WorldWideScience

Sample records for software improvement tools

  1. A coherent environment of software improvement tools for CMS

    CERN Document Server

    Eulisse, G; Osborne, I; Taylor, L; Tuura, L A

    2004-01-01

    CMS has developed approximately one million lines of C++ code and uses many more from HEP, Grid and public domain projects. We describe a suite of tools which help to manage this complexity by measuring software dependencies, quality metrics, and CPU and memory performance. This coherent environment integrates and extends existing open-source tools where possible and provides new in-house components where a suitable solution does not already exist. This is a freely available environment with graphical user interface which can be run on any software without the need to recompile or instrument it. We have developed ignominy which performs software dependency analysis of source code, binary products and external software. CPU profiling is provided based on oprofile, with added features such as profile snapshots, distributed profiling and aggregate profiles for farm systems including server-side tools for collecting profile data. Finally, we have developed a low-overhead performance and memory profiling tool, Mem...

  2. Knickpoint finder: A software tool that improves neotectonic analysis

    Science.gov (United States)

    Queiroz, G. L.; Salamuni, E.; Nascimento, E. R.

    2015-03-01

    This work presents a new software tool for morphometric analysis of drainage networks based on the methods of Hack (1973) and Etchebehere et al. (2004). This tool is applicable to studies of morphotectonics and neotectonics. The software used a digital elevation model (DEM) to identify the relief breakpoints along drainage profiles (knickpoints). The program was coded in Python for use on the ArcGIS platform and is called Knickpoint Finder. A study area was selected to test and evaluate the software's ability to analyze and identify neotectonic morphostructures based on the morphology of the terrain. For an assessment of its validity, we chose an area of the James River basin, which covers most of the Piedmont area of Virginia (USA), which is an area of constant intraplate seismicity and non-orogenic active tectonics and exhibits a relatively homogeneous geodesic surface currently being altered by the seismogenic features of the region. After using the tool in the chosen area, we found that the knickpoint locations are associated with the geologic structures, epicenters of recent earthquakes, and drainages with rectilinear anomalies. The regional analysis demanded the use of a spatial representation of the data after processing using Knickpoint Finder. The results were satisfactory in terms of the correlation of dense areas of knickpoints with active lineaments and the rapidity of the identification of deformed areas. Therefore, this software tool may be considered useful in neotectonic analyses of large areas and may be applied to any area where there is DEM coverage.

  3. Tuning COCOMO-II for Software Process Improvement: A Tool Based Approach

    Directory of Open Access Journals (Sweden)

    SYEDA UMEMA HANI

    2016-10-01

    Full Text Available In order to compete in the international software development market the software organizations have to adopt internationally accepted software practices i.e. standard like ISO (International Standard Organization or CMMI (Capability Maturity Model Integration in spite of having scarce resources and tools. The aim of this study is to develop a tool which could be used to present an actual picture of Software Process Improvement benefits in front of the software development companies. However, there are few tools available to assist in making predictions, they are too expensive and could not cover dataset that reflect the cultural behavior of organizations for software development in developing countries. In extension to our previously done research reported elsewhere for Pakistani software development organizations which has quantified benefits of SDPI (Software Development Process Improvement, this research has used sixty-two datasets from three different software development organizations against the set of metrics used in COCOMO-II (Constructive Cost Model 2000. It derived a verifiable equation for calculating ISF (Ideal Scale Factor and tuned the COCOMO-II model to bring prediction capability for SDPI (benefit measurement classes such as ESCP (Effort, Schedule, Cost, and Productivity. This research has contributed towards software industry by giving a reliable and low-cost mechanism for generating prediction models with high prediction accuracy. Hopefully, this study will help software organizations to use this tool not only to predict ESCP but also to predict an exact impact of SDPI.

  4. GMFilter and SXTestPlate: software tools for improving the SNPlex™ genotyping system

    Directory of Open Access Journals (Sweden)

    Schreiber Stefan

    2009-03-01

    Full Text Available Abstract Background Genotyping of single-nucleotide polymorphisms (SNPs is a fundamental technology in modern genetics. The SNPlex™ mid-throughput genotyping system (Applied Biosystems, Foster City, CA, USA enables the multiplexed genotyping of up to 48 SNPs simultaneously in a single DNA sample. The high level of automation and the large amount of data produced in a high-throughput laboratory require advanced software tools for quality control and workflow management. Results We have developed two programs, which address two main aspects of quality control in a SNPlex™ genotyping environment: GMFilter improves the analysis of SNPlex™ plates by removing wells with a low overall signal intensity. It enables scientists to automatically process the raw data in a standardized way before analyzing a plate with the proprietary GeneMapper software from Applied Biosystems. SXTestPlate examines the genotype concordance of a SNPlex™ test plate, which was typed with a control SNP set. This program allows for regular quality control checks of a SNPlex™ genotyping platform. It is compatible to other genotyping methods as well. Conclusion GMFilter and SXTestPlate provide a valuable tool set for laboratories engaged in genotyping based on the SNPlex™ system. The programs enhance the analysis of SNPlex™ plates with the GeneMapper software and enable scientists to evaluate the performance of their genotyping platform.

  5. Software engineering tools.

    Science.gov (United States)

    Wear, L L; Pinkert, J R

    1994-01-01

    We have looked at general descriptions and illustrations of several software development tools, such as tools for prototyping, developing DFDs, testing, and maintenance. Many others are available, and new ones are being developed. However, you have at least seen some examples of powerful CASE tools for systems development.

  6. Software engineering methodologies and tools

    Science.gov (United States)

    Wilcox, Lawrence M.

    1993-01-01

    Over the years many engineering disciplines have developed, including chemical, electronic, etc. Common to all engineering disciplines is the use of rigor, models, metrics, and predefined methodologies. Recently, a new engineering discipline has appeared on the scene, called software engineering. For over thirty years computer software has been developed and the track record has not been good. Software development projects often miss schedules, are over budget, do not give the user what is wanted, and produce defects. One estimate is there are one to three defects per 1000 lines of deployed code. More and more systems are requiring larger and more complex software for support. As this requirement grows, the software development problems grow exponentially. It is believed that software quality can be improved by applying engineering principles. Another compelling reason to bring the engineering disciplines to software development is productivity. It has been estimated that productivity of producing software has only increased one to two percent a year in the last thirty years. Ironically, the computer and its software have contributed significantly to the industry-wide productivity, but computer professionals have done a poor job of using the computer to do their job. Engineering disciplines and methodologies are now emerging supported by software tools that address the problems of software development. This paper addresses some of the current software engineering methodologies as a backdrop for the general evaluation of computer assisted software engineering (CASE) tools from actual installation of and experimentation with some specific tools.

  7. Machine Tool Software

    Science.gov (United States)

    1988-01-01

    A NASA-developed software package has played a part in technical education of students who major in Mechanical Engineering Technology at William Rainey Harper College. Professor Hack has been using (APT) Automatically Programmed Tool Software since 1969 in his CAD/CAM Computer Aided Design and Manufacturing curriculum. Professor Hack teaches the use of APT programming languages for control of metal cutting machines. Machine tool instructions are geometry definitions written in APT Language to constitute a "part program." The part program is processed by the machine tool. CAD/CAM students go from writing a program to cutting steel in the course of a semester.

  8. Improvement of a free software tool for the assessment of sediment connectivity

    Science.gov (United States)

    Crema, Stefano; Lanni, Cristiano; Goldin, Beatrice; Marchi, Lorenzo; Cavalli, Marco

    2015-04-01

    Sediment connectivity expresses the degree of linkage that controls sediment fluxes throughout landscape, in particular between sediment sources and downstream areas. The assessment of sediment connectivity becomes a key issue when dealing with risk mitigation and priorities of intervention in the territory. In this work, the authors report the improvements made to an open source and stand-alone application (SedInConnect, http://www.sedalp.eu/download/tools.shtml), along with extensive applications to alpine catchments. SedInConnect calculates a sediment connectivity index as expressed in Cavalli et al. (2013); the software improvements consisted primarily in the introduction of the sink feature, i.e. areas that act as traps for sediment produced upstream (e.g., lakes, sediment traps). Based on user-defined sinks, the software decouples those parts of the catchment that do not deliver sediment to a selected target of interest (e.g., fan apex, main drainage network). In this way the assessment of sediment connectivity is achieved by taking in consideration effective sediment contributing areas. Sediment connectivity analysis has been carried out on several catchments in the South Tyrol alpine area (Northern Italy) with the goal of achieving a fast and objective characterization of the topographic control on sediment transfer. In addition to depicting the variability of sediment connectivity inside each basin, the index of connectivity has proved to be a valuable indicator of the dominant process characterizing the basin sediment dynamics (debris flow, bedload, mixed behavior). The characterization of the dominant process is of great importance for the hazard and risk assessment in mountain areas, and for choice and design of structural and non-structural intervention measures. The recognition of the dominant sediment transport process by the index of connectivity is in agreement with evidences arising from post-event field surveys and with the application of

  9. Software Tool Support to Specify and Verify Scientific Sensor Data Properties to Improve Anomaly Detection

    Science.gov (United States)

    Gallegos, I.; Gates, A. Q.; Tweedie, C.; Cybershare

    2010-12-01

    Advancements in scientific sensor data acquisition technologies, such as wireless sensor networks and robotic trams equipped with sensors, are increasing the amount of data being collected at field sites . This elevates the challenges of verifying the quality of streamed data and monitoring the correct operation of the instrumentation. Without the ability to evaluate the data collection process at near real-time, scientists can lose valuable time and data. In addition, scientists have to rely on their knowledge and experience in the field to evaluate data quality. Such knowledge is rarely shared or reused by other scientists mostly because of the lack of a well-defined methodology and tool support. Numerous scientific projects address anomaly detection, mostly as part of the verification system’s source code; however, anomaly detection properties, which often are embedded or hard-coded in the source code, are difficult to refine. In addition, a software developer is required to modify the source code every time a new anomaly detection property or a modification to an existing property is needed. This poster describes the tool support that has been developed, based on software engineering techniques, to address these challenges. The overall tool support allows scientists to specify and reuse anomaly detection properties generated using the specification tool and to use the specified properties to conduct automated anomaly detection at near-real time. The anomaly-detection mechanism is independent of the system used to collect the sensor data. With guidance provided by a classification and categorization of anomaly-detection properties, the user specifies properties on scientific sensor data. The properties, which can be associated with particular field sites or instrumentation, document knowledge about data anomalies that otherwise would have limited availability to the scientific community.

  10. Improving Software Reliability Forecasting

    NARCIS (Netherlands)

    Burtsy, Bernard; Albeanu, Grigore; Boros, Dragos N.; Popentiu, Florin; Nicola, V.F.

    1996-01-01

    This work investigates some methods for software reliability forecasting. A supermodel is presented as a suited tool for prediction of reliability in software project development. Also, times series forecasting for cumulative interfailure time is proposed and illustrated.

  11. CSAM Metrology Software Tool

    Science.gov (United States)

    Vu, Duc; Sandor, Michael; Agarwal, Shri

    2005-01-01

    CSAM Metrology Software Tool (CMeST) is a computer program for analysis of false-color CSAM images of plastic-encapsulated microcircuits. (CSAM signifies C-mode scanning acoustic microscopy.) The colors in the images indicate areas of delamination within the plastic packages. Heretofore, the images have been interpreted by human examiners. Hence, interpretations have not been entirely consistent and objective. CMeST processes the color information in image-data files to detect areas of delamination without incurring inconsistencies of subjective judgement. CMeST can be used to create a database of baseline images of packages acquired at given times for comparison with images of the same packages acquired at later times. Any area within an image can be selected for analysis, which can include examination of different delamination types by location. CMeST can also be used to perform statistical analyses of image data. Results of analyses are available in a spreadsheet format for further processing. The results can be exported to any data-base-processing software.

  12. Improving Agile Software Practice

    DEFF Research Database (Denmark)

    Tjørnehøj, Gitte

    2006-01-01

    Software process improvement in small and agile organizations is often problematic, but achieving good SPI-assessments can still be necessary to stay in the marked or to meet demands of multinational owners. The traditional norm driven, centralized and control centered improvement approaches has...... from their new multinational owners. In the project we experimented to reach a less centralized and control centered SPI approach trying to meet the agile culture of the firm both within diagnosing, improvement planning, process design and evaluation eventhough the goal was applying to the norm. After...... having chosen the improvement area; requirement management, a more formal culture assessment and comparisons with the culture of the CMM-norm helped guiding the design of the new processes and tools. The paper suggests a SPI -approach based on problem diagnosis instead of formal CMMI-assessment, culture...

  13. Data-Driven Decision Making as a Tool to Improve Software Development Productivity

    Science.gov (United States)

    Brown, Mary Erin

    2013-01-01

    The worldwide software project failure rate, based on a survey of information technology software manager's view of user satisfaction, product quality, and staff productivity, is estimated to be between 24% and 36% and software project success has not kept pace with the advances in hardware. The problem addressed by this study was the limited…

  14. Improving Software Developer's Competence

    DEFF Research Database (Denmark)

    Abrahamsson, Pekka; Kautz, Karlheinz; Sieppi, Heikki

    2002-01-01

    Emerging agile software development methods are people oriented development approaches to be used by the software industry. The personal software process (PSP) is an accepted method for improving the capabilities of a single software engineer. Five original hypotheses regarding the impact...... and time estimation skills but that the productivity did not decrease and the resulting product quality was improved. The implications of these findings are briefly addressed....

  15. Software Process Improvement Defined

    DEFF Research Database (Denmark)

    Aaen, Ivan

    2002-01-01

    This paper argues in favor of the development of explanatory theory on software process improvement. The last one or two decades commitment to prescriptive approaches in software process improvement theory may contribute to the emergence of a gulf dividing theorists and practitioners....... It is proposed that this divide be met by the development of theory evaluating prescriptive approaches and informing practice with a focus on the software process policymaking and process control aspects of improvement efforts...

  16. Software Engineering Improvement Plan

    Science.gov (United States)

    2006-01-01

    In performance of this task order, bd Systems personnel provided support to the Flight Software Branch and the Software Working Group through multiple tasks related to software engineering improvement and to activities of the independent Technical Authority (iTA) Discipline Technical Warrant Holder (DTWH) for software engineering. To ensure that the products, comments, and recommendations complied with customer requirements and the statement of work, bd Systems personnel maintained close coordination with the customer. These personnel performed work in areas such as update of agency requirements and directives database, software effort estimation, software problem reports, a web-based process asset library, miscellaneous documentation review, software system requirements, issue tracking software survey, systems engineering NPR, and project-related reviews. This report contains a summary of the work performed and the accomplishments in each of these areas.

  17. Modern Tools for Modern Software

    Energy Technology Data Exchange (ETDEWEB)

    Kumfert, G; Epperly, T

    2001-10-31

    This is a proposal for a new software configure/build tool for building, maintaining, deploying, and installing software. At its completion, this new tool will replace current standard tool suites such as ''autoconf'', ''automake'', ''libtool'', and the de facto standard build tool, ''make''. This ambitious project is born out of the realization that as scientific software has grown in size and complexity over the years, the difficulty of configuring and building software has increased as well. For high performance scientific software, additional complexities often arises from the need for portability to multiple platforms (including many one-of-a-kind platforms), multilanguage implementations, use of third party libraries, and a need to adapt algorithms to the specific features of the hardware. Development of scientific software is being hampered by the quality of configuration and build tools commonly available. Inordinate amounts of time and expertise are required to develop and maintain the configure and build system for a moderately complex project. Better build and configure tools will increase developer productivity. This proposal is a first step in a process of shoring up the foundation upon which DOE software is created and used.

  18. A Software Tool for Improved Noise Source Identification and Understanding Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Innovative Technology Applications Company and Drs. P. Morris and K. Brentner will make improvements in noise prediction and measurement methods for subsonic and...

  19. Software Process Improvement

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; Diebold, Philipp; Münch, Jürgen

    2016-01-01

    Software process improvement (SPI) is around for decades: frameworks are proposed, success factors are studied, and experiences have been reported. However, the sheer mass of concepts, approaches, and standards published over the years overwhelms practitioners as well as researchers. What is out...

  20. Software Engineering Improvement Activities/Plan

    Science.gov (United States)

    2003-01-01

    bd Systems personnel accomplished the technical responsibilities for this reporting period, as planned. A close working relationship was maintained with personnel of the MSFC Avionics Department Software Group (ED14). Work accomplishments included development, evaluation, and enhancement of a software cost model, performing literature search and evaluation of software tools available for code analysis and requirements analysis, and participating in other relevant software engineering activities. Monthly reports were submitted. This support was provided to the Flight Software Group/ED 1 4 in accomplishing the software engineering improvement engineering activities of the Marshall Space Flight Center (MSFC) Software Engineering Improvement Plan.

  1. Software Engineering Program: Software Process Improvement Guidebook

    Science.gov (United States)

    1996-01-01

    The purpose of this document is to provide experience-based guidance in implementing a software process improvement program in any NASA software development or maintenance community. This guidebook details how to define, operate, and implement a working software process improvement program. It describes the concept of the software process improvement program and its basic organizational components. It then describes the structure, organization, and operation of the software process improvement program, illustrating all these concepts with specific NASA examples. The information presented in the document is derived from the experiences of several NASA software organizations, including the SEL, the SEAL, and the SORCE. Their experiences reflect many of the elements of software process improvement within NASA. This guidebook presents lessons learned in a form usable by anyone considering establishing a software process improvement program within his or her own environment. This guidebook attempts to balance general and detailed information. It provides material general enough to be usable by NASA organizations whose characteristics do not directly match those of the sources of the information and models presented herein. It also keeps the ideas sufficiently close to the sources of the practical experiences that have generated the models and information.

  2. Software Process Improvement

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; Konopka, Claudia; Nellemann, Peter

    2016-01-01

    Software process improvement (SPI) is around for decades: frameworks are proposed, success factors are studied, and experiences have been reported. However, the sheer mass of concepts, approaches, and standards published over the years overwhelms practitioners as well as researchers. What is out...... directions. An analysis of 635 publications draws a big picture of SPI-related research of the past 25 years. Our study shows a high number of solution proposals, experience reports, and secondary studies, but only few theories. In particular, standard SPI models are analyzed and evaluated for applicability...

  3. SACA: Software Assisted Call Analysis--an interactive tool supporting content exploration, online guidance and quality improvement of counseling dialogues.

    Science.gov (United States)

    Trinkaus, Hans L; Gaisser, Andrea E

    2010-09-01

    Nearly 30,000 individual inquiries are answered annually by the telephone cancer information service (CIS, KID) of the German Cancer Research Center (DKFZ). The aim was to develop a tool for evaluating these calls, and to support the complete counseling process interactively. A novel software tool is introduced, based on a structure similar to a music score. Treating the interaction as a "duet", guided by the CIS counselor, the essential contents of the dialogue are extracted automatically. For this, "trained speech recognition" is applied to the (known) counselor's part, and "keyword spotting" is used on the (unknown) client's part to pick out specific items from the "word streams". The outcomes fill an abstract score representing the dialogue. Pilot tests performed on a prototype of SACA (Software Assisted Call Analysis) resulted in a basic proof of concept: Demographic data as well as information regarding the situation of the caller could be identified. The study encourages following up on the vision of an integrated SACA tool for supporting calls online and performing statistics on its knowledge database offline. Further research perspectives are to check SACA's potential in comparison with established interaction analysis systems like RIAS. Copyright (c) 2010 Elsevier Ireland Ltd. All rights reserved.

  4. Design of parametric software tools

    DEFF Research Database (Denmark)

    Sabra, Jakob Borrits; Mullins, Michael

    2011-01-01

    fulfilment of evidence-based design criterion regarding light distribution and location in relation to patient safety in architectural health care design proposals. The study uses 2D/3D CAD modelling software Rhinoceros 3D with plug-in Grasshopper to create parametric tool prototypes to exemplify......The studies investigate the field of evidence-based design used in architectural design practice and propose a method using 2D/3D CAD applications to: 1) enhance integration of evidence-based design knowledge in architectural design phases with a focus on lighting and interior design and 2) assess...... the operations and functions of the design method. To evaluate the prototype potentials, surveys with architectural and healthcare design companies are conducted. Evaluation is done by the administration of questionnaires being part of the development of the tools. The results show that architects, designers...

  5. Software Tools Streamline Project Management

    Science.gov (United States)

    2009-01-01

    Three innovative software inventions from Ames Research Center (NETMARK, Program Management Tool, and Query-Based Document Management) are finding their way into NASA missions as well as industry applications. The first, NETMARK, is a program that enables integrated searching of data stored in a variety of databases and documents, meaning that users no longer have to look in several places for related information. NETMARK allows users to search and query information across all of these sources in one step. This cross-cutting capability in information analysis has exponentially reduced the amount of time needed to mine data from days or weeks to mere seconds. NETMARK has been used widely throughout NASA, enabling this automatic integration of information across many documents and databases. NASA projects that use NETMARK include the internal reporting system and project performance dashboard, Erasmus, NASA s enterprise management tool, which enhances organizational collaboration and information sharing through document routing and review; the Integrated Financial Management Program; International Space Station Knowledge Management; Mishap and Anomaly Information Reporting System; and management of the Mars Exploration Rovers. Approximately $1 billion worth of NASA s projects are currently managed using Program Management Tool (PMT), which is based on NETMARK. PMT is a comprehensive, Web-enabled application tool used to assist program and project managers within NASA enterprises in monitoring, disseminating, and tracking the progress of program and project milestones and other relevant resources. The PMT consists of an integrated knowledge repository built upon advanced enterprise-wide database integration techniques and the latest Web-enabled technologies. The current system is in a pilot operational mode allowing users to automatically manage, track, define, update, and view customizable milestone objectives and goals. The third software invention, Query

  6. Improving Software Citation and Credit

    CERN Document Server

    Allen, Alice; DuPrie, Kimberly; Mink, Jessica; Nemiroff, Robert; Robitaille, Thomas; Shamir, Lior; Shortridge, Keith; Taylor, Mark; Teuben, Peter; Wallin, John

    2015-01-01

    The past year has seen movement on several fronts for improving software citation, including the Center for Open Science's Transparency and Openness Promotion (TOP) Guidelines, the Software Publishing Special Interest Group that was started at January's AAS meeting in Seattle at the request of that organization's Working Group on Astronomical Software, a Sloan-sponsored meeting at GitHub in San Francisco to begin work on a cohesive research software citation-enabling platform, the work of Force11 to "transform and improve" research communication, and WSSSPE's ongoing efforts that include software publication, citation, credit, and sustainability. Brief reports on these efforts were shared at the BoF, after which participants discussed ideas for improving software citation, generating a list of recommendations to the community of software authors, journal publishers, ADS, and research authors. The discussion, recommendations, and feedback will help form recommendations for software citation to those publishers...

  7. Tools & training for more secure software

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    Just by fate of nature, software today is shipped out as “beta”, coming with vulnerabilities and weaknesses, which should already have been fixed at the programming stage. This presentation will show the consequences of suboptimal software, why good programming, thorough software design, and a proper software development process is imperative for the overall security of the Organization, and how a few simple tools and training are supposed to make CERN software more secure.

  8. 2016 International Conference on Software Process Improvement

    CERN Document Server

    Muñoz, Mirna; Rocha, Álvaro; Feliu, Tomas; Peña, Adriana

    2017-01-01

    This book offers a selection of papers from the 2016 International Conference on Software Process Improvement (CIMPS’16), held between the 12th and 14th of October 2016 in Aguascalientes, Aguascalientes, México. The CIMPS’16 is a global forum for researchers and practitioners to present and discuss the most recent innovations, trends, results, experiences and concerns in the different aspects of software engineering with a focus on, but not limited to, software processes, security in information and communication technology, and big data. The main topics covered include: organizational models, standards and methodologies, knowledge management, software systems, applications and tools, information and communication technologies and processes in non-software domains (mining, automotive, aerospace, business, health care, manufacturing, etc.) with a clear focus on software process challenges.

  9. FFI: A software tool for ecological monitoring

    Science.gov (United States)

    Duncan C. Lutes; Nathan C. Benson; MaryBeth Keifer; John F. Caratti; S. Austin Streetman

    2009-01-01

    A new monitoring tool called FFI (FEAT/FIREMON Integrated) has been developed to assist managers with collection, storage and analysis of ecological information. The tool was developed through the complementary integration of two fire effects monitoring systems commonly used in the United States: FIREMON and the Fire Ecology Assessment Tool. FFI provides software...

  10. Texas traffic thermostat software tool.

    Science.gov (United States)

    2013-04-01

    The traffic thermostat decision tool is built to help guide the user through a logical, step-wise, process of examining potential changes to their Manage Lane/toll facility. : **NOTE: Project Title: Application of the Traffic Thermostat Framework. Ap...

  11. A software tool for network intrusion detection

    CSIR Research Space (South Africa)

    Van der Walt, C

    2012-10-01

    Full Text Available This presentation illustrates how a recently developed software tool enables operators to easily monitor a network and detect intrusions without requiring expert knowledge of network intrusion detections....

  12. Culture shock: Improving software quality

    Energy Technology Data Exchange (ETDEWEB)

    de Jong, K.; Trauth, S.L.

    1988-01-01

    The concept of software quality can represent a significant shock to an individual who has been developing software for many years and who believes he or she has been doing a high quality job. The very idea that software includes lines of code and associated documentation is foreign and difficult to grasp, at best. Implementation of a software quality program hinges on the concept that software is a product whose quality needs improving. When this idea is introduced into a technical community that is largely ''self-taught'' and has been producing ''good'' software for some time, a fundamental understanding of the concepts associated with software is often weak. Software developers can react as if to say, ''What are you talking about. What do you mean I'm not doing a good job. I haven't gotten any complaints about my code yetexclamation'' Coupling such surprise and resentment with the shock that software really is a product and software quality concepts do exist, can fuel the volatility of these emotions. In this paper, we demonstrate that the concept of software quality can indeed pose a culture shock to developers. We also show that a ''typical'' quality assurance approach, that of imposing a standard and providing inspectors and auditors to assure its adherence, contributes to this shock and detracts from the very goal the approach should achieve. We offer an alternative, adopted through experience, to implement a software quality program: cooperative assistance. We show how cooperation, education, consultation and friendly assistance can overcome this culture shock. 3 refs.

  13. Improving ATLAS reprocessing software

    CERN Document Server

    Novak, Tadej

    2014-01-01

    For my CERN Summer Student programme I have been working with ATLAS reprocessing group. Data taken at ATLAS experiment is not only processed after being taken, but is also reprocessed multiple times afterwards. This allows applying new alignments, calibration of detector and using improved or faster algorithms. Reprocessing is usually done in campaigns for different periods of data or for different interest groups. The idea of my project was to simplify the definition of tasks and monitoring of their progress. I created a LIST configuration files generator script in Python and a monitoring webpage for tracking current reprocessing tasks.

  14. A Software Tool for Legal Drafting

    Directory of Open Access Journals (Sweden)

    Daniel Gorín

    2011-09-01

    Full Text Available Although many attempts at automated aids for legal drafting have been made, they were based on the construction of a new tool, completely from scratch. This is at least curious, considering that a strong parallelism can be established between a normative document and a software specification: both describe what an entity should or should not do, can or cannot do. In this article we compare normative documents and software specifications to find out their similarities and differences. The comparison shows that there are distinctive particularities, but they are restricted to a very specific subclass of normative propositions. The rest, we postulate, can be dealt with software tools. For such an enterprise the FormaLex tool set was devised: an LTL-based language and companion tools that utilize model checking to find out normative incoherences in regulations, contracts and other legal documents. A feature-rich case study is analyzed with the presented tools.

  15. Tool Support for Software Lookup Table Optimization

    Directory of Open Access Journals (Sweden)

    Chris Wilcox

    2011-01-01

    Full Text Available A number of scientific applications are performance-limited by expressions that repeatedly call costly elementary functions. Lookup table (LUT optimization accelerates the evaluation of such functions by reusing previously computed results. LUT methods can speed up applications that tolerate an approximation of function results, thereby achieving a high level of fuzzy reuse. One problem with LUT optimization is the difficulty of controlling the tradeoff between performance and accuracy. The current practice of manual LUT optimization adds programming effort by requiring extensive experimentation to make this tradeoff, and such hand tuning can obfuscate algorithms. In this paper we describe a methodology and tool implementation to improve the application of software LUT optimization. Our Mesa tool implements source-to-source transformations for C or C++ code to automate the tedious and error-prone aspects of LUT generation such as domain profiling, error analysis, and code generation. We evaluate Mesa with five scientific applications. Our results show a performance improvement of 3.0× and 6.9× for two molecular biology algorithms, 1.4× for a molecular dynamics program, 2.1× to 2.8× for a neural network application, and 4.6× for a hydrology calculation. We find that Mesa enables LUT optimization with more control over accuracy and less effort than manual approaches.

  16. Software tool for physics chart checks.

    Science.gov (United States)

    Li, H Harold; Wu, Yu; Yang, Deshan; Mutic, Sasa

    2014-01-01

    Physics chart check has long been a central quality assurance (QC) measure in radiation oncology. The purpose of this work is to describe a software tool that aims to accomplish simplification, standardization, automation, and forced functions in the process. Nationally recognized guidelines, including American College of Radiology and American Society for Radiation Oncology guidelines and technical standards, and the American Association of Physicists in Medicine Task Group reports were identified, studied, and summarized. Meanwhile, the reported events related to physics chart check service were analyzed using an event reporting and learning system. A number of shortfalls in the chart check process were identified. To address these problems, a software tool was designed and developed under Microsoft. Net in C# to hardwire as many components as possible at each stage of the process. The software consists of the following 4 independent modules: (1) chart check management; (2) pretreatment and during treatment chart check assistant; (3) posttreatment chart check assistant; and (4) quarterly peer-review management. The users were a large group of physicists in the author's radiation oncology clinic. During over 1 year of use the tool has proven very helpful in chart checking management, communication, documentation, and maintaining consistency. The software tool presented in this work aims to assist physicists at each stage of the physics chart check process. The software tool is potentially useful for any radiation oncology clinics that are either in the process of pursuing or maintaining the American College of Radiology accreditation.

  17. Improved water δ2H and δ18O calibration and calculation of measurement uncertainty using a simple software tool.

    Science.gov (United States)

    Gröning, Manfred

    2011-10-15

    The calibration of all δ(2)H and δ(18)O measurements on the VSMOW/SLAP scale should be performed consistently, based on similar principles, independent of the instrumentation used. The basic principles of a comprehensive calibration strategy are discussed taking water as example. The most common raw data corrections for memory and drift effects are described. Those corrections result in a considerable improvement in data consistency, especially in laboratories analyzing samples of quite variable isotopic composition (e.g. doubly labelled water). The need for a reliable uncertainty assessment for all measurements is discussed and an easy implementation method proposed. A versatile evaluation method based on Excel macros and spreadsheets is presented. It corrects measured raw data for memory and drift effects, performs the calibration and calculates the combined standard uncertainty for each measurement. It allows the easy implementation of the discussed principles in any user laboratory. Following these principles will improve the comparability of data among laboratories. Copyright © 2011 John Wiley & Sons, Ltd.

  18. Object Relational Mapping Tool and Business Productivity Software Interfaces Group

    Directory of Open Access Journals (Sweden)

    Freddy Patricio Baño Naranjo

    2016-08-01

    Full Text Available Software development seeks to improve business productivity through automation and the use of tools, companies engaged in software development are aimed at helping other companies in the automation and development of tools for this purpose, but often them forget their own productivity. The ORM Tools (Object Relational Mapping have been devised for this purpose, to avoid repeating many lines of programming abstraction layer, but it is necessary to consider that each has its standard or your own language by which additional time is required to occupy them. The development of these ORM aims to provide a tool according to the needs and programming standard enterprise interfaces, making for software development, this focus on other more important aspects, such as the design of the database , user interface and business layer, forgetting almost entirely abstraction layer. Using the ORM tool was reduced development time of the test application.

  19. Improved tool grinding machine

    Science.gov (United States)

    Dial, C.E. Sr.

    The present invention relates to an improved tool grinding mechanism for grinding single point diamond cutting tools to precise roundness and radius specifications. The present invention utilizes a tool holder which is longitudinally displaced with respect to the remainder of the grinding system due to contact of the tool with the grinding surface with this displacement being monitored so that any variation in the grinding of the cutting surface such as caused by crystal orientation or tool thicknesses may be compensated for during the grinding operation to assure the attainment of the desired cutting tool face specifications.

  20. Social Networks in Software Process Improvement

    DEFF Research Database (Denmark)

    Nielsen, Peter Axel; Tjørnehøj, Gitte

    2010-01-01

    Software process improvement in small organisation is often problematic and communication and knowledge sharing is more informal. To improve software processes we need to understand how they communicate and share knowledge. In this article have studied the company SmallSoft through action research....... In the action research we have applied the framework of social network analysis and we show this can be used to understand the underlying structures of communication and knowledge sharing between software developers and managers. We show in detail how the analysis can be done and how the management can utilise...... the findings. From this we conclude that social network analysis was a useful framework together with accompanying tools and techniques. Copyright © 2009 John Wiley & Sons, Ltd....

  1. Software Tools Used for Continuous Assessment

    Directory of Open Access Journals (Sweden)

    Corina SBUGHEA

    2016-04-01

    Full Text Available he present paper addresses the subject of continuous evaluation and of the IT tools that support it. The approach starts from the main concepts and methods used in the teaching process, according to the assessment methodology and, then, it focuses on their implementation in the Wondershare QuizCreator software.

  2. Commercial Expert-System-Building Software Tools

    Science.gov (United States)

    Gevarter, William B.

    1989-01-01

    Report evaluates commercially-available expert-system-building tools in terms of structures, representations of knowledge, inference mechanisms, interfaces with developers and end users, and capabilities of performing such functions as diagnosis and design. Software tools commercialized derivatives of artificial-intelligence systems developed by researchers at universities and research organizations. Reducing time to develop expert system by order of magnitude compared to that required with such traditional artificial development languages as LISP. Table lists 20 such tools, rating attributes as strong, fair, programmable by user, or having no capability in various criteria.

  3. ATLAS software configuration and build tool optimisation

    CERN Document Server

    Rybkin, G; The ATLAS collaboration

    2013-01-01

    ATLAS software code base is over 6 million lines organised in about 2000 packages. It makes use of some 100 external software packages, is developed by more than 400 developers and used by more than 2500 physicists from over 200 universities and laboratories in 6 continents. To meet the challenge of configuration and building of this software, the Configuration Management Tool (CMT) is used. CMT expects each package to describe its build targets, build and environment setup parameters, dependencies on other packages in a text file called requirements, and each project (group of packages) to describe its policies and dependencies on other projects in a text project file. Based on the effective set of configuration parameters read from the requirements files of dependent packages and project files, CMT commands build the packages, generate the environment for their use, or query the packages. The main focus was on build time performance that was optimised within several approaches: reduction of the number of re...

  4. ATLAS software configuration and build tool optimisation

    CERN Document Server

    Rybkin, G; The ATLAS collaboration

    2014-01-01

    ATLAS software code base is over 6 million lines organised in about 2000 packages. It makes use of some 100 external software packages, is developed by more than 400 developers and used by more than 2500 physicists from over 200 universities and laboratories in 6 continents. To meet the challenge of configuration and building of this software, the Configuration Management Tool (CMT) is used. CMT expects each package to describe its build targets, build and environment setup parameters, dependencies on other packages in a text file called requirements, and each project (group of packages) to describe its policies and dependencies on other projects in a text project file. Based on the effective set of configuration parameters read from the requirements files of dependent packages and project files, CMT commands build the packages, generate the environment for their use, or query the packages. The main focus was on build time performance that was optimised within several approaches: reduction of the number of re...

  5. Software Engineering Tools for Scientific Models

    Science.gov (United States)

    Abrams, Marc; Saboo, Pallabi; Sonsini, Mike

    2013-01-01

    Software tools were constructed to address issues the NASA Fortran development community faces, and they were tested on real models currently in use at NASA. These proof-of-concept tools address the High-End Computing Program and the Modeling, Analysis, and Prediction Program. Two examples are the NASA Goddard Earth Observing System Model, Version 5 (GEOS-5) atmospheric model in Cell Fortran on the Cell Broadband Engine, and the Goddard Institute for Space Studies (GISS) coupled atmosphere- ocean model called ModelE, written in fixed format Fortran.

  6. Three novel software tools for ASDEX Upgrade

    Energy Technology Data Exchange (ETDEWEB)

    Martinov, S. [Max-Planck-Institut für Plasmaphysik, Boltzmannstr. 2, D-85748 Garching bei München (Germany); Löbhard, T. [Conovum GmbH & Co. KG, Nymphenburger Straße 13, D-80335 München (Germany); Lunt, T. [Max-Planck-Institut für Plasmaphysik, Boltzmannstr. 2, D-85748 Garching bei München (Germany); Behler, K., E-mail: karl.behler@ipp.mpg.de [Max-Planck-Institut für Plasmaphysik, Boltzmannstr. 2, D-85748 Garching bei München (Germany); Drube, R.; Eixenberger, H.; Herrmann, A.; Lohs, A. [Max-Planck-Institut für Plasmaphysik, Boltzmannstr. 2, D-85748 Garching bei München (Germany); Lüddecke, K. [Unlimited Computer Systems GmbH, Seeshaupterstr. 15, D-82393 Iffeldorf (Germany); Merkel, R.; Neu, G.; ASDEX Upgrade Team [Max-Planck-Institut für Plasmaphysik, Boltzmannstr. 2, D-85748 Garching bei München (Germany); MPCDF Garching [Max Planck Compu ting and Data Facility, Boltzmannstr. 2, D-85748 Garching (Germany)

    2016-11-15

    Highlights: • Key features of innovative software tools for data visualization and inspection are presented to the nuclear fusion research community. • 3D animation of experiment geometry together with diagnostic data and images allow better understanding of measurements and influence of machine construction details behind them. • Multi-video viewer with fusion relevant image manipulation abilities and event database features allows faster and better decision making from video streams coming from various plasma and machine diagnostics. • Platform independant Web technologies enable the inspection of diagnostic raw signals with virtually any kind of display device. - Abstract: Visualization of measurements together with experimental settings is a general subject in experiments analysis. The complex engineering design, 3D geometry, and manifold of diagnostics in larger fusion research experiments justify the development of special analysis and visualization programs. Novel ASDEX Upgrade (AUG) software tools bring together virtual navigation through 3D device models and advanced play-back and interpretation of video streams from plasma discharges. A third little tool allows the web-based platform independent observation of real-time diagnostic signals. While all three tools stem from spontaneous development ideas and are not considered mission critical for the operation of a fusion device, they with time and growing completeness shaped up as valuable helpers to visualize acquired data in fusion research. A short overview on the goals, the features, and the design as well as the operation of these tools is given in this paper.

  7. Knowledge Architect : A Tool Suite for Managing Software Architecture Knowledge

    NARCIS (Netherlands)

    Liang, Peng; Jansen, Anton; Avgeriou, Paris

    2009-01-01

    Management of software architecture knowledge (AK) is vital for improving an organization’s architectural capabilities. To support the architecting process within our industrial partner: Astron, the Dutch radio astronomy institute, we implemented the Knowledge Architect (KA): a tool suite for

  8. Structure and software tools of AIDA.

    Science.gov (United States)

    Duisterhout, J S; Franken, B; Witte, F

    1987-01-01

    AIDA consists of a set of software tools to allow for fast development and easy-to-maintain Medical Information Systems. AIDA supports all aspects of such a system both during development and operation. It contains tools to build and maintain forms for interactive data entry and on-line input validation, a database management system including a data dictionary and a set of run-time routines for database access, and routines for querying the database and output formatting. Unlike an application generator, the user of AIDA may select parts of the tools to fulfill his needs and program other subsystems not developed with AIDA. The AIDA software uses as host language the ANSI-standard programming language MUMPS, an interpreted language embedded in an integrated database and programming environment. This greatly facilitates the portability of AIDA applications. The database facilities supported by AIDA are based on a relational data model. This data model is built on top of the MUMPS database, the so-called global structure. This relational model overcomes the restrictions of the global structure regarding string length. The global structure is especially powerful for sorting purposes. Using MUMPS as a host language allows the user an easy interface between user-defined data validation checks or other user-defined code and the AIDA tools. AIDA has been designed primarily for prototyping and for the construction of Medical Information Systems in a research environment which requires a flexible approach. The prototyping facility of AIDA operates terminal independent and is even to a great extent multi-lingual. Most of these features are table-driven; this allows on-line changes in the use of terminal type and language, but also causes overhead. AIDA has a set of optimizing tools by which it is possible to build a faster, but (of course) less flexible code from these table definitions. By separating the AIDA software in a source and a run-time version, one is able to write

  9. Software Design Improvements. Part 1; Software Benefits and Limitations

    Science.gov (United States)

    Lalli, Vincent R.; Packard, Michael H.; Ziemianski, Tom

    1997-01-01

    Computer hardware and associated software have been used for many years to process accounting information, to analyze test data and to perform engineering analysis. Now computers and software also control everything from automobiles to washing machines and the number and type of applications are growing at an exponential rate. The size of individual program has shown similar growth. Furthermore, software and hardware are used to monitor and/or control potentially dangerous products and safety-critical systems. These uses include everything from airplanes and braking systems to medical devices and nuclear plants. The question is: how can this hardware and software be made more reliable? Also, how can software quality be improved? What methodology needs to be provided on large and small software products to improve the design and how can software be verified?

  10. A software tool for ecosystem services assessments

    Science.gov (United States)

    Riegels, Niels; Klinting, Anders; Butts, Michael; Middelboe, Anne Lise; Mark, Ole

    2017-04-01

    The EU FP7 DESSIN project is developing methods and tools for assessment of ecosystem services (ESS) and associated economic values, with a focus on freshwater ESS in urban settings. Although the ESS approach has gained considerable visibility over the past ten years, operationalizing the approach remains a challenge. Therefore, DESSSIN is also supporting development of a free software tool to support users implementing the DESSIN ESS evaluation framework. The DESSIN ESS evaluation framework is a structured approach to measuring changes in ecosystem services. The main purpose of the framework is to facilitate the application of the ESS approach in the appraisal of projects that have impacts on freshwater ecosystems and their services. The DESSIN framework helps users evaluate changes in ESS by linking biophysical, economic, and sustainability assessments sequentially. It was developed using the Common International Classification of Ecosystem Services (CICES) and the DPSIR (Drivers, Pressures, States, Impacts, Responses) adaptive management cycle. The former is a standardized system for the classification of ESS developed by the European Union to enhance the consistency and comparability of ESS assessments. The latter is a well-known concept to disentangle the biophysical and social aspects of a system under study. As part of its analytical component, the DESSIN framework also integrates elements of the Final Ecosystem Goods and Services-Classification System (FEGS-CS) of the US Environmental Protection Agency (USEPA). As implemented in the software tool, the DESSIN framework consists of five parts: • In part I of the evaluation, the ecosystem is defined and described and the local stakeholders are identified. In addition, administrative details and objectives of the assessment are defined. • In part II, drivers and pressures are identified. Once these first two elements of the DPSIR scheme have been characterized, the claimed/expected capabilities of a

  11. Mapping social networks in software process improvement

    DEFF Research Database (Denmark)

    Tjørnehøj, Gitte; Nielsen, Peter Axel

    2005-01-01

    to map social networks and suggest how it can be used in software process improvement. We applied the mapping approach in a small software company to support the realization of new ways of improving software processes. The mapping approach was found useful in improving social networks, and thus furthers...

  12. Software Process Improvement Using Force Field Analysis ...

    African Journals Online (AJOL)

    Software process improvement is a necessity especially since the dynamic nature of today's hardware demands reciprocal improvements in the underlying software systems. Several process improvement models exist where organizations perform an introspective study of the current software development process and ...

  13. A software communication tool for the tele-ICU.

    Science.gov (United States)

    Pimintel, Denise M; Wei, Shang Heng; Odor, Alberto

    2013-01-01

    The Tele Intensive Care Unit (tele-ICU) supports a high volume, high acuity population of patients. There is a high-volume of incoming and outgoing calls, especially during the evening and night hours, through the tele-ICU hubs. The tele-ICU clinicians must be able to communicate effectively to team members in order to support the care of complex and critically ill patients while supporting and maintaining a standard to improve time to intervention. This study describes a software communication tool that will improve the time to intervention, over the paper-driven communication format presently used in the tele-ICU. The software provides a multi-relational database of message instances to mine information for evaluation and quality improvement for all entities that touch the tele-ICU. The software design incorporates years of critical care and software design experience combined with new skills acquired in an applied Health Informatics program. This software tool will function in the tele-ICU environment and perform as a front-end application that gathers, routes, and displays internal communication messages for intervention by priority and provider.

  14. Software process improvement in the NASA software engineering laboratory

    Science.gov (United States)

    Mcgarry, Frank; Pajerski, Rose; Page, Gerald; Waligora, Sharon; Basili, Victor; Zelkowitz, Marvin

    1994-01-01

    The Software Engineering Laboratory (SEL) was established in 1976 for the purpose of studying and measuring software processes with the intent of identifying improvements that could be applied to the production of ground support software within the Flight Dynamics Division (FDD) at the National Aeronautics and Space Administration (NASA)/Goddard Space Flight Center (GSFC). The SEL has three member organizations: NASA/GSFC, the University of Maryland, and Computer Sciences Corporation (CSC). The concept of process improvement within the SEL focuses on the continual understanding of both process and product as well as goal-driven experimentation and analysis of process change within a production environment.

  15. How does Software Process Improvement Address Global Software Engineering?

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; Diebold, Philipp; Münch, Jürgen

    2016-01-01

    a systematic mapping study on the state-of-the-art in SPI from a general perspective, we observed Global Software Engineering (GSE) becoming a topic of interest in recent years. Therefore, in this paper, we provide a detailed investigation of those papers from the overall systematic mapping study that were......For decades, Software Process Improvement (SPI) programs have been implemented, inter alia, to improve quality and speed of software development. To set up, guide, and carry out SPI projects, and to measure SPI state, impact, and success, a multitude of different SPI approaches and considerable...... experience are available. SPI addresses many aspects ranging from individual developer skills to entire organizations. It comprises for instance the optimization of specific activities in the software lifecycle as well as the creation of organization awareness and project culture. In the course of conducting...

  16. How Does Software Process Improvement Address Global Software Engineering?

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; Diebold, Philipp; Münch, Jürgen

    2016-01-01

    For decades, Software Process Improvement (SPI) programs have been implemented, inter alia, to improve quality and speed of software development. To set up, guide, and carry out SPI projects, and to measure SPI state, impact, and success, a multitude of different SPI approaches and considerable...... experience are available. SPI addresses many aspects ranging from individual developer skills to entire organizations. It comprises for instance the optimization of specific activities in the software lifecycle as well as the creation of organization awareness and project culture. In the course of conducting...... a systematic mapping study on the state-of-the-art in SPI from a general perspective, we observed Global Software Engineering (GSE) becoming a topic of interest in recent years. Therefore, in this paper, we provide a detailed investigation of those papers from the overall systematic mapping study that were...

  17. Sandia software guidelines: Volume 5, Tools, techniques, and methodologies

    Energy Technology Data Exchange (ETDEWEB)

    1989-07-01

    This volume is one in a series of Sandia Software Guidelines intended for use in producing quality software within Sandia National Laboratories. This volume describes software tools and methodologies available to Sandia personnel for the development of software, and outlines techniques that have proven useful within the Laboratories and elsewhere. References and evaluations by Sandia personnel are included. 6 figs.

  18. SOFTWARE PROCESS ASSESSMENT AND IMPROVEMENT USING MULTICRITERIA DECISION AIDING - CONSTRUCTIVIST

    Directory of Open Access Journals (Sweden)

    Leonardo Ensslin

    2012-12-01

    Full Text Available Software process improvement and software process assessment have received special attention since the 1980s. Some models have been created, but these models rest on a normative approach, where the decision-maker’s participation in a software organization is limited to understanding which process is more relevant to each organization. The proposal of this work is to present the MCDA-C as a constructivist methodology for software process improvement and assessment. The methodology makes it possible to visualize the criteria that must be taken into account according to the decision-makers’ values in the process improvement actions, making it possible to rank actions in the light of specific organizational needs. This process helped the manager of the company studied to focus on and prioritize process improvement actions. This paper offers an empirical understanding of the application of performance evaluation to software process improvement and identifies complementary tools to the normative models presented today.

  19. Software Maintenance Management Evaluation and Continuous Improvement

    CERN Document Server

    April, Alain

    2008-01-01

    This book explores the domain of software maintenance management and provides road maps for improving software maintenance organizations. It describes full maintenance maturity models organized by levels 1, 2, and 3, which allow for benchmarking and continuous improvement paths. Goals for each key practice area are also provided, and the model presented is fully aligned with the architecture and framework of software development maturity models of CMMI and ISO 15504. It is complete with case studies, figures, tables, and graphs.

  20. Key characteristics relevant for selecting knowledge management software tools

    CSIR Research Space (South Africa)

    Smuts, H

    2011-07-01

    Full Text Available phenomenon that makes the use of technology not an option, but a necessity. Software tools in knowledge management are a collection of technologies and are not necessarily acquired as a single software solution. Furthermore, these knowledge management...

  1. Static and Dynamic Software Quality Metric Tools

    OpenAIRE

    Mayo, Kevin A.; Wake, Steven A.; Henry, Sallie M.

    1990-01-01

    The ability to detect and predict poor software quality is of major importance to software engineers, managers, and quality assurance organizations. Poor software quality leads to increased development costs and expensive maintenance. With so much attention on exacerbated budgetary constraints, a viable alternative is necessary. Software quality metrics are designed for this purpose. Metrics measure aspects of code or PDL representations, and can be collected and used throughout the life ...

  2. Educational Software Tool for Protection System Engineers. Distance Relay

    Directory of Open Access Journals (Sweden)

    Trujillo-Guajardo L.A.

    2012-04-01

    Full Text Available In this article, a graphical software tool is presented; this tool is based on the education of protection system engineers. The theoretical fundaments used for the design of operation characteristics of distance relays and their algorithms are presented. The software allows the evaluation and analysis of real time events or simulated ones of every stage of design of the distance relay. Some example cases are presented to illustrate the activities that could be done with the graphical software tool developed.

  3. Herramientas libres para modelar software Free tools to model software

    Directory of Open Access Journals (Sweden)

    Mauro Callejas Cuervo Óscar Yovany Baquero Moreno

    2010-11-01

    Full Text Available Observación acerca del  software libre y de suimplicación en procesos de desarrollo de  softwarecon herramientas 4G por parte de entidades opersonas sin capitales astronómicos y sin lamentalidad acaparadora de dominar el mercado conproductos costosos que las hagan multimillonarias yque no ofrecen una garantía real, ni la posibilidadsiquiera de conocer el  software por el que se hapagado, y mucho menos de modificarlo si no cumplenuestras expectativas.

  4. AASERT: Software Tools for Experimentation in Computational Geometry

    National Research Council Canada - National Science Library

    Dobkin, David

    2001-01-01

    This research has considered problems in computer graphics and visualization. The work has aimed to bring theoretical tools to practical problems as well as to develop tools with which to aid in the building of geometric software...

  5. Classroom Live: a software-assisted gamification tool

    Science.gov (United States)

    de Freitas, Adrian A.; de Freitas, Michelle M.

    2013-06-01

    Teachers have come to rely on a variety of approaches in order to elicit and sustain student interest in the classroom. One particular approach, known as gamification, seeks to improve student engagement by transforming the traditional classroom experience into a competitive multiplayer game. Initial attempts at classroom gamification relied on the teacher manually tracking student progress. At the US Air Force Academy, we wanted to experiment with a software gamification tool. Our client/server suite, dubbed Classroom Live, streamlines the gamification process for the teacher by simplifying common tasks. Simultaneously, the tool provides students with an esthetically pleasing user interface that offers in game rewards in exchange for their participation. Classroom Live is still in development, but our initial experience using the tool has been extremely positive and confirms our belief that students respond positively to gamification, even at the undergraduate level.

  6. Leveraging Code Comments to Improve Software Reliability

    Science.gov (United States)

    Tan, Lin

    2009-01-01

    Commenting source code has long been a common practice in software development. This thesis, consisting of three pieces of work, made novel use of the code comments written in natural language to improve software reliability. Our solution combines Natural Language Processing (NLP), Machine Learning, Statistics, and Program Analysis techniques to…

  7. The Value of Open Source Software Tools in Qualitative Research

    Science.gov (United States)

    Greenberg, Gary

    2011-01-01

    In an era of global networks, researchers using qualitative methods must consider the impact of any software they use on the sharing of data and findings. In this essay, I identify researchers' main areas of concern regarding the use of qualitative software packages for research. I then examine how open source software tools, wherein the publisher…

  8. Problem Diagnosis in Software Process Improvement

    DEFF Research Database (Denmark)

    Iversen, Jakob; Nielsen, Peter Axel; Nørbjerg, Jacob

    1998-01-01

    This paper addresses software process improvement. In particular it reports on action research undertaken to understand the problems with software processes of a large Danish company. It is argued that in order to understand what the specific problems are we may, on the one hand, rely on process...... models like CMM or Bootstrap. On the other hand, we may also see the specific and unique features of software processes in this company through what we call problem diagnosis. Problem diagnosis deals with eliciting problems perceived by software project managers and with forming commitment structures...

  9. Tool Support for Parametric Analysis of Large Software Simulation Systems

    Science.gov (United States)

    Schumann, Johann; Gundy-Burlet, Karen; Pasareanu, Corina; Menzies, Tim; Barrett, Tony

    2008-01-01

    The analysis of large and complex parameterized software systems, e.g., systems simulation in aerospace, is very complicated and time-consuming due to the large parameter space, and the complex, highly coupled nonlinear nature of the different system components. Thus, such systems are generally validated only in regions local to anticipated operating points rather than through characterization of the entire feasible operational envelope of the system. We have addressed the factors deterring such an analysis with a tool to support envelope assessment: we utilize a combination of advanced Monte Carlo generation with n-factor combinatorial parameter variations to limit the number of cases, but still explore important interactions in the parameter space in a systematic fashion. Additional test-cases, automatically generated from models (e.g., UML, Simulink, Stateflow) improve the coverage. The distributed test runs of the software system produce vast amounts of data, making manual analysis impossible. Our tool automatically analyzes the generated data through a combination of unsupervised Bayesian clustering techniques (AutoBayes) and supervised learning of critical parameter ranges using the treatment learner TAR3. The tool has been developed around the Trick simulation environment, which is widely used within NASA. We will present this tool with a GN&C (Guidance, Navigation and Control) simulation of a small satellite system.

  10. 4th International Conference on Software Process Improvement

    CERN Document Server

    Muñoz, Mirna; Rocha, Álvaro; Calvo-Manzano, Jose

    2016-01-01

    This book contains a selection of papers from The 2015 International Conference on Software Process Improvement (CIMPS’15), held between the 28th and 30th of October in Mazatlán, Sinaloa, México. The CIMPS’15 is a global forum for researchers and practitioners that present and discuss the most recent innovations, trends, results, experiences and concerns in the several perspectives of Software Engineering with clear relationship but not limited to software processes, Security in Information and Communication Technology and Big Data Field. The main topics covered are: Organizational Models, Standards and Methodologies, Knowledge Management, Software Systems, Applications and Tools, Information and Communication Technologies and Processes in non-software domains (Mining, automotive, aerospace, business, health care, manufacturing, etc.) with a demonstrated relationship to software process challenges.

  11. Towards an Interoperability Ontology for Software Development Tools

    Science.gov (United States)

    2003-03-01

    an ontology allowing interoperability and communication between different software development tools. Lenci defines ontologies as a core ingredient ...Variable, Market Segment, Market Research, Brand Image, Feature, Need, Market Need, Promotion, Competitor. Time Time Line, Time Interval, Time...Software Development Tools,” Draft PhD Dessertation , Computer Science Department, Naval Postgraduate School, Monterey California, 2003. [RATI02

  12. Caesy: A software tool for computer-aided engineering

    Science.gov (United States)

    Wette, Matt

    1993-01-01

    A new software tool, Caesy, is described. This tool provides a strongly typed programming environment for research in the development of algorithms and software for computer-aided control system design. A description of the user language and its implementation as they currently stand are presented along with a description of work in progress and areas of future work.

  13. Estimation of toxicity using a Java based software tool

    Science.gov (United States)

    A software tool has been developed that will allow a user to estimate the toxicity for a variety of endpoints (such as acute aquatic toxicity). The software tool is coded in Java and can be accessed using a web browser (or alternatively downloaded and ran as a stand alone applic...

  14. Workshop on Software Development Tools for Petascale Computing

    Energy Technology Data Exchange (ETDEWEB)

    Vetter, Jeffrey [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Georgia Inst. of Technology, Atlanta, GA (United States)

    2007-08-01

    Petascale computing systems will soon be available to the DOE science community. Recent studies in the productivity of HPC platforms point to better software environments as a key enabler to science on these systems. To prepare for the deployment and productive use of these petascale platforms, the DOE science and general HPC community must have the software development tools, such as performance analyzers and debuggers that meet application requirements for scalability, functionality, reliability, and ease of use. In this report, we identify and prioritize the research opportunities in the area of software development tools for high performance computing. To facilitate this effort, DOE hosted a group of 55 leading international experts in this area at the Software Development Tools for PetaScale Computing (SDTPC) Workshop, which was held in Washington, D.C. on August 1 and 2, 2007. Software development tools serve as an important interface between the application teams and the target HPC architectures. Broadly speaking, these roles can be decomposed into three categories: performance tools, correctness tools, and development environments. Accordingly, this SDTPC report has four technical thrusts: performance tools, correctness tools, development environment infrastructures, and scalable tool infrastructures. The last thrust primarily targets tool developers per se, rather than end users. Finally, this report identifies non-technical strategic challenges that impact most tool development. The organizing committee emphasizes that many critical areas are outside the scope of this charter; these important areas include system software, compilers, and I/O.

  15. Software Development Methods and Tools: a New Zealand study

    Directory of Open Access Journals (Sweden)

    Chris Phillips

    2005-05-01

    Full Text Available This study is a more detailed follow-up to a preliminary investigation of the practices of software engineers in New Zealand. The focus of this study is on the methods and tools used by software developers in their current organisation. The project involved detailed questionnaires being piloted and sent out to several hundred software developers. A central part of the research involved the identification of factors affecting the use and take-up of existing software development tools in the workplace. The full spectrum of tools from fully integrated I-CASE tools to individual software applications, such as drawing tools was investigated. This paper describes the project and presents the findings.

  16. Software Tools for Fault Management Technologies Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Fault Management (FM) is a key requirement for safety, efficient onboard and ground operations, maintenance, and repair. QSI's TEAMS Software suite is a leading...

  17. Managing Cultural Variation in Software Process Improvement

    DEFF Research Database (Denmark)

    Kræmmergaard, Pernille; Müller, Sune Dueholm; Mathiassen, Lars

    The scale and complexity of change in software process improvement (SPI) are considerable and managerial attention to organizational culture during SPI can therefore potentially contribute to successful outcomes. However, we know little about the impact of variations in organizational subculture...... CMMI level 2 as planned, ASY struggled to implement even modest improvements. To explain these differences, we analyzed the underlying organizational culture within ISY and ASY using two different methods for subculture assessment. The study demonstrates how variations in culture across software...... organizations can have important implications for SPI outcomes. Furthermore, it provides insights into how software managers can practically assess subcultures to inform decisions about and help prepare plans for SPI initiatives....

  18. EISA 432 Energy Audits Best Practices: Software Tools

    Energy Technology Data Exchange (ETDEWEB)

    Maryl Fisher

    2014-11-01

    Five whole building analysis software tools that can aid an energy manager with fulfilling energy audit and commissioning/retro-commissioning requirements were selected for review in this best practices study. A description of each software tool is provided as well as a discussion of the user interface and level of expertise required for each tool, a review of how to use the tool for analyzing energy conservation opportunities, the format and content of reports generated by the tool, and a discussion on the applicability of the tool for commissioning.

  19. Technology Transfer Challenges for High-Assurance Software Engineering Tools

    Science.gov (United States)

    Koga, Dennis (Technical Monitor); Penix, John; Markosian, Lawrence Z.

    2003-01-01

    In this paper, we describe our experience with the challenges thar we are currently facing in our effort to develop advanced software verification and validation tools. We categorize these challenges into several areas: cost benefits modeling, tool usability, customer application domain, and organizational issues. We provide examples of challenges in each area and identrfj, open research issues in areas which limit our ability to transfer high-assurance software engineering tools into practice.

  20. PLAGIARISM DETECTION PROBLEMS AND ANALYSIS SOFTWARE TOOLS FOR ITS SOLVE

    Directory of Open Access Journals (Sweden)

    V. I. Shynkarenko

    2017-02-01

    Full Text Available Purpose. This study is aimed at: 1 the definition of plagiarism in texts on formal and natural languages, building a taxonomy of plagiarism; 2 identify major problems of plagiarism detection when using automated tools to solve them; 3 Analysis and systematization of information obtained during the review, testing and analysis of existing detection systems. Methodology. To identify the requirements of the software to detect plagiarism apply methods of analysis of normative documentation (legislative base and competitive tools. To check the requirements of the testing methods used and GUI interfaces review. Findings. The paper considers the concept of plagiarism issues of proliferation and classification. A review of existing systems to identify plagiarism: desktop applications, and online resources. Highlighting their functional characteristics, determine the format of the input and output data and constraints on them, customization features and access. Drill down system requirements is made. Originality. The authors proposed schemes complement the existing hierarchical taxonomy of plagiarism. Analysis of existing systems is done in terms of functionality and possibilities for use of large amounts of data. Practical value. The practical significance is determined by the breadth of the problem of plagiarism in various fields. In Ukraine, develops the legal framework for the fight against plagiarism, which requires the active solution development tasks, improvement and delivery of relevant software (PO. This work contributes to the solution of these problems. Review of existing programs, Anti-plagiarism, as well as study and research experience in the field and update the concept of plagiarism, the strategy allows it to identify more fully articulate to the functional performance requirements, the input and output of the developed software, as well as to identify the features of such software. The article focuses on the features of solving the

  1. An Evaluation Format for "Open" Software Tools.

    Science.gov (United States)

    Murphy, Cheryl A.

    1995-01-01

    Evaluates six "open" (empty of content and customized by users) software programs using the literature-based characteristics of documentation, learner control, branching capabilities, portability, ease of use, and cost-effectiveness. Interviewed computer-knowledgeable individuals to confirm the legitimacy of the evaluative characteristics. (LRW)

  2. Improving ISD Agility in Fast-Moving Software Organizations

    DEFF Research Database (Denmark)

    Persson, John Stouby; Nørbjerg, Jacob; Nielsen, Peter Axel

    2016-01-01

    Fast-moving software organizations must respond quickly to changing technological options and market trends while delivering high-quality services at competitive prices. Improving agility of information systems development (ISD) may reconcile these inherent tensions, but previous research...... study on how to improve ISD agility in a fast-moving software organization. The study maps central problems in the ISD management to direct improvements of agility. Our following intervention addressed method improvements in defining types of ISD by customer relations and integrating the method...... with the task management tool used by the organization. The paper discusses how the action research contributes to our understanding of ISD agility in fast-moving software organizations with a framework for mapping and evaluating improvements of agility. The action research specifically points out that project...

  3. Improving ISD Agility in Fast-moving Software Organizations

    DEFF Research Database (Denmark)

    Persson, John Stouby; Nørbjerg, Jacob; Nielsen, Peter Axel

    2016-01-01

    Fast-moving software organizations must respond quickly to changing technological options and mar-ket trends while delivering high-quality services at competitive prices. Improving agility of infor-mation systems development (ISD) may reconcile these inherent tensions, but previous research...... study on how to improve ISD agility in a fast-moving software organization. The study maps central problems in the ISD management to direct improvements of agility. Our following intervention ad-dressed method improvements in defining types of ISD by customer relations and integrating the method...... with the task management tool used by the organization. The paper discusses how the action research contributes to our understanding of ISD agility in fast-moving software organizations with a framework for mapping and evaluating improvements of agility. The action research specifically points out that project...

  4. Managing Cultural Variation in Software Process Improvement

    DEFF Research Database (Denmark)

    Müller, Sune Dueholm; Kræmmergaard, Pernille; Mathiassen, Lars

    The scale and complexity of change in software process improvement (SPI) are considerable and managerial attention to organizational culture during SPI can therefore potentially contribute to successful outcomes. However, we know little about the impact of variations in organizational subculture...... organizations can have important implications for SPI outcomes. Furthermore, it provides insights into how software managers can practically assess subcultures to inform decisions about and help prepare plans for SPI initiatives....

  5. Brainwave Monitoring Software Improves Distracted Minds

    Science.gov (United States)

    2014-01-01

    Neurofeedback technology developed at Langley Research Center to monitor pilot awareness inspired Peter Freer to develop software for improving student performance. His company, Fletcher, North Carolina-based Unique Logic and Technology Inc., has gone on to develop technology for improving workplace and sports performance, monitoring drowsiness, and encouraging relaxation.

  6. Software Quality Improvement in the OMC Team

    CERN Document Server

    Maier, Viktor

    Physicists use self-written software as a tool to fulfill their tasks and often the developed software is used for several years or even decades. If a software product lives for a long time, it has to be changed and adapted to external influences. This implies that the source code has to be read, understood and modified. The same applies to the software of the Optics Measurements and Corrections (OMC) team at CERN. Their task is to track, analyze and correct the beams in the LHC and other accelerators. To solve this task, they revert to a self-written software base with more than 150,000 physical lines of code. The base is subject to continuous changes as well. Their software does its job and is effective, but runs regrettably not efficient because some parts of the source code are in a bad shape and has a low quality. The implementation could be faster and more memory efficient. In addition it is difficult to read and understand the code. Source code files and functions are too big and identifiers do not rev...

  7. Next Generation Static Software Analysis Tools (Dagstuhl Seminar 14352)

    OpenAIRE

    Cousot, Patrick; Kroening, Daniel; Sinz, Daniel

    2014-01-01

    There has been tremendous progress in static software analysis over the last years with, for example, refined abstract interpretation methods, the advent of fast decision procedures like SAT and SMT solvers, new approaches like software (bounded) model checking or CEGAR, or new problem encodings. We are now close to integrating these techniques into every programmer's toolbox. The aim of the seminar was to bring together developers of software analysis tools and algorithms, including ...

  8. Software Tools for Measuring and Calculating Electromagnetic Shielding Effectiveness

    National Research Council Canada - National Science Library

    Tesny, Neal

    2005-01-01

    The evaluation and the analysis of high-altitude electromagnetic pulse response of shielded enclosures require the availability of software tools able to acquire data and calculate shielding effectiveness...

  9. Computer- Aided Design in Power Engineering Application of Software Tools

    CERN Document Server

    Stojkovic, Zlatan

    2012-01-01

    This textbooks demonstrates the application of software tools in solving a series of problems from the field of designing power system structures and systems. It contains four chapters: The first chapter leads the reader through all the phases necessary in the procedures of computer aided modeling and simulation. It guides through the complex problems presenting on the basis of eleven original examples. The second chapter presents  application of software tools in power system calculations of power systems equipment design. Several design example calculations are carried out using engineering standards like MATLAB, EMTP/ATP, Excel & Access, AutoCAD and Simulink. The third chapters focuses on the graphical documentation using a collection of software tools (AutoCAD, EPLAN, SIMARIS SIVACON, SIMARIS DESIGN) which enable the complete automation of the development of graphical documentation of a power systems. In the fourth chapter, the application of software tools in the project management in power systems ...

  10. Software Construction and Analysis Tools for Future Space Missions

    Science.gov (United States)

    Lowry, Michael R.; Clancy, Daniel (Technical Monitor)

    2002-01-01

    NASA and its international partners will increasingly depend on software-based systems to implement advanced functions for future space missions, such as Martian rovers that autonomously navigate long distances exploring geographic features formed by surface water early in the planet's history. The software-based functions for these missions will need to be robust and highly reliable, raising significant challenges in the context of recent Mars mission failures attributed to software faults. After reviewing these challenges, this paper describes tools that have been developed at NASA Ames that could contribute to meeting these challenges; 1) Program synthesis tools based on automated inference that generate documentation for manual review and annotations for automated certification. 2) Model-checking tools for concurrent object-oriented software that achieve memorability through synergy with program abstraction and static analysis tools.

  11. ISWHM: Tools and Techniques for Software and System Health Management

    Science.gov (United States)

    Schumann, Johann; Mengshoel, Ole J.; Darwiche, Adnan

    2010-01-01

    This presentation presents status and results of research on Software Health Management done within the NRA "ISWHM: Tools and Techniques for Software and System Health Management." Topics include: Ingredients of a Guidance, Navigation, and Control System (GN and C); Selected GN and C Testbed example; Health Management of major ingredients; ISWHM testbed architecture; and Conclusions and next Steps.

  12. From Pragmatic to Systematic Software Process Improvement

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; Méndez Fernández, Daniel

    2015-01-01

    -intensive requiring many stakeholders to contribute to the process assessment, analysis, design, realisation, and deployment. Although there exist many valuable SPI approaches, none address the needs of both process engineers and project managers. This article presents an Artefact-based Software Process Improvement......Software processes improvement (SPI) is a challenging task, as many different stakeholders, project settings, and contexts and goals need to be considered. SPI projects are often operated in a complex and volatile environment and, thus, require a sound management that is resource...

  13. Software Tool for Real-Time Power Quality Analysis

    OpenAIRE

    CZIKER, A. C.; CHINDRIS, M. D.; Miron, A

    2013-01-01

    A software tool dedicated for the analysis of power signals containing harmonic and interharmonic components, unbalance, voltage dips and voltage swells is presented. The software tool is a virtual instrument, which uses innovative algorithms based on time and frequency domains analysis to process power signals. In order to detect the temporary disturbances, edge detection is proposed, whereas for the harmonic analysis Gaussian filter banks are implemented. Considering that a signal recov...

  14. Competing Values in Software Process Improvement

    DEFF Research Database (Denmark)

    Mûller, Sune Dueholm; Nielsen, Peter Axel

    2013-01-01

    Purpose The purpose of the article is to investigate the impact of organizational culture on software process improvement (SPI). Is cultural congruence between an organization and an adopted process model required? How can the level of congruence between an organizational culture and the values...... and assumptions underlying an adopted process model be assessed? How can cultural incongruence be managed to facilitate success of software process improvement? Design/methodology/approach The competing values framework and its associated assessment instrument are used in a case study to establish....../value The proposed culture management process, including the text analysis technique, is a cost-efficient approach to analyzing and managing cultural challenges during SPI in a specific company. The process provides understanding and guidance in dealing with the specific challenges faced by software companies during...

  15. Software

    Energy Technology Data Exchange (ETDEWEB)

    Macedo, R.; Budd, G.; Ross, E.; Wells, P.

    2010-07-15

    The software section of this journal presented new software programs that have been developed to help in the exploration and development of hydrocarbon resources. Software provider IHS Inc. has made additions to its geological and engineering analysis software tool, IHS PETRA, a product used by geoscientists and engineers to visualize, analyze and manage well production, well log, drilling, reservoir, seismic and other related information. IHS PETRA also includes a directional well module and a decline curve analysis module to improve analysis capabilities in unconventional reservoirs. Petris Technology Inc. has developed a software to help manage the large volumes of data. PetrisWinds Enterprise (PWE) helps users find and manage wellbore data, including conventional wireline and MWD core data; analysis core photos and images; waveforms and NMR; and external files documentation. Ottawa-based Ambercore Software Inc. has been collaborating with Nexen on the Petroleum iQ software for steam assisted gravity drainage (SAGD) producers. Petroleum iQ integrates geology and geophysics data with engineering data in 3D and 4D. Calgary-based Envirosoft Corporation has developed a software that reduces the costly and time-consuming effort required to comply with Directive 39 of the Alberta Energy Resources Conservation Board. The product includes an emissions modelling software. Houston-based Seismic Micro-Technology (SMT) has developed the Kingdom software that features the latest in seismic interpretation. Holland-based Joa Oil and Gas and Calgary-based Computer Modelling Group have both supplied the petroleum industry with advanced reservoir simulation software that enables reservoir interpretation. The 2010 software survey included a guide to new software applications designed to facilitate petroleum exploration, drilling and production activities. Oil and gas producers can use the products for a range of functions, including reservoir characterization and accounting. In

  16. The evolution of CACSD tools-a software engineering perspective

    DEFF Research Database (Denmark)

    Ravn, Ole; Szymkat, Maciej

    1992-01-01

    The earlier evolution of computer-aided control system design (CACSD) tools is discussed from a software engineering perspective. A model of the design process is presented as the basis for principles and requirements of future CACSD tools. Combinability, interfacing in memory, and an open...... workspace are seen as important concepts in CACSD. Some points are made about the problem of buy or make when new software is required, and the idea of buy and make is put forward. Emphasis is put on the time perspective and the life cycle of the software...

  17. Managing Cultural Variation in Software Process Improvement

    DEFF Research Database (Denmark)

    Kræmmergaard, Pernille; Müller, Sune Dueholm; Mathiassen, Lars

    CMMI level 2 as planned, ASY struggled to implement even modest improvements. To explain these differences, we analyzed the underlying organizational culture within ISY and ASY using two different methods for subculture assessment. The study demonstrates how variations in culture across software...

  18. Improving Hardware Reusability: Software Defined Hardware

    Science.gov (United States)

    2017-03-01

    performance improvements over software, specialization is likely the future of hardware design. This trend will manifest in an increased demand for chip ...design methodologies is critical to meeting the incoming demand for chip diversity. Acknowledgements Research partially funded by DARPA Award Number...DARPA; and ASPIRE Lab industrial sponsors and affiliates Intel, Google, HPE, Huawei, LGE, Nokia, NVIDIA, Oracle, and Samsung. References [1

  19. Software project management tools in global software development: a systematic mapping study.

    Science.gov (United States)

    Chadli, Saad Yasser; Idri, Ali; Ros, Joaquín Nicolás; Fernández-Alemán, José Luis; de Gea, Juan M Carrillo; Toval, Ambrosio

    2016-01-01

    Global software development (GSD) which is a growing trend in the software industry is characterized by a highly distributed environment. Performing software project management (SPM) in such conditions implies the need to overcome new limitations resulting from cultural, temporal and geographic separation. The aim of this research is to discover and classify the various tools mentioned in literature that provide GSD project managers with support and to identify in what way they support group interaction. A systematic mapping study has been performed by means of automatic searches in five sources. We have then synthesized the data extracted and presented the results of this study. A total of 102 tools were identified as being used in SPM activities in GSD. We have classified these tools, according to the software life cycle process on which they focus and how they support the 3C collaboration model (communication, coordination and cooperation). The majority of the tools found are standalone tools (77%). A small number of platforms (8%) also offer a set of interacting tools that cover the software development lifecycle. Results also indicate that SPM areas in GSD are not adequately supported by corresponding tools and deserve more attention from tool builders.

  20. A Prototype for the Support of Integrated Software Process Development and Improvement

    Science.gov (United States)

    Porrawatpreyakorn, Nalinpat; Quirchmayr, Gerald; Chutimaskul, Wichian

    An efficient software development process is one of key success factors for quality software. Not only can the appropriate establishment but also the continuous improvement of integrated project management and of the software development process result in efficiency. This paper hence proposes a software process maintenance framework which consists of two core components: an integrated PMBOK-Scrum model describing how to establish a comprehensive set of project management and software engineering processes and a software development maturity model advocating software process improvement. Besides, a prototype tool to support the framework is introduced.

  1. HANSIS software tool for the automated analysis of HOLZ lines

    Energy Technology Data Exchange (ETDEWEB)

    Holec, D., E-mail: david.holec@unileoben.ac.at [Department of Materials Science and Metallurgy, University of Cambridge, Pembroke Street, Cambridge CB2 3QZ (United Kingdom); Sridhara Rao, D.V.; Humphreys, C.J. [Department of Materials Science and Metallurgy, University of Cambridge, Pembroke Street, Cambridge CB2 3QZ (United Kingdom)

    2009-06-15

    A software tool, named as HANSIS (HOLZ analysis), has been developed for the automated analysis of higher-order Laue zone (HOLZ) lines in convergent beam electron diffraction (CBED) patterns. With this tool, the angles and distances between the HOLZ intersections can be measured and the data can be presented graphically with a user-friendly interface. It is capable of simultaneous analysis of several HOLZ patterns and thus provides a tool for systematic studies of CBED patterns.

  2. Navigating freely-available software tools for metabolomics analysis.

    Science.gov (United States)

    Spicer, Rachel; Salek, Reza M; Moreno, Pablo; Cañueto, Daniel; Steinbeck, Christoph

    2017-01-01

    The field of metabolomics has expanded greatly over the past two decades, both as an experimental science with applications in many areas, as well as in regards to data standards and bioinformatics software tools. The diversity of experimental designs and instrumental technologies used for metabolomics has led to the need for distinct data analysis methods and the development of many software tools. To compile a comprehensive list of the most widely used freely available software and tools that are used primarily in metabolomics. The most widely used tools were selected for inclusion in the review by either ≥ 50 citations on Web of Science (as of 08/09/16) or the use of the tool being reported in the recent Metabolomics Society survey. Tools were then categorised by the type of instrumental data (i.e. LC-MS, GC-MS or NMR) and the functionality (i.e. pre- and post-processing, statistical analysis, workflow and other functions) they are designed for. A comprehensive list of the most used tools was compiled. Each tool is discussed within the context of its application domain and in relation to comparable tools of the same domain. An extended list including additional tools is available at https://github.com/RASpicer/MetabolomicsTools which is classified and searchable via a simple controlled vocabulary. This review presents the most widely used tools for metabolomics analysis, categorised based on their main functionality. As future work, we suggest a direct comparison of tools' abilities to perform specific data analysis tasks e.g. peak picking.

  3. Software as a tool for controlling EMI/EMC

    Science.gov (United States)

    Boone, T. G.

    Traditional methods of controlling electromagnetic interference (EMI) typically deal with hardware based solutions such as grounding, bonding, shielding, filtering, and equipment placement. With the evolution of microprocessor and computer controlled systems, system flexibility and effectiveness increase up to a point where ambiguities reduce effectiveness and introduce the potential for a new form of EMI. Software, however, can become a tool for system designers, E3 and software engineers to use in controlling EMI and managing system EMC.

  4. Meta-tools for software development and knowledge acquisition

    Science.gov (United States)

    Eriksson, Henrik; Musen, Mark A.

    1992-01-01

    The effectiveness of tools that provide support for software development is highly dependent on the match between the tools and their task. Knowledge-acquisition (KA) tools constitute a class of development tools targeted at knowledge-based systems. Generally, KA tools that are custom-tailored for particular application domains are more effective than are general KA tools that cover a large class of domains. The high cost of custom-tailoring KA tools manually has encouraged researchers to develop meta-tools for KA tools. Current research issues in meta-tools for knowledge acquisition are the specification styles, or meta-views, for target KA tools used, and the relationships between the specification entered in the meta-tool and other specifications for the target program under development. We examine different types of meta-views and meta-tools. Our current project is to provide meta-tools that produce KA tools from multiple specification sources--for instance, from a task analysis of the target application.

  5. Marlin and LCCD—Software tools for the ILC

    Science.gov (United States)

    Gaede, F.

    2006-04-01

    The next big project proposed in particle physics is the International Linear Collider ( ILC), an electron positron collider with an energy reach of around 1 TeV. The ongoing optimization and development of a detector for the ILC is only possible through the extensive use of sophisticated simulation software. In this paper we give a brief review of the software tools that are available in the currently ongoing three international detector concept studies and present two new software packages that have been developed in the context of the Large Detector Concept ( LDC) study. The first is a C++ application framework that provides a platform for the distributed development of reconstruction and analysis software and the second is a conditions data toolkit. The interoperability with other software packages is discussed.

  6. Management of Astronomical Software Projects with Open Source Tools

    Science.gov (United States)

    Briegel, F.; Bertram, T.; Berwein, J.; Kittmann, F.

    2010-12-01

    In this paper we will offer an innovative approach to managing the software development process with free open source tools, for building and automated testing, a system to automate the compile/test cycle on a variety of platforms to validate code changes, using virtualization to compile in parallel on various operating system platforms, version control and change management, enhanced wiki and issue tracking system for online documentation and reporting and groupware tools as they are: blog, discussion and calendar. Initially starting with the Linc-Nirvana instrument a new project and configuration management tool for developing astronomical software was looked for. After evaluation of various systems of this kind, we are satisfied with the selection we are using now. Following the lead of Linc-Nirvana most of the other software projects at the MPIA are using it now.

  7. Software tool for data mining and its applications

    Science.gov (United States)

    Yang, Jie; Ye, Chenzhou; Chen, Nianyi

    2002-03-01

    A software tool for data mining is introduced, which integrates pattern recognition (PCA, Fisher, clustering, hyperenvelop, regression), artificial intelligence (knowledge representation, decision trees), statistical learning (rough set, support vector machine), computational intelligence (neural network, genetic algorithm, fuzzy systems). It consists of nine function models: pattern recognition, decision trees, association rule, fuzzy rule, neural network, genetic algorithm, Hyper Envelop, support vector machine, visualization. The principle and knowledge representation of some function models of data mining are described. The software tool of data mining is realized by Visual C++ under Windows 2000. Nonmonotony in data mining is dealt with by concept hierarchy and layered mining. The software tool of data mining has satisfactorily applied in the prediction of regularities of the formation of ternary intermetallic compounds in alloy systems, and diagnosis of brain glioma.

  8. Systems and software variability management concepts, tools and experiences

    CERN Document Server

    Capilla, Rafael; Kang, Kyo-Chul

    2013-01-01

    The success of product line engineering techniques in the last 15 years has popularized the use of software variability as a key modeling approach for describing the commonality and variability of systems at all stages of the software lifecycle. Software product lines enable a family of products to share a common core platform, while allowing for product specific functionality being built on top of the platform. Many companies have exploited the concept of software product lines to increase the resources that focus on highly differentiating functionality and thus improve their competitiveness

  9. Improving Software Systems By Flow Control Analysis

    Directory of Open Access Journals (Sweden)

    Piotr Poznanski

    2012-01-01

    Full Text Available Using agile methods during the implementation of the system that meets mission critical requirements can be a real challenge. The change in the system built of dozens or even hundreds of specialized devices with embedded software requires the cooperation of a large group of engineers. This article presents a solution that supports parallel work of groups of system analysts and software developers. Deployment of formal rules to the requirements written in natural language enables using formal analysis of artifacts being a bridge between software and system requirements. Formalism and textual form of requirements allowed the automatic generation of message flow graph for the (sub system, called the “big-picture-model”. Flow diagram analysis helped to avoid a large number of defects whose repair cost in extreme cases could undermine the legitimacy of agile methods in projects of this scale. Retrospectively, a reduction of technical debt was observed. Continuous analysis of the “big picture model” improves the control of the quality parameters of the software architecture. The article also tries to explain why the commercial platform based on UML modeling language may not be sufficient in projects of this complexity.

  10. PAnalyzer: A software tool for protein inference in shotgun proteomics

    Directory of Open Access Journals (Sweden)

    Prieto Gorka

    2012-11-01

    Full Text Available Abstract Background Protein inference from peptide identifications in shotgun proteomics must deal with ambiguities that arise due to the presence of peptides shared between different proteins, which is common in higher eukaryotes. Recently data independent acquisition (DIA approaches have emerged as an alternative to the traditional data dependent acquisition (DDA in shotgun proteomics experiments. MSE is the term used to name one of the DIA approaches used in QTOF instruments. MSE data require specialized software to process acquired spectra and to perform peptide and protein identifications. However the software available at the moment does not group the identified proteins in a transparent way by taking into account peptide evidence categories. Furthermore the inspection, comparison and report of the obtained results require tedious manual intervention. Here we report a software tool to address these limitations for MSE data. Results In this paper we present PAnalyzer, a software tool focused on the protein inference process of shotgun proteomics. Our approach considers all the identified proteins and groups them when necessary indicating their confidence using different evidence categories. PAnalyzer can read protein identification files in the XML output format of the ProteinLynx Global Server (PLGS software provided by Waters Corporation for their MSE data, and also in the mzIdentML format recently standardized by HUPO-PSI. Multiple files can also be read simultaneously and are considered as technical replicates. Results are saved to CSV, HTML and mzIdentML (in the case of a single mzIdentML input file files. An MSE analysis of a real sample is presented to compare the results of PAnalyzer and ProteinLynx Global Server. Conclusions We present a software tool to deal with the ambiguities that arise in the protein inference process. Key contributions are support for MSE data analysis by ProteinLynx Global Server and technical replicates

  11. Organizational Change Perspectives on Software Process Improvement

    DEFF Research Database (Denmark)

    Müller, Sune Dueholm; Mathiassen, Lars; Balshøj, Hans Henrik

    Many software organizations have engaged in Software Process Improvement (SPI) and experienced the challenges related to managing such complex organizational change efforts. As a result, there is an increasing body of research investigating change management in SPI. To provide an overview of what...... we know and don't know about SPI as organizational change, this paper addresses the following question: What are the dominant perspectives on SPI as organizational change in the literature and how is this knowledge presented and published? All journals on the AIS ranking list were screened...... audience (practitioner versus academic), geographical origin (Scandinavia, the Americas, Europe, or the Asia-Pacific), and publication level (high versus low ranked journal). The review demonstrates that the literature on SPI as organizational change is firmly grounded in both theory and practice...

  12. Improving Software Engineering on NASA Projects

    Science.gov (United States)

    Crumbley, Tim; Kelly, John C.

    2010-01-01

    Software Engineering Initiative: Reduces risk of software failure -Increases mission safety. More predictable software cost estimates and delivery schedules. Smarter buyer of contracted out software. More defects found and removed earlier. Reduces duplication of efforts between projects. Increases ability to meet the challenges of evolving software technology.

  13. The Systems Biology Research Tool: evolvable open-source software.

    Science.gov (United States)

    Wright, Jeremiah; Wagner, Andreas

    2008-06-29

    Research in the field of systems biology requires software for a variety of purposes. Software must be used to store, retrieve, analyze, and sometimes even to collect the data obtained from system-level (often high-throughput) experiments. Software must also be used to implement mathematical models and algorithms required for simulation and theoretical predictions on the system-level. We introduce a free, easy-to-use, open-source, integrated software platform called the Systems Biology Research Tool (SBRT) to facilitate the computational aspects of systems biology. The SBRT currently performs 35 methods for analyzing stoichiometric networks and 16 methods from fields such as graph theory, geometry, algebra, and combinatorics. New computational techniques can be added to the SBRT via process plug-ins, providing a high degree of evolvability and a unifying framework for software development in systems biology. The Systems Biology Research Tool represents a technological advance for systems biology. This software can be used to make sophisticated computational techniques accessible to everyone (including those with no programming ability), to facilitate cooperation among researchers, and to expedite progress in the field of systems biology.

  14. The Systems Biology Research Tool: evolvable open-source software

    Directory of Open Access Journals (Sweden)

    Wright Jeremiah

    2008-06-01

    Full Text Available Abstract Background Research in the field of systems biology requires software for a variety of purposes. Software must be used to store, retrieve, analyze, and sometimes even to collect the data obtained from system-level (often high-throughput experiments. Software must also be used to implement mathematical models and algorithms required for simulation and theoretical predictions on the system-level. Results We introduce a free, easy-to-use, open-source, integrated software platform called the Systems Biology Research Tool (SBRT to facilitate the computational aspects of systems biology. The SBRT currently performs 35 methods for analyzing stoichiometric networks and 16 methods from fields such as graph theory, geometry, algebra, and combinatorics. New computational techniques can be added to the SBRT via process plug-ins, providing a high degree of evolvability and a unifying framework for software development in systems biology. Conclusion The Systems Biology Research Tool represents a technological advance for systems biology. This software can be used to make sophisticated computational techniques accessible to everyone (including those with no programming ability, to facilitate cooperation among researchers, and to expedite progress in the field of systems biology.

  15. SafetyAnalyst : software tools for safety management of specific highway sites

    Science.gov (United States)

    2010-07-01

    SafetyAnalyst provides a set of software tools for use by state and local highway agencies for highway safety management. SafetyAnalyst can be used by highway agencies to improve their programming of site-specific highway safety improvements. SafetyA...

  16. Understanding Computation of Impulse Response in Microwave Software Tools

    Science.gov (United States)

    Potrebic, Milka M.; Tosic, Dejan V.; Pejovic, Predrag V.

    2010-01-01

    In modern microwave engineering curricula, the introduction of the many new topics in microwave industrial development, or of software tools for design and simulation, sometimes results in students having an inadequate understanding of the fundamental theory. The terminology for and the explanation of algorithms for calculating impulse response in…

  17. Proposing a Mathematical Software Tool in Physics Secondary Education

    Science.gov (United States)

    Baltzis, Konstantinos B.

    2009-01-01

    MathCad® is a very popular software tool for mathematical and statistical analysis in science and engineering. Its low cost, ease of use, extensive function library, and worksheet-like user interface distinguish it among other commercial packages. Its features are also well suited to educational process. The use of natural mathematical notation…

  18. Software Process Improvement as Organizational Change

    DEFF Research Database (Denmark)

    Mûller, Sune Dueholm; Mathiassen, Lars; Balshøj, Hans Henrik

    2010-01-01

    Software Process Improvement (SPI) typically involves rather complex organizational changes. Acknowledging that managers can approach these changes in quite different ways, this paper addresses the following question: What perspectives do the research literature offer on SPI as organizational...... of these perspectives. Overall, the paper offers research directions and management lessons, and it provides a roadmap to help identify insights and specific articles related to SPI as organizational change....... change and how is this knowledge presented and published? To answer this question, we analyzed SPI research publications with a main emphasis on organizational change using Gareth Morgan’s organizational metaphors (1996) as analytical lenses. In addition, we characterized each article along the following...

  19. PT-SAFE: a software tool for development and annunciation of medical audible alarms.

    Science.gov (United States)

    Bennett, Christopher L; McNeer, Richard R

    2012-03-01

    Recent reports by The Joint Commission as well as the Anesthesia Patient Safety Foundation have indicated that medical audible alarm effectiveness needs to be improved. Several recent studies have explored various approaches to improving the audible alarms, motivating the authors to develop real-time software capable of comparing such alarms. We sought to devise software that would allow for the development of a variety of audible alarm designs that could also integrate into existing operating room equipment configurations. The software is meant to be used as a tool for alarm researchers to quickly evaluate novel alarm designs. A software tool was developed for the purpose of creating and annunciating audible alarms. The alarms consisted of annunciators that were mapped to vital sign data received from a patient monitor. An object-oriented approach to software design was used to create a tool that is flexible and modular at run-time, can annunciate wave-files from disk, and can be programmed with MATLAB by the user to create custom alarm algorithms. The software was tested in a simulated operating room to measure technical performance and to validate the time-to-annunciation against existing equipment alarms. The software tool showed efficacy in a simulated operating room environment by providing alarm annunciation in response to physiologic and ventilator signals generated by a human patient simulator, on average 6.2 seconds faster than existing equipment alarms. Performance analysis showed that the software was capable of supporting up to 15 audible alarms on a mid-grade laptop computer before audio dropouts occurred. These results suggest that this software tool provides a foundation for rapidly staging multiple audible alarm sets from the laboratory to a simulation environment for the purpose of evaluating novel alarm designs, thus producing valuable findings for medical audible alarm standardization.

  20. PerfAndPubToolsTools for Software Performance Analysis and Publishing of Results

    Directory of Open Access Journals (Sweden)

    Nuno Fachada

    2016-05-01

    Full Text Available PerfAndPubTools consists of a set of MATLAB/Octave functions for the post-processing and analysis of software performance benchmark data and producing associated publication quality materials.

  1. Collaborative process improvement with examples from the software world

    CERN Document Server

    Yeakley, C

    2007-01-01

    Collaborative Process Improvement specifically addresses software companies that are interested in addressing quality in human terms. Using Collaborative Process Improvement techniques builds effective processes to deliver quality products; it helps readers relate to what quality means to the end-user and provides the essential tools and methods to integrate the face of the customer into the organization's day-to-day processes. It comes complete with real-world examples that are practical and unders andable to professionals in every role of a company.

  2. Recent improvements to software used for optimization of SRF linacs

    Energy Technology Data Exchange (ETDEWEB)

    Powers, Tom J. [JLAB

    2014-12-01

    This work describes a software tool that allows one to vary parameters and understand the effects on the optimized costs of construction plus 10 year operations of an SRF linac, where operation costs includes the cost of the electrical utilities but not the labor or other costs. The program includes estimates for the associated cryogenic facility, and controls hardware. The software interface provides the ability to vary the cost of the different aspects of the machine as well as to change the cryomodule and cavity types. Additionally, this work will describe the recent improvements to the software that allow one to estimate the costs of energy-recovery based linacs and to enter arbitrary values of the low field Q0 and Q0 slope. The initial goal when developing the software was to convert a spreadsheet format to a graphical interface and to allow the ability to sweep different parameter sets. The tools also allow one to compare the cost of the different facets of the machine design and operations so as to better understand tradeoffs. An example of how it was used to independently investigate cost optimization tradeoffs for the LCLS-II linac will also be presented.

  3. Software Process Improvement: Blueprints versus Recipes

    DEFF Research Database (Denmark)

    Aaen, Ivan

    2003-01-01

    Viewing software processes as blueprints emphasizes that design is separate from use, and thus that software process designers and users are independent. In the approach presented here, software processes are viewed as recipes; developers individually and collectively design their own software...

  4. Software reliability: Failures, consequences and improvement ...

    African Journals Online (AJOL)

    Software reliability is one of a number of aspects of computer software which can be taken into consideration when determining the quality of the software. Software reliability is an important factor affecting system performance. It differs from hardware reliability in that it reflects the design perfection rather than manufacturing ...

  5. software reliability: failures, consequences and improvement

    African Journals Online (AJOL)

    BARTH EKWUEME

    2009-07-16

    Jul 16, 2009 ... Software reliability is one of a number of aspects of computer software which can be taken into consideration when determining the quality of the software. Software reliability is an important factor affecting system performance. It differs from hardware reliability in that it reflects the design perfection rather ...

  6. Software Tool Integrating Data Flow Diagrams and Petri Nets

    Science.gov (United States)

    Thronesbery, Carroll; Tavana, Madjid

    2010-01-01

    Data Flow Diagram - Petri Net (DFPN) is a software tool for analyzing other software to be developed. The full name of this program reflects its design, which combines the benefit of data-flow diagrams (which are typically favored by software analysts) with the power and precision of Petri-net models, without requiring specialized Petri-net training. (A Petri net is a particular type of directed graph, a description of which would exceed the scope of this article.) DFPN assists a software analyst in drawing and specifying a data-flow diagram, then translates the diagram into a Petri net, then enables graphical tracing of execution paths through the Petri net for verification, by the end user, of the properties of the software to be developed. In comparison with prior means of verifying the properties of software to be developed, DFPN makes verification by the end user more nearly certain, thereby making it easier to identify and correct misconceptions earlier in the development process, when correction is less expensive. After the verification by the end user, DFPN generates a printable system specification in the form of descriptions of processes and data.

  7. COSTMODL: An automated software development cost estimation tool

    Science.gov (United States)

    Roush, George B.

    1991-01-01

    The cost of developing computer software continues to consume an increasing portion of many organizations' total budgets, both in the public and private sector. As this trend develops, the capability to produce reliable estimates of the effort and schedule required to develop a candidate software product takes on increasing importance. The COSTMODL program was developed to provide an in-house capability to perform development cost estimates for NASA software projects. COSTMODL is an automated software development cost estimation tool which incorporates five cost estimation algorithms including the latest models for the Ada language and incrementally developed products. The principal characteristic which sets COSTMODL apart from other software cost estimation programs is its capacity to be completely customized to a particular environment. The estimation equations can be recalibrated to reflect the programmer productivity characteristics demonstrated by the user's organization, and the set of significant factors which effect software development costs can be customized to reflect any unique properties of the user's development environment. Careful use of a capability such as COSTMODL can significantly reduce the risk of cost overruns and failed projects.

  8. How Using Dedicated Software Can Improve RECIST Readings

    Directory of Open Access Journals (Sweden)

    Amandine René

    2014-09-01

    Full Text Available Decision support tools exist for oncologic follow up. Their main interest is to help physicians improve their oncologic readings but this theoretical benefit has to be quantified by concrete evidence. The purpose of the study was to evaluate and quantify the impact of using dedicated software on RECIST readings. A comparison was made between RECIST readings without dedicated application vs. readings using dedicated software (Myrian® XL-Onco, Intrasense, France with specific functionalities such as 3D elastic target matching and automated calculation of tumoral response. A retrospective database of 40 patients who underwent a CT scan follow up was used (thoracic/abdominal lesions. The reading panel was composed of two radiologists. Reading times, intra/inter-operator reproducibility of measurements and RECIST response misclassifications were evaluated. On average, reading time was reduced by 49.7% using dedicated software. A more important saving was observed for lung lesions evaluations (63.4% vs. 36.1% for hepatic targets. Inter and intra-operator reproducibility of measurements was excellent for both reading methods. Using dedicated software prevented misclassifications on 10 readings out of 120 (eight due to calculation errors. The use of dedicated oncology software optimises RECIST evaluation by decreasing reading times significantly and avoiding response misclassifications due to manual calculation errors or approximations.

  9. Software Tools to Support the Assessment of System Health

    Science.gov (United States)

    Melcher, Kevin J.

    2013-01-01

    This presentation provides an overview of three software tools that were developed by the NASA Glenn Research Center to support the assessment of system health: the Propulsion Diagnostic Method Evaluation Strategy (ProDIMES), the Systematic Sensor Selection Strategy (S4), and the Extended Testability Analysis (ETA) tool. Originally developed to support specific NASA projects in aeronautics and space, these software tools are currently available to U.S. citizens through the NASA Glenn Software Catalog. The ProDiMES software tool was developed to support a uniform comparison of propulsion gas path diagnostic methods. Methods published in the open literature are typically applied to dissimilar platforms with different levels of complexity. They often address different diagnostic problems and use inconsistent metrics for evaluating performance. As a result, it is difficult to perform a one ]to ]one comparison of the various diagnostic methods. ProDIMES solves this problem by serving as a theme problem to aid in propulsion gas path diagnostic technology development and evaluation. The overall goal is to provide a tool that will serve as an industry standard, and will truly facilitate the development and evaluation of significant Engine Health Management (EHM) capabilities. ProDiMES has been developed under a collaborative project of The Technical Cooperation Program (TTCP) based on feedback provided by individuals within the aircraft engine health management community. The S4 software tool provides a framework that supports the optimal selection of sensors for health management assessments. S4 is structured to accommodate user ]defined applications, diagnostic systems, search techniques, and system requirements/constraints. One or more sensor suites that maximize this performance while meeting other user ]defined system requirements that are presumed to exist. S4 provides a systematic approach for evaluating combinations of sensors to determine the set or sets of

  10. Comparison of quality control software tools for diffusion tensor imaging.

    Science.gov (United States)

    Liu, Bilan; Zhu, Tong; Zhong, Jianhui

    2015-04-01

    Image quality of diffusion tensor imaging (DTI) is critical for image interpretation, diagnostic accuracy and efficiency. However, DTI is susceptible to numerous detrimental artifacts that may impair the reliability and validity of the obtained data. Although many quality control (QC) software tools are being developed and are widely used and each has its different tradeoffs, there is still no general agreement on an image quality control routine for DTIs, and the practical impact of these tradeoffs is not well studied. An objective comparison that identifies the pros and cons of each of the QC tools will be helpful for the users to make the best choice among tools for specific DTI applications. This study aims to quantitatively compare the effectiveness of three popular QC tools including DTI studio (Johns Hopkins University), DTIprep (University of North Carolina at Chapel Hill, University of Iowa and University of Utah) and TORTOISE (National Institute of Health). Both synthetic and in vivo human brain data were used to quantify adverse effects of major DTI artifacts to tensor calculation as well as the effectiveness of different QC tools in identifying and correcting these artifacts. The technical basis of each tool was discussed, and the ways in which particular techniques affect the output of each of the tools were analyzed. The different functions and I/O formats that three QC tools provide for building a general DTI processing pipeline and integration with other popular image processing tools were also discussed. Copyright © 2015 Elsevier Inc. All rights reserved.

  11. Software Tools for Electrical Quality Assurance in the LHC

    CERN Document Server

    Bednarek, Mateusz

    2011-01-01

    There are over 1600 superconducting magnet circuits in the LHC machine. Many of them consist of a large number of components electrically connected in series. This enhances the sensitivity of the whole circuits to electrical faults of individual components. Furthermore, circuits are equipped with a large number of instrumentation wires, which are exposed to accidental damage or swapping. In order to ensure safe operation, an Electrical Quality Assurance (ELQA) campaign is needed after each thermal cycle. Due to the complexity of the circuits, as well as their distant geographical distribution (tunnel of 27km circumference divided in 8 sectors), suitable software and hardware platforms had to be developed. The software combines an Oracle database, LabView data acquisition applications and PHP-based web follow-up tools. This paper describes the software used for the ELQA of the LHC.

  12. Software engineering capability for Ada (GRASP/Ada Tool)

    Science.gov (United States)

    Cross, James H., II

    1995-01-01

    The GRASP/Ada project (Graphical Representations of Algorithms, Structures, and Processes for Ada) has successfully created and prototyped a new algorithmic level graphical representation for Ada software, the Control Structure Diagram (CSD). The primary impetus for creation of the CSD was to improve the comprehension efficiency of Ada software and, as a result, improve reliability and reduce costs. The emphasis has been on the automatic generation of the CSD from Ada PDL or source code to support reverse engineering and maintenance. The CSD has the potential to replace traditional prettyprinted Ada Source code. A new Motif compliant graphical user interface has been developed for the GRASP/Ada prototype.

  13. Proposing a Mathematical Software Tool in Physics Secondary Education

    Directory of Open Access Journals (Sweden)

    Konstantinos B. Baltzis

    2009-03-01

    Full Text Available MathCad® is a very popular software tool for mathematical and statistical analysis in science and engineering. Its low cost, ease of use, extensive function library, and worksheet–like user interface distinguish it among other commercial packages. Its features are also well suited to educational process. The use of natural mathematical notation and built–in measurement units are its two major advantages in teaching and learning. In this paper, its complementary use in the upper secondary physics education in Greece is explored. In order to demonstrate its application in the teaching process, a set of representative examples are presented. The main features and advantages of the software are also pointed out. The paper aims to present the benefits of the application of mathematical information technology tools in secondary physics education. In this effort, MathCad® is probably the most promising solution.

  14. Software Tool for Real-Time Power Quality Analysis

    Directory of Open Access Journals (Sweden)

    CZIKER, A. C.

    2013-11-01

    Full Text Available A software tool dedicated for the analysis of power signals containing harmonic and interharmonic components, unbalance, voltage dips and voltage swells is presented. The software tool is a virtual instrument, which uses innovative algorithms based on time and frequency domains analysis to process power signals. In order to detect the temporary disturbances, edge detection is proposed, whereas for the harmonic analysis Gaussian filter banks are implemented. Considering that a signal recovery algorithm is applied, the harmonic analysis can be made even if voltage dips or swells appear. The virtual instrument input data can be recorded or online signals; the last ones being get through a data acquisition board. The virtual instrument was tested using both virtually created and real signals from measurements performed in distribution networks. The paper contains a numeric example made on a synthetic digital signal and an analysis made in real-time.

  15. Thermography based prescreening software tool for veterinary clinics

    Science.gov (United States)

    Dahal, Rohini; Umbaugh, Scott E.; Mishra, Deependra; Lama, Norsang; Alvandipour, Mehrdad; Umbaugh, David; Marino, Dominic J.; Sackman, Joseph

    2017-05-01

    Under development is a clinical software tool which can be used in the veterinary clinics as a prescreening tool for these pathologies: anterior cruciate ligament (ACL) disease, bone cancer and feline hyperthyroidism. Currently, veterinary clinical practice uses several imaging techniques including radiology, computed tomography (CT), and magnetic resonance imaging (MRI). But, harmful radiation involved during imaging, expensive equipment setup, excessive time consumption and the need for a cooperative patient during imaging, are major drawbacks of these techniques. In veterinary procedures, it is very difficult for animals to remain still for the time periods necessary for standard imaging without resorting to sedation - which creates another set of complexities. Therefore, clinical application software integrated with a thermal imaging system and the algorithms with high sensitivity and specificity for these pathologies, can address the major drawbacks of the existing imaging techniques. A graphical user interface (GUI) has been created to allow ease of use for the clinical technician. The technician inputs an image, enters patient information, and selects the camera view associated with the image and the pathology to be diagnosed. The software will classify the image using an optimized classification algorithm that has been developed through thousands of experiments. Optimal image features are extracted and the feature vector is then used in conjunction with the stored image database for classification. Classification success rates as high as 88% for bone cancer, 75% for ACL and 90% for feline hyperthyroidism have been achieved. The software is currently undergoing preliminary clinical testing.

  16. Leveraging Genomics Software to Improve Proteomics Results

    Energy Technology Data Exchange (ETDEWEB)

    Fodor, I K; Nelson, D O

    2005-09-06

    Rigorous data analysis techniques are essential in quantifying the differential expression of proteins in biological samples of interest. Statistical methods from the microarray literature were applied to the analysis of two-dimensional difference gel electrophoresis (2-D DIGE) proteomics experiments, in the context of technical variability studies involving human plasma. Protein expression measurements were corrected to account for observed intensity-dependent biases within gels, and normalized to mitigate observed gel to gel variations. The methods improved upon the results achieved using the best currently available 2-D DIGE proteomics software. The spot-wise protein variance was reduced by 10% and the number of apparently differentially expressed proteins was reduced by over 50%.

  17. Enhancing the Understanding of Computer Networking Courses through Software Tools

    OpenAIRE

    Dafalla, Z. I.; Balaji, R. D.

    2015-01-01

    Computer networking is an important specialization in Information and Communication Technologies. However imparting the right knowledge to students can be a challenging task due to the fact that there is not enough time to deliver lengthy labs during normal lecture hours. Augmenting the use of physical machines with software tools help the students to learn beyond the limited lab sessions within the environment of higher Institutions of learning throughout the world. The Institutions focus mo...

  18. Improving system quality through software evaluation.

    Science.gov (United States)

    McDaniel, James G

    2002-05-01

    The role of evaluation is examined with respect to quality of software in healthcare. Of particular note is the failure of the Therac-25 radiation therapy machine. This example provides evidence of several types of defect which could have been detected and corrected using appropriate evaluation procedures. The field of software engineering has developed metrics and guidelines to assist in software evaluation but this example indicates that software evaluation must be extended beyond the formally defined interfaces of the software to its real-life operating context.

  19. Improvements to the APBS biomolecular solvation software suite.

    Science.gov (United States)

    Jurrus, Elizabeth; Engel, Dave; Star, Keith; Monson, Kyle; Brandi, Juan; Felberg, Lisa E; Brookes, David H; Wilson, Leighton; Chen, Jiahui; Liles, Karina; Chun, Minju; Li, Peter; Gohara, David W; Dolinsky, Todd; Konecny, Robert; Koes, David R; Nielsen, Jens Erik; Head-Gordon, Teresa; Geng, Weihua; Krasny, Robert; Wei, Guo-Wei; Holst, Michael J; McCammon, J Andrew; Baker, Nathan A

    2018-01-01

    The Adaptive Poisson-Boltzmann Solver (APBS) software was developed to solve the equations of continuum electrostatics for large biomolecular assemblages that have provided impact in the study of a broad range of chemical, biological, and biomedical applications. APBS addresses the three key technology challenges for understanding solvation and electrostatics in biomedical applications: accurate and efficient models for biomolecular solvation and electrostatics, robust and scalable software for applying those theories to biomolecular systems, and mechanisms for sharing and analyzing biomolecular electrostatics data in the scientific community. To address new research applications and advancing computational capabilities, we have continually updated APBS and its suite of accompanying software since its release in 2001. In this article, we discuss the models and capabilities that have recently been implemented within the APBS software package including a Poisson-Boltzmann analytical and a semi-analytical solver, an optimized boundary element solver, a geometry-based geometric flow solvation model, a graph theory-based algorithm for determining pK a values, and an improved web-based visualization tool for viewing electrostatics. © 2017 The Protein Society.

  20. Current practice in software development for computational neuroscience and how to improve it.

    Science.gov (United States)

    Gewaltig, Marc-Oliver; Cannon, Robert

    2014-01-01

    Almost all research work in computational neuroscience involves software. As researchers try to understand ever more complex systems, there is a continual need for software with new capabilities. Because of the wide range of questions being investigated, new software is often developed rapidly by individuals or small groups. In these cases, it can be hard to demonstrate that the software gives the right results. Software developers are often open about the code they produce and willing to share it, but there is little appreciation among potential users of the great diversity of software development practices and end results, and how this affects the suitability of software tools for use in research projects. To help clarify these issues, we have reviewed a range of software tools and asked how the culture and practice of software development affects their validity and trustworthiness. We identified four key questions that can be used to categorize software projects and correlate them with the type of product that results. The first question addresses what is being produced. The other three concern why, how, and by whom the work is done. The answers to these questions show strong correlations with the nature of the software being produced, and its suitability for particular purposes. Based on our findings, we suggest ways in which current software development practice in computational neuroscience can be improved and propose checklists to help developers, reviewers, and scientists to assess the quality of software and whether particular pieces of software are ready for use in research.

  1. Current practice in software development for computational neuroscience and how to improve it.

    Directory of Open Access Journals (Sweden)

    Marc-Oliver Gewaltig

    2014-01-01

    Full Text Available Almost all research work in computational neuroscience involves software. As researchers try to understand ever more complex systems, there is a continual need for software with new capabilities. Because of the wide range of questions being investigated, new software is often developed rapidly by individuals or small groups. In these cases, it can be hard to demonstrate that the software gives the right results. Software developers are often open about the code they produce and willing to share it, but there is little appreciation among potential users of the great diversity of software development practices and end results, and how this affects the suitability of software tools for use in research projects. To help clarify these issues, we have reviewed a range of software tools and asked how the culture and practice of software development affects their validity and trustworthiness. We identified four key questions that can be used to categorize software projects and correlate them with the type of product that results. The first question addresses what is being produced. The other three concern why, how, and by whom the work is done. The answers to these questions show strong correlations with the nature of the software being produced, and its suitability for particular purposes. Based on our findings, we suggest ways in which current software development practice in computational neuroscience can be improved and propose checklists to help developers, reviewers, and scientists to assess the quality of software and whether particular pieces of software are ready for use in research.

  2. COSTMODL - AN AUTOMATED SOFTWARE DEVELOPMENT COST ESTIMATION TOOL

    Science.gov (United States)

    Roush, G. B.

    1994-01-01

    The cost of developing computer software consumes an increasing portion of many organizations' budgets. As this trend continues, the capability to estimate the effort and schedule required to develop a candidate software product becomes increasingly important. COSTMODL is an automated software development estimation tool which fulfills this need. Assimilating COSTMODL to any organization's particular environment can yield significant reduction in the risk of cost overruns and failed projects. This user-customization capability is unmatched by any other available estimation tool. COSTMODL accepts a description of a software product to be developed and computes estimates of the effort required to produce it, the calendar schedule required, and the distribution of effort and staffing as a function of the defined set of development life-cycle phases. This is accomplished by the five cost estimation algorithms incorporated into COSTMODL: the NASA-developed KISS model; the Basic, Intermediate, and Ada COCOMO models; and the Incremental Development model. This choice affords the user the ability to handle project complexities ranging from small, relatively simple projects to very large projects. Unique to COSTMODL is the ability to redefine the life-cycle phases of development and the capability to display a graphic representation of the optimum organizational structure required to develop the subject project, along with required staffing levels and skills. The program is menu-driven and mouse sensitive with an extensive context-sensitive help system that makes it possible for a new user to easily install and operate the program and to learn the fundamentals of cost estimation without having prior training or separate documentation. The implementation of these functions, along with the customization feature, into one program makes COSTMODL unique within the industry. COSTMODL was written for IBM PC compatibles, and it requires Turbo Pascal 5.0 or later and Turbo

  3. "SABER": A new software tool for radiotherapy treatment plan evaluation.

    Science.gov (United States)

    Zhao, Bo; Joiner, Michael C; Orton, Colin G; Burmeister, Jay

    2010-11-01

    Both spatial and biological information are necessary in order to perform true optimization of a treatment plan and for predicting clinical outcome. The goal of this work is to develop an enhanced treatment plan evaluation tool which incorporates biological parameters and retains spatial dose information. A software system is developed which provides biological plan evaluation with a novel combination of features. It incorporates hyper-radiosensitivity using the induced-repair model and applies the new concept of dose convolution filter (DCF) to simulate dose wash-out effects due to cell migration, bystander effect, and/or tissue motion during treatment. Further, the concept of spatial DVH (sDVH) is introduced to evaluate and potentially optimize the spatial dose distribution in the target volume. Finally, generalized equivalent uniform dose is derived from both the physical dose distribution (gEUD) and the distribution of equivalent dose in 2 Gy fractions (gEUD2) and the software provides three separate models for calculation of tumor control probability (TCP), normal tissue complication probability (NTCP), and probability of uncomplicated tumor control (P+). TCP, NTCP, and P+ are provided as a function of prescribed dose and multivariable TCP, NTCP, and P+ plots are provided to illustrate the dependence on individual parameters used to calculate these quantities. Ten plans from two clinical treatment sites are selected to test the three calculation models provided by this software. By retaining both spatial and biological information about the dose distribution, the software is able to distinguish features of radiotherapy treatment plans not discernible using commercial systems. Plans that have similar DVHs may have different spatial and biological characteristics and the application of novel tools such as sDVH and DCF within the software may substantially change the apparent plan quality or predicted plan metrics such as TCP and NTCP. For the cases examined

  4. Consolidating software tools for DNA microarray design and manufacturing.

    Science.gov (United States)

    Atlas, M; Hundewale, N; Perelygina, L; Zelikovsky, A

    2004-01-01

    As the human genome project progresses and some microbial and eukaryotic genomes are recognized, a novel technology, DNA microarray (also called gene chip, biochip, gene microarray, and DNA chip) technology, has attracted increasing number of biologists, bioengineers and computer scientists recently. This technology promises to monitor the whole genome at once, so that researchers can study the whole genome on the global level and have a better picture of the expressions among millions of genes simultaneously. Today, it is widely used in many fields - disease diagnosis, gene classification, gene regulatory network, and drug discovery. We present a concatenated software solution for the entire DNA array flow exploring all steps of a consolidated software tool. The proposed software tool has been tested on Herpes B virus as well as simulated data. Our experiments show that the genomic data follow the pattern predicted by simulated data although the number of border conflicts (quality of the DNA array design) is several times smaller than for simulated data. We also report a trade-off between the number of border conflicts and the running time for several proposed algorithmic techniques employed in the physical design of DNA arrays.

  5. Managing Change in Software Process Improvement

    DEFF Research Database (Denmark)

    Mathiassen, Lars; Ngwenyama, Ojelanki K.; Aaen, Ivan

    2005-01-01

    When software managers initiate SPI, most are ill prepared for the scale and complexity of the organizational change involved. Although they typically know how to deal with large software projects, few managers have sufficient experience with projects that transform organizations. To succeed...... with SPI, software managers must understand the context, know the organizational elements that are involved, and master the tactics that facilitate successful change....

  6. Evaluation of static analysis tools used to assess software important to nuclear power plant safety

    Directory of Open Access Journals (Sweden)

    Alain Ourghanlian

    2015-03-01

    Full Text Available We describe a comparative analysis of different tools used to assess safety-critical software used in nuclear power plants. To enhance the credibility of safety assessments and to optimize safety justification costs, Electricité de France (EDF investigates the use of methods and tools for source code semantic analysis, to obtain indisputable evidence and help assessors focus on the most critical issues. EDF has been using the PolySpace tool for more than 10 years. Currently, new industrial tools based on the same formal approach, Abstract Interpretation, are available. Practical experimentation with these new tools shows that the precision obtained on one of our shutdown systems software packages is substantially improved. In the first part of this article, we present the analysis principles of the tools used in our experimentation. In the second part, we present the main characteristics of protection-system software, and why these characteristics are well adapted for the new analysis tools. In the last part, we present an overview of the results and the limitations of the tools.

  7. Software tool for horizontal-axis wind turbine simulation

    Energy Technology Data Exchange (ETDEWEB)

    Vitale, A.J. [Instituto Argentino de Oceanografia, Camino La Carrindanga Km. 7, 5 CC 804, B8000FWB Bahia Blanca (Argentina); Universidad Tecnologica Nacional Facultad Regional Bahia Blanca, GESE, 11 de Abril 461, B8000LMI Bahia Blanca (Argentina); Dpto. de Ing. Electrica y de Computadoras, Universidad Nacional del Sur, Av. Alem 1253, 8000 Bahia Blanca (Argentina); Rossi, A.P. [Universidad Tecnologica Nacional Facultad Regional Bahia Blanca, GESE, 11 de Abril 461, B8000LMI Bahia Blanca (Argentina); Dpto. de Ing. Electrica y de Computadoras, Universidad Nacional del Sur, Av. Alem 1253, 8000 Bahia Blanca (Argentina)

    2008-07-15

    The main problem of a wind turbine generator design project is the design of the right blades capable of satisfying the specific energy requirement of an electric system with optimum performance. Once the blade has been designed for optimum operation at a particular rotor angular speed, it is necessary to determine the overall performance of the rotor under the range of wind speed that it will encounter. A software tool that simulates low-power, horizontal-axis wind turbines was developed for this purpose. With this program, the user can calculate the rotor power output for any combination of wind and rotor speeds, with definite blade shape and airfoil characteristics. The software also provides information about distribution of forces along the blade span, for different operational conditions. (author)

  8. AMIDE: A Free Software Tool for Multimodality Medical Image Analysis

    Directory of Open Access Journals (Sweden)

    Andreas Markus Loening

    2003-07-01

    Full Text Available Amide's a Medical Image Data Examiner (AMIDE has been developed as a user-friendly, open-source software tool for displaying and analyzing multimodality volumetric medical images. Central to the package's abilities to simultaneously display multiple data sets (e.g., PET, CT, MRI and regions of interest is the on-demand data reslicing implemented within the program. Data sets can be freely shifted, rotated, viewed, and analyzed with the program automatically handling interpolation as needed from the original data. Validation has been performed by comparing the output of AMIDE with that of several existing software packages. AMIDE runs on UNIX, Macintosh OS X, and Microsoft Windows platforms, and it is freely available with source code under the terms of the GNU General Public License.

  9. AMIDE: a free software tool for multimodality medical image analysis.

    Science.gov (United States)

    Loening, Andreas Markus; Gambhir, Sanjiv Sam

    2003-07-01

    Amide's a Medical Image Data Examiner (AMIDE) has been developed as a user-friendly, open-source software tool for displaying and analyzing multimodality volumetric medical images. Central to the package's abilities to simultaneously display multiple data sets (e.g., PET, CT, MRI) and regions of interest is the on-demand data reslicing implemented within the program. Data sets can be freely shifted, rotated, viewed, and analyzed with the program automatically handling interpolation as needed from the original data. Validation has been performed by comparing the output of AMIDE with that of several existing software packages. AMIDE runs on UNIX, Macintosh OS X, and Microsoft Windows platforms, and it is freely available with source code under the terms of the GNU General Public License.

  10. Northwestern University Schizophrenia Data and Software Tool (NUSDAST

    Directory of Open Access Journals (Sweden)

    Lei eWang

    2013-11-01

    Full Text Available The schizophrenia research community has invested substantial resources on collecting, managing and sharing large neuroimaging datasets. As part of this effort, our group has collected high resolution magnetic resonance (MR datasets from individuals with schizophrenia, their non-psychotic siblings, healthy controls and their siblings. This effort has resulted in a growing resource, the Northwestern University Schizophrenia Data and Software Tool (NUSDAST, an NIH-funded data sharing project to stimulate new research. This resource resides on XNAT Central, and it contains neuroimaging (MR scans, landmarks and surface maps for deep subcortical structures, and FreeSurfer cortical parcellation and measurement data, cognitive (cognitive domain scores for crystallized intelligence, working memory, episodic memory, and executive function, clinical (demographic, sibling relationship, SAPS and SANS psychopathology, and genetic (20 polymorphisms data, collected from more than 450 subjects, most with 2-year longitudinal follow-up. A neuroimaging mapping, analysis and visualization software tool, CAWorks, is also part of this resource. Moreover, in making our existing neuroimaging data along with the associated meta-data and computational tools publically accessible, we have established a web-based information retrieval portal that allows the user to efficiently search the collection. This research-ready dataset meaningfully combines neuroimaging data with other relevant information, and it can be used to help facilitate advancing neuroimaging research. It is our hope that this effort will help to overcome some of the commonly recognized technical barriers in advancing neuroimaging research such as lack of local organization and standard descriptions.

  11. Software process improvement in a research environment

    NARCIS (Netherlands)

    Velden, van der M.J.; Hendriks, P.R.H.; Udink ten Cate, A.J.

    1995-01-01

    Research organizations pay much attention to the quality of their work, but not always to the quality of the software they produce within research projects. This is not a healthy situation since research organizations are becoming more and more dependent on software development. This paper describes

  12. Object-Oriented Software Tools for the Construction of Preconditioners

    Directory of Open Access Journals (Sweden)

    Eva Mossberg

    1997-01-01

    Full Text Available In recent years, there has been considerable progress concerning preconditioned iterative methods for large and sparse systems of equations arising from the discretization of differential equations. Such methods are particularly attractive in the context of high-performance (parallel computers. However, the implementation of a preconditioner is a nontrivial task. The focus of the present contribution is on a set of object-oriented software tools that support the construction of a family of preconditioners based on fast transforms. By combining objects of different classes, it is possible to conveniently construct any preconditioner within this family.

  13. High-quality real-time temporal segmentation tool for video editing software

    OpenAIRE

    Cuevas Rodríguez, Carlos; García Santos, Narciso

    2012-01-01

    The increasing use of video editing software has resulted in a necessity for faster and more efficient editing tools. Here, we propose a lightweight high-quality video indexing tool that is suitable for video editing software.

  14. Investigating the relationship between software process improvement, situational change, and business success in software SMEs

    OpenAIRE

    Clarke, Paul

    2012-01-01

    While we have learned a great deal from Software Process Improvement (SPI) research to date, no earlier study has been designed from the outset to examine the relationship between SPI and business success in software development small- to- medium- sized companies (software SMEs). Since business processes are generally acknowledged as having an important role to play in supporting business success, it follows that the software development process (a large and complex component of the overall b...

  15. Improvement of E and P PETROBRAS maintenance program by an RCM tool; MCC Net: software de revisao de planos de manutencao com base M.C.C. Metodo qualitativo

    Energy Technology Data Exchange (ETDEWEB)

    Frydman, Bernardo; Okada, Ricardo Yoshinori [PETROBRAS, Rio de Janeiro, RJ (Brazil). Exploracao e Producao; Souza, Arleniro Oliveira de [PETROBRAS, AM (Brazil). Unidade de Exploracao e Producao da Bacia do Solimoes; Frazao, Nelson A. [PETROBRAS, Macae, RJ (Brazil). Unidade de Negocios da Bacia de Campos

    2004-07-01

    The objective of this paper are remember some basic concepts necessaries to understanding the Reliability Centered Maintenance technique, presenting a report of the RCM applications already developed in the exploration and production segment at PETROBRAS, with some examples of results obtained, presents the premises that based the development of the denominated software MCCNet, developed by PETROBRAS E and P, tool for use of the technique of Reliability Centered Maintenance , for elaboration and/or revision of the preventive maintenance plans of equipment or systems, that it can be used for the study of any process type, and, still presenting the future vision for the subject of RCM in this company segment. (author)

  16. Improving productivity software through the adaptation of an agile development framework

    Directory of Open Access Journals (Sweden)

    Ángel Fiallos Ordoñez

    2015-06-01

    Full Text Available (Received: 2015/02/20 - Accepted: 2015/06/17The current research suggests that using of agile methodologies in conjunction with open source software tools can improve productivity, reduce costs and optimize resources in the process of software development, and helps improve user satisfaction due to implementation of excellent quality software. The following analysis shows the most important variables for the successful implementation of IT development projects and their relation with the use of traditional and agile software development methodologies.

  17. Software process improvement: controlling developers, managers or users?

    DEFF Research Database (Denmark)

    Nørbjerg, Jacob

    1999-01-01

    The paper discusses how the latest trend in the management of software development: software process improvement (SPI) may affect user-developer relations. At the outset, SPI concerns the "internal workings" of software organisations, but it may also be interpreted as one way to give the developer...

  18. A computer-aided software-tool for sustainable process synthesis-intensification

    DEFF Research Database (Denmark)

    Kumar Tula, Anjan; Babi, Deenesh K.; Bottlaender, Jack

    2017-01-01

    and determine within the design space, the more sustainable processes. In this paper, an integrated computer-aided software-tool that searches the design space for hybrid/intensified more sustainable process options is presented. Embedded within the software architecture are process synthesis......Currently, the process industry is moving towards the design of innovative, more sustainable processes that show improvements in both economic and environmental factors. The design space of unit operations that can be combined to generate process flowsheet alternatives considering known unit...... constraints while also matching the design targets, they are therefore more sustainable than the base case. The application of the software-tool to the production of biodiesel is presented, highlighting the main features of the computer-aided, multi-stage, multi-scale methods that are able to determine more...

  19. Evaluating, selecting and relevance software tools in technology monitoring

    Directory of Open Access Journals (Sweden)

    Óscar Fernando Castellanos Domínguez

    2010-07-01

    Full Text Available The current setting for industrial and entrepreneurial development has posed the need for incorporating differentiating elements into the production apparatus leading to anticipating technological change. Technology monitoring (TM emerges as a methodology focused on analysing these changes for identifying challenges and opportunities (being mainly supported by information technology (IT through the search for, capture and analysis of data and information. This article proposes criteria for choosing and efficiently using software tools having different characteristics, requirements, capacity and cost which could be used in monitoring. An approach is made to different TM models, emphasising the identification and analysis of different information sources for coving and supporting information and access monitoring. Some evaluation, selection and analysis criteria are given for using these types of tools according to each production system’s individual profile and needs. Some of the existing software packages are described which are available on the market for carrying out monitoring prolects, relating them to their complexity, process characteristics and cost.

  20. Software tools to aid Pascal and Ada program design

    Energy Technology Data Exchange (ETDEWEB)

    Jankowitz, H.T.

    1987-01-01

    This thesis describes a software tool which analyses the style and structure of Pascal and Ada programs by ensuring that some minimum design requirements are fulfilled. The tool is used in much the same way as a compiler is used to teach students the syntax of a language, only in this case issues related to the design and structure of the program are of paramount importance. The tool operates by analyzing the design and structure of a syntactically correct program, automatically generating a report detailing changes that need to be made in order to ensure that the program is structurally sound. The author discusses how the model gradually evolved from a plagiarism detection system which extracted several measurable characteristics in a program to a model that analyzed the style of Pascal programs. In order to incorporate more-sophistical concepts like data abstraction, information hiding and data protection, this model was then extended to analyze the composition of Ada programs. The Ada model takes full advantage of facilities offered in the language and by using this tool the standard and quality of written programs is raised whilst the fundamental principles of program design are grasped through a process of self-tuition.

  1. Software tool for 3D extraction of germinal centers.

    Science.gov (United States)

    Olivieri, David N; Escalona, Merly; Faro, Jose

    2013-01-01

    Germinal Centers (GC) are short-lived micro-anatomical structures, within lymphoid organs, where affinity maturation is initiated. Theoretical modeling of the dynamics of the GC reaction including follicular CD4+ T helper and the recently described follicular regulatory CD4+ T cell populations, predicts that the intensity and life span of such reactions is driven by both types of T cells, yet controlled primarily by follicular regulatory CD4+ T cells. In order to calibrate GC models, it is necessary to properly analyze the kinetics of GC sizes. Presently, the estimation of spleen GC volumes relies upon confocal microscopy images from 20-30 slices spanning a depth of ~ 20 - 50 μm, whose GC areas are analyzed, slice-by-slice, for subsequent 3D reconstruction and quantification. The quantity of data to be analyzed from such images taken for kinetics experiments is usually prohibitively large to extract semi-manually with existing software. As a result, the entire procedure is highly time-consuming, and inaccurate, thereby motivating the need for a new software tool that can automatically identify and calculate the 3D spot volumes from GC multidimensional images. We have developed pyBioImage, an open source cross platform image analysis software application, written in python with C extensions that is specifically tailored to the needs of immunologic research involving 4D imaging of GCs. The software provides 1) support for importing many multi-image formats, 2) basic image processing and analysis, and 3) the ExtractGC module, that allows for automatic analysis and visualization of extracted GC volumes from multidimensional confocal microscopy images. We present concrete examples of different microscopy image data sets of GC that have been used in experimental and theoretical studies of mouse model GC dynamics. The pyBioImage software framework seeks to be a general purpose image application for immunological research based on 4D imaging. The ExtractGC module uses a

  2. New software tools for enhanced precision in robot-assisted laser phonomicrosurgery.

    Science.gov (United States)

    Dagnino, Giulio; Mattos, Leonardo S; Caldwell, Darwin G

    2012-01-01

    This paper describes a new software package created to enhance precision during robot-assisted laser phonomicrosurgery procedures. The new software is composed of three tools for camera calibration, automatic tumor segmentation, and laser tracking. These were designed and developed to improve the outcome of this demanding microsurgical technique, and were tested herein to produce quantitative performance data. The experimental setup was based on the motorized laser micromanipulator created by Istituto Italiano di Tecnologia and the experimental protocols followed are fully described in this paper. The results show the new tools are robust and effective: The camera calibration tool reduced residual errors (RMSE) to 0.009 ± 0.002 mm under 40× microscope magnification; the automatic tumor segmentation tool resulted in deep lesion segmentations comparable to manual segmentations (RMSE= 0.160 ± 0.028 mm under 40× magnification); and the laser tracker tool proved to be reliable even during cutting procedures (RMSE= 0.073 ± 0.023 mm under 40× magnification). These results demonstrate the new software package can provide excellent improvements to the previous microsurgical system, leading to important enhancements in surgical outcome.

  3. A multicenter study benchmarks software tools for label-free proteome quantification.

    Science.gov (United States)

    Navarro, Pedro; Kuharev, Jörg; Gillet, Ludovic C; Bernhardt, Oliver M; MacLean, Brendan; Röst, Hannes L; Tate, Stephen A; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I; Aebersold, Ruedi; Tenzer, Stefan

    2016-11-01

    Consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH 2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from sequential window acquisition of all theoretical fragment-ion spectra (SWATH)-MS, which uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test data sets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation-window setups. For consistent evaluation, we developed LFQbench, an R package, to calculate metrics of precision and accuracy in label-free quantitative MS and report the identification performance, robustness and specificity of each software tool. Our reference data sets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics.

  4. A multi-center study benchmarks software tools for label-free proteome quantification

    Science.gov (United States)

    Gillet, Ludovic C; Bernhardt, Oliver M.; MacLean, Brendan; Röst, Hannes L.; Tate, Stephen A.; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I.; Aebersold, Ruedi; Tenzer, Stefan

    2016-01-01

    The consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from SWATH-MS (sequential window acquisition of all theoretical fragment ion spectra), a method that uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test datasets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation windows setups. For consistent evaluation we developed LFQbench, an R-package to calculate metrics of precision and accuracy in label-free quantitative MS, and report the identification performance, robustness and specificity of each software tool. Our reference datasets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics. PMID:27701404

  5. Learning Photogrammetry with Interactive Software Tool PhoX

    Science.gov (United States)

    Luhmann, T.

    2016-06-01

    Photogrammetry is a complex topic in high-level university teaching, especially in the fields of geodesy, geoinformatics and metrology where high quality results are demanded. In addition, more and more black-box solutions for 3D image processing and point cloud generation are available that generate nice results easily, e.g. by structure-from-motion approaches. Within this context, the classical approach of teaching photogrammetry (e.g. focusing on aerial stereophotogrammetry) has to be reformed in order to educate students and professionals with new topics and provide them with more information behind the scene. Since around 20 years photogrammetry courses at the Jade University of Applied Sciences in Oldenburg, Germany, include the use of digital photogrammetry software that provide individual exercises, deep analysis of calculation results and a wide range of visualization tools for almost all standard tasks in photogrammetry. During the last years the software package PhoX has been developed that is part of a new didactic concept in photogrammetry and related subjects. It also serves as analysis tool in recent research projects. PhoX consists of a project-oriented data structure for images, image data, measured points and features and 3D objects. It allows for almost all basic photogrammetric measurement tools, image processing, calculation methods, graphical analysis functions, simulations and much more. Students use the program in order to conduct predefined exercises where they have the opportunity to analyse results in a high level of detail. This includes the analysis of statistical quality parameters but also the meaning of transformation parameters, rotation matrices, calibration and orientation data. As one specific advantage, PhoX allows for the interactive modification of single parameters and the direct view of the resulting effect in image or object space.

  6. Harnessing scientific literature reports for pharmacovigilance. Prototype software analytical tool development and usability testing.

    Science.gov (United States)

    Sorbello, Alfred; Ripple, Anna; Tonning, Joseph; Munoz, Monica; Hasan, Rashedul; Ly, Thomas; Francis, Henry; Bodenreider, Olivier

    2017-03-22

    We seek to develop a prototype software analytical tool to augment FDA regulatory reviewers' capacity to harness scientific literature reports in PubMed/MEDLINE for pharmacovigilance and adverse drug event (ADE) safety signal detection. We also aim to gather feedback through usability testing to assess design, performance, and user satisfaction with the tool. A prototype, open source, web-based, software analytical tool generated statistical disproportionality data mining signal scores and dynamic visual analytics for ADE safety signal detection and management. We leveraged Medical Subject Heading (MeSH) indexing terms assigned to published citations in PubMed/MEDLINE to generate candidate drug-adverse event pairs for quantitative data mining. Six FDA regulatory reviewers participated in usability testing by employing the tool as part of their ongoing real-life pharmacovigilance activities to provide subjective feedback on its practical impact, added value, and fitness for use. All usability test participants cited the tool's ease of learning, ease of use, and generation of quantitative ADE safety signals, some of which corresponded to known established adverse drug reactions. Potential concerns included the comparability of the tool's automated literature search relative to a manual 'all fields' PubMed search, missing drugs and adverse event terms, interpretation of signal scores, and integration with existing computer-based analytical tools. Usability testing demonstrated that this novel tool can automate the detection of ADE safety signals from published literature reports. Various mitigation strategies are described to foster improvements in design, productivity, and end user satisfaction.

  7. Economic Consequence Analysis of Disasters: The ECAT Software Tool

    Energy Technology Data Exchange (ETDEWEB)

    Rose, Adam; Prager, Fynn; Chen, Zhenhua; Chatterjee, Samrat; Wei, Dan; Heatwole, Nathaniel; Warren, Eric

    2017-04-15

    This study develops a methodology for rapidly obtaining approximate estimates of the economic consequences from numerous natural, man-made and technological threats. This software tool is intended for use by various decision makers and analysts to obtain estimates rapidly. It is programmed in Excel and Visual Basic for Applications (VBA) to facilitate its use. This tool is called E-CAT (Economic Consequence Analysis Tool) and accounts for the cumulative direct and indirect impacts (including resilience and behavioral factors that significantly affect base estimates) on the U.S. economy. E-CAT is intended to be a major step toward advancing the current state of economic consequence analysis (ECA) and also contributing to and developing interest in further research into complex but rapid turnaround approaches. The essence of the methodology involves running numerous simulations in a computable general equilibrium (CGE) model for each threat, yielding synthetic data for the estimation of a single regression equation based on the identification of key explanatory variables (threat characteristics and background conditions). This transforms the results of a complex model, which is beyond the reach of most users, into a "reduced form" model that is readily comprehensible. Functionality has been built into E-CAT so that its users can switch various consequence categories on and off in order to create customized profiles of economic consequences of numerous risk events. E-CAT incorporates uncertainty on both the input and output side in the course of the analysis.

  8. User Guide for the STAYSL PNNL Suite of Software Tools

    Energy Technology Data Exchange (ETDEWEB)

    Greenwood, Lawrence R.; Johnson, Christian D.

    2013-02-27

    The STAYSL PNNL software suite provides a set of tools for working with neutron activation rates measured in a nuclear fission reactor, an accelerator-based neutron source, or any neutron field to determine the neutron flux spectrum through a generalized least-squares approach. This process is referred to as neutron spectral adjustment since the preferred approach is to use measured data to adjust neutron spectra provided by neutron physics calculations. The input data consist of the reaction rates based on measured activities, an initial estimate of the neutron flux spectrum, neutron activation cross sections and their associated uncertainties (covariances), and relevant correction factors. The output consists of the adjusted neutron flux spectrum and associated covariance matrix, which is useful for neutron dosimetry and radiation damage calculations.

  9. Metabolic interrelationships software application: Interactive learning tool for intermediary metabolism*.

    Science.gov (United States)

    Verhoeven, Adrie J M; Doets, Mathijs; Lamers, Jos M J; Koster, Johan F

    2005-11-01

    We developed and implemented the software application titled Metabolic Interrelationships as a self-learning and -teaching tool for intermediary metabolism. It is used by undergraduate medical students in an integrated organ systems-based and disease-oriented core curriculum, which started in our medical faculty in 2001. The computer program provides an interactive environment in which students learn to integrate the major metabolic pathways as well as their hormonal control mechanisms as far as they depend on nutritional status. Students can explore the time- and tissue-dependent changes in mammalian intermediary metabolism during a feeding-fasting cycle. Starting from a whole-body view of interorgan nutrient fluxes, the student can make excursions to individual organs and, from there, to increasing levels of molecular detail and to explanatory animations. The application is well received by students and staff. Copyright © 2005 International Union of Biochemistry and Molecular Biology, Inc.

  10. Software Development Of XML Parser Based On Algebraic Tools

    Science.gov (United States)

    Georgiev, Bozhidar; Georgieva, Adriana

    2011-12-01

    In this paper, is presented one software development and implementation of an algebraic method for XML data processing, which accelerates XML parsing process. Therefore, the proposed in this article nontraditional approach for fast XML navigation with algebraic tools contributes to advanced efforts in the making of an easier user-friendly API for XML transformations. Here the proposed software for XML documents processing (parser) is easy to use and can manage files with strictly defined data structure. The purpose of the presented algorithm is to offer a new approach for search and restructuring hierarchical XML data. This approach permits fast XML documents processing, using algebraic model developed in details in previous works of the same authors. So proposed parsing mechanism is easy accessible to the web consumer who is able to control XML file processing, to search different elements (tags) in it, to delete and to add a new XML content as well. The presented various tests show higher rapidity and low consumption of resources in comparison with some existing commercial parsers.

  11. Improving the Quality of Published Chemical Names with Nomenclature Software

    Directory of Open Access Journals (Sweden)

    Gernot A. Eller

    2006-11-01

    Full Text Available This work deals with the use of organic systematic nomenclature in scientific literature, its quality, and computerized methods for its improvement. Criteria for classification of systematic names in terms of quality/correctness are discussed and applied to a sample set of several hundred names extracted from the literature. The same structures are named with three popular state-of-the-art nomenclature programs – AutoNom 2000, ChemDraw 10.0, and ACD/Name 9.0. When comparing the results, all nomenclature tools show a significantly better performance than 'average chemists'. One program allows the generation not only of IUPAC names but also of CAS-like index names that are compared with the officially registered names. The scope and limitations of nomenclature software are discussed and a comparison of the programs' actual capabilities is given.

  12. SNPdetector: a software tool for sensitive and accurate SNP detection.

    Directory of Open Access Journals (Sweden)

    Jinghui Zhang

    2005-10-01

    Full Text Available Identification of single nucleotide polymorphisms (SNPs and mutations is important for the discovery of genetic predisposition to complex diseases. PCR resequencing is the method of choice for de novo SNP discovery. However, manual curation of putative SNPs has been a major bottleneck in the application of this method to high-throughput screening. Therefore it is critical to develop a more sensitive and accurate computational method for automated SNP detection. We developed a software tool, SNPdetector, for automated identification of SNPs and mutations in fluorescence-based resequencing reads. SNPdetector was designed to model the process of human visual inspection and has a very low false positive and false negative rate. We demonstrate the superior performance of SNPdetector in SNP and mutation analysis by comparing its results with those derived by human inspection, PolyPhred (a popular SNP detection tool, and independent genotype assays in three large-scale investigations. The first study identified and validated inter- and intra-subspecies variations in 4,650 traces of 25 inbred mouse strains that belong to either the Mus musculus species or the M. spretus species. Unexpected heterozygosity in CAST/Ei strain was observed in two out of 1,167 mouse SNPs. The second study identified 11,241 candidate SNPs in five ENCODE regions of the human genome covering 2.5 Mb of genomic sequence. Approximately 50% of the candidate SNPs were selected for experimental genotyping; the validation rate exceeded 95%. The third study detected ENU-induced mutations (at 0.04% allele frequency in 64,896 traces of 1,236 zebra fish. Our analysis of three large and diverse test datasets demonstrated that SNPdetector is an effective tool for genome-scale research and for large-sample clinical studies. SNPdetector runs on Unix/Linux platform and is available publicly (http://lpg.nci.nih.gov.

  13. SNPdetector: A Software Tool for Sensitive and Accurate SNP Detection.

    Directory of Open Access Journals (Sweden)

    2005-10-01

    Full Text Available Identification of single nucleotide polymorphisms (SNPs and mutations is important for the discovery of genetic predisposition to complex diseases. PCR resequencing is the method of choice for de novo SNP discovery. However, manual curation of putative SNPs has been a major bottleneck in the application of this method to high-throughput screening. Therefore it is critical to develop a more sensitive and accurate computational method for automated SNP detection. We developed a software tool, SNPdetector, for automated identification of SNPs and mutations in fluorescence-based resequencing reads. SNPdetector was designed to model the process of human visual inspection and has a very low false positive and false negative rate. We demonstrate the superior performance of SNPdetector in SNP and mutation analysis by comparing its results with those derived by human inspection, PolyPhred (a popular SNP detection tool, and independent genotype assays in three large-scale investigations. The first study identified and validated inter- and intra-subspecies variations in 4,650 traces of 25 inbred mouse strains that belong to either the Mus musculus species or the M. spretus species. Unexpected heterozgyosity in CAST/Ei strain was observed in two out of 1,167 mouse SNPs. The second study identified 11,241 candidate SNPs in five ENCODE regions of the human genome covering 2.5 Mb of genomic sequence. Approximately 50% of the candidate SNPs were selected for experimental genotyping; the validation rate exceeded 95%. The third study detected ENU-induced mutations (at 0.04% allele frequency in 64,896 traces of 1,236 zebra fish. Our analysis of three large and diverse test datasets demonstrated that SNPdetector is an effective tool for genome-scale research and for large-sample clinical studies. SNPdetector runs on Unix/Linux platform and is available publicly (http://lpg.nci.nih.gov.

  14. Organizational management practices for achieving software process improvement

    Science.gov (United States)

    Kandt, Ronald Kirk

    2004-01-01

    The crisis in developing software has been known for over thirty years. Problems that existed in developing software in the early days of computing still exist today. These problems include the delivery of low-quality products, actual development costs that exceed expected development costs, and actual development time that exceeds expected development time. Several solutions have been offered to overcome out inability to deliver high-quality software, on-time and within budget. One of these solutions involves software process improvement. However, such efforts often fail because of organizational management issues. This paper discusses business practices that organizations should follow to improve their chances of initiating and sustaining successful software process improvement efforts.

  15. Tracking Flight Software in Cassini Mission Operations Using the FMT Tool

    Science.gov (United States)

    Kan, E.; Uffelman, H.

    1998-01-01

    The arduous task of tracking multiple flight software images, across redundant on-board processors, recorders and across time, has been automated and systematized via the use of a unified Flight Software Memory Tracker (FMT) Tool.

  16. Software Process Improvement Journey: IBM Australia Application Management Services

    Science.gov (United States)

    2005-03-01

    See Section 5.1.2) - Client Relationship Management ( CRM ) processes-specifically, Solution Design and Solution Delivery - Worldwide Project Management ...plex systems life-cycle management , rapid solutions development, custom development, package selection and implementation, maintenance, minor...CarnegieMellon ___ Software Engineering Institute Software Process Improvement Journey: IBM Australia Application Management Services Robyn Nichols

  17. Improving Software Sustainability: Lessons Learned from Profiles in Science.

    Science.gov (United States)

    Gallagher, Marie E

    2013-01-01

    The Profiles in Science® digital library features digitized surrogates of historical items selected from the archival collections of the U.S. National Library of Medicine as well as collaborating institutions. In addition, it contains a database of descriptive, technical and administrative metadata. It also contains various software components that allow creation of the metadata, management of the digital items, and access to the items and metadata through the Profiles in Science Web site [1]. The choices made building the digital library were designed to maximize the sustainability and long-term survival of all of the components of the digital library [2]. For example, selecting standard and open digital file formats rather than proprietary formats increases the sustainability of the digital files [3]. Correspondingly, using non-proprietary software may improve the sustainability of the software--either through in-house expertise or through the open source community. Limiting our digital library software exclusively to open source software or to software developed in-house has not been feasible. For example, we have used proprietary operating systems, scanning software, a search engine, and office productivity software. We did this when either lack of essential capabilities or the cost-benefit trade-off favored using proprietary software. We also did so knowing that in the future we would need to replace or upgrade some of our proprietary software, analogous to migrating from an obsolete digital file format to a new format as the technological landscape changes. Since our digital library's start in 1998, all of its software has been upgraded or replaced, but the digitized items have not yet required migration to other formats. Technological changes that compelled us to replace proprietary software included the cost of product licensing, product support, incompatibility with other software, prohibited use due to evolving security policies, and product abandonment

  18. Performance Evaluation of a Software Engineering Tool for Automated Design of Cooling Systems in Injection Moulding

    NARCIS (Netherlands)

    Jauregui Becker, Juan Manuel; Tosello, Guido; Houten, Fred J.A.M.; Hansen, Hans N.

    2013-01-01

    This paper presents a software tool for automating the design of cooling systems for injection moulding and a validation of its performance. Cooling system designs were automatically generated by the proposed software tool and by applying a best practice tool engineering design approach. The two

  19. Can agile software tools bring the benefits of a task board to globally distributed teams?

    NARCIS (Netherlands)

    Katsma, Christiaan; Amrit, Chintan Amrit; van Hillegersberg, Jos; Sikkel, Nicolaas; Oshri, Ilan; Kotlarsky, Julia; Willcocks, Leslie P.

    Software-based tooling has become an essential part of globally disitrbuted software development. In this study we focus on the usage of such tools and task boards in particular. We investigate the deployment of these tools through a field research in 4 different companies that feature agile and

  20. Software reuse in agile development organizations: a conceptual management tool

    NARCIS (Netherlands)

    Spoelstra, Wouter; Iacob, Maria Eugenia; van Sinderen, Marten J.; Chu, W.C.; Wong, E.C.; Palakal, M.J.; Hung, C.-C.

    2011-01-01

    The reuse of knowledge is considered a major factor for increasing productivity and quality. In the software industry knowledge is embodied in software assets such as code components, functional designs and test cases. This kind of knowledge reuse is also referred to as software reuse. Although the

  1. BEASTling: A software tool for linguistic phylogenetics using BEAST 2.

    Science.gov (United States)

    Maurits, Luke; Forkel, Robert; Kaiping, Gereon A; Atkinson, Quentin D

    2017-01-01

    We present a new open source software tool called BEASTling, designed to simplify the preparation of Bayesian phylogenetic analyses of linguistic data using the BEAST 2 platform. BEASTling transforms comparatively short and human-readable configuration files into the XML files used by BEAST to specify analyses. By taking advantage of Creative Commons-licensed data from the Glottolog language catalog, BEASTling allows the user to conveniently filter datasets using names for recognised language families, to impose monophyly constraints so that inferred language trees are backward compatible with Glottolog classifications, or to assign geographic location data to languages for phylogeographic analyses. Support for the emerging cross-linguistic linked data format (CLDF) permits easy incorporation of data published in cross-linguistic linked databases into analyses. BEASTling is intended to make the power of Bayesian analysis more accessible to historical linguists without strong programming backgrounds, in the hopes of encouraging communication and collaboration between those developing computational models of language evolution (who are typically not linguists) and relevant domain experts.

  2. Software Quality Assessment Tool Based on Meta-Models

    OpenAIRE

    Doneva Rositsa; Gaftandzhieva Silvia; Doneva Zhelyana; Staevsky Nevena

    2015-01-01

    In the software industry it is indisputably essential to control the quality of produced software systems in terms of capabilities for easy maintenance, reuse, portability and others in order to ensure reliability in the software development. But it is also clear that it is very difficult to achieve such a control through a ‘manual’ management of quality.There are a number of approaches for software quality assurance based typically on software quality models (e.g. ISO 9126, McCall’s, Boehm’s...

  3. Effectiveness and usability of Scanning Wizard software: a tool for enhancing switch scanning.

    Science.gov (United States)

    Koester, Heidi Horstmann; Simpson, Richard C

    2017-11-24

    Scanning Wizard software helps scanning users improve the setup of their switch and scanning system. This study evaluated Scanning Wizard's effectiveness and usability. Ten people who use switch scanning and ten practitioners used Scanning Wizard in the initial session. Usability was high, based on survey responses averaging over 4.5 out of 5, and qualitative feedback was very positive. Five switch users were able to complete the multi-week protocol, using settings on their own scanning system that were recommended from the Scanning Wizard session. Using these revised settings, text entry rates improved by an average of 71%, ranging from 29% to 172% improvement. Results suggest that Scanning Wizard is a useful tool for improving the configuration of scanning systems for people who use switch scanning to communicate. Implications for Rehabilitation Some individuals with severe physical impairments use switch scanning for spoken and written communication. Scanning Wizard software helps scanning users improve the setup of their switch and scanning system. This study demonstrated high usability of Scanning Wizard (with 10 switch userpractitioner teams) and increased text entry rate by an average of 71% (for five switch users). Results suggest that Scanning Wizard is a useful tool for improving the configuration of scanning systems for people who use switch scanning to communicate.

  4. Availability Analysis and Improvement of Software Rejuvenation Using Virtualization

    National Research Council Canada - National Science Library

    PARK, Jong Sou; Thandar, THEIN; Sung-Do, CHI

    2007-01-01

    .... To improve the availability of application servers, we have conducted a study of virtualization technology and software rejuvenation that follows a proactive fault-tolerant approach to counter act...

  5. Characteristics and possibilities of software tool for metal-oxide surge arresters selection

    Directory of Open Access Journals (Sweden)

    Đorđević Dragan

    2012-01-01

    Full Text Available This paper presents a procedure for the selection of metal-oxide surge arresters based on the instructions given in the Siemens and ABB catalogues, respecting their differences and the characteristics and possibilities of the software tool. The software tool was developed during the preparation of a Master's thesis titled, 'Automation of Metal-Oxide Surge Arresters Selection'. An example is presented of the selection of metal-oxide surge arresters using the developed software tool.

  6. Usage of wondershare quizcreator software for assessment as a way of improving math evaluation

    OpenAIRE

    Jovanovska, Dobrila; Atanasova-Pacemska, Tatjana; Lazarova, Limonka; Pacemska, Sanja; Kovacheva, Tcveta

    2015-01-01

    Checking and evaluation are one of the most important elements of the learning process, because they provide information about the extent to which students achieved previously uploaded educational standards. This paper is a proposal how to improve the evaluation process in mathematics by using electronic tests created by multimedia software known as Wondershare quizcreator software. To make tests for computer based test, there are multiple quiz/test tools to choose from, but for the ...

  7. EINSTEIN - Expert system for an Intelligent Supply of Thermal Energy in Industry. Audit methodology and software tool

    Energy Technology Data Exchange (ETDEWEB)

    Schweiger, Hans; Danov, Stoyan (energyXperts.NET (Spain)); Vannoni, Claudia; Facci, Enrico (Sapienza Univ. of Rome, Dept. of Mechanics and Aeronautics, Rome (Italy)); Brunner, Christoph; Slawitsch, Bettina (Joanneum Research, Inst. of Sustainable Techniques and Systems - JOINTS, Graz (Austria))

    2009-07-01

    For optimising thermal energy supply in industry, a holistic integral approach is required that includes possibilities of demand reduction by heat recovery and process integration, and by an intelligent combination of efficient heat and cold supply technologies. EINSTEIN is a tool-kit for fast and high quality thermal energy audits in industry, composed by an audit guide describing the methodology and by a software tool that guides the auditor through all the audit steps. The main features of EINSTEIN are: (1) a basic questionnaire helps for systematic collection of the necessary information with the possibility to acquire data by distance; (2) special tools allow for fast consistency checking and estimation of missing data, so that already with very few data some first predictions can be made; (3) the data processing is based on standardised models for industrial processes and industrial heat supply systems; (4) semi-automatization: the software tool gives support to decision making for the generation of alternative heat and cold supply proposals, carries out automatically all the necessary calculations, including dynamic simulation of the heat supply system, and creates a standard audit report. The software tool includes modules for benchmarking, automatic design of heat exchanger networks, and design assistants for the heat and cold supply system. The core of the expert system software tool is available for free, as an open source software project. This type of software development has shown to be very efficient for dissemination of knowledge and for the continuous maintenance and improvement thanks to user contributions.

  8. An Accurate FFPA-PSR Estimator Algorithm and Tool for Software Effort Estimation

    Directory of Open Access Journals (Sweden)

    Senthil Kumar Murugesan

    2015-01-01

    Full Text Available Software companies are now keen to provide secure software with respect to accuracy and reliability of their products especially related to the software effort estimation. Therefore, there is a need to develop a hybrid tool which provides all the necessary features. This paper attempts to propose a hybrid estimator algorithm and model which incorporates quality metrics, reliability factor, and the security factor with a fuzzy-based function point analysis. Initially, this method utilizes a fuzzy-based estimate to control the uncertainty in the software size with the help of a triangular fuzzy set at the early development stage. Secondly, the function point analysis is extended by the security and reliability factors in the calculation. Finally, the performance metrics are added with the effort estimation for accuracy. The experimentation is done with different project data sets on the hybrid tool, and the results are compared with the existing models. It shows that the proposed method not only improves the accuracy but also increases the reliability, as well as the security, of the product.

  9. An Accurate FFPA-PSR Estimator Algorithm and Tool for Software Effort Estimation.

    Science.gov (United States)

    Murugesan, Senthil Kumar; Balasubramanian, Chidhambara Rajan

    2015-01-01

    Software companies are now keen to provide secure software with respect to accuracy and reliability of their products especially related to the software effort estimation. Therefore, there is a need to develop a hybrid tool which provides all the necessary features. This paper attempts to propose a hybrid estimator algorithm and model which incorporates quality metrics, reliability factor, and the security factor with a fuzzy-based function point analysis. Initially, this method utilizes a fuzzy-based estimate to control the uncertainty in the software size with the help of a triangular fuzzy set at the early development stage. Secondly, the function point analysis is extended by the security and reliability factors in the calculation. Finally, the performance metrics are added with the effort estimation for accuracy. The experimentation is done with different project data sets on the hybrid tool, and the results are compared with the existing models. It shows that the proposed method not only improves the accuracy but also increases the reliability, as well as the security, of the product.

  10. [An improved software design of ultrasound bone densitometer].

    Science.gov (United States)

    Yu, Zhengtao; Yang, Lian; Xu, Shijie; Deng, Jiangjun; Dong, Qingqing; He, Aijun

    2014-10-01

    In order to meet the requirements of ultrasound bone density measurement, we proposed a software solution to improve the accuracy and speed of measurement of bone mineral density of the ultrasound bone densitometer. We used a high-speed USB interface chip FT232H, along with a high-speed AD converter chip to calculate speed of sound (SOS), broadband ultrasound attenuation (BUA ) and other bone density parameters in the PC software. This solution improved the accuracy of the measurement data, reduced the measurement time and increased the quality of the displayed image. It is well concluded that the new software can greatly improve the accuracy and transmission speed of bone density measurement data through a high-speed USB interface and a software data processing technology.

  11. On the Role of Software Quality Management in Software Process Improvement

    DEFF Research Database (Denmark)

    Wiedemann Jacobsen, Jan; Kuhrmann, Marco; Münch, Jürgen

    2016-01-01

    Software Process Improvement (SPI) programs have been implemented, inter alia, to improve quality and speed of software development. SPI addresses many aspects ranging from individual developer skills to entire organizations. It comprises, for instance, the optimization of specific activities...... in the software lifecycle as well as the creation of organizational awareness and project culture. In the course of conducting a systematic mapping study on the state-of-the-art in SPI from a general perspective, we observed Software Quality Management (SQM) being of certain relevance in SPI programs...... and to develop an initial picture of how these topics are addressed in SPI. Our findings show a fairly pragmatic contribution set in which different solutions are proposed, discussed, and evaluated. Among others, our findings indicate a certain reluctance towards standard quality or (test) maturity models...

  12. Improving Data Catalogs with Free and Open Source Software

    Science.gov (United States)

    Schweitzer, R.; Hankin, S.; O'Brien, K.

    2013-12-01

    The Global Earth Observation Integrated Data Environment (GEO-IDE) is NOAA's effort to successfully integrate data and information with partners in the national US-Global Earth Observation System (US-GEO) and the international Global Earth Observation System of Systems (GEOSS). As part of the GEO-IDE, the Unified Access Framework (UAF) is working to build momentum towards the goal of increased data integration and interoperability. The UAF project is moving towards this goal with an approach that includes leveraging well known and widely used standards, as well as free and open source software. The UAF project shares the widely held conviction that the use of data standards is a key ingredient necessary to achieve interoperability. Many community-based consensus standards fail, though, due to poor compliance. Compliance problems emerge for many reasons: because the standards evolve through versions, because documentation is ambiguous or because individual data providers find the standard inadequate as-is to meet their special needs. In addition, minimalist use of standards will lead to a compliant service, but one which is of low quality. In this presentation, we will be discussing the UAF effort to build a catalog cleaning tool which is designed to crawl THREDDS catalogs, analyze the data available, and then build a 'clean' catalog of data which is standards compliant and has a uniform set of data access services available. These data services include, among others, OPeNDAP, Web Coverage Service (WCS) and Web Mapping Service (WMS). We will also discuss how we are utilizing free and open source software and services to both crawl, analyze and build the clean data catalog, as well as our efforts to help data providers improve their data catalogs. We'll discuss the use of open source software such as DataNucleus, Thematic Realtime Environmental Distributed Data Services (THREDDS), ncISO and the netCDF Java Common Data Model (CDM). We'll also demonstrate how we are

  13. Software for predictive microbiology and risk assessment: a description and comparison of tools presented at the ICPMF8 Software Fair.

    Science.gov (United States)

    Tenenhaus-Aziza, Fanny; Ellouze, Mariem

    2015-02-01

    The 8th International Conference on Predictive Modelling in Food was held in Paris, France in September 2013. One of the major topics of this conference was the transfer of knowledge and tools between academics and stakeholders of the food sector. During the conference, a "Software Fair" was held to provide information and demonstrations of predictive microbiology and risk assessment software. This article presents an overall description of the 16 software tools demonstrated at the session and provides a comparison based on several criteria such as the modeling approach, the different modules available (e.g. databases, predictors, fitting tools, risk assessment tools), the studied environmental factors (temperature, pH, aw, etc.), the type of media (broth or food) and the number and type of the provided micro-organisms (pathogens and spoilers). The present study is a guide to help users select the software tools which are most suitable to their specific needs, before they test and explore the tool(s) in more depth. Copyright © 2014 Elsevier Ltd. All rights reserved.

  14. Improving Tools in Artificial Intelligence

    Directory of Open Access Journals (Sweden)

    Angel Garrido

    2010-01-01

    Full Text Available The historical origin of the Artificial Intelligence (AI is usually established in the Dartmouth Conference, of 1956. But we can find many more arcane origins [1]. Also, we can consider, in more recent times, very great thinkers, as Janos Neumann (then, John von Neumann, arrived in USA, Norbert Wiener, Alan Mathison Turing, or Lofti Zadeh, for instance [12, 14]. Frequently AI requires Logic. But its Classical version shows too many insufficiencies. So, it was necessary to introduce more sophisticated tools, as Fuzzy Logic, Modal Logic, Non-Monotonic Logic and so on [1, 2]. Among the things that AI needs to represent are categories, objects, properties, relations between objects, situations, states, time, events, causes and effects, knowledge about knowledge, and so on. The problems in AI can be classified in two general types [3, 5], search problems and representation problems. On this last "peak", there exist different ways to reach their summit. So, we have [4] Logics, Rules, Frames, Associative Nets, Scripts, and so on, many times connected among them. We attempt, in this paper, a panoramic vision of the scope of application of such representation methods in AI. The two more disputable questions of both modern philosophy of mind and AI will be perhaps the Turing Test and the Chinese Room Argument. To elucidate these very difficult questions, see our final note.

  15. Using Commercial Off-the-Shelf Software Tools for Space Shuttle Scientific Software

    Science.gov (United States)

    Groleau, Nicolas; Friedland, Peter (Technical Monitor)

    1994-01-01

    In October 1993, the Astronaut Science Advisor (ASA) was on board the STS-58 flight of the space shuttle. ASA is an interactive system providing data acquisition and analysis, experiment step re-scheduling, and various other forms of reasoning. As fielded, the system runs on a single Macintosh PowerBook 170, which hosts the six ASA modules. There is one other piece of hardware, an external (GW Instruments, Sommerville, Massachusetts) analog-to-digital converter connected to the PowerBook's SCSI port. Three main software tools were used: LabVIEW, CLIPS, and HyperCard: First, a module written in LabVIEW (National Instruments, Austin, Texas) controls the A/D conversion and stores the resulting data in appropriate arrays. This module also analyzes the numerical data to produce a small set of characteristic numbers or symbols describing the results of an experiment trial. Second, a forward-chaining inference system written in CLIPS (NASA) uses the symbolic information provided by the first stage with a static rule base to infer decisions about the experiment. This expert system shell is used by the system for diagnosis. The third component of the system is the user interface, written in HyperCard (Claris Inc. and Apple Inc., both in Cupertino, California).

  16. Basic quality tools in continuous improvement process

    OpenAIRE

    Soković, Mirko; Jovanović, Jelena; Krivokapić, Zdravko; Vujović, Aleksandar

    2015-01-01

    If organizations wish to achieve continuous quality improvement they need to use appropriate selection of quality tools and techniques. In this paper a review of possibilities of the systematic use of seven basic quality tools (7QC tools) is presented. It is shown that 7QC tools can be used in all process phases, from the beginning of a product development up to management of a production process and delivery. It is further shown how to involve 7QC tools in some phases of continuous improveme...

  17. ESSCOTS for Learning: Transforming Commercial Software into Powerful Educational Tools.

    Science.gov (United States)

    McArthur, David; And Others

    1995-01-01

    Gives an overview of Educational Support Systems based on commercial off-the-shelf software (ESSCOTS), and discusses the benefits of developing such educational software. Presents results of a study that revealed the learning processes of middle and high school students who used a geographical information system. (JMV)

  18. Software tools for identification, visualization and analysis of protein tunnels and channels.

    Science.gov (United States)

    Brezovsky, Jan; Chovancova, Eva; Gora, Artur; Pavelka, Antonin; Biedermannova, Lada; Damborsky, Jiri

    2013-01-01

    Protein structures contain highly complex systems of voids, making up specific features such as surface clefts or grooves, pockets, protrusions, cavities, pores or channels, and tunnels. Many of them are essential for the migration of solvents, ions and small molecules through proteins, and their binding to the functional sites. Analysis of these structural features is very important for understanding of structure-function relationships, for the design of potential inhibitors or proteins with improved functional properties. Here we critically review existing software tools specialized in rapid identification, visualization, analysis and design of protein tunnels and channels. The strengths and weaknesses of individual tools are reported together with examples of their applications for the analysis and engineering of various biological systems. This review can assist users with selecting a proper software tool for study of their biological problem as well as highlighting possible avenues for further development of existing tools. Development of novel descriptors representing not only geometry, but also electrostatics, hydrophobicity or dynamics, is needed for reliable identification of biologically relevant tunnels and channels. Copyright © 2012 Elsevier Inc. All rights reserved.

  19. 76 FR 5832 - International Business Machines (IBM), Software Group Business Unit, Optim Data Studio Tools QA...

    Science.gov (United States)

    2011-02-02

    ... Employment and Training Administration International Business Machines (IBM), Software Group Business Unit... at International Business Machines (IBM), Software Group Business Unit, Optim Data Studio Tools QA... IBM employees working on the relevant product within the Data Studio Tools QA on a part-time basis and...

  20. Software development tool for PicoBlaze multi-processor implementation

    Directory of Open Access Journals (Sweden)

    Claudiu Lung

    2012-06-01

    Full Text Available This paper presents a useful software tool for projects with multi PicoBlaze microprocessors implemented in FPGA circuits. Application presented in this paper which use for software development PicoBlaze SDK tool is an Automatic Packet Report System (APRS, with three PicoBlaze microprocessors implemented in FPGA circuit.

  1. Software Support of Modelling using Ergonomic Tools in Engineering

    Directory of Open Access Journals (Sweden)

    Darina Dupláková

    2017-08-01

    Full Text Available One of the preconditions for correct development of industrial production is continuous interconnecting of virtual reality and real world by computer software. Computer software are used for product modelling, creation of technical documentation, scheduling, management and optimization of manufacturing processes, and efficiency increase of human work in manufacturing plants. This article describes the frequent used ergonomic software which helping to increase of human work by error rate reducing, risks factors of working environment, injury in workplaces and elimination of arising occupational diseases. They are categorized in the field of micro ergonomics and they are applicable at the manufacturing level with flexible approach in solving of established problems.

  2. Software Tool for Automated Failure Modes and Effects Analysis (FMEA) of Hydraulic Systems

    DEFF Research Database (Denmark)

    Stecki, J. S.; Conrad, Finn; Oh, B.

    2002-01-01

    Offshore, marine,aircraft and other complex engineering systems operate in harsh environmental and operational conditions and must meet stringent requirements of reliability, safety and maintability. To reduce the hight costs of development of new systems in these fields improved the design...... management techniques and a vast array of computer aided techniques are applied during design and testing stages. The paper present and discusses the research and development of a software tool for automated failure mode and effects analysis - FMEA - of hydraulic systems. The paper explains the underlying...

  3. Possibilities for using software tools in the process of secuirty design

    Directory of Open Access Journals (Sweden)

    Ladislav Mariš

    2013-07-01

    Full Text Available The authors deal with the use of software support the process of security design. The article proposes the theoretical basis of the implementation of software tools to design activities. Based on the selected design standards of electrical safety systems application design solutions, especially in drawing documentation. The article should serve the needs of the project team members in order to use selected software tools and a subsequent increase in the degree of automation of design activities.

  4. Refrigerated cutting tools improve machining of superalloys

    Science.gov (United States)

    Dudley, G. M.

    1971-01-01

    Freon-12 applied to tool cutting edge evaporates quickly, leaves no residue, and permits higher cutting rate than with conventional coolants. This technique increases cutting rate on Rene-41 threefold and improves finish of machined surface.

  5. Tool Support for Distributed Software Development : The past - present - and future of gaps between user requirements and tool functionalities

    NARCIS (Netherlands)

    Herrera, Miles; van Hillegersberg, Jos; Harmsen, Frank; Amrit, Chintan Amrit; Geisberger, Eva; Keil, Patrick; Kuhrmann, Marco

    2007-01-01

    This paper presents the past, present, and our view on future user requirements and tool functionalities supporting Globally Distributed Software Teams and highlights the changing emphasis in these user requirements.

  6. OpenROCS: a software tool to control robotic observatories

    Science.gov (United States)

    Colomé, Josep; Sanz, Josep; Vilardell, Francesc; Ribas, Ignasi; Gil, Pere

    2012-09-01

    We present the Open Robotic Observatory Control System (OpenROCS), an open source software platform developed for the robotic control of telescopes. It acts as a software infrastructure that executes all the necessary processes to implement responses to the system events that appear in the routine and non-routine operations associated to data-flow and housekeeping control. The OpenROCS software design and implementation provides a high flexibility to be adapted to different observatory configurations and event-action specifications. It is based on an abstract model that is independent of the specific hardware or software and is highly configurable. Interfaces to the system components are defined in a simple manner to achieve this goal. We give a detailed description of the version 2.0 of this software, based on a modular architecture developed in PHP and XML configuration files, and using standard communication protocols to interface with applications for hardware monitoring and control, environment monitoring, scheduling of tasks, image processing and data quality control. We provide two examples of how it is used as the core element of the control system in two robotic observatories: the Joan Oró Telescope at the Montsec Astronomical Observatory (Catalonia, Spain) and the SuperWASP Qatar Telescope at the Roque de los Muchachos Observatory (Canary Islands, Spain).

  7. A Dedicated Computational Platform for Cellular Monte Carlo T-CAD Software Tools

    Science.gov (United States)

    2015-07-14

    AFRL-OSR-VA-TR-2015-0176 A dedicated computational platform for Cellular Monte Carlo T- CAD software tools Marco Saraniti ARIZONA STATE UNIVERSITY...TITLE AND SUBTITLE A Dedicated Computational Platform for Cellular Monte Carlo T- CAD Software Tools 5a. CONTRACT NUMBER 5b. GRANT NUMBER FA9550-14...with an optimized architecture for the Cellular Monte Carlo particle-based T- CAD simulation tools developed by our group. Such code is used for the

  8. Governance, risk and compliance in BPM: A survey of software tools

    OpenAIRE

    Kötter, Falko; Kochanowski, Monika; Drawehn, Jens

    2015-01-01

    Governance, risk and compliance (GRC) are current research topics in business process management (BPM). However, the state of the art in research and practice does not match. In this work, we investigate the practice of GRC in BPM tools based on a survey of 14 software providers. Identifying commonly shared features and components we determine the state of the art of GRC support in BPM tools. We found software providers agree in their definitions of GRC. Today’s tools provide mature solutions...

  9. Hardware and software and machine-tool simulation with parallel structures mechanisms

    Directory of Open Access Journals (Sweden)

    Keba P.V.

    2016-12-01

    Full Text Available The usage spectrum of mechanisms with parallel structure is spreading all the time. The mechanisms of machine-tools and manipulators become more complicated and it is necessary to improve the program-controlled modules. Closed circuit mechanisms are mostly spread in robotic complexes, where manipulator performs complicated spatial movements by the given trajectory. The usage spectrum is very wide and the most popular are sorting, welding, assembling and others. However, the problem of designing the operating programs is still present even today. It is just because the developed post-processors are created for the equipment that we have for now. But new machine tool constructions appear every day and there is a necessity to control them. The problems associated with using of hardware and software of mechanisms with parallel structure in computer-aided simulation are considered. The program for inverse problem kinematics solving is designed. New method of designing the control programs is found. The kinematic analysis methods options and calculated data obtained by computer mathematics systems are shown with «Tools Glide» software taken as an example.

  10. Analyst Tools and Quality Control Software for the ARM Data System

    Energy Technology Data Exchange (ETDEWEB)

    Moore, Sean; Hughes, Gary

    2008-07-31

    Mission Research develops analyst tools and automated quality control software in order to assist the Atmospheric Radiation Measurement (ARM) Data Quality Office with their data inspection tasks. We have developed web-based data analysis and visualization tools such as the interactive plotting program NCVweb, various diagnostic plot browsers, and a datastream processing status application. These tools allow even novice ARM researchers to be productive with ARM data with only minimal effort. We also contribute to the ARM Data Quality Office by analyzing ARM data streams, developing new quality control metrics, new diagnostic plots, and integrating this information into DQ HandS - the Data Quality Health and Status web-based explorer. We have developed several ways to detect outliers in ARM data streams and have written software to run in an automated fashion to flag these outliers. We have also embarked on a system to comprehensively generate long time-series plots, frequency distributions, and other relevant statistics for scientific and engineering data in most high-level, publicly available ARM data streams. Furthermore, frequency distributions categorized by month or by season are made available to help define valid data ranges specific to those time domains. These statistics can be used to set limits that when checked, will improve upon the reporting of suspicious data and the early detection of instrument malfunction. The statistics and proposed limits are stored in a database for easy reporting, refining, and for use by other processes. Web-based applications to view the results are also available.

  11. RFcap: a software analysis tool for multichannel cochlear implant signals.

    Science.gov (United States)

    Lai, Wai Kong; Dillier, Norbert

    2013-03-01

    Being able to display and analyse the output of a speech processor that encodes the parameters of complex stimuli to be presented by a cochlear implant (CI) is useful for software and hardware development as well as for diagnostic purposes. This firstly requires appropriate hardware that is able to receive and decode the radio frequency (RF)-coded signals, and then processing the decoded data using suitable software. The PCI-IF6 clinical hardware for the Nucleus CI system, together with the Nucleus Implant Communicator and Nucleus Matlab Toolbox research software libraries, provide the necessary functionality. RFcap is a standalone Matlab application that encapsulates the relevant functions to capture, display, and analyse the RF-coded signals intended for the Nucleus CI24M/R, CI24RE, and CI500 multichannel CIs.

  12. C++ software quality in the ATLAS experiment: tools and experience

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00236968; The ATLAS collaboration; Kluth, Stefan; Seuster, Rolf; Snyder, Scott; Obreshkov, Emil; Roe, Shaun; Sherwood, Peter; Stewart, Graeme

    2017-01-01

    In this paper we explain how the C++ code quality is managed in ATLAS using a range of tools from compile-time through to run time testing and reflect on the substantial progress made in the last two years largely through the use of static analysis tools such as Coverity®, an industry-standard tool which enables quality comparison with general open source C++ code. Other available code analysis tools are also discussed, as is the role of unit testing with an example of how the GoogleTest framework can be applied to our codebase.

  13. Tool for Validation Software Projects in Programming Labs

    Directory of Open Access Journals (Sweden)

    Antonio J. Sierra

    2012-04-01

    Full Text Available This work shows a testing tool used in Fundamentals of Programming II laboratory in Telecommunication Technologies Engineering Degree at University of Sevilla to check the student project. This tool allows students to test the proper operation of their project in autonomous way. This is a flexible and useful tool for testing the project because the tool identifies when the student has carried out a project that meet the given specifications of the project. This implies a high rate of success when the student delivers its project.

  14. C++ software quality in the ATLAS experiment: tools and experience

    Science.gov (United States)

    Martin-Haugh, S.; Kluth, S.; Seuster, R.; Snyder, S.; Obreshkov, E.; Roe, S.; Sherwood, P.; Stewart, G. A.

    2017-10-01

    In this paper we explain how the C++ code quality is managed in ATLAS using a range of tools from compile-time through to run time testing and reflect on the substantial progress made in the last two years largely through the use of static analysis tools such as Coverity®, an industry-standard tool which enables quality comparison with general open source C++ code. Other available code analysis tools are also discussed, as is the role of unit testing with an example of how the GoogleTest framework can be applied to our codebase.

  15. C++ Software Quality in the ATLAS Experiment: Tools and Experience

    CERN Document Server

    Kluth, Stefan; The ATLAS collaboration; Obreshkov, Emil; Roe, Shaun; Seuster, Rolf; Snyder, Scott; Stewart, Graeme

    2016-01-01

    The ATLAS experiment at CERN uses about six million lines of code and currently has about 420 developers whose background is largely from physics. In this paper we explain how the C++ code quality is managed using a range of tools from compile-time through to run time testing and reflect on the great progress made in the last year largely through the use of static analysis tools such as Coverity®, an industry-standard tool which enables quality comparison with general open source C++ code. Other tools including cppcheck, Include-What-You-Use and run-time 'sanitizers' are also discussed.

  16. High Energy Physics Forum for Computational Excellence: Working Group Reports (I. Applications Software II. Software Libraries and Tools III. Systems)

    Energy Technology Data Exchange (ETDEWEB)

    Habib, Salman [Argonne National Lab. (ANL), Argonne, IL (United States); Roser, Robert [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); LeCompte, Tom [Argonne National Lab. (ANL), Argonne, IL (United States); Marshall, Zach [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Borgland, Anders [SLAC National Accelerator Lab., Menlo Park, CA (United States); Viren, Brett [Brookhaven National Lab. (BNL), Upton, NY (United States); Nugent, Peter [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Asai, Makato [SLAC National Accelerator Lab., Menlo Park, CA (United States); Bauerdick, Lothar [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Finkel, Hal [Argonne National Lab. (ANL), Argonne, IL (United States); Gottlieb, Steve [Indiana Univ., Bloomington, IN (United States); Hoeche, Stefan [SLAC National Accelerator Lab., Menlo Park, CA (United States); Sheldon, Paul [Vanderbilt Univ., Nashville, TN (United States); Vay, Jean-Luc [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Elmer, Peter [Princeton Univ., NJ (United States); Kirby, Michael [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Patton, Simon [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Potekhin, Maxim [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Yanny, Brian [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Calafiura, Paolo [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Dart, Eli [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Gutsche, Oliver [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Izubuchi, Taku [Brookhaven National Lab. (BNL), Upton, NY (United States); Lyon, Adam [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Petravick, Don [Univ. of Illinois, Urbana-Champaign, IL (United States). National Center for Supercomputing Applications (NCSA)

    2015-10-29

    Computing plays an essential role in all aspects of high energy physics. As computational technology evolves rapidly in new directions, and data throughput and volume continue to follow a steep trend-line, it is important for the HEP community to develop an effective response to a series of expected challenges. In order to help shape the desired response, the HEP Forum for Computational Excellence (HEP-FCE) initiated a roadmap planning activity with two key overlapping drivers -- 1) software effectiveness, and 2) infrastructure and expertise advancement. The HEP-FCE formed three working groups, 1) Applications Software, 2) Software Libraries and Tools, and 3) Systems (including systems software), to provide an overview of the current status of HEP computing and to present findings and opportunities for the desired HEP computational roadmap. The final versions of the reports are combined in this document, and are presented along with introductory material.

  17. High Energy Physics Forum for Computational Excellence: Working Group Reports (I. Applications Software II. Software Libraries and Tools III. Systems)

    Energy Technology Data Exchange (ETDEWEB)

    Habib, Salman [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Roser, Robert [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States)

    2015-10-28

    Computing plays an essential role in all aspects of high energy physics. As computational technology evolves rapidly in new directions, and data throughput and volume continue to follow a steep trend-line, it is important for the HEP community to develop an effective response to a series of expected challenges. In order to help shape the desired response, the HEP Forum for Computational Excellence (HEP-FCE) initiated a roadmap planning activity with two key overlapping drivers -- 1) software effectiveness, and 2) infrastructure and expertise advancement. The HEP-FCE formed three working groups, 1) Applications Software, 2) Software Libraries and Tools, and 3) Systems (including systems software), to provide an overview of the current status of HEP computing and to present findings and opportunities for the desired HEP computational roadmap. The final versions of the reports are combined in this document, and are presented along with introductory material.

  18. MSP-Tool: a VBA-based software tool for the analysis of multispecimen paleointensity data

    Science.gov (United States)

    Monster, Marilyn; de Groot, Lennart; Dekkers, Mark

    2015-12-01

    The multispecimen protocol (MSP) is a method to estimate the Earth's magnetic field's past strength from volcanic rocks or archeological materials. By reducing the amount of heating steps and aligning the specimens parallel to the applied field, thermochemical alteration and multi-domain effects are minimized. We present a new software tool, written for Microsoft Excel 2010 in Visual Basic for Applications (VBA), that evaluates paleointensity data acquired using this protocol. In addition to the three ratios (standard, fraction-corrected and domain-state-corrected) calculated following Dekkers and Böhnel (2006) and Fabian and Leonhardt (2010) and a number of other parameters proposed by Fabian and Leonhardt (2010), it also provides several reliability criteria. These include an alteration criterion, whether or not the linear regression intersects the y axis within the theoretically prescribed range, and two directional checks. Overprints and misalignment are detected by isolating the remaining natural remanent magnetization (NRM) and the partial thermoremanent magnetization (pTRM) gained and comparing their declinations and inclinations. The NRM remaining and pTRM gained are then used to calculate alignment-corrected multispecimen plots. Data are analyzed using bootstrap statistics. The program was tested on lava samples that were given a full TRM and that acquired their pTRMs at angles of 0, 15, 30 and 90° with respect to their NRMs. MSP-Tool adequately detected and largely corrected these artificial alignment errors.

  19. A Study of Collaborative Software Development Using Groupware Tools

    Science.gov (United States)

    Defranco-Tommarello, Joanna; Deek, Fadi P.

    2005-01-01

    The experimental results of a collaborative problem solving and program development model that takes into consideration the cognitive and social activities that occur during software development is presented in this paper. This collaborative model is based on the Dual Common Model that focuses on individual cognitive aspects of problem solving and…

  20. Software Tools For Large Scale Interactive Hydrodynamic Modeling

    NARCIS (Netherlands)

    Donchyts, G.; Baart, F.; van Dam, A; Jagers, B; van der Pijl, S.; Piasecki, M.

    2014-01-01

    Developing easy-to-use software that combines components for simultaneous visualization, simulation and interaction is a great challenge. Mainly, because it involves a number of disciplines, like computational fluid dynamics, computer graphics, high-performance computing. One of the main

  1. The Development of an Analyses-Intensive Software for Improved ...

    African Journals Online (AJOL)

    The Development of an Analyses-Intensive Software for Improved Cams System Design. ... If you would like more information about how to print, save, and work with PDFs, Highwire Press provides a helpful Frequently Asked Questions about PDFs. Alternatively, you can download the PDF file directly to your computer, from ...

  2. XML-Based Integration of the SMIILE Tool Prototype and Software Metrics Repository

    Science.gov (United States)

    Rakić, Gordana; Gerlec, Črt; Novak, Jernej; Budimac, Zoran

    2011-09-01

    An adequate system for collecting and analyzing software metric results is crucial for the successful interpretation of software characteristics during the software development process. Furthermore, the right interpretation can also identify poor quality in complex modules within software systems. However, there is a gap between raw metrics data and the right interpretation of them. In this paper, the integration of a language independent metric tool (SMIILE prototype [9]) for metrics data extraction and the software metrics repository for storing and analyzing extracted data will be described. With this integration we hope to overcome some of the difficulties that application of sofware metrics experienced in practice.

  3. 75 FR 30387 - Improving Market and Planning Efficiency Through Improved Software; Notice of Agenda and...

    Science.gov (United States)

    2010-06-01

    ... Federal Energy Regulatory Commission Improving Market and Planning Efficiency Through Improved Software... discuss issues related to power system expansion planning models and software. The technical conference... event is available through http://www.ferc.gov . Anyone with Internet access who desires to view this...

  4. Using generic "diagramming" software as a research tool

    OpenAIRE

    Shabajee, Paul

    1997-01-01

    This paper will look at ways in which generic 'diagramming/flowcharting' software can be used to support the research process. It will show how it can help with the organisation and processing of information. The uses described include, graphical representation of results of literature searches to show inter- relationships, flowcharting the research process to ensure clarity, organising contacts database graphically to enable faster access to data and identifying inter-relationships. The pape...

  5. Improving Performance of Software Implemented Floating Point Addition

    DEFF Research Database (Denmark)

    Hindborg, Andreas Erik; Karlsson, Sven

    2011-01-01

    We outline and evaluate hardware extensions to an integer processor pipeline which allow IEEE 754 oating point, FP, addition to be eciently implemented in software. With a very moderate increase in hardware resources, our perfor- mance evaluation shows that, for a benchmark that executes 12.5% FP...... addition instructions, our approach exhibits a rel- ative slowdown of 3.38 to 15.15 as compared to dedicated hardware. This is a signicant improvement of pure software emulation which leads to relative slowdowns up to 45.33....

  6. Availability Analysis and Improvement of Software Rejuvenation Using Virtualization

    Directory of Open Access Journals (Sweden)

    Jong Sou PARK

    2007-01-01

    Full Text Available Availability of business-critical application servers is an issue ofparamount importance that has received special attention from the industry and academia. To improve the availability of application servers, we have conducted a study of virtualization technology and software rejuvenation that follows a proactive fault-tolerant approach to counter act the software aging problem. We present Markov models for analyzing availability in such continuously running applications and express availability, downtime and downtime costs during rejuvenation in terms of the parameters in the models. Our results show that our approach is a practical way to ensure uninterrupted availability and optimize performance for even strongly aging applications.

  7. STADIUM FLIR: a software tool for FLIR92 and ACQUIRE

    Science.gov (United States)

    Hess, Glenn T.; Sanders, Thomas J.

    2000-07-01

    FLIR92 and ACQUIRE have become the standard simulation models used in virtually all Forward Looking Infrared (FLIR) system design. Recently, a software program called STADIUM FLIR has been written for use with the U.S. Army's FLIR92 and ACQUIRE models. This software provides many performance and ease of use enhancements for the models. Some of these enhancements include graphical user interfaces for all model parameter entry, data extraction between FLIR92 and ACQUIRE as well as comprehensive plotting of output curves. All data extraction and plotting is automatic and seamless. STADIUM FLIR is based on AET's STADIUM technology which adds powerful Design of Experiments and statistical analysis capabilities to simulation environments. The results are presented both quantitatively and graphically. STADIUM FLIR provides comprehensive plotting capabilities for both raw data as well as `overlayed' statistical variability data. STADIUM FLIR provides the power to perform multiple FLIR92 and ACQUIRE simulations with inputs (even multiple targets) varying over user specified ranges. This paper will describe the software and how it enhances the power of FLIR92 and ACQUIRE.

  8. Productivity, part 2: cloud storage, remote meeting tools, screencasting, speech recognition software, password managers, and online data backup.

    Science.gov (United States)

    Lackey, Amanda E; Pandey, Tarun; Moshiri, Mariam; Lalwani, Neeraj; Lall, Chandana; Bhargava, Puneet

    2014-06-01

    It is an opportune time for radiologists to focus on personal productivity. The ever increasing reliance on computers and the Internet has significantly changed the way we work. Myriad software applications are available to help us improve our personal efficiency. In this article, the authors discuss some tools that help improve collaboration and personal productivity, maximize e-learning, and protect valuable digital data. Published by Elsevier Inc.

  9. Toolkit of Available EPA Green Infrastructure Modeling Software: Watershed Management Optimization Support Tool (WMOST)

    Science.gov (United States)

    Watershed Management Optimization Support Tool (WMOST) is a software application designed tofacilitate integrated water resources management across wet and dry climate regions. It allows waterresources managers and planners to screen a wide range of practices across their watersh...

  10. Contingency Contractor Optimization Phase 3 Sustainment Third-Party Software List - Contingency Contractor Optimization Tool - Prototype

    Energy Technology Data Exchange (ETDEWEB)

    Durfee, Justin David; Frazier, Christopher Rawls; Bandlow, Alisa

    2016-05-01

    The Contingency Contractor Optimization Tool - Prototype (CCOT-P) requires several third-party software packages. These are documented below for each of the CCOT-P elements: client, web server, database server, solver, web application and polling application.

  11. Focus: Design and Evaluation of a Software Tool for Collecting Reader Feedback.

    Science.gov (United States)

    de Jong, Menno; Lentz, Leo

    2001-01-01

    Describes "Focus," a software tool for collecting reader comments more efficiently. Discusses the design and rationale of the software. Notes that results obtained using Focus were compared to the reader feedback collected under the plus-minus method. Concludes that Focus participants appeared to comment more from a reviewer's and less…

  12. Designing and Using Software Tools for Educational Purposes: FLAT, a Case Study

    Science.gov (United States)

    Castro-Schez, J. J.; del Castillo, E.; Hortolano, J.; Rodriguez, A.

    2009-01-01

    Educational software tools are considered to enrich teaching strategies, providing a more compelling means of exploration and feedback than traditional blackboard methods. Moreover, software simulators provide a more motivating link between theory and practice than pencil-paper methods, encouraging active and discovery learning in the students.…

  13. Slower Algebra Students Meet Faster Tools: Solving Algebra Word Problems with Graphing Software

    Science.gov (United States)

    Yerushalmy, Michal

    2006-01-01

    The article discusses the ways that less successful mathematics students used graphing software with capabilities similar to a basic graphing calculator to solve algebra problems in context. The study is based on interviewing students who learned algebra for 3 years in an environment where software tools were always present. We found differences…

  14. A Process Framework for Designing Software Reference Architectures for Providing Tools as a Service

    DEFF Research Database (Denmark)

    Chauhan, Muhammad Aufeef; Babar, Muhammad Ali; Probst, Christian W.

    2016-01-01

    of software systems need customized and systematic SRA design and evaluation methods. In this paper, we present a software Reference Architecture Design process Framework (RADeF) that can be used for analysis, design and evaluation of the SRA for provisioning of Tools as a Service as part of a cloud...

  15. A Guide to the Classification and Assessment of Software Engineering Tools

    Science.gov (United States)

    1987-09-01

    collection and data management CMU/SEI-87-TR-10 11 " communications (within and between projects) " quality assurance (reviews and audits...tool support the communication mechanisms of the methodology (such as * a textual or graphical language) without alteration? 6. Does the tool build in...1985. [6] STARS Joint Service Team for Software Engineering Environment. Preliminary System Specification, Software TeCnology for Adaptable Reliable

  16. A Process Framework for Designing Software Reference Architectures for Providing Tools as a Service

    DEFF Research Database (Denmark)

    Chauhan, Muhammad Aufeef; Babar, Muhammad Ali; Probst, Christian W.

    2016-01-01

    Software Reference Architecture (SRA), which is a generic architecture solution for a specific type of software systems, provides foundation for the design of concrete architectures in terms of architecture design guidelines and architecture elements. The complexity and size of certain types...... of software systems need customized and systematic SRA design and evaluation methods. In this paper, we present a software Reference Architecture Design process Framework (RADeF) that can be used for analysis, design and evaluation of the SRA for provisioning of Tools as a Service as part of a cloud......-enabled workSPACE (TSPACE). The framework is based on the state of the art results from literature and our experiences with designing software architectures for cloud-based systems. We have applied RADeF SRA design two types of TSPACE: software architecting TSPACE and software implementation TSPACE...

  17. Software Tools | Office of Cancer Clinical Proteomics Research

    Science.gov (United States)

    The CPTAC program develops new approaches to elucidate aspects of the molecular complexity of cancer made from large-scale proteogenomic datasets, and advance them toward precision medicine.  Part of the CPTAC mission is to make data and tools available and accessible to the greater research community to accelerate the discovery process.

  18. Integration of life cycle assessment software with tools for economic and sustainability analyses and process simulation for sustainable process design

    DEFF Research Database (Denmark)

    Kalakul, Sawitree; Malakul, Pomthong; Siemanond, Kitipat

    2014-01-01

    The sustainable future of the world challenges engineers to develop chemical process designs that are not only technically and economically feasible but also environmental friendly. Life cycle assessment (LCA) is a tool for identifying and quantifying environmental impacts of the chemical product...... with other process design tools such as sustainable design (SustainPro), economic analysis (ECON) and process simulation. The software framework contains four main tools: Tool-I is for life cycle inventory (LCI) knowledge management that enables easy maintenance and future expansion of the LCI database; Tool...... and/or the process that makes it. It can be used in conjunction with process simulation and economic analysis tools to evaluate the design of any existing and/or new chemical-biochemical process and to propose improvement options in order to arrive at the best design among various alternatives...

  19. Crafting a Software Process Improvement Approach - A Retrospective Systematization

    DEFF Research Database (Denmark)

    Kuhrmann, Marco

    2015-01-01

    Structured approaches are beneficial for successful software process improvement (SPI). However, process engineers often struggle with standardized SPI methods, such as capability maturity model integration (CMMI) or International Organization for Standardization (ISO) 15504, and complain about too...... the need to develop a new method for artifact-based SPI. In the process, we found that the construction procedures of SPI models are barely documented, and thus, their successful adaptation solely depends on the process engineers' expertise. With this article, we aim to address this lack of support...... and provide a structured reflection on our experiences from creating and adopting the Artifact-based Software Process Improvement & Management (ArSPI) model. We present the steps of the construction procedure, the validation, and the dissemination of the model. Furthermore, we detail on the applied methods...

  20. APUAMA: a software tool for reaction rate calculations.

    Science.gov (United States)

    Euclides, Henrique O; P Barreto, Patricia R

    2017-06-01

    APUAMA is a free software designed to determine the reaction rate and thermodynamic properties of chemical species of a reagent system. With data from electronic structure calculations, the APUAMA determine the rate constant with tunneling correction, such as Wigner, Eckart and small curvature, and also, include the rovibrational level of diatomic molecules. The results are presented in the form of Arrhenius-Kooij form, for the reaction rate, and the thermodynamic properties are written down in the polynomial form. The word APUAMA means "fast" in Tupi-Guarani Brazilian language, then the code calculates the reaction rate on a simple and intuitive graphic interface, the form fast and practical. As program output, there are several ASCII files with tabulated information for rate constant, rovibrational levels, energy barriers and enthalpy of reaction, Arrhenius-Kooij coefficient, and also, the option to the User save all graphics in BMP format.

  1. JULIDE: a software tool for 3D reconstruction and statistical analysis of autoradiographic mouse brain sections.

    Directory of Open Access Journals (Sweden)

    Delphine Ribes

    Full Text Available In this article we introduce JULIDE, a software toolkit developed to perform the 3D reconstruction, intensity normalization, volume standardization by 3D image registration and voxel-wise statistical analysis of autoradiographs of mouse brain sections. This software tool has been developed in the open-source ITK software framework and is freely available under a GPL license. The article presents the complete image processing chain from raw data acquisition to 3D statistical group analysis. Results of the group comparison in the context of a study on spatial learning are shown as an illustration of the data that can be obtained with this tool.

  2. The Web Interface Template System (WITS), a software developer`s tool

    Energy Technology Data Exchange (ETDEWEB)

    Lauer, L.J.; Lynam, M.; Muniz, T. [Sandia National Labs., Albuquerque, NM (United States). Financial Systems Dept.

    1995-11-01

    The Web Interface Template System (WITS) is a tool for software developers. WITS is a three-tiered, object-oriented system operating in a Client/Server environment. This tool can be used to create software applications that have a Web browser as the user interface and access a Sybase database. Development, modification, and implementation are greatly simplified because the developer can change and test definitions immediately, without writing or compiling any code. This document explains WITS functionality, the system structure and components of WITS, and how to obtain, install, and use the software system.

  3. Software Tools for Design of Reagents for Multiplex Genetic Analyses

    OpenAIRE

    Stenberg, Johan

    2006-01-01

    Methods using oligonucleotide probes are powerful tools for the analysis of nucleic acids. During recent years, many such methods have been developed that enable the simultaneous interrogation of multiple qualities of a sample. Many of these multiplexing techniques share common limitations. This thesis discusses new developments to overcome the problems of multiplex amplification of genomic sequences and design of sets of oligonucleotide probes for multiplex genetic analyses. A novel molecula...

  4. Fighting software piracy: which governance tools matter in Africa?

    OpenAIRE

    Simplice A. Asongu; Andrés, Antonio R.

    2012-01-01

    This article integrates previously missing components of government quality into the governance-piracy nexus in exploring governance mechanisms by which global obligations for the treatment of IPRs are effectively transmitted from international to the national level in the battle against piracy. It assesses the best governance tools in the fight against piracy and upholding of Intellectual Property Rights (IPRs). The instrumentality of IPR laws (treaties) in tackling piracy through good gover...

  5. Using Software Development Tools and Practices in Acquisition

    Science.gov (United States)

    2013-12-01

    extractors to create stand-alone documentation from pre-formatted comments with- in the code • static analysis enforcing the use of code constructs...simpler, working baseline version aids in identifying defect root causes. The key objective of this process is to make sure the interfaces that...it. This class of tools enable root cause analysis, which is a key activity that requires the analytical capabilities of the technical skills of all

  6. Improving Software Performance in the Compute Unified Device Architecture

    OpenAIRE

    Alexandru PIRJAN

    2010-01-01

    This paper analyzes several aspects regarding the improvement of software performance for applications written in the Compute Unified Device Architecture CUDA). We address an issue of great importance when programming a CUDA application: the Graphics Processing Unit’s (GPU’s) memory management through ranspose ernels. We also benchmark and evaluate the performance for progressively optimizing a transposing matrix application in CUDA. One particular interest was to research how well the op...

  7. NGS++: a library for rapid prototyping of epigenomics software tools.

    Science.gov (United States)

    Nordell Markovits, Alexei; Joly Beauparlant, Charles; Toupin, Dominique; Wang, Shengrui; Droit, Arnaud; Gevry, Nicolas

    2013-08-01

    The development of computational tools to enable testing and analysis of high-throughput-sequencing data is essential to modern genomics research. However, although multiple frameworks have been developed to facilitate access to these tools, comparatively little effort has been made at implementing low-level programming libraries to increase the speed and ease of their development. We propose NGS++, a programming library in C++11 specialized in manipulating both next-generation sequencing (NGS) datasets and genomic information files. This library allows easy integration of new formats and rapid prototyping of new functionalities with a focus on the analysis of genomic regions and features. It offers a powerful, yet versatile and easily extensible interface to read, write and manipulate multiple genomic file formats. By standardizing the internal data structures and presenting a common interface to the data parser, NGS++ offers an effective framework for epigenomics tool development. NGS++ was written in C++ using the C++11 standard. It requires minimal efforts to build and is well-documented via a complete docXygen guide, online documentation and tutorials. Source code, tests, code examples and documentation are available via the website at http://www.ngsplusplus.ca and the github repository at https://github.com/NGS-lib/NGSplusplus. nicolas.gevry@usherbrooke.ca or arnaud.droit@crchuq.ulaval.ca.

  8. Collaborative Software Development in Support of Fast Adaptive AeroSpace Tools (FAAST)

    Science.gov (United States)

    Kleb, William L.; Nielsen, Eric J.; Gnoffo, Peter A.; Park, Michael A.; Wood, William A.

    2003-01-01

    A collaborative software development approach is described. The software product is an adaptation of proven computational capabilities combined with new capabilities to form the Agency's next generation aerothermodynamic and aerodynamic analysis and design tools. To efficiently produce a cohesive, robust, and extensible software suite, the approach uses agile software development techniques; specifically, project retrospectives, the Scrum status meeting format, and a subset of Extreme Programming's coding practices are employed. Examples are provided which demonstrate the substantial benefits derived from employing these practices. Also included is a discussion of issues encountered when porting legacy Fortran 77 code to Fortran 95 and a Fortran 95 coding standard.

  9. Plans for performance and model improvements in the LISE++ software

    Science.gov (United States)

    Kuchera, M. P.; Tarasov, O. B.; Bazin, D.; Sherrill, B. M.; Tarasova, K. V.

    2016-06-01

    The LISE++ software for fragment separator simulations is undergoing a major update. LISE++ is the standard software used at in-flight separator facilities for predicting beam intensity and purity. The code simulates nuclear physics experiments where fragments are produced and then selected with a fragment separator. A set of modifications to improve the functionality of the code is discussed in this work. These modifications include transportation to a modern graphics framework and updated compilers to aid in the performance and sustainability of the code. To accommodate the diversity of our users' computer platform preferences, we extend the software from Windows to a cross-platform application. The calculations of beam transport and isotope production are becoming more computationally intense with the new large scale facilities. Planned new features include new types of optimization, for example, optimization of ion optics, improvements in reaction models, and new event generator options. In addition, LISE++ interface with control systems are planned. Computational improvements as well as the schedule for updating this large package will be discussed.

  10. A Tool to Enhance Cooperation and Knowledge Transfer among Software Developers

    Science.gov (United States)

    Aydin, Seçil; Mishra, Deepti

    Software developers have been successfully tailoring software development methods according to the project situation and more so in small scale software development organizations. There is a need to share this knowledge with other developers who may be facing the same project situation so that they can benefit from other people experiences. In this paper, an approach to enhance cooperation among software developers, in terms of sharing the knowledge that was used successfully in past projects, is proposed. A web-based tool is developed that can assist in creation, storage and extraction of methods related with requirement elicitation phase. These methods are categorized according to certain criteria which helps in searching a method that will be most appropriate in a given project situation. This approach and tool can also be used for other software development activities.

  11. Computer-aided design in power engineering. Application of software tools

    Energy Technology Data Exchange (ETDEWEB)

    Stojkovic, Zlatan

    2012-07-01

    Demonstrates the use software tools in the practice of design in the field of power systems. Presents many applications in the design in the field of power systems. Useful for educative purposes and practical work. This textbooks demonstrates the application of software tools in solving a series of problems from the field of designing power system structures and systems. It contains four chapters: The first chapter leads the reader through all the phases necessary in the procedures of computer aided modeling and simulation. It guides through the complex problems presenting on the basis of eleven original examples. The second chapter presents application of software tools in power system calculations of power systems equipment design. Several design example calculations are carried out using engineering standards like MATLAB, EMTP/ATP, Excel and Access, AutoCAD and Simulink. The third chapters focuses on the graphical documentation using a collection of software tools (AutoCAD, EPLAN, SIMARIS SIVACON, SIMARIS DESIGN) which enable the complete automation of the development of graphical documentation of a power systems. In the fourth chapter, the application of software tools in the project management in power systems is discussed. Here, the emphasis is put on the standard software MS Excel and MS Project.

  12. Development of a software tool for an internal dosimetry using MIRD method

    Science.gov (United States)

    Chaichana, A.; Tocharoenchai, C.

    2016-03-01

    Currently, many software packages for the internal radiation dosimetry have been developed. Many of them do not provide sufficient tools to perform all of the necessary steps from nuclear medicine image analysis for dose calculation. For this reason, we developed a CALRADDOSE software that can be performed internal dosimetry using MIRD method within a single environment. MATLAB software version 2015a was used as development tool. The calculation process of this software proceeds from collecting time-activity data from image data followed by residence time calculation and absorbed dose calculation using MIRD method. To evaluate the accuracy of this software, we calculate residence times and absorbed doses of 5 Ga- 67 studies and 5 I-131 MIBG studies and then compared the results with those obtained from OLINDA/EXM software. The results showed that the residence times and absorbed doses obtained from both software packages were not statistically significant differences. The CALRADDOSE software is a user-friendly, graphic user interface-based software for internal dosimetry. It provides fast and accurate results, which may be useful for a routine work.

  13. Metric-based method of software requirements correctness improvement

    Directory of Open Access Journals (Sweden)

    Yaremchuk Svitlana

    2017-01-01

    Full Text Available The work highlights the most important principles of software reliability management (SRM. The SRM concept construes a basis for developing a method of requirements correctness improvement. The method assumes that complicated requirements contain more actual and potential design faults/defects. The method applies a newer metric to evaluate the requirements complexity and double sorting technique evaluating the priority and complexity of a particular requirement. The method enables to improve requirements correctness due to identification of a higher number of defects with restricted resources. Practical application of the proposed method in the course of demands review assured a sensible technical and economic effect.

  14. Software attribute visualization for high integrity software

    Energy Technology Data Exchange (ETDEWEB)

    Pollock, G.M.

    1998-03-01

    This report documents a prototype tool developed to investigate the use of visualization and virtual reality technologies for improving software surety confidence. The tool is utilized within the execution phase of the software life cycle. It provides a capability to monitor an executing program against prespecified requirements constraints provided in a program written in the requirements specification language SAGE. The resulting Software Attribute Visual Analysis Tool (SAVAnT) also provides a technique to assess the completeness of a software specification.

  15. SAGES: a suite of freely-available software tools for electronic disease surveillance in resource-limited settings.

    Directory of Open Access Journals (Sweden)

    Sheri L Lewis

    Full Text Available Public health surveillance is undergoing a revolution driven by advances in the field of information technology. Many countries have experienced vast improvements in the collection, ingestion, analysis, visualization, and dissemination of public health data. Resource-limited countries have lagged behind due to challenges in information technology infrastructure, public health resources, and the costs of proprietary software. The Suite for Automated Global Electronic bioSurveillance (SAGES is a collection of modular, flexible, freely-available software tools for electronic disease surveillance in resource-limited settings. One or more SAGES tools may be used in concert with existing surveillance applications or the SAGES tools may be used en masse for an end-to-end biosurveillance capability. This flexibility allows for the development of an inexpensive, customized, and sustainable disease surveillance system. The ability to rapidly assess anomalous disease activity may lead to more efficient use of limited resources and better compliance with World Health Organization International Health Regulations.

  16. SAGES: A Suite of Freely-Available Software Tools for Electronic Disease Surveillance in Resource-Limited Settings

    Science.gov (United States)

    Lewis, Sheri L.; Feighner, Brian H.; Loschen, Wayne A.; Wojcik, Richard A.; Skora, Joseph F.; Coberly, Jacqueline S.; Blazes, David L.

    2011-01-01

    Public health surveillance is undergoing a revolution driven by advances in the field of information technology. Many countries have experienced vast improvements in the collection, ingestion, analysis, visualization, and dissemination of public health data. Resource-limited countries have lagged behind due to challenges in information technology infrastructure, public health resources, and the costs of proprietary software. The Suite for Automated Global Electronic bioSurveillance (SAGES) is a collection of modular, flexible, freely-available software tools for electronic disease surveillance in resource-limited settings. One or more SAGES tools may be used in concert with existing surveillance applications or the SAGES tools may be used en masse for an end-to-end biosurveillance capability. This flexibility allows for the development of an inexpensive, customized, and sustainable disease surveillance system. The ability to rapidly assess anomalous disease activity may lead to more efficient use of limited resources and better compliance with World Health Organization International Health Regulations. PMID:21572957

  17. A free software tool for the development of decision support systems

    Directory of Open Access Journals (Sweden)

    COLONESE, G

    2008-06-01

    Full Text Available This article describes PostGeoOlap, a free software open source tool for decision support that integrates OLAP (On-Line Analytical Processing and GIS (Geographical Information Systems. Besides describing the tool, we show how it can be used to achieve effective and low cost decision support that is adequate for small and medium companies and for small public offices.

  18. D-VASim: A Software Tool to Simulate and Analyze Genetic Logic Circuits

    DEFF Research Database (Denmark)

    Baig, Hasan; Madsen, Jan

    2016-01-01

    -stage researchers with limited experience in the field of biology. The Solution: Using LabVIEW to develop a user-friendly simulation tool named Dynamic Virtual Analyzer and Simulator (D-VASim), which is the first software tool in the domain of synthetic biology that provides a virtual laboratory environment...

  19. Ranking of Business Process Simulation Software Tools with DEX/QQ Hierarchical Decision Model.

    Science.gov (United States)

    Damij, Nadja; Boškoski, Pavle; Bohanec, Marko; Mileva Boshkoska, Biljana

    2016-01-01

    The omnipresent need for optimisation requires constant improvements of companies' business processes (BPs). Minimising the risk of inappropriate BP being implemented is usually performed by simulating the newly developed BP under various initial conditions and "what-if" scenarios. An effectual business process simulations software (BPSS) is a prerequisite for accurate analysis of an BP. Characterisation of an BPSS tool is a challenging task due to the complex selection criteria that includes quality of visual aspects, simulation capabilities, statistical facilities, quality reporting etc. Under such circumstances, making an optimal decision is challenging. Therefore, various decision support models are employed aiding the BPSS tool selection. The currently established decision support models are either proprietary or comprise only a limited subset of criteria, which affects their accuracy. Addressing this issue, this paper proposes a new hierarchical decision support model for ranking of BPSS based on their technical characteristics by employing DEX and qualitative to quantitative (QQ) methodology. Consequently, the decision expert feeds the required information in a systematic and user friendly manner. There are three significant contributions of the proposed approach. Firstly, the proposed hierarchical model is easily extendible for adding new criteria in the hierarchical structure. Secondly, a fully operational decision support system (DSS) tool that implements the proposed hierarchical model is presented. Finally, the effectiveness of the proposed hierarchical model is assessed by comparing the resulting rankings of BPSS with respect to currently available results.

  20. Performance Evaluation of a Software Engineering Tool for Automated Design of Cooling Systems in Injection Moulding

    DEFF Research Database (Denmark)

    Jauregui-Becker, Juan M.; Tosello, Guido; van Houten, Fred J.A.M.

    2013-01-01

    This paper presents a software tool for automating the design of cooling systems for injection moulding and a validation of its performance. Cooling system designs were automatically generated by the proposed software tool and by applying a best practice tool engineering design approach. The two...... different design methods (i.e. automatic and manual) were applied to the mould design of two thin-walled products, namely a rectangular flat box and a cylindrical container with a flat base. Injection moulding process simulations based on the finite element method were performed to assess the quality...

  1. Improving Software Performance in the Compute Unified Device Architecture

    Directory of Open Access Journals (Sweden)

    Alexandru PIRJAN

    2010-01-01

    Full Text Available This paper analyzes several aspects regarding the improvement of software performance for applications written in the Compute Unified Device Architecture CUDA. We address an issue of great importance when programming a CUDA application: the Graphics Processing Unit’s (GPU’s memory management through ranspose ernels. We also benchmark and evaluate the performance for progressively optimizing a transposing matrix application in CUDA. One particular interest was to research how well the optimization techniques, applied to software application written in CUDA, scale to the latest generation of general-purpose graphic processors units (GPGPU, like the Fermi architecture implemented in the GTX480 and the previous architecture implemented in GTX280. Lately, there has been a lot of interest in the literature for this type of optimization analysis, but none of the works so far (to our best knowledge tried to validate if the optimizations can apply to a GPU from the latest Fermi architecture and how well does the Fermi architecture scale to these software performance improving techniques.

  2. Software Tools for Robust Analysis of High-Dimensional Data

    Directory of Open Access Journals (Sweden)

    Valentin Todorov

    2014-06-01

    Full Text Available The present work discusses robust multivariate methods specifically designed for highdimensions. Their implementation in R is presented and their application is illustratedon examples. The first group are algorithms for outlier detection, already introducedelsewhere and implemented in other packages. The value added of the new package isthat all methods follow the same design pattern and thus can use the same graphicaland diagnostic tools. The next topic covered is sparse principal components including anobject oriented interface to the standard method proposed by Zou, Hastie, and Tibshirani(2006 and the robust one proposed by Croux, Filzmoser, and Fritz (2013. Robust partialleast squares (see Hubert and Vanden Branden 2003 as well as partial least squares fordiscriminant analysis conclude the scope of the new package.

  3. Incremental Method Enactment for Computer Aided Software Engineering Tools

    NARCIS (Netherlands)

    Vlaanderen, K.; Tuijl, G.J. van; Brinkkemper, S.; Jansen, Slinger

    2013-01-01

    In most cases, enactment is the most resource consuming aspect of process improvement, as large process changes are put into practice. Problems that typically are encountered include ine ective process changes, resistance from employees, and unclarity about the advantages of the new process.

  4. Towards a Reference Architecture to Provision Tools as a Service for Global Software Development

    DEFF Research Database (Denmark)

    Chauhan, Aufeef; Babar, Muhammad Ali

    2014-01-01

    Organizations involve in Global Software Development (GSD) face challenges in terms of having access to appropriate set of tools for performing distributed engineering and development activities, integration between heterogeneous desktop and web-based tools, management of artifacts developed...... distributed environment. In this paper, we argue the need to have a cloud-enabled platform for supporting GSD and propose reference architecture of a cloud based Platform for providing support to provision ecosystem of the Tools as a Service (PTaaS)....

  5. Study of Software Tools to Support Systems Engineering Management

    Science.gov (United States)

    2015-06-01

    to explore the key components of systems engineering management. Systems engineering teaches that before a solution can be developed the underlying...enhanced knowledge capture  improved ability to teach and learn systems engineering fundamentals. 18 Management is explicitly identified as a...flag requirements that utilize certain keywords or have a particular structure known to be an indicator for bad requirements, similar to a grammar

  6. DAE Tools: equation-based object-oriented modelling, simulation and optimisation software

    Directory of Open Access Journals (Sweden)

    Dragan D. Nikolić

    2016-04-01

    Full Text Available In this work, DAE Tools modelling, simulation and optimisation software, its programming paradigms and main features are presented. The current approaches to mathematical modelling such as the use of modelling languages and general-purpose programming languages are analysed. The common set of capabilities required by the typical simulation software are discussed, and the shortcomings of the current approaches recognised. A new hybrid approach is introduced, and the modelling languages and the hybrid approach are compared in terms of the grammar, compiler, parser and interpreter requirements, maintainability and portability. The most important characteristics of the new approach are discussed, such as: (1 support for the runtime model generation; (2 support for the runtime simulation set-up; (3 support for complex runtime operating procedures; (4 interoperability with the third party software packages (i.e. NumPy/SciPy; (5 suitability for embedding and use as a web application or software as a service; and (6 code-generation, model exchange and co-simulation capabilities. The benefits of an equation-based approach to modelling, implemented in a fourth generation object-oriented general purpose programming language such as Python are discussed. The architecture and the software implementation details as well as the type of problems that can be solved using DAE Tools software are described. Finally, some applications of the software at different levels of abstraction are presented, and its embedding capabilities and suitability for use as a software as a service is demonstrated.

  7. Improved molecular tools for sugar cane biotechnology.

    Science.gov (United States)

    Kinkema, Mark; Geijskes, Jason; Delucca, Paulo; Palupe, Anthony; Shand, Kylie; Coleman, Heather D; Brinin, Anthony; Williams, Brett; Sainz, Manuel; Dale, James L

    2014-03-01

    Sugar cane is a major source of food and fuel worldwide. Biotechnology has the potential to improve economically-important traits in sugar cane as well as diversify sugar cane beyond traditional applications such as sucrose production. High levels of transgene expression are key to the success of improving crops through biotechnology. Here we describe new molecular tools that both expand and improve gene expression capabilities in sugar cane. We have identified promoters that can be used to drive high levels of gene expression in the leaf and stem of transgenic sugar cane. One of these promoters, derived from the Cestrum yellow leaf curling virus, drives levels of constitutive transgene expression that are significantly higher than those achieved by the historical benchmark maize polyubiquitin-1 (Zm-Ubi1) promoter. A second promoter, the maize phosphonenolpyruvate carboxylate promoter, was found to be a strong, leaf-preferred promoter that enables levels of expression comparable to Zm-Ubi1 in this organ. Transgene expression was increased approximately 50-fold by gene modification, which included optimising the codon usage of the coding sequence to better suit sugar cane. We also describe a novel dual transcriptional enhancer that increased gene expression from different promoters, boosting expression from Zm-Ubi1 over eightfold. These molecular tools will be extremely valuable for the improvement of sugar cane through biotechnology.

  8. An improved financial tool to replace BHT

    CERN Multimedia

    2002-01-01

    In November, the BHT tool used to control financial data will be replaced by an improved and more powerful system, called CET for CERN Expenditure Tracking. The team in charge of CET. From left to right, sitting, Martyn Rankin, David McGlashan, standing, Per Gunnar Jonsson, James Purvis and Mikael Angberg. After 10 years of BHT at CERN, in order to face the challenges of the LHC era, and following the recommendations of the External Review Committee, the BHT application will be replaced by an improved and more powerful expenditure tracking tool called CET for CERN Expenditure Tracking. For those who are not familiar with it, the BHT, Budget Holders Toolkit, is a utility that provides a way to view CERN financial data. It is available for users who have access to at least one budget code. The new CET represents a tool that not only allows powerful analysis of the past, but also assists in forecasting the future. CET will offer significantly more functionality than BHT, including extended contract analysis, ...

  9. FREE OPEN SOURCE AND SOFTWARE IN THE TEACHING OF CAT TOOLS: OMEGAT, A VIABLE ALTERNATIVE

    Directory of Open Access Journals (Sweden)

    Adauto Lúcio Caetano Villela

    2016-12-01

    Full Text Available The teaching of Computer Aided Translation (CAT tools is essential to all courses aimed at preparing students for the effective exercise of this profession, in particular those students who will translate technical texts, especially in the field of localization. As an alternative to the paid commercial software which dominate the translation industry (such as SDL Trados Studio, Wordfast Pro and MemoQ, there are free proprietary softwares (such as Wordfast Anywhere and Google Translator Toolkit, and free open source ones (such as OmegaT and Anaphraseus. Starting from a description of CAT types and main functions, the purpose of this article is to point out, through an evaluative-comparative analysis based on a research and on a comparison between OmegaT, free proprietary softwares and others open source softwares, why and for which situations OmegaT is a viable alternative for the teaching of CAT tools in higher education.

  10. Improving Organizational Learning: Defining Units of Learning from Social Tools

    Science.gov (United States)

    Menolli, André Luís Andrade; Reinehr, Sheila; Malucelli, Andreia

    2013-01-01

    New technologies, such as social networks, wikis, blogs and other social tools, enable collaborative work and are important facilitators of the social learning process. Many companies are using these types of tools as substitutes for their intranets, especially software development companies. However, the content generated by these tools in many…

  11. A tool to include gamma analysis software into a quality assurance program.

    Science.gov (United States)

    Agnew, Christina E; McGarry, Conor K

    2016-03-01

    To provide a tool to enable gamma analysis software algorithms to be included in a quality assurance (QA) program. Four image sets were created comprising two geometric images to independently test the distance to agreement (DTA) and dose difference (DD) elements of the gamma algorithm, a clinical step and shoot IMRT field and a clinical VMAT arc. The images were analysed using global and local gamma analysis with 2 in-house and 8 commercially available software encompassing 15 software versions. The effect of image resolution on gamma pass rates was also investigated. All but one software accurately calculated the gamma passing rate for the geometric images. Variation in global gamma passing rates of 1% at 3%/3mm and over 2% at 1%/1mm was measured between software and software versions with analysis of appropriately sampled images. This study provides a suite of test images and the gamma pass rates achieved for a selection of commercially available software. This image suite will enable validation of gamma analysis software within a QA program and provide a frame of reference by which to compare results reported in the literature from various manufacturers and software versions. Copyright © 2015. Published by Elsevier Ireland Ltd.

  12. MyETL: A Java Software Tool to Extract, Transform, and Load Your Business

    Directory of Open Access Journals (Sweden)

    Michele Nuovo

    2015-12-01

    Full Text Available The project follows the development of a Java Software Tool that extracts data from Flat File (Fixed Length Record Type, CSV (Comma Separated Values, and XLS (Microsoft Excel 97-2003 Worksheet file, apply transformation to those sources, and finally load the data into the end target RDBMS. The software refers to a process known as ETL (Extract Transform and Load. Those kinds of systems are called ETL systems.

  13. A Software Tool for Optimal Sizing of PV Systems in Malaysia

    OpenAIRE

    Tamer Khatib; Azah Mohamed; K. Sopian

    2012-01-01

    This paper presents a MATLAB based user friendly software tool called as PV.MY for optimal sizing of photovoltaic (PV) systems. The software has the capabilities of predicting the metrological variables such as solar energy, ambient temperature and wind speed using artificial neural network (ANN), optimizes the PV module/ array tilt angle, optimizes the inverter size and calculate optimal capacities of PV array, battery, wind turbine and diesel generator in hybrid PV systems. The ANN based mo...

  14. SaTool - a Software Tool for Structural Analysis of Complex Automation Systems

    DEFF Research Database (Denmark)

    Blanke, Mogens; Lorentzen, Torsten

    2006-01-01

    The paper introduces SaTool, a tool for structural analysis, the use of the Matlab (R)-based implementation is presented and special features are introduced, which were motivated by industrial users. Salient features of tool are presented, including the ability to specify the behavior of a comple...

  15. A Process Framework for Designing Software Reference Architectures for Providing Tools as a Service

    DEFF Research Database (Denmark)

    Chauhan, Muhammad Aufeef; Babar, Muhammad Ali; Probst, Christian W.

    2016-01-01

    to be provisioned and underlying cloud platforms to be used while designing SRA. The framework recommends adoption of the multi-faceted approach for evaluation of SRA and quantifiable measurement scheme to evaluate quality of the SRA. We foresee that RADeF can facilitate software architects and researchers during...... of software systems need customized and systematic SRA design and evaluation methods. In this paper, we present a software Reference Architecture Design process Framework (RADeF) that can be used for analysis, design and evaluation of the SRA for provisioning of Tools as a Service as part of a cloud......Software Reference Architecture (SRA), which is a generic architecture solution for a specific type of software systems, provides foundation for the design of concrete architectures in terms of architecture design guidelines and architecture elements. The complexity and size of certain types...

  16. A Reference Architecture for Providing Tools as a Service to Support Global Software Development

    DEFF Research Database (Denmark)

    Chauhan, Aufeef

    2014-01-01

    -based solutions. The restricted ability of the organizations to have desired alignment of tools with software engineering and development processes results in administrative and managerial overhead that incur increased development cost and poor product quality. Moreover, stakeholders involved in the projects have......Global Software Development (GSD) teams encounter challenges that are associated with distribution of software development activities across multiple geographic regions. The limited support for performing collaborative development and engineering activities and lack of sufficient support...... specific constraints regarding availability and deployments of the tools. The artifacts and data produced or consumed by the tools need to be governed according to the constraints and corresponding quality of service (QoS) parameters. In this paper, we present the research agenda to leverage cloud...

  17. A software tool for design of process monitoring and analysis systems

    DEFF Research Database (Denmark)

    Singh, Ravendra; Gernaey, Krist; Gani, Rafiqul

    2009-01-01

    A well designed process monitoring and analysis system is necessary to consistently achieve any predefined end product quality. Systematic computer aided methods and tools provide the means to design the necessary process monitoring and analysis systems and/or to validate any existing monitoring...... and analysis system. A software to achieve this has been developed. Two developed supporting tools for the design, a knowledge base (consisting of the process knowledge as well as the knowledge on measurement methods & tools) and a model library (consisting of the process operational models) have been extended...... rigorously and integrated with the user interface, which made the software more generic and applicable to a wide range of problems. The software for the design of a process monitoring and analysis system is presented and illustrated with a tablet manufacturing process example....

  18. Managing the Testing Process Practical Tools and Techniques for Managing Hardware and Software Testing

    CERN Document Server

    Black, Rex

    2011-01-01

    New edition of one of the most influential books on managing software and hardware testing In this new edition of his top-selling book, Rex Black walks you through the steps necessary to manage rigorous testing programs of hardware and software. The preeminent expert in his field, Mr. Black draws upon years of experience as president of both the International and American Software Testing Qualifications boards to offer this extensive resource of all the standards, methods, and tools you'll need. The book covers core testing concepts and thoroughly examines the best test management practices

  19. Learning Tools and Applications for Cognitive Improvement

    Directory of Open Access Journals (Sweden)

    Athanasios Drigas

    2014-06-01

    Full Text Available Learning technologies are an indispensable tool for students’ cognitive improvement and assessment. ICTs in coordination with a concrete pedagogical framework may provide students and teachers flexible, engaging, cost-effective and above all, personalized learning experiences, which focus on the adoption of the 21st century cognitive skills into the actual learning process. Such higher order thinking skills (HOTS entail critical thinking, problem solving, independent inquiry, creativity, communication, collaboration, digital literacy. Therefore, technologically-supported educational environments aim at self-regulated and inquisitive, constructivist, knowledge building rather than knowledge accumulation.

  20. Software for improving the quality of project management, a case study: international manufacture of electrical equipment

    Science.gov (United States)

    Preradović, D. M.; Mićić, Lj S.; Barz, C.

    2017-05-01

    Production conditions in today’s world require software support at every stage of production and development of new products, for quality assurance and compliance with ISO standards. In addition to ISO standards such as usual metrics of quality, companies today are focused on other optional standards, such as CMMI (Capability Maturity Model Integrated) or prescribing they own standards. However, while there is intensive progress being made in the PM (project management), there is still a significant number of projects, at the global level, that are failures. These have failed to achieve their goals, within budget or timeframe. This paper focuses on checking the role of software tools through the rate of success in projects implemented in the case of internationally manufactured electrical equipment. The results of this research show the level of contribution of the project management software used to manage and develop new products to improve PM processes and PM functions, and how selection of the software tools affects the quality of PM processes and successfully completed projects.

  1. Software tools and surgical guides in dental-implant-guided surgery.

    Science.gov (United States)

    Mora, Maria A; Chenin, Douglas L; Arce, Roger M

    2014-07-01

    Cone beam computed tomography has become an essential tool in the diagnosis and planning for implant dentistry. New hardware and software developments have emerged to help implant surgeons to successfully adopt and use different systems in patients requiring prosthetically driven implant dentistry. However, there is the need to develop an adequate planning protocol that includes appropriate acquisition/data manipulation, appropriate use of software tools for interpretation, and appropriate application of such systems during implant surgery. This article examines essential characteristics of the entire implant-guided surgery planning process and points out potential sources of error that could affect clinical accuracy outcomes. Copyright © 2014 Elsevier Inc. All rights reserved.

  2. Proteomics: A Biotechnology Tool for Crop Improvement

    Directory of Open Access Journals (Sweden)

    Moustafa eEldakak

    2013-02-01

    Full Text Available A sharp decline in the availability of arable land and sufficient supply of irrigation water along with a continuous steep increase in food demands have exerted a pressure on farmers to produce more with fewer resources. A viable solution to release this pressure is to speed up the plant breeding process by employing biotechnology in breeding programs. The majority of biotechnological applications rely on information generated from various -omic technologies. The latest outstanding improvements in proteomic platforms and many other but related advances in plant biotechnology techniques offer various new ways to encourage the usage of these technologies by plant scientists for crop improvement programs. A combinatorial approach of accelerated gene discovery through genomics, proteomics, and other associated -omic branches of biotechnology, as an applied approach, is proving to be an effective way to speed up the crop improvement programs worldwide. In the near future, swift improvements in -omic databases are becoming critical and demand immediate attention for the effective utilization of these techniques to produce next-generation crops for the progressive farmers. Here, we have reviewed the recent advances in proteomics, as tools of biotechnology, which are offering great promise and leading the path towards crop improvement for sustainable agriculture.

  3. Online survey software as a data collection tool for medical education: A case study on lesson plan assessment

    Science.gov (United States)

    Kimiafar, Khalil; Sarbaz, Masoumeh; Sheikhtaheri, Abbas

    2016-01-01

    Background: There are no general strategies or tools to evaluate daily lesson plans; however, assessments conducted using traditional methods usually include course plans. This study aimed to evaluate the strengths and weaknesses of online survey software in collecting data on education in medical fields and the application of such softwares to evaluate students' views and modification of lesson plans. Methods: After investigating the available online survey software, esurveypro was selected for assessing daily lesson plans. After using the software for one semester, a questionnaire was prepared to assess the advantages and disadvantages of this method and students’ views in a cross-sectional study. Results: The majority of the students (51.7%) rated the evaluation of classes per session (lesson plans) using the online survey as useful or very useful. About 51% (n=36) of the students considered this method effective in improving the management of each session, 67.1% (n=47) considered it effective in improving the management of sessions for the next semester, and 51.4% (n=36) said it had a high impact on improving the educational content of subsequent sessions. Finally, 61.4% (n=43) students expressed high and very high levels of satisfaction with using an online survey at each session. Conclusion: The use of online surveys may be appropriate to improve lesson plans and educational planning at different levels. This method can be used for other evaluations and for assessing people’s opinions at different levels of an educational system. PMID:28491839

  4. Evaluation of Agricultural Accounting Software. Improved Decision Making. Third Edition.

    Science.gov (United States)

    Lovell, Ashley C., Comp.

    Following a discussion of the evaluation criteria for choosing accounting software, this guide contains reviews of 27 accounting software programs that could be used by farm or ranch business managers. The information in the reviews was provided by the software vendors and covers the following points for each software package: general features,…

  5. GenomeTools: a comprehensive software library for efficient processing of structured genome annotations.

    Science.gov (United States)

    Gremme, Gordon; Steinbiss, Sascha; Kurtz, Stefan

    2013-01-01

    Genome annotations are often published as plain text files describing genomic features and their subcomponents by an implicit annotation graph. In this paper, we present the GenomeTools, a convenient and efficient software library and associated software tools for developing bioinformatics software intended to create, process or convert annotation graphs. The GenomeTools strictly follow the annotation graph approach, offering a unified graph-based representation. This gives the developer intuitive and immediate access to genomic features and tools for their manipulation. To process large annotation sets with low memory overhead, we have designed and implemented an efficient pull-based approach for sequential processing of annotations. This allows to handle even the largest annotation sets, such as a complete catalogue of human variations. Our object-oriented C-based software library enables a developer to conveniently implement their own functionality on annotation graphs and to integrate it into larger workflows, simultaneously accessing compressed sequence data if required. The careful C implementation of the GenomeTools does not only ensure a light-weight memory footprint while allowing full sequential as well as random access to the annotation graph, but also facilitates the creation of bindings to a variety of script programming languages (like Python and Ruby) sharing the same interface.

  6. Proceedings of the Workshop on software tools for distributed intelligent control systems

    Energy Technology Data Exchange (ETDEWEB)

    Herget, C.J. (ed.)

    1990-09-01

    The Workshop on Software Tools for Distributed Intelligent Control Systems was organized by Lawrence Livermore National Laboratory for the United States Army Headquarters Training and Doctrine Command and the Defense Advanced Research Projects Agency. The goals of the workshop were to the identify the current state of the art in tools which support control systems engineering design and implementation, identify research issues associated with writing software tools which would provide a design environment to assist engineers in multidisciplinary control design and implementation, formulate a potential investment strategy to resolve the research issues and develop public domain code which can form the core of more powerful engineering design tools, and recommend test cases to focus the software development process and test associated performance metrics. Recognizing that the development of software tools for distributed intelligent control systems will require a multidisciplinary effort, experts in systems engineering, control systems engineering, and compute science were invited to participate in the workshop. In particular, experts who could address the following topics were selected: operating systems, engineering data representation and manipulation, emerging standards for manufacturing data, mathematical foundations, coupling of symbolic and numerical computation, user interface, system identification, system representation at different levels of abstraction, system specification, system design, verification and validation, automatic code generation, and integration of modular, reusable code.

  7. Contingency Contractor Optimization Phase 3 Sustainment Software Design Document - Contingency Contractor Optimization Tool - Prototype

    Energy Technology Data Exchange (ETDEWEB)

    Durfee, Justin David; Frazier, Christopher Rawls; Bandlow, Alisa; Jones, Katherine A

    2016-05-01

    This document describes the final software design of the Contingency Contractor Optimization Tool - Prototype. Its purpose is to provide the overall architecture of the software and the logic behind this architecture. Documentation for the individual classes is provided in the application Javadoc. The Contingency Contractor Optimization project is intended to address Department of Defense mandates by delivering a centralized strategic planning tool that allows senior decision makers to quickly and accurately assess the impacts, risks, and mitigation strategies associated with utilizing contract support. The Contingency Contractor Optimization Tool - Prototype was developed in Phase 3 of the OSD ATL Contingency Contractor Optimization project to support strategic planning for contingency contractors. The planning tool uses a model to optimize the Total Force mix by minimizing the combined total costs for selected mission scenarios. The model optimizes the match of personnel types (military, DoD civilian, and contractors) and capabilities to meet mission requirements as effectively as possible, based on risk, cost, and other requirements.

  8. Activity Theory applied to Global Software Engineering: Theoretical Foundations and Implications for Tool Builders

    DEFF Research Database (Denmark)

    Tell, Paolo; Ali Babar, Muhammad

    2012-01-01

    Although a plethora of tools are available for Global Software Engineering (GSE) teams, it is being realized increasingly that the most prevalent desktop metaphor underpinning the majority of tools have several inherent limitations. We have proposed that Activity-Based Computing (ABC) can...... be a promising alternative to build tools for GSE. However, significant effort is required to introduce a new paradigm; there is a need of sound theoretical foundation based on activity theory to address challenges faced by tools in GSE. This paper reports our effort aimed at building theoretical foundations...... for applying activity theory to GSE. We analyze and explain the fundamental concepts of activity theory, and how they can be applied by using examples of software architecture design and evaluation processes. We describe the kind of data model and architectural support required for applying activity theory...

  9. Diabetes care may be improved with Steno Quality Assurance Tool--a self-assessment tool in diabetes management

    DEFF Research Database (Denmark)

    Bjerre-Christensen, Ulla; Nielsen, Annemette Anker; Binder, Christian

    2014-01-01

    AIM: To evaluate if improvements in the quality of diabetes care in Indian clinics can be obtained by simple self-surveillance PC-based software. METHOD: Nineteen Indian diabetes clinics were introduced to the principles of quality assurance (QA), and to a software program, the Steno Quality...... Assurance Tool (SQAT). Data was entered for an initial 3 months period. Subsequently data were analyzed by the users, who designed plans to improve indicator status and set goals for the upcoming period. A second data entry period followed after 7-9 months. RESULTS: QA data was analyzed from 4487 T2DM.......002). CONCLUSION: Quality of diabetes care can be improved by applying SQAT, a QA self-surveillance software that enables documentation of changes in process and outcome indicators....

  10. Should we have blind faith in bioinformatics software? Illustrations from the SNAP web-based tool.

    Directory of Open Access Journals (Sweden)

    Sébastien Robiou-du-Pont

    Full Text Available Bioinformatics tools have gained popularity in biology but little is known about their validity. We aimed to assess the early contribution of 415 single nucleotide polymorphisms (SNPs associated with eight cardio-metabolic traits at the genome-wide significance level in adults in the Family Atherosclerosis Monitoring In earLY Life (FAMILY birth cohort. We used the popular web-based tool SNAP to assess the availability of the 415 SNPs in the Illumina Cardio-Metabochip genotyped in the FAMILY study participants. We then compared the SNAP output with the Cardio-Metabochip file provided by Illumina using chromosome and chromosomal positions of SNPs from NCBI Human Genome Browser (Genome Reference Consortium Human Build 37. With the HapMap 3 release 2 reference, 201 out of 415 SNPs were reported as missing in the Cardio-Metabochip by the SNAP output. However, the Cardio-Metabochip file revealed that 152 of these 201 SNPs were in fact present in the Cardio-Metabochip array (false negative rate of 36.6%. With the more recent 1000 Genomes Project release, we found a false-negative rate of 17.6% by comparing the outputs of SNAP and the Illumina product file. We did not find any 'false positive' SNPs (SNPs specified as available in the Cardio-Metabochip by SNAP, but not by the Cardio-Metabochip Illumina file. The Cohen's Kappa coefficient, which calculates the percentage of agreement between both methods, indicated that the validity of SNAP was fair to moderate depending on the reference used (the HapMap 3 or 1000 Genomes. In conclusion, we demonstrate that the SNAP outputs for the Cardio-Metabochip are invalid. This study illustrates the importance of systematically assessing the validity of bioinformatics tools in an independent manner. We propose a series of guidelines to improve practices in the fast-moving field of bioinformatics software implementation.

  11. Should we have blind faith in bioinformatics software? Illustrations from the SNAP web-based tool.

    Science.gov (United States)

    Robiou-du-Pont, Sébastien; Li, Aihua; Christie, Shanice; Sohani, Zahra N; Meyre, David

    2015-01-01

    Bioinformatics tools have gained popularity in biology but little is known about their validity. We aimed to assess the early contribution of 415 single nucleotide polymorphisms (SNPs) associated with eight cardio-metabolic traits at the genome-wide significance level in adults in the Family Atherosclerosis Monitoring In earLY Life (FAMILY) birth cohort. We used the popular web-based tool SNAP to assess the availability of the 415 SNPs in the Illumina Cardio-Metabochip genotyped in the FAMILY study participants. We then compared the SNAP output with the Cardio-Metabochip file provided by Illumina using chromosome and chromosomal positions of SNPs from NCBI Human Genome Browser (Genome Reference Consortium Human Build 37). With the HapMap 3 release 2 reference, 201 out of 415 SNPs were reported as missing in the Cardio-Metabochip by the SNAP output. However, the Cardio-Metabochip file revealed that 152 of these 201 SNPs were in fact present in the Cardio-Metabochip array (false negative rate of 36.6%). With the more recent 1000 Genomes Project release, we found a false-negative rate of 17.6% by comparing the outputs of SNAP and the Illumina product file. We did not find any 'false positive' SNPs (SNPs specified as available in the Cardio-Metabochip by SNAP, but not by the Cardio-Metabochip Illumina file). The Cohen's Kappa coefficient, which calculates the percentage of agreement between both methods, indicated that the validity of SNAP was fair to moderate depending on the reference used (the HapMap 3 or 1000 Genomes). In conclusion, we demonstrate that the SNAP outputs for the Cardio-Metabochip are invalid. This study illustrates the importance of systematically assessing the validity of bioinformatics tools in an independent manner. We propose a series of guidelines to improve practices in the fast-moving field of bioinformatics software implementation.

  12. Use of collaboration software to improve nuclear power plant outage management

    Energy Technology Data Exchange (ETDEWEB)

    Germain, Shawn

    2015-02-01

    Nuclear Power Plant (NPP) refueling outages create some of the most challenging activities the utilities face in both tracking and coordinating thousands of activities in a short period of time. Other challenges, including nuclear safety concerns arising from atypical system configurations and resource allocation issues, can create delays and schedule overruns, driving up outage costs. Today the majority of the outage communication is done using processes that do not take advantage of advances in modern technologies that enable enhanced communication, collaboration and information sharing. Some of the common practices include: runners that deliver paper-based requests for approval, radios, telephones, desktop computers, daily schedule printouts, and static whiteboards that are used to display information. Many gains have been made to reduce the challenges facing outage coordinators; however; new opportunities can be realized by utilizing modern technological advancements in communication and information tools that can enhance the collective situational awareness of plant personnel leading to improved decision-making. Ongoing research as part of the Light Water Reactor Sustainability Program (LWRS) has been targeting NPP outage improvement. As part of this research, various applications of collaborative software have been demonstrated through pilot project utility partnerships. Collaboration software can be utilized as part of the larger concept of Computer-Supported Cooperative Work (CSCW). Collaborative software can be used for emergent issue resolution, Outage Control Center (OCC) displays, and schedule monitoring. Use of collaboration software enables outage staff and subject matter experts (SMEs) to view and update critical outage information from any location on site or off.

  13. GraphCrunch 2: Software tool for network modeling, alignment and clustering.

    Science.gov (United States)

    Kuchaiev, Oleksii; Stevanović, Aleksandar; Hayes, Wayne; Pržulj, Nataša

    2011-01-19

    Recent advancements in experimental biotechnology have produced large amounts of protein-protein interaction (PPI) data. The topology of PPI networks is believed to have a strong link to their function. Hence, the abundance of PPI data for many organisms stimulates the development of computational techniques for the modeling, comparison, alignment, and clustering of networks. In addition, finding representative models for PPI networks will improve our understanding of the cell just as a model of gravity has helped us understand planetary motion. To decide if a model is representative, we need quantitative comparisons of model networks to real ones. However, exact network comparison is computationally intractable and therefore several heuristics have been used instead. Some of these heuristics are easily computable "network properties," such as the degree distribution, or the clustering coefficient. An important special case of network comparison is the network alignment problem. Analogous to sequence alignment, this problem asks to find the "best" mapping between regions in two networks. It is expected that network alignment might have as strong an impact on our understanding of biology as sequence alignment has had. Topology-based clustering of nodes in PPI networks is another example of an important network analysis problem that can uncover relationships between interaction patterns and phenotype. We introduce the GraphCrunch 2 software tool, which addresses these problems. It is a significant extension of GraphCrunch which implements the most popular random network models and compares them with the data networks with respect to many network properties. Also, GraphCrunch 2 implements the GRAph ALigner algorithm ("GRAAL") for purely topological network alignment. GRAAL can align any pair of networks and exposes large, dense, contiguous regions of topological and functional similarities far larger than any other existing tool. Finally, GraphCruch 2 implements an

  14. GraphCrunch 2: Software tool for network modeling, alignment and clustering

    Directory of Open Access Journals (Sweden)

    Hayes Wayne

    2011-01-01

    Full Text Available Abstract Background Recent advancements in experimental biotechnology have produced large amounts of protein-protein interaction (PPI data. The topology of PPI networks is believed to have a strong link to their function. Hence, the abundance of PPI data for many organisms stimulates the development of computational techniques for the modeling, comparison, alignment, and clustering of networks. In addition, finding representative models for PPI networks will improve our understanding of the cell just as a model of gravity has helped us understand planetary motion. To decide if a model is representative, we need quantitative comparisons of model networks to real ones. However, exact network comparison is computationally intractable and therefore several heuristics have been used instead. Some of these heuristics are easily computable "network properties," such as the degree distribution, or the clustering coefficient. An important special case of network comparison is the network alignment problem. Analogous to sequence alignment, this problem asks to find the "best" mapping between regions in two networks. It is expected that network alignment might have as strong an impact on our understanding of biology as sequence alignment has had. Topology-based clustering of nodes in PPI networks is another example of an important network analysis problem that can uncover relationships between interaction patterns and phenotype. Results We introduce the GraphCrunch 2 software tool, which addresses these problems. It is a significant extension of GraphCrunch which implements the most popular random network models and compares them with the data networks with respect to many network properties. Also, GraphCrunch 2 implements the GRAph ALigner algorithm ("GRAAL" for purely topological network alignment. GRAAL can align any pair of networks and exposes large, dense, contiguous regions of topological and functional similarities far larger than any other

  15. Programming heterogeneous MPSoCs tool flows to close the software productivity gap

    CERN Document Server

    Castrillón Mazo, Jerónimo

    2014-01-01

    This book provides embedded software developers with techniques for programmingheterogeneous Multi-Processor Systems-on-Chip (MPSoCs), capable of executing multiple applications simultaneously. It describes a set of algorithms and methodologies to narrow the software productivity gap, as well as an in-depth description of the underlying problems and challenges of today’s programming practices. The authors present four different tool flows: A parallelism extraction flow for applications writtenusing the C programming language, a mapping and scheduling flow for parallel applications, a special mapping flow for baseband applications in the context of Software Defined Radio (SDR) and a final flow for analyzing multiple applications at design time. The tool flows are evaluated on Virtual Platforms (VPs), which mimic different characteristics of state-of-the-art heterogeneous MPSoCs.   • Provides a novel set of algorithms and methodologies for programming heterogeneous Multi-Processor Systems-on-Chip (MPSoCs)...

  16. Defamiliarization: Flarf, conceptual writing, and using flawed software tools as creative partners

    Directory of Open Access Journals (Sweden)

    Richard P. Gabriel (ACM Fellow; Allen Newell Award

    2012-06-01

    Full Text Available One form of creativity uses defamiliarization, a mechanism that frees the brain from its rational shackles and permits the abducing brain to run free. Mistakes and flaws in several software tools are shown to be the starting points for increased creativity and better art, and a theory explaining the phenomenon is proposed.

  17. The Data Uncertainty Engine (DUE): a software tool for assessing and simulating uncertain environmental variables

    NARCIS (Netherlands)

    Brown, J.D.; Heuvelink, G.B.M.

    2007-01-01

    This paper describes a software tool for: (1) assessing uncertainties in environmental data; and (2) generating realisations of uncertain data for use in uncertainty propagation analyses: the ¿Data Uncertainty Engine (DUE)¿. Data may be imported into DUE from file or from a database, and are

  18. Experience with case tools in the design of process-oriented software

    CERN Document Server

    Novakov, O

    1993-01-01

    In Accelerator systems such as the CERN PS complex, process equipment has a life time which may exceed the typical life cycle of its related software. Taking in account the variety of such equipment, it is important to keep the analysis and design of the software in a system- independent form. This paper discusses the experience gathered in using commercial CASE tools for analysis, design and reverse engineering of different process- oriented software modules, with a principal emphasis on maintaining the initial analysis in a standardized form. Such tools are in existence for several years, but this paper shows that they are not entirely adapted to our needs.In particular, the paper stresses the problems of integrating such a tool in an existing data-base-dependent development chain, the lack of real-time simulation tools and of object-oriented concepts in existing commercial packages. Finally, the paper attempts to show a broader view of software engineering needs in our particular context.

  19. A Guide to the Use of Tool Software for the Apple Computer.

    Science.gov (United States)

    Collett, Charles R.; Goldberg, Fred S.

    Designed to give teachers and supervisors a working knowledge of various approaches to enhancing pupil learning through software application programs, this guide is presented in a hands-on fashion. It supports a dual purpose, i.e., it can serve as an individual tutorial or as a turnkey staff development tool. All program files referred to may be…

  20. The use of software tools and autonomous bots against vandalism : eroding Wikipedia's moral order?

    NARCIS (Netherlands)

    de Laat, Paul B.

    English-language Wikipedia is constantly being plagued by vandalistic contributions on a massive scale. In order to fight them its volunteer contributors deploy an array of software tools and autonomous bots. After an analysis of their functioning and the 'coactivity' in use between humans and bots,

  1. Design, Development and Delivery of Active Learning Tools in Software Verification & Validation Education

    Science.gov (United States)

    Acharya, Sushil; Manohar, Priyadarshan Anant; Wu, Peter; Maxim, Bruce; Hansen, Mary

    2018-01-01

    Active learning tools are critical in imparting real world experiences to the students within a classroom environment. This is important because graduates are expected to develop software that meets rigorous quality standards in functional and application domains with little to no training. However, there is a well-recognized need for the…

  2. Review of software tools for design and analysis of large scale MRM proteomic datasets.

    Science.gov (United States)

    Colangelo, Christopher M; Chung, Lisa; Bruce, Can; Cheung, Kei-Hoi

    2013-06-15

    Selective or Multiple Reaction monitoring (SRM/MRM) is a liquid-chromatography (LC)/tandem-mass spectrometry (MS/MS) method that enables the quantitation of specific proteins in a sample by analyzing precursor ions and the fragment ions of their selected tryptic peptides. Instrumentation software has advanced to the point that thousands of transitions (pairs of primary and secondary m/z values) can be measured in a triple quadrupole instrument coupled to an LC, by a well-designed scheduling and selection of m/z windows. The design of a good MRM assay relies on the availability of peptide spectra from previous discovery-phase LC-MS/MS studies. The tedious aspect of manually developing and processing MRM assays involving thousands of transitions has spurred to development of software tools to automate this process. Software packages have been developed for project management, assay development, assay validation, data export, peak integration, quality assessment, and biostatistical analysis. No single tool provides a complete end-to-end solution, thus this article reviews the current state and discusses future directions of these software tools in order to enable researchers to combine these tools for a comprehensive targeted proteomics workflow. Copyright © 2013 The Authors. Published by Elsevier Inc. All rights reserved.

  3. Wiki as a Corporate Learning Tool: Case Study for Software Development Company

    Science.gov (United States)

    Milovanovic, Milos; Minovic, Miroslav; Stavljanin, Velimir; Savkovic, Marko; Starcevic, Dusan

    2012-01-01

    In our study, we attempted to further investigate how Web 2.0 technologies influence workplace learning. Our particular interest was on using Wiki as a tool for corporate exchange of knowledge with the focus on informal learning. In this study, we collaborated with a multinational software development company that uses Wiki as a corporate tool…

  4. Software Construction and Composition Tools for Petascale Computing SCW0837 Progress Report

    Energy Technology Data Exchange (ETDEWEB)

    Epperly, T W; Hochstein, L

    2011-09-12

    The majority of scientific software is distributed as source code. As the number of library dependencies and supported platforms increases, so does the complexity of describing the rules for configuring and building software. In this project, we have performed an empirical study of the magnitude of the build problem by examining the development history of two DOE-funded scientific software projects. We have developed MixDown, a meta-build tool, to simplify the task of building applications that depend on multiple third-party libraries. The results of this research indicate that the effort that scientific programmers spend takes a significant fraction of the total development effort and that the use of MixDown can significantly simplify the task of building software with multiple dependencies.

  5. Software tool for improved prediction of Alzheimer's disease

    DEFF Research Database (Denmark)

    Soininen, Hilkka; Mattila, Jussi; Koikkalainen, Juha

    2012-01-01

    Diagnostic criteria of Alzheimer's disease (AD) emphasize the integration of clinical data and biomarkers. In practice, collection and analysis of patient data vary greatly across different countries and clinics.......Diagnostic criteria of Alzheimer's disease (AD) emphasize the integration of clinical data and biomarkers. In practice, collection and analysis of patient data vary greatly across different countries and clinics....

  6. Practitioner's knowledge representation a pathway to improve software effort estimation

    CERN Document Server

    Mendes, Emilia

    2014-01-01

    The main goal of this book is to help organizations improve their effort estimates and effort estimation processes by providing a step-by-step methodology that takes them through the creation and validation of models that are based on their own knowledge and experience. Such models, once validated, can then be used to obtain predictions, carry out risk analyses, enhance their estimation processes for new projects and generally advance them as learning organizations.Emilia Mendes presents the Expert-Based Knowledge Engineering of Bayesian Networks (EKEBNs) methodology, which she has used and adapted during the course of several industry collaborations with different companies world-wide over more than 6 years. The book itself consists of two major parts: first, the methodology's foundations in knowledge management, effort estimation (with special emphasis on the intricacies of software and Web development) and Bayesian networks are detailed; then six industry case studies are presented which illustrate the pra...

  7. APPLICATION Of MODEL PSP MANUAL AND SUPPORTED BY TOOL MARRIES IN A STUDY OF CASE OF BRAZILIAN PLANT OF SOFTWARE

    Directory of Open Access Journals (Sweden)

    Denis Ávila Montini

    2006-06-01

    Full Text Available In a context of continuous improvement of quality in software development’s projects, the PSP experimental process was applied to discipline some of the processes suggested by CMMI level 2 with two different strategies. The first one consists of observing the behavior of a software factory on collecting the necessary data to assist the PSP model manually, and in the second one the collecting happened with the help of a CASE tool. The results show the impacts in the performance and in the quality patterns that the two strategies provided with their advantages and their vulnerabilities. In both cases the fulfillment of the stated periods was obtained from the moment where the specification and the course of the activities were controlled by the two PSP’ strategies suggested. Key words: CMMI, PSP, CASE, Factory, Improvement of processes.

  8. Development and implementation of a 'Mental Health Finder' software tool within an electronic medical record system.

    Science.gov (United States)

    Swan, D; Hannigan, A; Higgins, S; McDonnell, R; Meagher, D; Cullen, W

    2017-02-01

    In Ireland, as in many other healthcare systems, mental health service provision is being reconfigured with a move toward more care in the community, and particularly primary care. Recording and surveillance systems for mental health information and activities in primary care are needed for service planning and quality improvement. We describe the development and initial implementation of a software tool ('mental health finder') within a widely used primary care electronic medical record system (EMR) in Ireland to enable large-scale data collection on the epidemiology and management of mental health and substance use problems among patients attending general practice. In collaboration with the Irish Primary Care Research Network (IPCRN), we developed the 'Mental Health Finder' as a software plug-in to a commonly used primary care EMR system to facilitate data collection on mental health diagnoses and pharmacological treatments among patients. The finder searches for and identifies patients based on diagnostic coding and/or prescribed medicines. It was initially implemented among a convenience sample of six GP practices. Prevalence of mental health and substance use problems across the six practices, as identified by the finder, was 9.4% (range 6.9-12.7%). 61.9% of identified patients were female; 25.8% were private patients. One-third (33.4%) of identified patients were prescribed more than one class of psychotropic medication. Of the patients identified by the finder, 89.9% were identifiable via prescribing data, 23.7% via diagnostic coding. The finder is a feasible and promising methodology for large-scale data collection on mental health problems in primary care.

  9. Teaching structure: student use of software tools for understanding macromolecular structure in an undergraduate biochemistry course.

    Science.gov (United States)

    Jaswal, Sheila S; O'Hara, Patricia B; Williamson, Patrick L; Springer, Amy L

    2013-01-01

    Because understanding the structure of biological macromolecules is critical to understanding their function, students of biochemistry should become familiar not only with viewing, but also with generating and manipulating structural representations. We report a strategy from a one-semester undergraduate biochemistry course to integrate use of structural representation tools into both laboratory and homework activities. First, early in the course we introduce the use of readily available open-source software for visualizing protein structure, coincident with modules on amino acid and peptide bond properties. Second, we use these same software tools in lectures and incorporate images and other structure representations in homework tasks. Third, we require a capstone project in which teams of students examine a protein-nucleic acid complex and then use the software tools to illustrate for their classmates the salient features of the structure, relating how the structure helps explain biological function. To ensure engagement with a range of software and database features, we generated a detailed template file that can be used to explore any structure, and that guides students through specific applications of many of the software tools. In presentations, students demonstrate that they are successfully interpreting structural information, and using representations to illustrate particular points relevant to function. Thus, over the semester students integrate information about structural features of biological macromolecules into the larger discussion of the chemical basis of function. Together these assignments provide an accessible introduction to structural representation tools, allowing students to add these methods to their biochemical toolboxes early in their scientific development. © 2013 by The International Union of Biochemistry and Molecular Biology.

  10. An improved COCOMO software cost estimation model | Duke ...

    African Journals Online (AJOL)

    In this paper, we discuss the methodologies adopted previously in software cost estimation using the COnstructive COst MOdels (COCOMOs). From our analysis, COCOMOs produce very high software development efforts, which eventually produce high software development costs. Consequently, we propose its extension, ...

  11. S2O - A software tool for integrating research data from general purpose statistic software into electronic data capture systems.

    Science.gov (United States)

    Bruland, Philipp; Dugas, Martin

    2017-01-07

    Data capture for clinical registries or pilot studies is often performed in spreadsheet-based applications like Microsoft Excel or IBM SPSS. Usually, data is transferred into statistic software, such as SAS, R or IBM SPSS Statistics, for analyses afterwards. Spreadsheet-based solutions suffer from several drawbacks: It is generally not possible to ensure a sufficient right and role management; it is not traced who has changed data when and why. Therefore, such systems are not able to comply with regulatory requirements for electronic data capture in clinical trials. In contrast, Electronic Data Capture (EDC) software enables a reliable, secure and auditable collection of data. In this regard, most EDC vendors support the CDISC ODM standard to define, communicate and archive clinical trial meta- and patient data. Advantages of EDC systems are support for multi-user and multicenter clinical trials as well as auditable data. Migration from spreadsheet based data collection to EDC systems is labor-intensive and time-consuming at present. Hence, the objectives of this research work are to develop a mapping model and implement a converter between the IBM SPSS and CDISC ODM standard and to evaluate this approach regarding syntactic and semantic correctness. A mapping model between IBM SPSS and CDISC ODM data structures was developed. SPSS variables and patient values can be mapped and converted into ODM. Statistical and display attributes from SPSS are not corresponding to any ODM elements; study related ODM elements are not available in SPSS. The S2O converting tool was implemented as command-line-tool using the SPSS internal Java plugin. Syntactic and semantic correctness was validated with different ODM tools and reverse transformation from ODM into SPSS format. Clinical data values were also successfully transformed into the ODM structure. Transformation between the spreadsheet format IBM SPSS and the ODM standard for definition and exchange of trial data is feasible

  12. Assess/Mitigate Risk through the Use of Computer-Aided Software Engineering (CASE) Tools

    Science.gov (United States)

    Aguilar, Michael L.

    2013-01-01

    The NASA Engineering and Safety Center (NESC) was requested to perform an independent assessment of the mitigation of the Constellation Program (CxP) Risk 4421 through the use of computer-aided software engineering (CASE) tools. With the cancellation of the CxP, the assessment goals were modified to capture lessons learned and best practices in the use of CASE tools. The assessment goal was to prepare the next program for the use of these CASE tools. The outcome of the assessment is contained in this document.

  13. Capability studies, helpful tools in process quality improvement

    Directory of Open Access Journals (Sweden)

    Simion Carmen

    2017-01-01

    Full Text Available Ability of processes to meet customer’quality requirements has become essential for providing competitive advantages such as cost savings, reducing the number of nonconfoming products or increasing customer satisfaction. This paper aims to conduct a capability study for a swaging process (that ensures an assembly dimension of a key product characteristic (the outer diameter of a new part, related to the concept of capability and performance indices and how these metrics can be used and interpreted to become powerful tools for decision making. To achieve the goal of the paper, the following key aspects were analyzed: capability of the measurement system capability (gage R&R, production equipment (machine capability and process capability/performance during the first serial production. The analysis was performed with Minitab® 17, the most commonly used software for quality improvement.

  14. IPAT: a freely accessible software tool for analyzing multiple patent documents with inbuilt landscape visualizer.

    Science.gov (United States)

    Ajay, Dara; Gangwal, Rahul P; Sangamwar, Abhay T

    2015-01-01

    Intelligent Patent Analysis Tool (IPAT) is an online data retrieval tool, operated based on text mining algorithm to extract specific patent information in a predetermined pattern into an Excel sheet. The software is designed and developed to retrieve and analyze technology information from multiple patent documents and generate various patent landscape graphs and charts. The software is C# coded in visual studio 2010, which extracts the publicly available patent information from the web pages like Google Patent and simultaneously study the various technology trends based on user-defined parameters. In other words, IPAT combined with the manual categorization will act as an excellent technology assessment tool in competitive intelligence and due diligence for predicting the future R&D forecast.

  15. Hardware and software improvements to a low-cost horizontal parallax holographic video monitor.

    Science.gov (United States)

    Henrie, Andrew; Codling, Jesse R; Gneiting, Scott; Christensen, Justin B; Awerkamp, Parker; Burdette, Mark J; Smalley, Daniel E

    2018-01-01

    Displays capable of true holographic video have been prohibitively expensive and difficult to build. With this paper, we present a suite of modularized hardware components and software tools needed to build a HoloMonitor with basic "hacker-space" equipment, highlighting improvements that have enabled the total materials cost to fall to $820, well below that of other holographic displays. It is our hope that the current level of simplicity, development, design flexibility, and documentation will enable the lay engineer, programmer, and scientist to relatively easily replicate, modify, and build upon our designs, bringing true holographic video to the masses.

  16. A proposal for reverse engineering CASE tools to support new software development

    Energy Technology Data Exchange (ETDEWEB)

    Maxted, A.

    1993-06-01

    Current CASE technology provides sophisticated diagramming tools to generate a software design. The design, stored internal to the CASE tool, is bridged to the code via code generators. There are several limitations to this technique: (1) the portability of the design is limited to the portability of the CASE tools, and (2) the code generators offer a clumsy link between design and code. The CASE tool though valuable during design, becomes a hindrance during implementation. Frustration frequently causes the CASE tool to be abandoned during implementation, permanently severing the link between design and code. Current CASE stores the design in a CASE internal structure, from which code is generated. The technique presented herein suggests that CASE tools store the system knowledge directly in code. The CASE support then switches from an emphasis on code generators to employing state-of-the-art reverse engineering techniques for document generation. Graphical and textual descriptions of each software component (e.g., Ada Package) may be generated via reverse engineering techniques from the code. These reverse engineered descriptions can be merged with system over-view diagrams to form a top-level design document. The resulting document can readily reflect changes to the software components by automatically generating new component descriptions for the changed components. The proposed auto documentation technique facilitates the document upgrade task at later stages of development, (e.g., design, implementation and delivery) by using the component code as the source of the component descriptions. The CASE technique presented herein is a unique application of reverse engineering techniques to new software systems. This technique contrasts with more traditional CASE auto code generation techniques.

  17. ConsensusCluster: a software tool for unsupervised cluster discovery in numerical data.

    Science.gov (United States)

    Seiler, Michael; Huang, C Chris; Szalma, Sandor; Bhanot, Gyan

    2010-02-01

    We have created a stand-alone software tool, ConsensusCluster, for the analysis of high-dimensional single nucleotide polymorphism (SNP) and gene expression microarray data. Our software implements the consensus clustering algorithm and principal component analysis to stratify the data into a given number of robust clusters. The robustness is achieved by combining clustering results from data and sample resampling as well as by averaging over various algorithms and parameter settings to achieve accurate, stable clustering results. We have implemented several different clustering algorithms in the software, including K-Means, Partition Around Medoids, Self-Organizing Map, and Hierarchical clustering methods. After clustering the data, ConsensusCluster generates a consensus matrix heatmap to give a useful visual representation of cluster membership, and automatically generates a log of selected features that distinguish each pair of clusters. ConsensusCluster gives more robust and more reliable clusters than common software packages and, therefore, is a powerful unsupervised learning tool that finds hidden patterns in data that might shed light on its biological interpretation. This software is free and available from http://code.google.com/p/consensus-cluster .

  18. Integrated software system for improving medical equipment management.

    Science.gov (United States)

    Bliznakov, Z; Pappous, G; Bliznakova, K; Pallikarakis, N

    2003-01-01

    The evolution of biomedical technology has led to an extraordinary use of medical devices in health care delivery. During the last decade, clinical engineering departments (CEDs) turned toward computerization and application of specific software systems for medical equipment management in order to improve their services and monitor outcomes. Recently, much emphasis has been given to patient safety. Through its Medical Device Directives, the European Union has required all member nations to use a vigilance system to prevent the reoccurrence of adverse events that could lead to injuries or death of patients or personnel as a result of equipment malfunction or improper use. The World Health Organization also has made this issue a high priority and has prepared a number of actions and recommendations. In the present workplace, a new integrated, Windows-oriented system is proposed, addressing all tasks of CEDs but also offering a global approach to their management needs, including vigilance. The system architecture is based on a star model, consisting of a central core module and peripheral units. Its development has been based on the integration of 3 software modules, each one addressing specific predefined tasks. The main features of this system include equipment acquisition and replacement management, inventory archiving and monitoring, follow up on scheduled maintenance, corrective maintenance, user training, data analysis, and reports. It also incorporates vigilance monitoring and information exchange for adverse events, together with a specific application for quality-control procedures. The system offers clinical engineers the ability to monitor and evaluate the quality and cost-effectiveness of the service provided by means of quality and cost indicators. Particular emphasis has been placed on the use of harmonized standards with regard to medical device nomenclature and classification. The system's practical applications have been demonstrated through a pilot

  19. Development of Safety-Critical Software for Nuclear Power Plant using a CASE Tool

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Chang Ho; Oh, Do Young; Kim, Koh Eun; Choi, Woong Seock; Sohn, Se Do; Kim, Jae Hack; Kim, Hang Bae [KEPCO E and C, Daejeon (Korea, Republic of)

    2011-08-15

    The Integrated SOftware Development Environment (ISODE) is developed to provide the major S/W life cycle processes that are composed of development process, V/V process, requirements traceability process, and automated document generation process and target importing process to Programmable Logic Controller (PLC) platform. This provides critical safety software developers with a certified, domain optimized, model-based development environment, and the associated services to reduce time and efforts to develop software such as debugging, simulation, code generation and document generation. This also provides critical safety software verifiers with integrated V/V features of each phase of the software life cycle using appropriate tools such as model test coverage, formal verification, and automated report generation. In addition to development and verification, the ISODE gives a complete traceability solution from the SW design phase to the testing phase. Using this information, the coverage and impact analysis can be done easily whenever software modification is necessary. The final source codes of ISODE are imported into the newly developed PLC environment, as a module based after automatically converted into the format required by PLC. Additional tests for module and unit level are performed on the target platform.

  20. Distortion Analysis Toolkit—A Software Tool for Easy Analysis of Nonlinear Audio Systems

    Directory of Open Access Journals (Sweden)

    Jyri Pakarinen

    2010-01-01

    Full Text Available Several audio effects devices deliberately add nonlinear distortion to the processed signal in order to create a desired sound. When creating virtual analog models of nonlinearly distorting devices, it would be very useful to carefully analyze the type of distortion, so that the model could be made as realistic as possible. While traditional system analysis tools such as the frequency response give detailed information on the operation of linear and time-invariant systems, they are less useful for analyzing nonlinear devices. Furthermore, although there do exist separate algorithms for nonlinear distortion analysis, there is currently no unified, easy-to-use tool for rapid analysis of distorting audio systems. This paper offers a remedy by introducing a new software tool for easy analysis of distorting effects. A comparison between a well-known guitar tube amplifier and two commercial software simulations is presented as a case study. This freely available software is written in Matlab language, but the analysis tool can also run as a standalone program, so the user does not need to have Matlab installed in order to perform the analysis.

  1. TINA manual landmarking tool: software for the precise digitization of 3D landmarks

    Directory of Open Access Journals (Sweden)

    Schunke Anja C

    2012-04-01

    Full Text Available Abstract Background Interest in the placing of landmarks and subsequent morphometric analyses of shape for 3D data has increased with the increasing accessibility of computed tomography (CT scanners. However, current computer programs for this task suffer from various practical drawbacks. We present here a free software tool that overcomes many of these problems. Results The TINA Manual Landmarking Tool was developed for the digitization of 3D data sets. It enables the generation of a modifiable 3D volume rendering display plus matching orthogonal 2D cross-sections from DICOM files. The object can be rotated and axes defined and fixed. Predefined lists of landmarks can be loaded and the landmarks identified within any of the representations. Output files are stored in various established formats, depending on the preferred evaluation software. Conclusions The software tool presented here provides several options facilitating the placing of landmarks on 3D objects, including volume rendering from DICOM files, definition and fixation of meaningful axes, easy import, placement, control, and export of landmarks, and handling of large datasets. The TINA Manual Landmark Tool runs under Linux and can be obtained for free from http://www.tina-vision.net/tarballs/.

  2. Data structure and software engineering challenges and improvements

    CERN Document Server

    Antonakos, James L

    2011-01-01

    Data structure and software engineering is an integral part of computer science. This volume presents new approaches and methods to knowledge sharing, brain mapping, data integration, and data storage. The author describes how to manage an organization's business process and domain data and presents new software and hardware testing methods. The book introduces a game development framework used as a learning aid in a software engineering at the university level. It also features a review of social software engineering metrics and methods for processing business information. It explains how to

  3. Updates on Resources, Software Tools, and Databases for Plant Proteomics in 2016-2017.

    Science.gov (United States)

    Misra, Biswapriya B

    2018-02-08

    Proteomics data processing, annotation, and analysis can often lead to major hurdles in large-scale high throughput bottom-up proteomics experiments. Given the recent rise in protein-based big datasets being generated, efforts in in silico tool development occurrences have had an unprecedented increase; so much so, that it has become increasingly difficult to keep track of all the advances in a particular academic year. However, these tools benefit the proteomics community in circumventing critical issues in data analysis and visualization, as these continually developing one-source and community-developed tools hold potential in future research efforts. This review will aim to introduce and summarize more than 50 software tools, databases, and resources developed and published during 2016-2017 under the following categories: tools for data re-processing and analysis, statistical analysis tools, peptide identification tools, databases and spectral libraries, and data visualization and interpretation tools. Intended for a well-informed proteomics community, finally, efforts in data archiving and validation datasets for the community will be discussed as well. Additionally, the author delineates the current and most commonly used proteomics tools in order to introduce novice users to this -omics discovery platform. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  4. BH-ShaDe: A Software Tool That Assists Architecture Students in the III-Structured Task of Housing Design

    Science.gov (United States)

    Millan, Eva; Belmonte, Maria-Victoria; Ruiz-Montiel, Manuela; Gavilanes, Juan; Perez-de-la-Cruz, Jose-Luis

    2016-01-01

    In this paper, we present BH-ShaDe, a new software tool to assist architecture students learning the ill-structured domain/task of housing design. The software tool provides students with automatic or interactively generated floor plan schemas for basic houses. The students can then use the generated schemas as initial seeds to develop complete…

  5. Review of free software tools for image analysis of fluorescence cell micrographs.

    Science.gov (United States)

    Wiesmann, V; Franz, D; Held, C; Münzenmayer, C; Palmisano, R; Wittenberg, T

    2015-01-01

    An increasing number of free software tools have been made available for the evaluation of fluorescence cell micrographs. The main users are biologists and related life scientists with no or little knowledge of image processing. In this review, we give an overview of available tools and guidelines about which tools the users should use to segment fluorescence micrographs. We selected 15 free tools and divided them into stand-alone, Matlab-based, ImageJ-based, free demo versions of commercial tools and data sharing tools. The review consists of two parts: First, we developed a criteria catalogue and rated the tools regarding structural requirements, functionality (flexibility, segmentation and image processing filters) and usability (documentation, data management, usability and visualization). Second, we performed an image processing case study with four representative fluorescence micrograph segmentation tasks with figure-ground and cell separation. The tools display a wide range of functionality and usability. In the image processing case study, we were able to perform figure-ground separation in all micrographs using mainly thresholding. Cell separation was not possible with most of the tools, because cell separation methods are provided only by a subset of the tools and are difficult to parametrize and to use. Most important is that the usability matches the functionality of a tool. To be usable, specialized tools with less functionality need to fulfill less usability criteria, whereas multipurpose tools need a well-structured menu and intuitive graphical user interface. © 2014 Fraunhofer-Institute for Integrated Circuits IIS Journal of Microscopy © 2014 Royal Microscopical Society.

  6. The Formalism and Language Tools for Semantics Specification of Software Libraries

    Directory of Open Access Journals (Sweden)

    V. M. Itsykson

    2016-01-01

    Full Text Available The paper is dedicated to the specification of the structure and the behaviour of soft-ware libraries. It describes the existing problems of libraries specifications. A brief overview of the research field concerned with formalizing the specification of libraries and library functions is presented. The requirements imposed on the formalism designed are established; the formalism based on these requirements allows specifying all the properties of the libraries needed for automation of several classes of problems: defects detection in the software, migration of applications into a new environment, gen-eration of software documentation. The requirements on the language tools based on the developed formalism are proposed. The conclusion defines potential directions for further research.

  7. A Software Tool of Technical and Financial-Economic Analysis for Acquistion of Broadband Radio PPDR Networks

    Directory of Open Access Journals (Sweden)

    Gierszal Henryk

    2016-12-01

    Full Text Available In this paper, we present a software tool that allows preparing econometric analyses aiming at selection of optimal business models for acquisition of broadband mobile networks by PPDR organizations. Such agencies often and often need broadband services to improve operational activities in order to increase the safety, security, and their effectiveness in day-to-day and crisis situations. Upgrade or migration to broadband networks needs careful decisions so as to find a justified trade-off between CAPEX and OPEX. The network evolution can be based on different business models but any approach cannot degrade the reliability, security, and resilience required by PSC.

  8. PyElph - a software tool for gel images analysis and phylogenetics

    Directory of Open Access Journals (Sweden)

    Pavel Ana Brânduşa

    2012-01-01

    Full Text Available Abstract Background This paper presents PyElph, a software tool which automatically extracts data from gel images, computes the molecular weights of the analyzed molecules or fragments, compares DNA patterns which result from experiments with molecular genetic markers and, also, generates phylogenetic trees computed by five clustering methods, using the information extracted from the analyzed gel image. The software can be successfully used for population genetics, phylogenetics, taxonomic studies and other applications which require gel image analysis. Researchers and students working in molecular biology and genetics would benefit greatly from the proposed software because it is free, open source, easy to use, has a friendly Graphical User Interface and does not depend on specific image acquisition devices like other commercial programs with similar functionalities do. Results PyElph software tool is entirely implemented in Python which is a very popular programming language among the bioinformatics community. It provides a very friendly Graphical User Interface which was designed in six steps that gradually lead to the results. The user is guided through the following steps: image loading and preparation, lane detection, band detection, molecular weights computation based on a molecular weight marker, band matching and finally, the computation and visualization of phylogenetic trees. A strong point of the software is the visualization component for the processed data. The Graphical User Interface provides operations for image manipulation and highlights lanes, bands and band matching in the analyzed gel image. All the data and images generated in each step can be saved. The software has been tested on several DNA patterns obtained from experiments with different genetic markers. Examples of genetic markers which can be analyzed using PyElph are RFLP (Restriction Fragment Length Polymorphism, AFLP (Amplified Fragment Length Polymorphism, RAPD

  9. A Software Tool for Atmospheric Correction and Surface Temperature Estimation of Landsat Infrared Thermal Data

    Directory of Open Access Journals (Sweden)

    Benjamin Tardy

    2016-08-01

    Full Text Available Land surface temperature (LST is an important variable involved in the Earth’s surface energy and water budgets and a key component in many aspects of environmental research. The Landsat program, jointly carried out by NASA and the USGS, has been recording thermal infrared data for the past 40 years. Nevertheless, LST data products for Landsat remain unavailable. The atmospheric correction (AC method commonly used for mono-window Landsat thermal data requires detailed information concerning the vertical structure (temperature, pressure and the composition (water vapor, ozone of the atmosphere. For a given coordinate, this information is generally obtained through either radio-sounding or atmospheric model simulations and is passed to the radiative transfer model (RTM to estimate the local atmospheric correction parameters. Although this approach yields accurate LST data, results are relevant only near this given coordinate. To meet the scientific community’s demand for high-resolution LST maps, we developed a new software tool dedicated to processing Landsat thermal data. The proposed tool improves on the commonly-used AC algorithm by incorporating spatial variations occurring in the Earth’s atmosphere composition. The ERA-Interim dataset (ECMWFmeteorological organization was used to retrieve vertical atmospheric conditions, which are available at a global scale with a resolution of 0.125 degrees and a temporal resolution of 6 h. A temporal and spatial linear interpolation of meteorological variables was performed to match the acquisition dates and coordinates of the Landsat images. The atmospheric correction parameters were then estimated on the basis of this reconstructed atmospheric grid using the commercial RTMsoftware MODTRAN. The needed surface emissivity was derived from the common vegetation index NDVI, obtained from the red and near-infrared (NIR bands of the same Landsat image. This permitted an estimation of LST for the entire

  10. BMDExpress: a software tool for the benchmark dose analyses of genomic data

    Directory of Open Access Journals (Sweden)

    Thomas Russell S

    2007-10-01

    Full Text Available Abstract Background Dose-dependent processes are common within biological systems and include phenotypic changes following exposures to both endogenous and xenobiotic molecules. The use of microarray technology to explore the molecular signals that underlie these dose-dependent processes has become increasingly common; however, the number of software tools for quantitatively analyzing and interpreting dose-response microarray data has been limited. Results We have developed BMDExpress, a Java application that combines traditional benchmark dose methods with gene ontology classification in the analysis of dose-response data from microarray experiments. The software application is designed to perform a stepwise analysis beginning with a one-way analysis of variance to identify the subset of genes that demonstrate significant dose-response behavior. The second step of the analysis involves fitting the gene expression data to a selection of standard statistical models (linear, 2° polynomial, 3° polynomial, and power models and selecting the model that best describes the data with the least amount of complexity. The model is then used to estimate the benchmark dose at which the expression of the gene significantly deviates from that observed in control animals. Finally, the software application summarizes the statistical modeling results by matching each gene to its corresponding gene ontology categories and calculating summary values that characterize the dose-dependent behavior for each biological process and molecular function. As a result, the summary values represent the dose levels at which genes in the corresponding cellular process show transcriptional changes. Conclusion The application of microarray technology together with the BMDExpress software tool represents a useful combination in characterizing dose-dependent transcriptional changes in biological systems. The software allows users to efficiently analyze large dose

  11. Analyst Tools and Quality Control Software for the ARM Data System

    Energy Technology Data Exchange (ETDEWEB)

    Moore, S.T.

    2004-12-14

    ATK Mission Research develops analyst tools and automated quality control software in order to assist the Atmospheric Radiation Measurement (ARM) Data Quality Office with their data inspection tasks. We have developed a web-based data analysis and visualization tool, called NCVweb, that allows for easy viewing of ARM NetCDF files. NCVweb, along with our library of sharable Interactive Data Language procedures and functions, allows even novice ARM researchers to be productive with ARM data with only minimal effort. We also contribute to the ARM Data Quality Office by analyzing ARM data streams, developing new quality control metrics, new diagnostic plots, and integrating this information into DQ HandS - the Data Quality Health and Status web-based explorer. We have developed several ways to detect outliers in ARM data streams and have written software to run in an automated fashion to flag these outliers.

  12. RAVEN AS A TOOL FOR DYNAMIC PROBABILISTIC RISK ASSESSMENT: SOFTWARE OVERVIEW

    Energy Technology Data Exchange (ETDEWEB)

    Alfonsi Andrea; Mandelli Diego; Rabiti Cristian; Joshua Cogliati; Robert Kinoshita

    2013-05-01

    RAVEN is a software tool under development at the Idaho National Laboratory (INL) that acts as the control logic driver and post-processing tool for the newly developed Thermo-Hydraylic code RELAP- 7. The scope of this paper is to show the software structure of RAVEN and its utilization in connection with RELAP-7. A short overview of the mathematical framework behind the code is presented along with its main capabilities such as on-line controlling/monitoring and Monte-Carlo sampling. A demo of a Station Black Out PRA analysis of a simplified Pressurized Water Reactor (PWR) model is shown in order to demonstrate the Monte-Carlo and clustering capabilities.

  13. Improving the software fault localization process through testability information

    NARCIS (Netherlands)

    Gonzalez-Sanchez, A.; Abreu, R.; Gross, H.; Van Gemund, A.

    2010-01-01

    When failures occur during software testing, automated software fault localization helps to diagnose their root causes and identify the defective components of a program to support debugging. Diagnosis is carried out by selecting test cases in such way that their pass or fail information will narrow

  14. PC Software graphics tool for conceptual design of space/planetary electrical power systems

    Science.gov (United States)

    Truong, Long V.

    1995-01-01

    This paper describes the Decision Support System (DSS), a personal computer software graphics tool for designing conceptual space and/or planetary electrical power systems. By using the DSS, users can obtain desirable system design and operating parameters, such as system weight, electrical distribution efficiency, and bus power. With this tool, a large-scale specific power system was designed in a matter of days. It is an excellent tool to help designers make tradeoffs between system components, hardware architectures, and operation parameters in the early stages of the design cycle. The DSS is a user-friendly, menu-driven tool with online help and a custom graphical user interface. An example design and results are illustrated for a typical space power system with multiple types of power sources, frequencies, energy storage systems, and loads.

  15. Variation of densitometry on computed tomography in COPD--influence of different software tools.

    Directory of Open Access Journals (Sweden)

    Mark O Wielpütz

    Full Text Available Quantitative multidetector computed tomography (MDCT as a potential biomarker is increasingly used for severity assessment of emphysema in chronic obstructive pulmonary disease (COPD. Aim of this study was to evaluate the user-independent measurement variability between five different fully-automatic densitometry software tools.MDCT and full-body plethysmography incl. forced expiratory volume in 1s and total lung capacity were available for 49 patients with advanced COPD (age = 64±9 years, forced expiratory volume in 1 s = 31±6% predicted. Measurement variation regarding lung volume, emphysema volume, emphysema index, and mean lung density was evaluated for two scientific and three commercially available lung densitometry software tools designed to analyze MDCT from different scanner types.One scientific tool and one commercial tool failed to process most or all datasets, respectively, and were excluded. One scientific and another commercial tool analyzed 49, the remaining commercial tool 30 datasets. Lung volume, emphysema volume, emphysema index and mean lung density were significantly different amongst these three tools (p<0.001. Limits of agreement for lung volume were [-0.195, -0.052 l], [-0.305, -0.131 l], and [-0.123, -0.052 l] with correlation coefficients of r = 1.00 each. Limits of agreement for emphysema index were [-6.2, 2.9%], [-27.0, 16.9%], and [-25.5, 18.8%], with r = 0.79 to 0.98. Correlation of lung volume with total lung capacity was good to excellent (r = 0.77 to 0.91, p<0.001, but segmented lung volume (6.7±1.3-6.8±1.3 l were significantly lower than total lung capacity (7.7±1.7 l, p<0.001.Technical incompatibilities hindered evaluation of two of five tools. The remaining three showed significant measurement variation for emphysema, hampering quantitative MDCT as a biomarker in COPD. Follow-up studies should currently use identical software, and standardization efforts should encompass software as

  16. Featureous: A Tool for Feature-Centric Analysis of Java Software

    DEFF Research Database (Denmark)

    Olszak, Andrzej; Jørgensen, Bo Nørregaard

    2010-01-01

    Feature-centric comprehension of source code is necessary for incorporating user-requested modifications during software evolution and maintenance. However, such comprehension is difficult to achieve in case of large object-oriented programs due to the size, complexity, and implicit character...... of mappings between features and source code. To support programmers in overcoming these difficulties, we present a feature-centric analysis tool, Featureous. Our tool extends the NetBeans IDE with mechanisms for efficient location of feature implementations in legacy source code, and an extensive analysis...... of the discovered feature-code relations through a number of analytical views....

  17. Software Tools for Emittance Measurement and Matching for 12 GeV CEBAF

    Energy Technology Data Exchange (ETDEWEB)

    Turner, Dennis L. [Thomas Jefferson National Accelerator Facility (TJNAF), Newport News, VA (United States)

    2016-05-01

    This paper discusses model-driven setup of the Continuous Electron Beam Accelerator Facility (CEBAF) for the 12GeV era, focusing on qsUtility. qsUtility is a set of software tools created to perform emittance measurements, analyze those measurements, and compute optics corrections based upon the measurements.qsUtility was developed as a toolset to facilitate reducing machine configuration time and reproducibility by way of an accurate accelerator model, and to provide Operations staff with tools to measure and correct machine optics with little or no assistance from optics experts.

  18. The E-learning Circle – a holistic software design tool for e-learning

    OpenAIRE

    Kolås, Line; Staupe, Arvid

    2010-01-01

    The article introduces the E-learning Circle, a tool developed to assure the quality of the software design process of e-learning systems, considering pedagogical principles as well as technology. The E-learning Circle consists of a number of concentric circles which are divided into three sectors. The content of the inner circles is based on pedagogical principles, while the outer circle specifies how the pedagogical principles may be implemented with technology. The circle’s centre is dedic...

  19. statnet: Software Tools for the Representation, Visualization, Analysis and Simulation of Network Data

    Directory of Open Access Journals (Sweden)

    Mark S. Handcock

    2007-12-01

    Full Text Available statnet is a suite of software packages for statistical network analysis. The packages implement recent advances in network modeling based on exponential-family random graph models (ERGM. The components of the package provide a comprehensive framework for ERGM-based network modeling, including tools for model estimation, model evaluation, model-based network simulation, and network visualization. This broad functionality is powered by a central Markov chain Monte Carlo (MCMC algorithm. The coding is optimized for speed and robustness.

  20. A SOFTWARE TOOL TO COMPARE MEASURED AND SIMULATED BUILDING ENERGY PERFORMANCE DATA

    Energy Technology Data Exchange (ETDEWEB)

    Maile, Tobias; Bazjanac, Vladimir; O' Donnell, James; Garr, Matthew

    2011-11-01

    Building energy performance is often inadequate when compared to design goals. To link design goals to actual operation one can compare measured with simulated energy performance data. Our previously developed comparison approach is the Energy Performance Comparison Methodology (EPCM), which enables the identification of performance problems based on a comparison of measured and simulated performance data. In context of this method, we developed a software tool that provides graphing and data processing capabilities of the two performance data sets. The software tool called SEE IT (Stanford Energy Efficiency Information Tool) eliminates the need for manual generation of data plots and data reformatting. SEE IT makes the generation of time series, scatter and carpet plots independent of the source of data (measured or simulated) and provides a valuable tool for comparing measurements with simulation results. SEE IT also allows assigning data points on a predefined building object hierarchy and supports different versions of simulated performance data. This paper briefly introduces the EPCM, describes the SEE IT tool and illustrates its use in the context of a building case study.

  1. A software tool for determination of breast cancer treatment methods using data mining approach.

    Science.gov (United States)

    Cakır, Abdülkadir; Demirel, Burçin

    2011-12-01

    In this work, breast cancer treatment methods are determined using data mining. For this purpose, software is developed to help to oncology doctor for the suggestion of application of the treatment methods about breast cancer patients. 462 breast cancer patient data, obtained from Ankara Oncology Hospital, are used to determine treatment methods for new patients. This dataset is processed with Weka data mining tool. Classification algorithms are applied one by one for this dataset and results are compared to find proper treatment method. Developed software program called as "Treatment Assistant" uses different algorithms (IB1, Multilayer Perception and Decision Table) to find out which one is giving better result for each attribute to predict and by using Java Net beans interface. Treatment methods are determined for the post surgical operation of breast cancer patients using this developed software tool. At modeling step of data mining process, different Weka algorithms are used for output attributes. For hormonotherapy output IB1, for tamoxifen and radiotherapy outputs Multilayer Perceptron and for the chemotherapy output decision table algorithm shows best accuracy performance compare to each other. In conclusion, this work shows that data mining approach can be a useful tool for medical applications particularly at the treatment decision step. Data mining helps to the doctor to decide in a short time.

  2. Oxygen octahedra picker: A software tool to extract quantitative information from STEM images

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Yi, E-mail: y.wang@fkf.mpg.de; Salzberger, Ute; Sigle, Wilfried; Eren Suyolcu, Y.; Aken, Peter A. van

    2016-09-15

    In perovskite oxide based materials and hetero-structures there are often strong correlations between oxygen octahedral distortions and functionality. Thus, atomistic understanding of the octahedral distortion, which requires accurate measurements of atomic column positions, will greatly help to engineer their properties. Here, we report the development of a software tool to extract quantitative information of the lattice and of BO{sub 6} octahedral distortions from STEM images. Center-of-mass and 2D Gaussian fitting methods are implemented to locate positions of individual atom columns. The precision of atomic column distance measurements is evaluated on both simulated and experimental images. The application of the software tool is demonstrated using practical examples. - Highlights: • We report a software tool for mapping atomic positions from HAADF and ABF images. • It enables quantification of both crystal lattice and oxygen octahedral distortions. • We test the measurement accuracy and precision on simulated and experimental images. • It works well for different orientations of perovskite structures and interfaces.

  3. User Driven Development of Software Tools for Open Data Discovery and Exploration

    Science.gov (United States)

    Schlobinski, Sascha; Keppel, Frank; Dihe, Pascal; Boot, Gerben; Falkenroth, Esa

    2016-04-01

    The use of open data in research faces challenges not restricted to inherent properties such as data quality, resolution of open data sets. Often Open data is catalogued insufficiently or fragmented. Software tools that support the effective discovery including the assessment of the data's appropriateness for research have shortcomings such as the lack of essential functionalities like support for data provenance. We believe that one of the reasons is the neglect of real end users requirements in the development process of aforementioned software tools. In the context of the FP7 Switch-On project we have pro-actively engaged the relevant user user community to collaboratively develop a means to publish, find and bind open data relevant for hydrologic research. Implementing key concepts of data discovery and exploration we have used state of the art web technologies to provide an interactive software tool that is easy to use yet powerful enough to satisfy the data discovery and access requirements of the hydrological research community.

  4. SU-E-T-203: Development of a QA Software Tool for Automatic Verification of Plan Data Transfer and Delivery.

    Science.gov (United States)

    Chen, G; Li, X

    2012-06-01

    Consistency verification between the data from treatment planning system (TPS), record and verification system (R&V), and delivered recorder with visual inspection is time consuming and subject to human error. The purpose of this work is to develop a software tool to automatically perform such verifications. Using Microsoft visual C++, a quality assurance (QA) tool was developed to (1) read plan data including gantry/collimator/couch parameters, multi-leaf-collimator leaf positions, and monitor unit (MU) numbers from a TPS (Xio, CMS/Elekta, or RealART, Prowess) via RTP link or DICOM transfer, (2) retrieve imported (prior to delivery) and recorded (after delivery) data from a R&V system (Mosaiq, Elekta) with open database connectivity, calculate MU independently based on the DICOM plan data using a modified Clarkson integration algorithm, and (4) compare all the extracted data to identify possible discrepancy between TPS and R&V, and R&V and delivery. The tool was tested for 20 patients with 3DCRT and IMRT plans from regular and the online adaptive radiotherapy treatments. It was capable of automatically detecting any inconsistency between the beam data from the TPS and the data stored in the R&V system with an independent MU check and any significant treatment delivery deviation from the plan within a few seconds. With this tool being used prior to and after the delivery as an essential QA step, our clinical online adaptive re-planning process can be speeded up to save a few minutes by eliminating the tedious visual inspection. A QA software tool has been developed to automatically verify the treatment data consistency from delivery back to plan and to identify discrepancy in MU calculations between the TPS and the secondary MU check. This tool speeds up clinical QA process and eliminating human errors from visual inspection, thus improves safety. © 2012 American Association of Physicists in Medicine.

  5. Software tools for data modelling and processing of human body temperature circadian dynamics.

    Science.gov (United States)

    Petrova, Elena S; Afanasova, Anastasia I

    2015-01-01

    This paper is presenting a software development for simulating and processing thermometry data. The motivation of this research is the miniaturization of actuators attached to human body which allow frequent temperature measurements and improve the medical diagnosis procedures related to circadian dynamics.

  6. Software for improved field surveys of nesting marine turtles

    OpenAIRE

    Anast?cio, R.; Gonzalez, J. M.; Slater, K.; Pereira, M. J.

    2017-01-01

    Field data are still recorded on paper in many worldwide beach surveys of nesting marine turtles. The data must be subsequently transferred into an electronic database, and this can introduce errors in the dataset. To minimize such errors, the ?Turtles? software was developed and piloted to record field data by one software user accompanying one Tortuguero in Akumal beaches, Quintana Roo, Mexico, from June 1st to July 31st during the night patrols. Comparisons were made between exported data ...

  7. Robust optimal design of experiments for model discrimination using an interactive software tool.

    Directory of Open Access Journals (Sweden)

    Johannes Stegmaier

    Full Text Available Mathematical modeling of biochemical processes significantly contributes to a better understanding of biological functionality and underlying dynamic mechanisms. To support time consuming and costly lab experiments, kinetic reaction equations can be formulated as a set of ordinary differential equations, which in turn allows to simulate and compare hypothetical models in silico. To identify new experimental designs that are able to discriminate between investigated models, the approach used in this work solves a semi-infinite constrained nonlinear optimization problem using derivative based numerical algorithms. The method takes into account parameter variabilities such that new experimental designs are robust against parameter changes while maintaining the optimal potential to discriminate between hypothetical models. In this contribution we present a newly developed software tool that offers a convenient graphical user interface for model discrimination. We demonstrate the beneficial operation of the discrimination approach and the usefulness of the software tool by analyzing a realistic benchmark experiment from literature. New robust optimal designs that allow to discriminate between the investigated model hypotheses of the benchmark experiment are successfully calculated and yield promising results. The involved robustification approach provides maximally discriminating experiments for the worst parameter configurations, which can be used to estimate the meaningfulness of upcoming experiments. A major benefit of the graphical user interface is the ability to interactively investigate the model behavior and the clear arrangement of numerous variables. In addition to a brief theoretical overview of the discrimination method and the functionality of the software tool, the importance of robustness of experimental designs against parameter variability is demonstrated on a biochemical benchmark problem. The software is licensed under the GNU

  8. QCScreen: a software tool for data quality control in LC-HRMS based metabolomics.

    Science.gov (United States)

    Simader, Alexandra Maria; Kluger, Bernhard; Neumann, Nora Katharina Nicole; Bueschl, Christoph; Lemmens, Marc; Lirk, Gerald; Krska, Rudolf; Schuhmacher, Rainer

    2015-10-24

    Metabolomics experiments often comprise large numbers of biological samples resulting in huge amounts of data. This data needs to be inspected for plausibility before data evaluation to detect putative sources of error e.g. retention time or mass accuracy shifts. Especially in liquid chromatography-high resolution mass spectrometry (LC-HRMS) based metabolomics research, proper quality control checks (e.g. for precision, signal drifts or offsets) are crucial prerequisites to achieve reliable and comparable results within and across experimental measurement sequences. Software tools can support this process. The software tool QCScreen was developed to offer a quick and easy data quality check of LC-HRMS derived data. It allows a flexible investigation and comparison of basic quality-related parameters within user-defined target features and the possibility to automatically evaluate multiple sample types within or across different measurement sequences in a short time. It offers a user-friendly interface that allows an easy selection of processing steps and parameter settings. The generated results include a coloured overview plot of data quality across all analysed samples and targets and, in addition, detailed illustrations of the stability and precision of the chromatographic separation, the mass accuracy and the detector sensitivity. The use of QCScreen is demonstrated with experimental data from metabolomics experiments using selected standard compounds in pure solvent. The application of the software identified problematic features, samples and analytical parameters and suggested which data files or compounds required closer manual inspection. QCScreen is an open source software tool which provides a useful basis for assessing the suitability of LC-HRMS data prior to time consuming, detailed data processing and subsequent statistical analysis. It accepts the generic mzXML format and thus can be used with many different LC-HRMS platforms to process both multiple

  9. Omics Informatics: From Scattered Individual Software Tools to Integrated Workflow Management Systems.

    Science.gov (United States)

    Ma, Tianle; Zhang, Aidong

    2017-01-01

    Omic data analyses pose great informatics challenges. As an emerging subfield of bioinformatics, omics informatics focuses on analyzing multi-omic data efficiently and effectively, and is gaining momentum. There are two underlying trends in the expansion of omics informatics landscape: the explosion of scattered individual omics informatics tools with each of which focuses on a specific task in both single- and multi- omic settings, and the fast-evolving integrated software platforms such as workflow management systems that can assemble multiple tools into pipelines and streamline integrative analysis for complicated tasks. In this survey, we give a holistic view of omics informatics, from scattered individual informatics tools to integrated workflow management systems. We not only outline the landscape and challenges of omics informatics, but also sample a number of widely used and cutting-edge algorithms in omics data analysis to give readers a fine-grained view. We survey various workflow management systems (WMSs), classify them into three levels of WMSs from simple software toolkits to integrated multi-omic analytical platforms, and point out the emerging needs for developing intelligent workflow management systems. We also discuss the challenges, strategies and some existing work in systematic evaluation of omics informatics tools. We conclude by providing future perspectives of emerging fields and new frontiers in omics informatics.

  10. Development of a User Interface for a Regression Analysis Software Tool

    Science.gov (United States)

    Ulbrich, Norbert Manfred; Volden, Thomas R.

    2010-01-01

    An easy-to -use user interface was implemented in a highly automated regression analysis tool. The user interface was developed from the start to run on computers that use the Windows, Macintosh, Linux, or UNIX operating system. Many user interface features were specifically designed such that a novice or inexperienced user can apply the regression analysis tool with confidence. Therefore, the user interface s design minimizes interactive input from the user. In addition, reasonable default combinations are assigned to those analysis settings that influence the outcome of the regression analysis. These default combinations will lead to a successful regression analysis result for most experimental data sets. The user interface comes in two versions. The text user interface version is used for the ongoing development of the regression analysis tool. The official release of the regression analysis tool, on the other hand, has a graphical user interface that is more efficient to use. This graphical user interface displays all input file names, output file names, and analysis settings for a specific software application mode on a single screen which makes it easier to generate reliable analysis results and to perform input parameter studies. An object-oriented approach was used for the development of the graphical user interface. This choice keeps future software maintenance costs to a reasonable limit. Examples of both the text user interface and graphical user interface are discussed in order to illustrate the user interface s overall design approach.

  11. Improved Software to Browse the Serial Medical Images for Learning.

    Science.gov (United States)

    Kwon, Koojoo; Chung, Min Suk; Park, Jin Seo; Shin, Byeong Seok; Chung, Beom Sun

    2017-07-01

    The thousands of serial images used for medical pedagogy cannot be included in a printed book; they also cannot be efficiently handled by ordinary image viewer software. The purpose of this study was to provide browsing software to grasp serial medical images efficiently. The primary function of the newly programmed software was to select images using 3 types of interfaces: buttons or a horizontal scroll bar, a vertical scroll bar, and a checkbox. The secondary function was to show the names of the structures that had been outlined on the images. To confirm the functions of the software, 3 different types of image data of cadavers (sectioned and outlined images, volume models of the stomach, and photos of the dissected knees) were inputted. The browsing software was downloadable for free from the homepage (anatomy.co.kr) and available off-line. The data sets provided could be replaced by any developers for their educational achievements. We anticipate that the software will contribute to medical education by allowing users to browse a variety of images. © 2017 The Korean Academy of Medical Sciences.

  12. SOFTWARE EFFORT ESTIMATION FRAMEWORK TO IMPROVE ORGANIZATION PRODUCTIVITY USING EMOTION RECOGNITION OF SOFTWARE ENGINEERS IN SPONTANEOUS SPEECH

    Directory of Open Access Journals (Sweden)

    B.V.A.N.S.S. Prabhakar Rao

    2015-10-01

    Full Text Available Productivity is a very important part of any organisation in general and software industry in particular. Now a day’s Software Effort estimation is a challenging task. Both Effort and Productivity are inter-related to each other. This can be achieved from the employee’s of the organization. Every organisation requires emotionally stable employees in their firm for seamless and progressive working. Of course, in other industries this may be achieved without man power. But, software project development is labour intensive activity. Each line of code should be delivered from software engineer. Tools and techniques may helpful and act as aid or supplementary. Whatever be the reason software industry has been suffering with success rate. Software industry is facing lot of problems in delivering the project on time and within the estimated budget limit. If we want to estimate the required effort of the project it is significant to know the emotional state of the team member. The responsibility of ensuring emotional contentment falls on the human resource department and the department can deploy a series of systems to carry out its survey. This analysis can be done using a variety of tools, one such, is through study of emotion recognition. The data needed for this is readily available and collectable and can be an excellent source for the feedback systems. The challenge of recognition of emotion in speech is convoluted primarily due to the noisy recording condition, the variations in sentiment in sample space and exhibition of multiple emotions in a single sentence. The ambiguity in the labels of training set also increases the complexity of problem addressed. The existing models using probabilistic models have dominated the study but present a flaw in scalability due to statistical inefficiency. The problem of sentiment prediction in spontaneous speech can thus be addressed using a hybrid system comprising of a Convolution Neural Network and

  13. Emerging role of bioinformatics tools and software in evolution of clinical research

    Directory of Open Access Journals (Sweden)

    Supreet Kaur Gill

    2016-01-01

    Full Text Available Clinical research is making toiling efforts for promotion and wellbeing of the health status of the people. There is a rapid increase in number and severity of diseases like cancer, hepatitis, HIV etc, resulting in high morbidity and mortality. Clinical research involves drug discovery and development whereas clinical trials are performed to establish safety and efficacy of drugs. Drug discovery is a long process starting with the target identification, validation and lead optimization. This is followed by the preclinical trials, intensive clinical trials and eventually post marketing vigilance for drug safety. Softwares and the bioinformatics tools play a great role not only in the drug discovery but also in drug development. It involves the use of informatics in the development of new knowledge pertaining to health and disease, data management during clinical trials and to use clinical data for secondary research. In addition, new technology likes molecular docking, molecular dynamics simulation, proteomics and quantitative structure activity relationship in clinical research results in faster and easier drug discovery process. During the preclinical trials, the software is used for randomization to remove bias and to plan study design. In clinical trials software like electronic data capture, Remote data capture and electronic case report form (eCRF is used to store the data. eClinical, Oracle clinical are software used for clinical data management and for statistical analysis of the data. After the drug is marketed the safety of a drug could be monitored by drug safety software like Oracle Argus or ARISg. Therefore, softwares are used from the very early stages of drug designing, to drug development, clinical trials and during pharmacovigilance. This review describes different aspects related to application of computers and bioinformatics in drug designing, discovery and development, formulation designing and clinical research.

  14. DVS-SOFTWARE: An Effective Tool for Applying Highly Parallelized Hardware To Computational Geophysics

    Science.gov (United States)

    Herrera, I.; Herrera, G. S.

    2015-12-01

    Most geophysical systems are macroscopic physical systems. The behavior prediction of such systems is carried out by means of computational models whose basic models are partial differential equations (PDEs) [1]. Due to the enormous size of the discretized version of such PDEs it is necessary to apply highly parallelized super-computers. For them, at present, the most efficient software is based on non-overlapping domain decomposition methods (DDM). However, a limiting feature of the present state-of-the-art techniques is due to the kind of discretizations used in them. Recently, I. Herrera and co-workers using 'non-overlapping discretizations' have produced the DVS-Software which overcomes this limitation [2]. The DVS-software can be applied to a great variety of geophysical problems and achieves very high parallel efficiencies (90%, or so [3]). It is therefore very suitable for effectively applying the most advanced parallel supercomputers available at present. In a parallel talk, in this AGU Fall Meeting, Graciela Herrera Z. will present how this software is being applied to advance MOD-FLOW. Key Words: Parallel Software for Geophysics, High Performance Computing, HPC, Parallel Computing, Domain Decomposition Methods (DDM)REFERENCES [1]. Herrera Ismael and George F. Pinder, Mathematical Modelling in Science and Engineering: An axiomatic approach", John Wiley, 243p., 2012. [2]. Herrera, I., de la Cruz L.M. and Rosas-Medina A. "Non Overlapping Discretization Methods for Partial, Differential Equations". NUMER METH PART D E, 30: 1427-1454, 2014, DOI 10.1002/num 21852. (Open source) [3]. Herrera, I., & Contreras Iván "An Innovative Tool for Effectively Applying Highly Parallelized Software To Problems of Elasticity". Geofísica Internacional, 2015 (In press)

  15. Lightweight Methods for Effective Verification of Software Product Lines with Off-the-Shelf Tools

    DEFF Research Database (Denmark)

    Iosif-Lazar, Alexandru Florin

    2017-01-01

    relies on objective evidence of quality, which is produced by using qualified and state-of-the-art tools and verification and validation techniques. Software product line (SPL) engineering distributes costs among similar products that are developed simultaneously. However, SPL certification faces major......Certification is the process of assessing the quality of a product and whether it meets a set of requirements and adheres to functional and safety standards. I is often legally required to provide guarantee for human safety and to make the product available on the market. The certification process......, they are formally specified and accompanied by proof-of-concept implementations. The first challenge is that of qualifying variant derivation tools. Any error that occurs in the process of deriving/building products from an SPL can affect the quality of the products themselves. Qualifying variant derivation tools...

  16. MASH Suite Pro: A Comprehensive Software Tool for Top-Down Proteomics*

    Science.gov (United States)

    Cai, Wenxuan; Guner, Huseyin; Gregorich, Zachery R.; Chen, Albert J.; Ayaz-Guner, Serife; Peng, Ying; Valeja, Santosh G.; Liu, Xiaowen; Ge, Ying

    2016-01-01

    Top-down mass spectrometry (MS)-based proteomics is arguably a disruptive technology for the comprehensive analysis of all proteoforms arising from genetic variation, alternative splicing, and posttranslational modifications (PTMs). However, the complexity of top-down high-resolution mass spectra presents a significant challenge for data analysis. In contrast to the well-developed software packages available for data analysis in bottom-up proteomics, the data analysis tools in top-down proteomics remain underdeveloped. Moreover, despite recent efforts to develop algorithms and tools for the deconvolution of top-down high-resolution mass spectra and the identification of proteins from complex mixtures, a multifunctional software platform, which allows for the identification, quantitation, and characterization of proteoforms with visual validation, is still lacking. Herein, we have developed MASH Suite Pro, a comprehensive software tool for top-down proteomics with multifaceted functionality. MASH Suite Pro is capable of processing high-resolution MS and tandem MS (MS/MS) data using two deconvolution algorithms to optimize protein identification results. In addition, MASH Suite Pro allows for the characterization of PTMs and sequence variations, as well as the relative quantitation of multiple proteoforms in different experimental conditions. The program also provides visualization components for validation and correction of the computational outputs. Furthermore, MASH Suite Pro facilitates data reporting and presentation via direct output of the graphics. Thus, MASH Suite Pro significantly simplifies and speeds up the interpretation of high-resolution top-down proteomics data by integrating tools for protein identification, quantitation, characterization, and visual validation into a customizable and user-friendly interface. We envision that MASH Suite Pro will play an integral role in advancing the burgeoning field of top-down proteomics. PMID:26598644

  17. MASH Suite Pro: A Comprehensive Software Tool for Top-Down Proteomics.

    Science.gov (United States)

    Cai, Wenxuan; Guner, Huseyin; Gregorich, Zachery R; Chen, Albert J; Ayaz-Guner, Serife; Peng, Ying; Valeja, Santosh G; Liu, Xiaowen; Ge, Ying

    2016-02-01

    Top-down mass spectrometry (MS)-based proteomics is arguably a disruptive technology for the comprehensive analysis of all proteoforms arising from genetic variation, alternative splicing, and posttranslational modifications (PTMs). However, the complexity of top-down high-resolution mass spectra presents a significant challenge for data analysis. In contrast to the well-developed software packages available for data analysis in bottom-up proteomics, the data analysis tools in top-down proteomics remain underdeveloped. Moreover, despite recent efforts to develop algorithms and tools for the deconvolution of top-down high-resolution mass spectra and the identification of proteins from complex mixtures, a multifunctional software platform, which allows for the identification, quantitation, and characterization of proteoforms with visual validation, is still lacking. Herein, we have developed MASH Suite Pro, a comprehensive software tool for top-down proteomics with multifaceted functionality. MASH Suite Pro is capable of processing high-resolution MS and tandem MS (MS/MS) data using two deconvolution algorithms to optimize protein identification results. In addition, MASH Suite Pro allows for the characterization of PTMs and sequence variations, as well as the relative quantitation of multiple proteoforms in different experimental conditions. The program also provides visualization components for validation and correction of the computational outputs. Furthermore, MASH Suite Pro facilitates data reporting and presentation via direct output of the graphics. Thus, MASH Suite Pro significantly simplifies and speeds up the interpretation of high-resolution top-down proteomics data by integrating tools for protein identification, quantitation, characterization, and visual validation into a customizable and user-friendly interface. We envision that MASH Suite Pro will play an integral role in advancing the burgeoning field of top-down proteomics. © 2016 by The American

  18. STRAP PTM: Software Tool for Rapid Annotation and Differential Comparison of Protein Post-Translational Modifications.

    Science.gov (United States)

    Spencer, Jean L; Bhatia, Vivek N; Whelan, Stephen A; Costello, Catherine E; McComb, Mark E

    2013-12-01

    The identification of protein post-translational modifications (PTMs) is an increasingly important component of proteomics and biomarker discovery, but very few tools exist for performing fast and easy characterization of global PTM changes and differential comparison of PTMs across groups of data obtained from liquid chromatography-tandem mass spectrometry experiments. STRAP PTM (Software Tool for Rapid Annotation of Proteins: Post-Translational Modification edition) is a program that was developed to facilitate the characterization of PTMs using spectral counting and a novel scoring algorithm to accelerate the identification of differential PTMs from complex data sets. The software facilitates multi-sample comparison by collating, scoring, and ranking PTMs and by summarizing data visually. The freely available software (beta release) installs on a PC and processes data in protXML format obtained from files parsed through the Trans-Proteomic Pipeline. The easy-to-use interface allows examination of results at protein, peptide, and PTM levels, and the overall design offers tremendous flexibility that provides proteomics insight beyond simple assignment and counting.

  19. SAMPA: A free software tool for skin and membrane permeation data analysis.

    Science.gov (United States)

    Bezrouk, Aleš; Fiala, Zdeněk; Kotingová, Lenka; Krulichová, Iva Selke; Kopečná, Monika; Vávrová, Kateřina

    2017-10-01

    Skin and membrane permeation experiments comprise an important step in the development of a transdermal or topical formulation or toxicological risk assessment. The standard method for analyzing these data relies on the linear part of a permeation profile. However, it is difficult to objectively determine when the profile becomes linear, or the experiment duration may be insufficient to reach a maximum or steady state. Here, we present a software tool for Skin And Membrane Permeation data Analysis, SAMPA, that is easy to use and overcomes several of these difficulties. The SAMPA method and software have been validated on in vitro and in vivo permeation data on human, pig and rat skin and model stratum corneum lipid membranes using compounds that range from highly lipophilic polycyclic aromatic hydrocarbons to highly hydrophilic antiviral drug, with and without two permeation enhancers. The SAMPA performance was compared with the standard method using a linear part of the permeation profile and a complex mathematical model. SAMPA is a user-friendly, open-source software tool for analyzing the data obtained from skin and membrane permeation experiments. It runs on a Microsoft Windows platform and is freely available as a Supporting file to this article. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Gbm.auto: A software tool to simplify spatial modelling and Marine Protected Area planning.

    Directory of Open Access Journals (Sweden)

    Simon Dedman

    Full Text Available Marine resource managers and scientists often advocate spatial approaches to manage data-poor species. Existing spatial prediction and management techniques are either insufficiently robust, struggle with sparse input data, or make suboptimal use of multiple explanatory variables. Boosted Regression Trees feature excellent performance and are well suited to modelling the distribution of data-limited species, but are extremely complicated and time-consuming to learn and use, hindering access for a wide potential user base and therefore limiting uptake and usage.We have built a software suite in R which integrates pre-existing functions with new tailor-made functions to automate the processing and predictive mapping of species abundance data: by automating and greatly simplifying Boosted Regression Tree spatial modelling, the gbm.auto R package suite makes this powerful statistical modelling technique more accessible to potential users in the ecological and modelling communities. The package and its documentation allow the user to generate maps of predicted abundance, visualise the representativeness of those abundance maps and to plot the relative influence of explanatory variables and their relationship to the response variables. Databases of the processed model objects and a report explaining all the steps taken within the model are also generated. The package includes a previously unavailable Decision Support Tool which combines estimated escapement biomass (the percentage of an exploited population which must be retained each year to conserve it with the predicted abundance maps to generate maps showing the location and size of habitat that should be protected to conserve the target stocks (candidate MPAs, based on stakeholder priorities, such as the minimisation of fishing effort displacement.By bridging the gap between advanced statistical methods for species distribution modelling and conservation science, management and policy, these

  1. Interactive software tool to comprehend the calculation of optimal sequence alignments with dynamic programming.

    Science.gov (United States)

    Ibarra, Ignacio L; Melo, Francisco

    2010-07-01

    Dynamic programming (DP) is a general optimization strategy that is successfully used across various disciplines of science. In bioinformatics, it is widely applied in calculating the optimal alignment between pairs of protein or DNA sequences. These alignments form the basis of new, verifiable biological hypothesis. Despite its importance, there are no interactive tools available for training and education on understanding the DP algorithm. Here, we introduce an interactive computer application with a graphical interface, for the purpose of educating students about DP. The program displays the DP scoring matrix and the resulting optimal alignment(s), while allowing the user to modify key parameters such as the values in the similarity matrix, the sequence alignment algorithm version and the gap opening/extension penalties. We hope that this software will be useful to teachers and students of bioinformatics courses, as well as researchers who implement the DP algorithm for diverse applications. The software is freely available at: http:/melolab.org/sat. The software is written in the Java computer language, thus it runs on all major platforms and operating systems including Windows, Mac OS X and LINUX. All inquiries or comments about this software should be directed to Francisco Melo at fmelo@bio.puc.cl.

  2. TESPI (Tool for Environmental Sound Product Innovation): a simplified software tool to support environmentally conscious design in SMEs

    Science.gov (United States)

    Misceo, Monica; Buonamici, Roberto; Buttol, Patrizia; Naldesi, Luciano; Grimaldi, Filomena; Rinaldi, Caterina

    2004-12-01

    TESPI (Tool for Environmental Sound Product Innovation) is the prototype of a software tool developed within the framework of the "eLCA" project. The project, (www.elca.enea.it)financed by the European Commission, is realising "On line green tools and services for Small and Medium sized Enterprises (SMEs)". The implementation by SMEs of environmental product innovation (as fostered by the European Integrated Product Policy, IPP) needs specific adaptation to their economic model, their knowledge of production and management processes and their relationships with innovation and the environment. In particular, quality and costs are the main driving forces of innovation in European SMEs, and well known barriers exist to the adoption of an environmental approach in the product design. Starting from these considerations, the TESPI tool has been developed to support the first steps of product design taking into account both the quality and the environment. Two main issues have been considered: (i) classic Quality Function Deployment (QFD) can hardly be proposed to SMEs; (ii) the environmental aspects of the product life cycle need to be integrated with the quality approach. TESPI is a user friendly web-based tool, has a training approach and applies to modular products. Users are guided through the investigation of the quality aspects of their product (customer"s needs and requirements fulfilment) and the identification of the key environmental aspects in the product"s life cycle. A simplified check list allows analyzing the environmental performance of the product. Help is available for a better understanding of the analysis criteria. As a result, the significant aspects for the redesign of the product are identified.

  3. FORMATION OF SENIOR PUPILS ALGORITHMIC CULTURE IN THE PROCESS OF SOLVING COMPUTATIONAL PROBLEMS USING SOFTWARE TOOLS: RESULTS OF THE STUDY

    Directory of Open Access Journals (Sweden)

    Liudmyla V. Osipа

    2013-06-01

    Full Text Available Introduced a new practical solution to the urgent problem of the formation of algorithmic culture of senior pupils in the process of solving computational problems using software tools. Identified and theoretically grounded teaching conditions for the formation of algorithmic culture of high school students in the process of solving computational problems with the use of software tools and developed a training program as an elective course "The decision of computational problems using software tools", the introduction of which is necessary for the implementation of the teaching conditions for the formation of algorithmic culture.

  4. Conceptions and validation software tools for the level 0 muon trigger of LHCb

    CERN Document Server

    Aslanides, E; Cogan, J; Duval, P-Y; Le Gac, R; Hachon, F; Leroy, O; Liotard, P-L; Marin, F; Tsaregorodtsev, A

    2009-01-01

    The Level-0 muon trigger processor of the LHCb experiment looks for straight particules crossing muon detector and measures their transverse momentum. It processes 40×10^6 proton-proton collisions per second. The tracking uses a road algorithm relying on the projectivity of the muon detector. The architecture of the Level-0 muon trigger is complex with a dense network of data interconnections. The design and validation of such an intricate system has only been possible with intense use of software tools for the detector simulation, the modelling of the hardware components behaviour and the validation. A database describing the dataflow is the corner stone between the software and hardware components.

  5. Conception and validation software tools for the level 0 muon trigger of LHCb

    CERN Document Server

    Aslanides, E; Cogan, J; Duval, P-Y; Le Gac, R; Hachon, R; Leroy, O; Liotard, P-L; Marin, F; Tsaregorodtsev, A

    2009-01-01

    The Level-O muon trigger processor of the LHCb experiment looks for straight particules crossing muon detector and measures their transverse momentum. It processes 40x$10^6$ proton-proton collisions per second. The tracking uses a road algorithm relying on the projectivity of the muon detector. The architecture of the Level-O muon trigger is complex with a dense network of data interconnections. The design and validation of such an intricate system has only been possible with intense use of software tools for the detector simulation, the modeling of the hardware components behaviour and the validation. A database describing the dataflow is the corner stone between the software and hardware components.

  6. SLDAssay: A software package and web tool for analyzing limiting dilution assays.

    Science.gov (United States)

    Trumble, Ilana M; Allmon, Andrew G; Archin, Nancie M; Rigdon, Joseph; Francis, Owen; Baldoni, Pedro L; Hudgens, Michael G

    2017-11-01

    Serial limiting dilution (SLD) assays are used in many areas of infectious disease related research. This paper presents SLDAssay, a free and publicly available R software package and web tool for analyzing data from SLD assays. SLDAssay computes the maximum likelihood estimate (MLE) for the concentration of target cells, with corresponding exact and asymptotic confidence intervals. Exact and asymptotic goodness of fit p-values, and a bias-corrected (BC) MLE are also provided. No other publicly available software currently implements the BC MLE or the exact methods. For validation of SLDAssay, results from Myers et al. (1994) are replicated. Simulations demonstrate the BC MLE is less biased than the MLE. Additionally, simulations demonstrate that exact methods tend to give better confidence interval coverage and goodness-of-fit tests with lower type I error than the asymptotic methods. Additional advantages of using exact methods are also discussed. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Techniques and tools for measuring energy efficiency of scientific software applications

    CERN Document Server

    Abdurachmanov, David; Eulisse, Giulio; Knight, Robert; Niemi, Tapio; Nurminen, Jukka K.; Nyback, Filip; Pestana, Goncalo; Ou, Zhonghong; Khan, Kashif

    2014-01-01

    The scale of scientific High Performance Computing (HPC) and High Throughput Computing (HTC) has increased significantly in recent years, and is becoming sensitive to total energy use and cost. Energy-efficiency has thus become an important concern in scientific fields such as High Energy Physics (HEP). There has been a growing interest in utilizing alternate architectures, such as low power ARM processors, to replace traditional Intel x86 architectures. Nevertheless, even though such solutions have been successfully used in mobile applications with low I/O and memory demands, it is unclear if they are suitable and more energy-efficient in the scientific computing environment. Furthermore, there is a lack of tools and experience to derive and compare power consumption between the architectures for various workloads, and eventually to support software optimizations for energy efficiency. To that end, we have performed several physical and software-based measurements of workloads from HEP applications running o...

  8. GelMap-a novel software tool for building and presenting proteome reference maps.

    Science.gov (United States)

    Rode, Christina; Senkler, Michael; Klodmann, Jennifer; Winkelmann, Traud; Braun, Hans-Peter

    2011-09-06

    Protein separation by two-dimensional gel electrophoresis is of central importance for proteomics. Upon combination with systematic protein identifications by mass spectrometry, large data sets are routinely generated in several proteome laboratories which can be used as "reference maps" for future analyses of analogous biochemical fractions. Here we present GelMap, a novel software tool for the building presentation and evaluation of proteomic reference maps. Variable frames are introduced in order to group proteins into functional categories on three levels or into categories according to differential abundance during comparative proteome analyses. The software is easy to handle as it only requires uploading two digital files to a web site. An additional file including detailed information on all proteins can be combined with the primary map. Two different gel-based projects are presented to illustrate the capacity of GelMap for proteome annotation and evaluation. Copyright © 2011 Elsevier B.V. All rights reserved.

  9. Web-based software tool for constraint-based design specification of synthetic biological systems.

    Science.gov (United States)

    Oberortner, Ernst; Densmore, Douglas

    2015-06-19

    miniEugene provides computational support for solving combinatorial design problems, enabling users to specify and enumerate designs for novel biological systems based on sets of biological constraints. This technical note presents a brief tutorial for biologists and software engineers in the field of synthetic biology on how to use miniEugene. After reading this technical note, users should know which biological constraints are available in miniEugene, understand the syntax and semantics of these constraints, and be able to follow a step-by-step guide to specify the design of a classical synthetic biological system-the genetic toggle switch.1 We also provide links and references to more information on the miniEugene web application and the integration of the miniEugene software library into sophisticated Computer-Aided Design (CAD) tools for synthetic biology ( www.eugenecad.org ).

  10. Fuzzy cognitive map software tool for treatment management of uncomplicated urinary tract infection.

    Science.gov (United States)

    Papageorgiou, Elpiniki I

    2012-03-01

    Uncomplicated urinary tract infection (uUTI) is a bacterial infection that affects individuals with normal urinary tracts from both structural and functional perspective. The appropriate antibiotics and treatment suggestions to individuals suffer of uUTI is an important and complex task that demands a special attention. How to decrease the unsafely use of antibiotics and their consumption is an important issue in medical treatment. Aiming to model medical decision making for uUTI treatment, an innovative and flexible approach called fuzzy cognitive maps (FCMs) is proposed to handle with uncertainty and missing information. The FCM is a promising technique for modeling knowledge and/or medical guidelines/treatment suggestions and reasoning with it. A software tool, namely FCM-uUTI DSS, is investigated in this work to produce a decision support module for uUTI treatment management. The software tool was tested (evaluated) in a number of 38 patient cases, showing its functionality and demonstrating that the use of the FCMs as dynamic models is reliable and good. The results have shown that the suggested FCM-uUTI tool gives a front-end decision on antibiotics' suggestion for uUTI treatment and are considered as helpful references for physicians and patients. Due to its easy graphical representation and simulation process the proposed FCM formalization could be used to make the medical knowledge widely available through computer consultation systems. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  11. Software tools for manipulating fe mesh, virtual surgery and post-processing

    Directory of Open Access Journals (Sweden)

    Milašinović Danko Z.

    2009-01-01

    Full Text Available This paper describes a set of software tools which we developed for the calculation of fluid flow through cardiovascular organs. Our tools work with medical data from a CT scanner, but could be used with any other 3D input data. For meshing we used a Tetgen tetrahedral mesh generator, as well as a mesh re-generator that we have developed for conversion of tetrahedral elements into bricks. After adequate meshing we used our PAKF solver for calculation of fluid flow. For human-friendly presentation of results we developed a set of post-processing software tools. With modification of 2D mesh (boundary of cardiovascular organ it is possible to do virtual surgery, so in a case of an aorta with aneurism, which we had received from University Clinical center in Heidelberg from a multi-slice 64-CT scanner, we removed the aneurism and ran calculations on both geometrical models afterwards. The main idea of this methodology is creating a system that could be used in clinics.

  12. New tools for digital medical image processing implemented in DIP software

    Energy Technology Data Exchange (ETDEWEB)

    Araujo, Erica A.C.; Santana, Ivan E. [Instituto Federal de Educacao, Ciencia e Tecnologia de Pernambuco, Recife, PE (Brazil); Lima, Fernando R.A., E-mail: falima@cnen.gov.b [Centro Regional de Ciencias Nucleares, (CRCN/NE-CNEN-PE), Recife, PE (Brazil); Viera, Jose W. [Escola Politecnica de Pernambuco, Recife, PE (Brazil)

    2011-07-01

    The anthropomorphic models used in computational dosimetry, also called phantoms, are mostly built from stacks of images CT (Computed Tomography) or MRI (Magnetic Resonance Imaging) obtained from scans of patients or volunteers. The construction of voxel phantoms requires computational processing for transforming image formats, dimensional image compression (2D) to form three-dimensional arrays (3D), quantization, resampling, enhancement, restoration and image segmentation, among others. Hardly the computational dosimetry researcher finds all these skills into a single software and often it results in a decreased development of their research or inadequate use of alternative tools. The need to integrate the various tasks of the original digital image processing to obtain an image that can be used in a computational model of exposure led to the development of software DIP (Digital Image Processing). This software reads, writes and edits binary files containing the 3D matrix corresponding to a stack of cross-sectional images of a given geometry that can be a human body or other volume of interest. It can also read any type of computer image and do conversions. When the task involves only one output image, it is saved in the JPEG standard Windows. When it involves a stack of images, the binary output file is called SGI (Interactive Graphic Simulations, a symbol already used in other publications of the Research Group in Numerical Dosimetry). The following paper presents the third version of the DIP software and emphasizes the new tools it implemented. Currently it has the menus Basics, Views, Spatial Domain, Frequency Domain, Segmentations and Study. Each menu contains items and subitems with features that generally require an image as input and produce an image or an attribute in the output. (author)

  13. A software tool for advanced MRgFUS prostate therapy planning and follow up

    Science.gov (United States)

    van Straaten, Dörte; Hoogenboom, Martijn; van Amerongen, Martinus J.; Weiler, Florian; Issawi, Jumana Al; Günther, Matthias; Fütterer, Jurgen; Jenne, Jürgen W.

    2017-03-01

    US guided HIFU/FUS ablation for the therapy of prostate cancer is a clinical established method, while MR guided HIFU/FUS applications for prostate recently started clinical evaluation. Even if MRI examination is an excellent diagnostic tool for prostate cancer, it is a time consuming procedure and not practicable within an MRgFUS therapy session. The aim of our ongoing work is to develop software to support therapy planning and post-therapy follow-up for MRgFUS on localized prostate cancer, based on multi-parametric MR protocols. The clinical workflow of diagnosis, therapy and follow-up of MR guided FUS on prostate cancer was deeply analyzed. Based on this, the image processing workflow was designed and all necessary components, e.g. GUI, viewer, registration tools etc. were defined and implemented. The software bases on MeVisLab with several implemented C++ modules for the image processing tasks. The developed software, called LTC (Local Therapy Control) will register and visualize automatically all images (T1w, T2w, DWI etc.) and ADC or perfusion maps gained from the diagnostic MRI session. This maximum of diagnostic information helps to segment all necessary ROIs, e.g. the tumor, for therapy planning. Final therapy planning will be performed based on these segmentation data in the following MRgFUS therapy session. In addition, the developed software should help to evaluate the therapy success, by synchronization and display of pre-therapeutic, therapy and follow-up image data including the therapy plan and thermal dose information. In this ongoing project, the first stand-alone prototype was completed and will be clinically evaluated.

  14. Parameter definition using vibration prediction software leads to significant drilling performance improvements

    Energy Technology Data Exchange (ETDEWEB)

    Amorim, Dalmo; Hanley, Chris Hanley; Fonseca, Isaac; Santos, Juliana [National Oilwell Varco, Houston TX (United States); Leite, Daltro J.; Borella, Augusto; Gozzi, Danilo [Petroleo Brasileiro S.A. (PETROBRAS), Rio de Janeiro, RJ (Brazil)

    2012-07-01

    field monitoring. Vibration prediction diminishes the importance of trial-and-error procedures such as drill-off tests, which are valid only for short sections. It also solves an existing lapse in Mechanical Specific Energy (MSE) real-time drilling control programs applying the theory of Teale, which states that a drilling system is perfectly efficient when it spends the exact energy to overcome the in situ rock strength. Using the proprietary software tool this paper will examine the resonant vibration modes that may be initiated while drilling with different BHA's and drill string designs, showing that the combination of a proper BHA design along with the correct selection of input parameters results in an overall improvement to drilling efficiency. Also, being the BHA predictively analyzed, it will be reduced the potential for vibration or stress fatigue in the drill string components, leading to a safer operation. In the recent years there has been an increased focus on vibration detection, analysis, and mitigation techniques, where new technologies, like the Drilling Dynamics Data Recorders (DDDR), may provide the capability to capture high frequency dynamics data at multiple points along the drilling system. These tools allow the achievement of drilling performance improvements not possible before, opening a whole new array of opportunities for optimization and for verification of predictions calculated by the drill string dynamics modeling software tool. The results of this study will identify how the dynamics from the drilling system, interacting with formation, directly relate to inefficiencies and to the possible solutions to mitigate drilling vibrations in order to improve drilling performance. Software vibration prediction and downhole measurements can be used for non-drilling operations like drilling out casing or reaming, where extremely high vibration levels - devastating to the cutting structure of the bit before it has even touched bottom - have

  15. Improving Reuse in Software Development for the Life Sciences

    Science.gov (United States)

    Iannotti, Nicholas V.

    2013-01-01

    The last several years have seen unprecedented advancements in the application of technology to the life sciences, particularly in the area of data generation. Novel scientific insights are now often driven primarily by software development supporting new multidisciplinary and increasingly multifaceted data analysis. However, despite the…

  16. Improving Collaborative Learning in Online Software Engineering Education

    Science.gov (United States)

    Neill, Colin J.; DeFranco, Joanna F.; Sangwan, Raghvinder S.

    2017-01-01

    Team projects are commonplace in software engineering education. They address a key educational objective, provide students critical experience relevant to their future careers, allow instructors to set problems of greater scale and complexity than could be tackled individually, and are a vehicle for socially constructed learning. While all…

  17. Mining Program Source Code for Improving Software Quality

    Science.gov (United States)

    2013-01-01

    Maintenance (ICSM). 2010/09/12 00:00:00, Timi oara, Romania. : , 05/09/2013 44.00 Rahul Pandita, Tao Xie, Nikolai Tillmann, Jonathan de Halleux. Guided...test generation for coverage criteria, 2010 IEEE 26th International Conference on Software Maintenance (ICSM). 2010/09/12 00:00:00, Timi oara

  18. The Development of an Analyses-Intensive Software for Improved ...

    African Journals Online (AJOL)

    The computer-aided software developed in this research work is used in designing cam systems by generating various follower motions and cam profiles. It is highly suited for extensive dynamics, kinematics and geometric design analysis based on some inherent features that are unique. The plate cam with either flat-face ...

  19. Software Cost and Schedule Estimating: A Process Improvement Initiative

    Science.gov (United States)

    1994-05-01

    their text on SADT’I" [ Marca 87], Marca and McGowan describe the different interfaces as "* Input: Things used and transformed. "* Control: Things that...Software Engineering Institute, Carnegie Mellon University, November 1991. [ Marca 87] Marca , D. A. and McGowan, C. L. Structured Analysis and Design

  20. Metrics to improve control in outsourcing software development projects

    NARCIS (Netherlands)

    Ponisio, Laura; van Eck, Pascal; Messnarz, R.; Ekert, D.; Christiansen, M.; Johansen, J.; Koinig, S.

    Measurements serve as vital instruments to control projects involving software development outsourcing. However, managers have found it difficult to develop and implement effective measurement programs, in part because guidelines for choosing among concrete measure-ments are scarce. We address this

  1. Evaluation of Farm Accounting Software. Improved Decision Making.

    Science.gov (United States)

    Lovell, Ashley C., Comp.

    This guide contains information on 36 computer programs used for farm and ranch accounting. This information and assessment of software features were provided by the manufacturers and vendors. Information is provided on the following items, among others: program name, vendor's name and address, computer and operating system, type of accounting and…

  2. Improving a data-acquisition software system with abstract data type components

    Science.gov (United States)

    Howard, S. D.

    1990-01-01

    Abstract data types and object-oriented design are active research areas in computer science and software engineering. Much of the interest is aimed at new software development. Abstract data type packages developed for a discontinued software project were used to improve a real-time data-acquisition system under maintenance. The result saved effort and contributed to a significant improvement in the performance, maintainability, and reliability of the Goldstone Solar System Radar Data Acquisition System.

  3. A software tool for STED-AFM correlative super-resolution microscopy

    Science.gov (United States)

    Koho, Sami; Deguchi, Takahiro; Löhmus, Madis; Näreoja, Tuomas; Hänninen, Pekka E.

    2015-03-01

    Multi-modal correlative microscopy allows combining the strengths of several imaging techniques to provide unique contrast. However it is not always straightforward to setup instruments for such customized experiments, as most microscope manufacturers use their own proprietary software, with limited or no capability to interface with other instruments - this makes correlation of the multi-modal data extremely challenging. We introduce a new software tool for simultaneous use of a STimulated Emission Depletion (STED) microscope with an Atomic Force Microscope (AFM). In our experiments, a Leica TCS STED commercial super-resolution microscope, together with an Agilent 5500ilm AFM microscope was used. With our software, it is possible to synchronize the data acquisition between the STED and AFM instruments, as well as to perform automatic registration of the AFM images with the super-resolution STED images. The software was realized in LabVIEW; the registration part was also implemented as an ImageJ script. The synchronization was realized by controlling simple trigger signals, also available in the commercial STED microscope, with a low-cost National Instruments USB-6501 digital I/O card. The registration was based on detecting the positions of the AFM tip inside the STED fieldof-view, which were then used as registration landmarks. The registration should work on any STED and tip-scanning AFM microscope combination, at nanometer-scale precision. Our STED-AFM correlation method has been tested with a variety of nanoparticle and fixed cell samples. The software will be released under BSD open-source license.

  4. adwTools Developed: New Bulk Alloy and Surface Analysis Software for the Alloy Design Workbench

    Science.gov (United States)

    Bozzolo, Guillermo; Morse, Jeffrey A.; Noebe, Ronald D.; Abel, Phillip B.

    2004-01-01

    A suite of atomistic modeling software, called the Alloy Design Workbench, has been developed by the Computational Materials Group at the NASA Glenn Research Center and the Ohio Aerospace Institute (OAI). The main goal of this software is to guide and augment experimental materials research and development efforts by creating powerful, yet intuitive, software that combines a graphical user interface with an operating code suitable for real-time atomistic simulations of multicomponent alloy systems. Targeted for experimentalists, the interface is straightforward and requires minimum knowledge of the underlying theory, allowing researchers to focus on the scientific aspects of the work. The centerpiece of the Alloy Design Workbench suite is the adwTools module, which concentrates on the atomistic analysis of surfaces and bulk alloys containing an arbitrary number of elements. An additional module, adwParams, handles ab initio input for the parameterization used in adwTools. Future modules planned for the suite include adwSeg, which will provide numerical predictions for segregation profiles to alloy surfaces and interfaces, and adwReport, which will serve as a window into the database, providing public access to the parameterization data and a repository where users can submit their own findings from the rest of the suite. The entire suite is designed to run on desktop-scale computers. The adwTools module incorporates a custom OAI/Glenn-developed Fortran code based on the BFS (Bozzolo- Ferrante-Smith) method for alloys, ref. 1). The heart of the suite, this code is used to calculate the energetics of different compositions and configurations of atoms.

  5. A software tool to assist business-process decision-making in the biopharmaceutical industry.

    Science.gov (United States)

    Mustafa, Mustafa A; Washbrook, John; Lim, Ai Chye; Zhou, Yuhong; Titchener-Hooker, Nigel J; Morton, Philip; Berezenko, Steve; Farid, Suzanne S

    2004-01-01

    Conventionally, software tools for the design of bioprocesses have provided only limited business-related information for decision-making. There is an industrial need to investigate manufacturing options and to gauge the impact of various decisions from economic as well as process perspectives. This paper describes the development and use of a tool to provide an assessment of whole flowsheets by capturing both process and business aspects. The tool is demonstrated by considering the issues concerned when making decisions between two potential flowsheets for a common product. A case study approach is used to compare the process and business benefits of a conventional process route employing packed chromatography beds and an alternative that uses expanded bed adsorption (EBA). The tool allows direct evaluation of the benefits of capital cost reduction and increased yield offered by EBA against penalties of using potentially more expensive EBA matrix with lower lifetimes. Furthermore, the tool provides the ability to gauge the process robustness of each flowsheet option.

  6. An open source browser-based software tool for graph drawing and visualisation

    OpenAIRE

    Vogt, Veit-Dieter

    2014-01-01

    In this research work we searched for open source libraries which supports graph drawing and visualisation and can run in a browser. Subsequent these libraries were evaluated to find out which one is the best for this task. The result was the d3.js is that library which has the greatest functionality, flexibility and customisability. Afterwards we developed an open source software tool where d3.js was included and which was written in JavaScript so that it can run browser-based. En este tr...

  7. On the Design of a Knowledge Management System for Incremental Process Improvement for Software Product Management

    NARCIS (Netherlands)

    Vlaanderen, K.; Brinkkemper, S.; van de Weerd, I.

    2012-01-01

    Incremental software process improvement deals with the challenges of step-wise process improvement in a time where resources are scarce and many organizations are struggling with the challenges of effective management of software products. Effective knowledge sharing and incremental approaches are

  8. Server-based enterprise collaboration software improves safety and quality in high-volume PET/CT practice.

    Science.gov (United States)

    McDonald, James E; Kessler, Marcus M; Hightower, Jeremy L; Henry, Susan D; Deloney, Linda A

    2013-12-01

    With increasing volumes of complex imaging cases and rising economic pressure on physician staffing, timely reporting will become progressively challenging. Current and planned iterations of PACS and electronic medical record systems do not offer workflow management tools to coordinate delivery of imaging interpretations with the needs of the patient and ordering physician. The adoption of a server-based enterprise collaboration software system by our Division of Nuclear Medicine has significantly improved our efficiency and quality of service.

  9. How can GPs drive software changes to improve healthcare for Aboriginal and Torres Strait Islanders peoples?

    Science.gov (United States)

    Kehoe, Helen

    2017-01-01

    Changes to the software used in general practice could improve the collection of the Aboriginal and Torres Strait Islander status of all patients, and boost access to healthcare measures specifically for Aboriginal and Torres Strait Islander peoples provided directly or indirectly by general practitioners (GPs). Despite longstanding calls for improvements to general practice software to better support Aboriginal and Torres Strait Islander health, little change has been made. The aim of this article is to promote software improvements by identifying desirable software attributes and encouraging GPs to promote their adoption. Establishing strong links between collecting Aboriginal and Torres Strait Islander status, clinical decision supports, and uptake of GP-mediated health measures specifically for Aboriginal and Torres Strait Islander peoples - and embedding these links in GP software - is a long overdue reform. In the absence of government initiatives in this area, GPs are best placed to advocate for software changes, using the model described here as a starting point for action.

  10. A software tool for creating simulated outbreaks to benchmark surveillance systems

    Directory of Open Access Journals (Sweden)

    Olson Karen L

    2005-07-01

    Full Text Available Abstract Background Evaluating surveillance systems for the early detection of bioterrorism is particularly challenging when systems are designed to detect events for which there are few or no historical examples. One approach to benchmarking outbreak detection performance is to create semi-synthetic datasets containing authentic baseline patient data (noise and injected artificial patient clusters, as signal. Methods We describe a software tool, the AEGIS Cluster Creation Tool (AEGIS-CCT, that enables users to create simulated clusters with controlled feature sets, varying the desired cluster radius, density, distance, relative location from a reference point, and temporal epidemiological growth pattern. AEGIS-CCT does not require the use of an external geographical information system program for cluster creation. The cluster creation tool is an open source program, implemented in Java and is freely available under the Lesser GNU Public License at its Sourceforge website. Cluster data are written to files or can be appended to existing files so that the resulting file will include both existing baseline and artificially added cases. Multiple cluster file creation is an automated process in which multiple cluster files are created by varying a single parameter within a user-specified range. To evaluate the output of this software tool, sets of test clusters were created and graphically rendered. Results Based on user-specified parameters describing the location, properties, and temporal pattern of simulated clusters, AEGIS-CCT created clusters accurately and uniformly. Conclusion AEGIS-CCT enables the ready creation of datasets for benchmarking outbreak detection systems. It may be useful for automating the testing and validation of spatial and temporal cluster detection algorithms.

  11. Transfer Learning for Improving Model Predictions in Highly Configurable Software

    OpenAIRE

    Jamshidi, Pooyan; Velez, Miguel; Kästner, Christian; Siegmund, Norbert; Kawthekar, Prasad

    2017-01-01

    Modern software systems are built to be used in dynamic environments using configuration capabilities to adapt to changes and external uncertainties. In a self-adaptation context, we are often interested in reasoning about the performance of the systems under different configurations. Usually, we learn a black-box model based on real measurements to predict the performance of the system given a specific configuration. However, as modern systems become more complex, there are many configuratio...

  12. A Survey of Commonly Applied Methods for Software Process Improvement

    Science.gov (United States)

    1994-02-01

    may be overly optimistic, which may result in the 󈨞 percent syndrome * wherein they claim to be 90 percent done for the last 50 percent of the...Management, 13, 1-10, 1987. [Abdel-Hamid 88]Abdel-Hamid, T. K., "Understanding the 󈨞% Syndrome ’ in Software Project Management: A Simulation-Based Case...experienced by the customer. In a typical automotive example, an engineering factor might be the strength of a windshield wiper spring and the

  13. Life Cycle Assessment Studies of Chemical and Biochemical Processes through the new LCSoft Software-tool

    DEFF Research Database (Denmark)

    Supawanich, Perapong; Malakul, Pomthong; Gani, Rafiqul

    2015-01-01

    Life Cycle Assessment or LCA is an effective tool for quantifying the potential environmental impacts of products, processes, or services in order to support the selection making of desired products and/or processes from different alternatives. For more sustainable process designs, technical requ...... on the LCI assessment results. The fourth task has been added to validate and improve LCSoft by testing it against several case studies and compare the assessment results with other available tools....

  14. Easyverifier 1.0: a software tool for revising scientific articles’ bibliographical citations

    Directory of Open Access Journals (Sweden)

    Freddy Alberto Correa Riveros

    2010-05-01

    Full Text Available The first academic revolution which occurred in developed countries during the late 19th century made research a university func- tion in addition to the traditional task of teaching. A second academic revolution has tried to transform the university into a tea- ching, research and socio-economic development enterprise. The scientific article has become an excellent practical means for the movement of new knowledge between the university and the socioeconomic environment. This work had two purposes. One was to present some general considerations regarding research and the scientific article. The second was to provide information about a computational tool which supports revising scientific articles’ citations; this step is usually done manually and requires some experience. The software allows two text files to be read, one containing the scientific article’s content and another one the bibliography. A report is then generated allowing the authors mentioned in the text but not indexed in the bibliography to be identified and to determine which authors have been mentioned in the bibliography but who have not been mentioned in the text of the article. The software allows researchers and journal coordinators to detect reference errors among citations in the text and the bibliographical references. The steps to develop the software were: analysis, design, implementation and use. For the analysis it was important the revision of the literature about elaboration of citations in scientific documents.

  15. Isobar(PTM): a software tool for the quantitative analysis of post-translationally modified proteins.

    Science.gov (United States)

    Breitwieser, Florian P; Colinge, Jacques

    2013-09-02

    The establishment of extremely powerful proteomics platforms able to map thousands of modification sites, e.g. phosphorylations or acetylations, over entire proteomes calls for equally powerful software tools to effectively extract useful and reliable information from such complex datasets. We present a new quantitative PTM analysis platform aimed at processing iTRAQ or Tandem Mass Tags (TMT) labeled peptides. It covers a broad range of needs associated with proper PTM ratio analysis such as PTM localization validation, robust ratio computation and statistical assessment, and navigable user report generation. Isobar(PTM) is made available as an R Bioconductor package and it can be run from the command line by non R specialists. "IsobarPTM is a new software tool facilitating the quantitative analysis of protein modification regulation streamlining important issues related to PTM localization and statistical modeling. Users are provided with a navigable spreadsheet report, which also annotate already public modification sites." Copyright © 2013 Elsevier B.V. All rights reserved.

  16. DSSR: an integrated software tool for dissecting the spatial structure of RNA.

    Science.gov (United States)

    Lu, Xiang-Jun; Bussemaker, Harmen J; Olson, Wilma K

    2015-12-02

    Insight into the three-dimensional architecture of RNA is essential for understanding its cellular functions. However, even the classic transfer RNA structure contains features that are overlooked by existing bioinformatics tools. Here we present DSSR (Dissecting the Spatial Structure of RNA), an integrated and automated tool for analyzing and annotating RNA tertiary structures. The software identifies canonical and noncanonical base pairs, including those with modified nucleotides, in any tautomeric or protonation state. DSSR detects higher-order coplanar base associations, termed multiplets. It finds arrays of stacked pairs, classifies them by base-pair identity and backbone connectivity, and distinguishes a stem of covalently connected canonical pairs from a helix of stacked pairs of arbitrary type/linkage. DSSR identifies coaxial stacking of multiple stems within a single helix and lists isolated canonical pairs that lie outside of a stem. The program characterizes 'closed' loops of various types (hairpin, bulge, internal, and junction loops) and pseudoknots of arbitrary complexity. Notably, DSSR employs isolated pairs and the ends of stems, whether pseudoknotted or not, to define junction loops. This new, inclusive definition provides a novel perspective on the spatial organization of RNA. Tests on all nucleic acid structures in the Protein Data Bank confirm the efficiency and robustness of the software, and applications to representative RNA molecules illustrate its unique features. DSSR and related materials are freely available at http://x3dna.org/. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  17. ConfocalCheck--a software tool for the automated monitoring of confocal microscope performance.

    Directory of Open Access Journals (Sweden)

    Keng Imm Hng

    Full Text Available Laser scanning confocal microscopy has become an invaluable tool in biomedical research but regular quality testing is vital to maintain the system's performance for diagnostic and research purposes. Although many methods have been devised over the years to characterise specific aspects of a confocal microscope like measuring the optical point spread function or the field illumination, only very few analysis tools are available. Our aim was to develop a comprehensive quality assurance framework ranging from image acquisition to automated analysis and documentation. We created standardised test data to assess the performance of the lasers, the objective lenses and other key components required for optimum confocal operation. The ConfocalCheck software presented here analyses the data fully automatically. It creates numerous visual outputs indicating potential issues requiring further investigation. By storing results in a web browser compatible file format the software greatly simplifies record keeping allowing the operator to quickly compare old and new data and to spot developing trends. We demonstrate that the systematic monitoring of confocal performance is essential in a core facility environment and how the quantitative measurements obtained can be used for the detailed characterisation of system components as well as for comparisons across multiple instruments.

  18. ConfocalCheck - A Software Tool for the Automated Monitoring of Confocal Microscope Performance

    Science.gov (United States)

    Hng, Keng Imm; Dormann, Dirk

    2013-01-01

    Laser scanning confocal microscopy has become an invaluable tool in biomedical research but regular quality testing is vital to maintain the system’s performance for diagnostic and research purposes. Although many methods have been devised over the years to characterise specific aspects of a confocal microscope like measuring the optical point spread function or the field illumination, only very few analysis tools are available. Our aim was to develop a comprehensive quality assurance framework ranging from image acquisition to automated analysis and documentation. We created standardised test data to assess the performance of the lasers, the objective lenses and other key components required for optimum confocal operation. The ConfocalCheck software presented here analyses the data fully automatically. It creates numerous visual outputs indicating potential issues requiring further investigation. By storing results in a web browser compatible file format the software greatly simplifies record keeping allowing the operator to quickly compare old and new data and to spot developing trends. We demonstrate that the systematic monitoring of confocal performance is essential in a core facility environment and how the quantitative measurements obtained can be used for the detailed characterisation of system components as well as for comparisons across multiple instruments. PMID:24224017

  19. USING A PEDAGOGICAL TOOL TO IMPROVE LEARNING:

    DEFF Research Database (Denmark)

    Bagger, Bettan; Taylor Kelly, Hélène; Hørdam, Britta

    at University College Zealand of nursing students’ bachelor projects highlighted students’ difficulties when categorizing and evaluating research literature. Students relied upon introductory textbooks as a major source of information and used proportionately few researched based resources. In 2011 a pilot...... project aimed at raising students’ awareness with respect to the choice and assessment of literature was initiated and students were introduced to a new pedagogical tool. Effects of the educational intervention were measured via quantitative and qualitative data and a follow-up clinical intervention......Abstract Content: Nurses work in complex organizations and solve multifaceted problems in an ever changing society when meeting patient needs. Therefore there is a need to develop students´ skills in the use of evidence based literature in clinical decision making. A retrospective study...

  20. Improving the Formatting Tools of CDS Invenio

    CERN Document Server

    Caffaro, J; Pu Faltings, Pearl

    2006-01-01

    CDS Invenio is the web-based integrated digital library system developed at CERN. It is a strategical tool that supports the archival and open dissemination of documents produced by CERN researchers. This paper reports on my Master’s thesis work done on BibFormat, a module in CDS Invenio, which formats documents metadata. The goal of this project was to implement a completely new formatting module for CDS Invenio. In this report a strong emphasis is put on the user-centered design of the new BibFormat. The bibliographic formatting process and its requirements are discussed. The task analysis and its resulting interaction model are detailed. The document also shows the implemented user interface of BibFormat and gives the results of the user evaluation of this interface. Finally the results of a small usability study of the formats included in CDS Invenio are discussed.

  1. [Adverse event sheets, a quality improvement tool].

    Science.gov (United States)

    Didry, Pascale; Lapp, Aymeric

    2017-05-01

    The declaration of adverse events comprises a written nurse report which helps to improve the quality and safety of care. Submitted to professionals from the quality department, this report will be used to perform an analysis of the causes and must therefore be descriptive and objective. Copyright © 2017 Elsevier Masson SAS. All rights reserved.

  2. Software maintenance : improvement through better development standards and documentation : final report, 1 Jan 80 to 1 Jan 82

    OpenAIRE

    Schneidewind, Norman Floyd

    1982-01-01

    Software maintenance is frequently the most expensive phase of the software life cycle. It is also the phase which has received insufficient attention by management and software developers. Software standards have improved the ability of the software community to develop and design software. Unfortunately, most standards do not deal with the maintenance phase in a substantive way. Since maintainability has to be designed into the software and cannot be achieved... Prepared f...

  3. Beautiful Testing Leading Professionals Reveal How They Improve Software

    CERN Document Server

    Goucher, Adam

    2009-01-01

    Successful software depends as much on scrupulous testing as it does on solid architecture or elegant code. But testing is not a routine process, it's a constant exploration of methods and an evolution of good ideas. Beautiful Testing offers 23 essays from 27 leading testers and developers that illustrate the qualities and techniques that make testing an art. Through personal anecdotes, you'll learn how each of these professionals developed beautiful ways of testing a wide range of products -- valuable knowledge that you can apply to your own projects. Here's a sample of what you'll find i

  4. Closing the loop on improvement: Packaging experience in the Software Engineering Laboratory

    Science.gov (United States)

    Waligora, Sharon R.; Landis, Linda C.; Doland, Jerry T.

    1994-01-01

    As part of its award-winning software process improvement program, the Software Engineering Laboratory (SEL) has developed an effective method for packaging organizational best practices based on real project experience into useful handbooks and training courses. This paper shares the SEL's experience over the past 12 years creating and updating software process handbooks and training courses. It provides cost models and guidelines for successful experience packaging derived from SEL experience.

  5. Online Tool to Improve Language Proficiency

    Directory of Open Access Journals (Sweden)

    Galina Kavaliauskienė

    2011-06-01

    Full Text Available This paper aims at examining students’ attitudes to application of Information and Communication Techno- logy for improving listening skills through online dictation. Dictation has been used in language testing for a long time, but its benefit in language teaching / learning has never been discussed by language teachers. This article is an attempt to put a useful but now undervalued technique back in the language teaching activities. However, there is no data on usefulness of dictation at university level. The research methods include students’ responses to a specially designed questionnaire. The participants in this study are students of two different specializations who study English for Specific Purposes at the Faculty of Social Policy, Mykolas Romeris University, Lithuania. Our study shows that class dictations of authentic materials are beneficial to students at tertiary level as they help improve listening and writing skills and raise awareness of problematic language areas. Statistical processing by means of the SPSS (Statistical Package for Social Sciences has proved that the findings are valid beyond the studied sample. The advice for language teachers is to employ the online dictation technique in language classroom in a way that is beneficial to students.

  6. The E-learning Circle – a holistic software design tool for e-learning

    Directory of Open Access Journals (Sweden)

    Arvid Staupe

    2010-07-01

    Full Text Available The article introduces the E-learning Circle, a tool developed to assure the quality of the software design process of e-learning systems, considering pedagogical principles as well as technology. The E-learning Circle consists of a number of concentric circles which are divided into three sectors. The content of the inner circles is based on pedagogical principles, while the outer circle specifies how the pedagogical principles may be implemented with technology. The circle’s centre is dedicated to the subject taught, ensuring focus on the specific subject’s properties. The three sectors represent the student, the teacher and the learning objectives. The strengths of the E-learning Circle are the compact presentation combined with the overview it provides, as well as the usefulness of a design tool dealing with complexity, providing a common language and embedding best practice. The E-learning Circle is not a prescriptive method, but is useful in several design models and processes. The article presents two projects where the E-learning Circle was used as a design tool.

  7. Biogem: an effective tool based approach for scaling up open source software development in bioinformatics

    NARCIS (Netherlands)

    Bonnal, R.J.P.; Smant, G.; Prins, J.C.P.

    2012-01-01

    Biogem provides a software development environment for the Ruby programming language, which encourages community-based software development for bioinformatics while lowering the barrier to entry and encouraging best practices. Biogem, with its targeted modular and decentralized approach, software

  8. Comparison of the Effectiveness of Stylewriter and Microsoft Word Computer Software to Improve English Writing Skills

    Science.gov (United States)

    Prvinchandar, Sunita; Ayub, Ahmad Fauzi Mohd

    2014-01-01

    This study compared the effectiveness of two types of computer software for improving the English writing skills of pupils in a Malaysian primary school. Sixty students who participated in the seven-week training course were divided into two groups, with the experimental group using the StyleWriter software and the control group using the…

  9. The Virtual Genetics Lab II: Improvements to a Freely Available Software Simulation of Genetics

    Science.gov (United States)

    White, Brian T.

    2012-01-01

    The Virtual Genetics Lab II (VGLII) is an improved version of the highly successful genetics simulation software, the Virtual Genetics Lab (VGL). The software allows students to use the techniques of genetic analysis to design crosses and interpret data to solve realistic genetics problems involving a hypothetical diploid insect. This is a brief…

  10. Publicly available software tools for decision-makers during an emergent epidemic-Systematic evaluation of utility and usability.

    Science.gov (United States)

    Heslop, David James; Chughtai, Abrar Ahmad; Bui, Chau Minh; MacIntyre, C Raina

    2017-12-01

    Epidemics and emerging infectious diseases are becoming an increasing threat to global populations-challenging public health practitioners, decision makers and researchers to plan, prepare, identify and respond to outbreaks in near real-timeframes. The aim of this research is to evaluate the range of public domain and freely available software epidemic modelling tools. Twenty freely utilisable software tools underwent assessment of software usability, utility and key functionalities. Stochastic and agent based tools were found to be highly flexible, adaptable, had high utility and many features, but low usability. Deterministic tools were highly usable with average to good levels of utility. Copyright © 2017 The Author(s). Published by Elsevier B.V. All rights reserved.

  11. Improving adherence to acne treatment: the emerging role of application software

    Directory of Open Access Journals (Sweden)

    Park C

    2014-02-01

    Full Text Available Chanhyun Park,1 Gilwan Kim,1 Isha Patel,2 Jongwha Chang,3 Xi Tan2 1College of Pharmacy, University of Texas at Austin, Austin, TX, USA; 2College of Pharmacy, University of Michigan, Ann Arbor, MI, USA; 3McWhorter School of Pharmacy, Samford University, Birmingham, AL, USA Objective: To examine recent studies on the effect of mobile and electronic (ME-health technology on adherence to acne treatment. Background: With emerging use of ME-health technology, there is a growing interest in evaluating the effectiveness of the tools on medication adherence. Examples of ME-health technology-based tools include text message-based pill reminders and Web-based patient education. Methods: MEDLINE, Cochrane Library, and Web of Science were searched for articles on adherence to acne treatment published through November 2013. A combination of search terms such as "acne" and "adherence" or "compliance" were used. Results: Adherence to oral acne medication was higher than for topical acne medication. The frequency of office visits was also an influencing factor for acne treatment adherence. The telephone-based reminders on a daily basis did not improve acne patients' medication adherence, whereas the Web-based educational tools on a weekly basis had a positive effect on medication adherence in acne treatment. Conclusion: In using ME-health interventions, factors such as medication dosage forms, frequency of intervention, and patients' preferences should be taken into consideration. Developing disease-specific text message reminders may be helpful to increase adherence rates. In addition, a combination of text message reminders with another type of intervention may improve medication adherence. Keywords: acne vulgaris, medication adherence, compliance, mobile and electronic health technology, application software, telemedicine

  12. English-based Pediatric Emergency Medicine Software Improves Physician Test Performance on Common Pediatric Emergencies: A Multicenter Study in Vietnam

    Directory of Open Access Journals (Sweden)

    Michelle Lin

    2013-09-01

    Full Text Available Introduction: Global health agencies and the Vietnam Ministry of Health have identified pediatric emergency care and health information technology as high priority goals. Clinical decision support (CDS software provides physicians with access to current literature to answer clinical queries, but there is limited impact data in developing countries. We hypothesized that Vietnamese physicians will demonstrate improved test performance on common pediatric emergencies using CDS technologies despite being in English.Methods: This multicenter, prospective, pretest-posttest study was conducted in 11 Vietnamese hospitals enrolled a convenience sample of physicians who attended an 80-minute software training on a pediatric CDS software (PEMSoft. Two multiple-choice exams (A, B were administered before and after the session. Participants, who received Test A as a pretest, received Test B as a posttest, and vice versa. Participants used the CDS software for the posttest. The primary outcome measure was the mean percentage difference in physician scores between the pretest and posttest, as calculated by a paired, two-tailed t-test.Results: For the 203 participants, the mean pretest, posttest, and improvement scores were 37% (95% CI: 35-38%, 70% (95% CI: 68-72%, and 33% (95% CI: 30-36%, respectively, with p<0.0001. This represents an 89% improvement over baseline. Subgroup analysis of practice setting, clinical experience, and comfort level with written English and computers showed that all subgroups equivalently improved their test scores.Conclusion: After brief training, Vietnamese physicians can effectively use an English-based CDS software based on improved performance on a written clinical exam. Given this rapid improvement, CDS technologies may serve as a transformative tool in resource-poor environments. [West J Emerg Med. 2013;14(5:471–476.

  13. The use of generalised audit software by internal audit functions in a developing country: The purpose of the use of generalised audit software as a data analytics tool

    Directory of Open Access Journals (Sweden)

    D.P. van der Nest

    2017-11-01

    Full Text Available This article explores the purpose of the use of generalised audit software as a data analytics tool by internal audit functions in the locally controlled banking industry of South Africa. The evolution of the traditional internal audit methodology of collecting audit evidence through the conduct of interviews, the completion of questionnaires, and by testing controls on a sample basis, is long overdue, and such practice in the present technological, data-driven era will soon render such an internal audit function obsolete. The research results indicate that respondents are utilising GAS for a variety of purposes but that its frequency of use is not yet optimal and that there is still much room for improvement for tests of controls purposes. The top five purposes for which the respondents make use of GAS often to always during separate internal audit engagements are: (1 to identify transactions with specific characteristics or control criteria for tests of control purposes; (2 for conducting full population analysis; (3 to identify account balances over a certain amount; (4 to identify and report on the frequency of occurrence of risks or frequency of occurrence of specific events; and (5 to obtain audit evidence about control effectiveness

  14. BioBrick assembly standards and techniques and associated software tools.

    Science.gov (United States)

    Røkke, Gunvor; Korvald, Eirin; Pahr, Jarle; Oyås, Ove; Lale, Rahmi

    2014-01-01

    The BioBrick idea was developed to introduce the engineering principles of abstraction and standardization into synthetic biology. BioBricks are DNA sequences that serve a defined biological function and can be readily assembled with any other BioBrick parts to create new BioBricks with novel properties. In order to achieve this, several assembly standards can be used. Which assembly standards a BioBrick is compatible with, depends on the prefix and suffix sequences surrounding the part. In this chapter, five of the most common assembly standards will be described, as well as some of the most used assembly techniques, cloning procedures, and a presentation of the available software tools that can be used for deciding on the best method for assembling of different BioBricks, and searching for BioBrick parts in the Registry of Standard Biological Parts database.

  15. Automated tools and techniques for distributed Grid Software Development of the testbed infrastructure

    CERN Document Server

    Aguado Sanchez, C

    2007-01-01

    Grid technology is becoming more and more important as the new paradigm for sharing computational resources across different organizations in a secure way. The great powerfulness of this solution, requires the definition of a generic stack of services and protocols and this is the scope of the different Grid initiatives. As a result of international collaborations for its development, the Open Grid Forum created the Open Grid Services Architecture (OGSA) which aims to define the common set of services that will enable interoperability across the different implementations. This master thesis has been developed in this framework, as part of the two European-funded projects ETICS and OMII-Europe. The main objective is to contribute to the design and maintenance of large distributed development projects with the automated tool that enables to implement Software Engineering techniques oriented to achieve an acceptable level of quality at the release process. Specifically, this thesis develops the testbed concept a...

  16. OligoSpawn: a software tool for the design of overgo probes from large unigene datasets

    Directory of Open Access Journals (Sweden)

    Jiang Tao

    2006-01-01

    Full Text Available Abstract Background Expressed sequence tag (EST datasets represent perhaps the largest collection of genetic information. ESTs can be exploited in a variety of biological experiments and analysis. Here we are interested in the design of overlapping oligonucleotide (overgo probes from large unigene (EST-contigs datasets. Results OLIGOSPAWN is a suite of software tools that offers two complementary services, namely (1 the selection of "unique" oligos each of which appears in one unigene but does not occur (exactly or approximately in any other and (2 the selection of "popular" oligos each of which occurs (exactly or approximately in as many unigenes as possible. In this paper, we describe the functionalities of OLIGOSPAWN and the computational methods it employs, and we report on experimental results for the overgo probes designed with it. Conclusion The algorithms we designed are highly efficient and capable of processing unigene datasets of sizes on the order of several tens of Mb in a few hours on a regular PC. The software has been used to design overgo probes employed to screen a barley BAC library (Hordeum vulgare. OLIGOSPAWN is freely available at http://oligospawn.ucr.edu/.

  17. PREDICTION OF SMARTPHONES’ PERCEIVED IMAGE QUALITY USING SOFTWARE EVALUATION TOOL VIQET

    Directory of Open Access Journals (Sweden)

    Pinchas ZOREA

    2016-12-01

    Full Text Available A great deal of resources and efforts have been made in recent years to assess how the smartphones users perceived the image quality. Unfortunately, only limited success has been achieved and the image quality assessment still based on many physical human visual test. The paper describes the new model proposed for perceived quality based on human visual tests compared with image analysis by the software application tool. The values of parameters of perceived image quality (brightness, contrast, color saturation and sharpness were calibrated based on results from human visual experiments.PREDICŢIA CALITĂŢII PERCEPUTE A IMAGINILOR AFIȘATE DE SMARTPHONE-URI UTILIZÂND APLICAŢIA DE EVALUARE VIQETÎn ultimii ani au fost depuse eforturi semnificative pentru a evalua modul în care utilizatorii de smartphone  percep calitatea imaginilor. Din păcate, a fost atins doar un progres limitat, evaluarea calităţii imaginiilor bazându-se încă pe multiple teste vizuale umane. În lucrare este descris un nou model al calităţii percepute pe baza testelor vizuale umane, comparate cu analiza imaginii efectuate cu o aplicaţie software. Valorile parametrilor calităţii  percepute a imaginii (lu­minozitate, contrast, saturaţia culorilor şi claritatea au fost calibrate pe baza rezultatelor experimentelor vizuale umane.

  18. Evaluating a digital ship design tool prototype: Designers' perceptions of novel ergonomics software.

    Science.gov (United States)

    Mallam, Steven C; Lundh, Monica; MacKinnon, Scott N

    2017-03-01

    Computer-aided solutions are essential for naval architects to manage and optimize technical complexities when developing a ship's design. Although there are an array of software solutions aimed to optimize the human element in design, practical ergonomics methodologies and technological solutions have struggled to gain widespread application in ship design processes. This paper explores how a new ergonomics technology is perceived by naval architecture students using a mixed-methods framework. Thirteen Naval Architecture and Ocean Engineering Masters students participated in the study. Overall, results found participants perceived the software and its embedded ergonomics tools to benefit their design work, increasing their empathy and ability to understand the work environment and work demands end-users face. However, participant's questioned if ergonomics could be practically and efficiently implemented under real-world project constraints. This revealed underlying social biases and a fundamental lack of understanding in engineering postgraduate students regarding applied ergonomics in naval architecture. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. A Software Tool for Optimal Sizing of PV Systems in Malaysia

    Directory of Open Access Journals (Sweden)

    Tamer Khatib

    2012-01-01

    Full Text Available This paper presents a MATLAB based user friendly software tool called as PV.MY for optimal sizing of photovoltaic (PV systems. The software has the capabilities of predicting the metrological variables such as solar energy, ambient temperature and wind speed using artificial neural network (ANN, optimizes the PV module/ array tilt angle, optimizes the inverter size and calculate optimal capacities of PV array, battery, wind turbine and diesel generator in hybrid PV systems. The ANN based model for metrological prediction uses four meteorological variables, namely, sun shine ratio, day number and location coordinates. As for PV system sizing, iterative methods are used for determining the optimal sizing of three types of PV systems, which are standalone PV system, hybrid PV/wind system and hybrid PV/diesel generator system. The loss of load probability (LLP technique is used for optimization in which the energy sources capacities are the variables to be optimized considering very low LLP. As for determining the optimal PV panels tilt angle and inverter size, the Liu and Jordan model for solar energy incident on a tilt surface is used in optimizing the monthly tilt angle, while a model for inverter efficiency curve is used in the optimization of inverter size.

  20. Cloud-Based SimJavaWeb Software Tool to Learn Simulation

    Directory of Open Access Journals (Sweden)

    A. Yu. Bykov

    2017-01-01

    Full Text Available Currently, in simulation there is a trend towards using the distributed software tools, particularly ones, which are using cloud technologies and the Internet. The article considers a simulation educational tool, implemented as a web application using the Java language with special Java class library developed for simulation. It is focused on a discrete event approach to modeling, similarly to the GPSS language, and intended for queuing systems simulation.The structure of the models obtained using this class library is similar to that of the GPSS language models. Also, the simulation language interpreter similar to GPSS is created using this class library, with some differences in the individual statements.Simulation experiments are performed on the server-side, and on client-side you must use a browser with standard functions to enter the source code into HTML-created form. Mobile devices can be used as clients. The source code of a model can be represented both in the Java language using a class library and in the language similar to GPSS.The simulation system implements functions especially for educational process. For example, there is possibility for a student to upload learning materials on the server, send developed software and reports of test control to the teacher via the Internet, and receive a detailed assessment of their results from the teacher. Also detailed results of passed tests in learning modules are entered, and some other functions are implemented in the system.As examples, the article considers models of the m/M/n/0 type queuing system in Java with a class library, and in the language similar to GPSS, shows simulation results, and presents the analytical model and calculations for this system. Analytical calculations proved that the modeling system is useful, as it overlaps simulation results with the acceptable error. Some approaches to the interaction with students through the Internet, used in modeling environment, can

  1. ELER software – a new tool for urban earthquake loss assessment

    Directory of Open Access Journals (Sweden)

    U. Hancilar

    2010-12-01

    .0 (Molina et al., 2008 and ATC-55 (Yang, 2005. An urban loss assessment exercise for a scenario earthquake for the city of Istanbul is conducted and physical and social losses are presented. Damage to the urban environment is compared to the results obtained from similar software, i.e. KOERILoss (KOERI, 2002 and DBELA (Crowley et al., 2004. The European rapid loss estimation tool is expected to help enable effective emergency response, on both local and global level, as well as public information.

  2. The dynamic of modern software development project management and the software crisis of quality. An integrated system dynamics approach towards software quality improvement

    OpenAIRE

    Nasirikaljahi, Armindokht

    2012-01-01

    The software industry is plagued by cost-overruns, delays, poor customer satisfaction and quality issues that are costing clients and customers world-wide billions of dollars each year. The phenomenon is coined The Software Crisis", and poses a huge challenge for software project management. This thesis addresses one of the core issues of the software crisis, namely software quality. The challenges of software quality are central for understanding the other symptoms of the software crisis. Th...

  3. Learning from open source software projects to improve scientific review.

    Science.gov (United States)

    Ghosh, Satrajit S; Klein, Arno; Avants, Brian; Millman, K Jarrod

    2012-01-01

    Peer-reviewed publications are the primary mechanism for sharing scientific results. The current peer-review process is, however, fraught with many problems that undermine the pace, validity, and credibility of science. We highlight five salient problems: (1) reviewers are expected to have comprehensive expertise; (2) reviewers do not have sufficient access to methods and materials to evaluate a study; (3) reviewers are neither identified nor acknowledged; (4) there is no measure of the quality of a review; and (5) reviews take a lot of time, and once submitted cannot evolve. We propose that these problems can be resolved by making the following changes to the review process. Distributing reviews to many reviewers would allow each reviewer to focus on portions of the article that reflect the reviewer's specialty or area of interest and place less of a burden on any one reviewer. Providing reviewers materials and methods to perform comprehensive evaluation would facilitate transparency, greater scrutiny, and replication of results. Acknowledging reviewers makes it possible to quantitatively assess reviewer contributions, which could be used to establish the impact of the reviewer in the scientific community. Quantifying review quality could help establish the importance of individual reviews and reviewers as well as the submitted article. Finally, we recommend expediting post-publication reviews and allowing for the dialog to continue and flourish in a dynamic and interactive manner. We argue that these solutions can be implemented by adapting existing features from open-source software management and social networking technologies. We propose a model of an open, interactive review system that quantifies the significance of articles, the quality of reviews, and the reputation of reviewers.

  4. Software quality and process improvement in scientific simulation codes

    Energy Technology Data Exchange (ETDEWEB)

    Ambrosiano, J.; Webster, R. [Los Alamos National Lab., NM (United States)

    1997-11-01

    This report contains viewgraphs on the quest to develope better simulation code quality through process modeling and improvement. This study is based on the experience of the authors and interviews with ten subjects chosen from simulation code development teams at LANL. This study is descriptive rather than scientific.

  5. PentaPlot: A software tool for the illustration of genome mosaicism

    Directory of Open Access Journals (Sweden)

    Zhaxybayeva Olga

    2005-06-01

    Full Text Available Abstract Background Dekapentagonal maps depict the phylogenetic relationships of five genomes in a visually appealing diagram and can be viewed as an alternative to a single evolutionary consensus tree. In particular, the generated maps focus attention on those gene families that significantly deviate from the consensus or plurality phylogeny. PentaPlot is a software tool that computes such dekapentagonal maps given an appropriate probability support matrix. Results The visualization with dekapentagonal maps critically depends on the optimal layout of unrooted tree topologies representing different evolutionary relationships among five organisms along the vertices of the dekapentagon. This is a difficult optimization problem given the large number of possible layouts. At its core our tool utilizes a genetic algorithm with demes and a local search strategy to search for the optimal layout. The hybrid genetic algorithm performs satisfactorily even in those cases where the chosen genomes are so divergent that little phylogenetic information has survived in the individual gene families. Conclusion PentaPlot is being made publicly available as an open source project at http://pentaplot.sourceforge.net.

  6. A software tool to assess uncertainty in transient-storage model parameters using Monte Carlo simulations

    Science.gov (United States)

    Ward, Adam S.; Kelleher, Christa A.; Mason, Seth J. K.; Wagener, Thorsten; McIntyre, Neil; McGlynn, Brian L.; Runkel, Robert L.; Payn, Robert A.

    2017-01-01

    Researchers and practitioners alike often need to understand and characterize how water and solutes move through a stream in terms of the relative importance of in-stream and near-stream storage and transport processes. In-channel and subsurface storage processes are highly variable in space and time and difficult to measure. Storage estimates are commonly obtained using transient-storage models (TSMs) of the experimentally obtained solute-tracer test data. The TSM equations represent key transport and storage processes with a suite of numerical parameters. Parameter values are estimated via inverse modeling, in which parameter values are iteratively changed until model simulations closely match observed solute-tracer data. Several investigators have shown that TSM parameter estimates can be highly uncertain. When this is the case, parameter values cannot be used reliably to interpret stream-reach functioning. However, authors of most TSM studies do not evaluate or report parameter certainty. Here, we present a software tool linked to the One-dimensional Transport with Inflow and Storage (OTIS) model that enables researchers to conduct uncertainty analyses via Monte-Carlo parameter sampling and to visualize uncertainty and sensitivity results. We demonstrate application of our tool to 2 case studies and compare our results to output obtained from more traditional implementation of the OTIS model. We conclude by suggesting best practices for transient-storage modeling and recommend that future applications of TSMs include assessments of parameter certainty to support comparisons and more reliable interpretations of transport processes.

  7. Streamlining the Design-to-Build Transition with Build-Optimization Software Tools.

    Science.gov (United States)

    Oberortner, Ernst; Cheng, Jan-Fang; Hillson, Nathan J; Deutsch, Samuel

    2017-03-17

    Scaling-up capabilities for the design, build, and test of synthetic biology constructs holds great promise for the development of new applications in fuels, chemical production, or cellular-behavior engineering. Construct design is an essential component in this process; however, not every designed DNA sequence can be readily manufactured, even using state-of-the-art DNA synthesis methods. Current biological computer-aided design and manufacture tools (bioCAD/CAM) do not adequately consider the limitations of DNA synthesis technologies when generating their outputs. Designed sequences that violate DNA synthesis constraints may require substantial sequence redesign or lead to price-premiums and temporal delays, which adversely impact the efficiency of the DNA manufacturing process. We have developed a suite of build-optimization software tools (BOOST) to streamline the design-build transition in synthetic biology engineering workflows. BOOST incorporates knowledge of DNA synthesis success determinants into the design process to output ready-to-build sequences, preempting the need for sequence redesign. The BOOST web application is available at https://boost.jgi.doe.gov and its Application Program Interfaces (API) enable integration into automated, customized DNA design processes. The herein presented results highlight the effectiveness of BOOST in reducing DNA synthesis costs and timelines.

  8. MilQuant: a free, generic software tool for isobaric tagging-based quantitation.

    Science.gov (United States)

    Zou, Xiao; Zhao, Minzhi; Shen, Hongyan; Zhao, Xuyang; Tong, Yuanpeng; Wang, Qingsong; Wei, Shicheng; Ji, Jianguo

    2012-09-18

    Isobaric tagging techniques such as iTRAQ and TMT are widely used in quantitative proteomics and especially useful for samples that demand in vitro labeling. Due to diversity in choices of MS acquisition approaches, identification algorithms, and relative abundance deduction strategies, researchers are faced with a plethora of possibilities when it comes to data analysis. However, the lack of generic and flexible software tool often makes it cumbersome for researchers to perform the analysis entirely as desired. In this paper, we present MilQuant, mzXML-based isobaric labeling quantitator, a pipeline of freely available programs that supports native acquisition files produced by all mass spectrometer types and collection approaches currently used in isobaric tagging based MS data collection. Moreover, aside from effective normalization and abundance ratio deduction algorithms, MilQuant exports various intermediate results along each step of the pipeline, making it easy for researchers to customize the analysis. The functionality of MilQuant was demonstrated by four distinct datasets from different laboratories. The compatibility and extendibility of MilQuant makes it a generic and flexible tool that can serve as a full solution to data analysis of isobaric tagging-based quantitation. Copyright © 2012 Elsevier B.V. All rights reserved.

  9. Software development for dynamic position emission tomography: Dynamic image analysis (DIA) tool

    Energy Technology Data Exchange (ETDEWEB)

    Pyeon, Do Yeong; Jung, Young Jin [Dongseo University, Busan (Korea, Republic of); Kim, Jung Su [Dept. of Radilogical Science, Dongnam Health University, Suwon (Korea, Republic of)

    2016-09-15

    Positron Emission Tomography(PET) is nuclear medical tests which is a combination of several compounds with a radioactive isotope that can be injected into body to quantitatively measure the metabolic rate (in the body). Especially, Phenomena that increase (sing) glucose metabolism in cancer tissue using the 18F-FDG (Fluorodeoxyglucose) is utilized widely in cancer diagnosis. And then, Numerous studies have been reported that incidence seems high availability even in the modern diagnosis of dementia and Parkinson's (disease) in brain disease. When using a dynamic PET image including the time information in the static information that is provided for the diagnosis many can increase the accuracy of diagnosis. For this reason, clinical researchers getting great attention but, it is the lack of tools to conduct research. And, it interfered complex mathematical algorithm and programming skills for activation of research. In this study, in order to easy to use and enable research dPET, we developed the software based graphic user interface(GUI). In the future, by many clinical researcher using DIA-Tool is expected to be of great help to dPET research.

  10. Development of a software tool to support chemical and biological terrorism intelligence analysis

    Science.gov (United States)

    Hunt, Allen R.; Foreman, William

    1997-01-01

    AKELA has developed a software tool which uses a systems analytic approach to model the critical processes which support the acquisition of biological and chemical weapons by terrorist organizations. This tool has four major components. The first is a procedural expert system which describes the weapon acquisition process. It shows the relationship between the stages a group goes through to acquire and use a weapon, and the activities in each stage required to be successful. It applies to both state sponsored and small group acquisition. An important part of this expert system is an analysis of the acquisition process which is embodied in a list of observables of weapon acquisition activity. These observables are cues for intelligence collection The second component is a detailed glossary of technical terms which helps analysts with a non- technical background understand the potential relevance of collected information. The third component is a linking capability which shows where technical terms apply to the parts of the acquisition process. The final component is a simple, intuitive user interface which shows a picture of the entire process at a glance and lets the user move quickly to get more detailed information. This paper explains e each of these five model components.

  11. GPU-FS-kNN: a software tool for fast and scalable kNN computation using GPUs.

    Directory of Open Access Journals (Sweden)

    Ahmed Shamsul Arefin

    Full Text Available BACKGROUND: The analysis of biological networks has become a major challenge due to the recent development of high-throughput techniques that are rapidly producing very large data sets. The exploding volumes of biological data are craving for extreme computational power and special computing facilities (i.e. super-computers. An inexpensive solution, such as General Purpose computation based on Graphics Processing Units (GPGPU, can be adapted to tackle this challenge, but the limitation of the device internal memory can pose a new problem of scalability. An efficient data and computational parallelism with partitioning is required to provide a fast and scalable solution to this problem. RESULTS: We propose an efficient parallel formulation of the k-Nearest Neighbour (kNN search problem, which is a popular method for classifying objects in several fields of research, such as pattern recognition, machine learning and bioinformatics. Being very simple and straightforward, the performance of the kNN search degrades dramatically for large data sets, since the task is computationally intensive. The proposed approach is not only fast but also scalable to large-scale instances. Based on our approach, we implemented a software tool GPU-FS-kNN (GPU-based Fast and Scalable k-Nearest Neighbour for CUDA enabled GPUs. The basic approach is simple and adaptable to other available GPU architectures. We observed speed-ups of 50-60 times compared with CPU implementation on a well-known breast microarray study and its associated data sets. CONCLUSION: Our GPU-based Fast and Scalable k-Nearest Neighbour search technique (GPU-FS-kNN provides a significant performance improvement for nearest neighbour computation in large-scale networks. Source code and the software tool is available under GNU Public License (GPL at https://sourceforge.net/p/gpufsknn/.

  12. User Guide for the Plotting Software for the Los Alamos National Laboratory Nuclear Weapons Analysis Tools Version 2.0

    Energy Technology Data Exchange (ETDEWEB)

    Cleland, Timothy James [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2018-01-02

    The Los Alamos National Laboratory Plotting Software for the Nuclear Weapons Analysis Tools is a Java™ application based upon the open source library JFreeChart. The software provides a capability for plotting data on graphs with a rich variety of display options while allowing the viewer interaction via graph manipulation and scaling to best view the data. The graph types include XY plots, Date XY plots, Bar plots and Histogram plots.

  13. Development of software tool support for enterprise architecture in small and medium-sized enterprises

    OpenAIRE

    Dumeez, Joost; Bernaert, Maxime; Poels, Geert

    2013-01-01

    Part 4: Enterprise Modelling Approaches and Tools for Agility and Flexibility; International audience; Enterprise architecture (EA) is used to improve the alignment of different facets of a company. The recognition for the need of EA in small and medium-sized enterprises (SMEs) has recently risen as a means to manage complexity and change [1]. Due to the specific problems and characteristics of SMEs, a different approach is necessary. CHOOSE was therefore developed as an EA approach focused o...

  14. Blink Animation Software to Improve Blinking and Dry Eye Symptoms.

    Science.gov (United States)

    Nosch, Daniela S; Foppa, Curdin; Tóth, Mike; Joos, Roland E

    2015-09-01

    To evaluate if the animation "blink blink" increases blink rate and improves dry eye symptoms during prolonged computer use. Study part A: Blink rate was recorded at baseline and during computer work of normal subjects without symptoms of dry eye. Half of the subjects used "blink blink," instructed to blink on animation appearance; the other half used a placebo version for 1 week during computer use. Thereafter, blink rate was recorded again with the use of "blink blink." Study part B: Blink rate was recorded during computer work with dry eye symptoms (modified Ocular Surface Disease Index > 15.0). Subjects used the test and placebo version of "blink blink" each for 1 week (1 week washout; crossover) and were instructed to blink twice on presentation of the animation. Blink rate and dry eye symptoms were assessed after each phase and compared with baseline. Study part A: Ten subjects participated (mean [± SD] age, 38.3 [± 16.0] years; 5 women). A greater increase in blink rate was observed in the test group (5.62 blinks/min for the test group and 0.96 blinks/min for the control group). Study part B: Twenty-four subjects participated (mean [± SD] age, 39.3 [± 19.1] years; 11 women). Dry eye symptoms improved during both phases (with test and placebo) to a statistically significant degree (each, p Blink rate increased with the program by 6.75 (± 3.80) blinks/min (p blinks/min with placebo (p = 0.396). This difference between test and placebo was statistically significant (p blink blink" well during computer use. Blink rate and dry eye symptoms improved with "blink blink." The double blink prompted by the animation allowed a decrease in number of presentations and improved acceptance of "blink blink."

  15. Improving the quality of numerical software through user-centered design

    Energy Technology Data Exchange (ETDEWEB)

    Pancake, C. M., Oregon State University

    1998-06-01

    The software interface - whether graphical, command-oriented, menu-driven, or in the form of subroutine calls - shapes the user`s perception of what software can do. It also establishes upper bounds on software usability. Numerical software interfaces typically are based on the designer`s understanding of how the software should be used. That is a poor foundation for usability, since the features that are ``instinctively right`` from the developer`s perspective are often the very ones that technical programmers find most objectionable or most difficult to learn. This paper discusses how numerical software interfaces can be improved by involving users more actively in design, a process known as user-centered design (UCD). While UCD requires extra organization and effort, it results in much higher levels of usability and can actually reduce software costs. This is true not just for graphical user interfaces, but for all software interfaces. Examples show how UCD improved the usability of a subroutine library, a command language, and an invocation interface.

  16. A Case of Engineering Quality for Mobile Healthcare Applications Using Augmented Personal Software Process Improvement

    Directory of Open Access Journals (Sweden)

    Shahbaz Ahmed Khan Ghayyur

    2016-01-01

    Full Text Available Mobile healthcare systems are currently considered as key research areas in the domain of software engineering. The adoption of modern technologies, for mobile healthcare systems, is a quick option for industry professionals. Software architecture is a key feature that contributes towards a software product, solution, or services. Software architecture helps in better communication, documentation of design decisions, risks identification, basis for reusability, scalability, scheduling, and reduced maintenance cost and lastly it helps to avoid software failures. Hence, in order to solve the abovementioned issues in mobile healthcare, the software architecture is integrated with personal software process. Personal software process has been applied successfully but it is unable to address the issues related to architectural design and evaluation capabilities. Hence, a new technique architecture augmented personal process is presented in order to enhance the quality of the mobile healthcare systems through the use of architectural design with integration of personal software process. The proposed process was validated by case studies. It was found that the proposed process helped in reducing the overall costs and effort. Moreover, an improved architectural design helped in development of high quality mobile healthcare system.

  17. Performance analysis and optimization of an advanced pharmaceutical wastewater treatment plant through a visual basic software tool (PWWT.VB).

    Science.gov (United States)

    Pal, Parimal; Thakura, Ritwik; Chakrabortty, Sankha

    2016-05-01

    A user-friendly, menu-driven simulation software tool has been developed for the first time to optimize and analyze the system performance of an advanced continuous membrane-integrated pharmaceutical wastewater treatment plant. The software allows pre-analysis and manipulation of input data which helps in optimization and shows the software performance visually on a graphical platform. Moreover, the software helps the user to "visualize" the effects of the operating parameters through its model-predicted output profiles. The software is based on a dynamic mathematical model, developed for a systematically integrated forward osmosis-nanofiltration process for removal of toxic organic compounds from pharmaceutical wastewater. The model-predicted values have been observed to corroborate well with the extensive experimental investigations which were found to be consistent under varying operating conditions like operating pressure, operating flow rate, and draw solute concentration. Low values of the relative error (RE = 0.09) and high values of Willmott-d-index (d will = 0.981) reflected a high degree of accuracy and reliability of the software. This software is likely to be a very efficient tool for system design or simulation of an advanced membrane-integrated treatment plant for hazardous wastewater.

  18. A practical overview and comparison of certain commercial forensic software tools for processing large-scale digital investigations

    Science.gov (United States)

    Kröger, Knut; Creutzburg, Reiner

    2013-05-01

    The aim of this paper is to show the usefulness of modern forensic software tools for processing large-scale digital investigations. In particular, we focus on the new version of Nuix 4.2 and compare it with AccessData FTK 4.2, X-Ways Forensics 16.9 and Guidance Encase Forensic 7 regarding its performance, functionality, usability and capability. We will show how these software tools work with large forensic images and how capable they are in examining complex and big data scenarios.

  19. Diva software, a tool for European regional seas and Ocean climatologies production

    Science.gov (United States)

    Ouberdous, M.; Troupin, C.; Barth, A.; Alvera-Azcàrate, A.; Beckers, J.-M.

    2012-04-01

    Diva (Data-Interpolating Variational Analysis) is a software based on a method designed to perform data-gridding (or analysis) tasks, with the assets of taking into account the intrinsic nature of oceanographic data, i.e., the uncertainty on the in situ measurements and the anisotropy due to advection and irregular coastlines and topography. The Variational Inverse Method (VIM, Brasseur et al., 1996) implemented in Diva consists in minimizing a variational principle which accounts for the differences between the observations and the reconstructed field, the influence of the gradients and variability of the reconstructed field. The resolution of the numerical problem is based on finite-element method, which allows a great numerical efficiency and the consideration of complicated contours. Along with the analysis, Diva provides also error fields (Brankart and Brasseur, 1998; Rixen et al., 2000) based on the data coverage and noise. Diva is used for the production of climatologies in the pan-European network SeaDataNet. SeaDataNet is connecting the existing marine data centres of more than 30 countries and set up a data management infrastructure consisting of a standardized distributed system. The consortium has elaborated integrated products, using common procedures and methods. Among these, it uses the Diva software as reference tool for climatologies computation for various European regional seas, the Atlantic and the global ocean. During the first phase of the SeaDataNet project, a number of additional tools were developed to make easier the climatologies production for the users. Among these tools: the advection constraint during the field reconstruction through the specification of a velocity field on a regular grid, forcing the analysis to align with the velocity vectors; the Generalized Cross Validation for the determination of analysis parameters (signal-to-noise ratio); the creation of contours at selected depths; the detection of possible outliers; the

  20. A software tool integrated risk assessment of spent fuel transpotation and storage

    Energy Technology Data Exchange (ETDEWEB)

    Yun, Mi Rae; Almomani, Belal; Ham, Jae Hyun; Kang, Hyun Gook [Dept. of Nuclear and Quantum Engineering, Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of); Christian, Robby [Dept. of Mechanical, Aerospace, and Nuclear Engineering, Rensselaer Polytechnic Institute, Troy (Korea, Republic of); Kim, Bo Gyung [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of); Lee, Sang Hoon [Dept. of Mechanical and Automotive Engineering, Keimyung University, Daegu (Korea, Republic of)

    2017-06-15

    When temporary spent fuel storage pools at nuclear power plants reach their capacity limit, the spent fuel must be moved to an alternative storage facility. However, radioactive materials must be handled and stored carefully to avoid severe consequences to the environment. In this study, the risks of three potential accident scenarios (i.e., maritime transportation, an aircraft crashing into an interim storage facility, and on-site transportation) associated with the spent fuel transportation process were analyzed using a probabilistic approach. For each scenario, the probabilities and the consequences were calculated separately to assess the risks: the probabilities were calculated using existing data and statistical models, and the consequences were calculated using computation models. Risk assessment software was developed to conveniently integrate the three scenarios. The risks were analyzed using the developed software according to the shipment route, building characteristics, and spent fuel handling environment. As a result of the risk analysis with varying accident conditions, transportation and storage strategies with relatively low risk were developed for regulators and licensees. The focus of this study was the risk assessment methodology; however, the applied model and input data have some uncertainties. Further research to reduce these uncertainties will improve the accuracy of this mode.

  1. A Retrofit Tool for Improving Energy Efficiency of Commercial Buildings

    Energy Technology Data Exchange (ETDEWEB)

    Levine, Mark; Feng, Wei; Ke, Jing; Hong, Tianzhen; Zhou, Nan

    2013-06-06

    Existing buildings will dominate energy use in commercial buildings in the United States for three decades or longer and even in China for the about two decades. Retrofitting these buildings to improve energy efficiency and reduce energy use is thus critical to achieving the target of reducing energy use in the buildings sector. However there are few evaluation tools that can quickly identify and evaluate energy savings and cost effectiveness of energy conservation measures (ECMs) for retrofits, especially for buildings in China. This paper discusses methods used to develop such a tool and demonstrates an application of the tool for a retrofit analysis. The tool builds on a building performance database with pre-calculated energy consumption of ECMs for selected commercial prototype buildings using the EnergyPlus program. The tool allows users to evaluate individual ECMs or a package of ECMs. It covers building envelope, lighting and daylighting, HVAC, plug loads, service hot water, and renewable energy. The prototype building can be customized to represent an actual building with some limitations. Energy consumption from utility bills can be entered into the tool to compare and calibrate the energy use of the prototype building. The tool currently can evaluate energy savings and payback of ECMs for shopping malls in China. We have used the tool to assess energy and cost savings for retrofit of the prototype shopping mall in Shanghai. Future work on the tool will simplify its use and expand it to cover other commercial building types and other countries.

  2. An open-source software tool for the generation of relaxation time maps in magnetic resonance imaging

    Directory of Open Access Journals (Sweden)

    Kühne Titus

    2010-07-01

    Full Text Available Abstract Background In magnetic resonance (MR imaging, T1, T2 and T2* relaxation times represent characteristic tissue properties that can be quantified with the help of specific imaging strategies. While there are basic software tools for specific pulse sequences, until now there is no universal software program available to automate pixel-wise mapping of relaxation times from various types of images or MR systems. Such a software program would allow researchers to test and compare new imaging strategies and thus would significantly facilitate research in the area of quantitative tissue characterization. Results After defining requirements for a universal MR mapping tool, a software program named MRmap was created using a high-level graphics language. Additional features include a manual registration tool for source images with motion artifacts and a tabular DICOM viewer to examine pulse sequence parameters. MRmap was successfully tested on three different computer platforms with image data from three different MR system manufacturers and five different sorts of pulse sequences: multi-image inversion recovery T1; Look-Locker/TOMROP T1; modified Look-Locker (MOLLI T1; single-echo T2/T2*; and multi-echo T2/T2*. Computing times varied between 2 and 113 seconds. Estimates of relaxation times compared favorably to those obtained from non-automated curve fitting. Completed maps were exported in DICOM format and could be read in standard software packages used for analysis of clinical and research MR data. Conclusions MRmap is a flexible cross-platform research tool that enables accurate mapping of relaxation times from various pulse sequences. The software allows researchers to optimize quantitative MR strategies in a manufacturer-independent fashion. The program and its source code were made available as open-source software on the internet.

  3. Using Numerical Models in the Development of Software Tools for Risk Management of Accidents with Oil and Inert Spills

    Science.gov (United States)

    Fernandes, R.; Leitão, P. C.; Braunschweig, F.; Lourenço, F.; Galvão, P.; Neves, R.

    2012-04-01

    useful in emergency situations. The backtracking modelling feature and the possibility of importing spill locations from remote servers with observed data (per example, from flight surveillance or remote sensing) allow the potential application to the evaluation of possible contamination sources. The third tool developed is an innovative system to dynamically produce quantified risk levels in real time, integrating best available information from numerical forecasts and the existing monitoring tools. This system provides coastal pollution risk levels associated to potential (or real) oil spill incidents, taking into account regional statistic information on vessel accidents and coastal sensitivity indexes (determined in EROCIPS project), real time vessel information (positioning, cargo type, speed and vessel type) obtained from AIS, best-available metocean numerical forecasts (hydrodynamics, meteorology - including visibility, wave conditions) and simulated scenarios by the oil spill fate and behaviour component of MOHID Water Modelling System (www.mohid.com). Different spill fate and behaviour simulations are continuously generated and processed in background (assuming hypothetical spills from vessels), based on variable vessel information, and metocean conditions, and results from these simulations are used in the quantification the consequences of potential spills. Dynamic Risk Tool was not designed to replace conventional mapping tools, but to complement that type of information with an innovative approach to risk mapping. Taking advantage of interoperability between forecasting models, oil spill simulations, AIS monitoring systems, statistical data and coastal vulnerability, this software can provide to end-users realtime risk levels, allowing an innovative approach to risk mapping, providing decision-makers with an improved decision support model and also an intelligent risk-based traffic monitoring. For instance, this tool allows the prioritisation of individual

  4. SIGSAC Software: A tool for the Management of Chronic Disease and Telecare.

    Science.gov (United States)

    Claudia, Bustamante; Claudia, Alcayaga; Ilta, Lange; Iñigo, Meza

    2012-01-01

    Chronic disease management is highly complex because multiple interventions are required to improve clinical outcomes. From the patient's perspective, his main problems are dealing with self-management without support and feeling isolated between clinical visits. A strategy for providing continuous self-management support is the use of communication technologies, such as the telephone. However, to be efficient and effective, an information system is required for telecare planning and follows up. The use of electronic clinical records facilitates the implementation of telecare, but those systems often do not allow to combine usual care (visits to the health clinics) with telecare. This paper presents the experience of developing an application called SIGSAC (Software de Información, Gestión y Seguimiento para el Autocuidado Crónico) for Chronic Disease Management and Telecare follow up.

  5. The Use of Software Tools for ChE Education: Students' Evaluations.

    Science.gov (United States)

    Abbas, Abderrahim; Al-Bastaki, Nader

    2002-01-01

    Describes three computer software programs implemented in the chemical engineering curriculum at the University of Bahrain and explains students' evaluations of the usefulness and effectiveness of the software packages. Programs include Control Station (CS), HYSYS, and MATHCAD. (YDS)

  6. Improving Video Game Development: Facilitating Heterogeneous Team Collaboration through Flexible Software Processes

    Science.gov (United States)

    Musil, Juergen; Schweda, Angelika; Winkler, Dietmar; Biffl, Stefan

    Based on our observations of Austrian video game software development (VGSD) practices we identified a lack of systematic processes/method support and inefficient collaboration between various involved disciplines, i.e. engineers and artists. VGSD includes heterogeneous disciplines, e.g. creative arts, game/content design, and software. Nevertheless, improving team collaboration and process support is an ongoing challenge to enable a comprehensive view on game development projects. Lessons learned from software engineering practices can help game developers to increase game development processes within a heterogeneous environment. Based on a state of the practice survey in the Austrian games industry, this paper presents (a) first results with focus on process/method support and (b) suggests a candidate flexible process approach based on Scrum to improve VGSD and team collaboration. Results showed (a) a trend to highly flexible software processes involving various disciplines and (b) identified the suggested flexible process approach as feasible and useful for project application.

  7. The effect of construction cost estimating (CCE software on job performance: An improvement plan

    Directory of Open Access Journals (Sweden)

    Mohd Mukelas M.F.

    2014-01-01

    Full Text Available This paper presents a comprehensive statistical research on the effect of construction cost estimating software’s features towards estimating job performance. The objectives of this study are identification of cost estimating software features, analyzing the significant relation of cost estimating software’s features towards job performance, Explore the problem faced during the implementation and lastly propose a plan to improve the cost estimating software usage among contractors in Malaysia. The study statistically reveals four features of cost estimating software that significantly impact towards changes in cost estimating job performance. These features were refined by performing interview to focus group of respondent to observe the actual possible problems during the implementation. Eventually, the proposed improvement plan was validated by the focus group of respondents to enhance the cost estimating software implementation among contractors in Malaysia.

  8. Medical SisRadiologia: a new software tool for analysis of radiological accidents and incidents in medical radiology

    Energy Technology Data Exchange (ETDEWEB)

    Lima, Camila M. Araujo; Silva, Francisco C.A. da, E-mail: araujocamila@yahoo.com.br, E-mail: dasilva@ird.gov.br [Instituto de Radioprotecao e Dosimetria (IRD/CNEN-RJ), Rio de Janeiro, RJ (Brazil); Araujo, Rilton A.; Pelegrineli, Samuel Q., E-mail: consultoria@maximindustrial.com.br, E-mail: samuelfisica@maximindustrial.com.br [Maxim Industrial, Rio de Janeiro, RJ (Brazil)

    2013-07-01

    The man's exposure to ionizing radiation in health are has increased considerably due not only the great request of medical examinations as well as the improvement of the techniques used in diagnostic imaging, for example, equipment for conventional X-rays, CT scans, mammography, hemodynamic and others. Although the benefits of using of radiology techniques are unquestionable, the lack of training in radiation protection of the workers, associated with procedure errors, have been responsible for the increasing number of radiation overexposures of these workers. Sometimes these high doses are real and there is a true radiological accident. The radiation workers, named occupationally Exposed Individual (IOE), must comply with two national regulations: Governmental Decree 453/1998 of the National Agency of Sanitary Surveillance (Portaria 453/1998 ANVISA Agencia Nacional de Vigilancia Sanitaria), which establishes the basic guidelines for radiation protection in medial and dental radiology and; the Governmental Decree NR-32/2002 of the Ministry of Labour and Employment (Ministerio do Trabalho e Emprego), which establishes the basic guidelines for the worker's health. The two mandatory regulations postulate a detailed investigation in the event of radiation overexposure of an IOE. In order to advice the diagnostic institution to perform an efficient analysis, investigation and report of high doses, it is proposed the use of a computational tool named 'Medical SisRadiologia'. This software tool enables the compilation and record of radiological abnormal data occurred in a diagnostic institution. It will also facilitate the detailed analysis of the event and will increase the effectiveness and development of work performed by the Radiation Protection Service. At the end, a technical report is issued, in accordance with the regulations of the technical regulations, which could also be used as training tool to avoid another event in the future. (author)

  9. Prescription, Description, Reflection: the shape of the software process improvement field

    DEFF Research Database (Denmark)

    Hansen, Bo; Rose, Jeremy; Tjørnehøj, Gitte

    2004-01-01

    This article reviews 322 representative contributions to the the Software Process Improvement (SPI) literature. The contributions are categorised according to a simple framework: whether their primary goal is (to tell SPI professionals what to do), (to report actual instances of SPI programs...... in software organisations), or   prescriptive descriptive reflective (theoretically analytical). The field is found to be rather dominated by one approach (the Capability Maturity Model (CMM)) and heavily biased towards prescriptive contributions. Neither of these trends is necessarily beneficial...

  10. SOTEA, a software tool for ascertaining the efficiency potential of electrical drives - Final report; SOTEA, Softwaretool zur Ermittlung des Effizienzpotenzials bei elektrischen Antrieben - Schlussbericht

    Energy Technology Data Exchange (ETDEWEB)

    Brunner, C. U. [S.A.F.E. Schweizerische Agentur fuer Energieeffizienz, Zuerich (Switzerland); Heldstab, T. [hematik, Heldstab Systemoptimierung und Informatik, Zuerich (Switzerland)

    2009-08-15

    As a scientific base for the Swiss electric motor efficiency implementation program Topmotors a software tool for industry managers has been developed and tested. The software allows an energy efficiency engineer in a first contact with industrial managers and with few simple data on the plant operation to estimate the energy efficiency potential of electric motor systems including pay-back and investment. The data can be fed to a laptop computer on site and the results can be shown immediately. The software has been programmed and tested with five prime users. The generally positive reactions were evaluated and the tool subsequently improved. 11 industrial objects with a total of 77.6 GWh electricity consumption and 7.9 million CHF electricity cost were studied. The SOTEA estimate is an annual efficiency improvement of the electric motor systems of 6.9 GWh (11 % of electricity for motors) with an average pay-back time of 1.7 years. The SOTEA software tool is publicly available since September 2008 under www.topmotors.ch, from 1 April 2009 in a Beta-2b version. It has been downloaded until 28 June 2009 218 times by 132 persons. It will be improved with results from new pilot studies. To avoid problems with different update versions a direct internet solution will be studied. The program will also be made available internationally for English speaking users for the IEA 4E EMSA project: International Energy Agency, Implementing Agreement for Efficient Electrical End-Use Equipment, Electric Motor Systems Annex www.motorsystems.org. (authors)

  11. Decision support tools in conservation: a workshop to improve user-centred design

    Directory of Open Access Journals (Sweden)

    David Rose

    2017-09-01

    Full Text Available A workshop held at the University of Cambridge in May 2017 brought developers, researchers, knowledge brokers, and users together to discuss user-centred design of decision support tools. Decision support tools are designed to take users through logical decision steps towards an evidence-informed final decision. Although they may exist in different forms, including on paper, decision support tools are generally considered to be computer- (online, software or app-based. Studies have illustrated the potential value of decision support tools for conservation, and there are several papers describing the design of individual tools. Rather less attention, however, has been placed on the desirable characteristics for use, and even less on whether tools are actually being used in practice. This is concerning because if tools are not used by their intended end user, for example a policy-maker or practitioner, then its design will have wasted resources. Based on an analysis of papers on tool use in conservation, there is a lack of social science research on improving design, and relatively few examples where users have been incorporated into the design process. Evidence from other disciplines, particularly human-computer interaction research, illustrates that involving users throughout the design of decision support tools increases the relevance, usability, and impact of systems. User-centred design of tools is, however, seldom mentioned in the conservation literature. The workshop started the necessary process of bringing together developers and users to share knowledge about how to conduct good user-centred design of decision support tools. This will help to ensure that tools are usable and make an impact in conservation policy and practice.

  12. GTest: a software tool for graphical assessment of empirical distributions' Gaussianity.

    Science.gov (United States)

    Barca, E; Bruno, E; Bruno, D E; Passarella, G

    2016-03-01

    their request for an effective tool for addressing such difficulties motivated us in adopting the inference-by-eye paradigm and implementing an easy-to-use, quick and reliable statistical tool. GTest visualizes its outcomes as a modified version of the Q-Q plot. The application has been developed in Visual Basic for Applications (VBA) within MS Excel 2010, which demonstrated to have all the characteristics of robustness and reliability needed. GTest provides true graphical normality tests which are as reliable as any statistical quantitative approach but much easier to understand. The Q-Q plots have been integrated with the outlining of an acceptance region around the representation of the theoretical distribution, defined in accordance with the alpha level of significance and the data sample size. The test decision rule is the following: if the empirical scatterplot falls completely within the acceptance region, then it can be concluded that the empirical distribution fits the theoretical one at the given alpha level. A comprehensive case study has been carried out with simulated and real-world data in order to check the robustness and reliability of the software.

  13. UNBizPlanner: a software tool for preparing a business plan

    Directory of Open Access Journals (Sweden)

    Oscar Ávila Cifuentes

    2010-04-01

    Full Text Available Universities are currently expected to play a new role in society (in addition to research and teaching by engaging in a third mission concerning socio-economic development. Universities also play an important role in encouraging en-trepreneurs through training them in business planning. A business plan is a document summarising how an entre-preneur will create an organisation to exploit a business opportunity. Preparing a business plan draws on a wide range of knowledge from many business disciplines (e.g. finance, human resource management, intellectual pro-perty management, supply chain management, operations management and marketing. This article presents a computational tool for drawing up a business plan from a Colombian viewpoint by identifying the most relevant stages which are born in mind by national entities having most experience in creating and consolidating companies. Special emphasis was placed on analysing, designing and implementing a systems development life cycle for de-veloping the software. Reviewing the literature concerning business plans formed an important part of the analysis stage (bearing a Colombian viewpoint in mind.

  14. TRANSIT--A Software Tool for Himar1 TnSeq Analysis.

    Directory of Open Access Journals (Sweden)

    Michael A DeJesus

    2015-10-01

    Full Text Available TnSeq has become a popular technique for determining the essentiality of genomic regions in bacterial organisms. Several methods have been developed to analyze the wealth of data that has been obtained through TnSeq experiments. We developed a tool for analyzing Himar1 TnSeq data called TRANSIT. TRANSIT provides a graphical interface to three different statistical methods for analyzing TnSeq data. These methods cover a variety of approaches capable of identifying essential genes in individual datasets as well as comparative analysis between conditions. We demonstrate the utility of this software by analyzing TnSeq datasets of M. tuberculosis grown on glycerol and cholesterol. We show that TRANSIT can be used to discover genes which have been previously implicated for growth on these carbon sources. TRANSIT is written in Python, and thus can be run on Windows, OSX and Linux platforms. The source code is distributed under the GNU GPL v3 license and can be obtained from the following GitHub repository: https://github.com/mad-lab/transit.

  15. Software Tool for Analysis of Breathing-Related Errors in Transthoracic Electrical Bioimpedance Spectroscopy Measurements

    Science.gov (United States)

    Abtahi, F.; Gyllensten, I. C.; Lindecrantz, K.; Seoane, F.

    2012-12-01

    During the last decades, Electrical Bioimpedance Spectroscopy (EBIS) has been applied in a range of different applications and mainly using the frequency sweep-technique. Traditionally the tissue under study is considered to be timeinvariant and dynamic changes of tissue activity are ignored and instead treated as a noise source. This assumption has not been adequately tested and could have a negative impact and limit the accuracy for impedance monitoring systems. In order to successfully use frequency-sweeping EBIS for monitoring time-variant systems, it is paramount to study the effect of frequency-sweep delay on Cole Model-based analysis. In this work, we present a software tool that can be used to simulate the influence of respiration activity in frequency-sweep EBIS measurements of the human thorax and analyse the effects of the different error sources. Preliminary results indicate that the deviation on the EBIS measurement might be significant at any frequency, and especially in the impedance plane. Therefore the impact on Cole-model analysis might be different depending on method applied for Cole parameter estimation.

  16. A Software Tool to Visualize Verbal Protocols to Enhance Strategic and Metacognitive Abilities in Basic Programming

    Directory of Open Access Journals (Sweden)

    Carlos A. Arévalo

    2011-07-01

    Full Text Available Learning to program is difficult for many first year undergraduate students. Instructional strategies of traditional programming courses tend to focus on syntactic issues and assigning practice exercises using the presentation-examples-practice formula and by showing the verbal and visual explanation of a teacher during the “step by step” process of writing a computer program. Cognitive literature regarding the mental processes involved in programming suggests that the explicit teaching of certain aspects such as mental models, strategic knowledge and metacognitive abilities, are critical issues of how to write and assemble the pieces of a computer program. Verbal protocols are often used in software engineering as a technique to record the short term cognitive process of a user or expert in evaluation or problem solving scenarios. We argue that verbal protocols can be used as a mechanism to explicitly show the strategic and metacognitive process of an instructor when writing a program. In this paper we present an Information System Prototype developed to store and visualize worked examples derived from transcribed verbal protocols during the process of writing introductory level programs. Empirical data comparing the grades obtained by two groups of novice programming students, using ANOVA, indicates a statistically positive difference in performance in the group using the tool, even though these results still cannot be extrapolated to general population, given the reported limitations of this study.

  17. Software Application Profile: PHESANT: a tool for performing automated phenome scans in UK Biobank.

    Science.gov (United States)

    Millard, Louise A C; Davies, Neil M; Gaunt, Tom R; Davey Smith, George; Tilling, Kate

    2017-10-05

    Epidemiological cohorts typically contain a diverse set of phenotypes such that automation of phenome scans is non-trivial, because they require highly heterogeneous models. For this reason, phenome scans have to date tended to use a smaller homogeneous set of phenotypes that can be analysed in a consistent fashion. We present PHESANT (PHEnome Scan ANalysis Tool), a software package for performing comprehensive phenome scans in UK Biobank. PHESANT tests the association of a specified trait with all continuous, integer and categorical variables in UK Biobank, or a specified subset. PHESANT uses a novel rule-based algorithm to determine how to appropriately test each trait, then performs the analyses and produces plots and summary tables. The PHESANT phenome scan is implemented in R. PHESANT includes a novel Javascript D3.js visualization and accompanying Java code that converts the phenome scan results to the required JavaScript Object Notation (JSON) format. PHESANT is available on GitHub at [https://github.com/MRCIEU/PHESANT]. Git tag v0.5 corresponds to the version presented here.

  18. 76 FR 28022 - Increasing Market and Planning Efficiency Through Improved Software; Notice of Technical...

    Science.gov (United States)

    2011-05-13

    ... Energy Regulatory Commission Increasing Market and Planning Efficiency Through Improved Software; Notice of Technical Conference: Increasing Real-Time and Day- Ahead Market Efficiency Through Improved...:30 a.m. to 4:30 p.m., to discuss opportunities for increasing real-time and day-ahead market...

  19. The Design and Development of a Computerized Tool Support for Conducting Senior Projects in Software Engineering Education

    Science.gov (United States)

    Chen, Chung-Yang; Teng, Kao-Chiuan

    2011-01-01

    This paper presents a computerized tool support, the Meetings-Flow Project Collaboration System (MFS), for designing, directing and sustaining the collaborative teamwork required in senior projects in software engineering (SE) education. Among many schools' SE curricula, senior projects serve as a capstone course that provides comprehensive…

  20. Plagiarism Detection: A Comparison of Teaching Assistants and a Software Tool in Identifying Cheating in a Psychology Course

    Science.gov (United States)

    Seifried, Eva; Lenhard, Wolfgang; Spinath, Birgit

    2015-01-01

    Essays that are assigned as homework in large classes are prone to cheating via unauthorized collaboration. In this study, we compared the ability of a software tool based on Latent Semantic Analysis (LSA) and student teaching assistants to detect plagiarism in a large group of students. To do so, we took two approaches: the first approach was…

  1. Communication Styles of Interactive Tools for Self-Improvement.

    Science.gov (United States)

    Niess, Jasmin; Diefenbach, Sarah

    Interactive products for self-improvement (e.g., online trainings to reduce stress, fitness gadgets) have become increasingly popular among consumers and healthcare providers. In line with the idea of positive computing, these tools aim to support their users on their way to improved well-being and human flourishing. As an interdisciplinary domain, the design of self-improvement technologies requires psychological, technological, and design expertise. One needs to know how to support people in behavior change, and one needs to find ways to do this through technology design. However, as recent reviews show, the interlocking relationship between these disciplines is still improvable. Many existing technologies for self-improvement neglect psychological theory on behavior change, especially motivational factors are not sufficiently considered. To counteract this, we suggest a focus on the dialog and emerging communication between product and user, considering the self-improvement tool as an interactive coach and advisor. The present qualitative interview study (N = 18) explored the user experience of self-improvement technologies. A special focus was on the perceived dialog between tool and user, which we analyzed in terms of models from communication psychology. Our findings show that users are sensible to the way the product "speaks to them" and consider this as essential for their experience and successful change. Analysis revealed different communication styles of self-improvement tools (e.g., helpful-cooperative, rational-distanced, critical-aggressive), each linked to specific emotional consequences. These findings form one starting point for a more psychologically founded design of self-improvement technology. On a more general level, our approach aims to contribute to a better integration of psychological and technological knowledge, and in consequence, supporting users on their way to enhanced well-being.

  2. Improved Technologies for the Process Energy and Pollution Reduction (PEPR) Analysis Tool

    National Research Council Canada - National Science Library

    Lin, Mike

    2000-01-01

    ...) software tool has been developed. The PEPR tool helps DOD facility personnel identify and quantify energy conservation and pollution prevention opportunities for the following industrial processes: load and pack (LAP...

  3. Mars, accessing the third dimension: a software tool to exploit Mars ground penetrating radars data.

    Science.gov (United States)

    Cantini, Federico; Ivanov, Anton B.

    2016-04-01

    The Mars Advanced Radar for Subsurface and Ionosphere Sounding (MARSIS), on board the ESA's Mars Express and the SHAllow RADar (SHARAD), on board the NASA's Mars Reconnaissance Orbiter are two ground penetrating radars (GPRs) aimed to probe the crust of Mars to explore the subsurface structure of the planet. By now they are collecting data since about 10 years covering a large fraction of the Mars surface. On the Earth GPRs collect data by sending electromagnetic (EM) pulses toward the surface and listening to the return echoes occurring at the dielectric discontinuities on the planet's surface and subsurface. The wavelengths used allow MARSIS EM pulses to penetrate the crust for several kilometers. The data products (Radargrams) are matrices where the x-axis spans different sampling points on the planet surface and the y-axis is the power of the echoes over time in the listening window. No standard way to manage this kind of data is established in the planetary science community and data analysis and interpretation require very often some knowledge of radar signal processing. Our software tool is aimed to ease the access to this data in particular to scientists without a specific background in signal processing. MARSIS and SHARAD geometrical data such as probing point latitude and longitude and spacecraft altitude, are stored, together with relevant acquisition metadata, in a geo-enabled relational database implemented using PostgreSQL and PostGIS. Data are extracted from official ESA and NASA released data using self-developed python classes and scripts and inserted in the database using OGR utilities. This software is also aimed to be the core of a collection of classes and script to implement more complex GPR data analysis. Geometrical data and metadata are exposed as WFS layers using a QGIS server, which can be further integrated with other data, such as imaging, spectroscopy and topography. Radar geometry data will be available as a part of the iMars Web

  4. YANA – a software tool for analyzing flux modes, gene-expression and enzyme activities

    Directory of Open Access Journals (Sweden)

    Engels Bernd

    2005-06-01

    Full Text Available Abstract Background A number of algorithms for steady state analysis of metabolic networks have been developed over the years. Of these, Elementary Mode Analysis (EMA has proven especially useful. Despite its low user-friendliness, METATOOL as a reliable high-performance implementation of the algorithm has been the instrument of choice up to now. As reported here, the analysis of metabolic networks has been improved by an editor and analyzer of metabolic flux modes. Analysis routines for expression levels and the most central, well connected metabolites and their metabolic connections are of particular interest. Results YANA features a platform-independent, dedicated toolbox for metabolic networks with a graphical user interface to calculate (integrating METATOOL, edit (including support for the SBML format, visualize, centralize, and compare elementary flux modes. Further, YANA calculates expected flux distributions for a given Elementary Mode (EM activity pattern and vice versa. Moreover, a dissection algorithm, a centralization algorithm, and an average diameter routine can be used to simplify and analyze complex networks. Proteomics or gene expression data give a rough indication of some individual enzyme activities, whereas the complete flux distribution in the network is often not known. As such data are noisy, YANA features a fast evolutionary algorithm (EA for the prediction of EM activities with minimum error, including alerts for inconsistent experimental data. We offer the possibility to include further known constraints (e.g. growth constraints in the EA calculation process. The redox metabolism around glutathione reductase serves as an illustration example. All software and documentation are available for download at http://yana.bioapps.biozentrum.uni-wuerzburg.de. Conclusion A graphical toolbox and an editor for METATOOL as well as a series of additional routines for metabolic network analyses constitute a new user

  5. Technical Note: Development and performance of a software tool for quality assurance of online replanning with a conventional Linac or MR-Linac.

    Science.gov (United States)

    Chen, Guang-Pei; Ahunbay, Ergun; Li, X Allen

    2016-04-01

    To develop an integrated quality assurance (QA) software tool for online replanning capable of efficiently and automatically checking radiation treatment (RT) planning parameters and gross plan quality, verifying treatment plan data transfer from treatment planning system (TPS) to record and verify (R&V) system, performing a secondary monitor unit (MU) calculation with or without a presence of a magnetic field from MR-Linac, and validating the delivery record consistency with the plan. The software tool, named ArtQA, was developed to obtain and compare plan and treatment parameters from both the TPS and the R&V system database. The TPS data are accessed via direct file reading and the R&V data are retrieved via open database connectivity and structured query language. Plan quality is evaluated with both the logical consistency of planning parameters and the achieved dose-volume histograms. Beams in between the TPS and R&V system are matched based on geometry configurations. To consider the effect of a 1.5 T transverse magnetic field from MR-Linac in the secondary MU calculation, a method based on modified Clarkson integration algorithm was developed and tested for a series of clinical situations. ArtQA has been used in their clinic and can quickly detect inconsistencies and deviations in the entire RT planning process. With the use of the ArtQA tool, the efficiency for plan check including plan quality, data transfer, and delivery check can be improved by at least 60%. The newly developed independent MU calculation tool for MR-Linac reduces the difference between the plan and calculated MUs by 10%. The software tool ArtQA can be used to perform a comprehensive QA check from planning to delivery with conventional Linac or MR-Linac and is an essential tool for online replanning where the QA check needs to be performed rapidly.

  6. Technical Note: Development and performance of a software tool for quality assurance of online replanning with a conventional Linac or MR-Linac

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Guang-Pei, E-mail: gpchen@mcw.edu; Ahunbay, Ergun; Li, X. Allen [Department of Radiation Oncology, Medical College of Wisconsin, Milwaukee, Wisconsin 53226 (United States)

    2016-04-15

    Purpose: To develop an integrated quality assurance (QA) software tool for online replanning capable of efficiently and automatically checking radiation treatment (RT) planning parameters and gross plan quality, verifying treatment plan data transfer from treatment planning system (TPS) to record and verify (R&V) system, performing a secondary monitor unit (MU) calculation with or without a presence of a magnetic field from MR-Linac, and validating the delivery record consistency with the plan. Methods: The software tool, named ArtQA, was developed to obtain and compare plan and treatment parameters from both the TPS and the R&V system database. The TPS data are accessed via direct file reading and the R&V data are retrieved via open database connectivity and structured query language. Plan quality is evaluated with both the logical consistency of planning parameters and the achieved dose–volume histograms. Beams in between the TPS and R&V system are matched based on geometry configurations. To consider the effect of a 1.5 T transverse magnetic field from MR-Linac in the secondary MU calculation, a method based on modified Clarkson integration algorithm was developed and tested for a series of clinical situations. Results: ArtQA has been used in their clinic and can quickly detect inconsistencies and deviations in the entire RT planning process. With the use of the ArtQA tool, the efficiency for plan check including plan quality, data transfer, and delivery check can be improved by at least 60%. The newly developed independent MU calculation tool for MR-Linac reduces the difference between the plan and calculated MUs by 10%. Conclusions: The software tool ArtQA can be used to perform a comprehensive QA check from planning to delivery with conventional Linac or MR-Linac and is an essential tool for online replanning where the QA check needs to be performed rapidly.

  7. Open Software Tools Applied to Jordan's National Multi-Agent Water Management Model

    Science.gov (United States)

    Knox, Stephen; Meier, Philipp; Harou, Julien; Yoon, Jim; Selby, Philip; Lachaut, Thibaut; Klassert, Christian; Avisse, Nicolas; Khadem, Majed; Tilmant, Amaury; Gorelick, Steven

    2016-04-01

    Jordan is the fourth most water scarce country in the world, where demand exceeds supply in a politically and demographically unstable context. The Jordan Water Project (JWP) aims to perform policy evaluation by modelling the hydrology, economics, and governance of Jordan's water resource system. The multidisciplinary nature of the project requires a modelling software system capable of integrating submodels from multiple disciplines into a single decision making process and communicating results to stakeholders. This requires a tool for building an integrated model and a system where diverse data sets can be managed and visualised. The integrated Jordan model is built using Pynsim, an open-source multi-agent simulation framework implemented in Python. Pynsim operates on network structures of nodes and links and supports institutional hierarchies, where an institution represents a grouping of nodes, links or other institutions. At each time step, code within each node, link and institution can executed independently, allowing for their fully autonomous behaviour. Additionally, engines (sub-models) perform actions over the entire network or on a subset of the network, such as taking a decision on a set of nodes. Pynsim is modular in design, allowing distinct modules to be modified easily without affecting others. Data management and visualisation is performed using Hydra (www.hydraplatform.org), an open software platform allowing users to manage network structure and data. The Hydra data manager connects to Pynsim, providing necessary input parameters for the integrated model. By providing a high-level portal to the model, Hydra removes a barrier between the users of the model (researchers, stakeholders, planners etc) and the model itself, allowing them to manage data, run the model and visualise results all through a single user interface. Pynsim's ability to represent institutional hierarchies, inter-network communication and the separation of node, link and

  8. A MODEL FOR INTEGRATED SOFTWARE TO IMPROVE COMMUNICATION POLICY IN DENTAL TECHNICAL LABS

    Directory of Open Access Journals (Sweden)

    Minko M. Milev

    2017-06-01

    Full Text Available Introduction: Integrated marketing communications (IMC are all kinds of communications between organisations and customers, partners, other organisations and society. Aim: To develop and present an integrated software model, which can improve the effectiveness of communications in dental technical services. Material and Methods: The model of integrated software is based on recommendations of a total of 700 respondents (students of dental technology, dental physicians, dental technicians and patients of dental technical laboratories in Northeastern Bulgaria. Results and Discussion: We present the benefits of future integrated software to improve the communication policy in the dental technical laboratory that meets the needs of fast cooperation and well-built communicative network between dental physicians, dental technicians, patients and students. Conclusion: The use of integrated communications could be a powerful unified approach to improving the communication policy between all players at the market of dental technical services.

  9. Combining Capability Assessment and Value Engineering: a New Two-dimensional Method for Software Process Improvement

    Directory of Open Access Journals (Sweden)

    Pasi Ojala

    2008-02-01

    Full Text Available During the last decades software process improvement (SPI has been recognized as a usable possibility to increase the quality of software development. Implemented SPI investments have often indicated increased process capabilities as well. Recently more attention has been focused on the costs of SPI as well as on the cost-effectiveness and productivity of software development, although the roots of economic-driven software engineering originate from the very early days of software engineering research. This research combines Value Engineering and capability assessment into usable new method in order to better respond to the challenges that cost-effectiveness and productivity has brought to software companies. This is done in part by defining the concepts of value, worth and cost and in part by defining the Value Engineering process and different enhancements it has seen to offer to software assessment. The practical industrial cases show that proposed two-dimensional method works in practise and is useful to assessed companies.

  10. Software Quality Assurance Metrics

    Science.gov (United States)

    McRae, Kalindra A.

    2004-01-01

    Software Quality Assurance (SQA) is a planned and systematic set of activities that ensures conformance of software life cycle processes and products conform to requirements, standards and procedures. In software development, software quality means meeting requirements and a degree of excellence and refinement of a project or product. Software Quality is a set of attributes of a software product by which its quality is described and evaluated. The set of attributes includes functionality, reliability, usability, efficiency, maintainability, and portability. Software Metrics help us understand the technical process that is used to develop a product. The process is measured to improve it and the product is measured to increase quality throughout the life cycle of software. Software Metrics are measurements of the quality of software. Software is measured to indicate the quality of the product, to assess the productivity of the people who produce the product, to assess the benefits derived from new software engineering methods and tools, to form a baseline for estimation, and to help justify requests for new tools or additional training. Any part of the software development can be measured. If Software Metrics are implemented in software development, it can save time, money, and allow the organization to identify the caused of defects which have the greatest effect on software development. The summer of 2004, I worked with Cynthia Calhoun and Frank Robinson in the Software Assurance/Risk Management department. My task was to research and collect, compile, and analyze SQA Metrics that have been used in other projects that are not currently being used by the SA team and report them to the Software Assurance team to see if any metrics can be implemented in their software assurance life cycle process.

  11. Mass Counseling: Effective Tool to Improve Knowledge, Attitude and ...

    African Journals Online (AJOL)

    Mass Counseling: Effective Tool to Improve Knowledge, Attitude and Behavior Regarding Blood Donation. ... Annals of Medical and Health Sciences Research ... and post‑counseling knowledge, attitude, and behavior (KAB) scores regarding blood donation were assessed using a pre‑tested semi‑structured questionnaire.

  12. Coaching as a Performance Improvement Tool at School

    Science.gov (United States)

    Yirci, Ramazan; Karakose, Turgut; Kocabas, Ibrahim

    2016-01-01

    The purpose of this study is to examine the current literature and have an insight about coaching as a performance improvement tool at school. In today's world, schools have to survive and keep their organizational success in the highest level because of the high expectations from school stakeholders. Taking place in such a fierce competitive…

  13. Sustaining an Online, Shared Community Resource for Models, Robust Open source Software Tools and Data for Volcanology - the Vhub Experience

    Science.gov (United States)

    Patra, A. K.; Valentine, G. A.; Bursik, M. I.; Connor, C.; Connor, L.; Jones, M.; Simakov, N.; Aghakhani, H.; Jones-Ivey, R.; Kosar, T.; Zhang, B.

    2015-12-01

    Over the last 5 years we have created a community collaboratory Vhub.org [Palma et al, J. App. Volc. 3:2 doi:10.1186/2191-5040-3-2] as a place to find volcanology-related resources, and a venue for users to disseminate tools, teaching resources, data, and an online platform to support collaborative efforts. As the community (current active users > 6000 from an estimated community of comparable size) embeds the tools in the collaboratory into educational and research workflows it became imperative to: a) redesign tools into robust, open source reusable software for online and offline usage/enhancement; b) share large datasets with remote collaborators and other users seamlessly with security; c) support complex workflows for uncertainty analysis, validation and verification and data assimilation with large data. The focus on tool development/redevelopment has been twofold - firstly to use best practices in software engineering and new hardware like multi-core and graphic processing units. Secondly we wish to enhance capabilities to support inverse modeling, uncertainty quantification using large ensembles and design of experiments, calibration, validation. Among software engineering practices we practice are open source facilitating community contributions, modularity and reusability. Our initial targets are four popular tools on Vhub - TITAN2D, TEPHRA2, PUFF and LAVA. Use of tools like these requires many observation driven data sets e.g. digital elevation models of topography, satellite imagery, field observations on deposits etc. These data are often maintained in private repositories that are privately shared by "sneaker-net". As a partial solution to this we tested mechanisms using irods software for online sharing of private data with public metadata and access limits. Finally, we adapted use of workflow engines (e.g. Pegasus) to support the complex data and computing workflows needed for usage like uncertainty quantification for hazard analysis using physical

  14. Development of a Kinect Software Tool to Classify Movements during Active Video Gaming.

    Directory of Open Access Journals (Sweden)

    Michael Rosenberg

    Full Text Available While it has been established that using full body motion to play active video games results in increased levels of energy expenditure, there is little information on the classification of human movement during active video game play in relationship to fundamental movement skills. The aim of this study was to validate software utilising Kinect sensor motion capture technology to recognise fundamental movement skills (FMS, during active video game play. Two human assessors rated jumping and side-stepping and these assessments were compared to the Kinect Action Recognition Tool (KART, to establish a level of agreement and determine the number of movements completed during five minutes of active video game play, for 43 children (m = 12 years 7 months ± 1 year 6 months. During five minutes of active video game play, inter-rater reliability, when examining the two human raters, was found to be higher for the jump (r = 0.94, p < .01 than the sidestep (r = 0.87, p < .01, although both were excellent. Excellent reliability was also found between human raters and the KART system for the jump (r = 0.84, p, .01 and moderate reliability for sidestep (r = 0.6983, p < .01 during game play, demonstrating that both humans and KART had higher agreement for jumps than sidesteps in the game play condition. The results of the study provide confidence that the Kinect sensor can be used to count the number of jumps and sidestep during five minutes of active video game play with a similar level of accuracy as human raters. However, in contrast to humans, the KART system required a fraction of the time to analyse and tabulate the results.

  15. Digital Aquifer - Integrating modeling, technical, software and policy aspects to develop a groundwater management tool

    Science.gov (United States)

    Tirupathi, S.; McKenna, S. A.; Fleming, K.; Wambua, M.; Waweru, P.; Ondula, E.

    2016-12-01

    Groundwater management has traditionally been observed as a study for long term policy measures to ensure that the water resource is sustainable. IBM Research, in association with the World Bank, extended this traditional analysis to include realtime groundwater management by building a context-aware, water rights management and permitting system. As part of this effort, one of the primary objectives was to develop a groundwater flow model that can help the policy makers with a visual overview of the current groundwater distribution. In addition, the system helps the policy makers simulate a range of scenarios and check the sustainability of the groundwater resource in a given region. The system also enables a license provider to check the effect of the introduction of a new well on the existing wells in the domain as well as the groundwater resource in general. This process simplifies how an engineer will determine if a new well should be approved. Distance to the nearest well neighbors and the maximum decreases in water levels of nearby wells are continually assessed and presented as evidence for an engineer to make the final judgment on approving the permit. The system also facilitates updated insights on the amount of groundwater left in an area and provides advice on how water fees should be structured to balance conservation and economic development goals. In this talk, we will discuss the concept of Digital Aquifer, the challenges in integrating modeling, technical and software aspects to develop a management system that helps policy makers and license providers with a robust decision making tool. We will concentrate on the groundwater model developed using the analytic element method that plays a very important role in the decision making aspects. Finally, the efficiency of this system and methodology is shown through a case study in Laguna Province, Philippines, which was done in collaboration with the National Water Resource Board, Philippines and World

  16. A flexible, interactive software tool for fitting the parameters of neuronal models

    Directory of Open Access Journals (Sweden)

    Péter eFriedrich

    2014-07-01

    Full Text Available The construction of biologically relevant neuronal models as well as model-based analysis of experimental data often requires the simultaneous fitting of multiple model parameters, so that the behavior of the model in a certain paradigm matches (as closely as possible the corresponding output of a real neuron according to some predefined criterion. Although the task of model optimization is often computationally hard, and the quality of the results depends heavily on technical issues such as the appropriate choice (and implementation of cost functions and optimization algorithms, no existing program provides access to the best available methods while also guiding the user through the process effectively. Our software, called Optimizer, implements a modular and extensible framework for the optimization of neuronal models, and also features a graphical interface which makes it easy for even non-expert users to handle many commonly occurring scenarios. Meanwhile, educated users can extend the capabilities of the program and customize it according to their needs with relatively little effort. Optimizer has been developed in Python, takes advantage of open-source Python modules for nonlinear optimization, and interfaces directly with the NEURON simulator to run the models. Other simulators are supported through an external interface. We have tested the program on several different types of problem of varying complexity, using different model classes. As targets, we used simulated traces from the same or a more complex model class, as well as experimental data. We successfully used Optimizer to determine passive parameters and conductance densities in compartmental models, and to fit simple (adaptive exponential integrate-and-fire neuronal models to complex biological data. Our detailed comparisons show that Optimizer can handle a wider range of problems, and delivers equally good or better performance than any other existing neuronal model fitting

  17. Improvement of Selected Logistics Processes Using Quality Engineering Tools

    Science.gov (United States)

    Zasadzień, Michał; Žarnovský, Jozef

    2018-03-01

    Increase in the number of orders, the increasing quality requirements and the speed of order preparation require implementation of new solutions and improvement of logistics processes. Any disruption that occurs during execution of an order often leads to customer dissatisfaction, as well as loss of his/her confidence. The article presents a case study of the use of quality engineering methods and tools to improve the e-commerce logistic process. This made it possible to identify and prioritize key issues, identify their causes, and formulate improvement and prevention measures.

  18. The Development and Use of an Evaluation Mechanism for the Assessment of Software Configuration Management Tools

    Science.gov (United States)

    1993-12-01

    Revisions ............................... 2-7 2-4. Generic Change Control Process ....................................... 2-10 2-5. Computer Software Development Cycle...EUNCTIONALALLOCATED DEVELOPMENTAL CONFIGURATION PRODUCT BASELINE BASELINE BASELINE Figure 2-5. Computer Software Development Cycle (Berlack, 1992:47) 2-18 In

  19. A tool-supported approach for modular design of energy-aware software

    NARCIS (Netherlands)

    te Brinke, Steven; Malakuti Khah Olun Abadi, Somayeh; Bockisch, Christoph; Bergmans, Lodewijk; Aksit, Mehmet; Katz, Shmuel

    The reduction of energy usage by software-controlled systems has many advantages, including prolonged battery life and reduction of greenhouse gas emissions. Thus, being able to implement energy optimization in software is essential. This requires a model of the energy utilization—or more general

  20. Software sensors as a tool for optimization of animal-cell cultures

    NARCIS (Netherlands)

    Dorresteijn, R.C.

    1997-01-01

    In this thesis software sensors are introduced that predict the biomass activity and the concentrations of glucose, glutamine, lactic acid, and ammonium on line, The software sensors for biomass activity, glucose and lactic acid can be applied for any type of animal cell that is grown in a

  1. Sound pressure level tools design used in occupational health by means of Labview software

    Directory of Open Access Journals (Sweden)

    Farhad Forouharmajd

    2015-01-01

    Conclusion: LabVIEW programming capabilities in the field of sound can be referred to the measurement of sound, frequency analysis, and sound control that actually the software acts like a sound level meter and sound analyzer. According to the mentioned features, we can use this software to analyze and process sound and vibration as a monitoring system.

  2. Model-driven design of simulation support for the TERRA robot software tool suite

    NARCIS (Netherlands)

    Lu, Zhou; Bezemer, M.M.; Broenink, Johannes F.

    2015-01-01

    Model-Driven Development (MDD) – based on the concepts of model, meta-model and model transformation – is an approach to develop predictable and re- liable software for Cyber-Physical Systems (CPS). The work presented here concerns a methodology to design simulation software based on MDD techniques,

  3. A tool framework for deriving the application architecture for global software development projects

    NARCIS (Netherlands)

    Yildiz, Bugra Mehmet; Tekinerdogan, B.; Cetin, Semih

    In order to meet the communication, coordination and control requirements of distributed Global Software Development (GSD) teams, it is necessary to define a proper software architecture. Designing a GSD architecture, however, involves a multitude of design decisions that are related in different

  4. Understanding How the "Open" of Open Source Software (OSS) Will Improve Global Health Security.

    Science.gov (United States)

    Hahn, Erin; Blazes, David; Lewis, Sheri

    2016-01-01

    Improving global health security will require bold action in all corners of the world, particularly in developing settings, where poverty often contributes to an increase in emerging infectious diseases. In order to mitigate the impact of emerging pandemic threats, enhanced disease surveillance is needed to improve early detection and rapid response to outbreaks. However, the technology to facilitate this surveillance is often unattainable because of high costs, software and hardware maintenance needs, limited technical competence among public health officials, and internet connectivity challenges experienced in the field. One potential solution is to leverage open source software, a concept that is unfortunately often misunderstood. This article describes the principles and characteristics of open source software and how it may be applied to solve global health security challenges.

  5. MsViz: A Graphical Software Tool for In-Depth Manual Validation and Quantitation of Post-translational Modifications.

    Science.gov (United States)

    Martín-Campos, Trinidad; Mylonas, Roman; Masselot, Alexandre; Waridel, Patrice; Petricevic, Tanja; Xenarios, Ioannis; Quadroni, Manfredo

    2017-08-04

    Mass spectrometry (MS) has become the tool of choice for the large scale identification and quantitation of proteins and their post-translational modifications (PTMs). This development has been enabled by powerful software packages for the automated analysis of MS data. While data on PTMs of thousands of proteins can nowadays be readily obtained, fully deciphering the complexity and combinatorics of modification patterns even on a single protein often remains challenging. Moreover, functional investigation of PTMs on a protein of interest requires validation of the localization and the accurate quantitation of its changes across several conditions, tasks that often still require human evaluation. Software tools for large scale analyses are highly efficient but are rarely conceived for interactive, in-depth exploration of data on individual proteins. We here describe MsViz, a web-based and interactive software tool that supports manual validation of PTMs and their relative quantitation in small- and medium-size experiments. The tool displays sequence coverage information, peptide-spectrum matches, tandem MS spectra and extracted ion chromatograms through a single, highly intuitive interface. We found that MsViz greatly facilitates manual data inspection to validate PTM location and quantitate modified species across multiple samples.

  6. Video Games and Software Engineers : Designing a study based on the benefits from Video Games and how they can improve Software Engineers

    OpenAIRE

    Cosic Prica, Srdjan

    2017-01-01

    Context: This is a study about investigating if playing video games can improve any skills and characteristics in a software engineer. Due to lack of resources and time, this study will focus on designing a study that others may use to measure the results and if video games actually can improve software engineers. Objectives: The main objectives are finding the benefits of playing video games and how those benefits are discovered. Meaning what types of games and for how long someone needs to ...

  7. Can digital tools be used for improving immunization programmes?

    Directory of Open Access Journals (Sweden)

    Alberto Eugenio Tozzi

    2016-03-01

    Full Text Available In order to successfully control and eliminate vaccine-preventable infectious diseases, an appropriate vaccine coverage has to be achieved and maintained. This task requires a high level of effort, as it may be compromised by a number of barriers. Public health agencies have issued specific recommendations to address these barriers and therefore improve immunization programmes. In the present review, we characterize issues and challenges of immunization programmes for which digital tools are a potential solution. In particular, we explore previously published research on the use of digital tools in the following vaccine-related areas: immunization registries, dose tracking, and decision support systems; vaccine preventable diseases surveillance; surveillance of adverse events following immunizations; vaccine confidence monitoring; delivery of information on vaccines to the public. Subsequently, we analyze the limits of the use of digital tools in such contexts and envision future possibilities and challenges.

  8. The Use of the Software MATLAB To Improve Chemical Engineering Education.

    Science.gov (United States)

    Damatto, T.; Maegava, L. M.; Filho, R. Maciel

    In all the Brazilian Universities involved with the project "Prodenge-Reenge", the main objective is to improve teaching and learning procedures for the engineering disciplines. The Chemical Engineering College of Campinas State University focused its effort on the use of engineering softwares. The work developed by this project has…

  9. Ergonomics and simulation tools for service & industrial process improvement

    Science.gov (United States)

    Sánchez, A.; García, M.

    2012-04-01

    Human interaction within designed processes is a really important factor in how efficiently any process will operate. How a human will function in relation to a process is not easy to predict. All the ergonomic considerations traditionally have been evaluated outside of the 3D product design. Nowadays technologies of 3D process design and simulation tools give us this opportunity from the earliest stages of the design process. Also they can be used to improve current process in order to increase human comfort, productivity and safety. This work shows a methodology using 3D design and simulation tools to improve industrial and service process. This methodology has as an objective the detection, evaluation, control of work-related musculoskeletal disorders (WMSDs).

  10. APPLICATION OF THE SPECTRUM ANALYSIS WITH USING BERG METHOD TO DEVELOPED SPECIAL SOFTWARE TOOLS FOR OPTICAL VIBRATION DIAGNOSTICS SYSTEM

    Directory of Open Access Journals (Sweden)

    E. O. Zaitsev

    2016-01-01

    Full Text Available The objective of this paper is development and experimental verification special software of spectral analysis. Spectral analysis use of controlled vibrations objects. Spectral analysis of vibration based on use maximum-entropy autoregressive method of spectral analysis by the Berg algorithm. For measured signals use preliminary analysis based on regression analysis. This analysis of the signal enables to eliminate uninformative parameters such as – the noise and the trend. For preliminary analysis developed special software tools. Non-contact measurement of mechanical vibrations parameters rotating diffusely-reflecting surfaces used in circumstances where the use of contact sensors difficult or impossible for a number of reasons, including lack of access to the object, the small size of the controlled area controlled portion has a high temperature or is affected by strong electromagnetic fields. For control use offered laser measuring system. This measuring system overcomes the shortcomings interference or Doppler optical measuring systems. Such as measure the large amplitude and inharmonious vibration. On the basis of the proposed methods developed special software tools for use measuring laser system. LabVIEW using for developed special software. Experimental research of the proposed method of vibration signals processing is checked in the analysis of the diagnostic information obtained by measuring the vibration system grinding diamond wheel cold solid tungsten-containing alloy TK8. A result of work special software tools was complex spectrum obtained «purified» from non-informative parameters. Spectrum of the signal corresponding to the vibration process observed object. 

  11. Development and assessment of a digital X-ray software tool to determine vertebral rotation in adolescent idiopathic scoliosis.

    Science.gov (United States)

    Eijgenraam, Susanne M; Boselie, Toon F M; Sieben, Judith M; Bastiaenen, Caroline H G; Willems, Paul C; Arts, Jacobus J; Lataster, Arno

    2017-02-01

    The amount of vertebral rotation in the axial plane is of key importance in the prognosis and treatment of adolescent idiopathic scoliosis (AIS). Current methods to determine vertebral rotation are either designed for use in analogue plain radiographs and not useful in digital images, or lack measurement precision and are therefore less suitable for the follow-up of rotation in AIS patients. This study aimed to develop a digital X-ray software tool with high measurement precision to determine vertebral rotation in AIS, and to assess its (concurrent) validity and reliability. In this study a combination of basic science and reliability methodology applied in both laboratory and clinical settings was used. Software was developed using the algorithm of the Perdriolle torsion meter for analogue AP plain radiographs of the spine. Software was then assessed for (1) concurrent validity and (2) intra- and interobserver reliability. Plain radiographs of both human cadaver vertebrae and outpatient AIS patients were used. Concurrent validity was measured by two independent observers, both experienced in the assessment of plain radiographs. Reliability-measurements were performed by three independent spine surgeons. Pearson correlation of the software compared with the analogue Perdriolle torsion meter for mid-thoracic vertebrae was 0.98, for low-thoracic vertebrae 0.97 and for lumbar vertebrae 0.97. Measurement exactness of the software was within 5° in 62% of cases and within 10° in 97% of cases. Intraclass correlation coefficient (ICC) for inter-observer reliability was 0.92 (0.91-0.95), ICC for intra-observer reliability was 0.96 (0.94-0.97). We developed a digital X-ray software tool to determine vertebral rotation in AIS with a substantial concurrent validity and reliability, which may be useful for the follow-up of vertebral rotation in AIS patients. Copyright © 2015 Elsevier Inc. All rights reserved.

  12. Novel PM Tool Steel with improved hardness and toughness

    OpenAIRE

    Deirmina, Faraz

    2017-01-01

    Ultrafine grained (~ 1μm) steels have been the subject of extensive research work during the past years. These steels generally offer interesting perspectives looking for improved mechanical properties. UFG Powder Metallurgy hot work tool steels (HWTS) can be fabricated by high energy mechanical milling (MM) followed by spark plasma sintering (SPS). However, similarly to most UFG and Nano-Crystalline (NC) metals, reduced ductility and toughness result from the early plastic instabilities in t...

  13. Development and verification of a software tool for the acoustic location of partial discharge in a power transformer

    Directory of Open Access Journals (Sweden)

    Polužanski Vladimir

    2014-01-01

    Full Text Available This paper discusses the development and verification of software tool for determining the location of partial discharge in a power transformer with the acoustic method. Classification and systematization of physical principles and detection methods and tests of partial discharge in power transformers are shown at the beginning of this paper. The most important mathematical models, features, algorithms, and real problems that affect measurement accuracy are highlighted. This paper describes the development and implementation of a software tool for determining the location of partial discharge in a power transformer based on a no iterative mathematical algorithm. Verification and accuracy of measurement are proved both by computer simulation and experimental results available in the literature.

  14. Development of a Kinect Software Tool to Classify Movements during Active Video Gaming

    National Research Council Canada - National Science Library

    Rosenberg, Michael; Thornton, Ashleigh L; Lay, Brendan S; Ward, Brodie; Nathan, David; Hunt, Daniel; Braham, Rebecca

    2016-01-01

    ... movement during active video game play in relationship to fundamental movement skills. The aim of this study was to validate software utilising Kinect sensor motion capture technology to recognise fundamental movement skills (FMS...

  15. 1st International Workshop on Tools for Managing Globally Distributed Software Development (TOMAG 2007)

    NARCIS (Netherlands)

    Harmsen, Frank; van Hillegersberg, Jos; Harmsen, Frank; Amrit, Chintan Amrit; Geisberger, Eva; Keil, Patrick; Kuhrmann, Marco

    2007-01-01

    The advent of global distribution of software development has made managing collaboration and coordination among developers more difficult due to various reasons including physical distance, differences in time, cultural differences etc. A nearly total absence of informal communication among

  16. Assessment of ICount software, a precise and fast egg counting tool for the mosquito vector Aedes aegypti

    Directory of Open Access Journals (Sweden)

    Julie Gaburro

    2016-11-01

    Full Text Available Abstract Background Widespread in the tropics, the mosquito Aedes aegypti is an important vector of many viruses, posing a significant threat to human health. Vector monitoring often requires fecundity estimation by counting eggs laid by female mosquitoes. Traditionally, manual data analyses have been used but this requires a lot of effort and is the methods are prone to errors. An easy tool to assess the number of eggs laid would facilitate experimentation and vector control operations. Results This study introduces a built-in software called ICount allowing automatic egg counting of the mosquito vector, Aedes aegypti. ICount egg estimation compared to manual counting is statistically equivalent, making the software effective for automatic and semi-automatic data analysis. This technique also allows rapid analysis compared to manual methods. Finally, the software has been used to assess p-cresol oviposition choices under laboratory conditions in order to test the system with different egg densities. Conclusions ICount is a powerful tool for fast and precise egg count analysis, freeing experimenters from manual data processing. Software access is free and its user-friendly interface allows easy use by non-experts. Its efficiency has been tested in our laboratory with oviposition dual choices of Aedes aegypti females. The next step will be the development of a mobile application, based on the ICount platform, for vector monitoring surveys in the field.

  17. Design and Implementation of a Tool for Automating Cluster Configuration : For a Software Defined Storage System

    OpenAIRE

    Marakani, Sindhusha

    2015-01-01

    Context Traditional storage systems are proving to be inefficient to handle the growing storage need of a modern IT organization. The need for a cost effective and scalable storage framework has led to the development of a Software Defined Storage (SDS) solution. SDS can be defined as an enterprise class distributed storage solution that uses standard hardware, with all the important storage and management functions performed by an intelligent software. Configuring and maintenance of these st...

  18. An open source software tool to assign the material properties of bone for ABAQUS finite element simulations.

    Science.gov (United States)

    Pegg, Elise C; Gill, Harinderjit S

    2016-09-06

    A new software tool to assign the material properties of bone to an ABAQUS finite element mesh was created and compared with Bonemat, a similar tool originally designed to work with Ansys finite element models. Our software tool (py_bonemat_abaqus) was written in Python, which is the chosen scripting language for ABAQUS. The purpose of this study was to compare the software packages in terms of the material assignment calculation and processing speed. Three element types were compared (linear hexahedral (C3D8), linear tetrahedral (C3D4) and quadratic tetrahedral elements (C3D10)), both individually and as part of a mesh. Comparisons were made using a CT scan of a hemi-pelvis as a test case. A small difference, of -0.05kPa on average, was found between Bonemat version 3.1 (the current version) and our Python package. Errors were found in the previous release of Bonemat (version 3.0 downloaded from www.biomedtown.org) during calculation of the quadratic tetrahedron Jacobian, and conversion of the apparent density to modulus when integrating over the Young׳s modulus field. These issues caused up to 2GPa error in the modulus assignment. For these reasons, we recommend users upgrade to the most recent release of Bonemat. Processing speeds were assessed for the three different element types. Our Python package took significantly longer (110s on average) to perform the calculations compared with the Bonemat software (10s). Nevertheless, the workflow advantages of the package and added functionality makes 'py_bonemat_abaqus' a useful tool for ABAQUS users. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. Improving ICT Governance by Reorganizing Operation of ICT and Software Applications: The First Step to Outsource

    Science.gov (United States)

    Johansson, Björn

    During recent years great attention has been paid to outsourcing as well as to the reverse, insourcing (Dibbern et al., 2004). There has been a strong focus on how the management of software applications and information and communication technology (ICT), expressed as ICT management versus ICT governance, should be carried out (Grembergen, 2004). The maintenance and operation of software applications and ICT use a lot of the resources spent on ICT in organizations today (Bearingpoint, 2004), and managers are asked to increase the business benefits of these investments (Weill & Ross, 2004). That is, they are asked to improve the usage of ICT and to develop new business critical solutions supported by ICT. It also means that investments in ICT and software applications need to be shown to be worthwhile. Basically there are two considerations to take into account with ICT usage: cost reduction and improving business value. How the governance and management of ICT and software applications are organized is important. This means that the improvement of the control of maintenance and operation may be of interest to executives of organizations. It can be stated that usage is dependent on how it is organized. So, if an increase of ICT governance is the same as having well-organized ICT resources, could this be seen as the first step in organizations striving for external provision of ICT? This question is dealt with to some degree in this paper.

  20. Improvement of Open Source Software Usability: An Empirical Evaluation from Developers' Perspective

    Directory of Open Access Journals (Sweden)

    Arif Raza

    2010-01-01

    Full Text Available User satisfaction has always been important for software success whether it is Open Source Software (OSS or closed proprietary software. Even though we do not presume that OSS always has poor usability, as there are examples of good usable open source software, it would still be agreed that OSS usability has room for further improvement. This paper presents an empirical investigation to study the impact of some key factors on OSS usability from developers' points of view. This is one of the series of four studies that we are conducting regarding improvement of OSS usability from OSS developers, users, contributors, and industry perspectives. The research model of this empirical investigation studies and establishes the relationship between the key usability factors from developers' perspective and OSS usability. A data set of 106 OSS developers from 18 open source projects of varied size has been used to study the research model. The results of this study provide empirical evidence that the studied key factors play a significant role in improving OSS usability.