WorldWideScience

Sample records for extraction software volume

  1. Sandia software guidelines, Volume 4: Configuration management

    Energy Technology Data Exchange (ETDEWEB)

    1992-06-01

    This volume is one in a series of Sandia Software Guidelines for use in producing quality software within Sandia National Laboratories. This volume is based on the IEEE standard and guide for software configuration management. The basic concepts and detailed guidance on implementation of these concepts are discussed for several software project types. Example planning documents for both projects and organizations are included.

  2. Sandia Software Guidelines, Volume 2. Documentation

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-09-01

    This volume is one in a series of Sandia Software Guidelines intended for use in producing quality software within Sandia National Laboratories. In consonance with the IEEE Standards for software documentation, this volume provides guidance in the selection of an adequate document set for a software project and example formats for many types of software documentation. A tutorial on life cycle documentation is also provided. Extended document thematic outlines and working examples of software documents are available on electronic media as an extension of this volume.

  3. Collected software engineering papers, volume 9

    Science.gov (United States)

    1991-01-01

    This document is a collection of selected technical papers produced by participants in the Software Engineering Laboratory (SEL) from November 1990 through October 1991. The purpose of the document is to make available, in one reference, some results of SEL research that originally appeared in a number of different forums. This is the ninth such volume of technical papers produced by the SEL. Although these papers cover several topics related to software engineering, they do not encompass the entire scope of SEL activities and interests. For the convenience of this presentation, the eight papers contained here are grouped into three major categories: (1) software models studies; (2) software measurement studies; and (3) Ada technology studies. The first category presents studies on reuse models, including a software reuse model applied to maintenance and a model for an organization to support software reuse. The second category includes experimental research methods and software measurement techniques. The third category presents object-oriented approaches using Ada and object-oriented features proposed for Ada. The SEL is actively working to understand and improve the software development process at GSFC.

  4. Collected software engineering papers, volume 11

    Science.gov (United States)

    1993-01-01

    This document is a collection of selected technical papers produced by participants in the Software Engineering Laboratory (SEL) from November 1992 through November 1993. The purpose of the document is to make available, in one reference, some results of SEL research that originally appeared in a number of different forums. This is the 11th such volume of technical papers produced by the SEL. Although these papers cover several topics related to software engineering, they do not encompass the entire scope of SEL activities and interests. Additional information about the SEL and its research efforts may be obtained from the sources listed in the bibliography at the end of this document.

  5. Collected software engineering papers, volume 12

    Science.gov (United States)

    1994-01-01

    This document is a collection of selected technical papers produced by participants in the Software Engineering Laboratory (SEL) from November 1993 through October 1994. The purpose of the document is to make available, in one reference, some results of SEL research that originally appeared in a number of different forums. This is the 12th such volume of technical papers produced by the SEL. Although these papers cover several topics related to software engineering, they do not encompass the entire scope of SEL activities and interests. Additional information about the SEL and its research efforts may be obtained from the sources listed in the bibliography at the end of this document.

  6. Sandia software guidelines: Volume 5, Tools, techniques, and methodologies

    Energy Technology Data Exchange (ETDEWEB)

    1989-07-01

    This volume is one in a series of Sandia Software Guidelines intended for use in producing quality software within Sandia National Laboratories. This volume describes software tools and methodologies available to Sandia personnel for the development of software, and outlines techniques that have proven useful within the Laboratories and elsewhere. References and evaluations by Sandia personnel are included. 6 figs.

  7. Software Engineering Laboratory Series: Collected Software Engineering Papers. Volume 14

    Science.gov (United States)

    1996-01-01

    The Software Engineering Laboratory (SEL) is an organization sponsored by NASA/GSFC and created to investigate the effectiveness of software engineering technologies when applied to the development of application software. The activities, findings, and recommendations of the SEL are recorded in the Software Engineering Laboratory Series, a continuing series of reports that includes this document.

  8. Software Engineering Laboratory Series: Collected Software Engineering Papers. Volume 15

    Science.gov (United States)

    1997-01-01

    The Software Engineering Laboratory (SEL) is an organization sponsored by NASA/GSFC and created to investigate the effectiveness of software engineering technologies when applied to the development of application software. The activities, findings, and recommendations of the SEL are recorded in the Software Engineering Laboratory Series, a continuing series of reports that includes this document.

  9. Software Engineering Laboratory Series: Collected Software Engineering Papers. Volume 13

    Science.gov (United States)

    1995-01-01

    The Software Engineering Laboratory (SEL) is an organization sponsored by NASA/GSFC and created to investigate the effectiveness of software engineering technologies when applied to the development of application software. The activities, findings, and recommendations of the SEL are recorded in the Software Engineering Laboratory Series, a continuing series of reports that includes this document.

  10. Reaction Wheel Disturbance Model Extraction Software - RWDMES

    Science.gov (United States)

    Blaurock, Carl

    2009-01-01

    The RWDMES is a tool for modeling the disturbances imparted on spacecraft by spinning reaction wheels. Reaction wheels are usually the largest disturbance source on a precision pointing spacecraft, and can be the dominating source of pointing error. Accurate knowledge of the disturbance environment is critical to accurate prediction of the pointing performance. In the past, it has been difficult to extract an accurate wheel disturbance model since the forcing mechanisms are difficult to model physically, and the forcing amplitudes are filtered by the dynamics of the reaction wheel. RWDMES captures the wheel-induced disturbances using a hybrid physical/empirical model that is extracted directly from measured forcing data. The empirical models capture the tonal forces that occur at harmonics of the spin rate, and the broadband forces that arise from random effects. The empirical forcing functions are filtered by a physical model of the wheel structure that includes spin-rate-dependent moments (gyroscopic terms). The resulting hybrid model creates a highly accurate prediction of wheel-induced forces. It accounts for variation in disturbance frequency, as well as the shifts in structural amplification by the whirl modes, as the spin rate changes. This software provides a point-and-click environment for producing accurate models with minimal user effort. Where conventional approaches may take weeks to produce a model of variable quality, RWDMES can create a demonstrably high accuracy model in two hours. The software consists of a graphical user interface (GUI) that enables the user to specify all analysis parameters, to evaluate analysis results and to iteratively refine the model. Underlying algorithms automatically extract disturbance harmonics, initialize and tune harmonic models, and initialize and tune broadband noise models. The component steps are described in the RWDMES user s guide and include: converting time domain data to waterfall PSDs (power spectral

  11. Collected software engineering papers, volume 2

    Science.gov (United States)

    1983-01-01

    Topics addressed include: summaries of the software engineering laboratory (SEL) organization, operation, and research activities; results of specific research projects in the areas of resource models and software measures; and strategies for data collection for software engineering research.

  12. Collected Software Engineering Papers, Volume 10

    Science.gov (United States)

    1992-01-01

    This document is a collection of selected technical papers produced by participants in the Software Engineering Laboratory (SEL) from Oct. 1991 - Nov. 1992. The purpose of the document is to make available, in one reference, some results of SEL research that originally appeared in a number of different forums. Although these papers cover several topics related to software engineering, they do not encompass the entire scope of SEL activities and interests. Additional information about the SEL and its research efforts may be obtained from the sources listed in the bibliography at the end of this document. For the convenience of this presentation, the 11 papers contained here are grouped into 5 major sections: (1) the Software Engineering Laboratory; (2) software tools studies; (3) software models studies; (4) software measurement studies; and (5) Ada technology studies.

  13. TMS communications software. Volume 1: Computer interfaces

    Science.gov (United States)

    Brown, J. S.; Lenker, M. D.

    1979-01-01

    A prototype bus communications system, which is being used to support the Trend Monitoring System (TMS) as well as for evaluation of the bus concept is considered. Hardware and software interfaces to the MODCOMP and NOVA minicomputers are included. The system software required to drive the interfaces in each TMS computer is described. Documentation of other software for bus statistics monitoring and for transferring files across the bus is also included.

  14. Collected software engineering papers, volume 8

    Science.gov (United States)

    1990-01-01

    A collection of selected technical papers produced by participants in the Software Engineering Laboratory (SEL) during the period November 1989 through October 1990 is presented. The purpose of the document is to make available, in one reference, some results of SEL research that originally appeared in a number of different forums. Although these papers cover several topics related to software engineering, they do not encompass the entire scope of SEL activities and interests. Additional information about the SEL and its research efforts may be obtained from the sources listed in the bibliography. The seven presented papers are grouped into four major categories: (1) experimental research and evaluation of software measurement; (2) studies on models for software reuse; (3) a software tool evaluation; and (4) Ada technology and studies in the areas of reuse and specification.

  15. Collected software engineering papers, volume 7

    Science.gov (United States)

    1989-01-01

    A collection is presented of selected technical papers produced by participants in the Software Engineering Laboratory (SEL) during the period Dec. 1988 to Oct. 1989. The purpose of the document is to make available, in one reference, some results of SEL research that originally appeared in a number of different forums. For the convenience of this presentation, the seven papers contained here are grouped into three major categories: (1) Software Measurement and Technology Studies; (2) Measurement Environment Studies; and (3) Ada Technology Studies. The first category presents experimental research and evaluation of software measurement and technology; the second presents studies on software environments pertaining to measurement. The last category represents Ada technology and includes research, development, and measurement studies.

  16. Collected software engineering papers, volume 6

    Science.gov (United States)

    1988-01-01

    A collection is presented of technical papers produced by participants in the Software Engineering Laboratory (SEL) during the period 1 Jun. 1987 to 1 Jan. 1989. The purpose of the document is to make available, in one reference, some results of SEL research that originally appeared in a number of different forums. For the convenience of this presentation, the twelve papers contained here are grouped into three major categories: (1) Software Measurement and Technology Studies; (2) Measurement Environment Studies; and (3) Ada Technology Studies. The first category presents experimental research and evaluation of software measurement and technology; the second presents studies on software environments pertaining to measurement. The last category represents Ada technology and includes research, development, and measurement studies.

  17. Guidance and Control Software Project Data - Volume 1: Planning Documents

    Science.gov (United States)

    Hayhurst, Kelly J. (Editor)

    2008-01-01

    The Guidance and Control Software (GCS) project was the last in a series of software reliability studies conducted at Langley Research Center between 1977 and 1994. The technical results of the GCS project were recorded after the experiment was completed. Some of the support documentation produced as part of the experiment, however, is serving an unexpected role far beyond its original project context. Some of the software used as part of the GCS project was developed to conform to the RTCA/DO-178B software standard, "Software Considerations in Airborne Systems and Equipment Certification," used in the civil aviation industry. That standard requires extensive documentation throughout the software development life cycle, including plans, software requirements, design and source code, verification cases and results, and configuration management and quality control data. The project documentation that includes this information is open for public scrutiny without the legal or safety implications associated with comparable data from an avionics manufacturer. This public availability has afforded an opportunity to use the GCS project documents for DO-178B training. This report provides a brief overview of the GCS project, describes the 4-volume set of documents and the role they are playing in training, and includes the planning documents from the GCS project. Volume 1 contains five appendices: A. Plan for Software Aspects of Certification for the Guidance and Control Software Project; B. Software Development Standards for the Guidance and Control Software Project; C. Software Verification Plan for the Guidance and Control Software Project; D. Software Configuration Management Plan for the Guidance and Control Software Project; and E. Software Quality Assurance Activities.

  18. Software for Extracting 3D - MSSTs

    DEFF Research Database (Denmark)

    Somchaipeng, Kerawit; Sporring, Jon; Kreiborg, Sven

    2003-01-01

    The deep structure of an image is investigated, and a Multi-Scale Singularity Tree (MSST) is constructed based on the pair-wise annihilations of critical points. This report contains two main contributions. Firstly, we describe a fast, simple, and robust method of extracting feature lines from data...... sets of up to four dimen- sions, which we apply in order to extract critical paths from scale-spaces. Secondly, we investigate the extracting of MSSTs using either support regions or extrema partitions. Given an image, both methods produce a binary tree that mathematically represents the topological...

  19. Extracting software static defect models using data mining

    Directory of Open Access Journals (Sweden)

    Ahmed H. Yousef

    2015-03-01

    Full Text Available Large software projects are subject to quality risks of having defective modules that will cause failures during the software execution. Several software repositories contain source code of large projects that are composed of many modules. These software repositories include data for the software metrics of these modules and the defective state of each module. In this paper, a data mining approach is used to show the attributes that predict the defective state of software modules. Software solution architecture is proposed to convert the extracted knowledge into data mining models that can be integrated with the current software project metrics and bugs data in order to enhance the prediction. The results show better prediction capabilities when all the algorithms are combined using weighted votes. When only one individual algorithm is used, Naïve Bayes algorithm has the best results, then the Neural Network and the Decision Trees algorithms.

  20. Guidance and Control Software Project Data - Volume 2: Development Documents

    Science.gov (United States)

    Hayhurst, Kelly J. (Editor)

    2008-01-01

    The Guidance and Control Software (GCS) project was the last in a series of software reliability studies conducted at Langley Research Center between 1977 and 1994. The technical results of the GCS project were recorded after the experiment was completed. Some of the support documentation produced as part of the experiment, however, is serving an unexpected role far beyond its original project context. Some of the software used as part of the GCS project was developed to conform to the RTCA/DO-178B software standard, "Software Considerations in Airborne Systems and Equipment Certification," used in the civil aviation industry. That standard requires extensive documentation throughout the software development life cycle, including plans, software requirements, design and source code, verification cases and results, and configuration management and quality control data. The project documentation that includes this information is open for public scrutiny without the legal or safety implications associated with comparable data from an avionics manufacturer. This public availability has afforded an opportunity to use the GCS project documents for DO-178B training. This report provides a brief overview of the GCS project, describes the 4-volume set of documents and the role they are playing in training, and includes the development documents from the GCS project. Volume 2 contains three appendices: A. Guidance and Control Software Development Specification; B. Design Description for the Pluto Implementation of the Guidance and Control Software; and C. Source Code for the Pluto Implementation of the Guidance and Control Software

  1. Guidance and Control Software Project Data - Volume 3: Verification Documents

    Science.gov (United States)

    Hayhurst, Kelly J. (Editor)

    2008-01-01

    The Guidance and Control Software (GCS) project was the last in a series of software reliability studies conducted at Langley Research Center between 1977 and 1994. The technical results of the GCS project were recorded after the experiment was completed. Some of the support documentation produced as part of the experiment, however, is serving an unexpected role far beyond its original project context. Some of the software used as part of the GCS project was developed to conform to the RTCA/DO-178B software standard, "Software Considerations in Airborne Systems and Equipment Certification," used in the civil aviation industry. That standard requires extensive documentation throughout the software development life cycle, including plans, software requirements, design and source code, verification cases and results, and configuration management and quality control data. The project documentation that includes this information is open for public scrutiny without the legal or safety implications associated with comparable data from an avionics manufacturer. This public availability has afforded an opportunity to use the GCS project documents for DO-178B training. This report provides a brief overview of the GCS project, describes the 4-volume set of documents and the role they are playing in training, and includes the verification documents from the GCS project. Volume 3 contains four appendices: A. Software Verification Cases and Procedures for the Guidance and Control Software Project; B. Software Verification Results for the Pluto Implementation of the Guidance and Control Software; C. Review Records for the Pluto Implementation of the Guidance and Control Software; and D. Test Results Logs for the Pluto Implementation of the Guidance and Control Software.

  2. Ground and Space Radar Volume Matching and Comparison Software

    Science.gov (United States)

    Morris, Kenneth; Schwaller, Mathew

    2010-01-01

    This software enables easy comparison of ground- and space-based radar observations. The software was initially designed to compare ground radar reflectivity from operational, ground based Sand C-band meteorological radars with comparable measurements from the Tropical Rainfall Measuring Mission (TRMM) satellite s Precipitation Radar (PR) instrument. The software is also applicable to other ground-based and space-based radars. The ground and space radar volume matching and comparison software was developed in response to requirements defined by the Ground Validation System (GVS) of Goddard s Global Precipitation Mission (GPM) project. This software innovation is specifically concerned with simplifying the comparison of ground- and spacebased radar measurements for the purpose of GPM algorithm and data product validation. This software is unique in that it provides an operational environment to routinely create comparison products, and uses a direct geometric approach to derive common volumes of space- and ground-based radar data. In this approach, spatially coincident volumes are defined by the intersection of individual space-based Precipitation Radar rays with the each of the conical elevation sweeps of the ground radar. Thus, the resampled volume elements of the space and ground radar reflectivity can be directly compared to one another.

  3. Development of Automatic Visceral Fat Volume Calculation Software for CT Volume Data

    Directory of Open Access Journals (Sweden)

    Mitsutaka Nemoto

    2014-01-01

    Full Text Available Objective. To develop automatic visceral fat volume calculation software for computed tomography (CT volume data and to evaluate its feasibility. Methods. A total of 24 sets of whole-body CT volume data and anthropometric measurements were obtained, with three sets for each of four BMI categories (under 20, 20 to 25, 25 to 30, and over 30 in both sexes. True visceral fat volumes were defined on the basis of manual segmentation of the whole-body CT volume data by an experienced radiologist. Software to automatically calculate visceral fat volumes was developed using a region segmentation technique based on morphological analysis with CT value threshold. Automatically calculated visceral fat volumes were evaluated in terms of the correlation coefficient with the true volumes and the error relative to the true volume. Results. Automatic visceral fat volume calculation results of all 24 data sets were obtained successfully and the average calculation time was 252.7 seconds/case. The correlation coefficients between the true visceral fat volume and the automatically calculated visceral fat volume were over 0.999. Conclusions. The newly developed software is feasible for calculating visceral fat volumes in a reasonable time and was proved to have high accuracy.

  4. Software Engineering and Knowledge Engineering Theory and Practice Volume 2

    CERN Document Server

    2012-01-01

    The volume includes a set of selected papers extended and revised from the I2009 Pacific-Asia Conference on Knowledge Engineering and Software Engineering (KESE 2009) was held on December 19~ 20, 2009, Shenzhen, China.   Volume 2 is to provide a forum for researchers, educators, engineers, and government officials involved in the general areas of Knowledge Engineering and Communication Technology to disseminate their latest research results and exchange views on the future research directions of these fields. 135 high-quality papers are included in the volume. Each paper has been peer-reviewed by at least 2 program committee members and selected by the volume editor Prof.Yanwen Wu.   On behalf of the this volume, we would like to express our sincere appreciation to all of authors and referees for their efforts reviewing the papers. Hoping you can find lots of profound research ideas and results on the related fields of Knowledge Engineering and Communication Technology. 

  5. Object-oriented software design in semiautomatic building extraction

    Science.gov (United States)

    Guelch, Eberhard; Mueller, Hardo

    1997-08-01

    Developing a system for semiautomatic building acquisition is a complex process, that requires constant integration and updating of software modules and user interfaces. To facilitate these processes we apply an object-oriented design not only for the data but also for the software involved. We use the unified modeling language (UML) to describe the object-oriented modeling of the system in different levels of detail. We can distinguish between use cases from the users point of view, that represent a sequence of actions, yielding in an observable result and the use cases for the programmers, who can use the system as a class library to integrate the acquisition modules in their own software. The structure of the system is based on the model-view-controller (MVC) design pattern. An example from the integration of automated texture extraction for the visualization of results demonstrate the feasibility of this approach.

  6. IMAGE information monitoring and applied graphics software environment. Volume 4. Applications description

    International Nuclear Information System (INIS)

    Hallam, J.W.; Ng, K.B.; Upham, G.L.

    1986-09-01

    The EPRI Information Monitoring and Applied Graphics Environment (IMAGE) system is designed for 'fast proto-typing' of advanced concepts for computer-aided plant operations tools. It is a flexible software system which can be used for rapidly creating, dynamically driving and evaluating advanced operator aid displays. The software is written to be both host computer and graphic device independent. This four volume report includes an Executive Overview of the IMAGE package (Volume 1), followed by Software Description (Volume II), User's Guide (Volume III), and Description of Example Applications (Volume IV)

  7. Software

    Energy Technology Data Exchange (ETDEWEB)

    Macedo, R.; Budd, G.; Ross, E.; Wells, P.

    2010-07-15

    The software section of this journal presented new software programs that have been developed to help in the exploration and development of hydrocarbon resources. Software provider IHS Inc. has made additions to its geological and engineering analysis software tool, IHS PETRA, a product used by geoscientists and engineers to visualize, analyze and manage well production, well log, drilling, reservoir, seismic and other related information. IHS PETRA also includes a directional well module and a decline curve analysis module to improve analysis capabilities in unconventional reservoirs. Petris Technology Inc. has developed a software to help manage the large volumes of data. PetrisWinds Enterprise (PWE) helps users find and manage wellbore data, including conventional wireline and MWD core data; analysis core photos and images; waveforms and NMR; and external files documentation. Ottawa-based Ambercore Software Inc. has been collaborating with Nexen on the Petroleum iQ software for steam assisted gravity drainage (SAGD) producers. Petroleum iQ integrates geology and geophysics data with engineering data in 3D and 4D. Calgary-based Envirosoft Corporation has developed a software that reduces the costly and time-consuming effort required to comply with Directive 39 of the Alberta Energy Resources Conservation Board. The product includes an emissions modelling software. Houston-based Seismic Micro-Technology (SMT) has developed the Kingdom software that features the latest in seismic interpretation. Holland-based Joa Oil and Gas and Calgary-based Computer Modelling Group have both supplied the petroleum industry with advanced reservoir simulation software that enables reservoir interpretation. The 2010 software survey included a guide to new software applications designed to facilitate petroleum exploration, drilling and production activities. Oil and gas producers can use the products for a range of functions, including reservoir characterization and accounting. In

  8. Isobio software: biological dose distribution and biological dose volume histogram from physical dose conversion using linear-quadratic-linear model.

    Science.gov (United States)

    Jaikuna, Tanwiwat; Khadsiri, Phatchareewan; Chawapun, Nisa; Saekho, Suwit; Tharavichitkul, Ekkasit

    2017-02-01

    To develop an in-house software program that is able to calculate and generate the biological dose distribution and biological dose volume histogram by physical dose conversion using the linear-quadratic-linear (LQL) model. The Isobio software was developed using MATLAB version 2014b to calculate and generate the biological dose distribution and biological dose volume histograms. The physical dose from each voxel in treatment planning was extracted through Computational Environment for Radiotherapy Research (CERR), and the accuracy was verified by the differentiation between the dose volume histogram from CERR and the treatment planning system. An equivalent dose in 2 Gy fraction (EQD 2 ) was calculated using biological effective dose (BED) based on the LQL model. The software calculation and the manual calculation were compared for EQD 2 verification with pair t -test statistical analysis using IBM SPSS Statistics version 22 (64-bit). Two and three-dimensional biological dose distribution and biological dose volume histogram were displayed correctly by the Isobio software. Different physical doses were found between CERR and treatment planning system (TPS) in Oncentra, with 3.33% in high-risk clinical target volume (HR-CTV) determined by D 90% , 0.56% in the bladder, 1.74% in the rectum when determined by D 2cc , and less than 1% in Pinnacle. The difference in the EQD 2 between the software calculation and the manual calculation was not significantly different with 0.00% at p -values 0.820, 0.095, and 0.593 for external beam radiation therapy (EBRT) and 0.240, 0.320, and 0.849 for brachytherapy (BT) in HR-CTV, bladder, and rectum, respectively. The Isobio software is a feasible tool to generate the biological dose distribution and biological dose volume histogram for treatment plan evaluation in both EBRT and BT.

  9. TMS communications software. Volume 2: Bus interface unit

    Science.gov (United States)

    Gregor, P. J.

    1979-01-01

    A data bus communication system to support the space shuttle's Trend Monitoring System (TMS) and to provide a basis for evaluation of the bus concept is described. Installation of the system included developing both hardware and software interfaces between the bus and the specific TMS computers and terminals. The software written for the microprocessor-based bus interface units is described. The software implements both the general bus communications protocol and also the specific interface protocols for the TMS computers and terminals.

  10. Software Assurance Curriculum Project Volume 1: Master of Software Assurance Reference Curriculum

    Science.gov (United States)

    2010-08-01

    developed products. The above definition was derived from these references: [IEEE-CS 2008] ISO /IEC 12207 , IEEE Std 12207 -2008, Systems and Software...Systems [CNSS 2009]. Software quality Capability of a software product to satisfy stated and implied needs when used under specified conditions [ ISO ...Curriculum ISO International Organization for Standardization IT information technology KA knowledge area KU knowledge unit MBA Master of

  11. IMAGE information monitoring and applied graphics software environment. Volume 2. Software description

    International Nuclear Information System (INIS)

    Hallam, J.W.; Ng, K.B.; Upham, G.L.

    1986-09-01

    The EPRI Information Monitoring and Applied Graphics Environment (IMAGE) system is designed for 'fast proto-typing' of advanced concepts for computer-aided plant operations tools. It is a flexible software system which can be used for rapidly creating, dynamically driving and evaluating advanced operator aid displays. The software is written to be both host computer and graphic device independent

  12. Can we replace curation with information extraction software?

    Science.gov (United States)

    Karp, Peter D

    2016-01-01

    Can we use programs for automated or semi-automated information extraction from scientific texts as practical alternatives to professional curation? I show that error rates of current information extraction programs are too high to replace professional curation today. Furthermore, current IEP programs extract single narrow slivers of information, such as individual protein interactions; they cannot extract the large breadth of information extracted by professional curators for databases such as EcoCyc. They also cannot arbitrate among conflicting statements in the literature as curators can. Therefore, funding agencies should not hobble the curation efforts of existing databases on the assumption that a problem that has stymied Artificial Intelligence researchers for more than 60 years will be solved tomorrow. Semi-automated extraction techniques appear to have significantly more potential based on a review of recent tools that enhance curator productivity. But a full cost-benefit analysis for these tools is lacking. Without such analysis it is possible to expend significant effort developing information-extraction tools that automate small parts of the overall curation workflow without achieving a significant decrease in curation costs.Database URL. © The Author(s) 2016. Published by Oxford University Press.

  13. Guidance and Control Software Project Data - Volume 4: Configuration Management and Quality Assurance Documents

    Science.gov (United States)

    Hayhurst, Kelly J. (Editor)

    2008-01-01

    The Guidance and Control Software (GCS) project was the last in a series of software reliability studies conducted at Langley Research Center between 1977 and 1994. The technical results of the GCS project were recorded after the experiment was completed. Some of the support documentation produced as part of the experiment, however, is serving an unexpected role far beyond its original project context. Some of the software used as part of the GCS project was developed to conform to the RTCA/DO-178B software standard, "Software Considerations in Airborne Systems and Equipment Certification," used in the civil aviation industry. That standard requires extensive documentation throughout the software development life cycle, including plans, software requirements, design and source code, verification cases and results, and configuration management and quality control data. The project documentation that includes this information is open for public scrutiny without the legal or safety implications associated with comparable data from an avionics manufacturer. This public availability has afforded an opportunity to use the GCS project documents for DO-178B training. This report provides a brief overview of the GCS project, describes the 4-volume set of documents and the role they are playing in training, and includes configuration management and quality assurance documents from the GCS project. Volume 4 contains six appendices: A. Software Accomplishment Summary for the Guidance and Control Software Project; B. Software Configuration Index for the Guidance and Control Software Project; C. Configuration Management Records for the Guidance and Control Software Project; D. Software Quality Assurance Records for the Guidance and Control Software Project; E. Problem Report for the Pluto Implementation of the Guidance and Control Software Project; and F. Support Documentation Change Reports for the Guidance and Control Software Project.

  14. Guidelines for the verification and validation of expert system software and conventional software: Bibliography. Volume 8

    International Nuclear Information System (INIS)

    Miller, L.A.; Hayes, J.E.; Mirsky, S.M.

    1995-03-01

    This volume contains all of the technical references found in Volumes 1-7 concerning the development of guidelines for the verification and validation of expert systems, knowledge-based systems, other AI systems, object-oriented systems, and conventional systems

  15. Guidelines for the verification and validation of expert system software and conventional software: Bibliography. Volume 8

    Energy Technology Data Exchange (ETDEWEB)

    Miller, L.A.; Hayes, J.E.; Mirsky, S.M. [Science Applications International Corp., McLean, VA (United States)

    1995-03-01

    This volume contains all of the technical references found in Volumes 1-7 concerning the development of guidelines for the verification and validation of expert systems, knowledge-based systems, other AI systems, object-oriented systems, and conventional systems.

  16. MyETL: A Java Software Tool to Extract, Transform, and Load Your Business

    Directory of Open Access Journals (Sweden)

    Michele Nuovo

    2015-12-01

    Full Text Available The project follows the development of a Java Software Tool that extracts data from Flat File (Fixed Length Record Type, CSV (Comma Separated Values, and XLS (Microsoft Excel 97-2003 Worksheet file, apply transformation to those sources, and finally load the data into the end target RDBMS. The software refers to a process known as ETL (Extract Transform and Load. Those kinds of systems are called ETL systems.

  17. PySE: Software for extracting sources from radio images

    Science.gov (United States)

    Carbone, D.; Garsden, H.; Spreeuw, H.; Swinbank, J. D.; van der Horst, A. J.; Rowlinson, A.; Broderick, J. W.; Rol, E.; Law, C.; Molenaar, G.; Wijers, R. A. M. J.

    2018-04-01

    PySE is a Python software package for finding and measuring sources in radio telescope images. The software was designed to detect sources in the LOFAR telescope images, but can be used with images from other radio telescopes as well. We introduce the LOFAR Telescope, the context within which PySE was developed, the design of PySE, and describe how it is used. Detailed experiments on the validation and testing of PySE are then presented, along with results of performance testing. We discuss some of the current issues with the algorithms implemented in PySE and their interaction with LOFAR images, concluding with the current status of PySE and its future development.

  18. Extraction and LOD control of colored interval volumes

    Science.gov (United States)

    Miyamura, Hiroko N.; Takeshima, Yuriko; Fujishiro, Issei; Saito, Takafumi

    2005-03-01

    Interval volume serves as a generalized isosurface and represents a three-dimensional subvolume for which the associated scalar filed values lie within a user-specified closed interval. In general, it is not an easy task for novices to specify the scalar field interval corresponding to their ROIs. In order to extract interval volumes from which desirable geometric features can be mined effectively, we propose a suggestive technique which extracts interval volumes automatically based on the global examination of the field contrast structure. Also proposed here is a simplification scheme for decimating resultant triangle patches to realize efficient transmission and rendition of large-scale interval volumes. Color distributions as well as geometric features are taken into account to select best edges to be collapsed. In addition, when a user wants to selectively display and analyze the original dataset, the simplified dataset is restructured to the original quality. Several simulated and acquired datasets are used to demonstrate the effectiveness of the present methods.

  19. Dynamic Displays for Tactical Planning. Volume 3. Software Documentation

    Science.gov (United States)

    1979-12-01

    Figure 3-7. High-Level Flow of Data. -18- I 4.0 MAJOR SOFTWARE MODULES This section contains write -ups on the major subprograms in TOMM. Criteria for...pages, arranged in alphabetical order. Each write -up includes the name of the subprogram at the top of the page, followed by: (1) a list of the overlays...IEN1, iELt ,0tl.3) CALL 6HLT 900 F IM AT (13) IEL= IEL*5 GO TO 20 10 CALL tPUT(IELrII4,-3Gp0J CALL GCHA(lFNfIEIL~l*009I) CALL tIiL I vA 11H15,99*J) CLAd

  20. Software Assurance Curriculum Project Volume 3: Master of Software Assurance Course Syllabi

    Science.gov (United States)

    2011-07-01

    thods and process of model-driven development. • Pressman , Roger S., Software Engineering: A Practitioner’s Approach, 6th ed. McGraw Hill, 2009...audience • [Bishop 2002] Chapter 18 • [Allen 2008] Chapters 1, 2 • [ Pressman 2009] Chapter 1 • [Merkow 2010] Chapter 3 • [Mouratidis 2007...Allen 2008] Chapters 3,4 • [ Pressman 2009] Chapters 3,4 • [Merkow 2010] Chapter 5 • [DHS 2008-2009a] • [Mellado 2010] • [CERT 2009] 3

  1. Evaluation of a new software tool for the automatic volume calculation of hepatic tumors. First results

    International Nuclear Information System (INIS)

    Meier, S.; Mildenberger, P.; Pitton, M.; Thelen, M.; Schenk, A.; Bourquain, H.

    2004-01-01

    Purpose: computed tomography has become the preferred method in detecting liver carcinomas. The introduction of spiral CT added volumetric assessment of intrahepatic tumors, which was unattainable in the clinical routine with incremental CT due to complex planimetric revisions and excessive computing time. In an ongoing clinical study, a new software tool was tested for the automatic detection of tumor volume and the time needed for this procedure. Materials and methods: we analyzed patients suffering from hepatocellular carcinoma (HCC). All patients underwent treatment with repeated transcatheter chemoembolization of the hepatic arteria. The volumes of the HCC lesions detected in CT were measured with the new software tool in HepaVison (MeVis, Germany). The results were compared with manual planimetric calculation of the volume performed by three independent radiologists. Results: our first results in 16 patients show a correlation between the automatically and the manually calculated volumes (up to a difference of 2 ml) of 96.8%. While the manual method of analyzing the volume of a lesion requires 2.5 minutes on average, the automatic method merely requires about 30 seconds of user interaction time. Conclusion: These preliminary results show a good correlation between automatic and manual calculations of the tumor volume. The new software tool requires less time for accurate determination of the tumor volume and can be applied in the daily clinical routine. (orig.) [de

  2. Bottom-Up Technologies for Reuse: Automated Extractive Adoption of Software Product Lines

    OpenAIRE

    Martinez , Jabier ,; Ziadi , Tewfik; Bissyandé , Tegawendé; Klein , Jacques ,; Le Traon , Yves ,

    2017-01-01

    International audience; Adopting Software Product Line (SPL) engineering principles demands a high up-front investment. Bottom-Up Technologies for Reuse (BUT4Reuse) is a generic and extensible tool aimed to leverage existing similar software products in order to help in extractive SPL adoption. The envisioned users are 1) SPL adopters and 2) Integrators of techniques and algorithms to provide automation in SPL adoption activities. We present the methodology it implies for both types of users ...

  3. CrossTalk. The Journal of Defense Software Engineering. Volume 15, Number 12, December 2002

    Science.gov (United States)

    2002-12-01

    Journal of Defense Software Engineering. Volume 15, Number 12, December 2002 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6...You sit backwards on Disneyland rides to see how they do the special effects. • You’ve tried to repair a $5 radio. • You look forward to Christmas so

  4. Measuring stone volume - three-dimensional software reconstruction or an ellipsoid algebra formula?

    Science.gov (United States)

    Finch, William; Johnston, Richard; Shaida, Nadeem; Winterbottom, Andrew; Wiseman, Oliver

    2014-04-01

    To determine the optimal method for assessing stone volume, and thus stone burden, by comparing the accuracy of scalene, oblate, and prolate ellipsoid volume equations with three-dimensional (3D)-reconstructed stone volume. Kidney stone volume may be helpful in predicting treatment outcome for renal stones. While the precise measurement of stone volume by 3D reconstruction can be accomplished using modern computer tomography (CT) scanning software, this technique is not available in all hospitals or with routine acute colic scanning protocols. Therefore, maximum diameters as measured by either X-ray or CT are used in the calculation of stone volume based on a scalene ellipsoid formula, as recommended by the European Association of Urology. In all, 100 stones with both X-ray and CT (1-2-mm slices) were reviewed. Complete and partial staghorn stones were excluded. Stone volume was calculated using software designed to measure tissue density of a certain range within a specified region of interest. Correlation coefficients among all measured outcomes were compared. Stone volumes were analysed to determine the average 'shape' of the stones. The maximum stone diameter on X-ray was 3-25 mm and on CT was 3-36 mm, with a reasonable correlation (r = 0.77). Smaller stones (15 mm towards scalene ellipsoids. There was no difference in stone shape by location within the kidney. As the average shape of renal stones changes with diameter, no single equation for estimating stone volume can be recommended. As the maximum diameter increases, calculated stone volume becomes less accurate, suggesting that larger stones have more asymmetric shapes. We recommend that research looking at stone clearance rates should use 3D-reconstructed stone volumes when available, followed by prolate, oblate, or scalene ellipsoid formulas depending on the maximum stone diameter. © 2013 The Authors. BJU International © 2013 BJU International.

  5. Software Engineering Laboratory (SEL) data base reporting software user's guide and system description. Volume 1: Introduction and user's guide

    Science.gov (United States)

    1983-01-01

    Reporting software programs provide formatted listings and summary reports of the Software Engineering Laboratory (SEL) data base contents. The operating procedures and system information for 18 different reporting software programs are described. Sample output reports from each program are provided.

  6. Technical Reviews and Audits for Systems, Equipment and Computer Software. Volume 1

    Science.gov (United States)

    2009-09-15

    acquisitions and technology developments. 2. This new-issue SMC standard comprises the text of The Aerospace Corporation report number TOR-2007( 8583 )-6414...TRA) Deskbook – DUSD(S&T) (May 2005) 17. IMP & IMS Preparation and Use Guide Version 0.9 (21 October 2005) 18. ISO /IEC STD 15939 Software...1521B, TOR-2007( 8583 )-6414_Volume 1. 110.2 Purpose A. The guidelines contained herein implement the Department of Defense Directive 4120.21

  7. Three Software Tools for Viewing Sectional Planes, Volume Models, and Surface Models of a Cadaver Hand.

    Science.gov (United States)

    Chung, Beom Sun; Chung, Min Suk; Shin, Byeong Seok; Kwon, Koojoo

    2018-02-19

    The hand anatomy, including the complicated hand muscles, can be grasped by using computer-assisted learning tools with high quality two-dimensional images and three-dimensional models. The purpose of this study was to present up-to-date software tools that promote learning of stereoscopic morphology of the hand. On the basis of horizontal sectioned images and outlined images of a male cadaver, vertical planes, volume models, and surface models were elaborated. Software to browse pairs of the sectioned and outlined images in orthogonal planes and software to peel and rotate the volume models, as well as a portable document format (PDF) file to select and rotate the surface models, were produced. All of the software tools were downloadable free of charge and usable off-line. The three types of tools for viewing multiple aspects of the hand could be adequately employed according to individual needs. These new tools involving the realistic images of a cadaver and the diverse functions are expected to improve comprehensive knowledge of the hand shape. © 2018 The Korean Academy of Medical Sciences.

  8. Software development to estimate the leaked volume from ruptured submarine pipelines; Desenvolvimento de um software para estimativa do volume vazado a partir de dutos submarions rompidos

    Energy Technology Data Exchange (ETDEWEB)

    Quadri, Marintho B.; Machado, Ricardo A.F.; Nogueira, Andre L.; Lopes, Toni J. [Universidade Federal de Santa Catarina (UFSC), Florianopolis, SC (Brazil). Dept. de Engenharia Quimica; Baptista, Renan M. [PETROBRAS, Rio de Janeiro, RJ (Brazil). Centro de Pesquisas (CENPES)

    2004-07-01

    The considerable increasing in the world petroleum consumption as the exhaustion of onshore reserves in the last decades leads the companies to exploit petroleum in offshore reserves (both shallow and deep water). As in onshore operations, accidents may also occur in submarine exploration. Leaking from submarine pipelines arises from corrosion pit and from axial or radial breakage. In all these three situations, the leaking is divided in three steps: pipeline depressurization until the internal pressure becomes equal to the external one; advective migration in which the driven force is the difference in the physical properties of the fluids; oil spill movement in the sea surface. A great number of mathematical models are Also available for the first and third steps. For the second one and theoretically, the most important situation, there is a restricted number of works respected to the oil volume leaked. The present study presents a software that is capable to accurate simulate a leakage through the advective migration phenomena. The software was validated for situations for different holes radii located in the upper side of a horizontal pipeline. Model results presented very good agreement with experimental data. (author)

  9. Guidelines for the verification and validation of expert system software and conventional software: Project summary. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Mirsky, S.M.; Hayes, J.E.; Miller, L.A. [Science Applications International Corp., McLean, VA (United States)

    1995-03-01

    This eight-volume report presents guidelines for performing verification and validation (V&V) on Artificial Intelligence (Al) systems with nuclear applications. The guidelines have much broader application than just expert systems; they are also applicable to object-oriented programming systems, rule-based systems, frame-based systems, model-based systems, neural nets, genetic algorithms, and conventional software systems. This is because many of the components of AI systems are implemented in conventional procedural programming languages, so there is no real distinction. The report examines the state of the art in verifying and validating expert systems. V&V methods traditionally applied to conventional software systems are evaluated for their applicability to expert systems. One hundred fifty-three conventional techniques are identified and evaluated. These methods are found to be useful for at least some of the components of expert systems, frame-based systems, and object-oriented systems. A taxonomy of 52 defect types and their delectability by the 153 methods is presented. With specific regard to expert systems, conventional V&V methods were found to apply well to all the components of the expert system with the exception of the knowledge base. The knowledge base requires extension of the existing methods. Several innovative static verification and validation methods for expert systems have been identified and are described here, including a method for checking the knowledge base {open_quotes}semantics{close_quotes} and a method for generating validation scenarios. Evaluation of some of these methods was performed both analytically and experimentally. A V&V methodology for expert systems is presented based on three factors: (1) a system`s judged need for V&V (based in turn on its complexity and degree of required integrity); (2) the life-cycle phase; and (3) the system component being tested.

  10. Guidelines for the verification and validation of expert system software and conventional software. Volume 1: Project summary. Final report

    International Nuclear Information System (INIS)

    Miller, L.A.; Hayes, J.E.; Mirsky, S.M.

    1995-05-01

    This eight-volume report presents guidelines for performing verification and validation (V ampersand V) on Artificial Intelligence (AI) systems with nuclear applications. The guidelines have much broader application than just expert systems; they are also applicable to object-oriented programming systems, rule-based systems, frame-based systems, model-based systems, neural nets, genetic algorithms, and conventional software systems. This is because many of the components of AI systems are implemented in conventional procedural programming languages, so there is no real distinction. The report examines the state of the art in verifying and validating expert systems. V ampersand V methods traditionally applied to conventional software systems are evaluated for their applicability to expert systems. One hundred fifty-three conventional techniques are identified and evaluated. These methods are found to be useful for at least some of the components of expert systems, frame-based systems, and object-oriented systems. A taxonomy of 52 defect types and their delectability by the 153 methods is presented. With specific regard to expert systems, conventional V ampersand V methods were found to apply well to all the components of the expert system with the exception of the knowledge base. The knowledge base requires extension of the existing methods. Several innovative static verification and validation methods for expert systems have been identified and are described here, including a method for checking the knowledge base open-quotes semanticsclose quotes and a method for generating validation scenarios. Evaluation of some of these methods was performed both analytically and experimentally

  11. Guidelines for the verification and validation of expert system software and conventional software: Project summary. Volume 1

    International Nuclear Information System (INIS)

    Mirsky, S.M.; Hayes, J.E.; Miller, L.A.

    1995-03-01

    This eight-volume report presents guidelines for performing verification and validation (V ampersand V) on Artificial Intelligence (Al) systems with nuclear applications. The guidelines have much broader application than just expert systems; they are also applicable to object-oriented programming systems, rule-based systems, frame-based systems, model-based systems, neural nets, genetic algorithms, and conventional software systems. This is because many of the components of AI systems are implemented in conventional procedural programming languages, so there is no real distinction. The report examines the state of the art in verifying and validating expert systems. V ampersand V methods traditionally applied to conventional software systems are evaluated for their applicability to expert systems. One hundred fifty-three conventional techniques are identified and evaluated. These methods are found to be useful for at least some of the components of expert systems, frame-based systems, and object-oriented systems. A taxonomy of 52 defect types and their delectability by the 153 methods is presented. With specific regard to expert systems, conventional V ampersand V methods were found to apply well to all the components of the expert system with the exception of the knowledge base. The knowledge base requires extension of the existing methods. Several innovative static verification and validation methods for expert systems have been identified and are described here, including a method for checking the knowledge base open-quotes semanticsclose quotes and a method for generating validation scenarios. Evaluation of some of these methods was performed both analytically and experimentally. A V ampersand V methodology for expert systems is presented based on three factors: (1) a system's judged need for V ampersand V (based in turn on its complexity and degree of required integrity); (2) the life-cycle phase; and (3) the system component being tested

  12. Network, system, and status software enhancements for the autonomously managed electrical power system breadboard. Volume 1: Project summary

    Science.gov (United States)

    Mckee, James W.

    1990-01-01

    This volume (1 of 4) gives a summary of the original AMPS software system configuration, points out some of the problem areas in the original software design that this project is to address, and in the appendix collects all the bimonthly status reports. The purpose of AMPS is to provide a self reliant system to control the generation and distribution of power in the space station. The software in the AMPS breadboard can be divided into three levels: the operating environment software, the protocol software, and the station specific software. This project deals only with the operating environment software and the protocol software. The present station specific software will not change except as necessary to conform to new data formats.

  13. Guidelines for the verification and validation of expert system software and conventional software: Validation scenarios. Volume 6

    International Nuclear Information System (INIS)

    Mirsky, S.M.; Hayes, J.E.; Miller, L.A.

    1995-03-01

    This report is the sixth volume in a series of reports describing the results of the Expert System Verification and Validation (V ampersand V) project which is jointly funded by the US Nuclear Regulatory Commission and the Electric Power Research Institute. The ultimate objective is the formulation of guidelines for the V ampersand V of expert systems for use in nuclear power applications. This activity was concerned with the development of a methodology for selecting validation scenarios and subsequently applying it to two expert systems used for nuclear utility applications. Validation scenarios were defined and classified into five categories: PLANT, TEST, BASICS, CODE, and LICENSING. A sixth type, REGRESSION, is a composite of the others and refers to the practice of using trusted scenarios to ensure that modifications to software did not change unmodified functions. Rationale was developed for preferring scenarios selected from the categories in the order listed and for determining under what conditions to select scenarios from other types. A procedure incorporating all of the recommendations was developed as a generalized method for generating validation scenarios. The procedure was subsequently applied to two expert systems used in the nuclear industry and was found to be effective, given that an experienced nuclear engineer made the final scenario selections. A method for generating scenarios directly from the knowledge base component was suggested

  14. Guidelines for the verification and validation of expert system software and conventional software: Validation scenarios. Volume 6

    Energy Technology Data Exchange (ETDEWEB)

    Mirsky, S.M.; Hayes, J.E.; Miller, L.A. [Science Applications International Corp., McLean, VA (United States)

    1995-03-01

    This report is the sixth volume in a series of reports describing the results of the Expert System Verification and Validation (V&V) project which is jointly funded by the US Nuclear Regulatory Commission and the Electric Power Research Institute. The ultimate objective is the formulation of guidelines for the V&V of expert systems for use in nuclear power applications. This activity was concerned with the development of a methodology for selecting validation scenarios and subsequently applying it to two expert systems used for nuclear utility applications. Validation scenarios were defined and classified into five categories: PLANT, TEST, BASICS, CODE, and LICENSING. A sixth type, REGRESSION, is a composite of the others and refers to the practice of using trusted scenarios to ensure that modifications to software did not change unmodified functions. Rationale was developed for preferring scenarios selected from the categories in the order listed and for determining under what conditions to select scenarios from other types. A procedure incorporating all of the recommendations was developed as a generalized method for generating validation scenarios. The procedure was subsequently applied to two expert systems used in the nuclear industry and was found to be effective, given that an experienced nuclear engineer made the final scenario selections. A method for generating scenarios directly from the knowledge base component was suggested.

  15. LLCEDATA and LLCECALC for Windows version 1.0, Volume 3: Software verification and validation

    International Nuclear Information System (INIS)

    McFadden, J.G.

    1998-01-01

    LLCEDATA and LLCECALC for Windows are user-friendly computer software programs that work together to determine the proper waste designation, handling, and disposition requirements for Long Length Contaminated Equipment (LLCE). LLCEDATA reads from a variety of data bases to produce an equipment data file(EDF) that represents a snapshot of both the LLCE and the tank from which it originates. LLCECALC reads the EDF and the gamma assay file (AV2) that is produced by the flexible Receiver Gamma Energy Analysis System. LLCECALC performs corrections to the AV2 file as it is being read and characterizes the LLCE. Both programs produce a variety of reports, including a characterization report and a status report. The status report documents each action taken by the user, LLCEDATA, and LLCECALC. Documentation for LLCEDATA and LLCECALC for Windows is available in three volumes. Volume 1 is a user's manual, which is intended as a quick reference for both LLCEDATA and LLCECALC. Volume 2 is a technical manual, which discusses system limitations and provides recommendations to the LLCE process. Volume 3 documents LLCEDATA and LLCECALC's verification and validation. Two of the three installation test cases, from Volume 1, are independently confirmed. Data bases used in LLCEDATA are verified and referenced. Both phases of LLCECALC process gamma and characterization, are extensively tested to verify that the methodology and algorithms used are correct

  16. Comparison of Perfusion CT Software to Predict the Final Infarct Volume After Thrombectomy.

    Science.gov (United States)

    Austein, Friederike; Riedel, Christian; Kerby, Tina; Meyne, Johannes; Binder, Andreas; Lindner, Thomas; Huhndorf, Monika; Wodarg, Fritz; Jansen, Olav

    2016-09-01

    Computed tomographic perfusion represents an interesting physiological imaging modality to select patients for reperfusion therapy in acute ischemic stroke. The purpose of our study was to determine the accuracy of different commercial perfusion CT software packages (Philips (A), Siemens (B), and RAPID (C)) to predict the final infarct volume (FIV) after mechanical thrombectomy. Single-institutional computed tomographic perfusion data from 147 mechanically recanalized acute ischemic stroke patients were postprocessed. Ischemic core and FIV were compared about thrombolysis in cerebral infarction (TICI) score and time interval to reperfusion. FIV was measured at follow-up imaging between days 1 and 8 after stroke. In 118 successfully recanalized patients (TICI 2b/3), a moderately to strongly positive correlation was observed between ischemic core and FIV. The highest accuracy and best correlation are shown in early and fully recanalized patients (Pearson r for A=0.42, B=0.64, and C=0.83; P<0.001). Bland-Altman plots and boxplots demonstrate smaller ranges in package C than in A and B. Significant differences were found between the packages about over- and underestimation of the ischemic core. Package A, compared with B and C, estimated more than twice as many patients with a malignant stroke profile (P<0.001). Package C best predicted hypoperfusion volume in nonsuccessfully recanalized patients. Our study demonstrates best accuracy and approximation between the results of a fully automated software (RAPID) and FIV, especially in early and fully recanalized patients. Furthermore, this software package overestimated the FIV to a significantly lower degree and estimated a malignant mismatch profile less often than other software. © 2016 American Heart Association, Inc.

  17. Extrusion Process by Finite Volume Method Using OpenFoam Software

    International Nuclear Information System (INIS)

    Matos Martins, Marcelo; Tonini Button, Sergio; Divo Bressan, Jose; Ivankovic, Alojz

    2011-01-01

    The computational codes are very important tools to solve engineering problems. In the analysis of metal forming process, such as extrusion, this is not different because the computational codes allow analyzing the process with reduced cost. Traditionally, the Finite Element Method is used to solve solid mechanic problems, however, the Finite Volume Method (FVM) have been gaining force in this field of applications. This paper presents the velocity field and friction coefficient variation results, obtained by numerical simulation using the OpenFoam Software and the FVM to solve an aluminum direct cold extrusion process.

  18. Oxygen octahedra picker: A software tool to extract quantitative information from STEM images

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Yi, E-mail: y.wang@fkf.mpg.de; Salzberger, Ute; Sigle, Wilfried; Eren Suyolcu, Y.; Aken, Peter A. van

    2016-09-15

    In perovskite oxide based materials and hetero-structures there are often strong correlations between oxygen octahedral distortions and functionality. Thus, atomistic understanding of the octahedral distortion, which requires accurate measurements of atomic column positions, will greatly help to engineer their properties. Here, we report the development of a software tool to extract quantitative information of the lattice and of BO{sub 6} octahedral distortions from STEM images. Center-of-mass and 2D Gaussian fitting methods are implemented to locate positions of individual atom columns. The precision of atomic column distance measurements is evaluated on both simulated and experimental images. The application of the software tool is demonstrated using practical examples. - Highlights: • We report a software tool for mapping atomic positions from HAADF and ABF images. • It enables quantification of both crystal lattice and oxygen octahedral distortions. • We test the measurement accuracy and precision on simulated and experimental images. • It works well for different orientations of perovskite structures and interfaces.

  19. SpecOp: Optimal Extraction Software for Integral Field Unit Spectrographs

    Science.gov (United States)

    McCarron, Adam; Ciardullo, Robin; Eracleous, Michael

    2018-01-01

    The Hobby-Eberly Telescope’s new low resolution integral field spectrographs, LRS2-B and LRS2-R, each cover a 12”x6” area on the sky with 280 fibers and generate spectra with resolutions between R=1100 and R=1900. To extract 1-D spectra from the instrument’s 3D data cubes, a program is needed that is flexible enough to work for a wide variety of targets, including continuum point sources, emission line sources, and compact sources embedded in complex backgrounds. We therefore introduce SpecOp, a user-friendly python program for optimally extracting spectra from integral-field unit spectrographs. As input, SpecOp takes a sky-subtracted data cube consisting of images at each wavelength increment set by the instrument’s spectral resolution, and an error file for each count measurement. All of these files are generated by the current LRS2 reduction pipeline. The program then collapses the cube in the image plane using the optimal extraction algorithm detailed by Keith Horne (1986). The various user-selected options include the fraction of the total signal enclosed in a contour-defined region, the wavelength range to analyze, and the precision of the spatial profile calculation. SpecOp can output the weighted counts and errors at each wavelength in various table formats using python’s astropy package. We outline the algorithm used for extraction and explain how the software can be used to easily obtain high-quality 1-D spectra. We demonstrate the utility of the program by applying it to spectra of a variety of quasars and AGNs. In some of these targets, we extract the spectrum of a nuclear point source that is superposed on a spatially extended galaxy.

  20. Guidelines for the verification and validation of expert system software and conventional software. Volume 7, User's manual: Final report

    International Nuclear Information System (INIS)

    Miller, L.A.; Hayes, J.E.; Mirsky, S.M.

    1995-05-01

    Reliable software is required for nuclear power industry applications. Verification and validation techniques applied during the software development process can help eliminate errors that could inhibit the proper operation of digital systems and cause availability and safety problems. Most of the techniques described in this report are valid for conventional software systems as well as for expert systems. The project resulted in a set of 16 V ampersand V guideline packages and 11 sets of procedures based on the class, development phase, and system component being tested. These guideline packages and procedures help a utility define the level of V ampersand V, which involves evaluating the complexity and type of software component along with the consequences of failure. In all, the project identified 153 V ampersand V techniques for conventional software systems and demonstrated their application to all aspects of expert systems except for the knowledge base, which requires specially developed tools. Each of these conventional techniques covers anywhere from 2-52 total types of conventional software defects, and each defect is covered by 21-50 V ampersand V techniques. The project also identified automated tools to Support V ampersand V activities

  1. Automated concept-level information extraction to reduce the need for custom software and rules development.

    Science.gov (United States)

    D'Avolio, Leonard W; Nguyen, Thien M; Goryachev, Sergey; Fiore, Louis D

    2011-01-01

    Despite at least 40 years of promising empirical performance, very few clinical natural language processing (NLP) or information extraction systems currently contribute to medical science or care. The authors address this gap by reducing the need for custom software and rules development with a graphical user interface-driven, highly generalizable approach to concept-level retrieval. A 'learn by example' approach combines features derived from open-source NLP pipelines with open-source machine learning classifiers to automatically and iteratively evaluate top-performing configurations. The Fourth i2b2/VA Shared Task Challenge's concept extraction task provided the data sets and metrics used to evaluate performance. Top F-measure scores for each of the tasks were medical problems (0.83), treatments (0.82), and tests (0.83). Recall lagged precision in all experiments. Precision was near or above 0.90 in all tasks. Discussion With no customization for the tasks and less than 5 min of end-user time to configure and launch each experiment, the average F-measure was 0.83, one point behind the mean F-measure of the 22 entrants in the competition. Strong precision scores indicate the potential of applying the approach for more specific clinical information extraction tasks. There was not one best configuration, supporting an iterative approach to model creation. Acceptable levels of performance can be achieved using fully automated and generalizable approaches to concept-level information extraction. The described implementation and related documentation is available for download.

  2. Two-dimensional particle simulation of negative ion extraction from a volume source

    International Nuclear Information System (INIS)

    Naitou, H.; Fukumasa, O.; Sakachou, K.; Mutou, K.

    1995-01-01

    Two-dimensional electrostatic particle simulation was done to study the extraction of negative ions from a volume plasma source. The simulation model is a rectangular system which consists of an extraction grid, a plasma grid, and a grounded wall. Full dynamics of electrons, ions, and negative ions are followed. Negative ions are extracted from the plasma region to the extraction grid through a slit in the plasma grid. For the lower value of extraction grid potential, the simulation results agree with the Child-Langumuir law, where the extracted negative ion current is proportional to the three-halves power of the potential of the extraction grid. For the higher value of extraction grid potential, the space charge effect of negative ions, which enter into the beamline at the top of the concavity of the positive ion boundary, reduces the negative ion current from the prediction of the Child-Langumuir law. ((orig.))

  3. Clinical records anonymisation and text extraction (CRATE): an open-source software system.

    Science.gov (United States)

    Cardinal, Rudolf N

    2017-04-26

    Electronic medical records contain information of value for research, but contain identifiable and often highly sensitive confidential information. Patient-identifiable information cannot in general be shared outside clinical care teams without explicit consent, but anonymisation/de-identification allows research uses of clinical data without explicit consent. This article presents CRATE (Clinical Records Anonymisation and Text Extraction), an open-source software system with separable functions: (1) it anonymises or de-identifies arbitrary relational databases, with sensitivity and precision similar to previous comparable systems; (2) it uses public secure cryptographic methods to map patient identifiers to research identifiers (pseudonyms); (3) it connects relational databases to external tools for natural language processing; (4) it provides a web front end for research and administrative functions; and (5) it supports a specific model through which patients may consent to be contacted about research. Creation and management of a research database from sensitive clinical records with secure pseudonym generation, full-text indexing, and a consent-to-contact process is possible and practical using entirely free and open-source software.

  4. Increased sinusoidal volume and solute extraction during retrograde liver perfusion

    International Nuclear Information System (INIS)

    Bass, N.M.; Manning, J.A.; Weisiger, R.A.

    1989-01-01

    Retrograde isolated liver perfusion has been used to probe acinar functional heterogeneity, but the hemodynamic effects of backward flow have not been characterized. In this study, extraction of a long-chain fatty acid derivative, 12-N-methyl-7-nitrobenzo-2-oxa-1,3-diazol-amino stearate (12-NBDS), was greater during retrograde than during anterograde perfusion of isolated rat liver. To determine whether hemodynamic differences between anterograde and retrograde perfused livers could account for this finding, the hepatic extracellular space was measured for both directions of flow by means of [ 14 C]sucrose washout during perfusion as well as by direct measurement of [ 14 C]sucrose entrapped during perfusion. A three- to fourfold enlargement of the total hepatic extracellular space was found during retrograde perfusion by both approaches. Examination of perfusion-fixed livers by light microscopy and morphometry revealed that marked distension of the sinusoids occurred during retrograde perfusion and that this accounts for the observed increase in the [ 14 C]sucrose space. These findings support the hypothesis that maximum resistance to perfusate flow in the isolated perfused rat liver is located at the presinusoidal level. In addition, increased transit time of perfusate through the liver and greater sinusoidal surface area resulting from sinusoidal distension may account for the higher extraction of 12-NBDS and possibly other compounds by retrograde perfused liver

  5. Method for quantifying the uncertainty with the extraction of the raw data of a gamma ray spectrum by deconvolution software

    International Nuclear Information System (INIS)

    Vigineix, Thomas; Guillot, Nicolas; Saurel, Nicolas

    2013-06-01

    Gamma ray spectrometry is a passive non destructive assay most commonly used to identify and quantify the radionuclides present in complex huge objects such as nuclear waste packages. The treatment of spectra from the measurement of nuclear waste is done in two steps: the first step is to extract the raw data from the spectra (energies and the net photoelectric absorption peaks area) and the second step is to determine the detection efficiency of the measuring scene. Commercial software use different methods to extract the raw data spectrum but none are optimal in the treatment of spectra containing actinides. Spectra should be handled individually and requires settings and an important feedback part from the operator, which prevents the automatic process of spectrum and increases the risk of human error. In this context the Nuclear Measurement and Valuation Laboratory (LMNE) in the Atomic Energy Commission Valduc (CEA Valduc) has developed a new methodology for quantifying the uncertainty associated with the extraction of the raw data over spectrum. This methodology was applied with raw data and commercial software that need configuration by the operator (GENIE2000, Interwinner...). This robust and fully automated methodology of uncertainties calculation is performed on the entire process of the software. The methodology ensures for all peaks processed by the deconvolution software an extraction of energy peaks closed to 2 channels and an extraction of net areas with an uncertainty less than 5 percents. The methodology was tested experimentally with actinides spectrum. (authors)

  6. Brain extraction in partial volumes T2*@7T by using a quasi-anatomic segmentation with bias field correction.

    Science.gov (United States)

    Valente, João; Vieira, Pedro M; Couto, Carlos; Lima, Carlos S

    2018-02-01

    Poor brain extraction in Magnetic Resonance Imaging (MRI) has negative consequences in several types of brain post-extraction such as tissue segmentation and related statistical measures or pattern recognition algorithms. Current state of the art algorithms for brain extraction work on weighted T1 and T2, being not adequate for non-whole brain images such as the case of T2*FLASH@7T partial volumes. This paper proposes two new methods that work directly in T2*FLASH@7T partial volumes. The first is an improvement of the semi-automatic threshold-with-morphology approach adapted to incomplete volumes. The second method uses an improved version of a current implementation of the fuzzy c-means algorithm with bias correction for brain segmentation. Under high inhomogeneity conditions the performance of the first method degrades, requiring user intervention which is unacceptable. The second method performed well for all volumes, being entirely automatic. State of the art algorithms for brain extraction are mainly semi-automatic, requiring a correct initialization by the user and knowledge of the software. These methods can't deal with partial volumes and/or need information from atlas which is not available in T2*FLASH@7T. Also, combined volumes suffer from manipulations such as re-sampling which deteriorates significantly voxel intensity structures making segmentation tasks difficult. The proposed method can overcome all these difficulties, reaching good results for brain extraction using only T2*FLASH@7T volumes. The development of this work will lead to an improvement of automatic brain lesions segmentation in T2*FLASH@7T volumes, becoming more important when lesions such as cortical Multiple-Sclerosis need to be detected. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. [Extraction of buildings three-dimensional information from high-resolution satellite imagery based on Barista software].

    Science.gov (United States)

    Zhang, Pei-feng; Hu, Yuan-man; He, Hong-shi

    2010-05-01

    The demand for accurate and up-to-date spatial information of urban buildings is becoming more and more important for urban planning, environmental protection, and other vocations. Today's commercial high-resolution satellite imagery offers the potential to extract the three-dimensional information of urban buildings. This paper extracted the three-dimensional information of urban buildings from QuickBird imagery, and validated the precision of the extraction based on Barista software. It was shown that the extraction of three-dimensional information of the buildings from high-resolution satellite imagery based on Barista software had the advantages of low professional level demand, powerful universality, simple operation, and high precision. One pixel level of point positioning and height determination accuracy could be achieved if the digital elevation model (DEM) and sensor orientation model had higher precision and the off-Nadir View Angle was relatively perfect.

  8. CONTRIBUTION TO THE DEVELOPMENT OF A SIMULATION SOFTWARE PERFORMANCE AND SHARING RATIO IN LIQUID-LIQUID EXTRACTION

    Directory of Open Access Journals (Sweden)

    A. Hadj Seyd

    2015-07-01

    Full Text Available The present work is to develop software to predict the value yield and the distribution coefficient in the process of liquid-liquid extraction of components of a mixture, from mathematical models expressing these entities, based on equations equilibrium between liquid-liquid phases, and predict the conditions under which the extraction operation is favorable, unfavorable or impossible to realize, by studying the variation of the entities cited, based on the parameters influencing the extraction, which are: initial concentrations, rate of solvent and pH, in the case of a simple extraction (extraction of neutral products or when it is reactive (extraction of complex acids or bases for one or more components.The programming language used is "Delphi" which is a very powerful oriented object programming under Windows.

  9. Open-source Software for Demand Forecasting of Clinical Laboratory Test Volumes Using Time-series Analysis.

    Science.gov (United States)

    Mohammed, Emad A; Naugler, Christopher

    2017-01-01

    Demand forecasting is the area of predictive analytics devoted to predicting future volumes of services or consumables. Fair understanding and estimation of how demand will vary facilitates the optimal utilization of resources. In a medical laboratory, accurate forecasting of future demand, that is, test volumes, can increase efficiency and facilitate long-term laboratory planning. Importantly, in an era of utilization management initiatives, accurately predicted volumes compared to the realized test volumes can form a precise way to evaluate utilization management initiatives. Laboratory test volumes are often highly amenable to forecasting by time-series models; however, the statistical software needed to do this is generally either expensive or highly technical. In this paper, we describe an open-source web-based software tool for time-series forecasting and explain how to use it as a demand forecasting tool in clinical laboratories to estimate test volumes. This tool has three different models, that is, Holt-Winters multiplicative, Holt-Winters additive, and simple linear regression. Moreover, these models are ranked and the best one is highlighted. This tool will allow anyone with historic test volume data to model future demand.

  10. Open-source software for demand forecasting of clinical laboratory test volumes using time-series analysis

    Directory of Open Access Journals (Sweden)

    Emad A Mohammed

    2017-01-01

    Full Text Available Background: Demand forecasting is the area of predictive analytics devoted to predicting future volumes of services or consumables. Fair understanding and estimation of how demand will vary facilitates the optimal utilization of resources. In a medical laboratory, accurate forecasting of future demand, that is, test volumes, can increase efficiency and facilitate long-term laboratory planning. Importantly, in an era of utilization management initiatives, accurately predicted volumes compared to the realized test volumes can form a precise way to evaluate utilization management initiatives. Laboratory test volumes are often highly amenable to forecasting by time-series models; however, the statistical software needed to do this is generally either expensive or highly technical. Method: In this paper, we describe an open-source web-based software tool for time-series forecasting and explain how to use it as a demand forecasting tool in clinical laboratories to estimate test volumes. Results: This tool has three different models, that is, Holt-Winters multiplicative, Holt-Winters additive, and simple linear regression. Moreover, these models are ranked and the best one is highlighted. Conclusion: This tool will allow anyone with historic test volume data to model future demand.

  11. Improvement of productivity in low volume production industry layout by using witness simulation software

    Science.gov (United States)

    Jaffrey, V.; Mohamed, N. M. Z. N.; Rose, A. N. M.

    2017-10-01

    In almost all manufacturing industry, increased productivity and better efficiency of the production line are the most important goals. Most factories especially small scale factory has less awareness of manufacturing system optimization and lack of knowledge about it and uses the traditional way of management. Problems that are commonly identified in the factory are a high idle time of labour and also small production. This study is done in a Small and Medium Enterprises (SME) low volume production company. Data collection and problems affecting productivity and efficiency are identified. In this study, Witness simulation software is being used to simulate the layout and the output is focusing on the improvement of layout in terms of productivity and efficiency. In this study, the layout is rearranged by reducing the travel time from a workstation to another workstation. Then, the improved layout is modelled and the machine and labour statistic of both, original and improved layout is taken. Productivity and efficiency are calculated for both layout and then being compared.

  12. CrossTalk. The Journal of Defense Software Engineering. Volume 26, Number 1

    Science.gov (United States)

    2013-02-01

    comprehen- sive software development process, which incorporates best practices as well as standards such as IEEE 12207 -2008. The contractor will be...5], defines software quality as the degree to which software possesses a desired combination of attributes. Similarly, ISO /IEC 9126-1:2001 [6], one...attributes of the quality characteristics defined in ISO /IEC 9126-1. It should be noted that the 9126-series is being revised as part of the Software Product

  13. Spacelab user implementation assessment study. (Software requirements analysis). Volume 2: Technical report

    Science.gov (United States)

    1976-01-01

    The engineering analyses and evaluation studies conducted for the Software Requirements Analysis are discussed. Included are the development of the study data base, synthesis of implementation approaches for software required by both mandatory onboard computer services and command/control functions, and identification and implementation of software for ground processing activities.

  14. Renal cortical volume measured using automatic contouring software for computed tomography and its relationship with BMI, age and renal function

    International Nuclear Information System (INIS)

    Muto, Natalia Sayuri; Kamishima, Tamotsu; Harris, Ardene A.; Kato, Fumi; Onodera, Yuya; Terae, Satoshi; Shirato, Hiroki

    2011-01-01

    Purpose: To evaluate the relationship between renal cortical volume, measured by an automatic contouring software, with body mass index (BMI), age and renal function. Materials and methods: The study was performed in accordance to the institutional guidelines at our hospital. Sixty-four patients (34 men, 30 women), aged 19 to 79 years had their CT scans for diagnosis or follow-up of hepatocellular carcinoma retrospectively examined by a computer workstation using a software that automatically contours the renal cortex and the renal parenchyma. Body mass index and estimated glomerular filtration rate (eGFR) were calculated based on data collected. Statistical analysis was done using the Student t-test, multiple regression analysis, and intraclass correlation coefficient (ICC). Results: The ICC for total renal and renal cortical volumes were 0.98 and 0.99, respectively. Renal volume measurements yielded a mean cortical volume of 105.8 cm 3 ± 28.4 SD, mean total volume of 153 cm 3 ± 39 SD and mean medullary volume of 47.8 cm 3 ± 19.5 SD. The correlation between body weight/height/BMI and both total renal and cortical volumes presented r = 0.6, 0.6 and 0.4, respectively, p < 0.05, while the correlation between renal cortex and age was r = -0.3, p < 0.05. eGFR showed correlation with renal cortical volume r = 0.6, p < 0.05. Conclusion: This study demonstrated that renal cortical volume had a moderate positive relationship with BMI, moderate negative relationship with age, and a strong positive relationship with the renal function, and provided a new method to routinely produce volumetric assessment of the kidney.

  15. Human-system interface design review guideline -- Review software and user's guide: Final report. Revision 1, Volume 3

    International Nuclear Information System (INIS)

    1996-06-01

    NUREG-0700, Revision 1, provides human factors engineering (HFE) guidance to the US Nuclear Regulatory Commission staff for its: (1) review of the human system interface (HSI) design submittals prepared by licensees or applications for a license or design certification of commercial nuclear power plants, and (2) performance of HSI reviews that could be undertaken as part of an inspection or other type of regulatory review involving HSI design or incidents involving human performance. The guidance consists of a review process and HFE guidelines. The document describes those aspects of the HSI design review process that are important to the identification and resolution of human engineering discrepancies that could adversely affect plant safety. Guidance is provided that could be used by the staff to review an applicant's HSI design review process or to guide the development of an HSI design review plan, e.g., as part of an inspection activity. The document also provides detailed HFE guidelines for the assessment of HSI design implementations. NUREG-0700, Revision 1, consists of three stand-alone volumes. Volume 3 contains an interactive software application of the NUREG-0700, Revision 1 guidance and a user's guide for this software. The software supports reviewers during review preparation, evaluation design using the human factors engineering guidelines, and in report preparation. The user's guide provides system requirements and installation instructions, detailed explanations of the software's functions and features, and a tutorial on using the software

  16. Clinical Application of an Open-Source 3D Volume Rendering Software to Neurosurgical Approaches.

    Science.gov (United States)

    Fernandes de Oliveira Santos, Bruno; Silva da Costa, Marcos Devanir; Centeno, Ricardo Silva; Cavalheiro, Sergio; Antônio de Paiva Neto, Manoel; Lawton, Michael T; Chaddad-Neto, Feres

    2018-02-01

    Preoperative recognition of the anatomic individualities of each patient can help to achieve more precise and less invasive approaches. It also may help to anticipate potential complications and intraoperative difficulties. Here we describe the use, accuracy, and precision of a free tool for planning microsurgical approaches using 3-dimensional (3D) reconstructions from magnetic resonance imaging (MRI). We used the 3D volume rendering tool of a free open-source software program for 3D reconstruction of images of surgical sites obtained by MRI volumetric acquisition. We recorded anatomic reference points, such as the sulcus and gyrus, and vascularization patterns for intraoperative localization of lesions. Lesion locations were confirmed during surgery by intraoperative ultrasound and/or electrocorticography and later by postoperative MRI. Between August 2015 and September 2016, a total of 23 surgeries were performed using this technique for 9 low-grade gliomas, 7 high-grade gliomas, 4 cortical dysplasias, and 3 arteriovenous malformations. The technique helped delineate lesions with an overall accuracy of 2.6 ± 1.0 mm. 3D reconstructions were successfully performed in all patients, and images showed sulcus, gyrus, and venous patterns corresponding to the intraoperative images. All lesion areas were confirmed both intraoperatively and at the postoperative evaluation. With the technique described herein, it was possible to successfully perform 3D reconstruction of the cortical surface. This reconstruction tool may serve as an adjunct to neuronavigation systems or may be used alone when such a system is unavailable. Copyright © 2017 Elsevier Inc. All rights reserved.

  17. Guidelines for the verification and validation of expert system software and conventional software: User`s manual. Volume 7

    Energy Technology Data Exchange (ETDEWEB)

    Mirsky, S.M.; Hayes, J.E.; Miller, L.A. [Science Applications International Corp., McLean, VA (United States)

    1995-03-01

    This report provides a step-by-step guide, or user manual, for personnel responsible for the planning and execution of the verification and validation (V&V), and developmental testing, of expert systems, conventional software systems, and various other types of artificial intelligence systems. While the guide was developed primarily for applications in the utility industry, it applies well to all industries. The user manual has three sections. In Section 1 the user assesses the stringency of V&V needed for the system under consideration, identifies the development stage the system is in, and identifies the component(s) of the system to be tested next. These three pieces of information determine which Guideline Package of V&V methods is most appropriate for those conditions. The V&V Guideline Packages are provided in Section 2. Each package consists of an ordered set of V&V techniques to be applied to the system, guides on choosing the review/evaluation team, measurement criteria, and references to a book or report which describes the application of the method. Section 3 presents details of 11 of the most important (or least well-explained in the literature) methods to assist the user in applying these techniques accurately.

  18. Guidelines for the verification and validation of expert system software and conventional software: User's manual. Volume 7

    International Nuclear Information System (INIS)

    Mirsky, S.M.; Hayes, J.E.; Miller, L.A.

    1995-03-01

    This report provides a step-by-step guide, or user manual, for personnel responsible for the planning and execution of the verification and validation (V ampersand V), and developmental testing, of expert systems, conventional software systems, and various other types of artificial intelligence systems. While the guide was developed primarily for applications in the utility industry, it applies well to all industries. The user manual has three sections. In Section 1 the user assesses the stringency of V ampersand V needed for the system under consideration, identifies the development stage the system is in, and identifies the component(s) of the system to be tested next. These three pieces of information determine which Guideline Package of V ampersand V methods is most appropriate for those conditions. The V ampersand V Guideline Packages are provided in Section 2. Each package consists of an ordered set of V ampersand V techniques to be applied to the system, guides on choosing the review/evaluation team, measurement criteria, and references to a book or report which describes the application of the method. Section 3 presents details of 11 of the most important (or least well-explained in the literature) methods to assist the user in applying these techniques accurately

  19. CrossTalk: The Journal of Defense Software Engineering. Volume 19, Number 9

    Science.gov (United States)

    2006-09-01

    activities to ISO /IEC 15288 system life cycle and ISO /IEC 12207 software life cycle processes. • Microsoft Security Development Lifecycle (SDL) [18, 19...Standardization/International Electrotechnical Commission ( ISO / IEC) Standard 15026 System and Software Assurance, which adds securi- ty assurance...Software ProcessSM (TSPSM Secure) [21]. The CMM and ISO /IEC process models are defined at a higher level of abstraction than SDL and CLASP, which

  20. CrossTalk: The Journal of Defense Software Engineering. Volume 21, Number 8

    Science.gov (United States)

    2008-08-01

    distributed black -and-white copy to its current form and shape. It is my hope that CrossTalk continues to publish insightful articles for another 20...Free: The Art of Making Quality Certain. New York: Mentor, New American Library, 1979. 7. Humphrey, Watts S. Managing the Software Process. Reading, MA...managing editor. The jazz trio worked their magic attracting a broader software audience as they infused topics outside the original embedded software

  1. CrossTalk: The Journal of Defense Software Engineering. Volume 20, Number 8, August 2007

    National Research Council Canada - National Science Library

    2007-01-01

    ... organizations in multiple countries. Next, Nelson Perez and Earnest Ambrose relate their story of successful software process improvement in "Lessons Learned in Using Agile Methods for Process Improvement...

  2. CrossTalk: The Journal of Defense Software Engineering. Volume 20, Number 2

    National Research Council Canada - National Science Library

    Phillips, Mike; Craig, Rushby; Jackelen, George; Humphrey, Watts S; Konrad, Michael D; Over, James W; Pries-Heje, Jan; Johansen, Joern; Christiansen, Mads; Korsaa, Morten; Laporte, Claude Y; April, Alain; Renault, Alain

    2007-01-01

    ...: This article describes how the 309th Software Maintenance Group used Standard Capability Maturity Model Integration Appraisal Method for Process Improvement B to identify opportunities for additional...

  3. Network, system, and status software enhancements for the autonomously managed electrical power system breadboard. Volume 3: Commands specification

    Science.gov (United States)

    Mckee, James W.

    1990-01-01

    This volume (3 of 4) contains the specification for the command language for the AMPS system. The volume contains a requirements specification for the operating system and commands and a design specification for the operating system and command. The operating system and commands sits on top of the protocol. The commands are an extension of the present set of AMPS commands in that the commands are more compact, allow multiple sub-commands to be bundled into one command, and have provisions for identifying the sender and the intended receiver. The commands make no change to the actual software that implement the commands.

  4. Server-based enterprise collaboration software improves safety and quality in high-volume PET/CT practice.

    Science.gov (United States)

    McDonald, James E; Kessler, Marcus M; Hightower, Jeremy L; Henry, Susan D; Deloney, Linda A

    2013-12-01

    With increasing volumes of complex imaging cases and rising economic pressure on physician staffing, timely reporting will become progressively challenging. Current and planned iterations of PACS and electronic medical record systems do not offer workflow management tools to coordinate delivery of imaging interpretations with the needs of the patient and ordering physician. The adoption of a server-based enterprise collaboration software system by our Division of Nuclear Medicine has significantly improved our efficiency and quality of service.

  5. High integrity software for nuclear power plants: Candidate guidelines, technical basis and research needs. Executive summary: Volume 1

    International Nuclear Information System (INIS)

    Seth, S.; Bail, W.; Cleaves, D.; Cohen, H.; Hybertson, D.; Schaefer, C.; Stark, G.; Ta, A.; Ulery, B.

    1995-06-01

    The work documented in this report was performed in support of the US Nuclear Regulatory Commission to examine the technical basis for candidate guidelines that could be considered in reviewing and evaluating high integrity computer software used in the safety systems of nuclear power plants. The framework for the work consisted of the following software development and assurance activities: requirements specification; design; coding; verification and validation, including static analysis and dynamic testing; safety analysis; operation and maintenance; configuration management; quality assurance; and planning and management. Each activity (framework element) was subdivided into technical areas (framework subelements). The report describes the development of approximately 200 candidate guidelines that span the entire range of software life-cycle activities; the assessment of the technical basis for those candidate guidelines; and the identification, categorization and prioritization of research needs for improving the technical basis. The report has two volumes: Volume 1, Executive Summary, includes an overview of the framework and of each framework element, the complete set of candidate guidelines, the results of the assessment of the technical basis for each candidate guideline, and a discussion of research needs that support the regulatory function; Volume 2 is the main report

  6. Guidelines for the verification and validation of expert system software and conventional software: Survey and assessment of conventional software verification and validation methods. Volume 2

    Energy Technology Data Exchange (ETDEWEB)

    Mirsky, S.M.; Groundwater, E.H.; Hayes, J.E.; Miller, L.A. [Science Applications International Corp., McLean, VA (United States)

    1995-03-01

    By means of a literature survey, a comprehensive set of methods was identified for the verification and validation of conventional software. The 153 methods so identified were classified according to their appropriateness for various phases of a developmental life-cycle -- requirements, design, and implementation; the last category was subdivided into two, static testing and dynamic testing methods. The methods were then characterized in terms of eight rating factors, four concerning ease-of-use of the methods and four concerning the methods` power to detect defects. Based on these factors, two measurements were developed to permit quantitative comparisons among methods, a Cost-Benefit metric and an Effectiveness Metric. The Effectiveness Metric was further refined to provide three different estimates for each method, depending on three classes of needed stringency of V&V (determined by ratings of a system`s complexity and required-integrity). Methods were then rank-ordered for each of the three classes by terms of their overall cost-benefits and effectiveness. The applicability was then assessed of each for the identified components of knowledge-based and expert systems, as well as the system as a whole.

  7. Guidelines for the verification and validation of expert system software and conventional software: Survey and assessment of conventional software verification and validation methods. Volume 2

    International Nuclear Information System (INIS)

    Mirsky, S.M.; Groundwater, E.H.; Hayes, J.E.; Miller, L.A.

    1995-03-01

    By means of a literature survey, a comprehensive set of methods was identified for the verification and validation of conventional software. The 153 methods so identified were classified according to their appropriateness for various phases of a developmental life-cycle -- requirements, design, and implementation; the last category was subdivided into two, static testing and dynamic testing methods. The methods were then characterized in terms of eight rating factors, four concerning ease-of-use of the methods and four concerning the methods' power to detect defects. Based on these factors, two measurements were developed to permit quantitative comparisons among methods, a Cost-Benefit metric and an Effectiveness Metric. The Effectiveness Metric was further refined to provide three different estimates for each method, depending on three classes of needed stringency of V ampersand V (determined by ratings of a system's complexity and required-integrity). Methods were then rank-ordered for each of the three classes by terms of their overall cost-benefits and effectiveness. The applicability was then assessed of each for the identified components of knowledge-based and expert systems, as well as the system as a whole

  8. CrossTalk: The Journal of Defense Software Engineering. Volume 21, Number 4

    National Research Council Canada - National Science Library

    Jones, Capers; Henderson, Kym; Zwikael, Ofer; Lipke, Walt; Coe, David J; Premeaux, David; Armour, Phillip G

    2008-01-01

    CONTENTS: 1) Software Tracking:The Last Defense Against Failure by Capers Jones: This article concentrates on four worst practices and the factors that most often lead to failure and litigation and gives advice on how to avoid them. 2...

  9. CrossTalk: The Journal of Defense Software Engineering. Volume 20, Number 12

    National Research Council Canada - National Science Library

    Jones, Capers; Huff, Lloyd; Novak, George; Lau, Yun-Tung; Torri, Stephen; Sanders, Derek; Hamilton, Drew; Evans, Gordon; Frost, Alison A; Campo, Michael J

    2007-01-01

    CONTENTS: 1) Geriatric Issues of Aging Software by Capers Jones: Capers Jones discusses the need of every company to evaluate and consider best practices for maintenance and to avoid common worst practices. 2...

  10. CrossTalk: The Journal of Defense Software Engineering. Volume 19, Number 6

    National Research Council Canada - National Science Library

    Jones, Capers; Jost, Alan C; Perkins, Timothy K; Fleming, Quentin W; Koppelman, Joel M; Lipke, Walt; Olson, Timothy G; Kimmerly, Paul

    2006-01-01

    "Social and Technical Reasons for Software Project Failures," by Capers Jones -- Applying a careful program of risk analysis and risk abatement can lower the effects of the technical and social issues...

  11. Crosstalk: The Journal of Defense Software Engineering. Volume 22, Number 2, February 2009

    Science.gov (United States)

    2009-02-01

    possible system attacks. by Ron Greenfield and Dr. Charley Tichenor Enforcing Static Program Properties to Enable Safety-Critical Use of Java Software...Assurance by Ron Greenfield and Dr. Charley Tichenor, and Dr. Kelvin Nilsen’s Enforcing Static Program Properties in Safety-Critical Java Software Components...01&lang=en>. 5. Shakespeare, William. The Tempest. 6. Intel. “How Chips are Made.” 2008 <www.intel.com/ education /making chips/preparation.htm>. 7

  12. Achieving Better Buying Power through Acquisition of Open Architecture Software Systems: Volume 1

    Science.gov (United States)

    2016-01-06

    markets [GuW12]. We thus believe our complementary research places us at an extraordinary advantage to conduct the proposed study that addresses a major...supporting “Bring Your Own Devices” (BYOD)? 22 New business models for OA software components ● Franchising ● Enterprise licensing ● Metered usage...paths IP and cybersecurity requirements will need continuous attention! 35 New business models for OA software components ● Franchising ● Enterprise

  13. Base excision repair efficiency and mechanism in nuclear extracts are influenced by the ratio between volume of nuclear extraction buffer and nuclei—Implications for comparative studies

    International Nuclear Information System (INIS)

    Akbari, Mansour; Krokan, Hans E.

    2012-01-01

    Highlights: • We examine effect of volume of extraction buffer relative to volume of isolated nuclei on repair activity of nuclear extract. • Base excision repair activity of nuclear extracts prepared from the same batch and number of nuclei varies inversely with the volume of nuclear extraction buffer. • Effect of the volume of extraction buffer on BER activity of nuclear extracts can only be partially reversed after concentration of the more diluted extract by ultrafiltration. - Abstract: The base excision repair (BER) pathway corrects many different DNA base lesions and is important for genomic stability. The mechanism of BER cannot easily be investigated in intact cells and therefore in vitro methods that reflect the in vivo processes are in high demand. Reconstitution of BER using purified proteins essentially mirror properties of the proteins used, and does not necessarily reflect the mechanism as it occurs in the cell. Nuclear extracts from cultured cells have the capacity to carry out complete BER and can give important information on the mechanism. Furthermore, candidate proteins in extracts can be inhibited or depleted in a controlled way, making defined extracts an important source for mechanistic studies. The major drawback is that there is no standardized method of preparing nuclear extract for BER studies, and it does not appear to be a topic given much attention. Here we have examined BER activity of nuclear cell extracts from HeLa cells, using as substrate a circular DNA molecule with either uracil or an AP-site in a defined position. We show that BER activity of nuclear extracts from the same batch of cells varies inversely with the volume of nuclear extraction buffer relative to nuclei volume, in spite of identical protein concentrations in the BER assay mixture. Surprisingly, the uracil–DNA glycosylase activity (mainly UNG2), but not amount of UNG2, also correlated negatively with the volume of extraction buffer. These studies demonstrate

  14. Guidelines for the verification and validation of expert system software and conventional software: Volume 5, Rationale and description of verification and validation guideline packages and procedures. Final report

    International Nuclear Information System (INIS)

    Miller, L.A.; Hayes, J.E.; Mirsky, S.M.

    1995-05-01

    This report is the fifth volume in a series of reports describing the results of the Expert System Verification and Validation (V ampersand V) project which is jointly funded by US NRC and EPRI toward formulating guidelines for V ampersand V of expert systems for use in nuclear power applications. This report provides the rationale for and description of those guidelines. The actual guidelines themselves (and the accompanying 11 step by step Procedures) are presented in Volume 7, User's Manual. Three factors determine what V ampersand V is needed: (1) the stage, of the development life cycle (requirements, design, or implementation), (2) whether the overall system or a specialized component needs be tested (knowledge base component, inference engine or other highly reusable element, or a component involving conventional software), and (3) the stringency of V ampersand V that is needed (as judged from an assessment of the system's complexity and the requirement for its integrity to form three Classes). A V ampersand V guideline package is provided for each of the combinations of these three variables. The package specifies the V ampersand V methods recommended and the order in which they should be administered, the assurances each method provides, the qualifications needed by the V ampersand V team to employ each Particular method, the degree to which the methods should be applied, the performance measures that should be taken, and the decision criteria for accepting, conditionally accepting, or rejecting an evaluated system. In addition to the guideline packages, highly detailed step-by-step procedures are provided for 11 of the more important methods, to ensure that they Can be implemented correctly. The guidelines can apply to conventional procedural software systems as well as all kinds of AI systems

  15. Guidelines for the verification and validation of expert system software and conventional software: Rationale and description of V ampersand V guideline packages and procedures. Volume 5

    International Nuclear Information System (INIS)

    Mirsky, S.M.; Hayes, J.E.; Miller, L.A.

    1995-03-01

    This report is the fifth volume in a series of reports describing the results of the Expert System Verification C, and Validation (V ampersand V) project which is jointly funded by the U.S. Nuclear Regulatory Commission and the Electric Power Research Institute toward the objective of formulating Guidelines for the V ampersand V of expert systems for use in nuclear power applications. This report provides the rationale for and description of those guidelines. The actual guidelines themselves are presented in Volume 7, open-quotes User's Manual.close quotes Three factors determine what V ampersand V is needed: (1) the stage of the development life cycle (requirements, design, or implementation); (2) whether the overall system or a specialized component needs to be tested (knowledge base component, inference engine or other highly reusable element, or a component involving conventional software); and (3) the stringency of V ampersand V that is needed (as judged from an assessment of the system's complexity and the requirement for its integrity to form three Classes). A V ampersand V Guideline package is provided for each of the combinations of these three variables. The package specifies the V ampersand V methods recommended and the order in which they should be administered, the assurances each method provides, the qualifications needed by the V ampersand V team to employ each particular method, the degree to which the methods should be applied, the performance measures that should be taken, and the decision criteria for accepting, conditionally accepting, or rejecting an evaluated system. In addition to the Guideline packages, highly detailed step-by-step procedures are provided for 11 of the more important methods, to ensure that they can be implemented correctly. The Guidelines can apply to conventional procedural software systems as well as all kinds of Al systems

  16. Automatic Extraction of Myocardial Mass and Volume Using Parametric Images from Dynamic Nongated PET.

    Science.gov (United States)

    Harms, Hendrik Johannes; Stubkjær Hansson, Nils Henrik; Tolbod, Lars Poulsen; Kim, Won Yong; Jakobsen, Steen; Bouchelouche, Kirsten; Wiggers, Henrik; Frøkiaer, Jørgen; Sörensen, Jens

    2016-09-01

    Dynamic cardiac PET is used to quantify molecular processes in vivo. However, measurements of left ventricular (LV) mass and volume require electrocardiogram-gated PET data. The aim of this study was to explore the feasibility of measuring LV geometry using nongated dynamic cardiac PET. Thirty-five patients with aortic-valve stenosis and 10 healthy controls underwent a 27-min (11)C-acetate PET/CT scan and cardiac MRI (CMR). The controls were scanned twice to assess repeatability. Parametric images of uptake rate K1 and the blood pool were generated from nongated dynamic data. Using software-based structure recognition, the LV wall was automatically segmented from K1 images to derive functional assessments of LV mass (mLV) and wall thickness. End-systolic and end-diastolic volumes were calculated using blood pool images and applied to obtain stroke volume and LV ejection fraction (LVEF). PET measurements were compared with CMR. High, linear correlations were found for LV mass (r = 0.95), end-systolic volume (r = 0.93), and end-diastolic volume (r = 0.90), and slightly lower correlations were found for stroke volume (r = 0.74), LVEF (r = 0.81), and thickness (r = 0.78). Bland-Altman analyses showed significant differences for mLV and thickness only and an overestimation for LVEF at lower values. Intra- and interobserver correlations were greater than 0.95 for all PET measurements. PET repeatability accuracy in the controls was comparable to CMR. LV mass and volume are accurately and automatically generated from dynamic (11)C-acetate PET without electrocardiogram gating. This method can be incorporated in a standard routine without any additional workload and can, in theory, be extended to other PET tracers. © 2016 by the Society of Nuclear Medicine and Molecular Imaging, Inc.

  17. CrossTalk: The Journal of Defense Software Engineering. Volume 19, Number 11

    Science.gov (United States)

    2006-11-01

    8>. 7. Wallace, Delores R. Practical Soft- ware Reliability Modeling. Proc. of the 26th Annual NASA Goddard Software Engineering Workshop, Nov. 2001...STAR WARS TO STAR TREK To Request Back Issues on Topics Not Listed Above, Please Contact <stsc. customerservice@hill.af.mil>. About the Authors Kym

  18. CrossTalk. The Journal of Defense Software Engineering. Volume 25, Number 2

    Science.gov (United States)

    2012-04-01

    both editorial oversight and technical review of the journal. CrossTalk’s mission is to encour- age the engineering development of software to improve...sending e-mail. (Robertson, 2011) Mobile Workers and related products - Telecommuting -- the home office - Pressure to provide tools and access to

  19. CrossTalk: The Journal of Defense Software Engineering. Volume 19, Number 12, December 2006

    Science.gov (United States)

    2006-12-01

    process. The industry average for project investment in its requirements process is 3 percent of project costs; data from NASA shows that when 8- 14...requirements error is a defect that is dis- covered in delivered code that is a result of a requirement statement. Data from NASA provided by requirements...CENTRICITY AUG2006 c ADA 2005 SEPT2006 c SOFTWARE ASSURANCE OCT2006 c STAR WARS TO STAR TREK NOV2006 c MANAGEMENT BASICS To Request Back Issues on Topics

  20. CrossTalk: The Journal of Defense Software Engineering. Volume 18, Number 6

    Science.gov (United States)

    2005-06-01

    Bollinger The MITRE Corporation1 Progress brings new dangers: Powerful home computers, inexpensive high-speed Internet access, telecommuting , and software...been using for years. Sure enough, GIANT was able to finish removing the hard - core spyware. At some point, my Internet security package had been... The sad truth is that if you do nothing more than attach a Windows PC to the Internet over a high- speed line, it will be subjected to the first

  1. Computer-Aided Design for Built-In-Test (CADBIT) - Software Specification. Volume 3

    Science.gov (United States)

    1989-10-01

    CADD COMAN WIDO IO-CNURET MSL PPLYING~ TET ATEN Figur 3-13- TUTORIA FIGUR PLCMI IN CAD-NVIOMENT ON BOAR SEFTST- 1"os-rnenu, long-tur d nnnih que- list...have software package for reliability calculation A-8 LIBRARY ELEMENT DATA SHE T’" BIT TECHNIQUE: ON-BOARD ROM CATEGORY: L’ONG TUTORIA PAGE ,5 of 14

  2. Crosstalk: The Journal of Defense Software Engineering. Volume 22, Number 4

    Science.gov (United States)

    2009-06-01

    to measure impacted functions not covered by the International Function Point User’s Group (IFPUG) Counting Practices Manual ( CPM ) 4.2. If you are...foggy on the ins and outs of IFPUG or CPM 4.2, or just want a refresher, Total Metrics has simplified the concepts down to an easily under- stood two...information from your organization’s perspective with the software defense industry. • Dedicated space in each issue. • Advertisements ranging from a full

  3. WebPlotDigitizer, a polyvalent and free software to extract spectra from old astronomical publications: application to ultraviolet spectropolarimetry

    Science.gov (United States)

    Marin, F.; Rohatgi, A.; Charlot, S.

    2017-12-01

    In this contribution, we present WebPlotDigitizer, a polyvalent and free software developed to facilitate easy and accurate data extraction from a variety of plot types. We describe the numerous features of this numerical tool and present its relevance when applied to astrophysical archival research. We exploit WebPlotDigitizer to extract ultraviolet spectropolarimetric spectra from old publications that used the Hubble Space Telescope, Lick Observatory 3 m Shane telescope and Astro-2 mission to observe the Seyfert-2 AGN NGC 1068. By doing so, we compile all the existing ultraviolet polarimetric data on NGC 1068 to prepare the ground for further investigations with the future high-resolution spectropolarimeter POLLUX on-board of the proposed Large UV/Optical/Infrared Surveyor (LUVOIR) NASA mission.

  4. Comparison of automatic quantification software for the measurement of ventricular volume and ejection fraction in gated myocardial perfusion SPECT

    International Nuclear Information System (INIS)

    Van Staden, J.A.; Herbst, C.P.; Du Raan, H.; Lotter, M.G.; Otto, A.C.

    2004-01-01

    Full text: Introduction: Gated myocardial perfusion SPECT has been used to calculate left ventricular ejection fraction (LVEF) and left ventricular end-diastolic volume (LVEDV) and has correlated well with conventional methods. However, the comparative accuracy of and correlations across various types of gated SPECT software are not well understood. Materials and methods: Twelve patients participated in a radionuclide gated blood-pool (GBP) study in addition to undergoing 99m Tc-sestamibi gated SPECT. Three different software algorithms, Quantitative Gated SPECT (QGS) from Cedars-Sinai, MultiDim from Stanford University Medical School and GQUANT from Alfa Nuclear were used to compute LVEF and LVEDV. These software algorithms operate in 3-dimensional space, two dependent on surface detection and the other on statistical parameters. The LVEF as calculated from gated SPECT myocardial perfusion images were compared with LVEF calculated from the GBP studies in the same patients to assess accuracy of the three software algorithms. Results: The software success-rate was 92% (11/12 pts) for MultiDim and 100% for the QGS and GQUANT. Agreement between LVEF measured with MultiDim and QGS, MultiDim and GQUANT and QGS and GQUANT were excellent (LVEF-MuItidim 0.80 LVEF QGS +5.02, r = 0.93, LVEF GQUANT = 1.10 LVEF MuItidim -1.33, r 0.90 and LVEF GQUANT = 1.02 LVEF QGS -1.40, r = 0.96). The correlation coefficient for LVEF between gated SPECT and the GBP study was 0.95, 0.95 and 0.97, for MultiDim, GQUANT and QGS, respectively. Conclusion: All 3 software programs showed good correlation between LVEF for gated SPECT and the GBP study. Good agreement for LVEF was observed also between the three software algorithms. However, because each method has unique characteristics that depend on its specific algorithm and thus behaves differently in the various patients, the methods should not be used interchangeably. (author)

  5. GENII: The Hanford Environmental Radiation Dosimetry Software System: Volume 1, Conceptual representation

    International Nuclear Information System (INIS)

    Napier, B.A.; Peloquin, R.A.; Strenge, D.L.; Ramsdell, J.V.

    1988-12-01

    The Hanford Environmental Dosimetry Upgrade Project was undertaken to incorporate the internal dosimetry models recommended by the International Commission on Radiological Protection (ICRP) in updated versions of the environmental pathway analysis models used at Hanford. The resulting second generation of Hanford environmental dosimetry computer codes is compiled in the Hanford Environmental Dosimetry System (Generation II, or GENII). The purpose of this coupled system of computer codes is to analyze environmental contamination resulting from acute or chronic releases to, or initial contamination of, air, water, or soil. This is accomplished by calculating radiation doses to individuals or populations. GENII is described in three volumes of documentation. The first volume describes the theoretical considerations of the system. The second volume is a Users' Manual, providing code structure, users' instructions, required system configurations, and QA-related topics. The third volume is a Code Maintenance Manual for the user who requires knowledge of code detail. It includes code logic diagrams, global dictionary, worksheets, example hand calculations, and listings of the code and its associated data libraries. 72 refs., 15 figs., 34 tabs

  6. GENII: The Hanford Environmental Radiation Dosimetry Software System: Volume 1, Conceptual representation

    Energy Technology Data Exchange (ETDEWEB)

    Napier, B.A.; Peloquin, R.A.; Strenge, D.L.; Ramsdell, J.V.

    1988-12-01

    The Hanford Environmental Dosimetry Upgrade Project was undertaken to incorporate the internal dosimetry models recommended by the International Commission on Radiological Protection (ICRP) in updated versions of the environmental pathway analysis models used at Hanford. The resulting second generation of Hanford environmental dosimetry computer codes is compiled in the Hanford Environmental Dosimetry System (Generation II, or GENII). The purpose of this coupled system of computer codes is to analyze environmental contamination resulting from acute or chronic releases to, or initial contamination of, air, water, or soil. This is accomplished by calculating radiation doses to individuals or populations. GENII is described in three volumes of documentation. The first volume describes the theoretical considerations of the system. The second volume is a Users' Manual, providing code structure, users' instructions, required system configurations, and QA-related topics. The third volume is a Code Maintenance Manual for the user who requires knowledge of code detail. It includes code logic diagrams, global dictionary, worksheets, example hand calculations, and listings of the code and its associated data libraries. 72 refs., 15 figs., 34 tabs.

  7. Portable microcomputer for the analysis of plutonium gamma-ray spectra. Volume II. Software description and listings

    International Nuclear Information System (INIS)

    Ruhter, W.D.

    1984-05-01

    A portable microcomputer has been developed and programmed for the International Atomic Energy Agency (IAEA) to perform in-field analysis of plutonium gamma-ray spectra. The unit includes a 16-bit LSI-11/2 microprocessor, 32-K words of memory, a 20-character display for user prompting, a numeric keyboard for user responses, and a 20-character thermal printer for hard-copy output of results. The unit weights 11 kg and has dimensions of 33.5 x 30.5 x 23.0 cm. This compactness allows the unit to be stored under an airline seat. Only the positions of the 148-keV 241 Pu and 208-keV 237 U peaks are required for spectral analysis that gives plutonium isotopic ratios and weight percent abundances. Volume I of this report provides a detailed description of the data analysis methodology, operation instructions, hardware, and maintenance and troubleshooting. Volume II describes the software and provides software listings

  8. Automatic extraction of forward stroke volume using dynamic PET/CT

    DEFF Research Database (Denmark)

    Harms, Hans; Tolbod, Lars Poulsen; Hansson, Nils Henrik Stubkjær

    2015-01-01

    Background The aim of this study was to develop and validate an automated method for extracting forward stroke volume (FSV) using indicator dilution theory directly from dynamic positron emission tomography (PET) studies for two different tracers and scanners. Methods 35 subjects underwent...... a dynamic 11 C-acetate PET scan on a Siemens Biograph TruePoint-64 PET/CT (scanner I). In addition, 10 subjects underwent both dynamic 15 O-water PET and 11 C-acetate PET scans on a GE Discovery-ST PET/CT (scanner II). The left ventricular (LV)-aortic time-activity curve (TAC) was extracted automatically...... from PET data using cluster analysis. The first-pass peak was isolated by automatic extrapolation of the downslope of the TAC. FSV was calculated as the injected dose divided by the product of heart rate and the area under the curve of the first-pass peak. Gold standard FSV was measured using phase...

  9. Automatic extraction of forward stroke volume using dynamic 11C-acetate PET/CT

    DEFF Research Database (Denmark)

    Harms, Hans; Tolbod, Lars Poulsen; Hansson, Nils Henrik

    Objectives: Dynamic PET with 11C-acetate can be used to quantify myocardial blood flow and oxidative metabolism, the latter of which is used to calculate myocardial external efficiency (MEE). Calculation of MEE requires forward stroke volume (FSV) data. FSV is affected by cardiac loading conditions......, potentially introducing bias if measured with a separate modality. The aim of this study was to develop and validate methods for automatically extracting FSV directly from the dynamic PET used for measuring oxidative metabolism. Methods: 16 subjects underwent a dynamic 27 min PET scan on a Siemens Biograph...... TruePoint 64 PET/CT scanner after bolus injection of 399±27 MBq of 11C-acetate. The LV-aortic time-activity curve (TAC) was extracted automatically from dynamic PET data using cluster analysis. The first-pass peak was derived by automatic extrapolation of the down-slope of the TAC. FSV...

  10. Guidelines for the verification and validation of expert system software and conventional software: Survey and documentation of expert system verification and validation methodologies. Volume 3

    International Nuclear Information System (INIS)

    Groundwater, E.H.; Miller, L.A.; Mirsky, S.M.

    1995-03-01

    This report is the third volume in the final report for the Expert System Verification and Validation (V ampersand V) project which was jointly sponsored by the Nuclear Regulatory Commission and the Electric Power Research Institute. The ultimate objective is the formulation of guidelines for the V ampersand V of expert systems for use in nuclear power applications. The purpose of this activity was to survey and document techniques presently in use for expert system V ampersand V. The survey effort included an extensive telephone interviewing program, site visits, and a thorough bibliographic search and compilation. The major finding was that V ampersand V of expert systems is not nearly as established or prevalent as V ampersand V of conventional software systems. When V ampersand V was used for expert systems, it was almost always at the system validation stage after full implementation and integration usually employing the non-systematic dynamic method of open-quotes ad hoc testing.close quotes There were few examples of employing V ampersand V in the early phases of development and only weak sporadic mention of the possibilities in the literature. There is, however, a very active research area concerning the development of methods and tools to detect problems with, particularly, rule-based expert systems. Four such static-testing methods were identified which were not discovered in a comprehensive review of conventional V ampersand V methods in an earlier task

  11. Guidelines for the verification and validation of expert system software and conventional software: Survey and documentation of expert system verification and validation methodologies. Volume 3

    Energy Technology Data Exchange (ETDEWEB)

    Groundwater, E.H.; Miller, L.A.; Mirsky, S.M. [Science Applications International Corp., McLean, VA (United States)

    1995-03-01

    This report is the third volume in the final report for the Expert System Verification and Validation (V&V) project which was jointly sponsored by the Nuclear Regulatory Commission and the Electric Power Research Institute. The ultimate objective is the formulation of guidelines for the V&V of expert systems for use in nuclear power applications. The purpose of this activity was to survey and document techniques presently in use for expert system V&V. The survey effort included an extensive telephone interviewing program, site visits, and a thorough bibliographic search and compilation. The major finding was that V&V of expert systems is not nearly as established or prevalent as V&V of conventional software systems. When V&V was used for expert systems, it was almost always at the system validation stage after full implementation and integration usually employing the non-systematic dynamic method of {open_quotes}ad hoc testing.{close_quotes} There were few examples of employing V&V in the early phases of development and only weak sporadic mention of the possibilities in the literature. There is, however, a very active research area concerning the development of methods and tools to detect problems with, particularly, rule-based expert systems. Four such static-testing methods were identified which were not discovered in a comprehensive review of conventional V&V methods in an earlier task.

  12. High integrity software for nuclear power plants: Candidate guidelines, technical basis and research needs. Main report, Volume 2

    International Nuclear Information System (INIS)

    Seth, S.; Bail, W.; Cleaves, D.; Cohen, H.; Hybertson, D.; Schaefer, C.; Stark, G.; Ta, A.; Ulery, B.

    1995-06-01

    The work documented in this report was performed in support of the US Nuclear Regulatory Commission to examine the technical basis for candidate guidelines that could be considered in reviewing and evaluating high integrity computer e following software development and assurance activities: Requirements specification; design; coding; verification and validation, inclukding static analysis and dynamic testing; safety analysis; operation and maintenance; configuration management; quality assurance; and planning and management. Each activity (framework element) was subdivided into technical areas (framework subelements). The report describes the development of approximately 200 candidate guidelines that span the entire ran e identification, categorization and prioritization of technical basis for those candidate guidelines; and the identification, categorization and prioritization of research needs for improving the technical basis. The report has two volumes: Volume 1, Executive Summary includes an overview of the framwork and of each framework element, the complete set of candidate guidelines, the results of the assessment of the technical basis for each candidate guideline, and a discussion of research needs that support the regulatory function; this document, Volume 2, is the main report

  13. High integrity software for nuclear power plants: Candidate guidelines, technical basis and research needs. Main report, Volume 2

    Energy Technology Data Exchange (ETDEWEB)

    Seth, S.; Bail, W.; Cleaves, D.; Cohen, H.; Hybertson, D.; Schaefer, C.; Stark, G.; Ta, A.; Ulery, B. [Mitre Corp., McLean, VA (United States)

    1995-06-01

    The work documented in this report was performed in support of the US Nuclear Regulatory Commission to examine the technical basis for candidate guidelines that could be considered in reviewing and evaluating high integrity computer e following software development and assurance activities: Requirements specification; design; coding; verification and validation, inclukding static analysis and dynamic testing; safety analysis; operation and maintenance; configuration management; quality assurance; and planning and management. Each activity (framework element) was subdivided into technical areas (framework subelements). The report describes the development of approximately 200 candidate guidelines that span the entire ran e identification, categorization and prioritization of technical basis for those candidate guidelines; and the identification, categorization and prioritization of research needs for improving the technical basis. The report has two volumes: Volume 1, Executive Summary includes an overview of the framwork and of each framework element, the complete set of candidate guidelines, the results of the assessment of the technical basis for each candidate guideline, and a discussion of research needs that support the regulatory function; this document, Volume 2, is the main report.

  14. Sandia software guidelines: Software quality planning

    Energy Technology Data Exchange (ETDEWEB)

    1987-08-01

    This volume is one in a series of Sandia Software Guidelines intended for use in producing quality software within Sandia National Laboratories. In consonance with the IEEE Standard for Software Quality Assurance Plans, this volume identifies procedures to follow in producing a Software Quality Assurance Plan for an organization or a project, and provides an example project SQA plan. 2 figs., 4 tabs.

  15. Automatic extraction of forward stroke volume using dynamic PET/CT

    DEFF Research Database (Denmark)

    Harms, Hans; Tolbod, Lars Poulsen; Hansson, Nils Henrik

    Background: Dynamic PET can be used to extract forward stroke volume (FSV) by the indicator dilution principle. The technique employed can be automated and is in theory independent on the tracer used and may therefore be added to any dynamic cardiac PET protocol. The aim of this study...... was to validate automated methods for extracting FSV directly from dynamic PET studies for two different tracers and to examine potential scanner hardware bias. Methods: 21 subjects underwent a dynamic 27 min 11C-acetate PET scan on a Siemens Biograph TruePoint 64 PET/CT scanner (scanner I). In addition, 8...... subjects underwent a dynamic 6 min 15O-water PET scan followed by a 27 min 11C-acetate PET scan on a GE Discovery ST PET/CT scanner (scanner II). The LV-aortic time-activity curve (TAC) was extracted automatically from dynamic PET data using cluster analysis. The first-pass peak was isolated by automatic...

  16. The Environment for Application Software Integration and Execution (EASIE), version 1.0. Volume 2: Program integration guide

    Science.gov (United States)

    Jones, Kennie H.; Randall, Donald P.; Stallcup, Scott S.; Rowell, Lawrence F.

    1988-01-01

    The Environment for Application Software Integration and Execution, EASIE, provides a methodology and a set of software utility programs to ease the task of coordinating engineering design and analysis codes. EASIE was designed to meet the needs of conceptual design engineers that face the task of integrating many stand-alone engineering analysis programs. Using EASIE, programs are integrated through a relational data base management system. In volume 2, the use of a SYSTEM LIBRARY PROCESSOR is used to construct a DATA DICTIONARY describing all relations defined in the data base, and a TEMPLATE LIBRARY. A TEMPLATE is a description of all subsets of relations (including conditional selection criteria and sorting specifications) to be accessed as input or output for a given application. Together, these form the SYSTEM LIBRARY which is used to automatically produce the data base schema, FORTRAN subroutines to retrieve/store data from/to the data base, and instructions to a generic REVIEWER program providing review/modification of data for a given template. Automation of these functions eliminates much of the tedious, error prone work required by the usual approach to data base integration.

  17. Extracting Metrics for Three-dimensional Root Systems: Volume and Surface Analysis from In-soil X-ray Computed Tomography Data.

    Science.gov (United States)

    Suresh, Niraj; Stephens, Sean A; Adams, Lexor; Beck, Anthon N; McKinney, Adriana L; Varga, Tamas

    2016-04-26

    Plant roots play a critical role in plant-soil-microbe interactions that occur in the rhizosphere, as well as processes with important implications to climate change and crop management. Quantitative size information on roots in their native environment is invaluable for studying root growth and environmental processes involving plants. X-ray computed tomography (XCT) has been demonstrated to be an effective tool for in situ root scanning and analysis. We aimed to develop a costless and efficient tool that approximates the surface and volume of the root regardless of its shape from three-dimensional (3D) tomography data. The root structure of a Prairie dropseed (Sporobolus heterolepis) specimen was imaged using XCT. The root was reconstructed, and the primary root structure was extracted from the data using a combination of licensed and open-source software. An isosurface polygonal mesh was then created for ease of analysis. We have developed the standalone application imeshJ, generated in MATLAB(1), to calculate root volume and surface area from the mesh. The outputs of imeshJ are surface area (in mm(2)) and the volume (in mm(3)). The process, utilizing a unique combination of tools from imaging to quantitative root analysis, is described. A combination of XCT and open-source software proved to be a powerful combination to noninvasively image plant root samples, segment root data, and extract quantitative information from the 3D data. This methodology of processing 3D data should be applicable to other material/sample systems where there is connectivity between components of similar X-ray attenuation and difficulties arise with segmentation.

  18. Maintenance Manual for AUDIT. A System for Analyzing SESCOMP Software. Volume 4: Appendix D. Listings of the AUDIT Software for the IBM 360.

    Science.gov (United States)

    1977-08-01

    The AUDIT documentation provides the maintenance programmer personnel with the information to effectively maintain and use the AUDIT software. The ...SESCOMPSPEC’s) and produces reports detailing the deviations from those standards. The AUDIT software also examines a program unit to detect and report...changes in word length on the output of computer programs. This report contains the listings of the AUDIT software for the IBM 360. (Author)

  19. Maintenance Manual for AUDIT. A System for Analyzing SESCOMP Software. Volume 3: Appendix C - Listings of the AUDIT Software for the UNIVAC 1108.

    Science.gov (United States)

    1977-08-01

    The AUDIT documentation provides the maintenance programmer personnel with the information to effectively maintain and use the AUDIT software. The ...SESCOMPSPEC’s) and produces reports detailing the deviations from those standards. The AUDIT software also examines a program unit to detect and report...changes in word length on the output of computer programs. This report contains the listings of the AUDIT software for the UNIVAC 1108. (Author)

  20. Maintenance Manual for AUDIT. A System for Analyzing SESCOMP Software. Volume 2: Appendix B. Listings of the Audit Software for the CDC 6000.

    Science.gov (United States)

    1977-08-01

    The AUDIT documentation provides the maintenance programmer personnel with the information to effectively maintain and use the AUDIT software. The ...SESCOMPSPEC’s) and produces reports detailing the deviations from those standards. The AUDIT software also examines a program unit to detect and report...changes in word length on the output of computer programs. This report contains the listings of the AUDIT software for the CDC 6000. (Author)

  1. A high-throughput platform for low-volume high-temperature/pressure sealed vessel solvent extractions

    International Nuclear Information System (INIS)

    Damm, Markus; Kappe, C. Oliver

    2011-01-01

    Highlights: ► Parallel low-volume coffee extractions in sealed-vessel HPLC/GC vials. ► Extractions are performed at high temperatures and pressures (200 °C/20 bar). ► Rapid caffeine determination from the liquid phase. ► Headspace analysis of volatiles using solid-phase microextraction (SPME). - Abstract: A high-throughput platform for performing parallel solvent extractions in sealed HPLC/GC vials inside a microwave reactor is described. The system consist of a strongly microwave-absorbing silicon carbide plate with 20 cylindrical wells of appropriate dimensions to be fitted with standard HPLC/GC autosampler vials serving as extraction vessels. Due to the possibility of heating up to four heating platforms simultaneously (80 vials), efficient parallel analytical-scale solvent extractions can be performed using volumes of 0.5–1.5 mL at a maximum temperature/pressure limit of 200 °C/20 bar. Since the extraction and subsequent analysis by either gas chromatography or liquid chromatography coupled with mass detection (GC–MS or LC–MS) is performed directly from the autosampler vial, errors caused by sample transfer can be minimized. The platform was evaluated for the extraction and quantification of caffeine from commercial coffee powders assessing different solvent types, extraction temperatures and times. For example, 141 ± 11 μg caffeine (5 mg coffee powder) were extracted during a single extraction cycle using methanol as extraction solvent, whereas only 90 ± 11 were obtained performing the extraction in methylene chloride, applying the same reaction conditions (90 °C, 10 min). In multiple extraction experiments a total of ∼150 μg caffeine was extracted from 5 mg commercial coffee powder. In addition to the quantitative caffeine determination, a comparative qualitative analysis of the liquid phase coffee extracts and the headspace volatiles was performed, placing special emphasis on headspace analysis using solid-phase microextraction (SPME

  2. A high-throughput platform for low-volume high-temperature/pressure sealed vessel solvent extractions

    Energy Technology Data Exchange (ETDEWEB)

    Damm, Markus [Christian Doppler Laboratory for Microwave Chemistry (CDLMC) and Institute of Chemistry, Karl-Franzens-University Graz, Heinrichstrasse 28, A-8010 Graz (Austria); Kappe, C. Oliver, E-mail: oliver.kappe@uni-graz.at [Christian Doppler Laboratory for Microwave Chemistry (CDLMC) and Institute of Chemistry, Karl-Franzens-University Graz, Heinrichstrasse 28, A-8010 Graz (Austria)

    2011-11-30

    Highlights: Black-Right-Pointing-Pointer Parallel low-volume coffee extractions in sealed-vessel HPLC/GC vials. Black-Right-Pointing-Pointer Extractions are performed at high temperatures and pressures (200 Degree-Sign C/20 bar). Black-Right-Pointing-Pointer Rapid caffeine determination from the liquid phase. Black-Right-Pointing-Pointer Headspace analysis of volatiles using solid-phase microextraction (SPME). - Abstract: A high-throughput platform for performing parallel solvent extractions in sealed HPLC/GC vials inside a microwave reactor is described. The system consist of a strongly microwave-absorbing silicon carbide plate with 20 cylindrical wells of appropriate dimensions to be fitted with standard HPLC/GC autosampler vials serving as extraction vessels. Due to the possibility of heating up to four heating platforms simultaneously (80 vials), efficient parallel analytical-scale solvent extractions can be performed using volumes of 0.5-1.5 mL at a maximum temperature/pressure limit of 200 Degree-Sign C/20 bar. Since the extraction and subsequent analysis by either gas chromatography or liquid chromatography coupled with mass detection (GC-MS or LC-MS) is performed directly from the autosampler vial, errors caused by sample transfer can be minimized. The platform was evaluated for the extraction and quantification of caffeine from commercial coffee powders assessing different solvent types, extraction temperatures and times. For example, 141 {+-} 11 {mu}g caffeine (5 mg coffee powder) were extracted during a single extraction cycle using methanol as extraction solvent, whereas only 90 {+-} 11 were obtained performing the extraction in methylene chloride, applying the same reaction conditions (90 Degree-Sign C, 10 min). In multiple extraction experiments a total of {approx}150 {mu}g caffeine was extracted from 5 mg commercial coffee powder. In addition to the quantitative caffeine determination, a comparative qualitative analysis of the liquid phase coffee

  3. Purification of nattokinase by reverse micelles extraction from fermentation broth: effect of temperature and phase volume ratio.

    Science.gov (United States)

    Liu, Jun-Guo; Xing, Jian-Min; Chang, Tian-Shi; Liu, Hui-Zhou

    2006-03-01

    Nattokinase is a novel fibrinolytic enzyme that is considered to be a promising agent for thrombosis therapy. In this study, reverse micelles extraction was applied to purify and concentrate nattokinase from fermentation broth. The effects of temperature and phase volume ratio used for the forward and backward extraction on the extraction process were examined. The optimal temperature for forward and backward extraction were 25 degrees C and 35 degrees C respectively. Nattokinase became more thermosensitive during reverse micelles extraction. And it could be enriched in the stripping phase eight times during backward extraction. It was found that nattokinase could be purified by AOT reverse micelles with up to 80% activity recovery and with a purification factor of 3.9.

  4. Reduced left ventricular filling following blood volume extraction does not result in compensatory augmentation of cardiac mechanics.

    Science.gov (United States)

    Lord, Rachel; MacLeod, David; George, Keith; Oxborough, David; Shave, Rob; Stembridge, Mike

    2018-04-01

    What is the central question of this study? A reduction in left ventricular (LV) filling, and concomitant increase in heart rate, augments LV mechanics to maintain stroke volume (SV); however, the impact of reduced LV filling in isolation on SV and LV mechanics is currently unknown. What is the main finding and its importance? An isolated decrease in LV filling did not provoke a compensatory increase in mechanics to maintain SV; in contrast, LV mechanics and SV were reduced. These data indicate that when LV filling is reduced without changes in heart rate, LV mechanics do not compensate to maintain SV. An acute non-invasive reduction in preload has been shown to augment cardiac mechanics to maintain stroke volume and cardiac output. Such interventions induce concomitant changes in heart rate, whereas blood volume extraction reduces preload without changes in heart rate. Therefore, the purpose of this study was to determine whether a preload reduction in isolation resulted in augmented stroke volume achieved via enhanced cardiac mechanics. Nine healthy volunteers (four female, age 29 ± 11 years) underwent echocardiography for the assessment of left ventricular (LV) volumes and mechanics in a supine position at baseline and end extraction after the controlled removal of 25% of total blood volume (1062 ± 342 ml). Arterial blood pressure was monitored continuously by a pressure transducer attached to an indwelling radial artery catheter. Heart rate and total peripheral resistance were unchanged from baseline to end extraction, but systolic blood pressure was reduced (from 148 to 127 mmHg). From baseline to end extraction there were significant reductions in left ventricular end-diastolic volume (from 89 to 71 ml) and stroke volume (from 56 to 37 ml); however, there was no change in LV twist, basal or apical rotation. In contrast, LV longitudinal strain (from -20 to -17%) and basal circumferential strain (from -22 to -19%) were significantly reduced from

  5. Interim report on the development and application of environmental mapped data digitization, encoding, analysis, and display software for the ALICE system. Volume II. [MAP, CHAIN, FIX, and DOUT, in FORTRAN IV for PDP-10

    Energy Technology Data Exchange (ETDEWEB)

    Amiot, L.W.; Lima, R.J.; Scholbrock, S.D.; Shelman, C.B.; Wehman, R.H.

    1979-06-01

    Volume I of An Interim Report on the Development and Application of Environmental Mapped Data Digitization, Encoding, Analysis, and Display Software for the ALICE System provided an overall description of the software developed for the ALICE System and presented an example of its application. The scope of the information presented in Volume I was directed both to the users and developers of digitization, encoding, analysis, and display software. Volume II presents information which is directly related to the actual computer code and operational characteristics (keys and subroutines) of the software. Volume II will be of more interest to developers of software than to users of the software. However, developers of software should be aware that the code developed for the ALICE System operates in an environment where much of the peripheral hardware to the PDP-10 is ANL/AMD built. For this reason, portions of the code may have to be modified for implementation on other computer system configurations. 11 tables.

  6. PEACE: pulsar evaluation algorithm for candidate extraction - a software package for post-analysis processing of pulsar survey candidates

    Science.gov (United States)

    Lee, K. J.; Stovall, K.; Jenet, F. A.; Martinez, J.; Dartez, L. P.; Mata, A.; Lunsford, G.; Cohen, S.; Biwer, C. M.; Rohr, M.; Flanigan, J.; Walker, A.; Banaszak, S.; Allen, B.; Barr, E. D.; Bhat, N. D. R.; Bogdanov, S.; Brazier, A.; Camilo, F.; Champion, D. J.; Chatterjee, S.; Cordes, J.; Crawford, F.; Deneva, J.; Desvignes, G.; Ferdman, R. D.; Freire, P.; Hessels, J. W. T.; Karuppusamy, R.; Kaspi, V. M.; Knispel, B.; Kramer, M.; Lazarus, P.; Lynch, R.; Lyne, A.; McLaughlin, M.; Ransom, S.; Scholz, P.; Siemens, X.; Spitler, L.; Stairs, I.; Tan, M.; van Leeuwen, J.; Zhu, W. W.

    2013-07-01

    Modern radio pulsar surveys produce a large volume of prospective candidates, the majority of which are polluted by human-created radio frequency interference or other forms of noise. Typically, large numbers of candidates need to be visually inspected in order to determine if they are real pulsars. This process can be labour intensive. In this paper, we introduce an algorithm called Pulsar Evaluation Algorithm for Candidate Extraction (PEACE) which improves the efficiency of identifying pulsar signals. The algorithm ranks the candidates based on a score function. Unlike popular machine-learning-based algorithms, no prior training data sets are required. This algorithm has been applied to data from several large-scale radio pulsar surveys. Using the human-based ranking results generated by students in the Arecibo Remote Command Center programme, the statistical performance of PEACE was evaluated. It was found that PEACE ranked 68 per cent of the student-identified pulsars within the top 0.17 per cent of sorted candidates, 95 per cent within the top 0.34 per cent and 100 per cent within the top 3.7 per cent. This clearly demonstrates that PEACE significantly increases the pulsar identification rate by a factor of about 50 to 1000. To date, PEACE has been directly responsible for the discovery of 47 new pulsars, 5 of which are millisecond pulsars that may be useful for pulsar timing based gravitational-wave detection projects.

  7. Achieving Better Buying Power through Acquisition of Open Architecture Software Systems. Volume 2 Understanding Open Architecture Software Systems: Licensing and Security Research and Recommendations

    Science.gov (United States)

    2016-01-06

    KWD00], as are  CORBA, Microsoft’s .NET, and Enterprise  Java  Beans.    ● Configured system or sub­system​ – These are software systems built to conform to...background.  55 Some OSS is multiply­licensed, or distributed under two or more licenses. The  MySQL  database  software is distributed either under GPLv2 for...Automation    The license metamodel, calculation, and an assortment of license interpretations are implemented  in a  Java  package. The calculation

  8. Architecture and data processing alternatives for the TSE computer. Volume 2: Extraction of topological information from an image by the Tse computer

    Science.gov (United States)

    Jones, J. R.; Bodenheimer, R. E.

    1976-01-01

    A simple programmable Tse processor organization and arithmetic operations necessary for extraction of the desired topological information are described. Hardware additions to this organization are discussed along with trade-offs peculiar to the tse computing concept. An improved organization is presented along with the complementary software for the various arithmetic operations. The performance of the two organizations is compared in terms of speed, power, and cost. Software routines developed to extract the desired information from an image are included.

  9. CrossTalk: The Journal of Defense Software Engineering. Volume 24, Number 1, Jan/Feb 2011

    Science.gov (United States)

    2011-02-01

    Aircraft Sustainment Group Tony Henderson 309th Software Maintenance Group Lt. Col. Brian Hermann, Ph.D. Defense Information Systems Agency Lt. Col...Solutions, Inc. Gordon Sleve Robbins Gioia LLC Larry Smith Software Technology Support Center Dr. John Sohl Weber State University Elizabeth Starrett OO-ALC

  10. Freely-available, true-color volume rendering software and cryohistology data sets for virtual exploration of the temporal bone anatomy.

    Science.gov (United States)

    Kahrs, Lüder Alexander; Labadie, Robert Frederick

    2013-01-01

    Cadaveric dissection of temporal bone anatomy is not always possible or feasible in certain educational environments. Volume rendering using CT and/or MRI helps understanding spatial relationships, but they suffer in nonrealistic depictions especially regarding color of anatomical structures. Freely available, nonstained histological data sets and software which are able to render such data sets in realistic color could overcome this limitation and be a very effective teaching tool. With recent availability of specialized public-domain software, volume rendering of true-color, histological data sets is now possible. We present both feasibility as well as step-by-step instructions to allow processing of publicly available data sets (Visible Female Human and Visible Ear) into easily navigable 3-dimensional models using free software. Example renderings are shown to demonstrate the utility of these free methods in virtual exploration of the complex anatomy of the temporal bone. After exploring the data sets, the Visible Ear appears more natural than the Visible Human. We provide directions for an easy-to-use, open-source software in conjunction with freely available histological data sets. This work facilitates self-education of spatial relationships of anatomical structures inside the human temporal bone as well as it allows exploration of surgical approaches prior to cadaveric testing and/or clinical implementation. Copyright © 2013 S. Karger AG, Basel.

  11. CrossTalk, The Journal of Defense Software Engineering. Volume 27, Number 4. July/August 2014

    Science.gov (United States)

    2014-07-01

    his writing. His works include “The Psychology of Computer Program- ming, An Introduction to General Systems Thinking, Becoming a Technical Leader ...en.wikipedia.org/wiki/Six_phases_of_a_big_ project). I am certainly not the originator. In 1997, Robert Glass published a book entitled “Software Runaways ...failures. In fact, to quote from the Amazon.com “blurb” on the book, Runaways brings a software engineer’s perspective to projects like: American

  12. Guidelines for the verification and validation of expert system software and conventional software: Volume 4, Evaluation of knowledge base certification methods. Final report

    International Nuclear Information System (INIS)

    Miller, L.A.; Hayes, J.E.; Mirsky, S.M.

    1995-05-01

    Objective is the formulation of guidelines for the V ampersand V of expert systems for use in nuclear power applications. This activity is concerned with the development and testing of various methods for assuring the quality of knowledge bases. The testing procedure used was that of behavioral experiment, the first known such evaluation of any of V ampersand V activity; the value lies in the capability to provide empirical evidence for or against the effectiveness of plausible methods in helping people find problems in knowledge bases. The three-day experiment included 20 participants from three nuclear utilities, the Nuclear Regulatory Commission's Technical training Center, University of Maryland, EG ampersand G Idaho, and SAIC. The study used two real nuclear expert systems: a boiling water reactor emergency operating procedures tracking system and a pressurized water reactor safety assessment systems. Ten participants were assigned to each of the expert systems. All participants were trained in and then used a sequence of four different V ampersand V methods selected as being the best and most appropriate. These methods either involved the analysis and tracing of requirements to elements in the knowledge base or direct inspection of the knowledge base for various kinds of errors. Half of the subjects within each system group used the best annual variant of the V ampersand V methods (the control group), while the other half were supported by the results of applying real or simulated automated tools to the knowledge bases (the experimental group). The four groups of participants were similar in nuclear engineering and software experience characteristics. It is concluded that the use of tools in static knowledge base certification results in significant improvement in detecting all types of defects, avoiding false alarms, and completing the effort in less time. The simulated knowledge-checking tool, based on supplemental engineering information about the systems

  13. Three-dimensional binding sites volume assessment during cardiac pacing lead extraction

    Directory of Open Access Journals (Sweden)

    Bich Lien Nguyen

    2015-07-01

    Conclusions: Real-time 3D binding sites assessment is feasible and improves transvenous lead extraction outcomes. Its role as a complementary information requires extensive validation, and might be beneficial for a tailored strategy.

  14. Simple and efficient method for region of interest value extraction from picture archiving and communication system viewer with optical character recognition software and macro program.

    Science.gov (United States)

    Lee, Young Han; Park, Eun Hae; Suh, Jin-Suck

    2015-01-01

    The objectives are: 1) to introduce a simple and efficient method for extracting region of interest (ROI) values from a Picture Archiving and Communication System (PACS) viewer using optical character recognition (OCR) software and a macro program, and 2) to evaluate the accuracy of this method with a PACS workstation. This module was designed to extract the ROI values on the images of the PACS, and created as a development tool by using open-source OCR software and an open-source macro program. The principal processes are as follows: (1) capture a region of the ROI values as a graphic file for OCR, (2) recognize the text from the captured image by OCR software, (3) perform error-correction, (4) extract the values including area, average, standard deviation, max, and min values from the text, (5) reformat the values into temporary strings with tabs, and (6) paste the temporary strings into the spreadsheet. This principal process was repeated for the number of ROIs. The accuracy of this module was evaluated on 1040 recognitions from 280 randomly selected ROIs of the magnetic resonance images. The input times of ROIs were compared between conventional manual method and this extraction module-assisted input method. The module for extracting ROI values operated successfully using the OCR and macro programs. The values of the area, average, standard deviation, maximum, and minimum could be recognized and error-corrected with AutoHotkey-coded module. The average input times using the conventional method and the proposed module-assisted method were 34.97 seconds and 7.87 seconds, respectively. A simple and efficient method for ROI value extraction was developed with open-source OCR and a macro program. Accurate inputs of various numbers from ROIs can be extracted with this module. The proposed module could be applied to the next generation of PACS or existing PACS that have not yet been upgraded. Copyright © 2015 AUR. Published by Elsevier Inc. All rights reserved.

  15. Process mining software repositories

    NARCIS (Netherlands)

    Poncin, W.; Serebrenik, A.; Brand, van den M.G.J.

    2011-01-01

    Software developers' activities are in general recorded in software repositories such as version control systems, bug trackers and mail archives. While abundant information is usually present in such repositories, successful information extraction is often challenged by the necessity to

  16. Quantitative assessment of primary mitral regurgitation using left ventricular volumes obtained with new automated three-dimensional transthoracic echocardiographic software: A comparison with 3-Tesla cardiac magnetic resonance.

    Science.gov (United States)

    Levy, Franck; Marechaux, Sylvestre; Iacuzio, Laura; Schouver, Elie Dan; Castel, Anne Laure; Toledano, Manuel; Rusek, Stephane; Dor, Vincent; Tribouilloy, Christophe; Dreyfus, Gilles

    2018-03-30

    Quantitative assessment of primary mitral regurgitation (MR) using left ventricular (LV) volumes obtained with three-dimensional transthoracic echocardiography (3D TTE) recently showed encouraging results. Nevertheless, 3D TTE is not incorporated into everyday practice, as current LV chamber quantification software products are time consuming. To investigate the accuracy and reproducibility of new automated fast 3D TTE software (HeartModel A.I. ; Philips Healthcare, Andover, MA, USA) for the quantification of LV volumes and MR severity in patients with isolated degenerative primary MR; and to compare regurgitant volume (RV) obtained with 3D TTE with a cardiac magnetic resonance (CMR) reference. Fifty-three patients (37 men; mean age 64±12 years) with at least mild primary isolated MR, and having comprehensive 3D TTE and CMR studies within 24h, were eligible for inclusion. MR RV was calculated using the proximal isovelocity surface area (PISA) method and the volumetric method (total LV stroke volume minus aortic stroke volume) with either CMR or 3D TTE. Inter- and intraobserver reproducibility of 3D TTE was excellent (coefficient of variation≤10%) for LV volumes. MR RV was similar using CMR and 3D TTE (57±23mL vs 56±28mL; P=0.22), but was significantly higher using the PISA method (69±30mL; P<0.05 compared with CMR and 3D TTE). The PISA method consistently overestimated MR RV compared with CMR (bias 12±21mL), while no significant bias was found between 3D TTE and CMR (bias 2±14mL). Concordance between echocardiography and CMR was higher using 3D TTE MR grading (intraclass correlation coefficient [ICC]=0.89) than with PISA MR grading (ICC=0.78). Complete agreement with CMR grading was more frequent with 3D TTE than with the PISA method (76% vs 63%). 3D TTE RV assessment using the new generation of automated software correlates well with CMR in patients with isolated degenerative primary MR. Copyright © 2018 Elsevier Masson SAS. All rights reserved.

  17. CrossTalk, The Journal of Defense Software Engineering. Volume 28 Number 1. Jan/Feb 2015

    Science.gov (United States)

    2015-02-01

    minority participation is low [24]. At UT Tyler, we conduct science camps for high-school girl students in the summer to encourage them to pursue STEM ...Intellectual Property Rights (Data Rights for commercial built software applications), CLE074 on Cybersecurity (March 2015 deployment), and CLL

  18. CrossTalk: The Journal of Defense Software Engineering. Volume 23, Number 1, Jan/Feb 2010

    Science.gov (United States)

    2010-02-01

    Maintenance Group Weber State University Arrowpoint Solutions, Inc. Robbins Gioia LLC Software Technology Support Center Weber State University OO-ALC...Les Dupaix Monika Fast Robert W. Ferguson Dr. Doretta Gordon Dr. John A. “Drew” Hamilton Jr. Gary Hebert Tony Henderson Lt. Col. Brian Hermann, Ph.D

  19. CrossTalk. The Journal of Defense Software Engineering. Volume 27, Number 2. March/April 2014

    Science.gov (United States)

    2014-04-01

    integrity and authenticity of the end product The Software Maintenance Group at Hill Air Force Base is recruiting civilians (U.S. Citizenship...B.Claise, “ Cisco Systems NetFlow Services Export Version 9”, October 2004, RFC 3954, <http://tools.ietf.org/html/rfc3954> 6. L. Daigle, “WHOIS

  20. CrossTalk: The Journal of Defense Software Engineering. Volume 25, Number 1, January/February 2012

    Science.gov (United States)

    2012-02-01

    February 2012 25 HIGH MATURITY - THE PAYOFF expanded target audience of producers, buyers, and users of software products and systems bring with it...diet Pepsi and Ho-Hos. • You have read this list, and realized that several items apply to you. And it got less and less funny as you kept

  1. Extraction of volume produced H- or D- ions from a sheet plasma, 2

    International Nuclear Information System (INIS)

    Uramoto, Joshin.

    1984-02-01

    A development to large area H - or D - ion source is tried by using three extraction electrodes: The first electrode bias voltage is set near the wall potential (floating), the second electrode is set near 13 % of main extraction voltage and the third electrode is the main acceleration electrode. An ion current of 13 mA (3.3 mA/cm 2 ) for H - or 11 mA (2.8 mA/ cm 2 ) for D - at 3 KeV is extracted from 9 apertures of 6 mm phi in 4 cm 2 outside of the sheet plasma (14 cm wide and 1.0 cm thick) under a pressure of 7.7 x 10 -4 H2 or D2 gas and a weak magnetic field 50 gauss. Then, it is noted that the corresponding electron current is suppressed below 1/10 of the H - or D - ion current. (author)

  2. EXTRACT

    DEFF Research Database (Denmark)

    Pafilis, Evangelos; Buttigieg, Pier Luigi; Ferrell, Barbra

    2016-01-01

    The microbial and molecular ecology research communities have made substantial progress on developing standards for annotating samples with environment metadata. However, sample manual annotation is a highly labor intensive process and requires familiarity with the terminologies used. We have the...... and text-mining-assisted curation revealed that EXTRACT speeds up annotation by 15-25% and helps curators to detect terms that would otherwise have been missed.Database URL: https://extract.hcmr.gr/......., organism, tissue and disease terms. The evaluators in the BioCreative V Interactive Annotation Task found the system to be intuitive, useful, well documented and sufficiently accurate to be helpful in spotting relevant text passages and extracting organism and environment terms. Comparison of fully manual...

  3. Salt removal from microliter sample volumes by multiple phase microelectromembrane extractions across free liquid membranes

    Czech Academy of Sciences Publication Activity Database

    Kubáň, Pavel

    2017-01-01

    Roč. 89, č. 16 (2017), s. 8476-8483 ISSN 0003-2700 R&D Projects: GA ČR(CZ) GA16-09135S Institutional support: RVO:68081715 Keywords : desalting * microelectromembrane extraction * electrospray ionization-mass spectrometry Subject RIV: CB - Analytical Chemistry , Separation OBOR OECD: Analytical chemistry Impact factor: 6.320, year: 2016

  4. Salt removal from microliter sample volumes by multiple phase microelectromembrane extractions across free liquid membranes

    Czech Academy of Sciences Publication Activity Database

    Kubáň, Pavel

    2017-01-01

    Roč. 89, č. 16 (2017), s. 8476-8483 ISSN 0003-2700 R&D Projects: GA ČR(CZ) GA16-09135S Institutional support: RVO:68081715 Keywords : desalting * microelectromembrane extraction * electrospray ionization-mass spectrometry Subject RIV: CB - Analytical Chemistry, Separation OBOR OECD: Analytical chemistry Impact factor: 6.320, year: 2016

  5. CrossTalk: The Journal of Defense Software Engineering. Volume 27, Number 6, November/December 2014

    Science.gov (United States)

    2014-12-01

    conclusion lest the analysis is biased . 6 CrossTalk—November/December 2014 SOFTWARE ENGINEERING TOOLS AND THE PROCESSES THEY SUPPORT To demonstrate...In some Agile development environments the relative separa- tion between tools and processes is so seamless as to be almost subconscious to process...a well-articulated process description, but if the performer is a tool lover, he will find fault with the defined process, always having a bias for

  6. CrossTalk. The Journal of Defense Software Engineering. Volume 23, Number 6, Nov/Dec 2010

    Science.gov (United States)

    2010-11-01

    such standards are International Standards Organization/ International Electrotechnical Commission ( ISO / IEC) standards 15288 for system engineering...and 12207 for software development. 13. Office of the DoD CIO. White Paper Phase I: A Competency Framework for the DoD Architect. Washington...processes in later phases. DoD-Centric Use Case Current support for net-centric operations is based on iso - lated deployments of relevant services in

  7. CrossTalk. The Journal of Defense Software Engineering. Volume 24, Number 5, Sep/Oct 2011

    Science.gov (United States)

    2011-09-01

    ancillary materials is counterfeit. Coun- terfeiting of ancillary materials may involve copying and slightly altering legitimate product data or...tools that implemented/applied the software protection mechanisms were authentic and trustworthy. IPR Enforcement Efforts In 2005, KPMG and the...2005. 5. KPMG and the Alliance for Gray Market and Counterfeit Abatement. “Managing the Risks of Counterfeiting in the Information Technology Industry

  8. CrossTalk: The Journal of Defense Software Engineering. Volume 22, Number 7, Nov/Dec 2009

    Science.gov (United States)

    2009-12-01

    anyone, and a public key, which is publicly advertised . A signer encrypts data using the recipient’s public key, and the receiver decrypts it with...of service. 6. Encryption certificates are advertised in the DoD via the Joint Enterprise Directory Service (located at <https:// jeds.gds.disa.mil...mainframe legacy code into an iPhone app. In Hell’s Kitchen, programmers try to find a software bug nestled inside two mil- lion lines of undocumented

  9. CrossTalk: The Journal of Defense Software Engineering. Volume 28, Number 2, March/April 2015

    Science.gov (United States)

    2015-04-01

    software is like striking gold in that you get a very high return on investment for testing. Optimization Technique #2 - Apply the Pareto Principle The... Pareto Principle (the 80/20 rule) that says you can get the majority of value from the minority of people, time or effort. In fact, my experience is...week test in two weeks. So, the client and I worked together to focus on key areas and skip minor areas. The test design was based on critical

  10. CrossTalk: The Journal of Defense Software Engineering. Volume 26, Number 3, May-June 2013

    Science.gov (United States)

    2013-06-01

    in which the pieces are being shaped at the same time they are being as- sembled. If I am honest, software is probably more like a Rube Goldberg ...losing the focus on architecting activities that help maintain the desired state, enable cost savings, and ensure delivery tempo when other agile...masters degrees and working as developers), accepted by the same in- structor, with counted LOC identically, yielded variations as great as 22:1, and

  11. Automatic extraction of myocardial mass and volumes using parametric images from dynamic nongated PET

    DEFF Research Database (Denmark)

    Harms, Hendrik Johannes; Hansson, Nils Henrik Stubkjær; Tolbod, Lars Poulsen

    2016-01-01

    Dynamic cardiac positron emission tomography (PET) is used to quantify molecular processes in vivo. However, measurements of left-ventricular (LV) mass and volumes require electrocardiogram (ECG)-gated PET data. The aim of this study was to explore the feasibility of measuring LV geometry using non......-gated dynamic cardiac PET. METHODS: Thirty-five patients with aortic-valve stenosis and 10 healthy controls (HC) underwent a 27-min 11C-acetate PET/CT scan and cardiac magnetic resonance imaging (CMR). HC were scanned twice to assess repeatability. Parametric images of uptake rate K1 and the blood pool were......LV and WT only and an overestimation for LVEF at lower values. Intra- and inter-observer correlations were >0.95 for all PET measurements. PET repeatability accuracy in HC was comparable to CMR. CONCLUSION: LV mass and volumes are accurately and automatically generated from dynamic 11C-acetate PET without...

  12. GENII (Generation II): The Hanford Environmental Radiation Dosimetry Software System: Volume 3, Code maintenance manual: Hanford Environmental Dosimetry Upgrade Project

    Energy Technology Data Exchange (ETDEWEB)

    Napier, B.A.; Peloquin, R.A.; Strenge, D.L.; Ramsdell, J.V.

    1988-09-01

    The Hanford Environmental Dosimetry Upgrade Project was undertaken to incorporate the internal dosimetry models recommended by the International Commission on Radiological Protection (ICRP) in updated versions of the environmental pathway analysis models used at Hanford. The resulting second generation of Hanford environmental dosimetry computer codes is compiled in the Hanford Environmental Dosimetry System (Generation II, or GENII). This coupled system of computer codes is intended for analysis of environmental contamination resulting from acute or chronic releases to, or initial contamination of, air, water, or soil, on through the calculation of radiation doses to individuals or populations. GENII is described in three volumes of documentation. This volume is a Code Maintenance Manual for the serious user, including code logic diagrams, global dictionary, worksheets to assist with hand calculations, and listings of the code and its associated data libraries. The first volume describes the theoretical considerations of the system. The second volume is a Users' Manual, providing code structure, users' instructions, required system configurations, and QA-related topics. 7 figs., 5 tabs.

  13. GENII [Generation II]: The Hanford Environmental Radiation Dosimetry Software System: Volume 3, Code maintenance manual: Hanford Environmental Dosimetry Upgrade Project

    International Nuclear Information System (INIS)

    Napier, B.A.; Peloquin, R.A.; Strenge, D.L.; Ramsdell, J.V.

    1988-09-01

    The Hanford Environmental Dosimetry Upgrade Project was undertaken to incorporate the internal dosimetry models recommended by the International Commission on Radiological Protection (ICRP) in updated versions of the environmental pathway analysis models used at Hanford. The resulting second generation of Hanford environmental dosimetry computer codes is compiled in the Hanford Environmental Dosimetry System (Generation II, or GENII). This coupled system of computer codes is intended for analysis of environmental contamination resulting from acute or chronic releases to, or initial contamination of, air, water, or soil, on through the calculation of radiation doses to individuals or populations. GENII is described in three volumes of documentation. This volume is a Code Maintenance Manual for the serious user, including code logic diagrams, global dictionary, worksheets to assist with hand calculations, and listings of the code and its associated data libraries. The first volume describes the theoretical considerations of the system. The second volume is a Users' Manual, providing code structure, users' instructions, required system configurations, and QA-related topics. 7 figs., 5 tabs

  14. GENII: The Hanford Environmental Radiation Dosimetry Software System: Volume 2, Users' manual: Hanford Environmental Dosimetry Upgrade Project

    International Nuclear Information System (INIS)

    Napier, B.A.; Peloquin, R.A.; Strenge, D.L.; Ramsdell, J.V.

    1988-11-01

    The Hanford Environmental Dosimetry Upgrade Project was undertaken to incorporate the internal dosimetry models recommended by the International Commission on Radiological Protection (ICRP) in updated versions of the environmental pathway analysis models used at Hanford. The resulting second generation of Hanford environmental dosimetry computer codes is compiled in the Hanford Environmental Dosimetry System (Generation II, or GENII). The purpose of this coupled system of computer codes is to analyze environmental contamination of, air, water, or soil. This is accomplished by calculating radiation doses to individuals or populations. GENII is described in three volumes of documentation. This second volume is a Users' Manual, providing code structure, users' instructions, required system configurations, and QA-related topics. The first volume describes the theoretical considerations of the system. The third volume is a Code Maintenance Manual for the user who requires knowledge of code detail. It includes logic diagrams, global dictionary, worksheets, example hand calculations, and listings of the code and its associated data libraries. 27 refs., 17 figs., 23 tabs

  15. PEACE: pulsar evaluation algorithm for candidate extraction - a software package for post-analysis processing of pulsar survey candidates

    NARCIS (Netherlands)

    Lee, K.J.; Stovall, K.; Jenet, F.A.; Martinez, J.; Dartez, L.P.; Mata, A.; Lunsford, G.; Cohen, S.; Biwer, C.M.; Rohr, M.; Flanigan, J.; Walker, A.; Banaszak, S.; Allen, B.; Barr, E.D.; Bhat, N.D.R.; Bogdanov, S.; Brazier, A.; Camilo, F.; Champion, D.J.; Chatterjee, S.; Cordes, J.; Crawford, F.; Deneva, J.; Desvignes, G.; Ferdman, R.D.; Freire, P.; Hessels, J.W.T.; Karuppusamy, R.; Kaspi, V.M.; Knispel, B.; Kramer, M.; Lazarus, P.; Lynch, R.; Lyne, A.; McLaughlin, M.; Ransom, S.; Scholz, P.; Siemens, X.; Spitler, L.; Stairs, I.; Tan, M.; van Leeuwen, J.; Zhu, W.W.

    2013-01-01

    Modern radio pulsar surveys produce a large volume of prospective candidates, the majority of which are polluted by human-created radio frequency interference or other forms of noise. Typically, large numbers of candidates need to be visually inspected in order to determine if they are real pulsars.

  16. Large volume TENAX {sup registered} extraction of the bioaccessible fraction of sediment-associated organic compounds for a subsequent effect-directed analysis

    Energy Technology Data Exchange (ETDEWEB)

    Schwab, K.; Brack, W. [UFZ - Helmholtz Centre or Environmental Research, Leipzig (Germany). Dept. of Effect-Directed Analysis

    2007-06-15

    Background, Aim and Scope: Effect-directed analysis (EDA) is a powerful tool for the identification of key toxicants in complex environmental samples. In most cases, EDA is based on total extraction of organic contaminants leading to an erroneous prioritization with regard to hazard and risk. Bioaccessibility-directed extraction aims to discriminate between contaminants that take part in partitioning between sediment and biota in a relevant time frame and those that are enclosed in structures, that do not allow rapid desorption. Standard protocols of targeted extraction of rapidly desorbing, and thus bioaccessible fraction using TENAX {sup registered} are based only on small amounts of sediment. In order to get sufficient amounts of extracts for subsequent biotesting, fractionation, and structure elucidation a large volume extraction technique needs to be developed applying one selected extraction time and excluding toxic procedural blanks. Materials and Methods: Desorption behaviour of sediment contaminants was determined by a consecutive solid-solid extraction of sediment using TENAX {sup registered} fitting a tri-compartment model on experimental data. Time needed to remove the rapidly desorbing fraction trap was calculated to select a fixed extraction time for single extraction procedures. Up-scaling by about a factor of 100 provided a large volume extraction technique for EDA. Reproducibility and comparability to small volume approach were proved. Blanks of respective TENAX {sup registered} mass were investigated using Scenedesmus vacuolatus and Artemia salina as test organisms. Results: Desorption kinetics showed that 12 to 30 % of sediment associated pollutants are available for rapid desorption. t{sub r}ap is compound dependent and covers a range of 2 to 18 h. On that basis a fixed extraction time of 24 h was selected. Validation of large volume approach was done by the means of comparison to small method and reproducibility. The large volume showed a good

  17. CrossTalk: The Journal of Defense Software Engineering. Volume 22, Number 6, September/October 2009

    Science.gov (United States)

    2009-10-01

    distinguished intermediate gates (Y6, Y7) known as out- puts. We define a signal as a vertical reading of a column in the truth table (a fully enu...glanced at the EKG and noticed severe brady- cardia. He realized he had never re- started the ventilator. This patient ultimately died. [10] This accident...attacker to achieve his objective. Google “Ariane 5 Flight 501,” “Therac-25 accidents,” or “Toyota Prius software bug” to read about some dramat- ic

  18. YAdumper: extracting and translating large information volumes from relational databases to structured flat files.

    Science.gov (United States)

    Fernández, José M; Valencia, Alfonso

    2004-10-12

    Downloading the information stored in relational databases into XML and other flat formats is a common task in bioinformatics. This periodical dumping of information requires considerable CPU time, disk and memory resources. YAdumper has been developed as a purpose-specific tool to deal with the integral structured information download of relational databases. YAdumper is a Java application that organizes database extraction following an XML template based on an external Document Type Declaration. Compared with other non-native alternatives, YAdumper substantially reduces memory requirements and considerably improves writing performance.

  19. Feasibility and performance of novel software to quantify metabolically active volumes and 3D partial volume corrected SUV and metabolic volumetric products of spinal bone marrow metastases on 18F-FDG-PET/CT.

    Science.gov (United States)

    Torigian, Drew A; Lopez, Rosa Fernandez; Alapati, Sridevi; Bodapati, Geetha; Hofheinz, Frank; van den Hoff, Joerg; Saboury, Babak; Alavi, Abass

    2011-01-01

    Our aim was to assess feasibility and performance of novel semi-automated image analysis software called ROVER to quantify metabolically active volume (MAV), maximum standardized uptake value-maximum (SUV(max)), 3D partial volume corrected mean SUV (cSUV(mean)), and 3D partial volume corrected mean MVP (cMVP(mean)) of spinal bone marrow metastases on fluorine-18 fluorodeoxyglucose-positron emission tomography/computerized tomography ((18)F-FDG-PET/CT). We retrospectively studied 16 subjects with 31 spinal metastases on FDG-PET/CT and MRI. Manual and ROVER determinations of lesional MAV and SUV(max), and repeated ROVER measurements of MAV, SUV(max), cSUV(mean) and cMVP(mean) were made. Bland-Altman and correlation analyses were performed to assess reproducibility and agreement. Our results showed that analyses of repeated ROVER measurements revealed MAV mean difference (D)=-0.03±0.53cc (95% CI(-0.22, 0.16)), lower limit of agreement (LLOA)=-1.07cc, and upper limit of agreement (ULOA)=1.01cc; SUV(max) D=0.00±0.00 with LOAs=0.00; cSUV(mean) D=-0.01±0.39 (95% CI(-0.15, 0.13)), LLOA=-0.76, and ULOA=0.75; cMVP(mean) D=-0.52±4.78cc (95% CI(-2.23, 1.23)), LLOA=-9.89cc, and ULOA=8.86cc. Comparisons between ROVER and manual measurements revealed volume D= -0.39±1.37cc (95% CI (-0.89, 0.11)), LLOA=-3.08cc, and ULOA=2.30cc; SUV(max) D=0.00±0.00 with LOAs=0.00. Mean percent increase in lesional SUV(mean) and MVP(mean) following partial volume correction using ROVER was 84.25±36.00% and 84.45±35.94% , respectively. In conclusion, it is feasible to estimate MAV, SUV(max), cSUV(mean), and cMVP(mean) of spinal bone marrow metastases from (18)F-FDG-PET/CT quickly and easily with good reproducibility via ROVER software. Partial volume correction is imperative, as uncorrected SUV(mean) and MVP(mean) are significantly underestimated, even for large lesions. This novel approach has great potential for practical, accurate, and precise combined structural-functional PET

  20. Network, system, and status software enhancements for the autonomously managed electrical power system breadboard. Volume 4: Graphical status display

    Science.gov (United States)

    Mckee, James W.

    1990-01-01

    This volume (4 of 4) contains the description, structured flow charts, prints of the graphical displays, and source code to generate the displays for the AMPS graphical status system. The function of these displays is to present to the manager of the AMPS system a graphical status display with the hot boxes that allow the manager to get more detailed status on selected portions of the AMPS system. The development of the graphical displays is divided into two processes; the creation of the screen images and storage of them in files on the computer, and the running of the status program which uses the screen images.

  1. Computer software quality assurance

    International Nuclear Information System (INIS)

    Ives, K.A.

    1986-06-01

    The author defines some criteria for the evaluation of software quality assurance elements for applicability to the regulation of the nuclear industry. The author then analyses a number of software quality assurance (SQA) standards. The major extracted SQA elements are then discussed, and finally specific software quality assurance recommendations are made for the nuclear industry

  2. Design and implementation of a control automatic module for the volume extraction of a 99mTc generator

    International Nuclear Information System (INIS)

    Lopez, Yon; Urquizo, Rafael; Gago, Javier; Mendoza, Pablo

    2014-01-01

    A module for the automatic extraction of volume from 0.05 mL to 1 mL has been developed using a 3D printer, using as base material acrylonitrile butadiene styrene (ABS). The design allows automation of the input and ejection eluate 99m Tc in the generator prototype 99 Mo/ 99m Tc processes; use in other systems is feasible due to its high degree of versatility, depending on the selection of the main components: precision syringe and multi-way solenoid valve. An accuracy equivalent to commercial equipment has been obtained, but at lower cost. This article describes the mechanical design, design calculations of the movement mechanism, electronics and automatic syringe dispenser control. (authors).

  3. Guidelines for the verification and validation of expert system software and conventional software: Volume 2, Survey and assessment of conventional software verification and validation methods Revision 1, Final report

    International Nuclear Information System (INIS)

    Miller, L.A.; Groundwater, E.H.; Hayes, J.E.; Mirsky, S.M.

    1995-05-01

    By means of a literature survey, a comprehensive set of methods was identified for the verification and validation of conventional software. The 153 methods so identified were classified according to their appropriateness for various phases of a developmental life-cycle -- requirements, design, and implementation; the last category was subdivided into two, static testing and dynamic testing methods. The methods were then characterized in terms of eight rating factors, four concerning ease-of-use of the methods and four concerning the methods' power to detect defects. Based on these factors, two measurements were developed to permit quantitative comparisons among methods, a Cost-Benefit Metric and an Effectiveness Metric. The Effectiveness Metric was further refined to provide three different estimates for each method, depending on three classes of needed stringency of V ampersand V (determined by ratings of a system's complexity and required-integrity). Methods were then rank-ordered for each of the three classes in terms of their overall cost-benefits and effectiveness. The applicability was then assessed of each method for the four identified components of knowledge-based and expert systems, as well as the system as a whole

  4. Network, system, and status software enhancements for the autonomously managed electrical power system breadboard. Volume 2: Protocol specification

    Science.gov (United States)

    Mckee, James W.

    1990-01-01

    This volume (2 of 4) contains the specification, structured flow charts, and code listing for the protocol. The purpose of an autonomous power system on a spacecraft is to relieve humans from having to continuously monitor and control the generation, storage, and distribution of power in the craft. This implies that algorithms will have been developed to monitor and control the power system. The power system will contain computers on which the algorithms run. There should be one control computer system that makes the high level decisions and sends commands to and receive data from the other distributed computers. This will require a communications network and an efficient protocol by which the computers will communicate. One of the major requirements on the protocol is that it be real time because of the need to control the power elements.

  5. Analysis of polycyclic aromatic hydrocarbons in water and beverages using membrane-assisted solvent extraction in combination with large volume injection-gas chromatography-mass spectrometric detection.

    Science.gov (United States)

    Rodil, Rosario; Schellin, Manuela; Popp, Peter

    2007-09-07

    Membrane-assisted solvent extraction (MASE) in combination with large volume injection-gas chromatography-mass spectrometry (LVI-GC-MS) was applied for the determination of 16 polycyclic aromatic hydrocarbons (PAHs) in aqueous samples. The MASE conditions were optimized for achieving high enrichment of the analytes from aqueous samples, in terms of extraction conditions (shaking speed, extraction temperature and time), extraction solvent and composition (ionic strength, sample pH and presence of organic solvent). Parameters like linearity and reproducibility of the procedure were determined. The extraction efficiency was above 65% for all the analytes and the relative standard deviation (RSD) for five consecutive extractions ranged from 6 to 18%. At optimized conditions detection limits at the ng/L level were achieved. The effectiveness of the method was tested by analyzing real samples, such as river water, apple juice, red wine and milk.

  6. On-line micro-volume introduction system developed for lower density than water extraction solvent and dispersive liquid–liquid microextraction coupled with flame atomic absorption spectrometry

    International Nuclear Information System (INIS)

    Anthemidis, Aristidis N.; Mitani, Constantina; Balkatzopoulou, Paschalia; Tzanavaras, Paraskevas D.

    2012-01-01

    Highlights: ► A dispersive liquid–liquid micro extraction method for lead and copper determination. ► A micro-volume transportation system for extractant solvent lighter than water. ► Analysis of natural water samples. - Abstract: A simple and fast preconcentration/separation dispersive liquid–liquid micro extraction (DLLME) method for metal determination based on the use of extraction solvent with lower density than water has been developed. For this purpose a novel micro-volume introduction system was developed enabling the on-line injection of the organic solvent into flame atomic absorption spectrometry (FAAS). The effectiveness and efficiency of the proposed system were demonstrated for lead and copper preconcentration in environmental water samples using di-isobutyl ketone (DBIK) as extraction solvent. Under the optimum conditions the enhancement factor for lead and copper was 187 and 310 respectively. For a sample volume of 10 mL, the detection limit (3 s) and the relative standard deviation were 1.2 μg L −1 and 3.3% for lead and 0.12 μg L −1 and 2.9% for copper respectively. The developed method was evaluated by analyzing certified reference material and it was applied successfully to the analysis of environmental water samples.

  7. Solid-Phase Extraction and Large-Volume Sample Stacking-Capillary Electrophoresis for Determination of Tetracycline Residues in Milk

    Directory of Open Access Journals (Sweden)

    Gabriela Islas

    2018-01-01

    Full Text Available Solid-phase extraction in combination with large-volume sample stacking-capillary electrophoresis (SPE-LVSS-CE was applied to measure chlortetracycline, doxycycline, oxytetracycline, and tetracycline in milk samples. Under optimal conditions, the proposed method had a linear range of 29 to 200 µg·L−1, with limits of detection ranging from 18.6 to 23.8 µg·L−1 with inter- and intraday repeatabilities < 10% (as a relative standard deviation in all cases. The enrichment factors obtained were from 50.33 to 70.85 for all the TCs compared with a conventional capillary zone electrophoresis (CZE. This method is adequate to analyze tetracyclines below the most restrictive established maximum residue limits. The proposed method was employed in the analysis of 15 milk samples from different brands. Two of the tested samples were positive for the presence of oxytetracycline with concentrations of 95 and 126 µg·L−1. SPE-LVSS-CE is a robust, easy, and efficient strategy for online preconcentration of tetracycline residues in complex matrices.

  8. Framework Programmable Platform for the Advanced Software Development Workstation (FPP/ASDW). Demonstration framework document. Volume 1: Concepts and activity descriptions

    Science.gov (United States)

    Mayer, Richard J.; Blinn, Thomas M.; Dewitte, Paul S.; Crump, John W.; Ackley, Keith A.

    1992-01-01

    The Framework Programmable Software Development Platform (FPP) is a project aimed at effectively combining tool and data integration mechanisms with a model of the software development process to provide an intelligent integrated software development environment. Guided by the model, this system development framework will take advantage of an integrated operating environment to automate effectively the management of the software development process so that costly mistakes during the development phase can be eliminated. The Advanced Software Development Workstation (ASDW) program is conducting research into development of advanced technologies for Computer Aided Software Engineering (CASE).

  9. Highly selective solid-phase extraction and large volume injection for the robust gas chromatography-mass spectrometric analysis of TCA and TBA in wines.

    Science.gov (United States)

    Insa, S; Anticó, E; Ferreira, V

    2005-09-30

    A reliable solid-phase extraction (SPE) method for the simultaneous determination of 2,4,6-trichloroanisole (TCA) and 2,4,6-tribromoanisole (TBA) in wines has been developed. In the proposed procedure 50 mL of wine are extracted in a 1 mL cartridge filled with 50 mg of LiChrolut EN resins. Most wine volatiles are washed up with 12.5 mL of a water:methanol solution (70%, v/v) containing 1% of NaHCO3. Analytes are further eluted with 0.6 mL of dichloromethane. A 40 microL aliquot of this extract is directly injected into a PTV injector operated in the solvent split mode, and analysed by gas chromatography (GC)-ion trap mass spectrometry using the selected ion storage mode. The solid-phase extraction, including sample volume and rinsing and elution solvents, and the large volume GC injection have been carefully evaluated and optimized. The resulting method is precise (RSD (%) TBA, respectively), robust (the absolute recoveries of both analytes are higher than 80% and consistent wine to wine) and friendly to the GC-MS system (the extract is clean, simple and free from non-volatiles).

  10. Miniaturised pressurised liquid extraction aromatic hydrocarbons from soil and sediment with subsequent large-volume injection-gas chromatography

    NARCIS (Netherlands)

    Ramos, L.; Vreuls, J.J.; Brinkman, U.A.T.

    2000-01-01

    Analyte extraction is the main limitation when developing at-line, or on-line, procedures for the preparation of (semi)solid environmental samples. Pressurised liquid extraction (PLE) is an analyte- and matrix-independent technique which provides cleaner extracts than the time-consuming classical

  11. Base excision repair efficiency and mechanism in nuclear extracts are influenced by the ratio between volume of nuclear extraction buffer and nuclei-Implications for comparative studies

    DEFF Research Database (Denmark)

    Akbari, Mansour; Krokan, Hans E

    2012-01-01

    The base excision repair (BER) pathway corrects many different DNA base lesions and is important for genomic stability. The mechanism of BER cannot easily be investigated in intact cells and therefore in vitro methods that reflect the in vivo processes are in high demand. Reconstitution of BER...... using purified proteins essentially mirror properties of the proteins used, and does not necessarily reflect the mechanism as it occurs in the cell. Nuclear extracts from cultured cells have the capacity to carry out complete BER and can give important information on the mechanism. Furthermore......, candidate proteins in extracts can be inhibited or depleted in a controlled way, making defined extracts an important source for mechanistic studies. The major drawback is that there is no standardized method of preparing nuclear extract for BER studies, and it does not appear to be a topic given much...

  12. Twenty-third water reactor safety information meeting: Volume 2, Human factors research; Advanced I and C hardware and software; Severe accident research; Probabilistic risk assessment topics; Individual plant examination: Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    Monteleone, S. [comp.] [Brookhaven National Lab., Upton, NY (United States)

    1996-03-01

    This three-volume report contains papers presented at the Twenty- Third Water Reactor Safety Information Meeting held at the Bethesda Marriott Hotel, Bethesda, Maryland, October 23-25, 1995. The papers are printed in the order of their presentation in each session and describe progress and results of programs in nuclear safety research conducted in this country and abroad. Foreign participation in the meeting included papers presented by researchers from France, Italy, Japan, Norway, Russia, Sweden, and Switzerland. This document, Volume 2, present topics in human factors research, advanced instrumentation and control hardware and software, severe accident research, probabilistic risk assessment, and individual plant examination. Individual papers have been cataloged separately.

  13. A SIMPLE AND FAST EXTRACTION METHOD FOR ORGANOCHLORINE PESTICIDES AND POLYCHLORINATED BIPHENYLS IN SMALL VOLUMES OF AVIAN SERUM

    Science.gov (United States)

    A solid-phase extraction (SPE) method was developed using 8 M urea to desorb and extract organochlorine pesticides (OCs) and polychlorinated biphenyls (PCBs) from avian serum for analysis by capillary gas chromatography with electron capture detection (GC-ECD). The analytes were ...

  14. Task-specific feature extraction and classification of fMRI volumes using a deep neural network initialized with a deep belief network: Evaluation using sensorimotor tasks.

    Science.gov (United States)

    Jang, Hojin; Plis, Sergey M; Calhoun, Vince D; Lee, Jong-Hwan

    2017-01-15

    Feedforward deep neural networks (DNNs), artificial neural networks with multiple hidden layers, have recently demonstrated a record-breaking performance in multiple areas of applications in computer vision and speech processing. Following the success, DNNs have been applied to neuroimaging modalities including functional/structural magnetic resonance imaging (MRI) and positron-emission tomography data. However, no study has explicitly applied DNNs to 3D whole-brain fMRI volumes and thereby extracted hidden volumetric representations of fMRI that are discriminative for a task performed as the fMRI volume was acquired. Our study applied fully connected feedforward DNN to fMRI volumes collected in four sensorimotor tasks (i.e., left-hand clenching, right-hand clenching, auditory attention, and visual stimulus) undertaken by 12 healthy participants. Using a leave-one-subject-out cross-validation scheme, a restricted Boltzmann machine-based deep belief network was pretrained and used to initialize weights of the DNN. The pretrained DNN was fine-tuned while systematically controlling weight-sparsity levels across hidden layers. Optimal weight-sparsity levels were determined from a minimum validation error rate of fMRI volume classification. Minimum error rates (mean±standard deviation; %) of 6.9 (±3.8) were obtained from the three-layer DNN with the sparsest condition of weights across the three hidden layers. These error rates were even lower than the error rates from the single-layer network (9.4±4.6) and the two-layer network (7.4±4.1). The estimated DNN weights showed spatial patterns that are remarkably task-specific, particularly in the higher layers. The output values of the third hidden layer represented distinct patterns/codes of the 3D whole-brain fMRI volume and encoded the information of the tasks as evaluated from representational similarity analysis. Our reported findings show the ability of the DNN to classify a single fMRI volume based on the

  15. Software FMEA analysis for safety-related application software

    International Nuclear Information System (INIS)

    Park, Gee-Yong; Kim, Dong Hoon; Lee, Dong Young

    2014-01-01

    Highlights: • We develop a modified FMEA analysis suited for applying to software architecture. • A template for failure modes on a specific software language is established. • A detailed-level software FMEA analysis on nuclear safety software is presented. - Abstract: A method of a software safety analysis is described in this paper for safety-related application software. The target software system is a software code installed at an Automatic Test and Interface Processor (ATIP) in a digital reactor protection system (DRPS). For the ATIP software safety analysis, at first, an overall safety or hazard analysis is performed over the software architecture and modules, and then a detailed safety analysis based on the software FMEA (Failure Modes and Effect Analysis) method is applied to the ATIP program. For an efficient analysis, the software FMEA analysis is carried out based on the so-called failure-mode template extracted from the function blocks used in the function block diagram (FBD) for the ATIP software. The software safety analysis by the software FMEA analysis, being applied to the ATIP software code, which has been integrated and passed through a very rigorous system test procedure, is proven to be able to provide very valuable results (i.e., software defects) that could not be identified during various system tests

  16. Joint Logistics Commanders’ Biennial Software Workshop (4th) Orlando II: Solving the PDSS (Post Deployment Software Support) Challenge Held in Orlando, Florida on 27-29 January 87. Volume 2. Proceedings

    Science.gov (United States)

    1987-06-01

    described the state )f ruaturity of software engineering as being equivalent to the state of maturity of Civil Engineering before Pythagoras invented the...formal verification languages, theorem provers or secure configuration 0 management tools would have to be maintained and used in the PDSS Center to

  17. A computer-aided system for automatic extraction of femur neck trabecular bone architecture using isotropic volume construction from clinical hip computed tomography images.

    Science.gov (United States)

    Vivekanandhan, Sapthagirivasan; Subramaniam, Janarthanam; Mariamichael, Anburajan

    2016-10-01

    Hip fractures due to osteoporosis are increasing progressively across the globe. It is also difficult for those fractured patients to undergo dual-energy X-ray absorptiometry scans due to its complicated protocol and its associated cost. The utilisation of computed tomography for the fracture treatment has become common in the clinical practice. It would be helpful for orthopaedic clinicians, if they could get some additional information related to bone strength for better treatment planning. The aim of our study was to develop an automated system to segment the femoral neck region, extract the cortical and trabecular bone parameters, and assess the bone strength using an isotropic volume construction from clinical computed tomography images. The right hip computed tomography and right femur dual-energy X-ray absorptiometry measurements were taken from 50 south-Indian females aged 30-80 years. Each computed tomography image volume was re-constructed to form isotropic volumes. An automated system by incorporating active contour models was used to segment the neck region. A minimum distance boundary method was applied to isolate the cortical and trabecular bone components. The trabecular bone was enhanced and segmented using trabecular enrichment approach. The cortical and trabecular bone features were extracted and statistically compared with dual-energy X-ray absorptiometry measured femur neck bone mineral density. The extracted bone measures demonstrated a significant correlation with neck bone mineral density (r > 0.7, p computed tomography images scanned with low dose could eventually be helpful in osteoporosis diagnosis and its treatment planning. © IMechE 2016.

  18. Solid phase extraction of large volume of water and beverage samples to improve detection limits for GC-MS analysis of bisphenol A and four other bisphenols.

    Science.gov (United States)

    Cao, Xu-Liang; Popovic, Svetlana

    2018-01-01

    Solid phase extraction (SPE) of large volumes of water and beverage products was investigated for the GC-MS analysis of bisphenol A (BPA), bisphenol AF (BPAF), bisphenol F (BPF), bisphenol E (BPE), and bisphenol B (BPB). While absolute recoveries of the method were improved for water and some beverage products (e.g. diet cola, iced tea), breakthrough may also have occurred during SPE of 200 mL of other beverages (e.g. BPF in cola). Improvements in method detection limits were observed with the analysis of large sample volumes for all bisphenols at ppt (pg/g) to sub-ppt levels. This improvement was found to be proportional to sample volumes for water and beverage products with less interferences and noise levels around the analytes. Matrix effects and interferences were observed during SPE of larger volumes (100 and 200 mL) of the beverage products, and affected the accurate analysis of BPF. This improved method was used to analyse bisphenols in various beverage samples, and only BPA was detected, with levels ranging from 0.022 to 0.030 ng/g for products in PET bottles, and 0.085 to 0.32 ng/g for products in cans.

  19. Software engineering

    CERN Document Server

    Sommerville, Ian

    2010-01-01

    The ninth edition of Software Engineering presents a broad perspective of software engineering, focusing on the processes and techniques fundamental to the creation of reliable, software systems. Increased coverage of agile methods and software reuse, along with coverage of 'traditional' plan-driven software engineering, gives readers the most up-to-date view of the field currently available. Practical case studies, a full set of easy-to-access supplements, and extensive web resources make teaching the course easier than ever.

  20. Upgrade Software and Computing

    CERN Document Server

    The LHCb Collaboration, CERN

    2018-01-01

    This document reports the Research and Development activities that are carried out in the software and computing domains in view of the upgrade of the LHCb experiment. The implementation of a full software trigger implies major changes in the core software framework, in the event data model, and in the reconstruction algorithms. The increase of the data volumes for both real and simulated datasets requires a corresponding scaling of the distributed computing infrastructure. An implementation plan in both domains is presented, together with a risk assessment analysis.

  1. Revisiting software ecosystems research

    DEFF Research Database (Denmark)

    Manikas, Konstantinos

    2016-01-01

    Software ecosystems’ is argued to first appear as a concept more than 10 years ago and software ecosystem research started to take off in 2010. We conduct a systematic literature study, based on the most extensive literature review in the field up to date, with two primarily aims: (a) to provide...... an updated overview of the field and (b) to document evolution in the field. In total, we analyze 231 papers from 2007 until 2014 and provide an overview of the research in software ecosystems. Our analysis reveals a field that is rapidly growing both in volume and empirical focus while becoming more mature...... from evolving. We propose means for future research and the community to address them. Finally, our analysis shapes the view of the field having evolved outside the existing definitions of software ecosystems and thus propose the update of the definition of software ecosystems....

  2. Performance of new automated transthoracic three-dimensional echocardiographic software for left ventricular volumes and function assessment in routine clinical practice: Comparison with 3 Tesla cardiac magnetic resonance.

    Science.gov (United States)

    Levy, Franck; Dan Schouver, Elie; Iacuzio, Laura; Civaia, Filippo; Rusek, Stephane; Dommerc, Carinne; Marechaux, Sylvestre; Dor, Vincent; Tribouilloy, Christophe; Dreyfus, Gilles

    2017-11-01

    Three-dimensional (3D) transthoracic echocardiography (TTE) is superior to two-dimensional Simpson's method for assessment of left ventricular (LV) volumes and LV ejection fraction (LVEF). Nevertheless, 3D TTE is not incorporated into everyday practice, as current LV chamber quantification software products are time-consuming. To evaluate the feasibility, accuracy and reproducibility of new fully automated fast 3D TTE software (HeartModel A.I. ; Philips Healthcare, Andover, MA, USA) for quantification of LV volumes and LVEF in routine practice; to compare the 3D LV volumes and LVEF obtained with a cardiac magnetic resonance (CMR) reference; and to optimize automated default border settings with CMR as reference. Sixty-three consecutive patients, who had comprehensive 3D TTE and CMR examinations within 24hours, were eligible for inclusion. Nine patients (14%) were excluded because of insufficient echogenicity in the 3D TTE. Thus, 54 patients (40 men; mean age 63±13 years) were prospectively included into the study. The inter- and intraobserver reproducibilities of 3D TTE were excellent (coefficient of variation<10%) for end-diastolic volume (EDV), end-systolic volume (ESV) and LVEF. Despite a slight underestimation of EDV using 3D TTE compared with CMR (bias=-22±34mL; P<0.0001), a significant correlation was found between the two measurements (r=0.93; P=0.0001). Enlarging default border detection settings leads to frequent volume overestimation in the general population, but improved agreement with CMR in patients with LVEF≤50%. Correlations between 3D TTE and CMR for ESV and LVEF were excellent (r=0.93 and r=0.91, respectively; P<0.0001). 3D TTE using new-generation fully automated software is a feasible, fast, reproducible and accurate imaging modality for LV volumetric quantification in routine practice. Optimization of border detection settings may increase agreement with CMR for EDV assessment in dilated ventricles. Copyright © 2017 Elsevier Masson

  3. SOFTWARE OPEN SOURCE, SOFTWARE GRATIS?

    Directory of Open Access Journals (Sweden)

    Nur Aini Rakhmawati

    2006-01-01

    Full Text Available Normal 0 false false false IN X-NONE X-NONE MicrosoftInternetExplorer4 Berlakunya Undang – undang Hak Atas Kekayaan Intelektual (HAKI, memunculkan suatu alternatif baru untuk menggunakan software open source. Penggunaan software open source menyebar seiring dengan isu global pada Information Communication Technology (ICT saat ini. Beberapa organisasi dan perusahaan mulai menjadikan software open source sebagai pertimbangan. Banyak konsep mengenai software open source ini. Mulai dari software yang gratis sampai software tidak berlisensi. Tidak sepenuhnya isu software open source benar, untuk itu perlu dikenalkan konsep software open source mulai dari sejarah, lisensi dan bagaimana cara memilih lisensi, serta pertimbangan dalam memilih software open source yang ada. Kata kunci :Lisensi, Open Source, HAKI

  4. Software Epistemology

    Science.gov (United States)

    2016-03-01

    in-vitro decision to incubate a startup, Lexumo [7], which is developing a commercial Software as a Service ( SaaS ) vulnerability assessment...LTS Label Transition System MUSE Mining and Understanding Software Enclaves RTEMS Real-Time Executive for Multi-processor Systems SaaS Software ...as a Service SSA Static Single Assignment SWE Software Epistemology UD/DU Def-Use/Use-Def Chains (Dataflow Graph)

  5. Extraction and formation dynamic of oak-related volatile compounds from different volume barrels to wine and their behavior during bottle storage.

    Science.gov (United States)

    Pérez-Prieto, Luis J; López-Roca, Jose M; Martínez-Cutillas, Adrián; Pardo-Mínguez, Francisco; Gómez-Plaza, Encarna

    2003-08-27

    The extraction rate of furfuryl aldehydes, guaiacol, and 4-methylguaiacol, cis- and trans-oak lactone, and vanillin and the formation rate of furfuryl alcohol and the volatile phenols 4-ethylguaiacol and 4-ethylphenol have been studied in wines matured in different capacity oak barrels (220, 500, and 1000 L). Also, the behavior of these compounds during 1 year of wine bottle storage was followed. The lactones were extracted at a linear rate with large differences that depended on barrel volume. Those compounds related to oak toasting (guaiacol, 4-methylguaiacol, furfuryl aldehydes, and vanillin) seemed to be extracted faster during the first days of oak maturation except for vanillin, which required at least 3 months to accumulate in the wine. The volatile phenols, 4-ethylphenol and 4-ethylguaiacol, were formed in large quantities after the first 90 days of oak maturation, coinciding with the end of spring and beginning of summer. Wines matured in 1000-L oak barrels resulted in the lowest levels of volatile compound accumulation. During bottle storage, some compounds decreased in their concentration (5-methylfurfural, vanillin), others experienced increases in their levels (lactones, furfural, 4-ethylguaiacol, 4-ethylphenol), and the concentration of other compounds hardly changed (guaiacol, furfuryl alcohol).

  6. Software reliability

    CERN Document Server

    Bendell, A

    1986-01-01

    Software Reliability reviews some fundamental issues of software reliability as well as the techniques, models, and metrics used to predict the reliability of software. Topics covered include fault avoidance, fault removal, and fault tolerance, along with statistical methods for the objective assessment of predictive accuracy. Development cost models and life-cycle cost models are also discussed. This book is divided into eight sections and begins with a chapter on adaptive modeling used to predict software reliability, followed by a discussion on failure rate in software reliability growth mo

  7. Computer software.

    Science.gov (United States)

    Rosenthal, L E

    1986-10-01

    Software is the component in a computer system that permits the hardware to perform the various functions that a computer system is capable of doing. The history of software and its development can be traced to the early nineteenth century. All computer systems are designed to utilize the "stored program concept" as first developed by Charles Babbage in the 1850s. The concept was lost until the mid-1940s, when modern computers made their appearance. Today, because of the complex and myriad tasks that a computer system can perform, there has been a differentiation of types of software. There is software designed to perform specific business applications. There is software that controls the overall operation of a computer system. And there is software that is designed to carry out specialized tasks. Regardless of types, software is the most critical component of any computer system. Without it, all one has is a collection of circuits, transistors, and silicone chips.

  8. Evaluation of automated decisionmaking methodologies and development of an integrated robotic system simulation, volume 2, part 1. Appendix A: Software documentation

    Science.gov (United States)

    Lowrie, J. W.; Fermelia, A. J.; Haley, D. C.; Gremban, K. D.; Vanbaalen, J.; Walsh, R. W.

    1982-01-01

    Documentation of the preliminary software developed as a framework for a generalized integrated robotic system simulation is presented. The program structure is composed of three major functions controlled by a program executive. The three major functions are: system definition, analysis tools, and post processing. The system definition function handles user input of system parameters and definition of the manipulator configuration. The analysis tools function handles the computational requirements of the program. The post processing function allows for more detailed study of the results of analysis tool function executions. Also documented is the manipulator joint model software to be used as the basis of the manipulator simulation which will be part of the analysis tools capability.

  9. Three-dimensional computer reconstruction of large tissue volumes based on composing series of high-resolution confocal images by GlueMRC and LinkMRC software

    Czech Academy of Sciences Publication Activity Database

    Karen, Petr; Jirkovská, M.; Tomori, Z.; Demjénová, E.; Janáček, Jiří; Kubínová, Lucie

    2003-01-01

    Roč. 62, č. 5 (2003), s. 415-422 ISSN 1059-910X R&D Projects: GA ČR GA304/01/0257 Grant - others:VEGA(SK) 2/1146/21; CZ-SK GA MŠk(CZ) KONTAKT 126/184 Institutional research plan: CEZ:AV0Z5011922 Keywords : 3D reconstruction * confocal microscopy * image processing Subject RIV: JC - Computer Hardware ; Software Impact factor: 2.307, year: 2003

  10. Genotyping for DQA1 and PM loci in urine using PCR-based amplification: effects of sample volume, storage temperature, preservatives, and aging on DNA extraction and typing.

    Science.gov (United States)

    Vu, N T; Chaturvedi, A K; Canfield, D V

    1999-05-31

    Urine is often the sample of choice for drug screening in aviation/general forensic toxicology and in workplace drug testing. In some instances, the origin of the submitted samples may be challenged because of the medicolegal and socioeconomic consequences of a positive drug test. Methods for individualization of biological samples have reached a new boundary with the application of the polymerase chain reaction (PCR) in DNA profiling, but a successful characterization of the urine specimens depends on the quantity and quality of DNA present in the samples. Therefore, the present study investigated the influence of storage conditions, sample volume, concentration modes, extraction procedures, and chemical preservations on the quantity of DNA recovered, as well as the success rate of PCR-based genotyping for DQA1 and PM loci in urine. Urine specimens from male and female volunteers were divided and stored at various temperatures for up to 30 days. The results suggested that sample purification by dialfiltration, using 3000-100,000 molecular weight cut-off filters, did not enhance DNA recovery and typing rate as compared with simple centrifugation procedures. Extraction of urinary DNA by the organic method and by the resin method gave comparable typing results. Larger sample volume yielded a higher amount of DNA, but the typing rates were not affected for sample volumes between 1 and 5 ml. The quantifiable amounts of DNA present were found to be greater in female (14-200 ng/ml) than in male (4-60 ng/ml) samples and decreased with the elapsed time under both room temperature (RT) and frozen storage. Typing of the male samples also demonstrated that RT storage samples produced significantly higher success rates than that of frozen samples, while there was only marginal difference in the DNA typing rates among the conditions tested using female samples. Successful assignment of DQA1 + PM genotype was achieved for all samples of fresh urine, independent of gender

  11. NASA software documentation standard software engineering program

    Science.gov (United States)

    1991-01-01

    The NASA Software Documentation Standard (hereinafter referred to as Standard) can be applied to the documentation of all NASA software. This Standard is limited to documentation format and content requirements. It does not mandate specific management, engineering, or assurance standards or techniques. This Standard defines the format and content of documentation for software acquisition, development, and sustaining engineering. Format requirements address where information shall be recorded and content requirements address what information shall be recorded. This Standard provides a framework to allow consistency of documentation across NASA and visibility into the completeness of project documentation. This basic framework consists of four major sections (or volumes). The Management Plan contains all planning and business aspects of a software project, including engineering and assurance planning. The Product Specification contains all technical engineering information, including software requirements and design. The Assurance and Test Procedures contains all technical assurance information, including Test, Quality Assurance (QA), and Verification and Validation (V&V). The Management, Engineering, and Assurance Reports is the library and/or listing of all project reports.

  12. Software engineering design theory and practice

    CERN Document Server

    Otero, Carlos

    2012-01-01

    … intended for use as a textbook for an advanced course in software design. Each chapter ends with review questions and references. … provides an overview of the software development process, something that would not be out of line in a course on software engineering including such topics as software process, software management, balancing conflicting values of stakeholders, testing, quality, and ethics. The author has principally focused on software design though, extracting the design phase from the surrounding software development lifecycle. … Software design strategies are addressed

  13. Software Innovation

    DEFF Research Database (Denmark)

    Rose, Jeremy

      Innovation is the forgotten key to modern systems development - the element that defines the enterprising engineer, the thriving software firm and the cutting edge software application.  Traditional forms of technical education pay little attention to creativity - often encouraging overly...

  14. Software engineering

    CERN Document Server

    Sommerville, Ian

    2016-01-01

    For courses in computer science and software engineering The Fundamental Practice of Software Engineering Software Engineering introduces readers to the overwhelmingly important subject of software programming and development. In the past few years, computer systems have come to dominate not just our technological growth, but the foundations of our world's major industries. This text seeks to lay out the fundamental concepts of this huge and continually growing subject area in a clear and comprehensive manner. The Tenth Edition contains new information that highlights various technological updates of recent years, providing readers with highly relevant and current information. Sommerville's experience in system dependability and systems engineering guides the text through a traditional plan-based approach that incorporates some novel agile methods. The text strives to teach the innovators of tomorrow how to create software that will make our world a better, safer, and more advanced place to live.

  15. Evidence synthesis software.

    Science.gov (United States)

    Park, Sophie Elizabeth; Thomas, James

    2018-06-07

    It can be challenging to decide which evidence synthesis software to choose when doing a systematic review. This article discusses some of the important questions to consider in relation to the chosen method and synthesis approach. Software can support researchers in a range of ways. Here, a range of review conditions and software solutions. For example, facilitating contemporaneous collaboration across time and geographical space; in-built bias assessment tools; and line-by-line coding for qualitative textual analysis. EPPI-Reviewer is a review software for research synthesis managed by the EPPI-centre, UCL Institute of Education. EPPI-Reviewer has text mining automation technologies. Version 5 supports data sharing and re-use across the systematic review community. Open source software will soon be released. EPPI-Centre will continue to offer the software as a cloud-based service. The software is offered via a subscription with a one-month (extendible) trial available and volume discounts for 'site licences'. It is free to use for Cochrane and Campbell reviews. The next EPPI-Reviewer version is being built in collaboration with National Institute for Health and Care Excellence using 'surveillance' of newly published research to support 'living' iterative reviews. This is achieved using a combination of machine learning and traditional information retrieval technologies to identify the type of research each new publication describes and determine its relevance for a particular review, domain or guideline. While the amount of available knowledge and research is constantly increasing, the ways in which software can support the focus and relevance of data identification are also developing fast. Software advances are maximising the opportunities for the production of relevant and timely reviews. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise

  16. Results of a Survey Software Development Project Management in the U.S. Aerospace Industry. Volume II. Project Management Techniques, Procedures and Tools.

    Science.gov (United States)

    1979-12-18

    I Z I UO W-1 eu 0 - 0 tD CL 0, 0 -I7 NW 2 - x M.j a CL W2 X 41 a ~ 0 0,a 4~~ 0 Z .D .J 0 2. N 0 ~N IU 0 4 - 2 0 ~. 0 Q. ’o ~ 0, 𔃺e U - U ~- 0, 0 -a...0 .44 A A A Ao 0 - 2. -U 2- 4’ A 4’ A o .~ 0 .~ 2. flJ 2. A Ao ~. a 2. 𔃾 2- @2 @2 @2 A 0 .~. a I- 2. Xii - 0 2 @2 ON AXe Re 2- Oi. K A.. A A AU .40...Project manager or person appointed by him SE/ TD project manager b. Senior ADP Manager Director Director computer programming Software program design

  17. Combined sabal and urtica extract compared with finasteride in men with benign prostatic hyperplasia: analysis of prostate volume and therapeutic outcome.

    Science.gov (United States)

    Sökeland, J

    2000-09-01

    To test the hypothesis that in patients with benign prostatic hyperplasia (BPH), the outcome of drug therapy with finasteride may be predictable from the baseline prostate volume and that positive clinical effects might be expected only in patients with prostate volumes of > 40 mL, using a subgroup analysis of results from a previously reported clinical trial of finasteride and phytotherapy. A subgroup of 431 patients was analysed from a randomized, multicentre, double-blind clinical trial involving 543 patients with the early stages of BPH. Patients received a fixed combination of extracts of saw palmetto fruit (Serenoa repens) and nettle root (Urtica dioica) (PRO 160/120) or the synthetic 5alpha-reductase inhibitor finasteride. The patients assessed had valid ultrasonographic measurements and baseline prostate volumes of either 40 mL. All 516 patients were included in the safety analysis. The results of the original trial showed equivalent efficacy for both treatments. The mean (SD) maximum urinary flow (the main outcome variable) increased (from baseline values) after 24 weeks by 1.9 (5.6) mL/s with PRO 160/120 and by 2.4 (6.3) mL/s with finasteride. There were no statistically significant group differences (P = 0.52). The subgroups with small prostates ( 40 mL were similar, at 2.3 (6.1) and 2. 2 (5.3) mL/s, respectively. There were improvements in the International Prostate Symptom Score in both treatment groups, with no statistically significant differences. The subgroup analysis showed slightly better results for voiding symptoms in the patients with prostates of > 40 mL, but there were also improvements in the subgroup with smaller prostates. The safety analysis showed that more patients in the finasteride group reported adverse events and also there were more adverse events in this group than in patients treated with PRO 160/120. The present analysis showed that the efficacy of both PRO 160/120 and finasteride was equivalent and unrelated to prostate volume

  18. Software requirements

    CERN Document Server

    Wiegers, Karl E

    2003-01-01

    Without formal, verifiable software requirements-and an effective system for managing them-the programs that developers think they've agreed to build often will not be the same products their customers are expecting. In SOFTWARE REQUIREMENTS, Second Edition, requirements engineering authority Karl Wiegers amplifies the best practices presented in his original award-winning text?now a mainstay for anyone participating in the software development process. In this book, you'll discover effective techniques for managing the requirements engineering process all the way through the development cy

  19. Twenty-second water reactor safety information meeting: Proceedings. Volume 1: Plenary session; Advanced instrumentation and control hardware and software; Human factors research; IPE and PRA

    Energy Technology Data Exchange (ETDEWEB)

    Monteleone, S. [comp.] [Brookhaven National Lab., Upton, NY (United States)

    1995-04-01

    This three-volume report contains papers presented at the Twenty-Second Water Reactor Safety Information Meeting held at the Bethesda Marriott Hotel, Bethesda, Maryland, during the week of October 24--26, 1994. The papers are printed in the order of their presentation in each session and describe progress and results of programs in nuclear safety research conducted in this country and abroad. Foreign participation in the meeting included papers presented by researchers from Finland, France, Italy, Japan, Russia, and United Kingdom. The titles of the papers and the names of the authors have been updated and may differ from those that appeared in the final program of the meeting. Selected papers are indexed separately for inclusion in the Energy Science and Technology Database.

  20. Twenty-second water reactor safety information meeting: Proceedings. Volume 1: Plenary session; Advanced instrumentation and control hardware and software; Human factors research; IPE and PRA

    International Nuclear Information System (INIS)

    Monteleone, S.

    1995-04-01

    This three-volume report contains papers presented at the Twenty-Second Water Reactor Safety Information Meeting held at the Bethesda Marriott Hotel, Bethesda, Maryland, during the week of October 24--26, 1994. The papers are printed in the order of their presentation in each session and describe progress and results of programs in nuclear safety research conducted in this country and abroad. Foreign participation in the meeting included papers presented by researchers from Finland, France, Italy, Japan, Russia, and United Kingdom. The titles of the papers and the names of the authors have been updated and may differ from those that appeared in the final program of the meeting. Selected papers are indexed separately for inclusion in the Energy Science and Technology Database

  1. Software Reviews.

    Science.gov (United States)

    Dwyer, Donna; And Others

    1989-01-01

    Reviewed are seven software packages for Apple and IBM computers. Included are: "Toxicology"; "Science Corner: Space Probe"; "Alcohol and Pregnancy"; "Science Tool Kit Plus"; Computer Investigations: Plant Growth"; "Climatrolls"; and "Animal Watch: Whales." (CW)

  2. Software Reviews.

    Science.gov (United States)

    Davis, Shelly J., Ed.; Knaupp, Jon, Ed.

    1984-01-01

    Reviewed is computer software on: (1) classification of living things, a tutorial program for grades 5-10; and (2) polynomial practice using tiles, a drill-and-practice program for algebra students. (MNS)

  3. Software Reviews.

    Science.gov (United States)

    Miller, Anne, Ed.; Radziemski, Cathy, Ed.

    1988-01-01

    Three pieces of computer software are described and reviewed: HyperCard, to build and use varied applications; Iggy's Gnees, for problem solving with shapes in grades kindergarten-two; and Algebra Shop, for practicing skills and problem solving. (MNS)

  4. Automatic extraction of forward stroke volume using dynamic PET/CT: a dual-tracer and dual-scanner validation in patients with heart valve disease.

    Science.gov (United States)

    Harms, Hendrik Johannes; Tolbod, Lars Poulsen; Hansson, Nils Henrik Stubkjær; Kero, Tanja; Orndahl, Lovisa Holm; Kim, Won Yong; Bjerner, Tomas; Bouchelouche, Kirsten; Wiggers, Henrik; Frøkiær, Jørgen; Sörensen, Jens

    2015-12-01

    The aim of this study was to develop and validate an automated method for extracting forward stroke volume (FSV) using indicator dilution theory directly from dynamic positron emission tomography (PET) studies for two different tracers and scanners. 35 subjects underwent a dynamic (11)C-acetate PET scan on a Siemens Biograph TruePoint-64 PET/CT (scanner I). In addition, 10 subjects underwent both dynamic (15)O-water PET and (11)C-acetate PET scans on a GE Discovery-ST PET/CT (scanner II). The left ventricular (LV)-aortic time-activity curve (TAC) was extracted automatically from PET data using cluster analysis. The first-pass peak was isolated by automatic extrapolation of the downslope of the TAC. FSV was calculated as the injected dose divided by the product of heart rate and the area under the curve of the first-pass peak. Gold standard FSV was measured using phase-contrast cardiovascular magnetic resonance (CMR). FSVPET correlated highly with FSVCMR (r = 0.87, slope = 0.90 for scanner I, r = 0.87, slope = 1.65, and r = 0.85, slope = 1.69 for scanner II for (15)O-water and (11)C-acetate, respectively) although a systematic bias was observed for both scanners (p dynamic PET/CT and cluster analysis. Results are almost identical for (11)C-acetate and (15)O-water. A scanner-dependent bias was observed, and a scanner calibration factor is required for multi-scanner studies. Generalization of the method to other tracers and scanners requires further validation.

  5. Sample Preparation and Extraction in Small Sample Volumes Suitable for Pediatric Clinical Studies: Challenges, Advances, and Experiences of a Bioanalytical HPLC-MS/MS Method Validation Using Enalapril and Enalaprilat

    Science.gov (United States)

    Burckhardt, Bjoern B.; Laeer, Stephanie

    2015-01-01

    In USA and Europe, medicines agencies force the development of child-appropriate medications and intend to increase the availability of information on the pediatric use. This asks for bioanalytical methods which are able to deal with small sample volumes as the trial-related blood lost is very restricted in children. Broadly used HPLC-MS/MS, being able to cope with small volumes, is susceptible to matrix effects. The latter restrains the precise drug quantification through, for example, causing signal suppression. Sophisticated sample preparation and purification utilizing solid-phase extraction was applied to reduce and control matrix effects. A scale-up from vacuum manifold to positive pressure manifold was conducted to meet the demands of high-throughput within a clinical setting. Faced challenges, advances, and experiences in solid-phase extraction are exemplarily presented on the basis of the bioanalytical method development and validation of low-volume samples (50 μL serum). Enalapril, enalaprilat, and benazepril served as sample drugs. The applied sample preparation and extraction successfully reduced the absolute and relative matrix effect to comply with international guidelines. Recoveries ranged from 77 to 104% for enalapril and from 93 to 118% for enalaprilat. The bioanalytical method comprising sample extraction by solid-phase extraction was fully validated according to FDA and EMA bioanalytical guidelines and was used in a Phase I study in 24 volunteers. PMID:25873972

  6. Sample Preparation and Extraction in Small Sample Volumes Suitable for Pediatric Clinical Studies: Challenges, Advances, and Experiences of a Bioanalytical HPLC-MS/MS Method Validation Using Enalapril and Enalaprilat

    Directory of Open Access Journals (Sweden)

    Bjoern B. Burckhardt

    2015-01-01

    Full Text Available In USA and Europe, medicines agencies force the development of child-appropriate medications and intend to increase the availability of information on the pediatric use. This asks for bioanalytical methods which are able to deal with small sample volumes as the trial-related blood lost is very restricted in children. Broadly used HPLC-MS/MS, being able to cope with small volumes, is susceptible to matrix effects. The latter restrains the precise drug quantification through, for example, causing signal suppression. Sophisticated sample preparation and purification utilizing solid-phase extraction was applied to reduce and control matrix effects. A scale-up from vacuum manifold to positive pressure manifold was conducted to meet the demands of high-throughput within a clinical setting. Faced challenges, advances, and experiences in solid-phase extraction are exemplarily presented on the basis of the bioanalytical method development and validation of low-volume samples (50 μL serum. Enalapril, enalaprilat, and benazepril served as sample drugs. The applied sample preparation and extraction successfully reduced the absolute and relative matrix effect to comply with international guidelines. Recoveries ranged from 77 to 104% for enalapril and from 93 to 118% for enalaprilat. The bioanalytical method comprising sample extraction by solid-phase extraction was fully validated according to FDA and EMA bioanalytical guidelines and was used in a Phase I study in 24 volunteers.

  7. Determination of tributyltin in environmental water matrices using stir bar sorptive extraction with in-situ derivatisation and large volume injection-gas chromatography-mass spectrometry.

    Science.gov (United States)

    Neng, N R; Santalla, R P; Nogueira, J M F

    2014-08-01

    Stir bar sorptive extraction with in-situ derivatization using sodium tetrahydridoborate (NaBH4) followed by liquid desorption and large volume injection-gas chromatography-mass spectrometry detection under the selected ion monitoring mode (SBSE(NaBH4)in-situ-LD/LVI-GC-MS(SIM)) was successfully developed for the determination of tributyltin (TBT) in environmental water matrices. NaBH4 proved to be an effective and easy in-situ speciation agent for TBT in aqueous media, allowing the formation of adducts with enough stability and suitable polarity for SBSE analysis. Assays performed on water samples spiked at the 10.0μg/L, yielded convenient recoveries (68.2±3.0%), showed good accuracy, suitable precision (RSD<9.0%), low detection limits (23ng/L) and excellent linear dynamic range (r(2)=0.9999) from 0.1 to 170.0µg/L, under optimized experimental conditions. By using the standard addition method, the application of the present methodology to real surface water samples allowed very good performance at the trace level. The proposed methodology proved to be a feasible alternative for routine quality control analysis, easy to implement, reliable and sensitive to monitor TBT in environmental water matrices. Copyright © 2014 Elsevier B.V. All rights reserved.

  8. Software reengineering

    Science.gov (United States)

    Fridge, Ernest M., III

    1991-01-01

    Today's software systems generally use obsolete technology, are not integrated properly with other software systems, and are difficult and costly to maintain. The discipline of reverse engineering is becoming prominent as organizations try to move their systems up to more modern and maintainable technology in a cost effective manner. JSC created a significant set of tools to develop and maintain FORTRAN and C code during development of the Space Shuttle. This tool set forms the basis for an integrated environment to re-engineer existing code into modern software engineering structures which are then easier and less costly to maintain and which allow a fairly straightforward translation into other target languages. The environment will support these structures and practices even in areas where the language definition and compilers do not enforce good software engineering. The knowledge and data captured using the reverse engineering tools is passed to standard forward engineering tools to redesign or perform major upgrades to software systems in a much more cost effective manner than using older technologies. A beta vision of the environment was released in Mar. 1991. The commercial potential for such re-engineering tools is very great. CASE TRENDS magazine reported it to be the primary concern of over four hundred of the top MIS executives.

  9. Software Authentication

    International Nuclear Information System (INIS)

    Wolford, J.K.; Geelhood, B.D.; Hamilton, V.A.; Ingraham, J.; MacArthur, D.W.; Mitchell, D.J.; Mullens, J.A.; Vanier, P. E.; White, G.K.; Whiteson, R.

    2001-01-01

    The effort to define guidance for authentication of software for arms control and nuclear material transparency measurements draws on a variety of disciplines and has involved synthesizing established criteria and practices with newer methods. Challenges include the need to protect classified information that the software manipulates as well as deal with the rapid pace of innovation in the technology of nuclear material monitoring. The resulting guidance will shape the design of future systems and inform the process of authentication of instruments now being developed. This paper explores the technical issues underlying the guidance and presents its major tenets

  10. Software engineering

    CERN Document Server

    Thorin, Marc

    1985-01-01

    Software Engineering describes the conceptual bases as well as the main methods and rules on computer programming. This book presents software engineering as a coherent and logically built synthesis and makes it possible to properly carry out an application of small or medium difficulty that can later be developed and adapted to more complex cases. This text is comprised of six chapters and begins by introducing the reader to the fundamental notions of entities, actions, and programming. The next two chapters elaborate on the concepts of information and consistency domains and show that a proc

  11. Reviews, Software.

    Science.gov (United States)

    Science Teacher, 1988

    1988-01-01

    Reviews two computer software packages for use in physical science, physics, and chemistry classes. Includes "Physics of Model Rocketry" for Apple II, and "Black Box" for Apple II and IBM compatible computers. "Black Box" is designed to help students understand the concept of indirect evidence. (CW)

  12. Software Reviews.

    Science.gov (United States)

    Kinnaman, Daniel E.; And Others

    1988-01-01

    Reviews four educational software packages for Apple, IBM, and Tandy computers. Includes "How the West was One + Three x Four,""Mavis Beacon Teaches Typing,""Math and Me," and "Write On." Reviews list hardware requirements, emphasis, levels, publisher, purchase agreements, and price. Discusses the strengths…

  13. Software Review.

    Science.gov (United States)

    McGrath, Diane, Ed.

    1989-01-01

    Reviewed is a computer software package entitled "Audubon Wildlife Adventures: Grizzly Bears" for Apple II and IBM microcomputers. Included are availability, hardware requirements, cost, and a description of the program. The murder-mystery flavor of the program is stressed in this program that focuses on illegal hunting and game…

  14. Software Reviews.

    Science.gov (United States)

    Teles, Elizabeth, Ed.; And Others

    1990-01-01

    Reviewed are two computer software packages for Macintosh microcomputers including "Phase Portraits," an exploratory graphics tool for studying first-order planar systems; and "MacMath," a set of programs for exploring differential equations, linear algebra, and other mathematical topics. Features, ease of use, cost, availability, and hardware…

  15. MIAWARE Software

    DEFF Research Database (Denmark)

    Wilkowski, Bartlomiej; Pereira, Oscar N. M.; Dias, Paulo

    2008-01-01

    is automatically generated. Furthermore, MIAWARE software is accompanied with an intelligent search engine for medical reports, based on the relations between parts of the lungs. A logical structure of the lungs is introduced to the search algorithm through the specially developed ontology. As a result...

  16. Software Tools for Software Maintenance

    Science.gov (United States)

    1988-10-01

    COMMUNICATIONS, AND COMPUTER SCIENCES I ,(AIRMICS) FO~SOFTWARE TOOLS (.o FOR SOF1 ’ARE MAINTENANCE (ASQBG-1-89-001) October, 1988 DTIC ELECTE -ifB...SUNWW~. B..c Program An~Iysw HA.c C-Tractr C Cobol Stncturing Facility VS Cobol 11 F-Scan Foctma Futbol Cobol Fortran Sltiuc Code Anaiyaer Fortran IS

  17. A novel optimised and validated method for analysis of multi-residues of pesticides in fruits and vegetables by microwave-assisted extraction (MAE)-dispersive solid-phase extraction (d-SPE)-retention time locked (RTL)-gas chromatography-mass spectrometry with Deconvolution reporting software (DRS).

    Science.gov (United States)

    Satpathy, Gouri; Tyagi, Yogesh Kumar; Gupta, Rajinder Kumar

    2011-08-01

    A rapid, effective and ecofriendly method for sensitive screening and quantification of 72 pesticides residue in fruits and vegetables, by microwave-assisted extraction (MAE) followed by dispersive solid-phase extraction (d-SPE), retention time locked (RTL) capillary gas-chromatographic separation in trace ion mode mass spectrometric determination has been validated as per ISO/IEC: 17025:2005. Identification and reporting with total and extracted ion chromatograms were facilitated to a great extent by Deconvolution reporting software (DRS). For all compounds LOD were 0.002-0.02mg/kg and LOQ were 0.025-0.100mg/kg. Correlation coefficients of the calibration curves in the range of 0.025-0.50mg/kg were >0.993. To validate matrix effects repeatability, reproducibility, recovery and overall uncertainty were calculated for the 35 matrices at 0.025, 0.050 and 0.100mg/kg. Recovery ranged between 72% and 114% with RSD of <20% for repeatability and intermediate precision. The reproducibility of the method was evaluated by an inter laboratory participation and Z score obtained within ±2. Copyright © 2011 Elsevier Ltd. All rights reserved.

  18. EPIQR software

    Energy Technology Data Exchange (ETDEWEB)

    Flourentzos, F. [Federal Institute of Technology, Lausanne (Switzerland); Droutsa, K. [National Observatory of Athens, Athens (Greece); Wittchen, K.B. [Danish Building Research Institute, Hoersholm (Denmark)

    1999-11-01

    The support of the EPIQR method is a multimedia computer program. Several modules help the users of the method to treat the data collected during a diagnosis survey, to set up refurbishment scenario and calculate their cost or energy performance, and finally to visualize the results in a comprehensive way and to prepare quality reports. This article presents the structure and the main features of the software. (au)

  19. Software preservation

    Directory of Open Access Journals (Sweden)

    Tadej Vodopivec

    2011-01-01

    Full Text Available Comtrade Ltd. covers a wide range of activities related to information and communication technologies; its deliverables include web applications, locally installed programs,system software, drivers, embedded software (used e.g. in medical devices, auto parts,communication switchboards. Also the extensive knowledge and practical experience about digital long-term preservation technologies have been acquired. This wide spectrum of activities puts us in the position to discuss the often overlooked aspect of the digital preservation - preservation of software programs. There are many resources dedicated to digital preservation of digital data, documents and multimedia records,but not so many about how to preserve the functionalities and features of computer programs. Exactly these functionalities - dynamic response to inputs - render the computer programs rich compared to documents or linear multimedia. The article opens the questions on the beginning of the way to the permanent digital preservation. The purpose is to find a way in the right direction, where all relevant aspects will be covered in proper balance. The following questions are asked: why at all to preserve computer programs permanently, who should do this and for whom, when we should think about permanent program preservation, what should be persevered (such as source code, screenshots, documentation, and social context of the program - e.g. media response to it ..., where and how? To illustrate the theoretic concepts given the idea of virtual national museum of electronic banking is also presented.

  20. Establishing software quality assurance

    International Nuclear Information System (INIS)

    Malsbury, J.

    1983-01-01

    This paper is concerned with four questions about establishing software QA: What is software QA. Why have software QA. What is the role of software QA. What is necessary to ensure the success of software QA

  1. Software Prototyping

    Science.gov (United States)

    Del Fiol, Guilherme; Hanseler, Haley; Crouch, Barbara Insley; Cummins, Mollie R.

    2016-01-01

    Summary Background Health information exchange (HIE) between Poison Control Centers (PCCs) and Emergency Departments (EDs) could improve care of poisoned patients. However, PCC information systems are not designed to facilitate HIE with EDs; therefore, we are developing specialized software to support HIE within the normal workflow of the PCC using user-centered design and rapid prototyping. Objective To describe the design of an HIE dashboard and the refinement of user requirements through rapid prototyping. Methods Using previously elicited user requirements, we designed low-fidelity sketches of designs on paper with iterative refinement. Next, we designed an interactive high-fidelity prototype and conducted scenario-based usability tests with end users. Users were asked to think aloud while accomplishing tasks related to a case vignette. After testing, the users provided feedback and evaluated the prototype using the System Usability Scale (SUS). Results Survey results from three users provided useful feedback that was then incorporated into the design. After achieving a stable design, we used the prototype itself as the specification for development of the actual software. Benefits of prototyping included having 1) subject-matter experts heavily involved with the design; 2) flexibility to make rapid changes, 3) the ability to minimize software development efforts early in the design stage; 4) rapid finalization of requirements; 5) early visualization of designs; 6) and a powerful vehicle for communication of the design to the programmers. Challenges included 1) time and effort to develop the prototypes and case scenarios; 2) no simulation of system performance; 3) not having all proposed functionality available in the final product; and 4) missing needed data elements in the PCC information system. PMID:27081404

  2. Formal Verification of Mathematical Software. Volume 2

    Science.gov (United States)

    1990-05-01

    nplus i 1) (nplus ,j k) iter ZERO f s =s iter (SUCC n) f s = iter n f (f s) PROVE x=(SUCC x)’=’(!x)’ PROVE ’ nplus ZERO n’ = ent PROVE ’ nplus ( SUCO n...PROVE ’niess (SUCC n) (SUCC mn)’ = ’niess n mn’ PROVE ’niess n ( SUCO mn)’ = ’true’, (’nim’=’true’ V/ ’niess n in’=’true’) PROVE ’niess (SUCC n) mn

  3. Global Software Engineering: A Software Process Approach

    Science.gov (United States)

    Richardson, Ita; Casey, Valentine; Burton, John; McCaffery, Fergal

    Our research has shown that many companies are struggling with the successful implementation of global software engineering, due to temporal, cultural and geographical distance, which causes a range of factors to come into play. For example, cultural, project managementproject management and communication difficulties continually cause problems for software engineers and project managers. While the implementation of efficient software processes can be used to improve the quality of the software product, published software process models do not cater explicitly for the recent growth in global software engineering. Our thesis is that global software engineering factors should be included in software process models to ensure their continued usefulness in global organisations. Based on extensive global software engineering research, we have developed a software process, Global Teaming, which includes specific practices and sub-practices. The purpose is to ensure that requirements for successful global software engineering are stipulated so that organisations can ensure successful implementation of global software engineering.

  4. Software system safety

    Science.gov (United States)

    Uber, James G.

    1988-01-01

    Software itself is not hazardous, but since software and hardware share common interfaces there is an opportunity for software to create hazards. Further, these software systems are complex, and proven methods for the design, analysis, and measurement of software safety are not yet available. Some past software failures, future NASA software trends, software engineering methods, and tools and techniques for various software safety analyses are reviewed. Recommendations to NASA are made based on this review.

  5. Visual querying and analysis of large software repositories

    NARCIS (Netherlands)

    Voinea, Lucian; Telea, Alexandru

    We present a software framework for mining software repositories. Our extensible framework enables the integration of data extraction from repositories with data analysis and interactive visualization. We demonstrate the applicability of the framework by presenting several case studies performed on

  6. FY1995 study of very flexible software structures based on soft-software components; 1995 nendo yawarankana software buhin ni motozuku software no choju kozo ni kansuru kenkyu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-03-01

    The purpose of this study is to develop the method and tools for changing the software structure flexibly along with the continuous continuous change of its environment and conditions of use. The goal is the software of very high adaptability by using soft-software components and flexible assembly. The CASE tool platform Sapid based on a fine-grained repository was developed and enforced for raising the abstraction level of program code and for mining potential flexible components. To reconstruct the software adaptable to a required environment, the SQM (Software Quark Model) was used in managing interconnectivity and other semantic relationships of among components. On these two basic systems, we developed various methods and tools such as those for static and dynamic analysis of very flexible software structures, program transformation description, program pattern extraction and composition component optimization by partial evaluation, component extraction by function slicing, code encapsulation, and component navigation and application. (NEDO)

  7. Avoidable Software Procurements

    Science.gov (United States)

    2012-09-01

    software license, software usage, ELA, Software as a Service , SaaS , Software Asset...PaaS Platform as a Service SaaS Software as a Service SAM Software Asset Management SMS System Management Server SEWP Solutions for Enterprise Wide...delivery of full Cloud Services , we will see the transition of the Cloud Computing service model from Iaas to SaaS , or Software as a Service . Software

  8. Software engineering architecture-driven software development

    CERN Document Server

    Schmidt, Richard F

    2013-01-01

    Software Engineering: Architecture-driven Software Development is the first comprehensive guide to the underlying skills embodied in the IEEE's Software Engineering Body of Knowledge (SWEBOK) standard. Standards expert Richard Schmidt explains the traditional software engineering practices recognized for developing projects for government or corporate systems. Software engineering education often lacks standardization, with many institutions focusing on implementation rather than design as it impacts product architecture. Many graduates join the workforce with incomplete skil

  9. Advances in software science and technology

    CERN Document Server

    Hikita, Teruo; Kakuda, Hiroyasu

    1993-01-01

    Advances in Software Science and Technology, Volume 4 provides information pertinent to the advancement of the science and technology of computer software. This book discusses the various applications for computer systems.Organized into two parts encompassing 10 chapters, this volume begins with an overview of the historical survey of programming languages for vector/parallel computers in Japan and describes compiling methods for supercomputers in Japan. This text then explains the model of a Japanese software factory, which is presented by the logical configuration that has been satisfied by

  10. Multi-Level Formation of Complex Software Systems

    Directory of Open Access Journals (Sweden)

    Hui Li

    2016-05-01

    Full Text Available We present a multi-level formation model for complex software systems. The previous works extract the software systems to software networks for further studies, but usually investigate the software networks at the class level. In contrast to these works, our treatment of software systems as multi-level networks is more realistic. In particular, the software networks are organized by three levels of granularity, which represents the modularity and hierarchy in the formation process of real-world software systems. More importantly, simulations based on this model have generated more realistic structural properties of software networks, such as power-law, clustering and modularization. On the basis of this model, how the structure of software systems effects software design principles is then explored, and it could be helpful for understanding software evolution and software engineering practices.

  11. The software life cycle

    CERN Document Server

    Ince, Darrel

    1990-01-01

    The Software Life Cycle deals with the software lifecycle, that is, what exactly happens when software is developed. Topics covered include aspects of software engineering, structured techniques of software development, and software project management. The use of mathematics to design and develop computer systems is also discussed. This book is comprised of 20 chapters divided into four sections and begins with an overview of software engineering and software development, paying particular attention to the birth of software engineering and the introduction of formal methods of software develop

  12. Automated Fault Interpretation and Extraction using Improved Supplementary Seismic Datasets

    Science.gov (United States)

    Bollmann, T. A.; Shank, R.

    2017-12-01

    During the interpretation of seismic volumes, it is necessary to interpret faults along with horizons of interest. With the improvement of technology, the interpretation of faults can be expedited with the aid of different algorithms that create supplementary seismic attributes, such as semblance and coherency. These products highlight discontinuities, but still need a large amount of human interaction to interpret faults and are plagued by noise and stratigraphic discontinuities. Hale (2013) presents a method to improve on these datasets by creating what is referred to as a Fault Likelihood volume. In general, these volumes contain less noise and do not emphasize stratigraphic features. Instead, planar features within a specified strike and dip range are highlighted. Once a satisfactory Fault Likelihood Volume is created, extraction of fault surfaces is much easier. The extracted fault surfaces are then exported to interpretation software for QC. Numerous software packages have implemented this methodology with varying results. After investigating these platforms, we developed a preferred Automated Fault Interpretation workflow.

  13. Software and Network Engineering

    CERN Document Server

    2012-01-01

    The series "Studies in Computational Intelligence" (SCI) publishes new developments and advances in the various areas of computational intelligence – quickly and with a high quality. The intent is to cover the theory, applications, and design methods of computational intelligence, as embedded in the fields of engineering, computer science, physics and life science, as well as the methodologies behind them. The series contains monographs, lecture notes and edited volumes in computational intelligence spanning the areas of neural networks, connectionist systems, genetic algorithms, evolutionary computation, artificial intelligence, cellular automata, self-organizing systems, soft computing, fuzzy systems, and hybrid intelligent systems. Critical to both contributors and readers are the short publication time and world-wide distribution - this permits a rapid and broad dissemination of research results.   The purpose of the first ACIS International Symposium on Software and Network Engineering held on Decembe...

  14. ESTSC - Software Best Practices

    Science.gov (United States)

    DOE Scientific and Technical Software Best Practices December 2010 Table of Contents 1.0 Introduction 2.0 Responsibilities 2.1 OSTI/ESTSC 2.2 SIACs 2.3 Software Submitting Sites/Creators 2.4 Software Sensitivity Review 3.0 Software Announcement and Submission 3.1 STI Software Appropriate for Announcement 3.2

  15. Software Assurance Competency Model

    Science.gov (United States)

    2013-03-01

    COTS) software , and software as a service ( SaaS ). L2: Define and analyze risks in the acquisition of contracted software , COTS software , and SaaS ...2010a]: Application of technologies and processes to achieve a required level of confidence that software systems and services function in the...

  16. Software attribute visualization for high integrity software

    Energy Technology Data Exchange (ETDEWEB)

    Pollock, G.M.

    1998-03-01

    This report documents a prototype tool developed to investigate the use of visualization and virtual reality technologies for improving software surety confidence. The tool is utilized within the execution phase of the software life cycle. It provides a capability to monitor an executing program against prespecified requirements constraints provided in a program written in the requirements specification language SAGE. The resulting Software Attribute Visual Analysis Tool (SAVAnT) also provides a technique to assess the completeness of a software specification.

  17. Software Atom: An approach towards software components structuring to improve reusability

    Directory of Open Access Journals (Sweden)

    Muhammad Hussain Mughal

    2017-12-01

    Full Text Available Diversity of application domain compelled to design sustainable classification scheme for significantly amassing software repository. The atomic reusable software components are articulated to improve the software component reusability in volatile industry.  Numerous approaches of software classification have been proposed over past decades. Each approach has some limitations related to coupling and cohesion. In this paper, we proposed a novel approach by constituting the software based on radical functionalities to improve software reusability. We analyze the element's semantics in Periodic Table used in chemistry to design our classification approach, and present this approach using tree-based classification to curtail software repository search space complexity and further refined based on semantic search techniques. We developed a Global unique Identifier (GUID for indexing the functions and related components. We have exploited the correlation between chemistry element and software elements to simulate one to one mapping between them. Our approach is inspired from sustainability chemical periodic table. We have proposed software periodic table (SPT representing atomic software components extracted from real application software. Based on SPT classified repository tree parsing & extraction to enable the user to program their software by customizing the ingredients of software requirements. The classified repository of software ingredients assist user to exploits their requirements to software engineer and enable requirement engineer to develop a rapid large-scale prototype with great essence. Furthermore, we would predict the usability of the categorized repository based on feedback of users.  The continuous evolution of that proposed repository will be fine-tuned based on utilization and SPT would be gradually optimized by ant colony optimization techniques. Succinctly would provoke automating the software development process.

  18. Turning the volume down on heavy metals using tuned diatomite. A review of diatomite and modified diatomite for the extraction of heavy metals from water.

    Science.gov (United States)

    Danil de Namor, Angela F; El Gamouz, Abdelaziz; Frangie, Sofia; Martinez, Vanina; Valiente, Liliana; Webb, Oliver A

    2012-11-30

    Contamination of water by heavy metals is a global problem, to which an inexpensive and simple solution is required. Within this context the unique properties of diatomite and its abundance in many regions of the world have led to the current widespread interest in this material for water purification purposes. Defined sections on articles published on the use of raw and modified diatomite for the removal of heavy metal pollutants from water are critically reviewed. The capability of the materials as extracting agents for individual species and mixtures of heavy metals are considered in terms of the kinetics, the thermodynamics and the recyclability for both, the pollutant and the extracting material. The concept of 'selectivity' for the enrichment of naturally occurring materials such as diatomite through the introduction of suitable functionalities in their structure to target a given pollutant is emphasised. Suggestions for further research in this area are given. Copyright © 2012 Elsevier B.V. All rights reserved.

  19. Reliability of software

    International Nuclear Information System (INIS)

    Kopetz, H.

    1980-01-01

    Common factors and differences in the reliability of hardware and software; reliability increase by means of methods of software redundancy. Maintenance of software for long term operating behavior. (HP) [de

  20. Turning the volume down on heavy metals using tuned diatomite. A review of diatomite and modified diatomite for the extraction of heavy metals from water

    Energy Technology Data Exchange (ETDEWEB)

    Danil de Namor, Angela F., E-mail: A.Danil-De-Namor@surrey.ac.uk [Instituto Nacional de Tecnologia Industrial, Parque Tecnologico Industrial Miguelete, Buenos Aires (Argentina); Department of Chemistry, University of Surrey, Guildford, Surrey GU2 7XH (United Kingdom); El Gamouz, Abdelaziz [Department of Chemistry, University of Surrey, Guildford, Surrey GU2 7XH (United Kingdom); Frangie, Sofia; Martinez, Vanina; Valiente, Liliana [Instituto Nacional de Tecnologia Industrial, Parque Tecnologico Industrial Miguelete, Buenos Aires (Argentina); Webb, Oliver A. [Department of Chemistry, University of Surrey, Guildford, Surrey GU2 7XH (United Kingdom)

    2012-11-30

    Highlights: Black-Right-Pointing-Pointer Critical assessment of published work on raw and modified diatomites. Black-Right-Pointing-Pointer Counter-ion effect on the extraction of heavy metal speciation by diatomite. Black-Right-Pointing-Pointer Selection of the counter-ion by the use of existing thermodynamic data. Black-Right-Pointing-Pointer Enrichment of diatomites by attaching heavy metal selective functionalities. Black-Right-Pointing-Pointer Supramolecular chemistry for conferring selectivity to diatomites. - Abstract: Contamination of water by heavy metals is a global problem, to which an inexpensive and simple solution is required. Within this context the unique properties of diatomite and its abundance in many regions of the world have led to the current widespread interest in this material for water purification purposes. Defined sections on articles published on the use of raw and modified diatomite for the removal of heavy metal pollutants from water are critically reviewed. The capability of the materials as extracting agents for individual species and mixtures of heavy metals are considered in terms of the kinetics, the thermodynamics and the recyclability for both, the pollutant and the extracting material. The concept of 'selectivity' for the enrichment of naturally occurring materials such as diatomite through the introduction of suitable functionalities in their structure to target a given pollutant is emphasised. Suggestions for further research in this area are given.

  1. Turning the volume down on heavy metals using tuned diatomite. A review of diatomite and modified diatomite for the extraction of heavy metals from water

    International Nuclear Information System (INIS)

    Danil de Namor, Angela F.; El Gamouz, Abdelaziz; Frangie, Sofia; Martinez, Vanina; Valiente, Liliana; Webb, Oliver A.

    2012-01-01

    Highlights: ► Critical assessment of published work on raw and modified diatomites. ► Counter-ion effect on the extraction of heavy metal speciation by diatomite. ► Selection of the counter-ion by the use of existing thermodynamic data. ► Enrichment of diatomites by attaching heavy metal selective functionalities. ► Supramolecular chemistry for conferring selectivity to diatomites. - Abstract: Contamination of water by heavy metals is a global problem, to which an inexpensive and simple solution is required. Within this context the unique properties of diatomite and its abundance in many regions of the world have led to the current widespread interest in this material for water purification purposes. Defined sections on articles published on the use of raw and modified diatomite for the removal of heavy metal pollutants from water are critically reviewed. The capability of the materials as extracting agents for individual species and mixtures of heavy metals are considered in terms of the kinetics, the thermodynamics and the recyclability for both, the pollutant and the extracting material. The concept of ‘selectivity’ for the enrichment of naturally occurring materials such as diatomite through the introduction of suitable functionalities in their structure to target a given pollutant is emphasised. Suggestions for further research in this area are given.

  2. Space Flight Software Development Software for Intelligent System Health Management

    Science.gov (United States)

    Trevino, Luis C.; Crumbley, Tim

    2004-01-01

    The slide presentation examines the Marshall Space Flight Center Flight Software Branch, including software development projects, mission critical space flight software development, software technical insight, advanced software development technologies, and continuous improvement in the software development processes and methods.

  3. Data extraction system for underwater particle holography

    Science.gov (United States)

    Nebrensky, J. J.; Craig, Gary; Hobson, Peter R.; Lampitt, R. S.; Nareid, Helge; Pescetto, A.; Trucco, Andrea; Watson, John

    2000-08-01

    Pulsed laser holography in an extremely powerful technique for the study of particle fields as it allows instantaneous, non-invasive high- resolution recording of substantial volumes. By relaying the real image one can obtain the size, shape, position and - if multiple exposures are made - velocity of every object in the recorded field. Manual analysis of large volumes containing thousands of particles is, however, an enormous and time-consuming task, with operator fatigue an unpredictable source of errors. Clearly the value of holographic measurements also depends crucially on the quality of the reconstructed image: not only will poor resolution degrade the size and shape measurements, but aberrations such as coma and astigmatism can change the perceived centroid of a particle, affecting position and velocity measurements. For large-scale applications of particle field holography, specifically the in situ recording of marine plankton with Holocam, we have developed an automated data extraction system that can be readily switched between the in-line and off-axis geometries and provides optimised reconstruction from holograms recorded underwater. As a videocamera is automatically stepped through the 200 by 200 by 1000mm sample volume, image processing and object tracking routines locate and extract particle images for further classification by a separate software module.

  4. Software Engineering Guidebook

    Science.gov (United States)

    Connell, John; Wenneson, Greg

    1993-01-01

    The Software Engineering Guidebook describes SEPG (Software Engineering Process Group) supported processes and techniques for engineering quality software in NASA environments. Three process models are supported: structured, object-oriented, and evolutionary rapid-prototyping. The guidebook covers software life-cycles, engineering, assurance, and configuration management. The guidebook is written for managers and engineers who manage, develop, enhance, and/or maintain software under the Computer Software Services Contract.

  5. Agile Processes in Software Engineering and Extreme Programming

    DEFF Research Database (Denmark)

    The volume constitutes the proceedings of the 18th International Conference on Agile Software Development, XP 2017, held in Cologne, Germany, in May 2017. The 14 full and 6 short papers presented in this volume were carefully reviewed and selected from 46 submissions. They were organized in topical...... sections named: improving agile processes; agile in organization; and safety critical software. In addition, the volume contains 3 doctoral symposium papers (from 4 papers submitted)....

  6. Computational intelligence and quantitative software engineering

    CERN Document Server

    Succi, Giancarlo; Sillitti, Alberto

    2016-01-01

    In a down-to-the earth manner, the volume lucidly presents how the fundamental concepts, methodology, and algorithms of Computational Intelligence are efficiently exploited in Software Engineering and opens up a novel and promising avenue of a comprehensive analysis and advanced design of software artifacts. It shows how the paradigm and the best practices of Computational Intelligence can be creatively explored to carry out comprehensive software requirement analysis, support design, testing, and maintenance. Software Engineering is an intensive knowledge-based endeavor of inherent human-centric nature, which profoundly relies on acquiring semiformal knowledge and then processing it to produce a running system. The knowledge spans a wide variety of artifacts, from requirements, captured in the interaction with customers, to design practices, testing, and code management strategies, which rely on the knowledge of the running system. This volume consists of contributions written by widely acknowledged experts ...

  7. Ensuring Software IP Cleanliness

    Directory of Open Access Journals (Sweden)

    Mahshad Koohgoli

    2007-12-01

    Full Text Available At many points in the life of a software enterprise, determination of intellectual property (IP cleanliness becomes critical. The value of an enterprise that develops and sells software may depend on how clean the software is from the IP perspective. This article examines various methods of ensuring software IP cleanliness and discusses some of the benefits and shortcomings of current solutions.

  8. Commercial Literacy Software.

    Science.gov (United States)

    Balajthy, Ernest

    1997-01-01

    Presents the first year's results of a continuing project to monitor the availability of software of relevance for literacy education purposes. Concludes there is an enormous amount of software available for use by teachers of reading and literacy--whereas drill-and-practice software is the largest category of software available, large numbers of…

  9. Ensuring Software IP Cleanliness

    OpenAIRE

    Mahshad Koohgoli; Richard Mayer

    2007-01-01

    At many points in the life of a software enterprise, determination of intellectual property (IP) cleanliness becomes critical. The value of an enterprise that develops and sells software may depend on how clean the software is from the IP perspective. This article examines various methods of ensuring software IP cleanliness and discusses some of the benefits and shortcomings of current solutions.

  10. Statistical Software Engineering

    Science.gov (United States)

    1998-04-13

    multiversion software subject to coincident errors. IEEE Trans. Software Eng. SE-11:1511-1517. Eckhardt, D.E., A.K Caglayan, J.C. Knight, L.D. Lee, D.F...J.C. and N.G. Leveson. 1986. Experimental evaluation of the assumption of independence in multiversion software. IEEE Trans. Software

  11. Agile Software Development

    Science.gov (United States)

    Biju, Soly Mathew

    2008-01-01

    Many software development firms are now adopting the agile software development method. This method involves the customer at every level of software development, thus reducing the impact of change in the requirement at a later stage. In this article, the principles of the agile method for software development are explored and there is a focus on…

  12. Improving Software Developer's Competence

    DEFF Research Database (Denmark)

    Abrahamsson, Pekka; Kautz, Karlheinz; Sieppi, Heikki

    2002-01-01

    Emerging agile software development methods are people oriented development approaches to be used by the software industry. The personal software process (PSP) is an accepted method for improving the capabilities of a single software engineer. Five original hypotheses regarding the impact...

  13. Software - Naval Oceanography Portal

    Science.gov (United States)

    are here: Home › USNO › Earth Orientation › Software USNO Logo USNO Navigation Earth Orientation Products GPS-based Products VLBI-based Products EO Information Center Publications about Products Software Search databases Auxiliary Software Supporting Software Form Folder Earth Orientation Matrix Calculator

  14. Software Engineering Education Directory

    Science.gov (United States)

    1990-04-01

    and Engineering (CMSC 735) Codes: GPEV2 * Textiooks: IEEE Tutoria on Models and Metrics for Software Management and Engameeing by Basi, Victor R...Software Engineering (Comp 227) Codes: GPRY5 Textbooks: IEEE Tutoria on Software Design Techniques by Freeman, Peter and Wasserman, Anthony 1. Software

  15. Automatic extraction of left ventricular mass and volumes using parametric images from non-ECG-gated 15O-water PET/CT

    DEFF Research Database (Denmark)

    Nordström, J; Harms, Hans; Lubberink, Mark

    of the present study was to investigate the feasibility of measuring LV geometry using dynamic 15O-water PET/CT without ECG-gating. Methods: Parametric images of MBF, perfusable tissue fraction (PTF) and LV blood pool were generated automatically using kinetic modelling. Segmentation of the LV wall using PTF......Introduction: 15O-water positron emission tomography (PET) is considered the gold standard for non-invasive quantification of myocardial blood flow (MBF). It has been shown to identify patients with significant coronary artery disease (CAD) with high accuracy. Hypertrophy with or without dilatation...... combined to measure stroke volume (SV=EDV-ESV) and ejection fraction (EF=SV/EDV). Accuracy was determined by comparing PET to cardiac magnetic resonance (CMR) in 30 asymptomatic patients with high grade LV regurgitation (group A). Precision was determined as inter-observer variation in group...

  16. Great software debates

    CERN Document Server

    Davis, A

    2004-01-01

    The industry’s most outspoken and insightful critic explains how the software industry REALLY works. In Great Software Debates, Al Davis, shares what he has learned about the difference between the theory and the realities of business and encourages you to question and think about software engineering in ways that will help you succeed where others fail. In short, provocative essays, Davis fearlessly reveals the truth about process improvement, productivity, software quality, metrics, agile development, requirements documentation, modeling, software marketing and sales, empiricism, start-up financing, software research, requirements triage, software estimation, and entrepreneurship.

  17. Views on Software Testability

    OpenAIRE

    Shimeall, Timothy; Friedman, Michael; Chilenski, John; Voas, Jeffrey

    1994-01-01

    The field of testability is an active, well-established part of engineering of modern computer systems. However, only recently have technologies for software testability began to be developed. These technologies focus on accessing the aspects of software that improve or depreciate the ease of testing. As both the size of implemented software and the amount of effort required to test that software increase, so will the important of software testability technologies in influencing the softwa...

  18. Agile software assessment

    OpenAIRE

    Nierstrasz Oscar; Lungu Mircea

    2012-01-01

    Informed decision making is a critical activity in software development but it is poorly supported by common development environments which focus mainly on low level programming tasks. We posit the need for agile software assessment which aims to support decision making by enabling rapid and effective construction of software models and custom analyses. Agile software assessment entails gathering and exploiting the broader context of software information related to the system at hand as well ...

  19. Software component quality evaluation

    Science.gov (United States)

    Clough, A. J.

    1991-01-01

    The paper describes a software inspection process that can be used to evaluate the quality of software components. Quality criteria, process application, independent testing of the process and proposed associated tool support are covered. Early results indicate that this technique is well suited for assessing software component quality in a standardized fashion. With automated machine assistance to facilitate both the evaluation and selection of software components, such a technique should promote effective reuse of software components.

  20. Japan society for software science and technology

    CERN Document Server

    Nakajima, Reiji; Hagino, Tatsuya

    1990-01-01

    Advances in Software Science and Technology, Volume 1 provides information pertinent to the advancement of the science and technology of computer software. This book discusses the various applications for computer systems.Organized into three parts encompassing 13 chapters, this volume begins with an overview of the phase structure grammar for Japanese called JPSG, and a parser based on this grammar. This text then explores the logic-based knowledge representation called Uranus, which uses a multiple world mechanism. Other chapters consider the optimal file segmentation techniques for multi-at

  1. Advances in software science and technology

    CERN Document Server

    Kamimura, Tsutomu

    1994-01-01

    This serial is a translation of the original works within the Japan Society of Software Science and Technology. A key source of information for computer scientists in the U.S., the serial explores the major areas of research in software and technology in Japan. These volumes are intended to promote worldwide exchange of ideas among professionals.This volume includes original research contributions in such areas as Augmented Language Logic (ALL), distributed C language, Smalltalk 80, and TAMPOPO-an evolutionary learning machine based on the principles of Realtime Minimum Skyline Detection.

  2. Advances in software science and technology

    CERN Document Server

    Ohno, Yoshio; Kamimura, Tsutomu

    1991-01-01

    Advances in Software Science and Technology, Volume 2 provides information pertinent to the advancement of the science and technology of computer software. This book discusses the various applications for computer systems.Organized into four parts encompassing 12 chapters, this volume begins with an overview of categorical frameworks that are widely used to represent data types in computer science. This text then provides an algorithm for generating vertices of a smoothed polygonal line from the vertices of a digital curve or polygonal curve whose position contains a certain amount of error. O

  3. Advances in software science and technology

    CERN Document Server

    Kakuda, Hiroyasu; Ohno, Yoshio

    1992-01-01

    Advances in Software Science and Technology, Volume 3 provides information pertinent to the advancement of the science and technology of computer software. This book discusses the various applications for computer systems.Organized into two parts encompassing 11 chapters, this volume begins with an overview of the development of a system of writing tools called SUIKOU that analyzes a machine-readable Japanese document textually. This text then presents the conditioned attribute grammars (CAGs) and a system for evaluating them that can be applied to natural-language processing. Other chapters c

  4. Software Quality Assurance Metrics

    Science.gov (United States)

    McRae, Kalindra A.

    2004-01-01

    Software Quality Assurance (SQA) is a planned and systematic set of activities that ensures conformance of software life cycle processes and products conform to requirements, standards and procedures. In software development, software quality means meeting requirements and a degree of excellence and refinement of a project or product. Software Quality is a set of attributes of a software product by which its quality is described and evaluated. The set of attributes includes functionality, reliability, usability, efficiency, maintainability, and portability. Software Metrics help us understand the technical process that is used to develop a product. The process is measured to improve it and the product is measured to increase quality throughout the life cycle of software. Software Metrics are measurements of the quality of software. Software is measured to indicate the quality of the product, to assess the productivity of the people who produce the product, to assess the benefits derived from new software engineering methods and tools, to form a baseline for estimation, and to help justify requests for new tools or additional training. Any part of the software development can be measured. If Software Metrics are implemented in software development, it can save time, money, and allow the organization to identify the caused of defects which have the greatest effect on software development. The summer of 2004, I worked with Cynthia Calhoun and Frank Robinson in the Software Assurance/Risk Management department. My task was to research and collect, compile, and analyze SQA Metrics that have been used in other projects that are not currently being used by the SA team and report them to the Software Assurance team to see if any metrics can be implemented in their software assurance life cycle process.

  5. Software Engineering Program: Software Process Improvement Guidebook

    Science.gov (United States)

    1996-01-01

    The purpose of this document is to provide experience-based guidance in implementing a software process improvement program in any NASA software development or maintenance community. This guidebook details how to define, operate, and implement a working software process improvement program. It describes the concept of the software process improvement program and its basic organizational components. It then describes the structure, organization, and operation of the software process improvement program, illustrating all these concepts with specific NASA examples. The information presented in the document is derived from the experiences of several NASA software organizations, including the SEL, the SEAL, and the SORCE. Their experiences reflect many of the elements of software process improvement within NASA. This guidebook presents lessons learned in a form usable by anyone considering establishing a software process improvement program within his or her own environment. This guidebook attempts to balance general and detailed information. It provides material general enough to be usable by NASA organizations whose characteristics do not directly match those of the sources of the information and models presented herein. It also keeps the ideas sufficiently close to the sources of the practical experiences that have generated the models and information.

  6. From Software Development to Software Assembly

    NARCIS (Netherlands)

    Sneed, Harry M.; Verhoef, Chris

    2016-01-01

    The lack of skilled programming personnel and the growing burden of maintaining customized software are forcing organizations to quit producing their own software. It's high time they turned to ready-made, standard components to fulfill their business requirements. Cloud services might be one way to

  7. Building Software with Gradle

    CERN Multimedia

    CERN. Geneva; Studer, Etienne

    2014-01-01

    In this presentation, we will give an overview of the key concepts and main features of Gradle, the innovative build system that has become the de-facto standard in the enterprise. We will cover task declaration and task graph execution, incremental builds, multi-project builds, dependency management, applying plugins, extracting reusable build logic, bootstrapping a build, and using the Gradle daemon. By the end of this talk, you will have a good understanding of what makes Gradle so powerful yet easy to use. You will also understand why companies like Pivotal, LinkedIn, Google, and other giants with complex builds count on Gradle. About the speakers Etienne is leading the Tooling Team at Gradleware. He has been working as a developer, architect, project manager, and CTO over the past 15 years. He has spent most of his time building software products from the ground up and successfully shipping them to happy customers. He had ...

  8. Maneuver Automation Software

    Science.gov (United States)

    Uffelman, Hal; Goodson, Troy; Pellegrin, Michael; Stavert, Lynn; Burk, Thomas; Beach, David; Signorelli, Joel; Jones, Jeremy; Hahn, Yungsun; Attiyah, Ahlam; hide

    2009-01-01

    The Maneuver Automation Software (MAS) automates the process of generating commands for maneuvers to keep the spacecraft of the Cassini-Huygens mission on a predetermined prime mission trajectory. Before MAS became available, a team of approximately 10 members had to work about two weeks to design, test, and implement each maneuver in a process that involved running many maneuver-related application programs and then serially handing off data products to other parts of the team. MAS enables a three-member team to design, test, and implement a maneuver in about one-half hour after Navigation has process-tracking data. MAS accepts more than 60 parameters and 22 files as input directly from users. MAS consists of Practical Extraction and Reporting Language (PERL) scripts that link, sequence, and execute the maneuver- related application programs: "Pushing a single button" on a graphical user interface causes MAS to run navigation programs that design a maneuver; programs that create sequences of commands to execute the maneuver on the spacecraft; and a program that generates predictions about maneuver performance and generates reports and other files that enable users to quickly review and verify the maneuver design. MAS can also generate presentation materials, initiate electronic command request forms, and archive all data products for future reference.

  9. Software Engineering Improvement Plan

    Science.gov (United States)

    2006-01-01

    In performance of this task order, bd Systems personnel provided support to the Flight Software Branch and the Software Working Group through multiple tasks related to software engineering improvement and to activities of the independent Technical Authority (iTA) Discipline Technical Warrant Holder (DTWH) for software engineering. To ensure that the products, comments, and recommendations complied with customer requirements and the statement of work, bd Systems personnel maintained close coordination with the customer. These personnel performed work in areas such as update of agency requirements and directives database, software effort estimation, software problem reports, a web-based process asset library, miscellaneous documentation review, software system requirements, issue tracking software survey, systems engineering NPR, and project-related reviews. This report contains a summary of the work performed and the accomplishments in each of these areas.

  10. Spotting software errors sooner

    International Nuclear Information System (INIS)

    Munro, D.

    1989-01-01

    Static analysis is helping to identify software errors at an earlier stage and more cheaply than conventional methods of testing. RTP Software's MALPAS system also has the ability to check that a code conforms to its original specification. (author)

  11. Avionics and Software Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The goal of the AES Avionics and Software (A&S) project is to develop a reference avionics and software architecture that is based on standards and that can be...

  12. Paladin Software Support Lab

    Data.gov (United States)

    Federal Laboratory Consortium — The Paladin Software Support Environment (SSE) occupies 2,241 square-feet. It contains the hardware and software tools required to support the Paladin Automatic Fire...

  13. Pragmatic Software Innovation

    DEFF Research Database (Denmark)

    Aaen, Ivan; Jensen, Rikke Hagensby

    2014-01-01

    We understand software innovation as concerned with introducing innovation into the development of software intensive systems, i.e. systems in which software development and/or integration are dominant considerations. Innovation is key in almost any strategy for competitiveness in existing markets......, for creating new markets, or for curbing rising public expenses, and software intensive systems are core elements in most such strategies. Software innovation therefore is vital for about every sector of the economy. Changes in software technologies over the last decades have opened up for experimentation......, learning, and flexibility in ongoing software projects, but how can this change be used to facilitate software innovation? How can a team systematically identify and pursue opportunities to create added value in ongoing projects? In this paper, we describe Deweyan pragmatism as the philosophical foundation...

  14. Optimization of Antivirus Software

    OpenAIRE

    Catalin BOJA; Adrian VISOIU

    2007-01-01

    The paper describes the main techniques used in development of computer antivirus software applications. For this particular category of software, are identified and defined optimum criteria that helps determine which solution is better and what are the objectives of the optimization process. From the general viewpoint of software optimization are presented methods and techniques that are applied at code development level. Regarding the particularities of antivirus software, the paper analyze...

  15. Open Source Software Development

    Science.gov (United States)

    2011-01-01

    appropriate to refer to FOSS or FLOSS (L for Libre , where the alternative term “ libre software ” has popularity in some parts of the world) in order...Applying Social Network Analysis to Community-Drive Libre Software Projects, Intern. J. Info. Tech. and Web Engineering, 2006, 1(3), 27-28. 17...Open Source Software Development* Walt Scacchi Institute for Software Researcher University of California, Irvine Irvine, CA 92697-3455 USA Abstract

  16. Gammasphere software development

    International Nuclear Information System (INIS)

    Piercey, R.B.

    1994-01-01

    This report describes the activities of the nuclear physics group at Mississippi State University which were performed during 1993. Significant progress has been made in the focus areas: chairing the Gammasphere Software Working Group (SWG); assisting with the porting and enhancement of the ORNL UPAK histogramming software package; and developing standard formats for Gammasphere data products. In addition, they have established a new public ftp archive to distribute software and software development tools and information

  17. Software engineer's pocket book

    CERN Document Server

    Tooley, Michael

    2013-01-01

    Software Engineer's Pocket Book provides a concise discussion on various aspects of software engineering. The book is comprised of six chapters that tackle various areas of concerns in software engineering. Chapter 1 discusses software development, and Chapter 2 covers programming languages. Chapter 3 deals with operating systems. The book also tackles discrete mathematics and numerical computation. Data structures and algorithms are also explained. The text will be of great use to individuals involved in the specification, design, development, implementation, testing, maintenance, and qualit

  18. Software Testing Requires Variability

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak

    2003-01-01

    Software variability is the ability of a software system or artefact to be changed, customized or configured for use in a particular context. Variability in software systems is important from a number of perspectives. Some perspectives rightly receive much attention due to their direct economic...... impact in software production. As is also apparent from the call for papers these perspectives focus on qualities such as reuse, adaptability, and maintainability....

  19. Software Architecture Evolution

    Science.gov (United States)

    Barnes, Jeffrey M.

    2013-01-01

    Many software systems eventually undergo changes to their basic architectural structure. Such changes may be prompted by new feature requests, new quality attribute requirements, changing technology, or other reasons. Whatever the causes, architecture evolution is commonplace in real-world software projects. Today's software architects, however,…

  20. XES Software Communication Extension

    NARCIS (Netherlands)

    Leemans, M.; Liu, C.

    2017-01-01

    During the execution of software, execution data can be recorded. With the development of process mining techniques on the one hand, and the growing availability of software execution data on the other hand, a new form of software analytics comes into reach. That is, applying process mining

  1. Neutron Scattering Software

    Science.gov (United States)

    Home Page | Facilities | Reference | Software | Conferences | Announcements | Mailing Lists Neutron Scattering Banner Neutron Scattering Software A new portal for neutron scattering has just been established sets KUPLOT: data plotting and fitting software ILL/TAS: Matlab probrams for analyzing triple axis data

  2. XES Software Event Extension

    NARCIS (Netherlands)

    Leemans, M.; Liu, C.

    2017-01-01

    During the execution of software, execution data can be recorded. With the development of process mining techniques on the one hand, and the growing availability of software execution data on the other hand, a new form of software analytics comes into reach. That is, applying process mining

  3. ARC Software and Models

    Science.gov (United States)

    Archives RESEARCH ▼ Research Areas Ongoing Projects Completed Projects SOFTWARE CONTACT ▼ Primary Contacts Researchers External Link MLibrary Deep Blue Software Archive Most research conducted at the ARC produce software code and methodologies that are transferred to TARDEC and industry partners. These

  4. XES Software Telemetry Extension

    NARCIS (Netherlands)

    Leemans, M.; Liu, C.

    2017-01-01

    During the execution of software, execution data can be recorded. With the development of process mining techniques on the one hand, and the growing availability of software execution data on the other hand, a new form of software analytics comes into reach. That is, applying process mining

  5. Specifications in software prototyping

    OpenAIRE

    Luqi; Chang, Carl K.; Zhu, Hong

    1998-01-01

    We explore the use of software speci®cations for software prototyping. This paper describes a process model for software prototyping, and shows how specifications can be used to support such a process via a cellular mobile phone switch example.

  6. Software Engineering for Portability.

    Science.gov (United States)

    Stanchev, Ivan

    1990-01-01

    Discussion of the portability of educational software focuses on the software design and development process. Topics discussed include levels of portability; the user-computer dialog; software engineering principles; design techniques for student performance records; techniques of courseware programing; and suggestions for further research and…

  7. Software Acquisition and Software Engineering Best Practices

    National Research Council Canada - National Science Library

    Eslinger, S

    1999-01-01

    The purpose of this white paper is to address the issues raised in the recently published Senate Armed Services Committee Report 106-50 concerning Software Management Improvements for the Department of Defense (DoD...

  8. Software Quality Assurance in Software Projects: A Study of Pakistan

    OpenAIRE

    Faisal Shafique Butt; Sundus Shaukat; M. Wasif Nisar; Ehsan Ullah Munir; Muhammad Waseem; Kashif Ayyub

    2013-01-01

    Software quality is specific property which tells what kind of standard software should have. In a software project, quality is the key factor of success and decline of software related organization. Many researches have been done regarding software quality. Software related organization follows standards introduced by Capability Maturity Model Integration (CMMI) to achieve good quality software. Quality is divided into three main layers which are Software Quality Assurance (SQA), Software Qu...

  9. Licensing safety critical software

    International Nuclear Information System (INIS)

    Archinoff, G.H.; Brown, R.A.

    1990-01-01

    Licensing difficulties with the shutdown system software at the Darlington Nuclear Generating Station contributed to delays in starting up the station. Even though the station has now been given approval by the Atomic Energy Control Board (AECB) to operate, the software issue has not disappeared - Ontario Hydro has been instructed by the AECB to redesign the software. This article attempts to explain why software based shutdown systems were chosen for Darlington, why there was so much difficulty licensing them, and what the implications are for other safety related software based applications

  10. HAZARD ANALYSIS SOFTWARE

    International Nuclear Information System (INIS)

    Sommer, S; Tinh Tran, T.

    2008-01-01

    Washington Safety Management Solutions, LLC developed web-based software to improve the efficiency and consistency of hazard identification and analysis, control selection and classification, and to standardize analysis reporting at Savannah River Site. In the new nuclear age, information technology provides methods to improve the efficiency of the documented safety analysis development process which includes hazard analysis activities. This software provides a web interface that interacts with a relational database to support analysis, record data, and to ensure reporting consistency. A team of subject matter experts participated in a series of meetings to review the associated processes and procedures for requirements and standard practices. Through these meetings, a set of software requirements were developed and compiled into a requirements traceability matrix from which software could be developed. The software was tested to ensure compliance with the requirements. Training was provided to the hazard analysis leads. Hazard analysis teams using the software have verified its operability. The software has been classified as NQA-1, Level D, as it supports the analysis team but does not perform the analysis. The software can be transported to other sites with alternate risk schemes. The software is being used to support the development of 14 hazard analyses. User responses have been positive with a number of suggestions for improvement which are being incorporated as time permits. The software has enforced a uniform implementation of the site procedures. The software has significantly improved the efficiency and standardization of the hazard analysis process

  11. An Integrated Software Suite for Surface-based Analyses of Cerebral Cortex

    Science.gov (United States)

    Van Essen, David C.; Drury, Heather A.; Dickson, James; Harwell, John; Hanlon, Donna; Anderson, Charles H.

    2001-01-01

    The authors describe and illustrate an integrated trio of software programs for carrying out surface-based analyses of cerebral cortex. The first component of this trio, SureFit (Surface Reconstruction by Filtering and Intensity Transformations), is used primarily for cortical segmentation, volume visualization, surface generation, and the mapping of functional neuroimaging data onto surfaces. The second component, Caret (Computerized Anatomical Reconstruction and Editing Tool Kit), provides a wide range of surface visualization and analysis options as well as capabilities for surface flattening, surface-based deformation, and other surface manipulations. The third component, SuMS (Surface Management System), is a database and associated user interface for surface-related data. It provides for efficient insertion, searching, and extraction of surface and volume data from the database. PMID:11522765

  12. An integrated software suite for surface-based analyses of cerebral cortex

    Science.gov (United States)

    Van Essen, D. C.; Drury, H. A.; Dickson, J.; Harwell, J.; Hanlon, D.; Anderson, C. H.

    2001-01-01

    The authors describe and illustrate an integrated trio of software programs for carrying out surface-based analyses of cerebral cortex. The first component of this trio, SureFit (Surface Reconstruction by Filtering and Intensity Transformations), is used primarily for cortical segmentation, volume visualization, surface generation, and the mapping of functional neuroimaging data onto surfaces. The second component, Caret (Computerized Anatomical Reconstruction and Editing Tool Kit), provides a wide range of surface visualization and analysis options as well as capabilities for surface flattening, surface-based deformation, and other surface manipulations. The third component, SuMS (Surface Management System), is a database and associated user interface for surface-related data. It provides for efficient insertion, searching, and extraction of surface and volume data from the database.

  13. Software Validation in ATLAS

    International Nuclear Information System (INIS)

    Hodgkinson, Mark; Seuster, Rolf; Simmons, Brinick; Sherwood, Peter; Rousseau, David

    2012-01-01

    The ATLAS collaboration operates an extensive set of protocols to validate the quality of the offline software in a timely manner. This is essential in order to process the large amounts of data being collected by the ATLAS detector in 2011 without complications on the offline software side. We will discuss a number of different strategies used to validate the ATLAS offline software; running the ATLAS framework software, Athena, in a variety of configurations daily on each nightly build via the ATLAS Nightly System (ATN) and Run Time Tester (RTT) systems; the monitoring of these tests and checking the compilation of the software via distributed teams of rotating shifters; monitoring of and follow up on bug reports by the shifter teams and periodic software cleaning weeks to improve the quality of the offline software further.

  14. [Optimization of ultrasonic-assisted extraction of total flavonoids from leaves of the Artocarpus heterophyllus by response surface methodology].

    Science.gov (United States)

    Wang, Hong-wu; Liu, Yan-qing; Wang, Yuan-hong

    2011-07-01

    To investigate the ultrasonic-assisted extract on of total flavonoids from leaves of the Artocarpus heterophyllus. Investigated the effects of ethanol concentration, extraction time, and liquid-solid ratio on flavonoids yield. A 17-run response surface design involving three factors at three levels was generated by the Design-Expert software and experimental data obtained were subjected to quadratic regression analysis to create a mathematical model describing flavonoids extraction. The optimum ultrasonic assisted extraction conditions were: ethanol volume fraction 69.4% and liquid-solid ratio of 22.6:1 for 32 min. Under these optimized conditions, the yield of flavonoids was 7.55 mg/g. The Box-Behnken design and response surface analysis can well optimize the ultrasonic-assisted extraction of total flavonoids from Artocarpus heterophyllus.

  15. Science and Software

    Science.gov (United States)

    Zelt, C. A.

    2017-12-01

    Earth science attempts to understand how the earth works. This research often depends on software for modeling, processing, inverting or imaging. Freely sharing open-source software is essential to prevent reinventing the wheel and allows software to be improved and applied in ways the original author may never have envisioned. For young scientists, releasing software can increase their name ID when applying for jobs and funding, and create opportunities for collaborations when scientists who collect data want the software's creator to be involved in their project. However, we frequently hear scientists say software is a tool, it's not science. Creating software that implements a new or better way of earth modeling or geophysical processing, inverting or imaging should be viewed as earth science. Creating software for things like data visualization, format conversion, storage, or transmission, or programming to enhance computational performance, may be viewed as computer science. The former, ideally with an application to real data, can be published in earth science journals, the latter possibly in computer science journals. Citations in either case should accurately reflect the impact of the software on the community. Funding agencies need to support more software development and open-source releasing, and the community should give more high-profile awards for developing impactful open-source software. Funding support and community recognition for software development can have far reaching benefits when the software is used in foreseen and unforeseen ways, potentially for years after the original investment in the software development. For funding, an open-source release that is well documented should be required, with example input and output files. Appropriate funding will provide the incentive and time to release user-friendly software, and minimize the need for others to duplicate the effort. All funded software should be available through a single web site

  16. Software Maintenance and Evolution: The Implication for Software ...

    African Journals Online (AJOL)

    Software Maintenance and Evolution: The Implication for Software Development. ... Software maintenance is the process of modifying existing operational software by correcting errors, ... EMAIL FREE FULL TEXT EMAIL FREE FULL TEXT

  17. Zirconia coated stir bar sorptive extraction combined with large volume sample stacking capillary electrophoresis-indirect ultraviolet detection for the determination of chemical warfare agent degradation products in water samples.

    Science.gov (United States)

    Li, Pingjing; Hu, Bin; Li, Xiaoyong

    2012-07-20

    In this study, a sensitive, selective and reliable analytical method by combining zirconia (ZrO₂) coated stir bar sorptive extraction (SBSE) with large volume sample stacking capillary electrophoresis-indirect ultraviolet (LVSS-CE/indirect UV) was developed for the direct analysis of chemical warfare agent degradation products of alkyl alkylphosphonic acids (AAPAs) (including ethyl methylphosphonic acid (EMPA) and pinacolyl methylphosphonate (PMPA)) and methylphosphonic acid (MPA) in environmental waters. ZrO₂ coated stir bar was prepared by adhering nanometer-sized ZrO₂ particles onto the surface of stir bar with commercial PDMS sol as adhesion agent. Due to the high affinity of ZrO₂ to the electronegative phosphonate group, ZrO₂ coated stir bars could selectively extract the strongly polar AAPAs and MPA. After systematically optimizing the extraction conditions of ZrO₂-SBSE, the analytical performance of ZrO₂-SBSE-CE/indirect UV and ZrO₂-SBSE-LVSS-CE/indirect UV was assessed. The limits of detection (LODs, at a signal-to-noise ratio of 3) obtained by ZrO₂-SBSE-CE/indirect UV were 13.4-15.9 μg/L for PMPA, EMPA and MPA. The relative standard deviations (RSDs, n=7, c=200 μg/L) of the corrected peak area for the target analytes were in the range of 6.4-8.8%. Enhancement factors (EFs) in terms of LODs were found to be from 112- to 145-fold. By combining ZrO₂ coating SBSE with LVSS as a dual preconcentration strategy, the EFs were magnified up to 1583-fold, and the LODs of ZrO₂-SBSE-LVSS-CE/indirect UV were 1.4, 1.2 and 3.1 μg/L for PMPA, EMPA, and MPA, respectively. The RSDs (n=7, c=20 μg/L) were found to be in the range of 9.0-11.8%. The developed ZrO₂-SBSE-LVSS-CE/indirect UV method has been successfully applied to the analysis of PMPA, EMPA, and MPA in different environmental water samples, and the recoveries for the spiked water samples were found to be in the range of 93.8-105.3%. Copyright © 2012 Elsevier B.V. All rights reserved.

  18. Software Defined Networking Demands on Software Technologies

    DEFF Research Database (Denmark)

    Galinac Grbac, T.; Caba, Cosmin Marius; Soler, José

    2015-01-01

    Software Defined Networking (SDN) is a networking approach based on a centralized control plane architecture with standardised interfaces between control and data planes. SDN enables fast configuration and reconfiguration of the network to enhance resource utilization and service performances....... This new approach enables a more dynamic and flexible network, which may adapt to user needs and application requirements. To this end, systemized solutions must be implemented in network software, aiming to provide secure network services that meet the required service performance levels. In this paper......, we review this new approach to networking from an architectural point of view, and identify and discuss some critical quality issues that require new developments in software technologies. These issues we discuss along with use case scenarios. Here in this paper we aim to identify challenges...

  19. A method for the direct injection and analysis of small volume human blood spots and plasma extracts containing high concentrations of organic solvents using revered-phase 2D UPLC/MS.

    Science.gov (United States)

    Rainville, Paul D; Simeone, Jennifer L; Root, Dan S; Mallet, Claude R; Wilson, Ian D; Plumb, Robert S

    2015-03-21

    The emergence of micro sampling techniques holds great potential to improve pharmacokinetic data quality, reduce animal usage, and save costs in safety assessment studies. The analysis of these samples presents new challenges for bioanalytical scientists, both in terms of sample processing and analytical sensitivity. The use of two dimensional LC/MS with, at-column-dilution for the direct analysis of highly organic extracts prepared from biological fluids such as dried blood spots and plasma is demonstrated. This technique negated the need to dry down and reconstitute, or dilute samples with water/aqueous buffer solutions, prior to injection onto a reversed-phase LC system. A mixture of model drugs, including bromhexine, triprolidine, enrofloxacin, and procaine were used to test the feasibility of the method. Finally an LC/MS assay for the probe pharmaceutical rosuvastatin was developed from dried blood spots and protein-precipitated plasma. The assays showed acceptable recovery, accuracy and precision according to US FDA guidelines. The resulting analytical method showed an increase in assay sensitivity of up to forty fold as compared to conventional methods by maximizing the amount loaded onto the system and the MS response for the probe pharmaceutical rosuvastatin from small volume samples.

  20. A Novel Method for Mining SaaS Software Tag via Community Detection in Software Services Network

    Science.gov (United States)

    Qin, Li; Li, Bing; Pan, Wei-Feng; Peng, Tao

    The number of online software services based on SaaS paradigm is increasing. However, users usually find it hard to get the exact software services they need. At present, tags are widely used to annotate specific software services and also to facilitate the searching of them. Currently these tags are arbitrary and ambiguous since mostly of them are generated manually by service developers. This paper proposes a method for mining tags from the help documents of software services. By extracting terms from the help documents and calculating the similarity between the terms, we construct a software similarity network where nodes represent software services, edges denote the similarity relationship between software services, and the weights of the edges are the similarity degrees. The hierarchical clustering algorithm is used for community detection in this software similarity network. At the final stage, tags are mined for each of the communities and stored as ontology.

  1. Software engineering in industry

    Science.gov (United States)

    Story, C. M.

    1989-12-01

    Can software be "engineered"? Can a few people with limited resources and a negligible budget produce high quality software solutions to complex software problems? It is possible to resolve the conflict between research activities and the necessity to view software development as a means to an end rather than as an end in itself? The aim of this paper is to encourage further thought and discussion on various topics which, in the author's experience, are becoming increasingly critical in large current software production and development projects, inside and outside high energy physics (HEP). This is done by briefly exploring some of the software engineering ideas and technologies now used in the information industry, using, as a case-study, a project with many similarities to those currently under way in HEP.

  2. Software for tomographic analysis: application in ceramic filters

    International Nuclear Information System (INIS)

    Figuerola, W.B.; Assis, J.T.; Oliveira, L.F.; Lopes, R.T.

    2001-01-01

    and UNIX). Various digital image processing techniques were implemented to extract physical properties such as: distance, volume, area and perimeter; digital filters as: Median filter, Histogram equalization, threshold quantization, boundary detection (Laplace and Sobel); and for the volume rendering, the Ray Casting technique was used. The results obtained with this software permits its use in this area of ceramic filters applications and to analyses others types of tomographic images

  3. SOFTWARE FOR REGIONS OF INTEREST RETRIEVAL ON MEDICAL 3D IMAGES

    Directory of Open Access Journals (Sweden)

    G. G. Stromov

    2014-01-01

    Full Text Available Background. Implementation of software for areas of interest retrieval in 3D medical images is described in this article. It has been tested against large volume of model MRIs.Material and methods. We tested software against normal and pathological (severe multiple sclerosis model MRIs from tge BrainWeb resource. Technological stack is based on open-source cross-platform solutions. We implemented storage system on Maria DB (an open-sourced fork of MySQL with P/SQL extensions. Python 2.7 scripting was used for automatization of extract-transform-load operations. The computational core is written on Java 7 with Spring framework 3. MongoDB was used as a cache in the cluster of workstations. Maven 3 was chosen as a dependency manager and build system, the project is hosted at Github.Results. As testing on SSMU's LAN has showed, software has been developed is quite efficiently retrieves ROIs are matching for the morphological substratum on pathological MRIs.Conclusion. Automation of a diagnostic process using medical imaging allows to level down the subjective component in decision making and increase the availability of hi-tech medicine. Software has shown in the article is a complex solution for ROI retrieving and segmentation process on model medical images in full-automated mode.We would like to thank Robert Vincent for great help with consulting of usage the BrainWeb resource.

  4. A software product certification model

    NARCIS (Netherlands)

    Heck, P.M.; Klabbers, M.D.; van Eekelen, Marko

    2010-01-01

    Certification of software artifacts offers organizations more certainty and confidence about software. Certification of software helps software sales, acquisition, and can be used to certify legislative compliance or to achieve acceptable deliverables in outsourcing. In this article, we present a

  5. Software verification for nuclear industry

    International Nuclear Information System (INIS)

    Wilburn, N.P.

    1985-08-01

    Why verification of software products throughout the software life cycle is necessary is considered. Concepts of verification, software verification planning, and some verification methodologies for products generated throughout the software life cycle are then discussed

  6. Software evolution and maintenance

    CERN Document Server

    Tripathy, Priyadarshi

    2014-01-01

    Software Evolution and Maintenance: A Practitioner's Approach is an accessible textbook for students and professionals, which collates the advances in software development and provides the most current models and techniques in maintenance.Explains two maintenance standards: IEEE/EIA 1219 and ISO/IEC14764Discusses several commercial reverse and domain engineering toolkitsSlides for instructors are available onlineInformation is based on the IEEE SWEBOK (Software Engineering Body of Knowledge)

  7. Software for microcircuit systems

    International Nuclear Information System (INIS)

    Kunz, P.F.

    1978-10-01

    Modern Large Scale Integration (LSI) microcircuits are meant to be programed in order to control the function that they perform. The basics of microprograming and new microcircuits have already been discussed. In this course, the methods of developing software for these microcircuits are explored. This generally requires a package of support software in order to assemble the microprogram, and also some amount of support software to test the microprograms and to test the microprogramed circuit itself. 15 figures, 2 tables

  8. Hospital Management Software Development

    OpenAIRE

    sobogunGod, olawale

    2012-01-01

    The purpose of this thesis was to implement a hospital management software which is suitable for small private hospitals in Nigeria, especially for the ones that use a file based system for storing information rather than having it stored in a more efficient and safer environment like databases or excel programming software. The software developed within this thesis project was specifically designed for the Rainbow specialist hospital which is based in Lagos, the commercial neurological cente...

  9. Computer software configuration management

    International Nuclear Information System (INIS)

    Pelletier, G.

    1987-08-01

    This report reviews the basic elements of software configuration management (SCM) as defined by military and industry standards. Several software configuration management standards are evaluated given the requirements of the nuclear industry. A survey is included of available automated tools for supporting SCM activities. Some information is given on the experience of establishing and using SCM plans of other organizations that manage critical software. The report concludes with recommendations of practices that would be most appropriate for the nuclear power industry in Canada

  10. Gammasphere software development

    International Nuclear Information System (INIS)

    Piercey, R.B.

    1993-01-01

    Activities of the nuclear physics group are described. Progress was made in organizing the Gammasphere Software Working Group, establishing a nuclear computing facility, participating in software development at Lawrence Berkeley, developing a common data file format, and adapting the ORNL UPAK software to run at Gammasphere. A universal histogram object was developed that defines a file format and provides for an objective-oriented programming model. An automated liquid nitrogen fill system was developed for Gammasphere (110 Ge detectors comprise the sphere)

  11. Software quality management

    International Nuclear Information System (INIS)

    Bishop, D.C.; Pymm, P.

    1991-01-01

    As programmable electronic (software-based) systems are increasingly being proposed as design solutions for high integrity applications in nuclear power stations, the need to adopt suitable quality management arrangements is paramount. The authors describe Scottish Nuclear's strategy for software quality management and, using the main on-line monitoring system at Torness Power Station as an example, explain how this strategy is put into practice. Particular attention is given to the topics of software quality planning and change control. (author)

  12. Software Process Improvement Defined

    DEFF Research Database (Denmark)

    Aaen, Ivan

    2002-01-01

    This paper argues in favor of the development of explanatory theory on software process improvement. The last one or two decades commitment to prescriptive approaches in software process improvement theory may contribute to the emergence of a gulf dividing theorists and practitioners....... It is proposed that this divide be met by the development of theory evaluating prescriptive approaches and informing practice with a focus on the software process policymaking and process control aspects of improvement efforts...

  13. Assuring Software Reliability

    Science.gov (United States)

    2014-08-01

    technologies and processes to achieve a required level of confidence that software systems and services function in the intended manner. 1.3 Security Example...that took three high-voltage lines out of service and a software fail- ure (a race condition3) that disabled the computing service that notified the... service had failed. Instead of analyzing the details of the alarm server failure, the reviewers asked why the following software assurance claim had

  14. Software evolution with XVCL

    DEFF Research Database (Denmark)

    Zhang, Weishan; Jarzabek, Stan; Zhang, Hongyu

    2004-01-01

    This chapter introduces software evolution with XVCL (XML-based Variant Configuration Language), which is an XML-based metaprogramming technique. As the software evolves, a large number of variants may arise, especially whtn such kinds of evolutions are related to multiple platforms as shown in our...... case study. Handling variants and tracing the impact of variants across the development lifecycle is a challenge. This chapter shows how we can maintain different versions of software in a reuse-based way....

  15. FASTBUS software status

    International Nuclear Information System (INIS)

    Gustavson, D.B.

    1980-10-01

    Computer software will be needed in addition to the mechanical, electrical, protocol and timing specifications of the FASTBUS, in order to facilitate the use of this flexible new multiprocessor and multisegment data acquisition and processing system. Software considerations have been important in the FASTBUS design, but standard subroutines and recommended algorithms will be needed as the FASTBUS comes into use. This paper summarizes current FASTBUS software projects, goals and status

  16. Software configuration management

    CERN Document Server

    Keyes, Jessica

    2004-01-01

    Software Configuration Management discusses the framework from a standards viewpoint, using the original DoD MIL-STD-973 and EIA-649 standards to describe the elements of configuration management within a software engineering perspective. Divided into two parts, the first section is composed of 14 chapters that explain every facet of configuration management related to software engineering. The second section consists of 25 appendices that contain many valuable real world CM templates.

  17. Solar Asset Management Software

    Energy Technology Data Exchange (ETDEWEB)

    Iverson, Aaron [Ra Power Management, Inc., Oakland, CA (United States); Zviagin, George [Ra Power Management, Inc., Oakland, CA (United States)

    2016-09-30

    Ra Power Management (RPM) has developed a cloud based software platform that manages the financial and operational functions of third party financed solar projects throughout their lifecycle. RPM’s software streamlines and automates the sales, financing, and management of a portfolio of solar assets. The software helps solar developers automate the most difficult aspects of asset management, leading to increased transparency, efficiency, and reduction in human error. More importantly, our platform will help developers save money by improving their operating margins.

  18. Essential software architecture

    CERN Document Server

    Gorton, Ian

    2011-01-01

    Job titles like ""Technical Architect"" and ""Chief Architect"" nowadays abound in software industry, yet many people suspect that ""architecture"" is one of the most overused and least understood terms in professional software development. Gorton's book tries to resolve this dilemma. It concisely describes the essential elements of knowledge and key skills required to be a software architect. The explanations encompass the essentials of architecture thinking, practices, and supporting technologies. They range from a general understanding of structure and quality attributes through technical i

  19. Software engineering the current practice

    CERN Document Server

    Rajlich, Vaclav

    2011-01-01

    INTRODUCTION History of Software EngineeringSoftware PropertiesOrigins of SoftwareBirth of Software EngineeringThird Paradigm: Iterative ApproachSoftware Life Span ModelsStaged ModelVariants of Staged ModelSoftware Technologies Programming Languages and CompilersObject-Oriented TechnologyVersion Control SystemSoftware ModelsClass DiagramsUML Activity DiagramsClass Dependency Graphs and ContractsSOFTWARE CHANGEIntroduction to Software ChangeCharacteristics of Software ChangePhases of Software ChangeRequirements and Their ElicitationRequirements Analysis and Change InitiationConcepts and Concept

  20. Agile software development

    CERN Document Server

    Dingsoyr, Torgeir; Moe, Nils Brede

    2010-01-01

    Agile software development has become an umbrella term for a number of changes in how software developers plan and coordinate their work, how they communicate with customers and external stakeholders, and how software development is organized in small, medium, and large companies, from the telecom and healthcare sectors to games and interactive media. Still, after a decade of research, agile software development is the source of continued debate due to its multifaceted nature and insufficient synthesis of research results. Dingsoyr, Dyba, and Moe now present a comprehensive snapshot of the kno

  1. Optimization of Antivirus Software

    Directory of Open Access Journals (Sweden)

    2007-01-01

    Full Text Available The paper describes the main techniques used in development of computer antivirus software applications. For this particular category of software, are identified and defined optimum criteria that helps determine which solution is better and what are the objectives of the optimization process. From the general viewpoint of software optimization are presented methods and techniques that are applied at code development level. Regarding the particularities of antivirus software, the paper analyzes some of the optimization concepts applied to this category of applications

  2. Dtest Testing Software

    Science.gov (United States)

    Jain, Abhinandan; Cameron, Jonathan M.; Myint, Steven

    2013-01-01

    This software runs a suite of arbitrary software tests spanning various software languages and types of tests (unit level, system level, or file comparison tests). The dtest utility can be set to automate periodic testing of large suites of software, as well as running individual tests. It supports distributing multiple tests over multiple CPU cores, if available. The dtest tool is a utility program (written in Python) that scans through a directory (and its subdirectories) and finds all directories that match a certain pattern and then executes any tests in that directory as described in simple configuration files.

  3. Software quality assurance

    CERN Document Server

    Laporte, Claude Y

    2018-01-01

    This book introduces Software Quality Assurance (SQA) and provides an overview of standards used to implement SQA. It defines ways to assess the effectiveness of how one approaches software quality across key industry sectors such as telecommunications, transport, defense, and aerospace. * Includes supplementary website with an instructor's guide and solutions * Applies IEEE software standards as well as the Capability Maturity Model Integration for Development (CMMI) * Illustrates the application of software quality assurance practices through the use of practical examples, quotes from experts, and tips from the authors

  4. Software architecture 2

    CERN Document Server

    Oussalah, Mourad Chabanne

    2014-01-01

    Over the past 20 years, software architectures have significantly contributed to the development of complex and distributed systems. Nowadays, it is recognized that one of the critical problems in the design and development of any complex software system is its architecture, i.e. the organization of its architectural elements. Software Architecture presents the software architecture paradigms based on objects, components, services and models, as well as the various architectural techniques and methods, the analysis of architectural qualities, models of representation of architectural templa

  5. Software as quality product

    International Nuclear Information System (INIS)

    Enders, A.

    1975-01-01

    In many discussions on the reliability of computer systems, software is presented as the weak link in the chain. The contribution attempts to identify the reasons for this situation as seen from the software development. The concepts correctness and reliability of programmes are explained as they are understood in the specialist discussion of today. Measures and methods are discussed which are particularly relevant as far as the obtaining of fault-free and reliable programmes is concerned. Conclusions are drawn for the user of software so that he is in the position to judge himself what can be justly expected frm the product software compared to other products. (orig./LH) [de

  6. Essence: Facilitating Software Innovation

    DEFF Research Database (Denmark)

    Aaen, Ivan

    2008-01-01

      This paper suggests ways to facilitate creativity and innovation in software development. The paper applies four perspectives – Product, Project, Process, and People –to identify an outlook for software innovation. The paper then describes a new facility–Software Innovation Research Lab (SIRL......) – and a new method concept for software innovation – Essence – based on views, modes, and team roles. Finally, the paper reports from an early experiment using SIRL and Essence and identifies further research....

  7. Global Software Engineering

    DEFF Research Database (Denmark)

    Ebert, Christof; Kuhrmann, Marco; Prikladnicki, Rafael

    2016-01-01

    Professional software products and IT systems and services today are developed mostly by globally distributed teams, projects, and companies. Successfully orchestrating Global Software Engineering (GSE) has become the major success factor both for organizations and practitioners. Yet, more than...... and experience reported at the IEEE International Conference on Software Engineering (ICGSE) series. The outcomes of our analysis show GSE as a field highly attached to industry and, thus, a considerable share of ICGSE papers address the transfer of Software Engineering concepts and solutions to the global stage...

  8. Software Intensive Systems

    National Research Council Canada - National Science Library

    Horvitz, E; Katz, D. J; Rumpf, R. L; Shrobe, H; Smith, T. B; Webber, G. E; Williamson, W. E; Winston, P. H; Wolbarsht, James L

    2006-01-01

    .... Additionally, recommend that DoN invest in software engineering, particularly as it complements commercial industry developments and promotes the application of systems engineering methodology...

  9. Contractor Software Charges

    National Research Council Canada - National Science Library

    Granetto, Paul

    1994-01-01

    .... Examples of computer software costs that contractors charge through indirect rates are material management systems, security systems, labor accounting systems, and computer-aided design and manufacturing...

  10. Decentralized Software Architecture

    National Research Council Canada - National Science Library

    Khare, Rohit

    2002-01-01

    .... While the term "decentralization" is familiar from political and economic contexts, it has been applied extensively, if indiscriminately, to describe recent trends in software architecture towards...

  11. Software architecture 1

    CERN Document Server

    Oussalah , Mourad Chabane

    2014-01-01

    Over the past 20 years, software architectures have significantly contributed to the development of complex and distributed systems. Nowadays, it is recognized that one of the critical problems in the design and development of any complex software system is its architecture, i.e. the organization of its architectural elements. Software Architecture presents the software architecture paradigms based on objects, components, services and models, as well as the various architectural techniques and methods, the analysis of architectural qualities, models of representation of architectural template

  12. Software Radar Technology

    Directory of Open Access Journals (Sweden)

    Tang Jun

    2015-08-01

    Full Text Available In this paper, the definition and the key features of Software Radar, which is a new concept, are proposed and discussed. We consider the development of modern radar system technology to be divided into three stages: Digital Radar, Software radar and Intelligent Radar, and the second stage is just commencing now. A Software Radar system should be a combination of various modern digital modular components conformed to certain software and hardware standards. Moreover, a software radar system with an open system architecture supporting to decouple application software and low level hardware would be easy to adopt "user requirements-oriented" developing methodology instead of traditional "specific function-oriented" developing methodology. Compared with traditional Digital Radar, Software Radar system can be easily reconfigured and scaled up or down to adapt to the changes of requirements and technologies. A demonstration Software Radar signal processing system, RadarLab 2.0, which has been developed by Tsinghua University, is introduced in this paper and the suggestions for the future development of Software Radar in China are also given in the conclusion.

  13. Software Library for Bruker TopSpin NMR Data Files

    Energy Technology Data Exchange (ETDEWEB)

    2016-10-14

    A software library for parsing and manipulating frequency-domain data files that have been processed using the Bruker TopSpin NMR software package. In the context of NMR, the term "processed" indicates that the end-user of the Bruker TopSpin NMR software package has (a) Fourier transformed the raw, time-domain data (the Free Induction Decay) into the frequency-domain and (b) has extracted the list of NMR peaks.

  14. Social software in global software development

    DEFF Research Database (Denmark)

    Giuffrida, Rosalba; Dittrich, Yvonne

    2010-01-01

    variety of tools such as: instant messaging, internet forums, mailing lists, blogs, wikis, social network sites, social bookmarking, social libraries, virtual worlds. Though normally rather belonging to the private realm, the use of social software in corporate context has been reported, e.g. as a way...

  15. NASA's Software Safety Standard

    Science.gov (United States)

    Ramsay, Christopher M.

    2007-01-01

    NASA relies more and more on software to control, monitor, and verify its safety critical systems, facilities and operations. Since the 1960's there has hardly been a spacecraft launched that does not have a computer on board that will provide command and control services. There have been recent incidents where software has played a role in high-profile mission failures and hazardous incidents. For example, the Mars Orbiter, Mars Polar Lander, the DART (Demonstration of Autonomous Rendezvous Technology), and MER (Mars Exploration Rover) Spirit anomalies were all caused or contributed to by software. The Mission Control Centers for the Shuttle, ISS, and unmanned programs are highly dependant on software for data displays, analysis, and mission planning. Despite this growing dependence on software control and monitoring, there has been little to no consistent application of software safety practices and methodology to NASA's projects with safety critical software. Meanwhile, academia and private industry have been stepping forward with procedures and standards for safety critical systems and software, for example Dr. Nancy Leveson's book Safeware: System Safety and Computers. The NASA Software Safety Standard, originally published in 1997, was widely ignored due to its complexity and poor organization. It also focused on concepts rather than definite procedural requirements organized around a software project lifecycle. Led by NASA Headquarters Office of Safety and Mission Assurance, the NASA Software Safety Standard has recently undergone a significant update. This new standard provides the procedures and guidelines for evaluating a project for safety criticality and then lays out the minimum project lifecycle requirements to assure the software is created, operated, and maintained in the safest possible manner. This update of the standard clearly delineates the minimum set of software safety requirements for a project without detailing the implementation for those

  16. Geopressured geothermal bibliography. Volume 1 (citation extracts)

    Energy Technology Data Exchange (ETDEWEB)

    Hill, T.R.; Sepehrnoori, K.

    1981-08-01

    This bibliography was compiled by the Center for Energy Studies at The University of Texas at Austin to serve as a tool for researchers in the field of geopressured geothermal energy resources. The bibliography represents citations of papers on geopressured geothermal energy resources over the past eighteen years. Topics covered in the bibliography range from the technical aspects of geopressured geothermal reservoirs to social, environmental, and legal aspects of tapping those reservoirs for their energy resources. The bibliography currently contains more than 750 entries. For quick reference to a given topic, the citations are indexed into five divisions: author, category, conference title, descriptor, and sponsor. These indexes are arranged alphabetically and cross-referenced by page number.

  17. Relevance of biotic pathways to the long-term regulation of nuclear waste disposal. Estimation of radiation dose to man resulting from biotic transport: the BIOPORT/MAXI1 software package. Volume 5

    Energy Technology Data Exchange (ETDEWEB)

    McKenzie, D.H.; Cadwell, L.L.; Gano, K.A.; Kennedy, W.E. Jr.; Napier, B.A.; Peloquin, R.A.; Prohammer, L.A.; Simmons, M.A.

    1985-10-01

    BIOPORT/MAXI1 is a collection of five computer codes designed to estimate the potential magnitude of the radiation dose to man resulting from biotic transport processes. Dose to man is calculated for ingestion of agricultural crops grown in contaminated soil, inhalation of resuspended radionuclides, and direct exposure to penetrating radiation resulting from the radionuclide concentrations established in the available soil surface by the biotic transport model. This document is designed as both an instructional and reference document for the BIOPORT/MAXI1 computer software package and has been written for two major audiences. The first audience includes persons concerned with the mathematical models of biological transport of commercial low-level radioactive wastes and the computer algorithms used to implement those models. The second audience includes persons concerned with exercising the computer program and exposure scenarios to obtain results for specific applications. The report contains sections describing the mathematical models, user operation of the computer programs, and program structure. Input and output for five sample problems are included. In addition, listings of the computer programs, data libraries, and dose conversion factors are provided in appendices.

  18. Relevance of biotic pathways to the long-term regulation of nuclear waste disposal. Estimation of radiation dose to man resulting from biotic transport: the BIOPORT/MAXI1 software package. Volume 5

    International Nuclear Information System (INIS)

    McKenzie, D.H.; Cadwell, L.L.; Gano, K.A.; Kennedy, W.E. Jr.; Napier, B.A.; Peloquin, R.A.; Prohammer, L.A.; Simmons, M.A.

    1985-10-01

    BIOPORT/MAXI1 is a collection of five computer codes designed to estimate the potential magnitude of the radiation dose to man resulting from biotic transport processes. Dose to man is calculated for ingestion of agricultural crops grown in contaminated soil, inhalation of resuspended radionuclides, and direct exposure to penetrating radiation resulting from the radionuclide concentrations established in the available soil surface by the biotic transport model. This document is designed as both an instructional and reference document for the BIOPORT/MAXI1 computer software package and has been written for two major audiences. The first audience includes persons concerned with the mathematical models of biological transport of commercial low-level radioactive wastes and the computer algorithms used to implement those models. The second audience includes persons concerned with exercising the computer program and exposure scenarios to obtain results for specific applications. The report contains sections describing the mathematical models, user operation of the computer programs, and program structure. Input and output for five sample problems are included. In addition, listings of the computer programs, data libraries, and dose conversion factors are provided in appendices

  19. Customer Interaction in Software Development: A Comparison of Software Methodologies Deployed in Namibian Software Firms

    CSIR Research Space (South Africa)

    Iyawa, GE

    2016-01-01

    Full Text Available within the Namibian context. An implication for software project managers and software developers is that customer interaction should be properly managed to ensure that the software methodologies for improving software development processes...

  20. Marketing Mix del Software.

    Directory of Open Access Journals (Sweden)

    Yudith del Carmen Rodríguez Pérez

    2006-03-01

    Por ello, en este trabajo se define el concepto de producto software, se caracteriza al mismo y se exponen sus atributos de calidad. Además, se aborda la mezcla de marketing del software necesaria y diferente a la de otros productos para que este triunfe en el mercado.

  1. Sustainability in Software Engineering

    NARCIS (Netherlands)

    Wolfram, N.J.E.; Lago, P.; Osborne, Francesco

    2017-01-01

    The intersection between software engineering research and issues related to sustainability and green IT has been the subject of increasing attention. In spite of that, we observe that sustainability is still not clearly defined, or understood, in the field of software engineering. This lack of

  2. Software cost estimation

    NARCIS (Netherlands)

    Heemstra, F.J.

    1992-01-01

    The paper gives an overview of the state of the art of software cost estimation (SCE). The main questions to be answered in the paper are: (1) What are the reasons for overruns of budgets and planned durations? (2) What are the prerequisites for estimating? (3) How can software development effort be

  3. Software cost estimation

    NARCIS (Netherlands)

    Heemstra, F.J.; Heemstra, F.J.

    1993-01-01

    The paper gives an overview of the state of the art of software cost estimation (SCE). The main questions to be answered in the paper are: (1) What are the reasons for overruns of budgets and planned durations? (2) What are the prerequisites for estimating? (3) How can software development effort be

  4. Software engineering ethics

    Science.gov (United States)

    Bown, Rodney L.

    1991-01-01

    Software engineering ethics is reviewed. The following subject areas are covered: lack of a system viewpoint; arrogance of PC DOS software vendors; violation od upward compatibility; internet worm; internet worm revisited; student cheating and company hiring interviews; computing practitioners and the commodity market; new projects and old programming languages; schedule and budget; and recent public domain comments.

  5. Computer Software Reviews.

    Science.gov (United States)

    Hawaii State Dept. of Education, Honolulu. Office of Instructional Services.

    Intended to provide guidance in the selection of the best computer software available to support instruction and to make optimal use of schools' financial resources, this publication provides a listing of computer software programs that have been evaluated according to their currency, relevance, and value to Hawaii's educational programs. The…

  6. Software product family evaluation

    NARCIS (Netherlands)

    van der Linden, F; Bosch, J; Kamsties, E; Kansala, K; Krzanik, L; Obbink, H; VanDerLinden, F

    2004-01-01

    This paper proposes a 4-dimensional software product family engineering evaluation model. The 4 dimensions relate to the software engineering concerns of business, architecture, organisation and process. The evaluation model is meant to be used within organisations to determine the status of their

  7. Selecting the Right Software.

    Science.gov (United States)

    Shearn, Joseph

    1987-01-01

    Selection of administrative software requires analyzing present needs and, to meet future needs, choosing software that will function with a more powerful computer system. Other important factors to include are a professional system demonstration, maintenance and training, and financial considerations that allow leasing or renting alternatives.…

  8. TAPSOFT'95: Theory and Practice of Software Development

    DEFF Research Database (Denmark)

    This volume presents the proceedings of the Sixth International Joint Conference on the Theory and Practice of Software Engineering, TAPSOFT '95, held in Aarhus, Denmark in May 1995. TAPSOFT '95 celebrates the 10th anniversary of this conference series started in Berlin in 1985 to bring together...... theoretical computer scientists and software engineers (researchers and practitioners) with a view to discussing how formal methods can usefully be applied in software development. The volume contains seven invited papers, among them one by Vaugham Pratt on the recently revealed bug in the Pentium chip...

  9. leaves extract on mild steel in acid

    African Journals Online (AJOL)

    ADOWIE PERE

    The volume of the cathodic hydrogen gas evolved was also plotted as a .... prevent the escape of hydrogen gas. The volume .... Clivia nobilis leaves extract on the flow of current ... behaviour of ethanol extract of Piper guinensis as a green ...

  10. Trends in software testing

    CERN Document Server

    Mohanty, J; Balakrishnan, Arunkumar

    2017-01-01

    This book is focused on the advancements in the field of software testing and the innovative practices that the industry is adopting. Considering the widely varied nature of software testing, the book addresses contemporary aspects that are important for both academia and industry. There are dedicated chapters on seamless high-efficiency frameworks, automation on regression testing, software by search, and system evolution management. There are a host of mathematical models that are promising for software quality improvement by model-based testing. There are three chapters addressing this concern. Students and researchers in particular will find these chapters useful for their mathematical strength and rigor. Other topics covered include uncertainty in testing, software security testing, testing as a service, test technical debt (or test debt), disruption caused by digital advancement (social media, cloud computing, mobile application and data analytics), and challenges and benefits of outsourcing. The book w...

  11. Software safety hazard analysis

    International Nuclear Information System (INIS)

    Lawrence, J.D.

    1996-02-01

    Techniques for analyzing the safety and reliability of analog-based electronic protection systems that serve to mitigate hazards in process control systems have been developed over many years, and are reasonably well understood. An example is the protection system in a nuclear power plant. The extension of these techniques to systems which include digital computers is not well developed, and there is little consensus among software engineering experts and safety experts on how to analyze such systems. One possible technique is to extend hazard analysis to include digital computer-based systems. Software is frequently overlooked during system hazard analyses, but this is unacceptable when the software is in control of a potentially hazardous operation. In such cases, hazard analysis should be extended to fully cover the software. A method for performing software hazard analysis is proposed in this paper

  12. Systematic Software Development

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; Méndez Fernández, Daniel

    2015-01-01

    The speed of innovation and the global allocation of resources to accelerate development or to reduce cost put pressure on the software industry. In the global competition, especially so-called high-price countries have to present arguments why the higher development cost is justified and what...... makes these countries an attractive host for software companies. Often, high-quality engineering and excellent quality of products, e.g., machinery and equipment, are mentioned. Yet, the question is: Can such arguments be also found for the software industry? We aim at investigating the degree...... of professionalism and systematization of software development to draw a map of strengths and weaknesses. To this end, we conducted as a first step an exploratory survey in Germany, presented in this paper. In this survey, we focused on the perceived importance of the two general software engineering process areas...

  13. Software architecture evolution

    DEFF Research Database (Denmark)

    Barais, Olivier; Le Meur, Anne-Francoise; Duchien, Laurence

    2008-01-01

    Software architectures must frequently evolve to cope with changing requirements, and this evolution often implies integrating new concerns. Unfortunately, when the new concerns are crosscutting, existing architecture description languages provide little or no support for this kind of evolution....... The software architect must modify multiple elements of the architecture manually, which risks introducing inconsistencies. This chapter provides an overview, comparison and detailed treatment of the various state-of-the-art approaches to describing and evolving software architectures. Furthermore, we discuss...... one particular framework named Tran SAT, which addresses the above problems of software architecture evolution. Tran SAT provides a new element in the software architecture descriptions language, called an architectural aspect, for describing new concerns and their integration into an existing...

  14. Software quality in 1997

    Energy Technology Data Exchange (ETDEWEB)

    Jones, C. [Software Productivity Research, Inc., Burlington, MA (United States)

    1997-11-01

    For many years, software quality assurance lagged behind hardware quality assurance in terms of methods, metrics, and successful results. New approaches such as Quality Function Deployment (QFD) the ISO 9000-9004 standards, the SEI maturity levels, and Total Quality Management (TQM) are starting to attract wide attention, and in some cases to bring software quality levels up to a parity with manufacturing quality levels. Since software is on the critical path for many engineered products, and for internal business systems as well, the new approaches are starting to affect global competition and attract widespread international interest. It can be hypothesized that success in mastering software quality will be a key strategy for dominating global software markets in the 21st century.

  15. Developing Software Simulations

    Directory of Open Access Journals (Sweden)

    Tom Hall

    2007-06-01

    Full Text Available Programs in education and business often require learners to develop and demonstrate competence in specified areas and then be able to effectively apply this knowledge. One method to aid in developing a skill set in these areas is through the use of software simulations. These simulations can be used for learner demonstrations of competencies in a specified course as well as a review of the basic skills at the beginning of subsequent courses. The first section of this paper discusses ToolBook, the software used to develop our software simulations. The second section discusses the process of developing software simulations. The third part discusses how we have used software simulations to assess student knowledge of research design by providing simulations that allow the student to practice using SPSS and Excel.

  16. Software licenses: Stay honest!

    CERN Multimedia

    Computer Security Team

    2012-01-01

    Do you recall our article about copyright violation in the last issue of the CERN Bulletin, “Music, videos and the risk for CERN”? Now let’s be more precise. “Violating copyright” not only means the illegal download of music and videos, it also applies to software packages and applications.   Users must respect proprietary rights in compliance with the CERN Computing Rules (OC5). Not having legitimately obtained a program or the required licenses to run that software is not a minor offense. It violates CERN rules and puts the Organization at risk! Vendors deserve credit and compensation. Therefore, make sure that you have the right to use their software. In other words, you have bought the software via legitimate channels and use a valid and honestly obtained license. This also applies to “Shareware” and software under open licenses, which might also come with a cost. Usually, only “Freeware” is complete...

  17. Colorado Conference on iterative methods. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1994-12-31

    The conference provided a forum on many aspects of iterative methods. Volume I topics were:Session: domain decomposition, nonlinear problems, integral equations and inverse problems, eigenvalue problems, iterative software kernels. Volume II presents nonsymmetric solvers, parallel computation, theory of iterative methods, software and programming environment, ODE solvers, multigrid and multilevel methods, applications, robust iterative methods, preconditioners, Toeplitz and circulation solvers, and saddle point problems. Individual papers are indexed separately on the EDB.

  18. Towards an Ontology of Software

    OpenAIRE

    Wang, Xiaowei

    2016-01-01

    Software is permeating every aspect of our personal and social life. And yet, the cluster of concepts around the notion of software, such as the notions of a software product, software requirements, software specifications, are still poorly understood with no consensus on the horizon. For many, software is just code, something intangible best defined in contrast with hardware, but it is not particularly illuminating. This erroneous notion, software is just code, presents both in the ontology ...

  19. LDUA software custodian's notebook

    International Nuclear Information System (INIS)

    Aftanas, B.L.

    1998-01-01

    This plan describes the activities to be performed and controls to be applied to the process of specifying, obtaining, and qualifying the control and data acquisition software for the Light Duty Utility Arm (LDUA) System. It serves the purpose of a software quality assurance plan, a verification and validation plan, and a configuration management plan. This plan applies to all software that is an integral part of the LDUA control and data acquisition system, that is, software that is installed in the computers that are part of the LDUA system as it is deployed in the field. This plan applies to the entire development process, including: requirements; design; implementation; and operations and maintenance. This plan does not apply to any software that is not integral with the LDUA system. This plan has-been prepared in accordance with WHC-CM-6-1 Engineering Practices, EP-2.1; WHC-CM-3-10 Software Practices; and WHC-CM-4-2, QR 19.0, Software Quality Assurance Requirements

  20. Software Formal Inspections Guidebook

    Science.gov (United States)

    1993-01-01

    The Software Formal Inspections Guidebook is designed to support the inspection process of software developed by and for NASA. This document provides information on how to implement a recommended and proven method for conducting formal inspections of NASA software. This Guidebook is a companion document to NASA Standard 2202-93, Software Formal Inspections Standard, approved April 1993, which provides the rules, procedures, and specific requirements for conducting software formal inspections. Application of the Formal Inspections Standard is optional to NASA program or project management. In cases where program or project management decide to use the formal inspections method, this Guidebook provides additional information on how to establish and implement the process. The goal of the formal inspections process as documented in the above-mentioned Standard and this Guidebook is to provide a framework and model for an inspection process that will enable the detection and elimination of defects as early as possible in the software life cycle. An ancillary aspect of the formal inspection process incorporates the collection and analysis of inspection data to effect continual improvement in the inspection process and the quality of the software subjected to the process.

  1. The STARLINK software collection

    Science.gov (United States)

    Penny, A. J.; Wallace, P. T.; Sherman, J. C.; Terret, D. L.

    1993-12-01

    A demonstration will be given of some recent Starlink software. STARLINK is: a network of computers used by UK astronomers; a collection of programs for the calibration and analysis of astronomical data; a team of people giving hardware, software and administrative support. The Starlink Project has been in operation since 1980 to provide UK astronomers with interactive image processing and data reduction facilities. There are now Starlink computer systems at 25 UK locations, serving about 1500 registered users. The Starlink software collection now has about 25 major packages covering a wide range of astronomical data reduction and analysis techniques, as well as many smaller programs and utilities. At the core of most of the packages is a common `software environment', which provides many of the functions which applications need and offers standardized methods of structuring and accessing data. The software environment simplifies programming and support, and makes it easy to use different packages for different stages of the data reduction. Users see a consistent style, and can mix applications without hitting problems of differing data formats. The Project group coordinates the writing and distribution of this software collection, which is Unix based. Outside the UK, Starlink is used at a large number of places, which range from installations at major UK telescopes, which are Starlink-compatible and managed like Starlink sites, to individuals who run only small parts of the Starlink software collection.

  2. An Introduction to the Special Volume on

    Directory of Open Access Journals (Sweden)

    Micah Altman

    2011-08-01

    Full Text Available This special volume of the Journal of Statistical Software on political methodology includes 14 papers, with wide-ranging software contributions of political scientists to their own field, and more generally to statistical data analysis in the the social sciences and beyond. Special emphasis is given to software that is written in or can cooperate with the R system for statistical computing.

  3. Beginning software engineering

    CERN Document Server

    Stephens, Rod

    2015-01-01

    Beginning Software Engineering demystifies the software engineering methodologies and techniques that professional developers use to design and build robust, efficient, and consistently reliable software. Free of jargon and assuming no previous programming, development, or management experience, this accessible guide explains important concepts and techniques that can be applied to any programming language. Each chapter ends with exercises that let you test your understanding and help you elaborate on the chapter's main concepts. Everything you need to understand waterfall, Sashimi, agile, RAD, Scrum, Kanban, Extreme Programming, and many other development models is inside!

  4. Software industrial flexible

    OpenAIRE

    Díaz Araya, Daniel; Muñoz, Leandro; Sirerol, Daniel; Oviedo, Sandra; Ibáñez, Francisco S.

    2012-01-01

    En este trabajo se pretende investigar y proponer técnicas, métodos y tecnologías que permitan el desarrollo de software flexible en ambientes industriales. El objetivo es generar métodos y técnicas para facilitar el desarrollo de software flexible en ambientes industriales. Las áreas de investigación son los sistemas de scheduling de producción, la generación de software para plataformas de hardware abiertas y la innovación.

  5. Thyroid uptake software

    International Nuclear Information System (INIS)

    Alonso, Dolores; Arista, Eduardo

    2003-01-01

    The DETEC-PC software was developed as a complement to a measurement system (hardware) able to perform Iodine Thyroid Uptake studies. The software was designed according to the principles of Object oriented programming using C++ language. The software automatically fixes spectrometric measurement parameters and besides patient measurement also performs statistical analysis of a batch of samples. It possesses a PARADOX database with all information of measured patients and a help system with the system options and medical concepts related to the thyroid uptake study

  6. Criteria for software modularization

    Science.gov (United States)

    Card, David N.; Page, Gerald T.; Mcgarry, Frank E.

    1985-01-01

    A central issue in programming practice involves determining the appropriate size and information content of a software module. This study attempted to determine the effectiveness of two widely used criteria for software modularization, strength and size, in reducing fault rate and development cost. Data from 453 FORTRAN modules developed by professional programmers were analyzed. The results indicated that module strength is a good criterion with respect to fault rate, whereas arbitrary module size limitations inhibit programmer productivity. This analysis is a first step toward defining empirically based standards for software modularization.

  7. Error Free Software

    Science.gov (United States)

    1985-01-01

    A mathematical theory for development of "higher order" software to catch computer mistakes resulted from a Johnson Space Center contract for Apollo spacecraft navigation. Two women who were involved in the project formed Higher Order Software, Inc. to develop and market the system of error analysis and correction. They designed software which is logically error-free, which, in one instance, was found to increase productivity by 600%. USE.IT defines its objectives using AXES -- a user can write in English and the system converts to computer languages. It is employed by several large corporations.

  8. Global Software Engineering

    DEFF Research Database (Denmark)

    Ebert, Christof; Kuhrmann, Marco; Prikladnicki, Rafael

    2016-01-01

    SOFTWARE, LIKE ALL industry products, is the result of complex multinational supply chains with many partners from concept to development to production and maintenance. Global software engineering (GSE), IT outsourcing, and business process outsourcing during the past decade have showed growth...... rates of 10 to 20 percent per year. This instalment of Practitioner’s Digest summarizes experiences and guidance from industry to facilitate knowledge and technology transfer for GSE. It’s based on industry feedback from the annual IEEE International Conference on Global Software Engineering, which had...

  9. Software for microcircuit systems

    International Nuclear Information System (INIS)

    Kunz, P.F.

    1978-01-01

    Modern Large Scale Integration (LSI) microcircuits are meant to be programmed in order to control the function that they perform. In the previous paper the author has already discussed the basics of microprogramming and have studied in some detail two types of new microcircuits. In this paper, methods of developing software for these microcircuits are explored. This generally requires a package of support software in order to assemble the microprogram, and also some amount of support software to test the microprograms and to test the microprogrammed circuit itself. (Auth.)

  10. Guide to software export

    CERN Document Server

    Philips, Roger A

    2014-01-01

    An ideal reference source for CEOs, marketing and sales managers, sales consultants, and students of international marketing, Guide to Software Export provides a step-by-step approach to initiating or expanding international software sales. It teaches you how to examine critically your candidate product for exportability; how to find distributors, agents, and resellers abroad; how to identify the best distribution structure for export; and much, much more!Not content with providing just the guidelines for setting up, expanding, and managing your international sales channels, Guide to Software

  11. Software takes command

    CERN Document Server

    Manovich, Lev

    2013-01-01

    Software has replaced a diverse array of physical, mechanical, and electronic technologies used before 21st century to create, store, distribute and interact with cultural artifacts. It has become our interface to the world, to others, to our memory and our imagination - a universal language through which the world speaks, and a universal engine on which the world runs. What electricity and combustion engine were to the early 20th century, software is to the early 21st century. Offering the the first theoretical and historical account of software for media authoring and its effects on the prac

  12. Software quality assurance handbook

    Energy Technology Data Exchange (ETDEWEB)

    1990-09-01

    There are two important reasons for Software Quality Assurance (SQA) at Allied-Signal Inc., Kansas City Division (KCD): First, the benefits from SQA make good business sense. Second, the Department of Energy has requested SQA. This handbook is one of the first steps in a plant-wide implementation of Software Quality Assurance at KCD. The handbook has two main purposes. The first is to provide information that you will need to perform software quality assurance activities. The second is to provide a common thread to unify the approach to SQA at KCD. 2 figs.

  13. Sobre software libre

    OpenAIRE

    Matellán Olivera, Vicente; González Barahona, Jesús; Heras Quirós, Pedro de las; Robles Martínez, Gregorio

    2004-01-01

    220 p. "Sobre software libre" reune casi una treintena de ensayos sobre temas de candente actualidad relacionados con el software libre (del cual Linux es su ex- ponente más conocido). Los ensayos que el lector encontrará están divididos en bloques temáticos que van desde la propiedad intelectual o las cuestiones económicas y sociales de este modelo hasta su uso en la educación y las administraciones publicas, pasando por alguno que repasa la historia del software libre en l...

  14. Data structure and software engineering challenges and improvements

    CERN Document Server

    Antonakos, James L

    2011-01-01

    Data structure and software engineering is an integral part of computer science. This volume presents new approaches and methods to knowledge sharing, brain mapping, data integration, and data storage. The author describes how to manage an organization's business process and domain data and presents new software and hardware testing methods. The book introduces a game development framework used as a learning aid in a software engineering at the university level. It also features a review of social software engineering metrics and methods for processing business information. It explains how to

  15. Nano-electromembrane extraction

    DEFF Research Database (Denmark)

    Payán, María D Ramos; Li, Bin; Petersen, Nickolaj J.

    2013-01-01

    as extraction selectivity. Compared with conventional EME, the acceptor phase volume in nano-EME was down-scaled by a factor of more than 1000. This resulted in a very high enrichment capacity. With loperamide as an example, an enrichment factor exceeding 500 was obtained in only 5 min of extraction...... electrophoresis (CE). In that way the sample preparation performed by nano-EME was coupled directly with a CE separation. Separation performance of 42,000-193,000 theoretical plates could easily be obtained by this direct sample preparation and injection technique that both provided enrichment as well...

  16. User book of SPAD software

    International Nuclear Information System (INIS)

    Eschylle, R.

    1987-01-01

    Data-gathering and accumulation is swelling in any domain thanks to computer science. A data analysis is aimed at extraction of useful data easily readable for the user. Among the mathematical methods used three of them are developed in the Spad software: - principal component analysis, - factorial analysis of simple correspondence, - factorial analysis of multiple correspondence. Data tables are first defined in this paper together with terms used by statisticians. Then the user manual describes the six analysis steps (or modules) included in the program, the links of steps which give classical data analysis. At last, for each step an ''using notice'' is given which describes the step parameters after some comments on the step object [fr

  17. Intellectual Property Protection of Software – At the Crossroads of Software Patents and Open Source Software

    OpenAIRE

    Tantarimäki, Maria

    2018-01-01

    The thesis considers the intellectual property protection of software in Europe and in the US, which is increasingly important subject as the world is globalizing and digitalizing. The special nature of software has challenges the intellectual property rights. The current protection of software is based on copyright protection but in this thesis, two other options are considered: software patents and open source software. Software patents provide strong protection for software whereas the pur...

  18. Center for Adaptive Optics | Software

    Science.gov (United States)

    Optics Software The Center for Adaptive Optics acts as a clearing house for distributing Software to Institutes it gives specialists in Adaptive Optics a place to distribute their software. All software is shared on an "as-is" basis and the users should consult with the software authors with any

  19. A concept of software testing for SMART MMIS software

    International Nuclear Information System (INIS)

    Seo, Yong Seok; Seong, Seung Hwan; Park, Keun Ok; Hur, Sub; Kim, Dong Hoon

    2001-01-01

    In order to achieve high quality of SMART MMIS software, the well-constructed software testing concept shall be required. This paper established software testing concept which is to be applied to SMART MMIS software, in terms of software testing organization, documentation. procedure, and methods. The software testing methods are classified into source code static analysis and dynamic testing. The software dynamic testing methods are discussed with two aspects: white-box and black-box testing. As software testing concept introduced in this paper is applied to the SMART MMIS software. the high quality of the software will be produced. In the future, software failure data will be collected through the construction of SMART MMIS prototyping facility which the software testing concept of this paper is applied to

  20. Computer Center: Software Review.

    Science.gov (United States)

    Duhrkopf, Richard, Ed.; Belshe, John F., Ed.

    1988-01-01

    Reviews a software package, "Mitosis-Meiosis," available for Apple II or IBM computers with colorgraphics capabilities. Describes the documentation, presentation and flexibility of the program. Rates the program based on graphics and usability in a biology classroom. (CW)

  1. Petroleum software profiles

    International Nuclear Information System (INIS)

    Anon.

    1996-01-01

    A profile of twenty-two software packages designed for petroleum exploration and production was provided. Some focussed on the oil and gas engineering industry, and others on mapping systems containing well history files and well data summaries. Still other programs provided accounting systems designed to address the complexities of the oil and gas industry. The software packages reviewed were developed by some of the best-known groups involved in software development for the oil and gas industry, including among others, Geoquest, the Can Tek Group, Applied Terravision Systems Inc., Neotechnology Consultants Ltd., (12) OGCI Software Inc., Oracle Energy, Production Revenue Information Systems Management, Virtual Computing Services Ltd., and geoLogic Systems Ltd

  2. Next Generation Software Development

    National Research Council Canada - National Science Library

    Manna, Zohar

    2005-01-01

    Under this grant we have studied the development of a scientifically sound basis for software development that builds on widely used pragmatic methods but is firmly grounded in well-established formal...

  3. Managing Distributed Software Projects

    DEFF Research Database (Denmark)

    Persson, John Stouby

    Increasingly, software projects are becoming geographically distributed, with limited face-toface interaction between participants. These projects face particular challenges that need careful managerial attention. This PhD study reports on how we can understand and support the management...... of distributed software projects, based on a literature study and a case study. The main emphasis of the literature study was on how to support the management of distributed software projects, but also contributed to an understanding of these projects. The main emphasis of the case study was on how to understand...... the management of distributed software projects, but also contributed to supporting the management of these projects. The literature study integrates what we know about risks and risk-resolution techniques, into a framework for managing risks in distributed contexts. This framework was developed iteratively...

  4. eSoftwareList

    Data.gov (United States)

    US Agency for International Development — USAID Software Database reporting tool created in Oracle Application Express (APEX). This version provides read only access to a database view of the JIRA SAR...

  5. Software didattico: integrazione scolastica

    Directory of Open Access Journals (Sweden)

    Lucia Ferlino

    1996-01-01

    Full Text Available Discussion of the use of educational software for school integration. Requires being aware of its potential effectiveness and know that it also lies in the choice of functional products.

  6. Core Flight Software

    Data.gov (United States)

    National Aeronautics and Space Administration — The AES Core Flight Software (CFS) project purpose is to analyze applicability, and evolve and extend the reusability of the CFS system originally developed by...

  7. Tier2 Submit Software

    Science.gov (United States)

    Download this tool for Windows or Mac, which helps facilities prepare a Tier II electronic chemical inventory report. The data can also be exported into the CAMEOfm (Computer-Aided Management of Emergency Operations) emergency planning software.

  8. SEER Data & Software

    Science.gov (United States)

    Options for accessing datasets for incidence, mortality, county populations, standard populations, expected survival, and SEER-linked and specialized data. Plus variable definitions, documentation for reporting and using datasets, statistical software (SEER*Stat), and observational research resources.

  9. Managing Software Process Evolution

    DEFF Research Database (Denmark)

    This book focuses on the design, development, management, governance and application of evolving software processes that are aligned with changing business objectives, such as expansion to new domains or shifting to global production. In the context of an evolving business world, it examines...... the complete software process lifecycle, from the initial definition of a product to its systematic improvement. In doing so, it addresses difficult problems, such as how to implement processes in highly regulated domains or where to find a suitable notation system for documenting processes, and provides...... essential insights and tips to help readers manage process evolutions. And last but not least, it provides a wealth of examples and cases on how to deal with software evolution in practice. Reflecting these topics, the book is divided into three parts. Part 1 focuses on software business transformation...

  10. Software for nuclear spectrometry

    International Nuclear Information System (INIS)

    1998-10-01

    The Advisory Group Meeting (AGM) on Software for Nuclear Spectrometry was dedicated to review the present status of software for nuclear spectrometry and to advise on future activities in this field. Because similar AGM and consultant's meetings had been held in the past; together with an attempt to get more streamlined, this AGM was devoted to the specific field of software for gamma ray spectrometry. Nevertheless, many of the issues discussed and the recommendations made are of general concern for any software on nuclear spectrometry. The report is organized by sections. The 'Summary' gives conclusions and recommendations adopted at the AGM. These conclusions and recommendations resulted from the discussions held during and after presentations of the scientific and technical papers. These papers are reported here in their integral form in the following Sections

  11. Software for radiation protection

    International Nuclear Information System (INIS)

    Graffunder, H.

    2002-01-01

    The software products presented are universally usable programs for radiation protection. The systems were designed in order to establish a comprehensive database specific to radiation protection and, on this basis, model in programs subjects of radiation protection. Development initially focused on the creation of the database. Each software product was to access the same nuclide-specific data; input errors and differences in spelling were to be excluded from the outset. This makes the products more compatible with each other and able to exchange data among each other. The software products are modular in design. Functions recurring in radiation protection are always treated the same way in different programs, and also represented the same way on the program surface. The recognition effect makes it easy for users to familiarize with the products quickly. All software products are written in German and are tailored to the administrative needs and codes and regulations in Germany and in Switzerland. (orig.) [de

  12. ITSY Handheld Software Radio

    National Research Council Canada - National Science Library

    Bose, Vanu

    2001-01-01

    .... A handheld software radio platform would enable the construction of devices that could inter-operate with multiple legacy systems, download new waveforms and be used to construct adhoc networks...

  13. Error-Free Software

    Science.gov (United States)

    1989-01-01

    001 is an integrated tool suited for automatically developing ultra reliable models, simulations and software systems. Developed and marketed by Hamilton Technologies, Inc. (HTI), it has been applied in engineering, manufacturing, banking and software tools development. The software provides the ability to simplify the complex. A system developed with 001 can be a prototype or fully developed with production quality code. It is free of interface errors, consistent, logically complete and has no data or control flow errors. Systems can be designed, developed and maintained with maximum productivity. Margaret Hamilton, President of Hamilton Technologies, also directed the research and development of USE.IT, an earlier product which was the first computer aided software engineering product in the industry to concentrate on automatically supporting the development of an ultrareliable system throughout its life cycle. Both products originated in NASA technology developed under a Johnson Space Center contract.

  14. ABC/2 Method Does not Accurately Predict Cerebral Arteriovenous Malformation Volume.

    Science.gov (United States)

    Roark, Christopher; Vadlamudi, Venu; Chaudhary, Neeraj; Gemmete, Joseph J; Seinfeld, Joshua; Thompson, B Gregory; Pandey, Aditya S

    2018-02-01

    Stereotactic radiosurgery (SRS) is a treatment option for cerebral arteriovenous malformations (AVMs) to prevent intracranial hemorrhage. The decision to proceed with SRS is usually based on calculated nidal volume. Physicians commonly use the ABC/2 formula, based on digital subtraction angiography (DSA), when counseling patients for SRS. To determine whether AVM volume calculated using the ABC/2 method on DSA is accurate when compared to the exact volume calculated from thin-cut axial sections used for SRS planning. Retrospective search of neurovascular database to identify AVMs treated with SRS from 1995 to 2015. Maximum nidal diameters in orthogonal planes on DSA images were recorded to determine volume using ABC/2 formula. Nidal target volume was extracted from operative reports of SRS. Volumes were then compared using descriptive statistics and paired t-tests. Ninety intracranial AVMs were identified. Median volume was 4.96 cm3 [interquartile range (IQR) 1.79-8.85] with SRS planning methods and 6.07 cm3 (IQR 1.3-13.6) with ABC/2 methodology. Moderate correlation was seen between SRS and ABC/2 (r = 0.662; P ABC/2 (t = -3.2; P = .002). When AVMs were dichotomized based on ABC/2 volume, significant differences remained (t = 3.1, P = .003 for ABC/2 volume ABC/2 volume > 7 cm3). The ABC/2 method overestimates cerebral AVM volume when compared to volumetric analysis from SRS planning software. For AVMs > 7 cm3, the overestimation is even greater. SRS planning techniques were also significantly different than values derived from equations for cones and cylinders. Copyright © 2017 by the Congress of Neurological Surgeons

  15. Estimation of Apple Volume and Its Shape Indentation Using Image Processing Technique and Neural Network

    Directory of Open Access Journals (Sweden)

    M Jafarlou

    2014-04-01

    Full Text Available Physical properties of agricultural products such as volume are the most important parameters influencing grading and packaging systems. They should be measured accurately as they are considered for any good system design. Image processing and neural network techniques are both non-destructive and useful methods which are recently used for such purpose. In this study, the images of apples were captured from a constant distance and then were processed in MATLAB software and the edges of apple images were extracted. The interior area of apple image was divided into some thin trapezoidal elements perpendicular to longitudinal axis. Total volume of apple was estimated by the summation of incremental volumes of these elements revolved around the apple’s longitudinal axis. The picture of half cut apple was also captured in order to obtain the apple shape’s indentation volume, which was subtracted from the previously estimated total volume of apple. The real volume of apples was measured using water displacement method and the relation between the real volume and estimated volume was obtained. The t-test and Bland-Altman indicated that the difference between the real volume and the estimated volume was not significantly different (p>0.05 i.e. the mean difference was 1.52 cm3 and the accuracy of measurement was 92%. Utilizing neural network with input variables of dimension and mass has increased the accuracy up to 97% and the difference between the mean of volumes decreased to 0.7 cm3.

  16. Linear accelerator quality assurance using EPIQA software

    International Nuclear Information System (INIS)

    Bozhikov, S.; Sokerov, H.; Tonev, A.; Ivanova, K.

    2012-01-01

    Unlike treatment with static fields, using a dynamic multileaf collimator (dMLC), there are significant dosimetric issues which must be assessed before dynamic therapy can be implemented. The advanced techniques require some additional commissioning and quality assurance tests. The results of standard quality assurance (QA) machine tests and commissioning tests for volume modulated arc therapy (VMAT) using electronic portal image device (EPID) and 'EPIQA' software are presented. (authors)

  17. MARS software package status

    International Nuclear Information System (INIS)

    Azhgirej, I.L.; Talanov, V.V.

    2000-01-01

    The MARS software package is intended for simulating the nuclear-electromagnetic cascades and the secondary neutrons and muons transport in the heterogeneous medium of arbitrary complexity in the magnetic fields presence. The inclusive approach to describing the particle production in the nuclear and electromagnetic interactions and by the unstable particles decay is realized in the package. The MARS software package was actively applied for solving various radiation physical problems [ru

  18. MAGIC user's group software

    International Nuclear Information System (INIS)

    Warren, G.; Ludeking, L.; McDonald, J.; Nguyen, K.; Goplen, B.

    1990-01-01

    The MAGIC User's Group has been established to facilitate the use of electromagnetic particle-in-cell software by universities, government agencies, and industrial firms. The software consists of a series of independent executables that are capable of inter-communication. MAGIC, SOS, μ SOS are used to perform electromagnetic simulations while POSTER is used to provide post-processing capabilities. Each is described in the paper. Use of the codes for Klystrode simulation is discussed

  19. Global software development

    DEFF Research Database (Denmark)

    Matthiesen, Stina

    2016-01-01

    This overview presents the mid stages of my doctoral research-based on ethnographic work conducted in IT companies in India and in Denmark-on collaborative work within global software development (GSD). In the following I briefly introduce how this research seeks to spark a debate in CSCW...... by challenging contemporary ideals about software development outsourcing through the exploration of the multiplicities and asymmetric dynamics inherent in the collaborative work of GSD....

  20. Principles of Antifragile Software

    OpenAIRE

    Monperrus, Martin

    2014-01-01

    The goal of this paper is to study and define the concept of "antifragile software". For this, I start from Taleb's statement that antifragile systems love errors, and discuss whether traditional software dependability fits into this class. The answer is somewhat negative, although adaptive fault tolerance is antifragile: the system learns something when an error happens, and always imrpoves. Automatic runtime bug fixing is changing the code in response to errors, fault injection in productio...

  1. Software product quality measurement

    OpenAIRE

    Godliauskas, Eimantas

    2016-01-01

    This paper analyses Ruby product quality measures, suggesting three new measures for Ruby product quality measurement tool Rubocop to measure Ruby product quality characteristics defined in ISO 2502n standard series. This paper consists of four main chapters. The first chapter gives a brief view of software product quality and software product quality measurement. The second chapter analyses object oriented quality measures. The third chapter gives a brief view of the most popular Ruby qualit...

  2. Extraction process

    International Nuclear Information System (INIS)

    Rendall, J.S.; Cahalan, M.J.

    1979-01-01

    A process is described for extracting at least two desired constituents from a mineral, using a liquid reagent which produces the constituents, or compounds thereof, in separable form and independently extracting those constituents, or compounds. The process is especially valuable for the extraction of phosphoric acid and metal values from acidulated phosphate rock, the slurry being contacted with selective extractants for phosphoric acid and metal (e.g. uranium) values. In an example, uranium values are oxidized to uranyl form and extracted using an ion exchange resin. (U.K.)

  3. Solvent extraction

    Energy Technology Data Exchange (ETDEWEB)

    Coombs, D.M.; Latimer, E.G.

    1988-01-05

    It is an object of this invention to provide for the demetallization and general upgrading of heavy oil via a solvent extracton process, and to improve the efficiency of solvent extraction operations. The yield and demetallization of product oil form heavy high-metal content oil is maximized by solvent extractions which employ either or all of the following techniques: premixing of a minor amount of the solvent with feed and using countercurrent flow for the remaining solvent; use of certain solvent/free ratios; use of segmental baffle tray extraction column internals and the proper extraction column residence time. The solvent premix/countercurrent flow feature of the invention substantially improves extractions where temperatures and pressures above the critical point of the solvent are used. By using this technique, a greater yield of extract oil can be obtained at the same metals content or a lower metals-containing extract oil product can be obtained at the same yield. Furthermore, the premixing of part of the solvent with the feed before countercurrent extraction gives high extract oil yields and high quality demetallization. The solvent/feed ratio features of the invention substanially lower the captial and operating costs for such processes while not suffering a loss in selectivity for metals rejection. The column internals and rsidence time features of the invention further improve the extractor metals rejection at a constant yield or allow for an increase in extract oil yield at a constant extract oil metals content. 13 figs., 3 tabs.

  4. Examining software complexity and quality for scientific software

    International Nuclear Information System (INIS)

    Kelly, D.; Shepard, T.

    2005-01-01

    Research has not found a simple relationship between software complexity and software quality, and particularly no relationship between commonly used software complexity metrics and the occurrence of software faults. A study with an example of scientific software from the nuclear power industry illustrates the importance of addressing cognitive complexity, the complexity related to understanding the intellectual content of the software. Simple practices such as aptly-named variables contributes more to high quality software than limiting code sizes. This paper examines the research into complexity and quality and reports on a longitudinal study using the example of nuclear software. (author)

  5. Computing and software

    Directory of Open Access Journals (Sweden)

    White, G. C.

    2004-06-01

    Full Text Available The reality is that the statistical methods used for analysis of data depend upon the availability of software. Analysis of marked animal data is no different than the rest of the statistical field. The methods used for analysis are those that are available in reliable software packages. Thus, the critical importance of having reliable, up–to–date software available to biologists is obvious. Statisticians have continued to develop more robust models, ever expanding the suite of potential analysis methods available. But without software to implement these newer methods, they will languish in the abstract, and not be applied to the problems deserving them. In the Computers and Software Session, two new software packages are described, a comparison of implementation of methods for the estimation of nest survival is provided, and a more speculative paper about how the next generation of software might be structured is presented. Rotella et al. (2004 compare nest survival estimation with different software packages: SAS logistic regression, SAS non–linear mixed models, and Program MARK. Nests are assumed to be visited at various, possibly infrequent, intervals. All of the approaches described compute nest survival with the same likelihood, and require that the age of the nest is known to account for nests that eventually hatch. However, each approach offers advantages and disadvantages, explored by Rotella et al. (2004. Efford et al. (2004 present a new software package called DENSITY. The package computes population abundance and density from trapping arrays and other detection methods with a new and unique approach. DENSITY represents the first major addition to the analysis of trapping arrays in 20 years. Barker & White (2004 discuss how existing software such as Program MARK require that each new model’s likelihood must be programmed specifically for that model. They wishfully think that future software might allow the user to combine

  6. Software Engineering Reviews and Audits

    CERN Document Server

    Summers, Boyd L

    2011-01-01

    Accurate software engineering reviews and audits have become essential to the success of software companies and military and aerospace programs. These reviews and audits define the framework and specific requirements for verifying software development efforts. Authored by an industry professional with three decades of experience, Software Engineering Reviews and Audits offers authoritative guidance for conducting and performing software first article inspections, and functional and physical configuration software audits. It prepares readers to answer common questions for conducting and perform

  7. Business Management Software Axolon ERP

    OpenAIRE

    Axolon ERP Solution

    2018-01-01

    Axolon ERP a Business Management Software www.axolonerp.com by Micromind is a comprehensive business management software solution for businesses. We deliver Business Management Software Dubai in UAE, GCC Countries and products also include ERP Software Dubai. HR & Payroll, Inventory Software, Project Management, Software Development, Solutions and Services in Dubai, UAE for small and medium sized Enterprises (SME) in the middle east with a easy-to-use, secure and efficient business management...

  8. BNL volume H- source

    International Nuclear Information System (INIS)

    Prelec, K.; Alessi, J.G.

    1991-01-01

    The volume H minus ion source under development at Brookhaven is unique in that it has a toroidal plasma region, which feeds ions into the central extraction region through a conically shaped filter field. In pulsed operation, it produced 25 mA of H minus in a 1 cm 2 aperture, with an electron-to-H minus ratio of ∼ 3. At 19 mA, a normalized, 90% emittance of 0.44 π mm-mrad has been measured. Up to 50 mA has been extracted through a 1.87 cm 2 aperture. Although not designed for steady state operation, up to 6 mA has been extracted d.c. The addition of xenon to the discharge was found to improve the source output by 20--70%. The circular magnetic cusp field geometry was found to be more favorable than radial cusp fields. 4 refs., 5 figs

  9. Tevatron extraction microcomputer

    International Nuclear Information System (INIS)

    Chapman, L.; Finley, D.A.; Harrison, M.; Merz, W.; Batavia, IL)

    1985-01-01

    Extraction in the Fermilab Tevatron is controlled by a multi-processor Multibus microcomputer system called QXR (Quad eXtraction Regulator). QXR monitors several analog beam signals and controls three sets of power supplies: the ''bucker'' and ''pulse'' magnets at a rate of 5760 Hz, and the ''QXR'' magnets at 720 Hz. QXR supports multiple slow spills (up to a total of 35 seconds) with multiple fast pulses intermixed. It linearizes the slow spill and bucks out the high frequency components. Fast extraction is done by outputting a variable pulse waveform. Closed loop learning techniques are used to improve performance from cycle to cycle for both slow and fast extraction. The system is connected to the Tevatron clock system so that it can track the machine cycle. QXR is also connected to the rest of the Fermilab control system, ACNET. Through ACNET, human operators and central computers can monitor and control extraction through communications with QXR. The controls hardware and software both employ some standard and some specialized components. This paper gives an overview of QXR as a control system; another paper (1) summarizes performance

  10. Tevatron extraction microcomputer

    International Nuclear Information System (INIS)

    Chapman, L.; Finley, D.A.; Harrison, M.; Merz, W.

    1985-06-01

    Extraction in the Fermilab Tevatron is controlled by a multi-processor Multibus microcomputer system called QXR (Quad eXtraction Regulator). QXR monitors several analog beam signals and controls three sets of power supplies: the ''bucker'' and ''pulse'' magnets at a rate of 5760 Hz, and the ''QXR'' magnets at 720 Hz. QXR supports multiple slow spills (up to a total of 35 seconds) with multiple fast pulses intermixed. It linearizes the slow spill and bucks out the high frequency components. Fast extraction is done by outputting a variable pulse waveform. Closed loop learning techniques are used to improve performance from cycle to cycle for both slow and fast extraction. The system is connected to the Tevatron clock system so that it can track the machine cycle. QXR is also connected to the rest of the Fermilab control system, ACNET. Through ACNET, human operators and central computers can monitor and control extraction through communications with QXR. The controls hardware and software both employ some standard and some specialized components. This paper gives an overview of QXR as a control system; another paper summarizes performance

  11. The Solid* toolset for software visual analytics of program structure and metrics comprehension : From research prototype to product

    NARCIS (Netherlands)

    Reniers, Dennie; Voinea, Lucian; Ersoy, Ozan; Telea, Alexandru

    2014-01-01

    Software visual analytics (SVA) tools combine static program analysis and fact extraction with information visualization to support program comprehension. However, building efficient and effective SVA tools is highly challenging, as it involves extensive software development in program analysis,

  12. Factors negatively influencing knowledge sharing in software development

    Directory of Open Access Journals (Sweden)

    Lucas T. Khoza

    2017-07-01

    Objective: This study seeks to identify factors that negatively influence knowledge sharing in software development in the developing country context. Method: Expert sampling as a subcategory of purposive sampling was employed to extract information, views and opinions from experts in the field of information and communication technology, more specifically from those who are involved in software development projects. Four Johannesburg-based software developing organisations listed on the Johannesburg Stock Exchange (JSE, South Africa, participated in this research study. Quantitative data were collected using an online questionnaire with closed-ended questions. Results: Findings of this research reveal that job security, motivation, time constraints, physiological factors, communication, resistance to change and rewards are core factors negatively influencing knowledge sharing in software developing organisations. Conclusions: Improved understanding of factors negatively influencing knowledge sharing is expected to assist software developing organisations in closing the gap for software development projects failing to meet the triple constraint of time, cost and scope.

  13. The Ettention software package

    International Nuclear Information System (INIS)

    Dahmen, Tim; Marsalek, Lukas; Marniok, Nico; Turoňová, Beata; Bogachev, Sviatoslav; Trampert, Patrick; Nickels, Stefan; Slusallek, Philipp

    2016-01-01

    We present a novel software package for the problem “reconstruction from projections” in electron microscopy. The Ettention framework consists of a set of modular building-blocks for tomographic reconstruction algorithms. The well-known block iterative reconstruction method based on Kaczmarz algorithm is implemented using these building-blocks, including adaptations specific to electron tomography. Ettention simultaneously features (1) a modular, object-oriented software design, (2) optimized access to high-performance computing (HPC) platforms such as graphic processing units (GPU) or many-core architectures like Xeon Phi, and (3) accessibility to microscopy end-users via integration in the IMOD package and eTomo user interface. We also provide developers with a clean and well-structured application programming interface (API) that allows for extending the software easily and thus makes it an ideal platform for algorithmic research while hiding most of the technical details of high-performance computing. - Highlights: • Novel software package for “reconstruction from projections” in electron microscopy. • Support for high-resolution reconstructions on iterative reconstruction algorithms. • Support for CPU, GPU and Xeon Phi. • Integration in the IMOD software. • Platform for algorithm researchers: object oriented, modular design.

  14. Software reliability assessment

    International Nuclear Information System (INIS)

    Barnes, M.; Bradley, P.A.; Brewer, M.A.

    1994-01-01

    The increased usage and sophistication of computers applied to real time safety-related systems in the United Kingdom has spurred on the desire to provide a standard framework within which to assess dependable computing systems. Recent accidents and ensuing legislation have acted as a catalyst in this area. One particular aspect of dependable computing systems is that of software, which is usually designed to reduce risk at the system level, but which can increase risk if it is unreliable. Various organizations have recognized the problem of assessing the risk imposed to the system by unreliable software, and have taken initial steps to develop and use such assessment frameworks. This paper relates the approach of Consultancy Services of AEA Technology in developing a framework to assess the risk imposed by unreliable software. In addition, the paper discusses the experiences gained by Consultancy Services in applying the assessment framework to commercial and research projects. The framework is applicable to software used in safety applications, including proprietary software. Although the paper is written with Nuclear Reactor Safety applications in mind, the principles discussed can be applied to safety applications in all industries

  15. The Ettention software package

    Energy Technology Data Exchange (ETDEWEB)

    Dahmen, Tim, E-mail: Tim.Dahmen@dfki.de [German Research Center for Artificial Intelligence GmbH (DFKI), 66123 Saarbrücken (Germany); Saarland University, 66123 Saarbrücken (Germany); Marsalek, Lukas [Eyen SE, Na Nivách 1043/16, 141 00 Praha 4 (Czech Republic); Saarland University, 66123 Saarbrücken (Germany); Marniok, Nico [Saarland University, 66123 Saarbrücken (Germany); Turoňová, Beata [Saarland University, 66123 Saarbrücken (Germany); IMPRS-CS, Max-Planck Institute for Informatics, Campus E 1.4, 66123 Saarbrücken (Germany); Bogachev, Sviatoslav [Saarland University, 66123 Saarbrücken (Germany); Trampert, Patrick; Nickels, Stefan [German Research Center for Artificial Intelligence GmbH (DFKI), 66123 Saarbrücken (Germany); Slusallek, Philipp [German Research Center for Artificial Intelligence GmbH (DFKI), 66123 Saarbrücken (Germany); Saarland University, 66123 Saarbrücken (Germany)

    2016-02-15

    We present a novel software package for the problem “reconstruction from projections” in electron microscopy. The Ettention framework consists of a set of modular building-blocks for tomographic reconstruction algorithms. The well-known block iterative reconstruction method based on Kaczmarz algorithm is implemented using these building-blocks, including adaptations specific to electron tomography. Ettention simultaneously features (1) a modular, object-oriented software design, (2) optimized access to high-performance computing (HPC) platforms such as graphic processing units (GPU) or many-core architectures like Xeon Phi, and (3) accessibility to microscopy end-users via integration in the IMOD package and eTomo user interface. We also provide developers with a clean and well-structured application programming interface (API) that allows for extending the software easily and thus makes it an ideal platform for algorithmic research while hiding most of the technical details of high-performance computing. - Highlights: • Novel software package for “reconstruction from projections” in electron microscopy. • Support for high-resolution reconstructions on iterative reconstruction algorithms. • Support for CPU, GPU and Xeon Phi. • Integration in the IMOD software. • Platform for algorithm researchers: object oriented, modular design.

  16. Extraction method

    International Nuclear Information System (INIS)

    Stary, J.; Kyrs, M.; Navratil, J.; Havelka, S.; Hala, J.

    1975-01-01

    Definitions of the basic terms and of relations are given and the knowledge is described of the possibilities of the extraction of elements, oxides, covalent-bound halogenides and heteropolyacids. Greatest attention is devoted to the detailed analysis of the extraction of chelates and ion associates using diverse agents. For both types of compounds detailed conditions are given of the separation and the effects of the individual factors are listed. Attention is also devoted to extractions using mixtures of organic agents, the synergic effects thereof, and to extractions in non-aqueous solvents. The effects of radiation on extraction and the main types of apparatus used for extractions carried out in the laboratory are described. (L.K.)

  17. Rapid extraction of PCDD/Fs from soil and fly ash samples. Pressurized fluid extraction (PFE) and microwave-assisted extraction (MAE)

    Energy Technology Data Exchange (ETDEWEB)

    Sanz, P.; Fabrellas, B. [Centro de Investigaciones Energeticas Medioambientales y Tecnologicas (CIEMAT), Madrid (Spain)

    2004-09-15

    The main reference extraction method in the analysis of polychlorinated dibenzop- dioxins and dibenzofurans (PCDD/Fs) is still the Soxhlet extraction. But it requires long extraction times (up to 24 hs), large volumes of hazardous organic solvents (100-300 ml) and its automation is limited. Pressurized Fluid Extraction (PFE) and Microwave-Assisted Extraction (MAE) are two relatively new extraction techniques that reduce the time and the volume of solvent required for extraction. However, very different PFE extraction conditions are found for the same enviromental matrices in the literature. MAE is not a extraction technique very applied for the analysis of PCDD/Fs yet, although it is used for the determination of other organic compounds, such as PCBs and PAHs. In this study, PFE and MAE extraction conditions were optimized to determine PCDDs y PCDFs in fly ash and soil/sediment samples. Conventional Soxhlet extraction with toluene was used to compare the extraction efficiency of both techniques.

  18. Methods, software and datasets to verify DVH calculations against analytical values: Twenty years late(r)

    Energy Technology Data Exchange (ETDEWEB)

    Nelms, Benjamin [Canis Lupus LLC, Merrimac, Wisconsin 53561 (United States); Stambaugh, Cassandra [Department of Physics, University of South Florida, Tampa, Florida 33612 (United States); Hunt, Dylan; Tonner, Brian; Zhang, Geoffrey; Feygelman, Vladimir, E-mail: vladimir.feygelman@moffitt.org [Department of Radiation Oncology, Moffitt Cancer Center, Tampa, Florida 33612 (United States)

    2015-08-15

    Purpose: The authors designed data, methods, and metrics that can serve as a standard, independent of any software package, to evaluate dose-volume histogram (DVH) calculation accuracy and detect limitations. The authors use simple geometrical objects at different orientations combined with dose grids of varying spatial resolution with linear 1D dose gradients; when combined, ground truth DVH curves can be calculated analytically in closed form to serve as the absolute standards. Methods: DICOM RT structure sets containing a small sphere, cylinder, and cone were created programmatically with axial plane spacing varying from 0.2 to 3 mm. Cylinders and cones were modeled in two different orientations with respect to the IEC 1217 Y axis. The contours were designed to stringently but methodically test voxelation methods required for DVH. Synthetic RT dose files were generated with 1D linear dose gradient and with grid resolution varying from 0.4 to 3 mm. Two commercial DVH algorithms—PINNACLE (Philips Radiation Oncology Systems) and PlanIQ (Sun Nuclear Corp.)—were tested against analytical values using custom, noncommercial analysis software. In Test 1, axial contour spacing was constant at 0.2 mm while dose grid resolution varied. In Tests 2 and 3, the dose grid resolution was matched to varying subsampled axial contours with spacing of 1, 2, and 3 mm, and difference analysis and metrics were employed: (1) histograms of the accuracy of various DVH parameters (total volume, D{sub max}, D{sub min}, and doses to % volume: D99, D95, D5, D1, D0.03 cm{sup 3}) and (2) volume errors extracted along the DVH curves were generated and summarized in tabular and graphical forms. Results: In Test 1, PINNACLE produced 52 deviations (15%) while PlanIQ produced 5 (1.5%). In Test 2, PINNACLE and PlanIQ differed from analytical by >3% in 93 (36%) and 18 (7%) times, respectively. Excluding D{sub min} and D{sub max} as least clinically relevant would result in 32 (15%) vs 5 (2

  19. Methods, software and datasets to verify DVH calculations against analytical values: Twenty years late(r).

    Science.gov (United States)

    Nelms, Benjamin; Stambaugh, Cassandra; Hunt, Dylan; Tonner, Brian; Zhang, Geoffrey; Feygelman, Vladimir

    2015-08-01

    The authors designed data, methods, and metrics that can serve as a standard, independent of any software package, to evaluate dose-volume histogram (DVH) calculation accuracy and detect limitations. The authors use simple geometrical objects at different orientations combined with dose grids of varying spatial resolution with linear 1D dose gradients; when combined, ground truth DVH curves can be calculated analytically in closed form to serve as the absolute standards. dicom RT structure sets containing a small sphere, cylinder, and cone were created programmatically with axial plane spacing varying from 0.2 to 3 mm. Cylinders and cones were modeled in two different orientations with respect to the IEC 1217 Y axis. The contours were designed to stringently but methodically test voxelation methods required for DVH. Synthetic RT dose files were generated with 1D linear dose gradient and with grid resolution varying from 0.4 to 3 mm. Two commercial DVH algorithms-pinnacle (Philips Radiation Oncology Systems) and PlanIQ (Sun Nuclear Corp.)-were tested against analytical values using custom, noncommercial analysis software. In Test 1, axial contour spacing was constant at 0.2 mm while dose grid resolution varied. In Tests 2 and 3, the dose grid resolution was matched to varying subsampled axial contours with spacing of 1, 2, and 3 mm, and difference analysis and metrics were employed: (1) histograms of the accuracy of various DVH parameters (total volume, Dmax, Dmin, and doses to % volume: D99, D95, D5, D1, D0.03 cm(3)) and (2) volume errors extracted along the DVH curves were generated and summarized in tabular and graphical forms. In Test 1, pinnacle produced 52 deviations (15%) while PlanIQ produced 5 (1.5%). In Test 2, pinnacle and PlanIQ differed from analytical by >3% in 93 (36%) and 18 (7%) times, respectively. Excluding Dmin and Dmax as least clinically relevant would result in 32 (15%) vs 5 (2%) scored deviations for pinnacle vs PlanIQ in Test 1, while Test 2

  20. Software cost/resource modeling: Software quality tradeoff measurement

    Science.gov (United States)

    Lawler, R. W.

    1980-01-01

    A conceptual framework for treating software quality from a total system perspective is developed. Examples are given to show how system quality objectives may be allocated to hardware and software; to illustrate trades among quality factors, both hardware and software, to achieve system performance objectives; and to illustrate the impact of certain design choices on software functionality.

  1. Software architecture analysis tool : software architecture metrics collection

    NARCIS (Netherlands)

    Muskens, J.; Chaudron, M.R.V.; Westgeest, R.

    2002-01-01

    The Software Engineering discipline lacks the ability to evaluate software architectures. Here we describe a tool for software architecture analysis that is based on metrics. Metrics can be used to detect possible problems and bottlenecks in software architectures. Even though metrics do not give a

  2. Impact of Agile Software Development Model on Software Maintainability

    Science.gov (United States)

    Gawali, Ajay R.

    2012-01-01

    Software maintenance and support costs account for up to 60% of the overall software life cycle cost and often burdens tightly budgeted information technology (IT) organizations. Agile software development approach delivers business value early, but implications on software maintainability are still unknown. The purpose of this quantitative study…

  3. Belle II Software

    International Nuclear Information System (INIS)

    Kuhr, T; Ritter, M

    2016-01-01

    Belle II is a next generation B factory experiment that will collect 50 times more data than its predecessor, Belle. The higher luminosity at the SuperKEKB accelerator leads to higher background levels and requires a major upgrade of the detector. As a consequence, the simulation, reconstruction, and analysis software must also be upgraded substantially. Most of the software has been redesigned from scratch, taking into account the experience from Belle and other experiments and utilizing new technologies. The large amount of experimental and simulated data requires a high level of reliability and reproducibility, even in parallel environments. Several technologies, tools, and organizational measures are employed to evaluate and monitor the performance of the software during development. (paper)

  4. New Media as Software

    Directory of Open Access Journals (Sweden)

    Manuel Portela

    2014-03-01

    Full Text Available Review of Lev Manovich, Software Takes Command: Extending the Language of New Media. London: Bloomsbury, 2013, 358 pp. ISBN 978-1-6235-6817-7. In Lev Manovich’s most recent book, this programmatic interrogation of our medial condition leads to the following question: do media still exist after software? This is the question that triggers Manovich’s dialogue both with computing history and with theories of digital media of recent decades, including the extension of his own previous formulations in The Language of New Media, published in 2001, and which became a major reference work in the field. The subtitle of the new book points precisely to this critical revisiting of his earlier work in the context of ubiquitous computing and accelerated transcoding of social, cultural and artistic practices by software.

  5. LHCb software strategy

    CERN Document Server

    Van Herwijnen, Eric

    1998-01-01

    This document describes the software strategy of the LHCb experiment. The main objective is to reuse designs and code wherever possible; We will implement an architecturally driven design process; This architectural process will be implemented using Object Technology; We aim for platform indepence; try to take advantage of distributed computing and will use industry standards, commercial software and profit from HEP developments; We will implement a common software process and development environment. One of the major problems that we are immediately faced with is the conversion of our current code from Fortran into an Object Oriented language and the conversion of our current developers to Object technology. Some technical terms related to OO programming are defined in Annex A.1

  6. Test af Software

    DEFF Research Database (Denmark)

    Dette dokument udgør slutrapporten for netværkssamarbejdet ”Testnet”, som er udført i perioden 1.4.2006 til 31.12.2008. Netværket beskæftiger sig navnlig med emner inden for test af indlejret og teknisk software, men et antal eksempler på problemstillinger og løsninger forbundet med test af...... administrativ software indgår også. Rapporten er opdelt i følgende 3 dele: Overblik. Her giver vi et resumé af netværkets formål, aktiviteter og resultater. State of the art af software test ridses op. Vi omtaler, at CISS og netværket tager nye tiltag. Netværket. Formål, deltagere og behandlede emner på ti...

  7. ORNL's DCAL software package

    International Nuclear Information System (INIS)

    Eckerman, K.F.

    2007-01-01

    Oak Ridge National Laboratory has released its Dose and Risk Calculation software, DCAL. DCAL, developed with the support of the U.S. Environmental Protection Agency, consists of a series of computational modules, driven in either an interactive or a batch mode for computation of dose and risk coefficients from intakes of radionuclides or exposure to radionuclides in environmental media. The software package includes extensive libraries of biokinetic and dosimetric data that represent the current state of the art. The software has unique capability for addressing intakes of radionuclides by non-adults. DCAL runs as 32-bit extended DOS and console applications under Windows 98/NT/2000/XP. It is intended for users familiar with the basic elements of computational radiation dosimetry. Components of DCAL have been used to prepare U.S. Environmental Protection Agency's Federal Guidance Reports 12 and 13 and several publications of the International Commission on Radiological Protection. (author)

  8. Aircraft Design Software

    Science.gov (United States)

    1997-01-01

    Successful commercialization of the AirCraft SYNThesis (ACSYNT) tool has resulted in the creation of Phoenix Integration, Inc. ACSYNT has been exclusively licensed to the company, an outcome of a seven year, $3 million effort to provide unique software technology to a focused design engineering market. Ames Research Center formulated ACSYNT and in working with the Virginia Polytechnic Institute CAD Laboratory, began to design and code a computer-aided design for ACSYNT. Using a Joint Sponsored Research Agreement, Ames formed an industry-government-university alliance to improve and foster research and development for the software. As a result of the ACSYNT Institute, the software is becoming a predominant tool for aircraft conceptual design. ACSYNT has been successfully applied to high- speed civil transport configuration, subsonic transports, and supersonic fighters.

  9. Lecture 2: Software Security

    CERN Multimedia

    CERN. Geneva

    2013-01-01

    Computer security has been an increasing concern for IT professionals for a number of years, yet despite all the efforts, computer systems and networks remain highly vulnerable to attacks of different kinds. Design flaws and security bugs in the underlying software are among the main reasons for this. This lecture addresses the following question: how to create secure software? The lecture starts with a definition of computer security and an explanation of why it is so difficult to achieve. It then introduces the main security principles (like least-privilege, or defense-in-depth) and discusses security in different phases of the software development cycle. The emphasis is put on the implementation part: most common pitfalls and security bugs are listed, followed by advice on best practice for security development, testing and deployment. Sebastian Lopienski is CERN’s deputy Computer Security Officer. He works on security strategy and policies; offers internal consultancy and audit services; develops and ...

  10. Methods of Software Verification

    Directory of Open Access Journals (Sweden)

    R. E. Gurin

    2015-01-01

    Full Text Available This article is devoted to the problem of software verification (SW. Methods of software verification designed to check the software for compliance with the stated requirements such as correctness, system security and system adaptability to small changes in the environment, portability and compatibility, etc. These are various methods both by the operation process and by the way of achieving result. The article describes the static and dynamic methods of software verification and paid attention to the method of symbolic execution. In its review of static analysis are discussed and described the deductive method, and methods for testing the model. A relevant issue of the pros and cons of a particular method is emphasized. The article considers classification of test techniques for each method. In this paper we present and analyze the characteristics and mechanisms of the static analysis of dependencies, as well as their views, which can reduce the number of false positives in situations where the current state of the program combines two or more states obtained both in different paths of execution and in working with multiple object values. Dependences connect various types of software objects: single variables, the elements of composite variables (structure fields, array elements, the size of the heap areas, the length of lines, the number of initialized array elements in the verification code using static methods. The article pays attention to the identification of dependencies within the framework of the abstract interpretation, as well as gives an overview and analysis of the inference tools.Methods of dynamic analysis such as testing, monitoring and profiling are presented and analyzed. Also some kinds of tools are considered which can be applied to the software when using the methods of dynamic analysis. Based on the work a conclusion is drawn, which describes the most relevant problems of analysis techniques, methods of their solutions and

  11. Limiting volume with modern ventilators.

    Science.gov (United States)

    Wing, Thomas J; Haan, Lutana; Ashworth, Lonny J; Anderson, Jeff

    2015-06-01

    The acute respiratory distress syndrome (ARDS) network low tidal-volume study comparing tidal volumes of 12 ml/kg versus 6 ml/kg was published in 2000. The study was stopped early as data revealed a 22% relative reduction in mortality rate when using 6 ml/kg tidal volume. The current generation of critical care ventilators allows the tidal volume to be set during volume-targeted, assist/control (volume A/C); however, some ventilators include options that may prevent the tidal volume from being controlled. The purpose of this bench study was to evaluate the delivered tidal volume, when these options are active, in a spontaneously breathing lung model using an electronic breathing simulator. Four ventilators were evaluated: CareFusion AVEA (AVEA), Dräger Evita® XL (Evita XL), Covidien Puritan Bennett® 840(TM) (PB 840), and Maquet SERVO-i (SERVO-i). Each ventilator was connected to the Hans Rudolph Electronic Breathing Simulator at an amplitude of 0 cm H2O and then 10 cm H2O. All four ventilators were set to deliver volume A/C, tidal volume 400 ml, respiratory rate 20 bpm, positive end-expiratory pressure 5 cm H2O, peak flowrate 60 L/min. The displayed tidal volume was recorded for each ventilator at the above settings with additional options OFF and then ON. The AVEA has two options in volume A/C: demand breaths and V-sync. When activated, these options allow the patient to exceed the set tidal volume. When using the Evita XL, the option AutoFlow can be turned ON or OFF, and when this option is ON, the tidal volume may vary. The PB 840 does not have any additional options that affect volume delivery, and it maintains the set tidal volume regardless of patient effort. The SERVO-i's demand valve allows additional flow if the patient's inspiratory flowrate exceeds the set flowrate, increasing the delivered tidal volume; this option can be turned OFF with the latest software upgrade. Modern ventilators have an increasing number of optional settings. These settings may

  12. A SOFTWARE RELIABILITY ESTIMATION METHOD TO NUCLEAR SAFETY SOFTWARE

    Directory of Open Access Journals (Sweden)

    GEE-YONG PARK

    2014-02-01

    Full Text Available A method for estimating software reliability for nuclear safety software is proposed in this paper. This method is based on the software reliability growth model (SRGM, where the behavior of software failure is assumed to follow a non-homogeneous Poisson process. Two types of modeling schemes based on a particular underlying method are proposed in order to more precisely estimate and predict the number of software defects based on very rare software failure data. The Bayesian statistical inference is employed to estimate the model parameters by incorporating software test cases as a covariate into the model. It was identified that these models are capable of reasonably estimating the remaining number of software defects which directly affects the reactor trip functions. The software reliability might be estimated from these modeling equations, and one approach of obtaining software reliability value is proposed in this paper.

  13. Agile software development

    CERN Document Server

    Stober, Thomas

    2009-01-01

    Software Development is moving towards a more agile and more flexible approach. It turns out that the traditional 'waterfall' model is not supportive in an environment where technical, financial and strategic constraints are changing almost every day. But what is agility? What are today's major approaches? And especially: What is the impact of agile development principles on the development teams, on project management and on software architects? How can large enterprises become more agile and improve their business processes, which have been existing since many, many years? What are the limit

  14. Software Testing as Science

    Directory of Open Access Journals (Sweden)

    Ingrid Gallesdic

    2013-06-01

    Full Text Available The most widespread opinion among people who have some connection with software testing is that this activity is an art. In fact, books have been published widely whose titles refer to it as art, role or process. But because software complexity is increasing every year, this paper proposes a new approach, conceiving the test as a science. This is because the processes by which they are applied are the steps of the scientific method: inputs, processes, outputs. The contents of this paper examines the similarities and test characteristics as science.

  15. Provider software buyer's guide.

    Science.gov (United States)

    1994-03-01

    To help long term care providers find new ways to improve quality of care and efficiency, Provider magazine presents the fourth annual listing of software firms marketing computer programs for all areas of nursing facility operations. On the following five pages, more than 80 software firms display their wares, with programs such as minimum data set and care planning, dietary, accounting and financials, case mix, and medication administration records. The guide also charts compatible hardware, integration ability, telephone numbers, company contacts, and easy-to-use reader service numbers.

  16. Model of software quality

    OpenAIRE

    Valencia Ayala, Luz Estela; Villa Sánchez, Paula Andréa; Ocampo S., Carlos Alberto

    2009-01-01

    En un mercado globalizado donde las empresas deben innovar y mejorar continuamente para crecer y ser más competitivas, es necesario tener acceso a certificaciones de calidad internacionales que les den un respaldo y puedan mantenerse en este mercado. Las certificaciones de calidad en la industria del software ayudan a las empresas a ser más productivas disminuyendo costos y tiempo en sus desarrollos. Las empresas de desarrollo de software de nuestro país en su mayoría son micro y pequeñas...

  17. Security System Software

    Science.gov (United States)

    1993-01-01

    C Language Integration Production System (CLIPS), a NASA-developed expert systems program, has enabled a security systems manufacturer to design a new generation of hardware. C.CURESystem 1 Plus, manufactured by Software House, is a software based system that is used with a variety of access control hardware at installations around the world. Users can manage large amounts of information, solve unique security problems and control entry and time scheduling. CLIPS acts as an information management tool when accessed by C.CURESystem 1 Plus. It asks questions about the hardware and when given the answer, recommends possible quick solutions by non-expert persons.

  18. Software product quality control

    CERN Document Server

    Wagner, Stefan

    2013-01-01

    Quality is not a fixed or universal property of software; it depends on the context and goals of its stakeholders. Hence, when you want to develop a high-quality software system, the first step must be a clear and precise specification of quality. Yet even if you get it right and complete, you can be sure that it will become invalid over time. So the only solution is continuous quality control: the steady and explicit evaluation of a product's properties with respect to its updated quality goals.This book guides you in setting up and running continuous quality control in your environment. Star

  19. Software Safety and Security

    CERN Document Server

    Nipkow, T; Hauptmann, B

    2012-01-01

    Recent decades have seen major advances in methods and tools for checking the safety and security of software systems. Automatic tools can now detect security flaws not only in programs of the order of a million lines of code, but also in high-level protocol descriptions. There has also been something of a breakthrough in the area of operating system verification. This book presents the lectures from the NATO Advanced Study Institute on Tools for Analysis and Verification of Software Safety and Security; a summer school held at Bayrischzell, Germany, in 2011. This Advanced Study Institute was

  20. Maintenance simulation: Software issues

    Energy Technology Data Exchange (ETDEWEB)

    Luk, C.H.; Jette, M.A.

    1995-07-01

    The maintenance of a distributed software system in a production environment involves: (1) maintaining software integrity, (2) maintaining and database integrity, (3) adding new features, and (4) adding new systems. These issues will be discussed in general: what they are and how they are handled. This paper will present our experience with a distributed resource management system that accounts for resources consumed, in real-time, on a network of heterogenous computers. The simulated environments to maintain this system will be presented relate to the four maintenance areas.

  1. Processeringsoptimering med Canons software

    DEFF Research Database (Denmark)

    Precht, Helle

    2009-01-01

    . Muligheder i software optimering blev studeret i relation til optimal billedkvalitet og kontrol optagelser, for at undersøge om det var muligt at acceptere diagnostisk billedkvalitet og derved tage afsæt i ALARA. Metode og materialer Et kvantitativt eksperimentelt studie baseret på forsøg med teknisk og...... humant fantom. CD Rad fantom anvendes som teknisk fantom, hvor billederne blev analyseret med CD Rad software, og resultatet var en objektiv IQF værdi. Det humane fantom var et lamme pelvis med femur, der via NRPB’ er sammenlignelig med absorptionen ved et femårigt barn. De humane forsøgsbilleder blev...

  2. Agile distributed software development

    DEFF Research Database (Denmark)

    Persson, John Stouby; Mathiassen, Lars; Aaen, Ivan

    2012-01-01

    While face-to-face interaction is fundamental in agile software development, distributed environments must rely extensively on mediated interactions. Practicing agile principles in distributed environments therefore poses particular control challenges related to balancing fixed vs. evolving quality...... requirements and people vs. process-based collaboration. To investigate these challenges, we conducted an in-depth case study of a successful agile distributed software project with participants from a Russian firm and a Danish firm. Applying Kirsch’s elements of control framework, we offer an analysis of how...

  3. Six Sigma software development

    CERN Document Server

    Tayntor, Christine B

    2002-01-01

    Since Six Sigma has had marked success in improving quality in other settings, and since the quality of software remains poor, it seems a natural evolution to apply the concepts and tools of Six Sigma to system development and the IT department. Until now however, there were no books available that applied these concepts to the system development process. Six Sigma Software Development fills this void and illustrates how Six Sigma concepts can be applied to all aspects of the evolving system development process. It includes the traditional waterfall model and in the support of legacy systems,

  4. Inventory of safeguards software

    International Nuclear Information System (INIS)

    Suzuki, Mitsutoshi; Horino, Koichi

    2009-03-01

    The purpose of this survey activity will serve as a basis for determining what needs may exist in this arena for development of next-generation safeguards systems and approaches. 23 software tools are surveyed by JAEA and NMCC. Exchanging information regarding existing software tools for safeguards and discussing about a next R and D program of developing a general-purpose safeguards tool should be beneficial to a safeguards system design and indispensable to evaluate a safeguards system for future nuclear fuel facilities. (author)

  5. Machine Tool Software

    Science.gov (United States)

    1988-01-01

    A NASA-developed software package has played a part in technical education of students who major in Mechanical Engineering Technology at William Rainey Harper College. Professor Hack has been using (APT) Automatically Programmed Tool Software since 1969 in his CAD/CAM Computer Aided Design and Manufacturing curriculum. Professor Hack teaches the use of APT programming languages for control of metal cutting machines. Machine tool instructions are geometry definitions written in APT Language to constitute a "part program." The part program is processed by the machine tool. CAD/CAM students go from writing a program to cutting steel in the course of a semester.

  6. Green in software engineering

    CERN Document Server

    Calero Munoz, Coral

    2015-01-01

    This is the first book that presents a comprehensive overview of sustainability aspects in software engineering. Its format follows the structure of the SWEBOK and covers the key areas involved in the incorporation of green aspects in software engineering, encompassing topics from requirement elicitation to quality assurance and maintenance, while also considering professional practices and economic aspects. The book consists of thirteen chapters, which are structured in five parts. First the "Introduction" gives an overview of the primary general concepts related to Green IT, discussing wha

  7. Idioms-based Business Rule Extraction

    NARCIS (Netherlands)

    R Smit (Rob)

    2011-01-01

    htmlabstractThis thesis studies the extraction of embedded business rules, using the idioms of the used framework to identify them. Embedded business rules exist as source code in the software system and knowledge about them may get lost. Extraction of those business rules could make them accessible

  8. Vacuum extraction

    DEFF Research Database (Denmark)

    Maagaard, Mathilde; Oestergaard, Jeanett; Johansen, Marianne

    2012-01-01

    Objectives. To develop and validate an Objective Structured Assessment of Technical Skills (OSATS) scale for vacuum extraction. Design. Two-part study design: Primarily, development of a procedure-specific checklist for vacuum extraction. Hereafter, validation of the developed OSATS scale for vac...

  9. Electromembrane extraction

    DEFF Research Database (Denmark)

    Huang, Chuixiu; Chen, Zhiliang; Gjelstad, Astrid

    2017-01-01

    Electromembrane extraction (EME) was inspired by solid-phase microextraction and developed from hollow fiber liquid-phase microextraction in 2006 by applying an electric field over the supported liquid membrane (SLM). EME provides rapid extraction, efficient sample clean-up and selectivity based...

  10. Mining dynamic noteworthy functions in software execution sequences.

    Science.gov (United States)

    Zhang, Bing; Huang, Guoyan; Wang, Yuqian; He, Haitao; Ren, Jiadong

    2017-01-01

    As the quality of crucial entities can directly affect that of software, their identification and protection become an important premise for effective software development, management, maintenance and testing, which thus contribute to improving the software quality and its attack-defending ability. Most analysis and evaluation on important entities like codes-based static structure analysis are on the destruction of the actual software running. In this paper, from the perspective of software execution process, we proposed an approach to mine dynamic noteworthy functions (DNFM)in software execution sequences. First, according to software decompiling and tracking stack changes, the execution traces composed of a series of function addresses were acquired. Then these traces were modeled as execution sequences and then simplified so as to get simplified sequences (SFS), followed by the extraction of patterns through pattern extraction (PE) algorithm from SFS. After that, evaluating indicators inner-importance and inter-importance were designed to measure the noteworthiness of functions in DNFM algorithm. Finally, these functions were sorted by their noteworthiness. Comparison and contrast were conducted on the experiment results from two traditional complex network-based node mining methods, namely PageRank and DegreeRank. The results show that the DNFM method can mine noteworthy functions in software effectively and precisely.

  11. Model-driven software engineering

    NARCIS (Netherlands)

    Amstel, van M.F.; Brand, van den M.G.J.; Protic, Z.; Verhoeff, T.; Hamberg, R.; Verriet, J.

    2014-01-01

    Software plays an important role in designing and operating warehouses. However, traditional software engineering methods for designing warehouse software are not able to cope with the complexity, size, and increase of automation in modern warehouses. This chapter describes Model-Driven Software

  12. Package-based software development

    NARCIS (Netherlands)

    Jonge, de M.; Chroust, G.; Hofer, C.

    2003-01-01

    The main goal of component-based software engineering is to decrease development time and development costs of software systems, by reusing prefabricated building blocks. Here we focus on software reuse within the implementation of such component-based applications, and on the corresponding software

  13. The fallacy of Software Patents

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    Software patents are usually used as argument for innovation but do they really promote innovation? Who really benefits from software patents? This talk attempts to show the problems with software patents and how they can actually harm innovation having little value for software users and our society in general.

  14. A methodology for software documentation

    OpenAIRE

    Torres Júnior, Roberto Dias; Ahlert, Hubert

    2000-01-01

    With the growing complexity of window based software and the use of object-oriented, the development of software is getting more complex than ever. Based on that, this article intends to present a methodology for software documentation and to analyze our experience and how this methodology can aid the software maintenance

  15. A software tool for automatic classification and segmentation of 2D/3D medical images

    International Nuclear Information System (INIS)

    Strzelecki, Michal; Szczypinski, Piotr; Materka, Andrzej; Klepaczko, Artur

    2013-01-01

    Modern medical diagnosis utilizes techniques of visualization of human internal organs (CT, MRI) or of its metabolism (PET). However, evaluation of acquired images made by human experts is usually subjective and qualitative only. Quantitative analysis of MR data, including tissue classification and segmentation, is necessary to perform e.g. attenuation compensation, motion detection, and correction of partial volume effect in PET images, acquired with PET/MR scanners. This article presents briefly a MaZda software package, which supports 2D and 3D medical image analysis aiming at quantification of image texture. MaZda implements procedures for evaluation, selection and extraction of highly discriminative texture attributes combined with various classification, visualization and segmentation tools. Examples of MaZda application in medical studies are also provided

  16. A software tool for automatic classification and segmentation of 2D/3D medical images

    Energy Technology Data Exchange (ETDEWEB)

    Strzelecki, Michal, E-mail: michal.strzelecki@p.lodz.pl [Institute of Electronics, Technical University of Lodz, Wolczanska 211/215, 90-924 Lodz (Poland); Szczypinski, Piotr; Materka, Andrzej; Klepaczko, Artur [Institute of Electronics, Technical University of Lodz, Wolczanska 211/215, 90-924 Lodz (Poland)

    2013-02-21

    Modern medical diagnosis utilizes techniques of visualization of human internal organs (CT, MRI) or of its metabolism (PET). However, evaluation of acquired images made by human experts is usually subjective and qualitative only. Quantitative analysis of MR data, including tissue classification and segmentation, is necessary to perform e.g. attenuation compensation, motion detection, and correction of partial volume effect in PET images, acquired with PET/MR scanners. This article presents briefly a MaZda software package, which supports 2D and 3D medical image analysis aiming at quantification of image texture. MaZda implements procedures for evaluation, selection and extraction of highly discriminative texture attributes combined with various classification, visualization and segmentation tools. Examples of MaZda application in medical studies are also provided.

  17. Optimization of palm oil extraction from Decanter cake of small crude palm oil mill by aqueous surfactant solution using RSM

    Science.gov (United States)

    Ahmadi Pirshahid, Shewa; Arirob, Wallop; Punsuvon, Vittaya

    2018-04-01

    The use of hexane to extract vegetable oil from oilseeds or seed cake is of growing concern due to its environmental impact such as its smelling and toxicity. In our method, used Response Surface Methodology (RSM) was applied to study the optimum condition of decanter cake obtained from small crude palm oil with aqueous surfactant solution. For the first time, we provide an optimum condition of preliminary study with decanter cake extraction to obtain the maximum of oil yield. The result from preliminary was further used in RSM study by using Central Composite Design (CCD) that consisted of thirty experiments. The effect of four independent variables: the concentration of Sodium Dodecyl Sulfate (SDS) as surfactant, temperature, the ratio by weight to volume of cake to surfactant solution and the amount of sodium chloride (NaCl) on dependent variables are studied. Data were analyzed using Design-Expert 8 software. The results showed that the optimum condition of decanter cake extraction were 0.016M of SDS solution concentration, 73°C of extraction temperature, 1:10 (g:ml) of the ratio of decanter cake to SDS solution and 2% (w/w) of NaCl amount. This condition gave 77.05% (w/w) oil yield. The chemical properties of the extracted palm oil from this aqueous surfactant extraction are further investigated compared with the hexane extraction. The obtained result showed that all properties of both extractions were nearly the same.

  18. Software testing concepts and operations

    CERN Document Server

    Mili, Ali

    2015-01-01

    Explores and identifies the main issues, concepts, principles and evolution of software testing, including software quality engineering and testing concepts, test data generation, test deployment analysis, and software test managementThis book examines the principles, concepts, and processes that are fundamental to the software testing function. This book is divided into five broad parts. Part I introduces software testing in the broader context of software engineering and explores the qualities that testing aims to achieve or ascertain, as well as the lifecycle of software testing. Part II c

  19. Patterns for Parallel Software Design

    CERN Document Server

    Ortega-Arjona, Jorge Luis

    2010-01-01

    Essential reading to understand patterns for parallel programming Software patterns have revolutionized the way we think about how software is designed, built, and documented, and the design of parallel software requires you to consider other particular design aspects and special skills. From clusters to supercomputers, success heavily depends on the design skills of software developers. Patterns for Parallel Software Design presents a pattern-oriented software architecture approach to parallel software design. This approach is not a design method in the classic sense, but a new way of managin

  20. The Art of Software Testing

    CERN Document Server

    Myers, Glenford J; Badgett, Tom

    2011-01-01

    The classic, landmark work on software testing The hardware and software of computing have changed markedly in the three decades since the first edition of The Art of Software Testing, but this book's powerful underlying analysis has stood the test of time. Whereas most books on software testing target particular development techniques, languages, or testing methods, The Art of Software Testing, Third Edition provides a brief but powerful and comprehensive presentation of time-proven software testing approaches. If your software development project is mission critical, this book is an investme

  1. What Counts in Software Process?

    DEFF Research Database (Denmark)

    Cohn, Marisa

    2009-01-01

    and conversations in negotiating between prescriptions from a model and the contingencies that arise in an enactment. A qualitative field study at two Agile software development companies was conducted to investigate the role of artifacts in the software development work and the relationship between these artifacts...... and the Software Process. Documentation of software requirements is a major concern among software developers and software researchers. Agile software development denotes a different relationship to documentation, one that warrants investigation. Empirical findings are presented which suggest a new understanding...

  2. Software for noise measurements

    International Nuclear Information System (INIS)

    Zyryanov, V.A.

    1987-01-01

    The CURS program library comprising 38 fortran-programs, designed for processing descrete experimental data in the form of random or determined periodic processes is described. The library is based on the modular construction principle which allows one to create on its base any sets of programs to solve tasks related to NPP operation, and to develop special software

  3. Software complex "remember me"

    OpenAIRE

    Kosheutova, N. V.; Osina, P. M.

    2016-01-01

    The article describes the importance of time management and effective planning in modern society and is devoted to an Android OS application development. It points out the main features of a mobile application such as cross-platform capability and synchronization. Much attention is given to the software architecture as well as user data protection via password hashing methods.

  4. Software management issues

    International Nuclear Information System (INIS)

    Kunz, P.F.

    1990-06-01

    The difficulty of managing the software in large HEP collaborations appears to becoming progressively worst with each new generation of detector. If one were to extrapolate to the SSC, it will become a major problem. This paper explores the possible causes of the difficulty and makes suggestions on what corrective actions should be taken

  5. Application software profiles 2010

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    2010-04-15

    This article presented information on new software applications designed to facilitate petroleum exploration, drilling and production activities. Computer modelling and analysis enables oil and gas producers to characterize reservoirs, estimate reserves forecast production, plan operations and manage assets. Seven Calgary-based organizations were highlighted along with their sophisticated software tools, the applications and the new features available in each product. The geoSCOUT version 7.7 by GeoLOGIC Systems Ltd. integrates public and proprietary data on wells, well logs, reserves, pipelines, production, ownership and seismic location data. The Value Navigator and AFE Navigator by Energy Navigator provides control over reserves, production and cash flow forecasting. FAST Harmony, FAST Evolution, FAST CBM, FAST FieldNotes, Fast Piper, FAST RTA, FAST VirtuWell and FAST WellTest by Fekete Associates Inc. provide reserve evaluations for reservoir engineering projects and production data analysis. The esi.manage software program by 3esi improves business results for upstream oil and gas companies through enhanced decision making and workforce effectiveness. WELLFLO, PIPEFLO, FORGAS, OLGA, Drillbench, and MEPO wellbore solutions by Neotec provide unique platforms for flow simulation to optimize oil and gas production systems. Petrel, ECLIPSE, Avocet, PipeSim and Merak software tools by Schlumberger Information Solutions are petroleum systems modelling tools for geologic mapping, visualization modelling and reservoir engineering. StudioSL by Streamsim Technologies Inc. is a modelling tool for optimizing flood management. figs.

  6. Software Geometry in Simulations

    Science.gov (United States)

    Alion, Tyler; Viren, Brett; Junk, Tom

    2015-04-01

    The Long Baseline Neutrino Experiment (LBNE) involves many detectors. The experiment's near detector (ND) facility, may ultimately involve several detectors. The far detector (FD) will be significantly larger than any other Liquid Argon (LAr) detector yet constructed; many prototype detectors are being constructed and studied to motivate a plethora of proposed FD designs. Whether it be a constructed prototype or a proposed ND/FD design, every design must be simulated and analyzed. This presents a considerable challenge to LBNE software experts; each detector geometry must be described to the simulation software in an efficient way which allows for multiple authors to easily collaborate. Furthermore, different geometry versions must be tracked throughout their use. We present a framework called General Geometry Description (GGD), written and developed by LBNE software collaborators for managing software to generate geometries. Though GGD is flexible enough to be used by any experiment working with detectors, we present it's first use in generating Geometry Description Markup Language (GDML) files to interface with LArSoft, a framework of detector simulations, event reconstruction, and data analyses written for all LAr technology users at Fermilab. Brett is the other of the framework discussed here, the General Geometry Description (GGD).

  7. Software configuration management

    International Nuclear Information System (INIS)

    Arribas Peces, E.; Martin Faraldo, P.

    1993-01-01

    Software Configuration Management is directed towards identifying system configuration at specific points of its life cycle, so as to control changes to the configuration and to maintain the integrity and traceability of the configuration throughout its life. SCM functions and tasks are presented in the paper

  8. Patterns in Software Development

    DEFF Research Database (Denmark)

    Corry, Aino Vonge

    the university and I entered a project to industry within Center for Object Technology (COT). I focused on promoting the pattern concept to the Danish software industry in order to help them take advantage of the benefits of applying patterns in system development. In the obligatory stay abroad, I chose to visit...

  9. Open Source Software Acquisition

    DEFF Research Database (Denmark)

    Holck, Jesper; Kühn Pedersen, Mogens; Holm Larsen, Michael

    2005-01-01

    Lately we have seen a growing interest from both public and private organisations to adopt OpenSource Software (OSS), not only for a few, specific applications but also on a more general levelthroughout the organisation. As a consequence, the organisations' decisions on adoption of OSS arebecoming...

  10. SEER*Stat Software

    Science.gov (United States)

    If you have access to SEER Research Data, use SEER*Stat to analyze SEER and other cancer-related databases. View individual records and produce statistics including incidence, mortality, survival, prevalence, and multiple primary. Tutorials and related analytic software tools are available.

  11. Improving Agile Software Practice

    DEFF Research Database (Denmark)

    Tjørnehøj, Gitte

    2006-01-01

    Software process improvement in small and agile organizations is often problematic, but achieving good SPI-assessments can still be necessary to stay in the marked or to meet demands of multinational owners. The traditional norm driven, centralized and control centered improvement approaches has...

  12. Software Defined Coded Networking

    DEFF Research Database (Denmark)

    Di Paola, Carla; Roetter, Daniel Enrique Lucani; Palazzo, Sergio

    2017-01-01

    the quality of each link and even across neighbouring links and using simulations to show that an additional reduction of packet transmission in the order of 40% is possible. Second, to advocate for the use of network coding (NC) jointly with software defined networking (SDN) providing an implementation...

  13. MOCASSIN-prot software

    Science.gov (United States)

    MOCASSIN-prot is a software, implemented in Perl and Matlab, for constructing protein similarity networks to classify proteins. Both domain composition and quantitative sequence similarity information are utilized in constructing the directed protein similarity networks. For each reference protein i...

  14. Writing testable software requirements

    Energy Technology Data Exchange (ETDEWEB)

    Knirk, D. [Sandia National Labs., Albuquerque, NM (United States)

    1997-11-01

    This tutorial identifies common problems in analyzing requirements in the problem and constructing a written specification of what the software is to do. It deals with two main problem areas: identifying and describing problem requirements, and analyzing and describing behavior specifications.

  15. Green Software Products

    NARCIS (Netherlands)

    Jagroep, E.A.

    2017-01-01

    The rising energy consumption of the ICT industry has triggered a quest for more green, energy efficient ICT solutions. The role of software as the true consumer of power and its potential contribution to reach sustainability goals has increasingly been acknowledged. At the same time, it is shown to

  16. Iterative software kernels

    Energy Technology Data Exchange (ETDEWEB)

    Duff, I.

    1994-12-31

    This workshop focuses on kernels for iterative software packages. Specifically, the three speakers discuss various aspects of sparse BLAS kernels. Their topics are: `Current status of user lever sparse BLAS`; Current status of the sparse BLAS toolkit`; and `Adding matrix-matrix and matrix-matrix-matrix multiply to the sparse BLAS toolkit`.

  17. APPHi: Automated Photometry Pipeline for High Cadence Large Volume Data

    Science.gov (United States)

    Sánchez, E.; Castro, J.; Silva, J.; Hernández, J.; Reyes, M.; Hernández, B.; Alvarez, F.; García T.

    2018-04-01

    APPHi (Automated Photometry Pipeline) carries out aperture and differential photometry of TAOS-II project data. It is computationally efficient and can be used also with other astronomical wide-field image data. APPHi works with large volumes of data and handles both FITS and HDF5 formats. Due the large number of stars that the software has to handle in an enormous number of frames, it is optimized to automatically find the best value for parameters to carry out the photometry, such as mask size for aperture, size of window for extraction of a single star, and the number of counts for the threshold for detecting a faint star. Although intended to work with TAOS-II data, APPHi can analyze any set of astronomical images and is a robust and versatile tool to performing stellar aperture and differential photometry.

  18. Flight Software Math Library

    Science.gov (United States)

    McComas, David

    2013-01-01

    The flight software (FSW) math library is a collection of reusable math components that provides typical math utilities required by spacecraft flight software. These utilities are intended to increase flight software quality reusability and maintainability by providing a set of consistent, well-documented, and tested math utilities. This library only has dependencies on ANSI C, so it is easily ported. Prior to this library, each mission typically created its own math utilities using ideas/code from previous missions. Part of the reason for this is that math libraries can be written with different strategies in areas like error handling, parameters orders, naming conventions, etc. Changing the utilities for each mission introduces risks and costs. The obvious risks and costs are that the utilities must be coded and revalidated. The hidden risks and costs arise in miscommunication between engineers. These utilities must be understood by both the flight software engineers and other subsystem engineers (primarily guidance navigation and control). The FSW math library is part of a larger goal to produce a library of reusable Guidance Navigation and Control (GN&C) FSW components. A GN&C FSW library cannot be created unless a standardized math basis is created. This library solves the standardization problem by defining a common feature set and establishing policies for the library s design. This allows the libraries to be maintained with the same strategy used in its initial development, which supports a library of reusable GN&C FSW components. The FSW math library is written for an embedded software environment in C. This places restrictions on the language features that can be used by the library. Another advantage of the FSW math library is that it can be used in the FSW as well as other environments like the GN&C analyst s simulators. This helps communication between the teams because they can use the same utilities with the same feature set and syntax.

  19. TWRS engineering bibliography software listing

    International Nuclear Information System (INIS)

    Husa, E.I.

    1995-01-01

    This document contains the computer software listing for Engineering Bibliography software, developed by E. Ivar Husa. This software is in the working prototype stage of development. The code has not been tested to requirements. TWRS Engineering created this software for engineers to share bibliographic references across the Hanford site network (HLAN). This software is intended to store several hundred to several thousand references (a compendium with limited range). Coded changes are needed to support the larger number of references

  20. Interface-based software testing

    OpenAIRE

    Aziz Ahmad Rais

    2016-01-01

    Software quality is determined by assessing the characteristics that specify how it should work, which are verified through testing. If it were possible to touch, see, or measure software, it would be easier to analyze and prove its quality. Unfortunately, software is an intangible asset, which makes testing complex. This is especially true when software quality is not a question of particular functions that can be tested through a graphical user interface. The primary objective of softwar...

  1. Self-assembling software generator

    Science.gov (United States)

    Bouchard, Ann M [Albuquerque, NM; Osbourn, Gordon C [Albuquerque, NM

    2011-11-25

    A technique to generate an executable task includes inspecting a task specification data structure to determine what software entities are to be generated to create the executable task, inspecting the task specification data structure to determine how the software entities will be linked after generating the software entities, inspecting the task specification data structure to determine logic to be executed by the software entities, and generating the software entities to create the executable task.

  2. Product-oriented Software Certification Process for Software Synthesis

    Science.gov (United States)

    Nelson, Stacy; Fischer, Bernd; Denney, Ewen; Schumann, Johann; Richardson, Julian; Oh, Phil

    2004-01-01

    The purpose of this document is to propose a product-oriented software certification process to facilitate use of software synthesis and formal methods. Why is such a process needed? Currently, software is tested until deemed bug-free rather than proving that certain software properties exist. This approach has worked well in most cases, but unfortunately, deaths still occur due to software failure. Using formal methods (techniques from logic and discrete mathematics like set theory, automata theory and formal logic as opposed to continuous mathematics like calculus) and software synthesis, it is possible to reduce this risk by proving certain software properties. Additionally, software synthesis makes it possible to automate some phases of the traditional software development life cycle resulting in a more streamlined and accurate development process.

  3. Interface-based software testing

    Directory of Open Access Journals (Sweden)

    Aziz Ahmad Rais

    2016-10-01

    Full Text Available Software quality is determined by assessing the characteristics that specify how it should work, which are verified through testing. If it were possible to touch, see, or measure software, it would be easier to analyze and prove its quality. Unfortunately, software is an intangible asset, which makes testing complex. This is especially true when software quality is not a question of particular functions that can be tested through a graphical user interface. The primary objective of software architecture is to design quality of software through modeling and visualization. There are many methods and standards that define how to control and manage quality. However, many IT software development projects still fail due to the difficulties involved in measuring, controlling, and managing software quality. Software quality failure factors are numerous. Examples include beginning to test software too late in the development process, or failing properly to understand, or design, the software architecture and the software component structure. The goal of this article is to provide an interface-based software testing technique that better measures software quality, automates software quality testing, encourages early testing, and increases the software’s overall testability

  4. Automating the Transformational Development of Software. Volume 2. Appendices.

    Science.gov (United States)

    1983-03-01

    ordered temporall by &War (package:ocated~at afti source)); ~iDeln commtU3 -mapped at step 1.10 Mg comment fi= comment - the sequence of packages ever to...We will present the development as an alternating series of ,.._ goals and methods for achieving those goals. Goals posted by the user will be

  5. Aviation Trainer Technology Test Plan. Volume II. Software Development

    Science.gov (United States)

    1991-11-25

    feild values in new node *Ieg>X=x newg->X = Y newg->Len =len; newg->Help =help; newg->Ignore = ignore; newg->Format = format; newg->Validation - NULL;I... vector : North long varVFE; /* F-16A velocity vector : East long varVFU; /* F-16A velocity vector : Up */ long varH; /* plane heading */ long varC; /* plane...31\\\\ETH523.sys" parmsdr.args=getds(); parmsdr.non7=OxOO; /*save interrupt vector for future restoration */ cSavvecso; rc=getdso; rc=cInitParameters

  6. Software Assurance Curriculum Project Volume 4: Community College Education

    Science.gov (United States)

    2011-09-01

    no previous programming or computer science experience expected) • Precalculus -ready (that is, proficiency sufficient to enter college-level... precalculus course) • English Composition I-ready (that is, proficiency sufficient to enter college-level English I course) Co-Requisite Discrete

  7. Software Design Document PVD CSCI (3). Volume 2, Appendices

    Science.gov (United States)

    1991-06-01

    FUNCTION: Igeneric -type(Type, Overline) called~y: show Ioverline in overlineif.c, (null) boundary-action in ovline-func.c, (null) Ideparture-action in...2.8.2.2-22 Ideparture-action 2.8.2.2-28 ldeparturejabel 2.8.2.2-27 idone__create_action 2.8.2.2-39 igeneric -size 2.8.2.2-17 lgeneric-type, 2.8.2.2-24

  8. Turbine Engine Control Synthesis. Volume 2. Simulation and Controller Software

    Science.gov (United States)

    1975-03-01

    3�.TEX)).(5)SOOe.TXI 5": kF . (c’Cb99,6.18OE-5.TEXI’EX-132.?O e,: CP . CPAFARX.CPF)/(j.+FAQX) 57: w a (CA+FAPX*.4F)/(t*#FARXI5pl. Ow a 2h*97-,99* ftA ...VT2 7 SELECT FMD SELECT T8 •- 4O O SUMPF - VT072A " BRANCH + MULTIPLY MULTIPLY M VT18O D C4 S13 ;;E2 51 MIN 1" 0 M L IPYVT180 " a-’ DII E DIVIDE NN

  9. Predictive Software Cost Model Study. Volume I. Final Technical Report.

    Science.gov (United States)

    1980-06-01

    development phase to identify computer resources necessary to support computer programs after transfer of program manangement responsibility and system... classical model development with refinements specifically applicable to avionics systems. The refinements are the result of the Phase I literature search

  10. Software Independent Verification and Validation (SIV&V) Simplified

    Science.gov (United States)

    2006-12-01

    Midcourse Defense GOTS Government-Off-The-Shelf GSAM General Service Administration Acquisition Manual GTE General Telephone and Electronics GUI...Graphical User Interfaces HSI Hardware and Software Integration HSI Human Systems Integration HTP Hardware Test Plan HWCI Hardware...requirements are extracted and traced to the developer’s Software and Hardware Test Plans (STP and HTP ). This ensures adequacy of the test plans and also

  11. An online database for plant image analysis software tools

    OpenAIRE

    Lobet, Guillaume; Draye, Xavier; Périlleux, Claire

    2013-01-01

    Background: Recent years have seen an increase in methods for plant phenotyping using image analyses. These methods require new software solutions for data extraction and treatment. These solutions are instrumental in supporting various research pipelines, ranging from the localisation of cellular compounds to the quantification of tree canopies. However, due to the variety of existing tools and the lack of central repository, it is challenging for researchers to identify the software that is...

  12. Reaction Wheel Disturbance Model Extraction Software, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Reaction wheel disturbances are some of the largest sources of noise on sensitive telescopes. Such wheel-induced mechanical noises are not well characterized....

  13. Reaction Wheel Disturbance Model Extraction Software, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — Reaction wheel mechanical noise is one of the largest sources of disturbance forcing on space-based observatories. Such noise arises from mass imbalance, bearing...

  14. A software engineering process for safety-critical software application

    International Nuclear Information System (INIS)

    Kang, Byung Heon; Kim, Hang Bae; Chang, Hoon Seon; Jeon, Jong Sun

    1995-01-01

    Application of computer software to safety-critical systems in on the increase. To be successful, the software must be designed and constructed to meet the functional and performance requirements of the system. For safety reason, the software must be demonstrated not only to meet these requirements, but also to operate safely as a component within the system. For longer-term cost consideration, the software must be designed and structured to ease future maintenance and modifications. This paper presents a software engineering process for the production of safety-critical software for a nuclear power plant. The presentation is expository in nature of a viable high quality safety-critical software development. It is based on the ideas of a rational design process and on the experience of the adaptation of such process in the production of the safety-critical software for the shutdown system number two of Wolsung 2, 3 and 4 nuclear power generation plants. This process is significantly different from a conventional process in terms of rigorous software development phases and software design techniques, The process covers documentation, design, verification and testing using mathematically precise notations and highly reviewable tabular format to specify software requirements and software requirements and software requirements and code against software design using static analysis. The software engineering process described in this paper applies the principle of information-hiding decomposition in software design using a modular design technique so that when a change is required or an error is detected, the affected scope can be readily and confidently located. it also facilitates a sense of high degree of confidence in the 'correctness' of the software production, and provides a relatively simple and straightforward code implementation effort. 1 figs., 10 refs. (Author)

  15. Accelerated Solvent Extraction: An Innovative Sample Extraction Technique for Natural Products

    International Nuclear Information System (INIS)

    Hazlina Ahmad Hassali; Azfar Hanif Abd Aziz; Rosniza Razali

    2015-01-01

    Accelerated solvent extraction (ASE) is one of the novel techniques that have been developed for the extraction of phytochemicals from plants in order to shorten the extraction time, decrease the solvent consumption, increase the extraction yield and enhance the quality of extracts. This technique combines elevated temperatures and pressure with liquid solvents. This paper gives a brief overview of accelerated solvent extraction technique for sample preparation and its application to the extraction of natural products. Through practical examples, the effects of operational parameters such as temperature, volume of solvent used, extraction time and extraction yields on the performance of ASE are discussed. It is demonstrated that ASE technique allows reduced solvent consumption and shorter extraction time, while the extraction yields are even higher than those obtained with conventional methods. (author)

  16. TOUGH2 software qualification

    International Nuclear Information System (INIS)

    Pruess, K.; Simmons, A.; Wu, Y.S.; Moridis, G.

    1996-02-01

    TOUGH2 is a numerical simulation code for multi-dimensional coupled fluid and heat flow of multiphase, multicomponent fluid mixtures in porous and fractured media. It belongs to the MULKOM (open-quotes MULti-KOMponentclose quotes) family of codes and is a more general version of the TOUGH simulator. The MULKOM family of codes was originally developed with a focus on geothermal reservoir simulation. They are suited to modeling systems which contain different fluid mixtures, with applications to flow problems arising in the context of high-level nuclear waste isolation, oil and gas recovery and storage, and groundwater resource protection. TOUGH2 is essentially a subset of MULKOM, consisting of a selection of the better tested and documented MULKOM program modules. The purpose of this package of reports is to provide all software baseline documents necessary for the software qualification of TOUGH2

  17. Software Configurable Multichannel Transceiver

    Science.gov (United States)

    Freudinger, Lawrence C.; Cornelius, Harold; Hickling, Ron; Brooks, Walter

    2009-01-01

    Emerging test instrumentation and test scenarios increasingly require network communication to manage complexity. Adapting wireless communication infrastructure to accommodate challenging testing needs can benefit from reconfigurable radio technology. A fundamental requirement for a software-definable radio system is independence from carrier frequencies, one of the radio components that to date has seen only limited progress toward programmability. This paper overviews an ongoing project to validate the viability of a promising chipset that performs conversion of radio frequency (RF) signals directly into digital data for the wireless receiver and, for the transmitter, converts digital data into RF signals. The Software Configurable Multichannel Transceiver (SCMT) enables four transmitters and four receivers in a single unit the size of a commodity disk drive, programmable for any frequency band between 1 MHz and 6 GHz.

  18. Implementing Software Defined Radio

    CERN Document Server

    Grayver, Eugene

    2013-01-01

    Software Defined Radio makes wireless communications easier, more efficient, and more reliable. This book bridges the gap between academic research and practical implementation. When beginning a project, practicing engineers, technical managers, and graduate students can save countless hours by considering the concepts presented in these pages. The author covers the myriad options and trade-offs available when selecting an appropriate hardware architecture. As demonstrated here, the choice between hardware- and software-centric architecture can mean the difference between meeting an aggressive schedule and bogging down in endless design iterations. Because of the author’s experience overseeing dozens of failed and successful developments, he is able to present many real-life examples. Some of the key concepts covered are: Choosing the right architecture for the market – laboratory, military, or commercial Hardware platforms – FPGAs, GPPs, specialized and hybrid devices Standardization efforts to ens...

  19. TOUGH2 software qualification

    Energy Technology Data Exchange (ETDEWEB)

    Pruess, K.; Simmons, A.; Wu, Y.S.; Moridis, G.

    1996-02-01

    TOUGH2 is a numerical simulation code for multi-dimensional coupled fluid and heat flow of multiphase, multicomponent fluid mixtures in porous and fractured media. It belongs to the MULKOM ({open_quotes}MULti-KOMponent{close_quotes}) family of codes and is a more general version of the TOUGH simulator. The MULKOM family of codes was originally developed with a focus on geothermal reservoir simulation. They are suited to modeling systems which contain different fluid mixtures, with applications to flow problems arising in the context of high-level nuclear waste isolation, oil and gas recovery and storage, and groundwater resource protection. TOUGH2 is essentially a subset of MULKOM, consisting of a selection of the better tested and documented MULKOM program modules. The purpose of this package of reports is to provide all software baseline documents necessary for the software qualification of TOUGH2.

  20. Guidance and Control Software,

    Science.gov (United States)

    1980-05-01

    user, by forcing him subconsciously to make faster decisions than necessary and giving him fewer choices than possible. It may be compared to the... reprogramming , and two real time references. Interfaced to the main computer but still within the same physical case are a 12-bit HUD processor, a HDD...redesigned and reprogrammed many areas of the UPDATE I Mission Software to rectify this problem. The lesson learned was that the in-house stuff must devise