WorldWideScience

Sample records for extraction software volume

  1. Sandia Software Guidelines, Volume 2. Documentation

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-09-01

    This volume is one in a series of Sandia Software Guidelines intended for use in producing quality software within Sandia National Laboratories. In consonance with the IEEE Standards for software documentation, this volume provides guidance in the selection of an adequate document set for a software project and example formats for many types of software documentation. A tutorial on life cycle documentation is also provided. Extended document thematic outlines and working examples of software documents are available on electronic media as an extension of this volume.

  2. The SIFT hardware/software systems. Volume 2: Software listings

    Science.gov (United States)

    Palumbo, Daniel L.

    1985-01-01

    This document contains software listings of the SIFT operating system and application software. The software is coded for the most part in a variant of the Pascal language, Pascal*. Pascal* is a cross-compiler running on the VAX and Eclipse computers. The output of Pascal* is BDX-390 assembler code. When necessary, modules are written directly in BDX-390 assembler code. The listings in this document supplement the description of the SIFT system found in Volume 1 of this report, A Detailed Description.

  3. Collected software engineering papers, volume 11

    Science.gov (United States)

    1993-01-01

    This document is a collection of selected technical papers produced by participants in the Software Engineering Laboratory (SEL) from November 1992 through November 1993. The purpose of the document is to make available, in one reference, some results of SEL research that originally appeared in a number of different forums. This is the 11th such volume of technical papers produced by the SEL. Although these papers cover several topics related to software engineering, they do not encompass the entire scope of SEL activities and interests. Additional information about the SEL and its research efforts may be obtained from the sources listed in the bibliography at the end of this document.

  4. Collected software engineering papers, volume 12

    Science.gov (United States)

    1994-01-01

    This document is a collection of selected technical papers produced by participants in the Software Engineering Laboratory (SEL) from November 1993 through October 1994. The purpose of the document is to make available, in one reference, some results of SEL research that originally appeared in a number of different forums. This is the 12th such volume of technical papers produced by the SEL. Although these papers cover several topics related to software engineering, they do not encompass the entire scope of SEL activities and interests. Additional information about the SEL and its research efforts may be obtained from the sources listed in the bibliography at the end of this document.

  5. Sandia software guidelines: Volume 5, Tools, techniques, and methodologies

    Energy Technology Data Exchange (ETDEWEB)

    1989-07-01

    This volume is one in a series of Sandia Software Guidelines intended for use in producing quality software within Sandia National Laboratories. This volume describes software tools and methodologies available to Sandia personnel for the development of software, and outlines techniques that have proven useful within the Laboratories and elsewhere. References and evaluations by Sandia personnel are included. 6 figs.

  6. Software Engineering Laboratory Series: Collected Software Engineering Papers. Volume 15

    Science.gov (United States)

    1997-01-01

    The Software Engineering Laboratory (SEL) is an organization sponsored by NASA/GSFC and created to investigate the effectiveness of software engineering technologies when applied to the development of application software. The activities, findings, and recommendations of the SEL are recorded in the Software Engineering Laboratory Series, a continuing series of reports that includes this document.

  7. Software Engineering Laboratory Series: Collected Software Engineering Papers. Volume 13

    Science.gov (United States)

    1995-01-01

    The Software Engineering Laboratory (SEL) is an organization sponsored by NASA/GSFC and created to investigate the effectiveness of software engineering technologies when applied to the development of application software. The activities, findings, and recommendations of the SEL are recorded in the Software Engineering Laboratory Series, a continuing series of reports that includes this document.

  8. Software Engineering Laboratory Series: Collected Software Engineering Papers. Volume 14

    Science.gov (United States)

    1996-01-01

    The Software Engineering Laboratory (SEL) is an organization sponsored by NASA/GSFC and created to investigate the effectiveness of software engineering technologies when applied to the development of application software. The activities, findings, and recommendations of the SEL are recorded in the Software Engineering Laboratory Series, a continuing series of reports that includes this document.

  9. Sandia software guidelines. Volume 3. Standards, practices, and conventions

    Energy Technology Data Exchange (ETDEWEB)

    1986-07-01

    This volume is one in a series of Sandia Software Guidelines intended for use in producing quality software within Sandia National Laboratories. In consonance with the IEEE Standard for Software Quality Assurance Plans, this volume identifies software standards, conventions, and practices. These guidelines are the result of a collective effort within Sandia National Laboratories to define recommended deliverables and to document standards, practices, and conventions which will help ensure quality software. 66 refs., 5 figs., 6 tabs.

  10. Software for Extracting 3D - MSSTs

    DEFF Research Database (Denmark)

    Somchaipeng, Kerawit; Sporring, Jon; Kreiborg, Sven

    2003-01-01

    The deep structure of an image is investigated, and a Multi-Scale Singularity Tree (MSST) is constructed based on the pair-wise annihilations of critical points. This report contains two main contributions. Firstly, we describe a fast, simple, and robust method of extracting feature lines from da...... structures of the image. The software described in this report is available through the EU-project, Deep Structure, Singularities, and Computer Vision....

  11. Collected software engineering papers, volume 2

    Science.gov (United States)

    1983-01-01

    Topics addressed include: summaries of the software engineering laboratory (SEL) organization, operation, and research activities; results of specific research projects in the areas of resource models and software measures; and strategies for data collection for software engineering research.

  12. Reaction Wheel Disturbance Model Extraction Software - RWDMES

    Science.gov (United States)

    Blaurock, Carl

    2009-01-01

    The RWDMES is a tool for modeling the disturbances imparted on spacecraft by spinning reaction wheels. Reaction wheels are usually the largest disturbance source on a precision pointing spacecraft, and can be the dominating source of pointing error. Accurate knowledge of the disturbance environment is critical to accurate prediction of the pointing performance. In the past, it has been difficult to extract an accurate wheel disturbance model since the forcing mechanisms are difficult to model physically, and the forcing amplitudes are filtered by the dynamics of the reaction wheel. RWDMES captures the wheel-induced disturbances using a hybrid physical/empirical model that is extracted directly from measured forcing data. The empirical models capture the tonal forces that occur at harmonics of the spin rate, and the broadband forces that arise from random effects. The empirical forcing functions are filtered by a physical model of the wheel structure that includes spin-rate-dependent moments (gyroscopic terms). The resulting hybrid model creates a highly accurate prediction of wheel-induced forces. It accounts for variation in disturbance frequency, as well as the shifts in structural amplification by the whirl modes, as the spin rate changes. This software provides a point-and-click environment for producing accurate models with minimal user effort. Where conventional approaches may take weeks to produce a model of variable quality, RWDMES can create a demonstrably high accuracy model in two hours. The software consists of a graphical user interface (GUI) that enables the user to specify all analysis parameters, to evaluate analysis results and to iteratively refine the model. Underlying algorithms automatically extract disturbance harmonics, initialize and tune harmonic models, and initialize and tune broadband noise models. The component steps are described in the RWDMES user s guide and include: converting time domain data to waterfall PSDs (power spectral

  13. Volume-Enclosing Surface Extraction

    CERN Document Server

    Schlei, B R

    2010-01-01

    A new method is presented here, which allows one to construct triangular surfaces from three-dimensional data sets, such as 3D image data and/or numerical simulation data that are based on regularly shaped, cubic lattices. This novel volume-enclosing surface extraction technique, which has been named VESTA, is guaranteed to always produce surfaces that do not contain any holes, e.g., in contrast to the well-known and very popular Marching Cubes algorithm, which has been developed by W.E. Lorensen and H.E. Cline in the mid-1980s. VESTA is not template based. In fact, the surface tiles are determined with a fast and robust construction technique. Among other things, VESTA's relationship to the DICONEX algorithm is explained, which -- in a lower-dimensional analogy -- produces contours from two-dimensional data sets, such as 2D gray-level images. In particular, the generation of isosurfaces from initially created VESTA surfaces is demonstrated here for the very first time. A few examples are provided, namely in ...

  14. Collected software engineering papers, volume 6

    Science.gov (United States)

    1988-01-01

    A collection is presented of technical papers produced by participants in the Software Engineering Laboratory (SEL) during the period 1 Jun. 1987 to 1 Jan. 1989. The purpose of the document is to make available, in one reference, some results of SEL research that originally appeared in a number of different forums. For the convenience of this presentation, the twelve papers contained here are grouped into three major categories: (1) Software Measurement and Technology Studies; (2) Measurement Environment Studies; and (3) Ada Technology Studies. The first category presents experimental research and evaluation of software measurement and technology; the second presents studies on software environments pertaining to measurement. The last category represents Ada technology and includes research, development, and measurement studies.

  15. Collected software engineering papers, volume 7

    Science.gov (United States)

    1989-01-01

    A collection is presented of selected technical papers produced by participants in the Software Engineering Laboratory (SEL) during the period Dec. 1988 to Oct. 1989. The purpose of the document is to make available, in one reference, some results of SEL research that originally appeared in a number of different forums. For the convenience of this presentation, the seven papers contained here are grouped into three major categories: (1) Software Measurement and Technology Studies; (2) Measurement Environment Studies; and (3) Ada Technology Studies. The first category presents experimental research and evaluation of software measurement and technology; the second presents studies on software environments pertaining to measurement. The last category represents Ada technology and includes research, development, and measurement studies.

  16. CADDIS Volume 4. Data Analysis: Download Software

    Science.gov (United States)

    Overview of the data analysis tools available for download on CADDIS. Provides instructions for downloading and installing CADStat, access to Microsoft Excel macro for computing SSDs, a brief overview of command line use of R, a statistical software.

  17. Guidance and Control Software Project Data - Volume 1: Planning Documents

    Science.gov (United States)

    Hayhurst, Kelly J. (Editor)

    2008-01-01

    The Guidance and Control Software (GCS) project was the last in a series of software reliability studies conducted at Langley Research Center between 1977 and 1994. The technical results of the GCS project were recorded after the experiment was completed. Some of the support documentation produced as part of the experiment, however, is serving an unexpected role far beyond its original project context. Some of the software used as part of the GCS project was developed to conform to the RTCA/DO-178B software standard, "Software Considerations in Airborne Systems and Equipment Certification," used in the civil aviation industry. That standard requires extensive documentation throughout the software development life cycle, including plans, software requirements, design and source code, verification cases and results, and configuration management and quality control data. The project documentation that includes this information is open for public scrutiny without the legal or safety implications associated with comparable data from an avionics manufacturer. This public availability has afforded an opportunity to use the GCS project documents for DO-178B training. This report provides a brief overview of the GCS project, describes the 4-volume set of documents and the role they are playing in training, and includes the planning documents from the GCS project. Volume 1 contains five appendices: A. Plan for Software Aspects of Certification for the Guidance and Control Software Project; B. Software Development Standards for the Guidance and Control Software Project; C. Software Verification Plan for the Guidance and Control Software Project; D. Software Configuration Management Plan for the Guidance and Control Software Project; and E. Software Quality Assurance Activities.

  18. Guidance and Control Software Project Data - Volume 2: Development Documents

    Science.gov (United States)

    Hayhurst, Kelly J. (Editor)

    2008-01-01

    The Guidance and Control Software (GCS) project was the last in a series of software reliability studies conducted at Langley Research Center between 1977 and 1994. The technical results of the GCS project were recorded after the experiment was completed. Some of the support documentation produced as part of the experiment, however, is serving an unexpected role far beyond its original project context. Some of the software used as part of the GCS project was developed to conform to the RTCA/DO-178B software standard, "Software Considerations in Airborne Systems and Equipment Certification," used in the civil aviation industry. That standard requires extensive documentation throughout the software development life cycle, including plans, software requirements, design and source code, verification cases and results, and configuration management and quality control data. The project documentation that includes this information is open for public scrutiny without the legal or safety implications associated with comparable data from an avionics manufacturer. This public availability has afforded an opportunity to use the GCS project documents for DO-178B training. This report provides a brief overview of the GCS project, describes the 4-volume set of documents and the role they are playing in training, and includes the development documents from the GCS project. Volume 2 contains three appendices: A. Guidance and Control Software Development Specification; B. Design Description for the Pluto Implementation of the Guidance and Control Software; and C. Source Code for the Pluto Implementation of the Guidance and Control Software

  19. Extracting excited mesons from the finite volume

    Energy Technology Data Exchange (ETDEWEB)

    Doring, Michael [George Washington Univ., Washington, DC (United States); Thomas Jefferson National Accelerator Facility (TJNAF), Newport News, VA (United States)

    2014-12-01

    As quark masses come closer to their physical values in lattice simulations, finite volume effects dominate the level spectrum. Methods to extract excited mesons from the finite volume are discussed, like moving frames in the presence of coupled channels. Effective field theory can be used to stabilize the determination of the resonance spectrum.

  20. Guidance and Control Software Project Data - Volume 3: Verification Documents

    Science.gov (United States)

    Hayhurst, Kelly J. (Editor)

    2008-01-01

    The Guidance and Control Software (GCS) project was the last in a series of software reliability studies conducted at Langley Research Center between 1977 and 1994. The technical results of the GCS project were recorded after the experiment was completed. Some of the support documentation produced as part of the experiment, however, is serving an unexpected role far beyond its original project context. Some of the software used as part of the GCS project was developed to conform to the RTCA/DO-178B software standard, "Software Considerations in Airborne Systems and Equipment Certification," used in the civil aviation industry. That standard requires extensive documentation throughout the software development life cycle, including plans, software requirements, design and source code, verification cases and results, and configuration management and quality control data. The project documentation that includes this information is open for public scrutiny without the legal or safety implications associated with comparable data from an avionics manufacturer. This public availability has afforded an opportunity to use the GCS project documents for DO-178B training. This report provides a brief overview of the GCS project, describes the 4-volume set of documents and the role they are playing in training, and includes the verification documents from the GCS project. Volume 3 contains four appendices: A. Software Verification Cases and Procedures for the Guidance and Control Software Project; B. Software Verification Results for the Pluto Implementation of the Guidance and Control Software; C. Review Records for the Pluto Implementation of the Guidance and Control Software; and D. Test Results Logs for the Pluto Implementation of the Guidance and Control Software.

  1. Software for chemical and extractive metallurgy

    Science.gov (United States)

    Morris, Arthur E.; Stephenson, James B.; Wadsley, Michael W.

    1990-04-01

    A dramatic software migration has occurred from mainframe to microcomputer since the first conference in this biennial series was held in 1985, and this migration will continue. Two major observations may be made—there is a lack of critically evaluated thermodynamic data, especially for solution models, and there is a continuing need for accurate thermodynamic property measurements to be utilized by the current and next generation of increasingly powerful and unique thermodynamic packages. The authors encourage researchers working on the development of solution phase models and solution databases to consider the format of this data so it can readily be incorporated into thermodynamic software.

  2. TANGO standard software to control the Nuclotron beam slow extraction

    Science.gov (United States)

    Andreev, V. A.; Volkov, V. I.; Gorbachev, E. V.; Isadov, V. A.; Kirichenko, A. E.; Romanov, S. V.; Sedykh, G. S.

    2016-09-01

    TANGO Controls is a basis of the NICA control system. The report describes the software which integrates the Nuclotron beam slow extraction subsystem into the TANGO system of NICA. Objects of control are power supplies for resonance lenses. The software consists of the subsystem device server, remote client and web-module for viewing the subsystem data.

  3. Software Engineering and Knowledge Engineering Theory and Practice Volume 2

    CERN Document Server

    2012-01-01

    The volume includes a set of selected papers extended and revised from the I2009 Pacific-Asia Conference on Knowledge Engineering and Software Engineering (KESE 2009) was held on December 19~ 20, 2009, Shenzhen, China.   Volume 2 is to provide a forum for researchers, educators, engineers, and government officials involved in the general areas of Knowledge Engineering and Communication Technology to disseminate their latest research results and exchange views on the future research directions of these fields. 135 high-quality papers are included in the volume. Each paper has been peer-reviewed by at least 2 program committee members and selected by the volume editor Prof.Yanwen Wu.   On behalf of the this volume, we would like to express our sincere appreciation to all of authors and referees for their efforts reviewing the papers. Hoping you can find lots of profound research ideas and results on the related fields of Knowledge Engineering and Communication Technology. 

  4. Software for Extracting 3D - MSSTs

    DEFF Research Database (Denmark)

    Somchaipeng, Kerawit; Sporring, Jon; Kreiborg, Sven;

    2003-01-01

    The deep structure of an image is investigated, and a Multi-Scale Singularity Tree (MSST) is constructed based on the pair-wise annihilations of critical points. This report contains two main contributions. Firstly, we describe a fast, simple, and robust method of extracting feature lines from da...

  5. Ada Implementation Guide. Software Engineering With Ada. Volume 2

    Science.gov (United States)

    1994-04-01

    international standard LADY-LOV 11/25/91 10240 Article on life of Ada Lovelace by Carol L. James and Duncan E. Morrill with note on the naming of the Ada ...A aa a- S I !/ A -- I Volume II I I! I Ii Ada Implementation GuideI Software Engineering With Ada I DTIC QUALn1T :1 1 ’. I April 1994 *ýS94-18856 JJ...A-13 A.1 Government Sources ...................................... A-1 A.1.1 Organizations .................................... A-2 Ada

  6. Object-oriented software design in semiautomatic building extraction

    Science.gov (United States)

    Guelch, Eberhard; Mueller, Hardo

    1997-08-01

    Developing a system for semiautomatic building acquisition is a complex process, that requires constant integration and updating of software modules and user interfaces. To facilitate these processes we apply an object-oriented design not only for the data but also for the software involved. We use the unified modeling language (UML) to describe the object-oriented modeling of the system in different levels of detail. We can distinguish between use cases from the users point of view, that represent a sequence of actions, yielding in an observable result and the use cases for the programmers, who can use the system as a class library to integrate the acquisition modules in their own software. The structure of the system is based on the model-view-controller (MVC) design pattern. An example from the integration of automated texture extraction for the visualization of results demonstrate the feasibility of this approach.

  7. Small volume liquid extraction of amphetamines in saliva.

    Science.gov (United States)

    Meng, Pinjia; Wang, Yanyan

    2010-04-15

    The present study introduced a procedure of small volume liquid extraction of amphetamines, including amphetamine (AM); methamphetamine (MA); 3,4-methylenedioxyamphetamine (MDA); 3,4-methylenemethamphtamine (MDMA), in saliva. Extraction efficiencies were compared between the conventional volume liquid phase extraction (LPE) and the small volume one, in which amphetamine abusers, and was proven to be practical for detecting trace amounts of amphetamines in saliva.

  8. National Utility Financial Statement model (NUFS). Volume III of III: software description. Final report

    Energy Technology Data Exchange (ETDEWEB)

    1981-10-29

    This volume contains a description of the software comprising the National Utility Financial Statement Model (NUFS). This is the third of three volumes describing NUFS provided by ICF Incorporated under contract DEAC-01-79EI-10579. The three volumes are entitled: model overview and description, user's guide, and software guide.

  9. The Slitless Spectroscopy Data Extraction Software aXe

    CERN Document Server

    Kümmel, Martin; Pirzkal, Norbert; Kuntschner, Harald; Pasquali, Anna

    2008-01-01

    The methods and techniques for the slitless spectroscopy software aXe, which was designed to reduce data from the various slitless spectroscopy modes of Hubble Space Telescope instruments, are described. aXe can treat slitless spectra from different instruments such as ACS, NICMOS and WFC3 through the use of a configuration file which contains all the instrument dependent parameters. The basis of the spectral extraction within aXe are the position, morphology and photometry of the objects on a companion direct image. Several aspects of slitless spectroscopy, such as the overlap of spectra, an extraction dependent on object shape and the provision of flat-field cubes, motivate a dedicated software package, and the solutions offered within aXe are discussed in detail. The effect of the mutual contamination of spectra can be quantitatively assessed in aXe, using spectral and morphological information from the companion direct image(s). A new method named 'aXedrizzle' for 2D rebinning and co-adding spectral data,...

  10. Isobio software: biological dose distribution and biological dose volume histogram from physical dose conversion using linear-quadratic-linear model.

    Science.gov (United States)

    Jaikuna, Tanwiwat; Khadsiri, Phatchareewan; Chawapun, Nisa; Saekho, Suwit; Tharavichitkul, Ekkasit

    2017-02-01

    To develop an in-house software program that is able to calculate and generate the biological dose distribution and biological dose volume histogram by physical dose conversion using the linear-quadratic-linear (LQL) model. The Isobio software was developed using MATLAB version 2014b to calculate and generate the biological dose distribution and biological dose volume histograms. The physical dose from each voxel in treatment planning was extracted through Computational Environment for Radiotherapy Research (CERR), and the accuracy was verified by the differentiation between the dose volume histogram from CERR and the treatment planning system. An equivalent dose in 2 Gy fraction (EQD2) was calculated using biological effective dose (BED) based on the LQL model. The software calculation and the manual calculation were compared for EQD2 verification with pair t-test statistical analysis using IBM SPSS Statistics version 22 (64-bit). Two and three-dimensional biological dose distribution and biological dose volume histogram were displayed correctly by the Isobio software. Different physical doses were found between CERR and treatment planning system (TPS) in Oncentra, with 3.33% in high-risk clinical target volume (HR-CTV) determined by D90%, 0.56% in the bladder, 1.74% in the rectum when determined by D2cc, and less than 1% in Pinnacle. The difference in the EQD2 between the software calculation and the manual calculation was not significantly different with 0.00% at p-values 0.820, 0.095, and 0.593 for external beam radiation therapy (EBRT) and 0.240, 0.320, and 0.849 for brachytherapy (BT) in HR-CTV, bladder, and rectum, respectively. The Isobio software is a feasible tool to generate the biological dose distribution and biological dose volume histogram for treatment plan evaluation in both EBRT and BT.

  11. Spacelab software development and integration concepts study report, volume 1

    Science.gov (United States)

    Rose, P. L.; Willis, B. G.

    1973-01-01

    The proposed software guidelines to be followed by the European Space Research Organization in the development of software for the Spacelab being developed for use as a payload for the space shuttle are documented. Concepts, techniques, and tools needed to assure the success of a programming project are defined as they relate to operation of the data management subsystem, support of experiments and space applications, use with ground support equipment, and for integration testing.

  12. MCTSSA Software Reliability Handbook, Volume III: Integration of Software Metrics with Quality and Reliability

    OpenAIRE

    Schneidewind, Norman F.

    1997-01-01

    The purpose of this handbook is threefold. Specifically, it: (1) Serves as a reference guide for implementing standard software reliability practices at Marine Corps Tactical Systems Support Activity and aids in applying the software reliability model. (2) Serves as a tool for managing the software reliability program. (3) Serves as a training aid. U.S. Marine Corps Tactical Systems Support Activity, Camp Pendleton, CA. RLACH

  13. Simulation of Extractive Distillation for Recycling Tetrahydrofuran from Pharmaceutical Wastewater with Chem CAD Software

    OpenAIRE

    Xiaoguang Wang; Yueyun Yang

    2013-01-01

    The functions and application of ChemCAD simulation software was introduced. Mathematical model of extraction distillation process was established. The extractive distillation process for preparation of tetrahydrofuran (THF) was simulated by SCDS rectification model in ChemCAD software, methanol-THF-solution and ethanedol-lithium chloride solution as extractant. Influence of extraction agent on vapor-liquid equilibrium curve of methanol-THF system and effects of theoretical plate number, feed...

  14. MODIS. Volume 1: MODIS level 1A software baseline requirements

    Science.gov (United States)

    Masuoka, Edward; Fleig, Albert; Ardanuy, Philip; Goff, Thomas; Carpenter, Lloyd; Solomon, Carl; Storey, James

    1994-01-01

    This document describes the level 1A software requirements for the moderate resolution imaging spectroradiometer (MODIS) instrument. This includes internal and external requirements. Internal requirements include functional, operational, and data processing as well as performance, quality, safety, and security engineering requirements. External requirements include those imposed by data archive and distribution systems (DADS); scheduling, control, monitoring, and accounting (SCMA); product management (PM) system; MODIS log; and product generation system (PGS). Implementation constraints and requirements for adapting the software to the physical environment are also included.

  15. PILOT: A Precision Intercoastal Loran Translocator. Volume 3. Software.

    Science.gov (United States)

    1982-03-01

    SCOPE. a. This PILOT software design manual provides a description of I the software associated with the two microprocessors mentioned in Sections 1.2...LOOP INITIALIZATION. 3-18 - ’ ! 1 i i i DISPLY GRPHICS (MDDI- " NAVIGATION LOOPi ~I N ITI ALI ZATI ON 1KEYBOARD(HKBD) I HEADING INPUT(H DGINP...interface card formats the raw data prior to * placing it into absolute address locations 8600 H (MSB) and 8601 H (LSB). The I . interface card is designed

  16. Guidance and Control Software Project Data - Volume 4: Configuration Management and Quality Assurance Documents

    Science.gov (United States)

    Hayhurst, Kelly J. (Editor)

    2008-01-01

    The Guidance and Control Software (GCS) project was the last in a series of software reliability studies conducted at Langley Research Center between 1977 and 1994. The technical results of the GCS project were recorded after the experiment was completed. Some of the support documentation produced as part of the experiment, however, is serving an unexpected role far beyond its original project context. Some of the software used as part of the GCS project was developed to conform to the RTCA/DO-178B software standard, "Software Considerations in Airborne Systems and Equipment Certification," used in the civil aviation industry. That standard requires extensive documentation throughout the software development life cycle, including plans, software requirements, design and source code, verification cases and results, and configuration management and quality control data. The project documentation that includes this information is open for public scrutiny without the legal or safety implications associated with comparable data from an avionics manufacturer. This public availability has afforded an opportunity to use the GCS project documents for DO-178B training. This report provides a brief overview of the GCS project, describes the 4-volume set of documents and the role they are playing in training, and includes configuration management and quality assurance documents from the GCS project. Volume 4 contains six appendices: A. Software Accomplishment Summary for the Guidance and Control Software Project; B. Software Configuration Index for the Guidance and Control Software Project; C. Configuration Management Records for the Guidance and Control Software Project; D. Software Quality Assurance Records for the Guidance and Control Software Project; E. Problem Report for the Pluto Implementation of the Guidance and Control Software Project; and F. Support Documentation Change Reports for the Guidance and Control Software Project.

  17. CrossTalk. The Journal of Defense Software Engineering. Volume 16, Number 11, November 2003

    Science.gov (United States)

    2003-11-01

    from approximately 10 top-level accidents/ events. The Hazard and Operability ( HAZOP ) [6] approach to system and software functionality assessment has...Journal of Defense Software Engineering. Volume 16, Number 11, November 2003 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6...requirements within international standards early in the development life cycle. by Brian Dobbing and Alan Burns Software Static Code Analysis

  18. Resonance Extraction from the Finite Volume

    Energy Technology Data Exchange (ETDEWEB)

    Doring, Michael [George Washington Univ., Washington, DC (United States); Thomas Jefferson National Accelerator Facility (TJNAF), Newport News, VA (United States); Molina Peralta, Raquel [George Washington Univ., Washington, DC (United States)

    2016-06-01

    The spectrum of excited hadrons becomes accessible in simulations of Quantum Chromodynamics on the lattice. Extensions of Lüscher's method allow to address multi-channel scattering problems using moving frames or modified boundary conditions to obtain more eigenvalues in finite volume. As these are at different energies, interpolations are needed to relate different eigenvalues and to help determine the amplitude. Expanding the T- or the K-matrix locally provides a controlled scheme by removing the known non-analyticities of thresholds. This can be stabilized by using Chiral Perturbation Theory. Different examples to determine resonance pole parameters and to disentangle resonances from thresholds are dis- cussed, like the scalar meson f0(980) and the excited baryons N(1535)1/2^- and Lambda(1405)1/2^-.

  19. Software Assurance Curriculum Project Volume 3: Master of Software Assurance Course Syllabi

    Science.gov (United States)

    2011-07-01

    Nicola. “Computer-Aided Support for Se- cure Tropos .” Automated Software Engineering 14, 3 (September 2007): 341–364. • Zannone, Nicola. “The Si...that are specific to software assurance, such as CLASP and Secure Tropos . Discuss the pros and cons of standard development process models...CLASP or Secure Tropos could be applied to the project. 4 Teach BSIMM, SAFECode and OWASP best practices. Discuss the pros and cons of security

  20. Three-dimensional active net for volume extraction

    Science.gov (United States)

    Takanashi, Ikuko; Muraki, Shigeru; Doi, Akio; Kaufman, Arie E.

    1998-05-01

    3D Active Net, which is a 3D extension of Snakes, is an energy-minimizing surface model which can extract a volume of interest from 3D volume data. It is deformable and evolves in 3D space to be attracted to salient features, according to its internal and image energy. The net can be fitted to the contour of a target object by defining the image energy suitable for the contour property. We present testing results of the extraction of a muscle from the Visible Human Data by two methods: manual segmentation and the application of 3D Active Net. We apply principal component analysis, which utilizes the color information of the 3D volume data to emphasize an ill-defined contour of the muscle, and then apply 3D Active Net. We recognize that the extracted object has a smooth and natural contour in contrast with a comparable manual segmentation, proving an advantage of our approach.

  1. Guidelines for the verification and validation of expert system software and conventional software: Bibliography. Volume 8

    Energy Technology Data Exchange (ETDEWEB)

    Miller, L.A.; Hayes, J.E.; Mirsky, S.M. [Science Applications International Corp., McLean, VA (United States)

    1995-03-01

    This volume contains all of the technical references found in Volumes 1-7 concerning the development of guidelines for the verification and validation of expert systems, knowledge-based systems, other AI systems, object-oriented systems, and conventional systems.

  2. Ada Implementation Guide. Software Engineering With Ada. Volume 1

    Science.gov (United States)

    1994-04-01

    teaching, the student is less likely to readily adopt new, more powerful ways of accomplishing old tasks 122 Depatn of the NaY I ! Trablng and Educaion and...Maturity Model3 (CMU/SEI-92-TR-25, ESC-TR-/92-0M5). Pittsburgh, PA : Carnegie-Mellon University, 1992. SBoehm. B.W. Software Engineering Economics...Pittsburgh, PA : Carnegie-Mellon University, 19-21 March 1991. £ Contrast: Ada 9X and C++, Schonberg, E. New York University, 1992 (Distributed by Ada IC on

  3. Software Reliability, Measurement, and Testing. Volume 2. Guidebook for Software Reliability Measurement and Testing

    Science.gov (United States)

    1992-04-01

    test experiments. Of the three static techniques, 200-4 SOFTWARE TEST TECHNIQUES ST Code Review A Error/ Anamoly Detection T I Structure...anomaly is an unforeseen event , which may not be detected by error-protection mechanisms in time to prevent system failure. The existence of extensive... event , the more difficult it is to make a meaningful prediction. As an example, it can be seen that the reliability of an electronic equipment is known

  4. SArEM: A SPEM extension for software architecture extraction process

    Directory of Open Access Journals (Sweden)

    Mira Abboud

    2016-04-01

    Full Text Available In order to maintain a system, it’s critical to understand its architecture. However even though every system has an architecture, not every system has a reliable representation of its architecture. To deal with this problem many researchers have engaged in software architecture extraction where the system’s architecture is recovered from its source code. While there is a plethora of approaches aiming at extracting software architectures, there is no mean or tool measurement for these approaches; which makes the comparison between the different approaches a hard task. To tackle this lack, we developed a meta-model, based on SPEM meta-model, that specifies the software architecture extraction process. Such meta-model serves as a tool to compare, analyze and evaluate research field approaches. In this paper we detail our meta-model called SArEM (Software Architecture Extraction Meta-model and clarify its concepts.

  5. Can we replace curation with information extraction software?

    Science.gov (United States)

    Karp, Peter D

    2016-01-01

    Can we use programs for automated or semi-automated information extraction from scientific texts as practical alternatives to professional curation? I show that error rates of current information extraction programs are too high to replace professional curation today. Furthermore, current IEP programs extract single narrow slivers of information, such as individual protein interactions; they cannot extract the large breadth of information extracted by professional curators for databases such as EcoCyc. They also cannot arbitrate among conflicting statements in the literature as curators can. Therefore, funding agencies should not hobble the curation efforts of existing databases on the assumption that a problem that has stymied Artificial Intelligence researchers for more than 60 years will be solved tomorrow. Semi-automated extraction techniques appear to have significantly more potential based on a review of recent tools that enhance curator productivity. But a full cost-benefit analysis for these tools is lacking. Without such analysis it is possible to expend significant effort developing information-extraction tools that automate small parts of the overall curation workflow without achieving a significant decrease in curation costs.Database URL.

  6. MyETL: A Java Software Tool to Extract, Transform, and Load Your Business

    OpenAIRE

    Michele Nuovo

    2015-01-01

    The project follows the development of a Java Software Tool that extracts data from Flat File (Fixed Length Record Type), CSV (Comma Separated Values), and XLS (Microsoft Excel 97-2003 Worksheet file), apply transformation to those sources, and finally load the data into the end target RDBMS. The software refers to a process known as ETL (Extract Transform and Load). Those kinds of systems are called ETL systems.

  7. MyETL: A Java Software Tool to Extract, Transform, and Load Your Business

    Directory of Open Access Journals (Sweden)

    Michele Nuovo

    2015-12-01

    Full Text Available The project follows the development of a Java Software Tool that extracts data from Flat File (Fixed Length Record Type, CSV (Comma Separated Values, and XLS (Microsoft Excel 97-2003 Worksheet file, apply transformation to those sources, and finally load the data into the end target RDBMS. The software refers to a process known as ETL (Extract Transform and Load. Those kinds of systems are called ETL systems.

  8. A Review of Feature Extraction Software for Microarray Gene Expression Data

    Directory of Open Access Journals (Sweden)

    Ching Siang Tan

    2014-01-01

    Full Text Available When gene expression data are too large to be processed, they are transformed into a reduced representation set of genes. Transforming large-scale gene expression data into a set of genes is called feature extraction. If the genes extracted are carefully chosen, this gene set can extract the relevant information from the large-scale gene expression data, allowing further analysis by using this reduced representation instead of the full size data. In this paper, we review numerous software applications that can be used for feature extraction. The software reviewed is mainly for Principal Component Analysis (PCA, Independent Component Analysis (ICA, Partial Least Squares (PLS, and Local Linear Embedding (LLE. A summary and sources of the software are provided in the last section for each feature extraction method.

  9. Semi-Automatically Extracting FAQs to Improve Accessibility of Software Development Knowledge

    CERN Document Server

    Henß, Stefan; Mezini, Mira

    2012-01-01

    Frequently asked questions (FAQs) are a popular way to document software development knowledge. As creating such documents is expensive, this paper presents an approach for automatically extracting FAQs from sources of software development discussion, such as mailing lists and Internet forums, by combining techniques of text mining and natural language processing. We apply the approach to popular mailing lists and carry out a survey among software developers to show that it is able to extract high-quality FAQs that may be further improved by experts.

  10. ICSOFT 2006 : First International Conference on Software and Data Technologies, Volume 1

    NARCIS (Netherlands)

    Filipe, Joaquim; Shishkov, Boris; Helfert, Markus

    2006-01-01

    This volume contains the proceedings of the first International Conference on Software and Data Technologies (ICSOFT 2006), organized by the Institute for Systems and Technologies of Information, Communication and Control (INSTICC) in cooperation with the Object Management Group (OMG), sponsored by

  11. Measuring stone volume - three-dimensional software reconstruction or an ellipsoid algebra formula?

    Science.gov (United States)

    Finch, William; Johnston, Richard; Shaida, Nadeem; Winterbottom, Andrew; Wiseman, Oliver

    2014-04-01

    To determine the optimal method for assessing stone volume, and thus stone burden, by comparing the accuracy of scalene, oblate, and prolate ellipsoid volume equations with three-dimensional (3D)-reconstructed stone volume. Kidney stone volume may be helpful in predicting treatment outcome for renal stones. While the precise measurement of stone volume by 3D reconstruction can be accomplished using modern computer tomography (CT) scanning software, this technique is not available in all hospitals or with routine acute colic scanning protocols. Therefore, maximum diameters as measured by either X-ray or CT are used in the calculation of stone volume based on a scalene ellipsoid formula, as recommended by the European Association of Urology. In all, 100 stones with both X-ray and CT (1-2-mm slices) were reviewed. Complete and partial staghorn stones were excluded. Stone volume was calculated using software designed to measure tissue density of a certain range within a specified region of interest. Correlation coefficients among all measured outcomes were compared. Stone volumes were analysed to determine the average 'shape' of the stones. The maximum stone diameter on X-ray was 3-25 mm and on CT was 3-36 mm, with a reasonable correlation (r = 0.77). Smaller stones (15 mm towards scalene ellipsoids. There was no difference in stone shape by location within the kidney. As the average shape of renal stones changes with diameter, no single equation for estimating stone volume can be recommended. As the maximum diameter increases, calculated stone volume becomes less accurate, suggesting that larger stones have more asymmetric shapes. We recommend that research looking at stone clearance rates should use 3D-reconstructed stone volumes when available, followed by prolate, oblate, or scalene ellipsoid formulas depending on the maximum stone diameter. © 2013 The Authors. BJU International © 2013 BJU International.

  12. Simulation of Extractive Distillation for Recycling Tetrahydrofuran from Pharmaceutical Wastewater with Chem CAD Software

    Directory of Open Access Journals (Sweden)

    Xiaoguang Wang

    2013-05-01

    Full Text Available The functions and application of ChemCAD simulation software was introduced. Mathematical model of extraction distillation process was established. The extractive distillation process for preparation of tetrahydrofuran (THF was simulated by SCDS rectification model in ChemCAD software, methanol-THF-solution and ethanedol-lithium chloride solution as extractant. Influence of extraction agent on vapor-liquid equilibrium curve of methanol-THF system and effects of theoretical plate number, feed and extractant input positions, extraction agent ratio (m(chlorinated lithium: v(ethanediol, extractant ratio (m (extractant: m(feed and reflux ratio on tower top mass fraction of THF were investigated. The results of simulation were compared with experimental data. Under optimum extraction distillation conditions: theoretical plate 30, extractant plate 6, feed plate 18, extraction agent ratio 0.24g/mL, extractant ratio 2.7 and reflux ratio 5.0, mass fraction of THF on tower top can reach 97.2%. The distribution characteristic parameters of the column were simulated. The results of simulation and experimental data agree satisfactorily.

  13. Modelling the Process of Induction Heating in Volume of a Bar Strip Using Flux 2D Software, coupled with Minitab Experimental Design Software

    Directory of Open Access Journals (Sweden)

    CODREAN Marius

    2016-05-01

    Full Text Available The purpose of this optimization is the identification of optimal parameters for processing the workpiece (the OLC45 steel bar, using inductive heating in volume. Flux 9.3.2 software, in 2D plan, has been employed in order to perform numerical simulations, while Minitab software has been used to determine optimal parameters.

  14. SHTEREOM I SIMPLE WINDOWS® BASED SOFTWARE FOR STEREOLOGY. VOLUME AND NUMBER ESTIMATIONS

    Directory of Open Access Journals (Sweden)

    Emin Oğuzhan Oğuz

    2011-05-01

    Full Text Available Stereology has been earlier defined by Wiebel (1970 to be: "a body of mathematical methods relating to three dimensional parameters defining the structure from two dimensional measurements obtainable on sections of the structure." SHTEREOM I is a simple windows-based software for stereological estimation. In this first part, we describe the implementation of the number and volume estimation tools for unbiased design-based stereology. This software is produced in Visual Basic and can be used on personal computers operated by Microsoft Windows® operating systems that are connected to a conventional camera attached to a microscope and a microcator or a simple dial gauge. Microsoft NET Framework version 1.1 also needs to be downloaded for full use. The features of the SHTEREOM I software are illustrated through examples of stereological estimations in terms of volume and particle numbers for different magnifications (4X–100X. Point-counting grids are available for area estimations and for use with the most efficient volume estimation tool, the Cavalieri technique and are applied to Lizard testicle volume. An unbiased counting frame system is available for number estimations of the objects under investigation, and an on-screen manual stepping module for number estimations through the optical fractionator method is also available for the measurement of increments along the X and Y axes of the microscope stage for the estimation of rat brain hippocampal pyramidal neurons.

  15. Software development to estimate the leaked volume from ruptured submarine pipelines; Desenvolvimento de um software para estimativa do volume vazado a partir de dutos submarions rompidos

    Energy Technology Data Exchange (ETDEWEB)

    Quadri, Marintho B.; Machado, Ricardo A.F.; Nogueira, Andre L.; Lopes, Toni J. [Universidade Federal de Santa Catarina (UFSC), Florianopolis, SC (Brazil). Dept. de Engenharia Quimica; Baptista, Renan M. [PETROBRAS, Rio de Janeiro, RJ (Brazil). Centro de Pesquisas (CENPES)

    2004-07-01

    The considerable increasing in the world petroleum consumption as the exhaustion of onshore reserves in the last decades leads the companies to exploit petroleum in offshore reserves (both shallow and deep water). As in onshore operations, accidents may also occur in submarine exploration. Leaking from submarine pipelines arises from corrosion pit and from axial or radial breakage. In all these three situations, the leaking is divided in three steps: pipeline depressurization until the internal pressure becomes equal to the external one; advective migration in which the driven force is the difference in the physical properties of the fluids; oil spill movement in the sea surface. A great number of mathematical models are Also available for the first and third steps. For the second one and theoretically, the most important situation, there is a restricted number of works respected to the oil volume leaked. The present study presents a software that is capable to accurate simulate a leakage through the advective migration phenomena. The software was validated for situations for different holes radii located in the upper side of a horizontal pipeline. Model results presented very good agreement with experimental data. (author)

  16. Software development to estimate the leaked volume from ruptured submarine pipelines; Desenvolvimento de um software para estimativa do volume vazado a partir de dutos submarions rompidos

    Energy Technology Data Exchange (ETDEWEB)

    Quadri, Marintho B.; Machado, Ricardo A.F.; Nogueira, Andre L.; Lopes, Toni J. [Universidade Federal de Santa Catarina (UFSC), Florianopolis, SC (Brazil). Dept. de Engenharia Quimica; Baptista, Renan M. [PETROBRAS, Rio de Janeiro, RJ (Brazil). Centro de Pesquisas (CENPES)

    2004-07-01

    The considerable increasing in the world petroleum consumption as the exhaustion of onshore reserves in the last decades leads the companies to exploit petroleum in offshore reserves (both shallow and deep water). As in onshore operations, accidents may also occur in submarine exploration. Leaking from submarine pipelines arises from corrosion pit and from axial or radial breakage. In all these three situations, the leaking is divided in three steps: pipeline depressurization until the internal pressure becomes equal to the external one; advective migration in which the driven force is the difference in the physical properties of the fluids; oil spill movement in the sea surface. A great number of mathematical models are Also available for the first and third steps. For the second one and theoretically, the most important situation, there is a restricted number of works respected to the oil volume leaked. The present study presents a software that is capable to accurate simulate a leakage through the advective migration phenomena. The software was validated for situations for different holes radii located in the upper side of a horizontal pipeline. Model results presented very good agreement with experimental data. (author)

  17. Guidelines for the verification and validation of expert system software and conventional software: Project summary. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Mirsky, S.M.; Hayes, J.E.; Miller, L.A. [Science Applications International Corp., McLean, VA (United States)

    1995-03-01

    This eight-volume report presents guidelines for performing verification and validation (V&V) on Artificial Intelligence (Al) systems with nuclear applications. The guidelines have much broader application than just expert systems; they are also applicable to object-oriented programming systems, rule-based systems, frame-based systems, model-based systems, neural nets, genetic algorithms, and conventional software systems. This is because many of the components of AI systems are implemented in conventional procedural programming languages, so there is no real distinction. The report examines the state of the art in verifying and validating expert systems. V&V methods traditionally applied to conventional software systems are evaluated for their applicability to expert systems. One hundred fifty-three conventional techniques are identified and evaluated. These methods are found to be useful for at least some of the components of expert systems, frame-based systems, and object-oriented systems. A taxonomy of 52 defect types and their delectability by the 153 methods is presented. With specific regard to expert systems, conventional V&V methods were found to apply well to all the components of the expert system with the exception of the knowledge base. The knowledge base requires extension of the existing methods. Several innovative static verification and validation methods for expert systems have been identified and are described here, including a method for checking the knowledge base {open_quotes}semantics{close_quotes} and a method for generating validation scenarios. Evaluation of some of these methods was performed both analytically and experimentally. A V&V methodology for expert systems is presented based on three factors: (1) a system`s judged need for V&V (based in turn on its complexity and degree of required integrity); (2) the life-cycle phase; and (3) the system component being tested.

  18. Information Extraction for System-Software Safety Analysis: Calendar Year 2008 Year-End Report

    Science.gov (United States)

    Malin, Jane T.

    2009-01-01

    This annual report describes work to integrate a set of tools to support early model-based analysis of failures and hazards due to system-software interactions. The tools perform and assist analysts in the following tasks: 1) extract model parts from text for architecture and safety/hazard models; 2) combine the parts with library information to develop the models for visualization and analysis; 3) perform graph analysis and simulation to identify and evaluate possible paths from hazard sources to vulnerable entities and functions, in nominal and anomalous system-software configurations and scenarios; and 4) identify resulting candidate scenarios for software integration testing. There has been significant technical progress in model extraction from Orion program text sources, architecture model derivation (components and connections) and documentation of extraction sources. Models have been derived from Internal Interface Requirements Documents (IIRDs) and FMEA documents. Linguistic text processing is used to extract model parts and relationships, and the Aerospace Ontology also aids automated model development from the extracted information. Visualizations of these models assist analysts in requirements overview and in checking consistency and completeness.

  19. Guidelines for the verification and validation of expert system software and conventional software: Validation scenarios. Volume 6

    Energy Technology Data Exchange (ETDEWEB)

    Mirsky, S.M.; Hayes, J.E.; Miller, L.A. [Science Applications International Corp., McLean, VA (United States)

    1995-03-01

    This report is the sixth volume in a series of reports describing the results of the Expert System Verification and Validation (V&V) project which is jointly funded by the US Nuclear Regulatory Commission and the Electric Power Research Institute. The ultimate objective is the formulation of guidelines for the V&V of expert systems for use in nuclear power applications. This activity was concerned with the development of a methodology for selecting validation scenarios and subsequently applying it to two expert systems used for nuclear utility applications. Validation scenarios were defined and classified into five categories: PLANT, TEST, BASICS, CODE, and LICENSING. A sixth type, REGRESSION, is a composite of the others and refers to the practice of using trusted scenarios to ensure that modifications to software did not change unmodified functions. Rationale was developed for preferring scenarios selected from the categories in the order listed and for determining under what conditions to select scenarios from other types. A procedure incorporating all of the recommendations was developed as a generalized method for generating validation scenarios. The procedure was subsequently applied to two expert systems used in the nuclear industry and was found to be effective, given that an experienced nuclear engineer made the final scenario selections. A method for generating scenarios directly from the knowledge base component was suggested.

  20. Guidelines for the verification and validation of expert system software and conventional software: Validation scenarios. Volume 6

    Energy Technology Data Exchange (ETDEWEB)

    Mirsky, S.M.; Hayes, J.E.; Miller, L.A. [Science Applications International Corp., McLean, VA (United States)

    1995-03-01

    This report is the sixth volume in a series of reports describing the results of the Expert System Verification and Validation (V&V) project which is jointly funded by the US Nuclear Regulatory Commission and the Electric Power Research Institute. The ultimate objective is the formulation of guidelines for the V&V of expert systems for use in nuclear power applications. This activity was concerned with the development of a methodology for selecting validation scenarios and subsequently applying it to two expert systems used for nuclear utility applications. Validation scenarios were defined and classified into five categories: PLANT, TEST, BASICS, CODE, and LICENSING. A sixth type, REGRESSION, is a composite of the others and refers to the practice of using trusted scenarios to ensure that modifications to software did not change unmodified functions. Rationale was developed for preferring scenarios selected from the categories in the order listed and for determining under what conditions to select scenarios from other types. A procedure incorporating all of the recommendations was developed as a generalized method for generating validation scenarios. The procedure was subsequently applied to two expert systems used in the nuclear industry and was found to be effective, given that an experienced nuclear engineer made the final scenario selections. A method for generating scenarios directly from the knowledge base component was suggested.

  1. MCTSSA Software Reliability Handbook, Volume II: Data Collection Demonstration and Software Reliability Modeling for a Multi-Function Distributed System

    OpenAIRE

    Schneidewind, Norman F.

    1997-01-01

    The purpose of this handbook is threefold. Specifically, it: Serves as a reference guide for implementing standard software reliability practices at Marine Corps Tactical Systems Support Activity and aids in applying the software reliability model; Serves as a tool for managing the software reliability program; and Serves as a training aid. U.S. Marine Corps Tactical Systems Support Activity, Camp Pendleton, CA. RLACH

  2. Information Extraction for System-Software Safety Analysis: Calendar Year 2007 Year-End Report

    Science.gov (United States)

    Malin, Jane T.

    2008-01-01

    This annual report describes work to integrate a set of tools to support early model-based analysis of failures and hazards due to system-software interactions. The tools perform and assist analysts in the following tasks: 1) extract model parts from text for architecture and safety/hazard models; 2) combine the parts with library information to develop the models for visualization and analysis; 3) perform graph analysis on the models to identify possible paths from hazard sources to vulnerable entities and functions, in nominal and anomalous system-software configurations; 4) perform discrete-time-based simulation on the models to investigate scenarios where these paths may play a role in failures and mishaps; and 5) identify resulting candidate scenarios for software integration testing. This paper describes new challenges in a NASA abort system case, and enhancements made to develop the integrated tool set.

  3. Software Test and Evaluation Manual. Volume 2. Guidelines for Software Test and Evaluation in the Department of Defense

    Science.gov (United States)

    1987-02-25

    Section 4.1. For example, if a critical operacional software issue can be addressed during DT as opposed to OT. the OT Agency should ensure that DT plans...have been overlooked and not allocated to lower level components at all. In addition, traceability matrices can be used to reflect the completeness of...elements of information that are reported in the traceability matrices become available as the software development progresses, the traceability matrices

  4. Orbiter subsystem hardware/software interaction analysis. Volume 8: AFT reaction control system, part 2

    Science.gov (United States)

    Becker, D. D.

    1980-01-01

    The orbiter subsystems and interfacing program elements which interact with the orbiter computer flight software are analyzed. The failure modes identified in the subsystem/element failure mode and effects analysis are examined. Potential interaction with the software is examined through an evaluation of the software requirements. The analysis is restricted to flight software requirements and excludes utility/checkout software. The results of the hardware/software interaction analysis for the forward reaction control system are presented.

  5. Does upper premolar extraction affect the changes of pharyngeal airway volume after bimaxillary surgery in skeletal class III patients?

    Science.gov (United States)

    Kim, Min-Ah; Park, Yang-Ho

    2014-01-01

    The purpose of this study was to assess the pharyngeal airway volume change after bimaxillary surgery in patients with skeletal Class III malocclusion and evaluate the difference in postoperative pharyngeal airway space between upper premolar extraction cases and nonextraction cases. Cone-beam computed tomographic scans were obtained for 23 patients (13 in extraction group and 10 in nonextraction group) who were diagnosed with mandibular prognathism before surgery (T0) and then 2 months (T2) and 6 months after surgery (T3). Using InVivoDental 3-dimensional imaging software, volumetric changes in the pharyngeal airway space were assessed at T0, T2, and T3. The Wilcoxon signed-rank test was used to determine whether there were significant changes in pharyngeal airway volume between time points. The Mann-Whitney U test was used to determine whether there were significant differences in volumetric changes between the extraction and nonextraction groups. Volumes in all subsections of the pharyngeal airway were decreased (P bimaxillary surgery. Copyright © 2014 American Association of Oral and Maxillofacial Surgeons. Published by Elsevier Inc. All rights reserved.

  6. NI-79RAPID ASSESSMENT OF LESION VOLUMES FOR PATIENTS WITH GLIOMA USING THE SMARTBRUSH SOFTWARE PACKAGE

    Science.gov (United States)

    Vaziri, Sana; Lafontaine, Marisa; Olson, Beck; Crane, Jason C.; Chang, Susan; Lupo, Janine; Nelson, Sarah J.

    2014-01-01

    The increasing interest in enhancing the RANO criteria by using quantitative assessments of changes in lesion size and image intensities has highlighted the need for rapid, easy-to-use tools that provide DICOM compatible outputs for evaluation of patients with glioma. To evaluate the performance of the SmartBrush software (Brainlab AG), which provides computer-assisted definitions of regions of interest (ROIs), a cohort of 20 patients with glioma (equal number having high and low grade and treated and un-treated) were scanned using a 3T whole-body MR system prior to surgical resection. The T2-weighted FLAIR, pre- and post-contrast T1-weighted gradient echo DICOM images were pushed from the scanner to an offline workstation where analysis of lesion volumes was performed using SmartBrush. Volumes of the T2Ls ranged from 7.9 to 110.2cm3 and the volumes of the CELs was 0.1 to 28.5 cm3 with 19/20 of the subjects having CELs and all 20 having T2Ls. the computer-assisted analysis was performed rapidly and efficiently, with the mean time for defining both lesions per subject was 5.77 (range 3.5 to 7.5) minutes. Prior analysis of ROIS with the SLICER package (www.slicer.org) took approximately 30 minutes/subject. SmartBrush provides lesion volumes and cross-sectional diameter as a PDF report, which can be stored in DICOM. The ROIs were also saved as DICOM objects and transferred to other packages for performing histogram analysis from ADC or other functional parameter maps. Ongoing studies that will be reported in this presentation are performing a similar analysis with multiple users in order to compare the relative intra- and inter-operator variations in terms of both the speed of analysis and the ROIs that are identified. Acknowledgements: The authors would like to acknowledge Rowena Thomson and Natalie Wright from Brainlab for helping to set up this study.

  7. Achieving Better Buying Power through Acquisition of Open Architecture Software Systems: Volume 1

    Science.gov (United States)

    2016-01-06

    2009. [Sca10] Scacchi, W. (2010). The Future of Research in Free / Open Source Software Development, Proc. ACM Workshop Future of Software Engineering...review and approve choices between functionally similar low or no cost open source software components, and commercially priced closed source ...NavalPostgraduate School, Monterey, CA. [Ke12] Kenyon, H. (2012). DoD, Intel Officials Bullish On Open Source Software ; Government- wide Software

  8. Open-source software for demand forecasting of clinical laboratory test volumes using time-series analysis

    Directory of Open Access Journals (Sweden)

    Emad A Mohammed

    2017-01-01

    Full Text Available Background: Demand forecasting is the area of predictive analytics devoted to predicting future volumes of services or consumables. Fair understanding and estimation of how demand will vary facilitates the optimal utilization of resources. In a medical laboratory, accurate forecasting of future demand, that is, test volumes, can increase efficiency and facilitate long-term laboratory planning. Importantly, in an era of utilization management initiatives, accurately predicted volumes compared to the realized test volumes can form a precise way to evaluate utilization management initiatives. Laboratory test volumes are often highly amenable to forecasting by time-series models; however, the statistical software needed to do this is generally either expensive or highly technical. Method: In this paper, we describe an open-source web-based software tool for time-series forecasting and explain how to use it as a demand forecasting tool in clinical laboratories to estimate test volumes. Results: This tool has three different models, that is, Holt-Winters multiplicative, Holt-Winters additive, and simple linear regression. Moreover, these models are ranked and the best one is highlighted. Conclusion: This tool will allow anyone with historic test volume data to model future demand.

  9. Open-source Software for Demand Forecasting of Clinical Laboratory Test Volumes Using Time-series Analysis.

    Science.gov (United States)

    Mohammed, Emad A; Naugler, Christopher

    2017-01-01

    Demand forecasting is the area of predictive analytics devoted to predicting future volumes of services or consumables. Fair understanding and estimation of how demand will vary facilitates the optimal utilization of resources. In a medical laboratory, accurate forecasting of future demand, that is, test volumes, can increase efficiency and facilitate long-term laboratory planning. Importantly, in an era of utilization management initiatives, accurately predicted volumes compared to the realized test volumes can form a precise way to evaluate utilization management initiatives. Laboratory test volumes are often highly amenable to forecasting by time-series models; however, the statistical software needed to do this is generally either expensive or highly technical. In this paper, we describe an open-source web-based software tool for time-series forecasting and explain how to use it as a demand forecasting tool in clinical laboratories to estimate test volumes. This tool has three different models, that is, Holt-Winters multiplicative, Holt-Winters additive, and simple linear regression. Moreover, these models are ranked and the best one is highlighted. This tool will allow anyone with historic test volume data to model future demand.

  10. Extracting three-body observables from finite-volume quantities

    CERN Document Server

    Hansen, Maxwell T

    2015-01-01

    Scattering and transition amplitudes with three-hadron final states play an important role in nuclear and particle physics. However, predicting such quantities using numerical Lattice QCD is very difficult, in part because of the effects of Euclidean time and finite volume. In this review we highlight recent formal developments that work towards overcoming these issues. We organize the presentation into three parts: large volume expansions, non-relativistic nonperturbative analyses, and nonperturbative studies based in relativistic field theory. In the first part we discuss results for ground state energies and matrix elements given by expanding in inverse box length, $1/L$. We describe complications that arise at $\\mathcal O(1/L^6)$ and include a table summarizing the results of different calculations. In the second part we summarize three recent non-relativistic non-perturbative studies and highlight the main conclusions of these works. This includes demonstrating that the three-particle finite-volume spect...

  11. Distribution and determinants of choroidal thickness and volume using automated segmentation software in a population-based study.

    Science.gov (United States)

    Gupta, Preeti; Jing, Tian; Marziliano, Pina; Cheung, Carol Y; Baskaran, Mani; Lamoureux, Ecosse L; Wong, Tien Yin; Cheung, Chui Ming Gemmy; Cheng, Ching-Yu

    2015-02-01

    To objectively quantify choroidal thickness and choroidal volume using fully automated choroidal segmentation software applied to images obtained from enhanced depth imaging spectral-domain optical coherence tomography (EDI SD OCT) in a population-based study; and evaluate the ocular and systemic determinants of choroidal thickness and choroidal volume. Prospective cross-sectional study. Participants ranging in age from 45 to 85 years were recruited from the Singapore Malay Eye Study-2 (SiMES-2), a follow-up population-based study. All participants (n = 540) underwent a detailed ophthalmic examination, including EDI SD OCT for measurements of thickness and volume of the choroid. The intrasession repeatability of choroidal thickness at 5 measured horizontal locations and macular choroidal volume using automated choroidal segmentation software was excellent (intraclass correlation coefficient, 0.97-0.99). Choroid was significantly thicker under the fovea (242.28 ± 97.58 μm), followed by 3 mm temporal (207.65 ± 80.98 μm), and was thinnest at 3 mm nasal (142.44 ± 79.19 μm) location. The mean choroidal volume at central macular region (within a circle of 1 mm diameter) was 0.185 ± 0.69 mm(3). Among the range of ocular and systemic factors studied, age, sex, and axial length were the only significant predictors of choroidal thickness and choroidal volume (all P choroidal segmentation software, we provide fast, reliable, and objective measurements of choroidal thickness and volume in a population-based sample. Male sex, younger age, and shorter axial length are the factors independently associated with thicker choroid and larger choroidal volume. These factors should be taken into consideration when interpreting EDI SD OCT-based choroidal thickness measurements in clinics. Copyright © 2015 Elsevier Inc. All rights reserved.

  12. Splenic volume measurements on computed tomography utilizing automatically contouring software and its relationship with age, gender, and anthropometric parameters

    Energy Technology Data Exchange (ETDEWEB)

    Harris, Ardene, E-mail: ardene_b@yahoo.co [Department of Radiology, Hokkaido University Hospital, North 15, West 7, Kita-ku, Sapporo 060-0815 (Japan); Kamishima, Tamotsu, E-mail: ktamotamo2@yahoo.co.j [Department of Radiology, Hokkaido University Hospital, North 15, West 7, Kita-ku, Sapporo 060-0815 (Japan); Hao, Hong Yi, E-mail: haohongyi88@yahoo.co.j [Department of Radiology, Hokkaido University Hospital, North 15, West 7, Kita-ku, Sapporo 060-0815 (Japan); Kato, Fumi [Department of Radiology, Hokkaido University Hospital, North 15, West 7, Kita-ku, Sapporo 060-0815 (Japan); Omatsu, Tokuhiko, E-mail: omatoku@me.co [Department of Radiology, Hokkaido University Hospital, North 15, West 7, Kita-ku, Sapporo 060-0815 (Japan); Onodera, Yuya, E-mail: yuyaonodera@med.hokudai.ac.j [Department of Radiology, Hokkaido University Hospital, North 15, West 7, Kita-ku, Sapporo 060-0815 (Japan); Terae, Satoshi, E-mail: saterae@med.hokudai.ac.j [Department of Radiology, Hokkaido University Hospital, North 15, West 7, Kita-ku, Sapporo 060-0815 (Japan); Shirato, Hiroki, E-mail: shirato@med.hokudai.ac.j [Department of Radiology, Hokkaido University Hospital, North 15, West 7, Kita-ku, Sapporo 060-0815 (Japan)

    2010-07-15

    Objective: The present research was conducted to establish the normal splenic volume in adults using a novel and fast technique. The relationship between splenic volume and age, gender, and anthropometric parameters was also examined. Materials and methods: The splenic volume was measured in 230 consecutive patients who underwent computed tomography (CT) scans for various indications. Patients with conditions that have known effect on the spleen size were not included in this study. A new technique using volumetric software to automatically contour the spleen in each CT slice and quickly calculate splenic volume was employed. Inter- and intra-observer variability were also examined. Results: The average splenic volume of all the subjects was 127.4 {+-} 62.9 cm{sup 3}, ranging from 22 to 417 cm{sup 3}. The splenic volume (S) correlated with age (A) (r = -0.33, p < 0.0001), body weight (W) (r = 0.35, p < 0.0001), body mass index (r = 0.24, p < 0.0001) and body surface area (BSA) (r = 0.31, p < 0.0001). The age-adjusted splenic volume index correlated with gender (p = 0.0089). The formulae S = W[6.47A{sup (-0.31)}] and S = BSA[278A{sup (-0.36)}] were derived and can be used to estimate the splenic volume. Inter- and intra-observer variability were 6.4 {+-} 9.8% and 2.8 {+-} 3.5% respectively. Conclusion: Of the anthropometric parameters, the splenic volume was most closely linked to body weight. The automatically contouring software as well as formulae can be used to obtain the volume of the spleen in regular practice.

  13. Clinical records anonymisation and text extraction (CRATE): an open-source software system.

    Science.gov (United States)

    Cardinal, Rudolf N

    2017-04-26

    Electronic medical records contain information of value for research, but contain identifiable and often highly sensitive confidential information. Patient-identifiable information cannot in general be shared outside clinical care teams without explicit consent, but anonymisation/de-identification allows research uses of clinical data without explicit consent. This article presents CRATE (Clinical Records Anonymisation and Text Extraction), an open-source software system with separable functions: (1) it anonymises or de-identifies arbitrary relational databases, with sensitivity and precision similar to previous comparable systems; (2) it uses public secure cryptographic methods to map patient identifiers to research identifiers (pseudonyms); (3) it connects relational databases to external tools for natural language processing; (4) it provides a web front end for research and administrative functions; and (5) it supports a specific model through which patients may consent to be contacted about research. Creation and management of a research database from sensitive clinical records with secure pseudonym generation, full-text indexing, and a consent-to-contact process is possible and practical using entirely free and open-source software.

  14. Renal cortical volume measured using automatic contouring software for computed tomography and its relationship with BMI, age and renal function

    Energy Technology Data Exchange (ETDEWEB)

    Muto, Natalia Sayuri, E-mail: nataliamuto@gmail.com [Department of Radiology, Hokkaido University Hospital, N15 W7, kita-ku, Sapporo City, 0608638 (Japan); Kamishima, Tamotsu, E-mail: ktamotamo2@yahoo.co.jp [Department of Radiology, Hokkaido University Hospital, N15 W7, kita-ku, Sapporo City, 0608638 (Japan); Harris, Ardene A., E-mail: ardene_b@yahoo.com [Department of Radiology, Hokkaido University Hospital, N15 W7, kita-ku, Sapporo City, 0608638 (Japan); Kato, Fumi, E-mail: fumikato@med.hokudai.ac.jp [Department of Radiology, Hokkaido University Hospital, N15 W7, kita-ku, Sapporo City, 0608638 (Japan); Onodera, Yuya, E-mail: yuyaonodera@med.hokudai.ac.jp [Department of Radiology, Hokkaido University Hospital, N15 W7, kita-ku, Sapporo City, 0608638 (Japan); Terae, Satoshi, E-mail: saterae@yahoo.co.jp [Department of Radiology, Hokkaido University Hospital, N15 W7, kita-ku, Sapporo City, 0608638 (Japan); Shirato, Hiroki, E-mail: shirato@med.hokudai.ac.jp [Department of Radiology, Hokkaido University Hospital, N15 W7, kita-ku, Sapporo City, 0608638 (Japan)

    2011-04-15

    Purpose: To evaluate the relationship between renal cortical volume, measured by an automatic contouring software, with body mass index (BMI), age and renal function. Materials and methods: The study was performed in accordance to the institutional guidelines at our hospital. Sixty-four patients (34 men, 30 women), aged 19 to 79 years had their CT scans for diagnosis or follow-up of hepatocellular carcinoma retrospectively examined by a computer workstation using a software that automatically contours the renal cortex and the renal parenchyma. Body mass index and estimated glomerular filtration rate (eGFR) were calculated based on data collected. Statistical analysis was done using the Student t-test, multiple regression analysis, and intraclass correlation coefficient (ICC). Results: The ICC for total renal and renal cortical volumes were 0.98 and 0.99, respectively. Renal volume measurements yielded a mean cortical volume of 105.8 cm{sup 3} {+-} 28.4 SD, mean total volume of 153 cm{sup 3} {+-} 39 SD and mean medullary volume of 47.8 cm{sup 3} {+-} 19.5 SD. The correlation between body weight/height/BMI and both total renal and cortical volumes presented r = 0.6, 0.6 and 0.4, respectively, p < 0.05, while the correlation between renal cortex and age was r = -0.3, p < 0.05. eGFR showed correlation with renal cortical volume r = 0.6, p < 0.05. Conclusion: This study demonstrated that renal cortical volume had a moderate positive relationship with BMI, moderate negative relationship with age, and a strong positive relationship with the renal function, and provided a new method to routinely produce volumetric assessment of the kidney.

  15. Guidelines for the verification and validation of expert system software and conventional software: User`s manual. Volume 7

    Energy Technology Data Exchange (ETDEWEB)

    Mirsky, S.M.; Hayes, J.E.; Miller, L.A. [Science Applications International Corp., McLean, VA (United States)

    1995-03-01

    This report provides a step-by-step guide, or user manual, for personnel responsible for the planning and execution of the verification and validation (V&V), and developmental testing, of expert systems, conventional software systems, and various other types of artificial intelligence systems. While the guide was developed primarily for applications in the utility industry, it applies well to all industries. The user manual has three sections. In Section 1 the user assesses the stringency of V&V needed for the system under consideration, identifies the development stage the system is in, and identifies the component(s) of the system to be tested next. These three pieces of information determine which Guideline Package of V&V methods is most appropriate for those conditions. The V&V Guideline Packages are provided in Section 2. Each package consists of an ordered set of V&V techniques to be applied to the system, guides on choosing the review/evaluation team, measurement criteria, and references to a book or report which describes the application of the method. Section 3 presents details of 11 of the most important (or least well-explained in the literature) methods to assist the user in applying these techniques accurately.

  16. CrossTalk. The Journal of Defense Software Engineering. Volume 17, Number 3, March 2004

    Science.gov (United States)

    2004-03-01

    402 Guanajuato, Gto., 36000 MEXICO Phone:+52 (473) 732 7155 ext.49577 E-mail: moca @cimat.mx Using the Team Software Process in an Outsourcing...Computer Science at the CIMAT, Mexico . He is a SEI-authorized Personal Software ProcessSM Instructor and Software Engineering Institute-trained Team...from Louisiana State University. Apdo. Postal 402 Guanajuato, Gto., 36000 MEXICO Phone:+52 (473) 732 7155 ext.49544 E-mail: masv@cimat.mx Carlos

  17. Orbiter subsystem hardware/software interaction analysis. Volume 8: Forward reaction control system

    Science.gov (United States)

    Becker, D. D.

    1980-01-01

    The results of the orbiter hardware/software interaction analysis for the AFT reaction control system are presented. The interaction between hardware failure modes and software are examined in order to identify associated issues and risks. All orbiter subsystems and interfacing program elements which interact with the orbiter computer flight software are analyzed. The failure modes identified in the subsystem/element failure mode and effects analysis are discussed.

  18. CONTRIBUTION TO THE DEVELOPMENT OF A SIMULATION SOFTWARE PERFORMANCE AND SHARING RATIO IN LIQUID-LIQUID EXTRACTION

    Directory of Open Access Journals (Sweden)

    A. Hadj Seyd

    2015-07-01

    Full Text Available The present work is to develop software to predict the value yield and the distribution coefficient in the process of liquid-liquid extraction of components of a mixture, from mathematical models expressing these entities, based on equations equilibrium between liquid-liquid phases, and predict the conditions under which the extraction operation is favorable, unfavorable or impossible to realize, by studying the variation of the entities cited, based on the parameters influencing the extraction, which are: initial concentrations, rate of solvent and pH, in the case of a simple extraction (extraction of neutral products or when it is reactive (extraction of complex acids or bases for one or more components.The programming language used is "Delphi" which is a very powerful oriented object programming under Windows.

  19. Effective Extraction Mechanism of Volume-Produced Ions in the NIPPER I Device

    Directory of Open Access Journals (Sweden)

    Henry Ramos

    1993-12-01

    Full Text Available A mass spectrometer system is developed to extract and analyze hydrogen ions from a volume plasma hydrogen ion source. A 180° magnetic deflection-type mass analyzer is coupled to NIPPER I (National Institute of Physics Plasma Experimental Rig I, a negative ion source. Hydrogen plasma is produced from a low pressure gas (10-2 Torr with a transition of the glow discharge (254 volts, 75 mA to an arc plasma (78 volts, 14 amperes in a few seconds. The usually cylindrical plasma is converted into a sheet configuration using a pair of Sm-Co magnets. This optimizes ion current extraction by reducing (a the ion loss to the discharge anode and (b the decay of the ion current produced in the plasma. Negative hydrogen ions (H- are volume-produced by dissociative attachment of low energy electrons to highly vibrational excited hydrogen molecules.The extraction of H- ions from this volume source is optimized by the proper choice of apertures of the limiting electrodes and of the applied bias potential. A proper combination of extraction electrodes gives an optimum H- current extracted without the electrons. When one of the extraction electrodes is biased negatively near the value of the plasma floating potential, a maximum H- current is also obtained. The methods of effective extraction of H- are discussed.

  20. High integrity software for nuclear power plants: Candidate guidelines, technical basis and research needs. Executive summary: Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Seth, S.; Bail, W.; Cleaves, D.; Cohen, H.; Hybertson, D.; Schaefer, C.; Stark, G.; Ta, A.; Ulery, B. [Mitre Corp., McLean, VA (United States)

    1995-06-01

    The work documented in this report was performed in support of the US Nuclear Regulatory Commission to examine the technical basis for candidate guidelines that could be considered in reviewing and evaluating high integrity computer software used in the safety systems of nuclear power plants. The framework for the work consisted of the following software development and assurance activities: requirements specification; design; coding; verification and validation, including static analysis and dynamic testing; safety analysis; operation and maintenance; configuration management; quality assurance; and planning and management. Each activity (framework element) was subdivided into technical areas (framework subelements). The report describes the development of approximately 200 candidate guidelines that span the entire range of software life-cycle activities; the assessment of the technical basis for those candidate guidelines; and the identification, categorization and prioritization of research needs for improving the technical basis. The report has two volumes: Volume 1, Executive Summary, includes an overview of the framework and of each framework element, the complete set of candidate guidelines, the results of the assessment of the technical basis for each candidate guideline, and a discussion of research needs that support the regulatory function; Volume 2 is the main report.

  1. Extracting Three Dimensional Surface Model of Human Kidney from the Visible Human Data Set using Free Software

    CERN Document Server

    P, Kirana Kumara

    2013-01-01

    Three dimensional digital model of a representative human kidney is needed for a surgical simulator that is capable of simulating a laparoscopic surgery involving kidney. Buying a three dimensional computer model of a representative human kidney, or reconstructing a human kidney from an image sequence using commercial software, both involve (sometimes significant amount of) money. In this paper, author has shown that one can obtain a three dimensional surface model of human kidney by making use of images from the Visible Human Data Set and a few free software packages (ImageJ, ITK-SNAP, and MeshLab in particular). Images from the Visible Human Data Set, and the software packages used here, both do not cost anything. Hence, the practice of extracting the geometry of a representative human kidney for free, as illustrated in the present work, could be a free alternative to the use of expensive commercial software or to the purchase of a digital model.

  2. Guidelines for the verification and validation of expert system software and conventional software: Survey and assessment of conventional software verification and validation methods. Volume 2

    Energy Technology Data Exchange (ETDEWEB)

    Mirsky, S.M.; Groundwater, E.H.; Hayes, J.E.; Miller, L.A. [Science Applications International Corp., McLean, VA (United States)

    1995-03-01

    By means of a literature survey, a comprehensive set of methods was identified for the verification and validation of conventional software. The 153 methods so identified were classified according to their appropriateness for various phases of a developmental life-cycle -- requirements, design, and implementation; the last category was subdivided into two, static testing and dynamic testing methods. The methods were then characterized in terms of eight rating factors, four concerning ease-of-use of the methods and four concerning the methods` power to detect defects. Based on these factors, two measurements were developed to permit quantitative comparisons among methods, a Cost-Benefit metric and an Effectiveness Metric. The Effectiveness Metric was further refined to provide three different estimates for each method, depending on three classes of needed stringency of V&V (determined by ratings of a system`s complexity and required-integrity). Methods were then rank-ordered for each of the three classes by terms of their overall cost-benefits and effectiveness. The applicability was then assessed of each for the identified components of knowledge-based and expert systems, as well as the system as a whole.

  3. Spacelab software development and integration concepts study report. Volume 2: Appendices

    Science.gov (United States)

    1973-01-01

    Software considerations were developed for incorporation in the spacelab systems design, and include management concepts for top-down structured programming, composite designs for modular programs, and team management methods for production programming.

  4. CrossTalk: The Journal of Defense Software Engineering. Volume 19, Number 3

    Science.gov (United States)

    2006-03-01

    disciplined process affects software quality. It also confirms that pro- grammer ability affects software quality and, more importantly, shows that even top...across PSP classes: (1) changes in the teaching materials used in the PSP class, or (2) differences between instructors. The possibility of a trend...that experience was a significant factor were performed in the 1970s and 1980s, entry-level pro- grammers were relatively unfamiliar with computers

  5. CrossTalk: The Journal of Defense Software Engineering. Volume 18, Number 6

    Science.gov (United States)

    2005-06-01

    recommends a number of architectural practices that will help programs avoid or mitigate these dangers. by Rich McCabe and Mike Polen Identifying Your...members during multiple iterations Effective Practices for Object-Oriented System Software Architecting Rich McCabe and Mike Polen Systems and...Phone: (703) 742-7289 Fax: (703) 742-7200 E-mail: mccabe@systemsand software.org Michael Polen is a sen- ior member of the tech- nical staff at the

  6. The SIFT hardware/software systems. Volume 1: A detailed description

    Science.gov (United States)

    Palumbo, Daniel L.

    1985-01-01

    This report contains a detailed description of the software implemented fault-tolerant computer's operating system and hardware subsystems. The Software Implemented Fault-Tolerant (SIFT) computer system was developed as an experimental vehicle for fault-tolerant systems research. The SIFT effort began with broad, in-depth studies stating the reliability and processing requirements for digital computers which would, in the aircraft of the 1990's, control flight-critical functions.

  7. CrossTalk: The Journal of Defense Software Engineering. Volume 18, Number 4

    Science.gov (United States)

    2005-04-01

    mathematics and statistics. by Lee Fischman , Karen McRitchie, and Daniel D. Galorath The Statistically Unreliable Nature of Lines of Code This author...estimating mod- els are evolving to keep pace with industry changes. As Lee Fischman et al. states, “The future of software project estimating has just...double the results a mere decade ago. As one of the first authors to recog- nize that software engineering differs from traditional engineering, David

  8. CrossTalk: The Journal of Defense Software Engineering. Volume 18, Number 7

    Science.gov (United States)

    2005-07-01

    Configuration Management Fundamentals This broad overview of configuration management (CM) takes you through the functions of CM to establishing a software...gives a brief but effective overview of important software acquisition and development topics, provides checklists for rapid self-inspection, and...before Ben could get a word out. Ben was not amused at my desertion, and sent me back up the ladder. Once I real- ized my fears of a gushing pipe were

  9. Guidelines for the verification and validation of expert system software and conventional software: Rationale and description of V&V guideline packages and procedures. Volume 5

    Energy Technology Data Exchange (ETDEWEB)

    Mirsky, S.M.; Hayes, J.E.; Miller, L.A. [Science Applications International Corp., McLean, VA (United States)

    1995-03-01

    This report is the fifth volume in a series of reports describing the results of the Expert System Verification C, and Validation (V&V) project which is jointly funded by the U.S. Nuclear Regulatory Commission and the Electric Power Research Institute toward the objective of formulating Guidelines for the V&V of expert systems for use in nuclear power applications. This report provides the rationale for and description of those guidelines. The actual guidelines themselves are presented in Volume 7, {open_quotes}User`s Manual.{close_quotes} Three factors determine what V&V is needed: (1) the stage of the development life cycle (requirements, design, or implementation); (2) whether the overall system or a specialized component needs to be tested (knowledge base component, inference engine or other highly reusable element, or a component involving conventional software); and (3) the stringency of V&V that is needed (as judged from an assessment of the system`s complexity and the requirement for its integrity to form three Classes). A V&V Guideline package is provided for each of the combinations of these three variables. The package specifies the V&V methods recommended and the order in which they should be administered, the assurances each method provides, the qualifications needed by the V&V team to employ each particular method, the degree to which the methods should be applied, the performance measures that should be taken, and the decision criteria for accepting, conditionally accepting, or rejecting an evaluated system. In addition to the Guideline packages, highly detailed step-by-step procedures are provided for 11 of the more important methods, to ensure that they can be implemented correctly. The Guidelines can apply to conventional procedural software systems as well as all kinds of Al systems.

  10. Development of a Solid Phase Extraction Method for Agricultural Pesticides in Large-Volume Water Samples

    Science.gov (United States)

    An analytical method using solid phase extraction (SPE) and analysis by gas chromatography/mass spectrometry (GC/MS) was developed for the trace determination of a variety of agricultural pesticides and selected transformation products in large-volume high-elevation lake water sa...

  11. Effect of water coffee extract on kidney volume (a stereological study

    Directory of Open Access Journals (Sweden)

    Farzaneh Dehghani

    2013-09-01

    Full Text Available Background: Coffee is a traditional drinking used by most of the people around the world. Overuse of coffee leads to many side effects on body. In this study, the effect of different doses of coffee extract on kidney volume was studied by the stereological method. Material and Methods: Sixty spragu-dawley male rats were divided into 6 groups. Control group was given tap water (0.5ml and experimental groups were given coffee extract orally for 14 days with doses (0.125, 0.25, 0.5, 1, 1.5gr/kg with the same volume of control group. Then rats were anesthetized (with ether, sacrificed and their right kidneys were removed, fixed, tissue processed and stained with H&E. The 5µm slides were studied by Cavalieri principle. Results: Higher doses of water coffee extract were associated with decreased kidney volume and volumes of glomerules but in lower doses it increased related to control group. Conclusion: It seems that, high doses of coffee has side effect on kidneyand reduces volume of kidney and its glomerules. However further studies are required to confirm this research.

  12. A method for extracting electronic patient record data from practice management software systems used in veterinary practice.

    Science.gov (United States)

    Jones-Diette, Julie S; Brennan, Marnie L; Cobb, Malcolm; Doit, Hannah; Dean, Rachel S

    2016-10-21

    Data extracted from electronic patient records (EPRs) within practice management software systems are increasingly used in veterinary research. The use of real patient data gives the potential to generate research that can readily be applied to clinical practice. The use of veterinary EPRs for research in the United Kingdom is hindered by the number of different Practice Management System (PMS) providers used by practices, as obtaining and combining data from different systems electronically can be problematic. The use of extensible mark up language (XML) to extract clinical data for research would potentially resolve the compatibility issues between systems. The aim of this study was to establish and validate a method for the extraction of small animal patient records from a veterinary PMS that could potentially be used across multiple systems. An XML schema was designed to extract clinical information from EPRs. The schema was tested and validated in a test system, and was then tested in a real small animal practice where data was extracted for 16 weeks. A 10 % sample of the extracted records was then compared to paper copies provided by the practice. All 21 fields encoded by the XML schema, from all of the records in the test system, were extracted with 100 % accuracy. Over the 18 week data collection period 4946 records, from 1279 patients, were extracted from the small animal practice. The 10 % printed records checked and compared with the XML extracted records demonstrated all required data was present. No unrequired, sensitive information e.g. costs or services/products or personal client information was extracted. This is the first time a method for data extraction from EPRs in veterinary practice using an XML schema has been reported in the United Kingdom. This is an efficient and accurate way of extracting data which could be applied to all PMSs nationally and internationally.

  13. CrossTalk. The Journal of Defense Software Engineering. Volume 25, Number 2

    Science.gov (United States)

    2012-04-01

    devices present expanded attack surfaces through sensors such as GPS, accelerometer, camera, micro- phone, and gyroscope . Recently, Kaspersky Lab...technical staff mem - ber at SEI. She is also a Carnegie Mellon adjunct faculty member. Her research interests include software assur- ance and SoS

  14. CrossTalk: The Journal of Defense Software Engineering. Volume 18, Number 8

    Science.gov (United States)

    2005-08-01

    pro- grammer with a higher level abstraction to the rudimentary socket interface. Much of the code for these networking middleware technologies can...she currently teaches classes in computer security and does research. Taylor’s research interests are in computer secu- rity and software engineering

  15. CrossTalk: The Journal of Defense Software Engineering. Volume 13, Number 5, May 2000

    Science.gov (United States)

    2000-05-01

    to have a good understanding of what things are so sacro - sanct that they could not change—the degree of stealthiness; performance param- eters...low-risk avionics development approach blended with state-of-the- art software development tools and processes has proven successful. Boeing also is

  16. CrossTalk. The Journal of Defense Software Engineering. Volume 15, Number 12, December 2002

    Science.gov (United States)

    2002-12-01

    Article) R. McCabe, M. Polen 10 30 What Is Agile Software Development? J. Highsmith 10 4 Best Practices A Study of Best Practice Adoption by Defense...Miscellaneous Add Decision Analysis to Your COTS Selection Process B. C. Phillips, S. M. Polen 4 21 CIO Update: The Expanding Responsibilities L. J

  17. CrossTalk: The Journal of Defense Software Engineering. Volume 21, Number 9

    Science.gov (United States)

    2008-09-01

    issued a security bulletin, MS03-047 [1] that fixed a cross-site scripting (XSS) vulnerability in the Outlook Web Access ( OWA ) front end to...Microsoft’s Exchange 5.5 software. In August 2004, Microsoft issued another bulletin, MS04-026 [2], in the same OWA component that was fixed in MS03-047 to fix

  18. CrossTalk: The Journal of Defense Software Engineering. Volume 21, Number 6

    Science.gov (United States)

    2008-06-01

    Honorable John Grimes Kristen Baldwin Jeff Schwalb Phil Perkins Karl Rogers Joe Jarzombek Brent Baxter Kasey Thompson Ken Davies Chelene Fortier...Reading 1. Boehm, Barry W. Software Engineer- ing Economics. Prentice Hall, Engle- wood Cliffs, NJ; 1981. 2. Crosby , Philip B. Quality Is Free. New

  19. CrossTalk: The Journal of Defense Software Engineering. Volume 18, Number 12

    Science.gov (United States)

    2005-12-01

    mission-critical soft- ware may result in harm to, or loss of human life and/or mission objectives such as in the case of the Therac -25 radiation overdose...accidents [2] and the Ariane-5 maiden launch failure [9]. The Therac -25 software caused severe radiation burns in numerous cancer patients before it

  20. CrossTalk. The Journal of Defense Software Engineering. Volume 25, Number 3

    Science.gov (United States)

    2012-06-01

    Reading, MA: Addison-Wesley, 1995. Print. 12. Philip M. Johnson and Anne M. Disney , “The Personal Software Process: A Cautionary Case Study,” IEEE...systems.” These systems control and manage the enterprise’s functional areas, generically: facilities, regulations , operations, procurement, human

  1. Simple and inexpensive hardware and software method to measure volume changes in Xenopus oocytes expressing aquaporins.

    Science.gov (United States)

    Dorr, Ricardo; Ozu, Marcelo; Parisi, Mario

    2007-04-15

    Water channels (aquaporins) family members have been identified in central nervous system cells. A classic method to measure membrane water permeability and its regulation is to capture and analyse images of Xenopus laevis oocytes expressing them. Laboratories dedicated to the analysis of motion images usually have powerful equipment valued in thousands of dollars. However, some scientists consider that new approaches are needed to reduce costs in scientific labs, especially in developing countries. The objective of this work is to share a very low-cost hardware and software setup based on a well-selected webcam, a hand-made adapter to a microscope and the use of free software to measure membrane water permeability in Xenopus oocytes. One of the main purposes of this setup is to maintain a high level of quality in images obtained at brief intervals (shorter than 70 ms). The presented setup helps to economize without sacrificing image analysis requirements.

  2. Technical Reviews and Audits for Systems, Equipment and Computer Software. Volume 1

    Science.gov (United States)

    2009-09-15

    TRA) Deskbook – DUSD(S&T) (May 2005) 17. IMP & IMS Preparation and Use Guide Version 0.9 (21 October 2005) 18. ISO/ IEC STD 15939 Software...ordnance, radiated effects on power buses, lightning and surge protection) d. Preliminary EMI and EMC-critical environmental characteristics and...transmitter RFI with vehicle receivers and ordnance, radiated effects on power buses, lightning and surge protection) 4. EMI and EMC critical environmental

  3. CrossTalk: The Journal of Defense Software Engineering. Volume 26, Number 4

    Science.gov (United States)

    2013-07-01

    and which they attack by massive and costly iterations. c. Lower risk: the program split into these phases automati- cally assures healthy milestones...required to make any software modification on their end as required to conduct the automated test. The VTE in this example can spawn an instance of ATRT...minutes from Salt Lake City �Utah Jazz Basketball �Three Minor League Baseball Teams �One Hour from 12 Ski Resorts �Minutes from Hunting, Fishing

  4. Computer-Aided Design for Built-In-Test (CADBIT) - Software Specification. Volume 3

    Science.gov (United States)

    1989-10-01

    CADD COMAN WIDO IO-CNURET MSL PPLYING~ TET ATEN Figur 3-13- TUTORIA FIGUR PLCMI IN CAD-NVIOMENT ON BOAR SEFTST- 1"os-rnenu, long-tur d nnnih que- list...have software package for reliability calculation A-8 LIBRARY ELEMENT DATA SHE T’" BIT TECHNIQUE: ON-BOARD ROM CATEGORY: L’ONG TUTORIA PAGE ,5 of 14

  5. CrossTalk: The Journal of Defense Software Engineering. Volume 19, Number 2

    Science.gov (United States)

    2006-02-01

    Information Exchanges Required to Get It Done • Systems that Support the Activities and Infomation Exchanges • Specific System Capabilities Required...hardware replace- ment requires system reboot. As with many other issues, the safety- critical Java profile tackles this challenge using a combination of...Air Systems Command (NAVAIR) Soft- ware Systems Support Center. The USAF Software Technology Support Center (STSC) is the publisher of CrossTalk

  6. Continuation of research into software for space operations support, volume 1

    Science.gov (United States)

    Collier, Mark D.; Killough, Ronnie; Martin, Nancy L.

    1990-01-01

    A prototype workstation executive called the Hardware Independent Software Development Environment (HISDE) was developed. Software technologies relevant to workstation executives were researched and evaluated and HISDE was used as a test bed for prototyping efforts. New X Windows software concepts and technology were introduced into workstation executives and related applications. The four research efforts performed included: (1) Research into the usability and efficiency of Motif (an X Windows based graphic user interface) which consisted of converting the existing Athena widget based HISDE user interface to Motif demonstrating the usability of Motif and providing insight into the level of effort required to translate an application from widget to another; (2) Prototype a real time data display widget which consisted of research methods for and prototyping the selected method of displaying textual values in an efficient manner; (3) X Windows performance evaluation which consisted of a series of performance measurements which demonstrated the ability of low level X Windows to display textural information; (4) Convert the Display Manager to X Window/Motif which is the application used by NASA for data display during operational mode.

  7. Portable microcomputer for the analysis of plutonium gamma-ray spectra. Volume II. Software description and listings. [IAEAPU

    Energy Technology Data Exchange (ETDEWEB)

    Ruhter, W.D.

    1984-05-01

    A portable microcomputer has been developed and programmed for the International Atomic Energy Agency (IAEA) to perform in-field analysis of plutonium gamma-ray spectra. The unit includes a 16-bit LSI-11/2 microprocessor, 32-K words of memory, a 20-character display for user prompting, a numeric keyboard for user responses, and a 20-character thermal printer for hard-copy output of results. The unit weights 11 kg and has dimensions of 33.5 x 30.5 x 23.0 cm. This compactness allows the unit to be stored under an airline seat. Only the positions of the 148-keV /sup 241/Pu and 208-keV /sup 237/U peaks are required for spectral analysis that gives plutonium isotopic ratios and weight percent abundances. Volume I of this report provides a detailed description of the data analysis methodology, operation instructions, hardware, and maintenance and troubleshooting. Volume II describes the software and provides software listings.

  8. Semiautomated three-dimensional segmentation software to quantify carpal bone volume changes on wrist CT scans for arthritis assessment.

    Science.gov (United States)

    Duryea, J; Magalnick, M; Alli, S; Yao, L; Wilson, M; Goldbach-Mansky, R

    2008-06-01

    Rapid progression of joint destruction is an indication of poor prognosis in patients with rheumatoid arthritis. Computed tomography (CT) has the potential to serve as a gold standard for joint imaging since it provides high resolution three-dimensional (3D) images of bone structure. The authors have developed a method to quantify erosion volume changes on wrist CT scans. In this article they present a description and validation of the methodology using multiple scans of a hand phantom and five human subjects. An anthropomorphic hand phantom was imaged with a clinical CT scanner at three different orientations separated by a 30-deg angle. A reader used the semiautomated software tool to segment the individual carpal bones of each CT scan. Reproducibility was measured as the root-mean-square standard deviation (RMMSD) and coefficient of variation (CoV) between multiple measurements of the carpal volumes. Longitudinal erosion progression was studied by inserting simulated erosions in a paired second scan. The change in simulated erosion size was calculated by performing 3D image registration and measuring the volume difference between scans in a region adjacent to the simulated erosion. The RMSSD for the total carpal volumes was 21.0 mm3 (CoV = 1.3%) for the phantom, and 44.1 mm3 (CoV = 3.0%) for the in vivo subjects. Using 3D registration and local volume difference calculations, the RMMSD was 1.0-3.0 mm3 The reader time was approximately 5 min per carpal bone. There was excellent agreement between the measured and simulated erosion volumes. The effect of a poorly measured volume for a single erosion is mitigated by the large number of subjects that would comprise a clinical study and that there will be many erosions measured per patient. CT promises to be a quantifiable tool to measure erosion volumes and may serve as a gold standard that can be used in the validation of other modalities such as magnetic resonance imaging.

  9. MPS Solidification Model. Volume 2: Operating guide and software documentation for the unsteady model

    Science.gov (United States)

    Maples, A. L.

    1981-01-01

    The operation of solidification Model 2 is described and documentation of the software associated with the model is provided. Model 2 calculates the macrosegregation in a rectangular ingot of a binary alloy as a result of unsteady horizontal axisymmetric bidirectional solidification. The solidification program allows interactive modification of calculation parameters as well as selection of graphical and tabular output. In batch mode, parameter values are input in card image form and output consists of printed tables of solidification functions. The operational aspects of Model 2 that differ substantially from Model 1 are described. The global flow diagrams and data structures of Model 2 are included. The primary program documentation is the code itself.

  10. CrossTalk: The Journal of Defense Software Engineering. Volume 21, Number 12

    Science.gov (United States)

    2008-12-01

    Koppelman Primavera Systems, Inc. 16 CROSSTALK The Journal of Defense Software Engineering December 2008 “ ... the TCPI, when used in conjunction with the CPI...specializing in EV. He has been a consultant to the staff at Primavera Systems, Inc. since 1993. Fleming was on the core team that updated the PMI’s...quentinf.com Joel M. Koppelman is the co-founder and CEO of Primavera Systems, Inc. He co-authored the book “Earned Value Project Management,” published by

  11. Guidelines for the verification and validation of expert system software and conventional software: Survey and documentation of expert system verification and validation methodologies. Volume 3

    Energy Technology Data Exchange (ETDEWEB)

    Groundwater, E.H.; Miller, L.A.; Mirsky, S.M. [Science Applications International Corp., McLean, VA (United States)

    1995-03-01

    This report is the third volume in the final report for the Expert System Verification and Validation (V&V) project which was jointly sponsored by the Nuclear Regulatory Commission and the Electric Power Research Institute. The ultimate objective is the formulation of guidelines for the V&V of expert systems for use in nuclear power applications. The purpose of this activity was to survey and document techniques presently in use for expert system V&V. The survey effort included an extensive telephone interviewing program, site visits, and a thorough bibliographic search and compilation. The major finding was that V&V of expert systems is not nearly as established or prevalent as V&V of conventional software systems. When V&V was used for expert systems, it was almost always at the system validation stage after full implementation and integration usually employing the non-systematic dynamic method of {open_quotes}ad hoc testing.{close_quotes} There were few examples of employing V&V in the early phases of development and only weak sporadic mention of the possibilities in the literature. There is, however, a very active research area concerning the development of methods and tools to detect problems with, particularly, rule-based expert systems. Four such static-testing methods were identified which were not discovered in a comprehensive review of conventional V&V methods in an earlier task.

  12. Fast implementation of kernel simplex volume analysis based on modified Cholesky factorization for endmember extraction

    Institute of Scientific and Technical Information of China (English)

    Jing LI; Xiao-run LI; Li-jiao WANG; Liao-ying ZHAO

    2016-01-01

    Endmember extraction is a key step in the hyperspectral image analysis process. The kernel new simplex growing algorithm (KNSGA), recently developed as a nonlinear alternative to the simplex growing algorithm (SGA), has proven a prom-ising endmember extraction technique. However, KNSGA still suffers from two issues limiting its application. First, its random initialization leads to inconsistency in final results; second, excessive computation is caused by the iterations of a simplex volume calculation. To solve the first issue, the spatial pixel purity index (SPPI) method is used in this study to extract the first endmember, eliminating the initialization dependence. A novel approach tackles the second issue by initially using a modified Cholesky fac-torization to decompose the volume matrix into triangular matrices, in order to avoid directly computing the determinant tauto-logically in the simplex volume formula. Theoretical analysis and experiments on both simulated and real spectral data demon-strate that the proposed algorithm significantly reduces computational complexity, and runs faster than the original algorithm.

  13. Description and characterization of a novel method for partial volume simulation in software breast phantoms.

    Science.gov (United States)

    Chen, Feiyu; Bakic, Predrag R; Maidment, Andrew D A; Jensen, Shane T; Shi, Xiquan; Pokrajac, David D

    2015-10-01

    A modification to our previous simulation of breast anatomy is proposed to improve the quality of simulated x-ray projections images. The image quality is affected by the voxel size of the simulation. Large voxels can cause notable spatial quantization artifacts; small voxels extend the generation time and increase the memory requirements. An improvement in image quality is achievable without reducing voxel size by the simulation of partial volume averaging in which voxels containing more than one simulated tissue type are allowed. The linear x-ray attenuation coefficient of voxels is, thus, the sum of the linear attenuation coefficients weighted by the voxel subvolume occupied by each tissue type. A local planar approximation of the boundary surface is employed. In the two-material case, the partial volume in each voxel is computed by decomposition into up to four simple geometric shapes. In the three-material case, by application of the Gauss-Ostrogradsky theorem, the 3D partial volume problem is converted into one of a few simpler 2D surface area problems. We illustrate the benefits of the proposed methodology on simulated x-ray projections. An efficient encoding scheme is proposed for the type and proportion of simulated tissues in each voxel. Monte Carlo simulation was used to evaluate the quantitative error of our approximation algorithms.

  14. High integrity software for nuclear power plants: Candidate guidelines, technical basis and research needs. Main report, Volume 2

    Energy Technology Data Exchange (ETDEWEB)

    Seth, S.; Bail, W.; Cleaves, D.; Cohen, H.; Hybertson, D.; Schaefer, C.; Stark, G.; Ta, A.; Ulery, B. [Mitre Corp., McLean, VA (United States)

    1995-06-01

    The work documented in this report was performed in support of the US Nuclear Regulatory Commission to examine the technical basis for candidate guidelines that could be considered in reviewing and evaluating high integrity computer e following software development and assurance activities: Requirements specification; design; coding; verification and validation, inclukding static analysis and dynamic testing; safety analysis; operation and maintenance; configuration management; quality assurance; and planning and management. Each activity (framework element) was subdivided into technical areas (framework subelements). The report describes the development of approximately 200 candidate guidelines that span the entire ran e identification, categorization and prioritization of technical basis for those candidate guidelines; and the identification, categorization and prioritization of research needs for improving the technical basis. The report has two volumes: Volume 1, Executive Summary includes an overview of the framwork and of each framework element, the complete set of candidate guidelines, the results of the assessment of the technical basis for each candidate guideline, and a discussion of research needs that support the regulatory function; this document, Volume 2, is the main report.

  15. Automatic extraction of forward stroke volume using dynamic PET/CT

    DEFF Research Database (Denmark)

    Harms, Hans; Tolbod, Lars Poulsen; Hansson, Nils Henrik;

    Background: Dynamic PET can be used to extract forward stroke volume (FSV) by the indicator dilution principle. The technique employed can be automated and is in theory independent on the tracer used and may therefore be added to any dynamic cardiac PET protocol. The aim of this study was to vali......Background: Dynamic PET can be used to extract forward stroke volume (FSV) by the indicator dilution principle. The technique employed can be automated and is in theory independent on the tracer used and may therefore be added to any dynamic cardiac PET protocol. The aim of this study...... was to validate automated methods for extracting FSV directly from dynamic PET studies for two different tracers and to examine potential scanner hardware bias. Methods: 21 subjects underwent a dynamic 27 min 11C-acetate PET scan on a Siemens Biograph TruePoint 64 PET/CT scanner (scanner I). In addition, 8...... subjects underwent a dynamic 6 min 15O-water PET scan followed by a 27 min 11C-acetate PET scan on a GE Discovery ST PET/CT scanner (scanner II). The LV-aortic time-activity curve (TAC) was extracted automatically from dynamic PET data using cluster analysis. The first-pass peak was isolated by automatic...

  16. Sandia software guidelines: Software quality planning

    Energy Technology Data Exchange (ETDEWEB)

    1987-08-01

    This volume is one in a series of Sandia Software Guidelines intended for use in producing quality software within Sandia National Laboratories. In consonance with the IEEE Standard for Software Quality Assurance Plans, this volume identifies procedures to follow in producing a Software Quality Assurance Plan for an organization or a project, and provides an example project SQA plan. 2 figs., 4 tabs.

  17. A high volume extraction and purification method for recovering DNA from human bone.

    Science.gov (United States)

    Marshall, Pamela L; Stoljarova, Monika; Schmedes, Sarah E; King, Jonathan L; Budowle, Bruce

    2014-09-01

    DNA recovery, purity and overall extraction efficiency of a protocol employing a novel silica-based column, Hi-Flow(®) (Generon Ltd., Maidenhead, UK), were compared with that of a standard organic DNA extraction methodology. The quantities of DNA recovered by each method were compared by real-time PCR and quality of DNA by STR typing using the PowerPlex(®) ESI 17 Pro System (Promega Corporation, Madison, WI) on DNA from 10 human bone samples. Overall, the Hi-Flow method recovered comparable quantities of DNA ranging from 0.8ng±1 to 900ng±159 of DNA compared with the organic method ranging from 0.5ng±0.9 to 855ng±156 of DNA. Complete profiles (17/17 loci tested) were obtained for at least one of three replicates for 3/10 samples using the Hi-Flow method and from 2/10 samples with the organic method. All remaining bone samples yielded partial profiles for all replicates with both methods. Compared with a standard organic DNA isolation method, the results indicated that the Hi-Flow method provided equal or improved recovery and quality of DNA without the harmful effects of organic extraction. Moreover, larger extraction volumes (up to 20mL) can be employed with the Hi-Flow method which enabled more bone sample to be extracted at one time. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  18. Performance evaluation of automated segmentation software on optical coherence tomography volume data.

    Science.gov (United States)

    Tian, Jing; Varga, Boglarka; Tatrai, Erika; Fanni, Palya; Somfai, Gabor Mark; Smiddy, William E; Debuc, Delia Cabrera

    2016-05-01

    Over the past two decades a significant number of OCT segmentation approaches have been proposed in the literature. Each methodology has been conceived for and/or evaluated using specific datasets that do not reflect the complexities of the majority of widely available retinal features observed in clinical settings. In addition, there does not exist an appropriate OCT dataset with ground truth that reflects the realities of everyday retinal features observed in clinical settings. While the need for unbiased performance evaluation of automated segmentation algorithms is obvious, the validation process of segmentation algorithms have been usually performed by comparing with manual labelings from each study and there has been a lack of common ground truth. Therefore, a performance comparison of different algorithms using the same ground truth has never been performed. This paper reviews research-oriented tools for automated segmentation of the retinal tissue on OCT images. It also evaluates and compares the performance of these software tools with a common ground truth.

  19. Finite volume treatment of pi pi scattering and limits to phase shifts extraction from lattice QCD

    CERN Document Server

    Albaladejo, M; Oset, E; Rios, G; Roca, L

    2012-01-01

    We study theoretically the effects of finite volume for pipi scattering in order to extract physical observables for infinite volume from lattice QCD. We compare three different approaches for pipi scattering (lowest order Bethe-Salpeter approach, N/D and inverse amplitude methods) with the aim to study the effects of the finite size of the box in the potential of the different theories, specially the left-hand cut contribution through loops in the crossed t,u-channels. We quantify the error made by neglecting these effects in usual extractions of physical observables from lattice QCD spectra. We conclude that for pipi phase-shifts in the scalar-isoscalar channel up to 800 MeV this effect is negligible for box sizes bigger than 2.5m_pi^-1 and of the order of 5% at around 1.5-2m_pi^-1. For isospin 2 the finite size effects can reach up to 10% for that energy. We also quantify the error made when using the standard Luscher method to extract physical observables from lattice QCD, which is widely used in the lite...

  20. A Novel Method for Extracting Respiration Rate and Relative Tidal Volume from Infrared Thermography

    Science.gov (United States)

    Lewis, Gregory F.; Gatto, Rodolfo G.; Porges, Stephen W.

    2010-01-01

    In psychophysiological research, measurement of respiration has been dependent on transducers having direct contact with the participant. The current study provides empirical data demonstrating that a noncontact technology, infrared video thermography, can accurately estimate breathing rate and relative tidal volume across a range of breathing patterns. Video tracking algorithms were applied to frame-by-frame thermal images of the face to extract time series of nostril temperature and to generate breath-by-breath measures of respiration rate and relative tidal volume. The thermal indices of respiration were contrasted with criterion measures collected with inductance plethysmography. The strong correlations observed between the technologies demonstrate the potential use of facial video thermography as a noncontact technology to monitor respiration. PMID:21214587

  1. Crack surface extraction of industrial CT volume data using FPIT and planelet.

    Science.gov (United States)

    Li, Zongjian; Zeng, Li; Zou, Xiaobing; Xiang, Caibing

    2011-01-01

    As an advanced nondestructive testing (NDT) technology, industrial computed tomography (ICT) has been widely applied to diversified areas. In modern industry, ICT is especially useful for analyzing inner defects of complex and close work pieces. The common defects of work pieces include gas cavities, slag inclusions, cracks and shrinking cavities. Only cracks are often caused by fatigue usage. Precisely extracting a crack is important to estimate the remaining secure service time of the work piece. This paper presents a crack surface extraction method of ICT volume data based on finite plane integral transform (FPIT) and planelet. FPIT and planelet, as new methods of multiscale geometric analysis (MGA), have distinct discrimination for different plane singularities. Within the paper, firstly the definitions of FPIT and planelet are introduced. Secondly, after analyzing the components and relationship of planelet at monoscale, a fast performance of planelet transform is designed. Thirdly, the steps of the proposed crack surface extraction method are described. In numeric experiment, compared with the method of 3D facet model, C-V model and 3D wavelet respectively, the proposed method can extract the crack surface full and continuously, which,is robust to noise.

  2. Development of automated extraction method of biliary tract from abdominal CT volumes based on local intensity structure analysis

    Science.gov (United States)

    Koga, Kusuto; Hayashi, Yuichiro; Hirose, Tomoaki; Oda, Masahiro; Kitasaka, Takayuki; Igami, Tsuyoshi; Nagino, Masato; Mori, Kensaku

    2014-03-01

    In this paper, we propose an automated biliary tract extraction method from abdominal CT volumes. The biliary tract is the path by which bile is transported from liver to the duodenum. No extraction method have been reported for the automated extraction of the biliary tract from common contrast CT volumes. Our method consists of three steps including: (1) extraction of extrahepatic bile duct (EHBD) candidate regions, (2) extraction of intrahepatic bile duct (IHBD) candidate regions, and (3) combination of these candidate regions. The IHBD has linear structures and intensities of the IHBD are low in CT volumes. We use a dark linear structure enhancement (DLSE) filter based on a local intensity structure analysis method using the eigenvalues of the Hessian matrix for the IHBD candidate region extraction. The EHBD region is extracted using a thresholding process and a connected component analysis. In the combination process, we connect the IHBD candidate regions to each EHBD candidate region and select a bile duct region from the connected candidate regions. We applied the proposed method to 22 cases of CT volumes. An average Dice coefficient of extraction result was 66.7%.

  3. A Method for Measuring the Volume of Transdermally Extracted Interstitial Fluid by a Three-Electrode Skin Resistance Sensor

    Directory of Open Access Journals (Sweden)

    Dachao Li

    2014-04-01

    Full Text Available It is difficult to accurately measure the volume of transdermally extracted interstitial fluid (ISF, which is important for improving blood glucose prediction accuracy. Skin resistance, which is a good indicator of skin permeability, can be used to determine the volume of extracted ISF. However, it is a challenge to realize in vivo longitudinal skin resistance measurements of microareas. In this study, a three-electrode sensor was presented for measuring single-point skin resistance in vivo, and a method for determining the volume of transdermally extracted ISF using this sensor was proposed. Skin resistance was measured under static and dynamic conditions. The correlation between the skin resistance and the permeation rate of transdermally extracted ISF was proven. The volume of transdermally extracted ISF was determined using skin resistance. Factors affecting the volume prediction accuracy of transdermally extracted ISF were discussed. This method is expected to improve the accuracy of blood glucose prediction, and is of great significance for the clinical application of minimally invasive blood glucose measurement.

  4. Interim report on the development and application of environmental mapped data digitization, encoding, analysis, and display software for the ALICE system. Volume II. [MAP, CHAIN, FIX, and DOUT, in FORTRAN IV for PDP-10

    Energy Technology Data Exchange (ETDEWEB)

    Amiot, L.W.; Lima, R.J.; Scholbrock, S.D.; Shelman, C.B.; Wehman, R.H.

    1979-06-01

    Volume I of An Interim Report on the Development and Application of Environmental Mapped Data Digitization, Encoding, Analysis, and Display Software for the ALICE System provided an overall description of the software developed for the ALICE System and presented an example of its application. The scope of the information presented in Volume I was directed both to the users and developers of digitization, encoding, analysis, and display software. Volume II presents information which is directly related to the actual computer code and operational characteristics (keys and subroutines) of the software. Volume II will be of more interest to developers of software than to users of the software. However, developers of software should be aware that the code developed for the ALICE System operates in an environment where much of the peripheral hardware to the PDP-10 is ANL/AMD built. For this reason, portions of the code may have to be modified for implementation on other computer system configurations. 11 tables.

  5. Base excision repair efficiency and mechanism in nuclear extracts are influenced by the ratio between volume of nuclear extraction buffer and nuclei-Implications for comparative studies

    DEFF Research Database (Denmark)

    Akbari, Mansour; Krokan, Hans E

    2012-01-01

    using purified proteins essentially mirror properties of the proteins used, and does not necessarily reflect the mechanism as it occurs in the cell. Nuclear extracts from cultured cells have the capacity to carry out complete BER and can give important information on the mechanism. Furthermore......, candidate proteins in extracts can be inhibited or depleted in a controlled way, making defined extracts an important source for mechanistic studies. The major drawback is that there is no standardized method of preparing nuclear extract for BER studies, and it does not appear to be a topic given much...... attention. Here we have examined BER activity of nuclear cell extracts from HeLa cells, using as substrate a circular DNA molecule with either uracil or an AP-site in a defined position. We show that BER activity of nuclear extracts from the same batch of cells varies inversely with the volume of nuclear...

  6. Characteristics of the negative ion beam extracted from an LBL multicusp volume source

    Energy Technology Data Exchange (ETDEWEB)

    Debiak, T.W.; Solensten, L.; Sredniawski, J.J.; Ng, Y.C.; Heuer, R. (Grumman Corporation, Bethpage, New York 11714 (US))

    1990-01-01

    This work encompasses a study of the beam position, profile, and emittance of a Lawrence Berkeley Lab (LBL) multicusp volume source. The study includes a comparison of different extraction geometries with single- and multiple-hole apertures. Our work is currently based on single-gap extraction and acceleration. These experiments are the first of a planned series of studies with various extractor geometries. The beam profile full width at half-maximum ranged from 5.7 to 10.2 mm at a position 69 mm from the emission aperture. Measurements of profile and position in the vertical direction indicate that the beam is significantly bent in the direction expected due to the field of the electron separation magnet. Phase space contour plots in the horizontal plane have been obtained for circular extraction apertures with a diameter of 1.0 and 2.0 mm, and a multiple-hole aperture with an overall diameter of 2.46 mm. Emittances were calculated to be as low as 0.0010 {pi} cm mrad for the 1-mm aperture and 0.0014 {pi} cm mrad for the 2-mm aperture. Emittances are not reported for the multiple-hole aperture due to the shape of the phase space contours; however, analysis of the data is in progress to provide a meaningful comparison of the single-hole and multiple-hole beam characteristics.

  7. Ion Exchange and Solvent Extraction: Supramolecular Aspects of Solvent Exchange Volume 21

    Energy Technology Data Exchange (ETDEWEB)

    Gloe, Karsten [Technischen Universität Dresden; Tasker, Peter A [ORNL; Oshima, Tatsuya [University of Miyazaki; Watarai, Hitoshi [Institute for NanoScience Design at Osaka University; Nilsson, Mikael [University of California, Irvine

    2013-01-01

    Preface The theme of supramolecular chemistry (SC), entailing the organization of multiple species through noncovalent interactions, has permeated virtually all aspects of chemical endeavor over the past several decades. Given that the observed behavior of discrete molecular species depends upon their weak interactions with one another and with matrix components, one would have to conclude that SC must indeed form part of the fabric of chemistry itself. A vast literature now serves to categorize SC phenomena within a body of consistent terminology. The word supramolecular itself appears in the titles of dozens of books, several journals, and a dedicated encyclopedia. Not surprisingly, the theme of SC also permeates the field of solvent extraction (SX), inspiring the framework for this volume of Ion Exchange and Solvent Extraction. It is attempted in the six chapters of this volume to identify both how supramolecular behavior occurs and is studied in the context of SX and how SC is influencing the current direction of SX. Researchers and practitioners have long dealt with supramolecular interactions in SX. Indeed, the use of polar extractant molecules in nonpolar media virtually assures that aggregative interactions will dominate the solution behavior of SX. Analytical chemists working in the 1930s to the 1950s with simple mono- and bidentate chelating ligands as extractants noted that extraction of metal ions obeyed complicated mass-action equilibria involving complex stoichiometries. As chemists and engineers developed processes for nuclear and hydrometallurgical applications in the 1950s and 1960s, the preference for aliphatic diluents only enhanced the complexity and supramolecular nature of extraction chemistry. Use of physical techniques such as light scattering and vapor-pressure measurements together with various spectroscopic methods revealed organic-phase aggregates from well-defined dimers to small aggregates containing a few extractant molecules to large

  8. Quantitative radiology: automated measurement of polyp volume in computed tomography colonography using Hessian matrix-based shape extraction and volume growing

    Science.gov (United States)

    Epstein, Mark L.; Obara, Piotr R.; Chen, Yisong; Liu, Junchi; Zarshenas, Amin; Makkinejad, Nazanin; Dachman, Abraham H.

    2015-01-01

    Background Current measurement of the single longest dimension of a polyp is subjective and has variations among radiologists. Our purpose was to develop a computerized measurement of polyp volume in computed tomography colonography (CTC). Methods We developed a 3D automated scheme for measuring polyp volume at CTC. Our scheme consisted of segmentation of colon wall to confine polyp segmentation to the colon wall, extraction of a highly polyp-like seed region based on the Hessian matrix, a 3D volume growing technique under the minimum surface expansion criterion for segmentation of polyps, and sub-voxel refinement and surface smoothing for obtaining a smooth polyp surface. Our database consisted of 30 polyp views (15 polyps) in CTC scans from 13 patients. Each patient was scanned in the supine and prone positions. Polyp sizes measured in optical colonoscopy (OC) ranged from 6-18 mm with a mean of 10 mm. A radiologist outlined polyps in each slice and calculated volumes by summation of volumes in each slice. The measurement study was repeated 3 times at least 1 week apart for minimizing a memory effect bias. We used the mean volume of the three studies as “gold standard”. Results Our measurement scheme yielded a mean polyp volume of 0.38 cc (range, 0.15-1.24 cc), whereas a mean “gold standard” manual volume was 0.40 cc (range, 0.15-1.08 cc). The “gold-standard” manual and computer volumetric reached excellent agreement (intra-class correlation coefficient =0.80), with no statistically significant difference [P (F≤f) =0.42]. Conclusions We developed an automated scheme for measuring polyp volume at CTC based on Hessian matrix-based shape extraction and volume growing. Polyp volumes obtained by our automated scheme agreed excellently with “gold standard” manual volumes. Our fully automated scheme can efficiently provide accurate polyp volumes for radiologists; thus, it would help radiologists improve the accuracy and efficiency of polyp volume

  9. Achieving Better Buying Power through Acquisition of Open Architecture Software Systems. Volume 2 Understanding Open Architecture Software Systems: Licensing and Security Research and Recommendations

    Science.gov (United States)

    2016-01-06

    of Defense (DoD) and military services in  free  and  open   source   software  (OSS) first appeared in the past five or so years [cf. Bol03]. More  recently...shared development; collaborative buying;  donation; sponsorship;  free / open   source   software  (e.g., Government OSS – ​GOSS​); and others  [Hanf13...2007).  Free / Open   Source   Software  Development: Recent Research  Results and Methods, in M. Zelkowitz (Ed.), ​Advances in Computers​, 69, 243­295.    19

  10. A high-throughput platform for low-volume high-temperature/pressure sealed vessel solvent extractions

    Energy Technology Data Exchange (ETDEWEB)

    Damm, Markus [Christian Doppler Laboratory for Microwave Chemistry (CDLMC) and Institute of Chemistry, Karl-Franzens-University Graz, Heinrichstrasse 28, A-8010 Graz (Austria); Kappe, C. Oliver, E-mail: oliver.kappe@uni-graz.at [Christian Doppler Laboratory for Microwave Chemistry (CDLMC) and Institute of Chemistry, Karl-Franzens-University Graz, Heinrichstrasse 28, A-8010 Graz (Austria)

    2011-11-30

    Highlights: Black-Right-Pointing-Pointer Parallel low-volume coffee extractions in sealed-vessel HPLC/GC vials. Black-Right-Pointing-Pointer Extractions are performed at high temperatures and pressures (200 Degree-Sign C/20 bar). Black-Right-Pointing-Pointer Rapid caffeine determination from the liquid phase. Black-Right-Pointing-Pointer Headspace analysis of volatiles using solid-phase microextraction (SPME). - Abstract: A high-throughput platform for performing parallel solvent extractions in sealed HPLC/GC vials inside a microwave reactor is described. The system consist of a strongly microwave-absorbing silicon carbide plate with 20 cylindrical wells of appropriate dimensions to be fitted with standard HPLC/GC autosampler vials serving as extraction vessels. Due to the possibility of heating up to four heating platforms simultaneously (80 vials), efficient parallel analytical-scale solvent extractions can be performed using volumes of 0.5-1.5 mL at a maximum temperature/pressure limit of 200 Degree-Sign C/20 bar. Since the extraction and subsequent analysis by either gas chromatography or liquid chromatography coupled with mass detection (GC-MS or LC-MS) is performed directly from the autosampler vial, errors caused by sample transfer can be minimized. The platform was evaluated for the extraction and quantification of caffeine from commercial coffee powders assessing different solvent types, extraction temperatures and times. For example, 141 {+-} 11 {mu}g caffeine (5 mg coffee powder) were extracted during a single extraction cycle using methanol as extraction solvent, whereas only 90 {+-} 11 were obtained performing the extraction in methylene chloride, applying the same reaction conditions (90 Degree-Sign C, 10 min). In multiple extraction experiments a total of {approx}150 {mu}g caffeine was extracted from 5 mg commercial coffee powder. In addition to the quantitative caffeine determination, a comparative qualitative analysis of the liquid phase coffee

  11. Purification of nattokinase by reverse micelles extraction from fermentation broth: effect of temperature and phase volume ratio.

    Science.gov (United States)

    Liu, Jun-Guo; Xing, Jian-Min; Chang, Tian-Shi; Liu, Hui-Zhou

    2006-03-01

    Nattokinase is a novel fibrinolytic enzyme that is considered to be a promising agent for thrombosis therapy. In this study, reverse micelles extraction was applied to purify and concentrate nattokinase from fermentation broth. The effects of temperature and phase volume ratio used for the forward and backward extraction on the extraction process were examined. The optimal temperature for forward and backward extraction were 25 degrees C and 35 degrees C respectively. Nattokinase became more thermosensitive during reverse micelles extraction. And it could be enriched in the stripping phase eight times during backward extraction. It was found that nattokinase could be purified by AOT reverse micelles with up to 80% activity recovery and with a purification factor of 3.9.

  12. PEACE: Pulsar Evaluation Algorithm for Candidate Extraction -- A software package for post-analysis processing of pulsar survey candidates

    CERN Document Server

    Lee, K J; Jenet, F A; Martinez, J; Dartez, L P; Mata, A; Lunsford, G; Cohen, S; Biwer, C M; Rohr, M; Flanigan, J; Walker, A; Banaszak, S; Allen, B; Barr, E D; Bhat, N D R; Bogdanov, S; Brazier, A; Camilo, F; Champion, D J; Chatterjee, S; Cordes, J; Crawford, F; Deneva, J; Desvignes, G; Ferdman, R D; Freire, P; Hessels, J W T; Karuppusamy, R; Kaspi, V M; Knispel, B; Kramer, M; Lazarus, P; Lynch, R; Lyne, A; McLaughlin, M; Ransom, S; Scholz, P; Siemens, X; Spitler, L; Stairs, I; Tan, M; van Leeuwen, J; Zhu, W W

    2013-01-01

    Modern radio pulsar surveys produce a large volume of prospective candidates, the majority of which are polluted by human-created radio frequency interference or other forms of noise. Typically, large numbers of candidates need to be visually inspected in order to determine if they are real pulsars. This process can be labor intensive. In this paper, we introduce an algorithm called PEACE (Pulsar Evaluation Algorithm for Candidate Extraction) which improves the efficiency of identifying pulsar signals. The algorithm ranks the candidates based on a score function. Unlike popular machine-learning based algorithms, no prior training data sets are required. This algorithm has been applied to data from several large-scale radio pulsar surveys. Using the human-based ranking results generated by students in the Arecibo Remote Command enter programme, the statistical performance of PEACE was evaluated. It was found that PEACE ranked 68% of the student-identified pulsars within the top 0.17% of sorted candidates, 95% ...

  13. Joint Logistics Commanders’ Workshop on Post Deployment Software Support (PDSS) for Mission-Critical Computer Software. Volume 2. Workshop Proceedings.

    Science.gov (United States)

    1984-06-01

    fit into the software life cycle? At what specific points is it employed? What is the relationship of IV&V to test, integracion , QA, system...computer. It is recognized that organizational and social factors are an important part of the work environment. These factors must be considered in the...the system and the organiza- tionai and social setting (ELZER, P.F., 1979). In industrial engineering there has been considerable work on facilities

  14. Volumetric analysis of lung nodules in computed tomography (CT): comparison of two different segmentation algorithm softwares and two different reconstruction filters on automated volume calculation.

    Science.gov (United States)

    Christe, Andreas; Brönnimann, Alain; Vock, Peter

    2014-02-01

    A precise detection of volume change allows for better estimating the biological behavior of the lung nodules. Postprocessing tools with automated detection, segmentation, and volumetric analysis of lung nodules may expedite radiological processes and give additional confidence to the radiologists. To compare two different postprocessing software algorithms (LMS Lung, Median Technologies; LungCARE®, Siemens) in CT volumetric measurement and to analyze the effect of soft (B30) and hard reconstruction filter (B70) on automated volume measurement. Between January 2010 and April 2010, 45 patients with a total of 113 pulmonary nodules were included. The CT exam was performed on a 64-row multidetector CT scanner (Somatom Sensation, Siemens, Erlangen, Germany) with the following parameters: collimation, 24x1.2 mm; pitch, 1.15; voltage, 120 kVp; reference tube current-time, 100 mAs. Automated volumetric measurement of each lung nodule was performed with the two different postprocessing algorithms based on two reconstruction filters (B30 and B70). The average relative volume measurement difference (VME%) and the limits of agreement between two methods were used for comparison. At soft reconstruction filters the LMS system produced mean nodule volumes that were 34.1% (P filters (B30) was significantly larger than with hard filters (B70); 11.2% for LMS and 1.6% for LungCARE®, respectively (both with P filters, 13.6% for soft and 3.8% for hard filters, respectively (P  0.05). There is a substantial inter-software (LMS/LungCARE®) as well as intra-software variability (B30/B70) in lung nodule volume measurement; therefore, it is mandatory to use the same equipment with the same reconstruction filter for the follow-up of lung nodule volume.

  15. CrossTalk: The Journal of Defense Software Engineering. Volume 28, Number 2, March/April 2015

    Science.gov (United States)

    2015-04-01

    Computational Mathemat - ics Division of NIST. His current interests include software testing and evaluation of the uncertainty in outputs of computation- al...1999, pp. 207-215. 17. Kemerer, C. F. (1987). “An empirical validation of software cost estimation models.” Communications of the ACM , Vol. 30, No...APR 2015 2. REPORT TYPE 3. DATES COVERED 00-00-2015 to 00-00-2015 4. TITLE AND SUBTITLE CrossTalk, The Journal of Defense Software Engineering

  16. Main Trend Extraction Based on Irregular Sampling Estimation and Its Application in Storage Volume of Internet Data Center

    Science.gov (United States)

    Dou, Chao

    2016-01-01

    The storage volume of internet data center is one of the classical time series. It is very valuable to predict the storage volume of a data center for the business value. However, the storage volume series from a data center is always “dirty,” which contains the noise, missing data, and outliers, so it is necessary to extract the main trend of storage volume series for the future prediction processing. In this paper, we propose an irregular sampling estimation method to extract the main trend of the time series, in which the Kalman filter is used to remove the “dirty” data; then the cubic spline interpolation and average method are used to reconstruct the main trend. The developed method is applied in the storage volume series of internet data center. The experiment results show that the developed method can estimate the main trend of storage volume series accurately and make great contribution to predict the future volume value. 
 PMID:28090205

  17. Main Trend Extraction Based on Irregular Sampling Estimation and Its Application in Storage Volume of Internet Data Center

    Directory of Open Access Journals (Sweden)

    Beibei Miao

    2016-01-01

    Full Text Available The storage volume of internet data center is one of the classical time series. It is very valuable to predict the storage volume of a data center for the business value. However, the storage volume series from a data center is always “dirty,” which contains the noise, missing data, and outliers, so it is necessary to extract the main trend of storage volume series for the future prediction processing. In this paper, we propose an irregular sampling estimation method to extract the main trend of the time series, in which the Kalman filter is used to remove the “dirty” data; then the cubic spline interpolation and average method are used to reconstruct the main trend. The developed method is applied in the storage volume series of internet data center. The experiment results show that the developed method can estimate the main trend of storage volume series accurately and make great contribution to predict the future volume value.

  18. CrossTalk. The Journal of Defense Software Engineering. Volume 23, Number 6, Nov/Dec 2010

    Science.gov (United States)

    2010-11-01

    technological challenges that must be overcome . Challenges aren’t going to stop smart software professionals from developing and delivering quality software... procrastination to an art form) so I typically start writ- ing the column about a day before it is due. There is nothing like sheer stress and a looming

  19. The extraction of bitumen from western oil sands: Volume 1. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Oblad, A.G.; Dahlstrom, D.A.; Deo, M.D.; Fletcher, J.V.; Hanson, F.V.; Miller, J.D.; Seader, J.D.

    1997-11-26

    The program is composed of 20 projects, of which 17 are laboratory bench or laboratory pilot scale processes or computer process simulations that are performed in existing facilities on the University of Utah campus in north-east Salt Lake City. These tasks are: (1) coupled fluidized-bed bitumen recovery and coked sand combustion; (2) water-based recovery of bitumen; (3) oil sand pyrolysis in a continuous rotary kiln reactor; (4) oil sand pyrolysis in a large diameter fluidized bed reactor; (5) oil sand pyrolysis in a small diameter fluidized bed reactor; (6) combustion of spent sand in a transport reactor; (7) recovery and upgrading of oil sand bitumen using solvent extraction methods; (8) fixed-bed hydrotreating of Uinta Basin bitumens and bitumen-derived hydrocarbon liquids; (9) ebullieted bed hydrotreating of bitumen and bitumen derived liquids; (10) bitumen upgrading by hydropyrolysis; (11) evaluation of Utah`s major oil sand deposits for the production of asphalt, high-energy jet fuels and other specialty products; (12) characterization of the bitumens and reservoir rocks from the Uinta Basin oil sand deposits; (13) bitumen upgrading pilot plant recommendations; (14) liquid-solid separation and fine tailings thickening; (15) in-situ production of heavy oil from Uinta Basin oil sand deposits; (16) oil sand research and development group analytical facility; and (17) process economics. This volume contains an executive summary and reports for five of these projects. 137 figs., 49 tabs.

  20. The extraction of bitumen from western oil sands: Volume 2. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Oblad, A.G.; Dahlstrom, D.A.; Deo, M.D.; Fletcher, J.V.; Hanson, F.V.; Miller, J.D.; Seader, J.D.

    1997-11-26

    The program is composed of 20 projects, of which 17 are laboratory bench or laboratory pilot scale processes or computer process simulations that are performed in existing facilities on the University of Utah campus in north-east Salt Lake City. These tasks are: (1) coupled fluidized-bed bitumen recovery and coked sand combustion; (2) water-based recovery of bitumen; (3) oil sand pyrolysis in a continuous rotary kiln reactor; (4) oil sand pyrolysis in a large diameter fluidized bed reactor; (5) oil sand pyrolysis in a small diameter fluidized bed reactor; (6) combustion of spent sand in a transport reactor; (7) recovery and upgrading of oil sand bitumen using solvent extraction methods; (8) fixed-bed hydrotreating of Uinta Basin bitumens and bitumen-derived hydrocarbon liquids; (9) ebullieted bed hydrotreating of bitumen and bitumen derived liquids; (10) bitumen upgrading by hydropyrolysis; (11) evaluation of Utah`s major oil sand deposits for the production of asphalt, high-energy jet fuels and other specialty products; (12) characterization of the bitumens and reservoir rocks from the Uinta Basin oil sand deposits; (13) bitumen upgrading pilot plant recommendations; (14) liquid-solid separation and fine tailings thickening; (15) in-situ production of heavy oil from Uinta Basin oil sand deposits; (16) oil sand research and development group analytical facility; and (17) process economics. This volume contains reports on nine of these projects, references, and a bibliography. 351 refs., 192 figs., 65 tabs.

  1. The extraction of negative carbon ions from a volume cusp ion source

    Science.gov (United States)

    Melanson, Stephane; Dehnel, Morgan; Potkins, Dave; McDonald, Hamish; Hollinger, Craig; Theroux, Joseph; Martin, Jeff; Stewart, Thomas; Jackle, Philip; Philpott, Chris; Jones, Tobin; Kalvas, Taneli; Tarvainen, Olli

    2017-08-01

    Acetylene and carbon dioxide gases are used in a filament-powered volume-cusp ion source to produce negative carbon ions for the purpose of carbon implantation for gettering applications. The beam was extracted to an energy of 25 keV and the composition was analyzed with a spectrometer system consisting of a 90° dipole magnet and a pair of slits. It is found that acetylene produces mostly C2- ions (up to 92 µA), while carbon dioxide produces mostly O- with only trace amounts of C-. Maximum C2- current was achieved with 400 W of arc power and, the beam current and composition were found to be highly dependent on the pressure in the source. The beam properties as a function of source settings are analyzed, and plasma properties are measured with a Langmuir probe. Finally, we describe testing of a new RF H- ion source, found to produce more than 6 mA of CW H- beam.

  2. Target volume delimitation with PET-CT in radiotherapy planning: A GDCM and ROOT based software implementation

    OpenAIRE

    Amaya Espinosa, Helman Alirio

    2014-01-01

    Un algoritmo computacional basado en detector de bordes de Canny, fue desarrollado para ser utilizado en procesamiento de imágenes de PET-CT y CT. Este algoritmo es un software construído con librerías de ROOT y GDCM. GDCM y ROOT son frameworks desarrollados por el CERN, y están licenciados como software libre. El software desarrollado mostró una mejor delimitación de una región de hiper-captación simulada con un Phantom de Agar, que el método de thresholding, siendo aplicados ...

  3. CrossTalk: The Journal of Defense Software Engineering. Volume 23, Number 5, September/October 2010

    Science.gov (United States)

    2010-10-01

    network ser - vices that must be provided during a week- long exercise (and the points to be deduct- ed if these services were either not opera- tional or...prioritizing security test cases. • Serving as attack templates for pene- tration testing and objective persona descriptors for red team penetration...and ser - Two Initiatives for Disseminating Software Assurance Knowledge Education in software assurance (SwA) is an essential element in the effort

  4. PEACE: pulsar evaluation algorithm for candidate extraction - a software package for post-analysis processing of pulsar survey candidates

    Science.gov (United States)

    Lee, K. J.; Stovall, K.; Jenet, F. A.; Martinez, J.; Dartez, L. P.; Mata, A.; Lunsford, G.; Cohen, S.; Biwer, C. M.; Rohr, M.; Flanigan, J.; Walker, A.; Banaszak, S.; Allen, B.; Barr, E. D.; Bhat, N. D. R.; Bogdanov, S.; Brazier, A.; Camilo, F.; Champion, D. J.; Chatterjee, S.; Cordes, J.; Crawford, F.; Deneva, J.; Desvignes, G.; Ferdman, R. D.; Freire, P.; Hessels, J. W. T.; Karuppusamy, R.; Kaspi, V. M.; Knispel, B.; Kramer, M.; Lazarus, P.; Lynch, R.; Lyne, A.; McLaughlin, M.; Ransom, S.; Scholz, P.; Siemens, X.; Spitler, L.; Stairs, I.; Tan, M.; van Leeuwen, J.; Zhu, W. W.

    2013-07-01

    Modern radio pulsar surveys produce a large volume of prospective candidates, the majority of which are polluted by human-created radio frequency interference or other forms of noise. Typically, large numbers of candidates need to be visually inspected in order to determine if they are real pulsars. This process can be labour intensive. In this paper, we introduce an algorithm called Pulsar Evaluation Algorithm for Candidate Extraction (PEACE) which improves the efficiency of identifying pulsar signals. The algorithm ranks the candidates based on a score function. Unlike popular machine-learning-based algorithms, no prior training data sets are required. This algorithm has been applied to data from several large-scale radio pulsar surveys. Using the human-based ranking results generated by students in the Arecibo Remote Command Center programme, the statistical performance of PEACE was evaluated. It was found that PEACE ranked 68 per cent of the student-identified pulsars within the top 0.17 per cent of sorted candidates, 95 per cent within the top 0.34 per cent and 100 per cent within the top 3.7 per cent. This clearly demonstrates that PEACE significantly increases the pulsar identification rate by a factor of about 50 to 1000. To date, PEACE has been directly responsible for the discovery of 47 new pulsars, 5 of which are millisecond pulsars that may be useful for pulsar timing based gravitational-wave detection projects.

  5. Software Reviews.

    Science.gov (United States)

    Science and Children, 1990

    1990-01-01

    Reviewed are seven computer software packages for IBM and/or Apple Computers. Included are "Windows on Science: Volume 1--Physical Science"; "Science Probe--Physical Science"; "Wildlife Adventures--Grizzly Bears"; "Science Skills--Development Programs"; "The Clean Machine"; "Rock Doctor"; and "Geology Search." Cost, quality, hardware, and…

  6. Architecture and data processing alternatives for the TSE computer. Volume 2: Extraction of topological information from an image by the Tse computer

    Science.gov (United States)

    Jones, J. R.; Bodenheimer, R. E.

    1976-01-01

    A simple programmable Tse processor organization and arithmetic operations necessary for extraction of the desired topological information are described. Hardware additions to this organization are discussed along with trade-offs peculiar to the tse computing concept. An improved organization is presented along with the complementary software for the various arithmetic operations. The performance of the two organizations is compared in terms of speed, power, and cost. Software routines developed to extract the desired information from an image are included.

  7. Influence of Software Tool and Methodological Aspects of Total Metabolic Tumor Volume Calculation on Baseline [18F]FDG PET to Predict Survival in Hodgkin Lymphoma.

    Directory of Open Access Journals (Sweden)

    Salim Kanoun

    Full Text Available To investigate the respective influence of software tool and total metabolic tumor volume (TMTV0 calculation method on prognostic stratification of baseline 2-deoxy-2-[18F]fluoro-D-glucose positron emission tomography ([18F]FDG-PET in newly diagnosed Hodgkin lymphoma (HL.59 patients with newly diagnosed HL were retrospectively included. [18F]FDG-PET was performed before any treatment. Four sets of TMTV0 were calculated with Beth Israel (BI software: based on an absolute threshold selecting voxel with standardized uptake value (SUV >2.5 (TMTV02.5, applying a per-lesion threshold of 41% of the SUV max (TMTV041 and using a per-patient adapted threshold based on SUV max of the liver (>125% and >140% of SUV max of the liver background; TMTV0125 and TMTV0140. TMTV041 was also determined with commercial software for comparison of software tools. ROC curves were used to determine the optimal threshold for each TMTV0 to predict treatment failure.Median follow-up was 39 months. There was an excellent correlation between TMTV041 determined with BI and with the commercial software (r = 0.96, p<0.0001. The median TMTV0 value for TMTV041, TMTV02.5, TMTV0125 and TMTV0140 were respectively 160 (used as reference, 210 ([28;154] p = 0.005, 183 ([-4;114] p = 0.06 and 143 ml ([-58;64] p = 0.9. The respective optimal TMTV0 threshold and area under curve (AUC for prediction of progression free survival (PFS were respectively: 313 ml and 0.70, 432 ml and 0.68, 450 ml and 0.68, 330 ml and 0.68. There was no significant difference between ROC curves. High TMTV0 value was predictive of poor PFS in all methodologies: 4-years PFS was 83% vs 42% (p = 0.006 for TMTV02.5, 83% vs 41% (p = 0.003 for TMTV041, 85% vs 40% (p<0.001 for TMTV0125 and 83% vs 42% (p = 0.004 for TMTV0140.In newly diagnosed HL, baseline metabolic tumor volume values were significantly influenced by the choice of the method used for determination of volume. However, no significant differences were found

  8. CrossTalk: The Journal of Defense Software Engineering. Volume 23, Number 3, May/June 2010

    Science.gov (United States)

    2010-06-01

    International Cryptology Conference Santa Barbara , CA www.iacr.org/conferences/ crypto2010 COMING EVENTS: Please submit coming events that are of interest to...practice principles to the development and maintenance of software. CrossTalk mainstay Capers Jones—along with other past authors Christof Ebert, Donald J

  9. CrossTalk: The Journal of Defense Software Engineering. Volume 23, Number 2, March/April 2010

    Science.gov (United States)

    2010-04-01

    occurred. Integrity (I) is the assurance that information is not altered in an unauthorized fashion . For example, integrity is violated when an employee...2010west May 24-27 Siemens PLM Connection 2010 Nashville, TN http://event.plmworld.org June 6-10 IBM Rational Software Conference Orlando, FL http://www

  10. CrossTalk: The Journal of Defense Software Engineering. Volume 22, Number 5, July/August 2009

    Science.gov (United States)

    2009-08-01

    Uni- versity of Southern California, 2008. 4. DoD. MIL-HDBK-881A. Washing- ton, D.C.: DoD, 2005. 5. Primavera Pertmaster Software. “Pert- master...Tutorial.” Primavera Pertmas- ter, 2008. 6. van Dorp, Johan R., et al. “A Risk Management Procedure for the Wash- ington State Ferries.” Risk Analysis

  11. An Automated Method for Extracting Spatially Varying Time-Dependent Quantities from an ALEGRA Simulation Using VisIt Visualization Software

    Science.gov (United States)

    2014-07-01

    Visualization software such as VisIt presents an alternative method to examine data through the use of EXODUS databases.3 In addition, VisIt...extracting transient quantities that vary spatially from an EXODUS database using a VisIt macro written in the Python programming language. 2...Graphics; Sandia National Laboratories: Albuquerque, NM, September 1991. Revised April 1994. 3 Schoof, L. A.; Yarberry, V. R. EXODUS II: A Finite

  12. CrossTalk: The Journal of Defense Software Engineering. Volume 23, Number 1, Jan/Feb 2010

    Science.gov (United States)

    2010-02-01

    sometimes daily, basis. Lean software development, in particular, is well-defined [1]. On the downside, simultaneous multiple Lean efforts ( Kaizen Events...Performance Excellence Find out how a CMMI-based framework—utilizing Lean Thinking, Six Sigma, and the Information Technology Infrastructure Library...integration of the CMMI frame- work with other improvement approaches. We have now, along with our customers, integrated Lean Thinking, the ITIL frame- work

  13. CrossTalk: The Journal of Defense Software Engineering. Volume 26, Number 3, May-June 2013

    Science.gov (United States)

    2013-06-01

    University Gauthier Fanmuy, ADN Abstract. Very Small Entities (VSEs) developing systems or software are very important to the military since the...Dublin, Ireland E-mail: roconnor@computing.dcu.ie Gauthier Fanmuy is a Department Director at ADN <http://www.adneurope.com>, a Systems Engineering...Engineering, <http:www.afis.fr> and AFIS representative at AFNOR. He is Associ- ate Technical Director for Industry in INCOSE. ADN Systems Engineering

  14. CrossTalk: The Journal of Defense Software Engineering. Volume 27, Number 5, September/October 2014

    Science.gov (United States)

    2014-10-01

    Importance of Systems of Systems Everything we do these days involves system and software technology: cars, planes, banks, restaurants , stores...most of their services to make the process more convenient for their citizens. It is estimated that more than 60% of Internet users interact with...year [1]. Online shopping has become more prevalent and convenient to customers than ever. In 2011, it is estimated that more than a trillion U.S

  15. GENII (Generation II): The Hanford Environmental Radiation Dosimetry Software System: Volume 3, Code maintenance manual: Hanford Environmental Dosimetry Upgrade Project

    Energy Technology Data Exchange (ETDEWEB)

    Napier, B.A.; Peloquin, R.A.; Strenge, D.L.; Ramsdell, J.V.

    1988-09-01

    The Hanford Environmental Dosimetry Upgrade Project was undertaken to incorporate the internal dosimetry models recommended by the International Commission on Radiological Protection (ICRP) in updated versions of the environmental pathway analysis models used at Hanford. The resulting second generation of Hanford environmental dosimetry computer codes is compiled in the Hanford Environmental Dosimetry System (Generation II, or GENII). This coupled system of computer codes is intended for analysis of environmental contamination resulting from acute or chronic releases to, or initial contamination of, air, water, or soil, on through the calculation of radiation doses to individuals or populations. GENII is described in three volumes of documentation. This volume is a Code Maintenance Manual for the serious user, including code logic diagrams, global dictionary, worksheets to assist with hand calculations, and listings of the code and its associated data libraries. The first volume describes the theoretical considerations of the system. The second volume is a Users' Manual, providing code structure, users' instructions, required system configurations, and QA-related topics. 7 figs., 5 tabs.

  16. CrossTalk: The Journal of Defense Software Engineering. Volume 25, Number 1, January/February 2012

    Science.gov (United States)

    2012-02-01

    FEB 2012 2. REPORT TYPE 3. DATES COVERED 00-00-2012 to 00-00-2012 4. TITLE AND SUBTITLE CrossTalk: The Journal of Defense Software Engineering...4 15 19 24 31 High Maturity - The Payoff Departments Cover Design by Kent Bingham 3 From the Sponsor 38 Upcoming Events 39 BackTalk CrossTalk...Lake City �Utah Jazz Basketball �Three Minor League Baseball Teams �One Hour from 12 Ski Resorts �Minutes from Hunting, Fishing, Water Skiing

  17. CrossTalk. The Journal of Defense Software Engineering. Volume 25, Number 1, Jan/Feb 2012

    Science.gov (United States)

    2012-01-01

    JAN 2012 2. REPORT TYPE 3. DATES COVERED 00-01-2012 to 00-02-2012 4. TITLE AND SUBTITLE CrossTalk The Journal of Defense Software Engineering...traditional DoD mindset. by Mary Ann Lapham 9 4 15 19 24 31 High Maturity - The Payoff Departments Cover Design by Kent Bingham 3 From the Sponsor...Location, Location: �25 minutes from Salt Lake City �Utah Jazz Basketball �Three Minor League Baseball Teams �One Hour from 12 Ski Resorts

  18. CrossTalk: The Journal of Defense Software Engineering. Volume 24, Number 2, March/April 2011

    Science.gov (United States)

    2011-04-01

    heights. We’ve seen tangible examples of where “security” has failed but “Rugged” has borne fruit . Buyers are seeking more Rugged infrastructure...lessons learned to their own FST programs. In this way, testers can more readily determine which types of tests are likely to be more fruit - ful and...dinner, or dessert ?) “What is the name of your favorite sports team?” (The last home team to win a championship.) As a software engineer, I’m

  19. Simple and efficient method for region of interest value extraction from picture archiving and communication system viewer with optical character recognition software and macro program.

    Science.gov (United States)

    Lee, Young Han; Park, Eun Hae; Suh, Jin-Suck

    2015-01-01

    The objectives are: 1) to introduce a simple and efficient method for extracting region of interest (ROI) values from a Picture Archiving and Communication System (PACS) viewer using optical character recognition (OCR) software and a macro program, and 2) to evaluate the accuracy of this method with a PACS workstation. This module was designed to extract the ROI values on the images of the PACS, and created as a development tool by using open-source OCR software and an open-source macro program. The principal processes are as follows: (1) capture a region of the ROI values as a graphic file for OCR, (2) recognize the text from the captured image by OCR software, (3) perform error-correction, (4) extract the values including area, average, standard deviation, max, and min values from the text, (5) reformat the values into temporary strings with tabs, and (6) paste the temporary strings into the spreadsheet. This principal process was repeated for the number of ROIs. The accuracy of this module was evaluated on 1040 recognitions from 280 randomly selected ROIs of the magnetic resonance images. The input times of ROIs were compared between conventional manual method and this extraction module-assisted input method. The module for extracting ROI values operated successfully using the OCR and macro programs. The values of the area, average, standard deviation, maximum, and minimum could be recognized and error-corrected with AutoHotkey-coded module. The average input times using the conventional method and the proposed module-assisted method were 34.97 seconds and 7.87 seconds, respectively. A simple and efficient method for ROI value extraction was developed with open-source OCR and a macro program. Accurate inputs of various numbers from ROIs can be extracted with this module. The proposed module could be applied to the next generation of PACS or existing PACS that have not yet been upgraded. Copyright © 2015 AUR. Published by Elsevier Inc. All rights reserved.

  20. A Study of the Value of Information and the Effect on Value of Intermediary Organizations, Timeliness of Services & Products, and Comprehensiveness of the EDB. Volume 1: The Value of Libraries as an Intermediary Information Service; Volume 2: The Value of the Network Energy Software Center and the Radiation Shielding Information Center; Volume 3: The Effects of Timeliness and Comprehensiveness on Value.

    Science.gov (United States)

    King Research, Inc., Rockville, MD.

    This document reports in three volumes the results of a series of surveys designed to: (1) determine what contribution intermediary information transfer organizations such as libraries and information analysis centers make to the value of information; (2) assess the value of two somewhat different software information analysis centers and the…

  1. Three-dimensional binding sites volume assessment during cardiac pacing lead extraction

    Directory of Open Access Journals (Sweden)

    Bich Lien Nguyen

    2015-07-01

    Conclusions: Real-time 3D binding sites assessment is feasible and improves transvenous lead extraction outcomes. Its role as a complementary information requires extensive validation, and might be beneficial for a tailored strategy.

  2. [An automatic extraction algorithm for individual tree crown projection area and volume based on 3D point cloud data].

    Science.gov (United States)

    Xu, Wei-Heng; Feng, Zhong-Ke; Su, Zhi-Fang; Xu, Hui; Jiao, You-Quan; Deng, Ou

    2014-02-01

    Tree crown projection area and crown volume are the important parameters for the estimation of biomass, tridimensional green biomass and other forestry science applications. Using conventional measurements of tree crown projection area and crown volume will produce a large area of errors in the view of practical situations referring to complicated tree crown structures or different morphological characteristics. However, it is difficult to measure and validate their accuracy through conventional measurement methods. In view of practical problems which include complicated tree crown structure, different morphological characteristics, so as to implement the objective that tree crown projection and crown volume can be extracted by computer program automatically. This paper proposes an automatic untouched measurement based on terrestrial three-dimensional laser scanner named FARO Photon120 using plane scattered data point convex hull algorithm and slice segmentation and accumulation algorithm to calculate the tree crown projection area. It is exploited on VC+6.0 and Matlab7.0. The experiments are exploited on 22 common tree species of Beijing, China. The results show that the correlation coefficient of the crown projection between Av calculated by new method and conventional method A4 reaches 0.964 (p3D LIDAR point cloud data of individual tree, tree crown structure was reconstructed at a high rate of speed with high accuracy, and crown projection and volume of individual tree were extracted by this automatical untouched method, which can provide a reference for tree crown structure studies and be worth to popularize in the field of precision forestry.

  3. High-volume extraction of nucleic acids by magnetic bead technology for ultrasensitive detection of bacteria in blood components.

    Science.gov (United States)

    Störmer, Melanie; Kleesiek, Knut; Dreier, Jens

    2007-01-01

    Nucleic acid isolation, the most technically demanding and laborious procedure performed in molecular diagnostics, harbors the potential for improvements in automation. A recent development is the use of magnetic beads covered with nucleic acid-binding matrices. We adapted this technology with a broad-range 23S rRNA real-time reverse transcription (RT)-PCR assay for fast and sensitive detection of bacterial contamination of blood products. We investigated different protocols for an automated high-volume extraction method based on magnetic-separation technology for the extraction of bacterial nucleic acids from platelet concentrates (PCs). We added 2 model bacteria, Staphylococcus epidermidis and Escherichia coli, to a single pool of apheresis-derived, single-donor platelets and assayed the PCs by real-time RT-PCR analysis with an improved primer-probe system and locked nucleic acid technology. Co-amplification of human beta(2)-microglobulin mRNA served as an internal control (IC). We used probit analysis to calculate the minimum concentration of bacteria that would be detected with 95% confidence. For automated magnetic bead-based extraction technology with the real-time RT-PCR, the 95% detection limit was 29 x 10(3) colony-forming units (CFU)/L for S. epidermidis and 22 x 10(3) CFU/L for E. coli. No false-positive results occurred, either due to nucleic acid contamination of reagents or externally during testing of 1030 PCs. High-volume nucleic acid extraction improved the detection limit of the assay. The improvement of the primer-probe system and the integration of an IC make the RT-PCR assay appropriate for bacteria screening of platelets.

  4. H- extraction from electron-cyclotron-resonance-driven multicusp volume source operated in pulsed mode

    Science.gov (United States)

    Svarnas, P.; Bacal, M.; Auvray, P.; Béchu, S.; Pelletier, J.

    2006-03-01

    H2 microwave (2.45GHz) pulsed plasma is produced from seven elementary electron cyclotron resonance sources installed into the magnetic multipole chamber "Camembert III" (École Polytechnique—Palaiseau) from which H- extraction takes place. The negative-ion and electron extracted currents are studied through electrical measurements and the plasma parameters by means of electrostatic probe under various experimental conditions. The role of the plasma electrode bias and the discharge duty cycle in the extraction process is emphasized. The gas breakdown at the beginning of every pulse gives rise to variations of the plasma characteristic parameters in comparison with those established at the later time of the pulse, where the electron temperature, the plasma potential, and the floating potential converge to the values obtained for a continuous plasma. The electron density is significantly enhanced in the pulsed mode.

  5. Crest lines extraction in volume 3D medical images : a multi-scale approach

    OpenAIRE

    Monga, Olivier; Lengagne, Richard; Deriche, Rachid

    1994-01-01

    Projet SYNTIM; Recently, we have shown that the differential properties of the surfaces represented by 3D volumic images can be recovered using their partial derivatives. For instance, the crest lines can be characterized by the first, second and third partial derivatives of the grey level function $I(x,y,z)$. In this paper, we show that~: - the computation of the partial derivatives of an image can be improved using recursive filters which approximate the Gaussian filter, - a multi-scale app...

  6. Automatic extraction of forward stroke volume using dynamic 11C-acetate PET/CT

    DEFF Research Database (Denmark)

    Harms, Hans; Tolbod, Lars Poulsen; Hansson, Nils Henrik;

    , potentially introducing bias if measured with a separate modality. The aim of this study was to develop and validate methods for automatically extracting FSV directly from the dynamic PET used for measuring oxidative metabolism. Methods: 16 subjects underwent a dynamic 27 min PET scan on a Siemens Biograph...... TruePoint 64 PET/CT scanner after bolus injection of 399±27 MBq of 11C-acetate. The LV-aortic time-activity curve (TAC) was extracted automatically from dynamic PET data using cluster analysis. The first-pass peak was derived by automatic extrapolation of the down-slope of the TAC. FSV...... was then calculated as the injected dose divided by the product of heart rate and the area under the curve of the first-pass peak. Gold standard FSV was measured in the left ventricular outflow tract by cardiovascular magnetic resonance using phase-contrast velocity mapping within two weeks of PET imaging. Results...

  7. Large volume TENAX {sup registered} extraction of the bioaccessible fraction of sediment-associated organic compounds for a subsequent effect-directed analysis

    Energy Technology Data Exchange (ETDEWEB)

    Schwab, K.; Brack, W. [UFZ - Helmholtz Centre or Environmental Research, Leipzig (Germany). Dept. of Effect-Directed Analysis

    2007-06-15

    Background, Aim and Scope: Effect-directed analysis (EDA) is a powerful tool for the identification of key toxicants in complex environmental samples. In most cases, EDA is based on total extraction of organic contaminants leading to an erroneous prioritization with regard to hazard and risk. Bioaccessibility-directed extraction aims to discriminate between contaminants that take part in partitioning between sediment and biota in a relevant time frame and those that are enclosed in structures, that do not allow rapid desorption. Standard protocols of targeted extraction of rapidly desorbing, and thus bioaccessible fraction using TENAX {sup registered} are based only on small amounts of sediment. In order to get sufficient amounts of extracts for subsequent biotesting, fractionation, and structure elucidation a large volume extraction technique needs to be developed applying one selected extraction time and excluding toxic procedural blanks. Materials and Methods: Desorption behaviour of sediment contaminants was determined by a consecutive solid-solid extraction of sediment using TENAX {sup registered} fitting a tri-compartment model on experimental data. Time needed to remove the rapidly desorbing fraction trap was calculated to select a fixed extraction time for single extraction procedures. Up-scaling by about a factor of 100 provided a large volume extraction technique for EDA. Reproducibility and comparability to small volume approach were proved. Blanks of respective TENAX {sup registered} mass were investigated using Scenedesmus vacuolatus and Artemia salina as test organisms. Results: Desorption kinetics showed that 12 to 30 % of sediment associated pollutants are available for rapid desorption. t{sub r}ap is compound dependent and covers a range of 2 to 18 h. On that basis a fixed extraction time of 24 h was selected. Validation of large volume approach was done by the means of comparison to small method and reproducibility. The large volume showed a good

  8. 基于工作研究的语块提取系统PhrasExt软件设计%Chunk Based on the Work of the Extraction System Software Design PhrasExt

    Institute of Scientific and Technical Information of China (English)

    熊秋平; 管新潮

    2011-01-01

    This paper applied the “work study” method of the classic Industrial Engineering to the preparation of chunk extraction software of bilingual parallel corpus. Expounded the “work study” of the two basic tools 5W1H and ECRS block in the preparation of chunk extraction applications. Results show that extraction software PhrasExt is better than the traditional extraction software.%将经典工业工程中的"工作研究"方法,应用到双语平行语料库语块提取软件的编制中.阐述了"工作研究"的两个基本工具5W1H和ECRS在编制语块提取软件时的应用.试验结果显示,软件PharasExt提取效果明显优于传统语块提取软件.

  9. Automatic extraction of myocardial mass and volumes using parametric images from dynamic non-gated PET

    DEFF Research Database (Denmark)

    Harms, Hans; Hansson, Nils Henrik Stubkjær; Tolbod, Lars Poulsen;

    2016-01-01

    -gated dynamic cardiac PET. METHODS: Thirty-five patients with aortic-valve stenosis and 10 healthy controls (HC) underwent a 27-min 11C-acetate PET/CT scan and cardiac magnetic resonance imaging (CMR). HC were scanned twice to assess repeatability. Parametric images of uptake rate K1 and the blood pool were......LV and WT only and an overestimation for LVEF at lower values. Intra- and inter-observer correlations were >0.95 for all PET measurements. PET repeatability accuracy in HC was comparable to CMR. CONCLUSION: LV mass and volumes are accurately and automatically generated from dynamic 11C-acetate PET without...... ECG-gating. This method can be incorporated in a standard routine without any additional workload and can, in theory, be extended to other PET tracers....

  10. Flexible Workflow Software enables the Management of an Increased Volume and Heterogeneity of Sensors, and evolves with the Expansion of Complex Ocean Observatory Infrastructures.

    Science.gov (United States)

    Tomlin, M. C.; Jenkyns, R.

    2015-12-01

    Ocean Networks Canada (ONC) collects data from observatories in the northeast Pacific, Salish Sea, Arctic Ocean, Atlantic Ocean, and land-based sites in British Columbia. Data are streamed, collected autonomously, or transmitted via satellite from a variety of instruments. The Software Engineering group at ONC develops and maintains Oceans 2.0, an in-house software system that acquires and archives data from sensors, and makes data available to scientists, the public, government and non-government agencies. The Oceans 2.0 workflow tool was developed by ONC to manage a large volume of tasks and processes required for instrument installation, recovery and maintenance activities. Since 2013, the workflow tool has supported 70 expeditions and grown to include 30 different workflow processes for the increasing complexity of infrastructures at ONC. The workflow tool strives to keep pace with an increasing heterogeneity of sensors, connections and environments by supporting versioning of existing workflows, and allowing the creation of new processes and tasks. Despite challenges in training and gaining mutual support from multidisciplinary teams, the workflow tool has become invaluable in project management in an innovative setting. It provides a collective place to contribute to ONC's diverse projects and expeditions and encourages more repeatable processes, while promoting interactions between the multidisciplinary teams who manage various aspects of instrument development and the data they produce. The workflow tool inspires documentation of terminologies and procedures, and effectively links to other tools at ONC such as JIRA, Alfresco and Wiki. Motivated by growing sensor schemes, modes of collecting data, archiving, and data distribution at ONC, the workflow tool ensures that infrastructure is managed completely from instrument purchase to data distribution. It integrates all areas of expertise and helps fulfill ONC's mandate to offer quality data to users.

  11. Determination of amphetamines in hair by GC/MS after small-volume liquid extraction and microwave derivatization.

    Science.gov (United States)

    Meng, Pinjia; Zhu, Dan; He, Hongyuan; Wang, Yanyan; Guo, Fei; Zhang, Liang

    2009-09-01

    We report here on the results of a procedure for the determination of amphetamine drugs in hair. The procedure is simple and sensitive. The results from the procedure using small-volume extraction matches perfectly with those either from using the derivatization method or selected ion monitoring (SIM) detection. We validated our method using four different amine drugs, including amphetamine, methamphetamine, methylenedioxy-amphetamine and methylenedioxy-methamphetamine. The detection limit for these drugs is about 50 +/- 7.5 pg/mg in hair and the intra-day and inter-day reproducibility are within 15% at most drug concentrations. Moreover, we also showed the utility of the procedure in analyses of authentic hair samples taken from amphetamine abusers, and demonstrated that the method meets the requirement for the analysis of a trace amounts of amphetamines in human hair.

  12. Automatic extraction of forward stroke volume using dynamic 11C-acetate PET/CT

    DEFF Research Database (Denmark)

    Harms, Hans; Tolbod, Lars Poulsen; Hansson, Nils Henrik

    was then calculated as the injected dose divided by the product of heart rate and the area under the curve of the first-pass peak. Gold standard FSV was measured in the left ventricular outflow tract by cardiovascular magnetic resonance using phase-contrast velocity mapping within two weeks of PET imaging. Results...... TruePoint 64 PET/CT scanner after bolus injection of 399±27 MBq of 11C-acetate. The LV-aortic time-activity curve (TAC) was extracted automatically from dynamic PET data using cluster analysis. The first-pass peak was derived by automatic extrapolation of the down-slope of the TAC. FSV...... = 0.001). Conclusions: FSV can be obtained automatically and reliably using dynamic 11C-acetate PET/CT and cluster analysis, although a small overestimation is observed when compared to FSV determined from MRI. This method could potentially be generalized to other tracers, although this requires...

  13. EXTRACT

    DEFF Research Database (Denmark)

    Pafilis, Evangelos; Buttigieg, Pier Luigi; Ferrell, Barbra

    2016-01-01

    The microbial and molecular ecology research communities have made substantial progress on developing standards for annotating samples with environment metadata. However, sample manual annotation is a highly labor intensive process and requires familiarity with the terminologies used. We have the...... and text-mining-assisted curation revealed that EXTRACT speeds up annotation by 15-25% and helps curators to detect terms that would otherwise have been missed.Database URL: https://extract.hcmr.gr/....

  14. OAST Space Theme Workshop. Volume 3: Working group summary. 4: Software (E-4). A. Summary. B. Technology needs (form 1). C. Priority assessment (form 2)

    Science.gov (United States)

    1976-01-01

    Only a few efforts are currently underway to develop an adequate technology base for the various themes. Particular attention must be given to software commonality and evolutionary capability, to increased system integrity and autonomy; and to improved communications among the program users, the program developers, and the programs themselves. There is a need for quantum improvement in software development methods and increasing the awareness of software by all concerned. Major thrusts identified include: (1) data and systems management; (2) software technology for autonomous systems; (3) technology and methods for improving the software development process; (4) advances related to systems of software elements including their architecture, their attributes as systems, and their interfaces with users and other systems; and (5) applications of software including both the basic algorithms used in a number of applications and the software specific to a particular theme or discipline area. The impact of each theme on software is assessed.

  15. Automatic extraction of forward stroke volume using dynamic PET/CT

    DEFF Research Database (Denmark)

    Harms, Hans; Tolbod, Lars Poulsen; Hansson, Nils Henrik Stubkjær

    2015-01-01

    from PET data using cluster analysis. The first-pass peak was isolated by automatic extrapolation of the downslope of the TAC. FSV was calculated as the injected dose divided by the product of heart rate and the area under the curve of the first-pass peak. Gold standard FSV was measured using phase-contrast...... a dynamic 11 C-acetate PET scan on a Siemens Biograph TruePoint-64 PET/CT (scanner I). In addition, 10 subjects underwent both dynamic 15 O-water PET and 11 C-acetate PET scans on a GE Discovery-ST PET/CT (scanner II). The left ventricular (LV)-aortic time-activity curve (TAC) was extracted automatically.......001 for all). FSV based on 11 C-acetate and 15 O-water correlated highly (r = 0.99, slope = 1.03) with no significant difference between FSV estimates (p = 0.14). Conclusions FSV can be obtained automatically using dynamic PET/CT and cluster analysis. Results are almost identical for 11 C-acetate and 15 O...

  16. Framework Programmable Platform for the Advanced Software Development Workstation (FPP/ASDW). Demonstration framework document. Volume 1: Concepts and activity descriptions

    Science.gov (United States)

    Mayer, Richard J.; Blinn, Thomas M.; Dewitte, Paul S.; Crump, John W.; Ackley, Keith A.

    1992-01-01

    The Framework Programmable Software Development Platform (FPP) is a project aimed at effectively combining tool and data integration mechanisms with a model of the software development process to provide an intelligent integrated software development environment. Guided by the model, this system development framework will take advantage of an integrated operating environment to automate effectively the management of the software development process so that costly mistakes during the development phase can be eliminated. The Advanced Software Development Workstation (ASDW) program is conducting research into development of advanced technologies for Computer Aided Software Engineering (CASE).

  17. The Effect of Pre-Nutrition of Hydroalcoholic Extracts of Origanum vulgare on Infarct Volume and Neurologic Deficits in a Rat Stroke Model

    Directory of Open Access Journals (Sweden)

    meysam Foroozandeh

    2016-04-01

    Full Text Available Background & objectives: Basic and clinical studies have shown that the production of free radicals was one of the main factors leading to the injury after stroke. In this study we investigated the effect of hydroalcoholic extracts of Origanum vulgare on infarct volume and neurological deficits in a rat stroke model. Methods: In this experimental study 35 male Wistar rats were randomly divided into 5 groups, each containing 7 animals. First group (control received distilled water, while other three treatment groups received oral Origanum vulgare extract by gavage for 30 days (50, 75 and 100 mg/kg/day, respectively. These groups were subjected to 60 min middle cerebral artery occlusion 2 hours after the last dose of Origanum extracts and followed by 24 hrs reperfusion. After 24 hrs, the infarct volume and neurologic deficits were evaluated in the groups. Sham operated groups (n=7 did not receive Marjoram and brain ischemia. Results: The hydroalcoholic extract of Origanum reduced the infarct volume and neurologic deficits in all treatment groups compared to control group. Conclusion: It seems that Origanum vulgare extract can exert the neuroprotective effect against stroke damage by reducing infarct volume and neurological disorders.

  18. HPARSER: extracting formal patient data from free text history and physical reports using natural language processing software.

    Science.gov (United States)

    Sponsler, J L

    2001-01-01

    A prototype, HPARSER, processes a patient history and physical report such that specific data are obtained and stored in a patient data record. HPARSER is a recursive transition network (RTN) parser, and includes English and medical grammar rules, lexicon, and database constraints. Medical grammar rules augment the grammar rule base and specify common phrases seen in patient reports (e.g., "pupils are equal and reactive"). Each database constraint associates a grammar rule with a database table and attribute. Constraint behavior is such that if a rule is satisfied, data is extracted from the parse tree and stored into the database. Control reports guided construction of grammar and constraint rules. Test reports were processed with the control rules. 85% of test report sentences parsed and a 60% data capture rate, compared to controls, was achieved. HPARSER demonstrates use of an RTN to parse patient reports, and database constraints to transfer formal data from parse trees into a database.

  19. OPTIMIZED DETERMINATION OF TRACE JET FUEL VOLATILE ORGANIC COMPOUNDS IN HUMAN BLOOD USING IN-FIELD LIQUID-LIQUID EXTRACTION WITH SUBSEQUENT LABORATORY GAS CHROMATOGRAPHIC-MASS SPECTROMETRIC ANALYSIS AND ON-COLUMN LARGE VOLUME INJECTION

    Science.gov (United States)

    A practical and sensitive method to assess volatile organic compounds (VOCs) from JP-8 jet fuel in human whole blood was developed by modifying previously established liquid-liquid extraction procedures, optimizing extraction times, solvent volume, specific sample processing te...

  20. On-line micro-volume introduction system developed for lower density than water extraction solvent and dispersive liquid-liquid microextraction coupled with flame atomic absorption spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Anthemidis, Aristidis N., E-mail: anthemid@chem.auth.gr [Laboratory of Analytical Chemistry, Department of Chemistry, Aristotle University, Thessaloniki 54124 (Greece); Mitani, Constantina; Balkatzopoulou, Paschalia; Tzanavaras, Paraskevas D. [Laboratory of Analytical Chemistry, Department of Chemistry, Aristotle University, Thessaloniki 54124 (Greece)

    2012-07-06

    Highlights: Black-Right-Pointing-Pointer A dispersive liquid-liquid micro extraction method for lead and copper determination. Black-Right-Pointing-Pointer A micro-volume transportation system for extractant solvent lighter than water. Black-Right-Pointing-Pointer Analysis of natural water samples. - Abstract: A simple and fast preconcentration/separation dispersive liquid-liquid micro extraction (DLLME) method for metal determination based on the use of extraction solvent with lower density than water has been developed. For this purpose a novel micro-volume introduction system was developed enabling the on-line injection of the organic solvent into flame atomic absorption spectrometry (FAAS). The effectiveness and efficiency of the proposed system were demonstrated for lead and copper preconcentration in environmental water samples using di-isobutyl ketone (DBIK) as extraction solvent. Under the optimum conditions the enhancement factor for lead and copper was 187 and 310 respectively. For a sample volume of 10 mL, the detection limit (3 s) and the relative standard deviation were 1.2 {mu}g L{sup -1} and 3.3% for lead and 0.12 {mu}g L{sup -1} and 2.9% for copper respectively. The developed method was evaluated by analyzing certified reference material and it was applied successfully to the analysis of environmental water samples.

  1. Height Extraction and Stand Volume Estimation Based on Fusion Airborne LiDAR Data and Terrestrial Measurements for a Norway Spruce [Picea abies (L. Karst.] Test Site in Romania

    Directory of Open Access Journals (Sweden)

    Bogdan APOSTOL

    2016-06-01

    Full Text Available The objective of this study was to analyze the efficiency of individual tree identification and stand volume estimation from LiDAR data. The study was located in Norway spruce [Picea abies (L. Karst.] stands in southwestern Romania and linked airborne laser scanning (ALS with terrestrial measurements through empirical modelling. The proposed method uses the Canopy Maxima algorithm for individual tree detection together with biometric field measurements and individual trees positioning. Field data was collected using Field-Map real-time GIS-laser equipment, a high-accuracy GNSS receiver and a Vertex IV ultrasound inclinometer. ALS data were collected using a Riegl LMS-Q560 instrument and processed using LP360 and Fusion software to extract digital terrain, surface and canopy height models. For the estimation of tree heights, number of trees and tree crown widths from the ALS data, the Canopy Maxima algorithm was used together with local regression equations relating field-measured tree heights and crown widths at each plot. When compared to LiDAR detected trees, about 40-61% of the field-measured trees were correctly identified. Such trees represented, in general, predominant, dominant and co-dominant trees from the upper canopy. However, it should be noted that the volume of the correctly identified trees represented 60-78% of the total plot volume. The estimation of stand volume using the LiDAR data was achieved by empirical modelling, taking into account the individual tree heights (as identified from the ALS data and the corresponding ground reference stem volume. The root mean square error (RMSE between the individual tree heights measured in the field and the corresponding heights identified in the ALS data was 1.7-2.2 meters. Comparing the ground reference estimated stem volume (at trees level with the corresponding ALS estimated tree stem volume, an RMSE of 0.5-0.7 m3 was achieved. The RMSE was slightly lower when comparing the ground

  2. Microwave-assisted extraction and large-volume injection gas chromatography tandem mass spectrometry determination of multiresidue pesticides in edible seaweed.

    Science.gov (United States)

    García-Rodríguez, D; Carro, A M; Cela, R; Lorenzo, R A

    2010-09-01

    A microwave-assisted extraction method followed by clean-up with solid-phase extraction (SPE) combined with large-volume injection gas chromatography-tandem mass spectrometry (LVI-GC-MS/MS) for the analysis of 17 pesticides in wild and aquaculture edible seaweeds has been developed. An experimental central composite design was employed to evaluate the effects of the main variables potentially affecting the extraction (temperature, time, and solvent volume) and to optimize the process. The most effective microwave extraction conditions were achieved at 125 °C and 12 min with 24 mL of hexane/ethyl acetate (80:20). SPE clean-up of the extracts with graphitized carbon and Florisil, optimized by means of the experimental design, proved to be efficient in the removal of matrix interferences. The analytical recoveries were close to 100% for all the analytes, with relative standard deviations lower than 13%. The limits of detection ranged from 0.3 to 23.1 pg g(-1) and the limits of quantification were between 2.3 and 76.9 pg g(-1), far below the maximum residue levels established by the European Union for pesticides in seaweed. The results obtained prove the suitability of the microwave-assisted extraction for the routine analysis of pesticides in aquaculture and wild seaweed samples.

  3. Determination of selected polycyclic aromatic compounds in particulate matter: a validation study of an agitation extraction method for samples with low mass loadings using reduced volumes

    Science.gov (United States)

    García-Alonso, S.; Pérez-Pastor, R. M.; Archilla-Prat, V.; Rodríguez-Maroto, J.; Izquierdo-Díaz, M.; Rojas, E.; Sanz, D.

    2015-12-01

    A simple analytical method using low volumes of solvent for determining selected PAHs and NPAHs in PM samples is presented. The proposed extraction method was compared with pressurized fluid (PFE) and microwave (MC) extraction techniques and intermediate precision associated to analytical measurements were estimated. Extraction by agitation with 8 mL of dichloromethane yielded recoveries above 80% compared to those obtained from PFE extraction. Regarding intermediate precision results, values between 10-20% were reached showing increases of dispersion for compounds with high volatility and low levels of concentration. Within the framework of the INTA/CIEMAT research agreement for the PM characterization in gas turbine exhaust, the method was applied for analysis of aluminum foil substrates and quartz filters with mass loading ranged from 0.02 to 2 mg per sample.

  4. Highly selective solid-phase extraction and large volume injection for the robust gas chromatography-mass spectrometric analysis of TCA and TBA in wines.

    Science.gov (United States)

    Insa, S; Anticó, E; Ferreira, V

    2005-09-30

    A reliable solid-phase extraction (SPE) method for the simultaneous determination of 2,4,6-trichloroanisole (TCA) and 2,4,6-tribromoanisole (TBA) in wines has been developed. In the proposed procedure 50 mL of wine are extracted in a 1 mL cartridge filled with 50 mg of LiChrolut EN resins. Most wine volatiles are washed up with 12.5 mL of a water:methanol solution (70%, v/v) containing 1% of NaHCO3. Analytes are further eluted with 0.6 mL of dichloromethane. A 40 microL aliquot of this extract is directly injected into a PTV injector operated in the solvent split mode, and analysed by gas chromatography (GC)-ion trap mass spectrometry using the selected ion storage mode. The solid-phase extraction, including sample volume and rinsing and elution solvents, and the large volume GC injection have been carefully evaluated and optimized. The resulting method is precise (RSD (%) extract is clean, simple and free from non-volatiles).

  5. Predictors and outcomes of lead extraction requiring a bailout femoral approach: Data from 2 high-volume centers.

    Science.gov (United States)

    El-Chami, Mikhael F; Merchant, Faisal M; Waheed, Anam; Khattak, Furqan; El-Khalil, Jad; Patel, Adarsh; Sayegh, Michael N; Desai, Yaanik; Leon, Angel R; Saba, Samir

    2017-04-01

    Lead extraction (LE) infrequently requires the use of the "bailout" femoral approach. Predictors and outcomes of femoral extraction are not well characterized. The aim of this study was to determine the predictors of need for femoral LE and its outcomes. Consecutive patients who underwent LE at our centers were identified. Baseline demographic characteristics, procedural outcomes, and clinical outcomes were ascertained by medical record review. Patients were stratified into 2 groups on the basis of the need for femoral extraction. A total of 1080 patients underwent LE, of whom 50 (4.63%) required femoral extraction. Patients requiring femoral extraction were more likely to have leads with longer dwell time (9.5 ± 6.0 years vs 5.7 ± 4.3 years; P extracted per procedure (2.0 ± 1.0 vs 1.7 ± 0.9; P = .003), and to have infection as an indication for extraction (72% vs 37.2%; P extraction group than in the nonfemoral group (58% and 76% vs 94.7% and 97.9 %, respectively; P extraction was needed in ~5% of LEs. Longer lead dwell time, higher number of leads extracted per procedure, and the presence of infection predicted the need for femoral extraction. Procedural success of femoral extraction was low, highlighting the fact that this approach is mostly used as a bailout strategy and thus selects for more challenging cases. Copyright © 2017 Heart Rhythm Society. Published by Elsevier Inc. All rights reserved.

  6. Joint Logistics Commanders’ Biennial Software Workshop (4th) Orlando II: Solving the PDSS (Post Deployment Software Support) Challenge Held in Orlando, Florida on 27-29 January 87. Volume 2. Proceedings

    Science.gov (United States)

    1987-06-01

    enhancements/modifications. These software efforts should be tracked by program element. Additionally, the current literature should be searched to...result. But we can start to reap the benefits today. 4. Tom Clancy, the author of the recent bestselling Cold War thriller , Hunt For Red October

  7. Liquid Metering Centrifuge Sticks (LMCS): A Centrifugal Approach to Metering Known Sample Volumes for Colorimetric Solid Phase Extraction (C-SPE)

    Science.gov (United States)

    Gazda, Daniel B.; Schultz, John R.; Clarke, Mark S.

    2007-01-01

    Phase separation is one of the most significant obstacles encountered during the development of analytical methods for water quality monitoring in spacecraft environments. Removing air bubbles from water samples prior to analysis is a routine task on earth; however, in the absence of gravity, this routine task becomes extremely difficult. This paper details the development and initial ground testing of liquid metering centrifuge sticks (LMCS), devices designed to collect and meter a known volume of bubble-free water in microgravity. The LMCS uses centrifugal force to eliminate entrapped air and reproducibly meter liquid sample volumes for analysis with Colorimetric Solid Phase Extraction (C-SPE). C-SPE is a sorption-spectrophotometric platform that is being developed as a potential spacecraft water quality monitoring system. C-SPE utilizes solid phase extraction membranes impregnated with analyte-specific colorimetric reagents to concentrate and complex target analytes in spacecraft water samples. The mass of analyte extracted from the water sample is determined using diffuse reflectance (DR) data collected from the membrane surface and an analyte-specific calibration curve. The analyte concentration can then be calculated from the mass of extracted analyte and the volume of the sample analyzed. Previous flight experiments conducted in microgravity conditions aboard the NASA KC-135 aircraft demonstrated that the inability to collect and meter a known volume of water using a syringe was a limiting factor in the accuracy of C-SPE measurements. Herein, results obtained from ground based C-SPE experiments using ionic silver as a test analyte and either the LMCS or syringes for sample metering are compared to evaluate the performance of the LMCS. These results indicate very good agreement between the two sample metering methods and clearly illustrate the potential of utilizing centrifugal forces to achieve phase separation and metering of water samples in microgravity.

  8. SU-E-J-241: Wavelet-Based Temporal Feature Extraction From DCE-MRI to Identify Sub-Volumes of Low Blood Volume in Head-And-Neck Cancer

    Energy Technology Data Exchange (ETDEWEB)

    You, D; Aryal, M; Samuels, S; Eisbruch, A; Cao, Y [University of Michigan, Ann Arbor, MI (United States)

    2015-06-15

    Purpose: A previous study showed that large sub-volumes of tumor with low blood volume (BV) (poorly perfused) in head-and-neck (HN) cancers are significantly associated with local-regional failure (LRF) after chemoradiation therapy, and could be targeted with intensified radiation doses. This study aimed to develop an automated and scalable model to extract voxel-wise contrast-enhanced temporal features of dynamic contrastenhanced (DCE) MRI in HN cancers for predicting LRF. Methods: Our model development consists of training and testing stages. The training stage includes preprocessing of individual-voxel DCE curves from tumors for intensity normalization and temporal alignment, temporal feature extraction from the curves, feature selection, and training classifiers. For feature extraction, multiresolution Haar discrete wavelet transformation is applied to each DCE curve to capture temporal contrast-enhanced features. The wavelet coefficients as feature vectors are selected. Support vector machine classifiers are trained to classify tumor voxels having either low or high BV, for which a BV threshold of 7.6% is previously established and used as ground truth. The model is tested by a new dataset. The voxel-wise DCE curves for training and testing were from 14 and 8 patients, respectively. A posterior probability map of the low BV class was created to examine the tumor sub-volume classification. Voxel-wise classification accuracy was computed to evaluate performance of the model. Results: Average classification accuracies were 87.2% for training (10-fold crossvalidation) and 82.5% for testing. The lowest and highest accuracies (patient-wise) were 68.7% and 96.4%, respectively. Posterior probability maps of the low BV class showed the sub-volumes extracted by our model similar to ones defined by the BV maps with most misclassifications occurred near the sub-volume boundaries. Conclusion: This model could be valuable to support adaptive clinical trials with further

  9. Software engineering

    CERN Document Server

    Sommerville, Ian

    2010-01-01

    The ninth edition of Software Engineering presents a broad perspective of software engineering, focusing on the processes and techniques fundamental to the creation of reliable, software systems. Increased coverage of agile methods and software reuse, along with coverage of 'traditional' plan-driven software engineering, gives readers the most up-to-date view of the field currently available. Practical case studies, a full set of easy-to-access supplements, and extensive web resources make teaching the course easier than ever.

  10. Software interface system for Geophysical Data Access and Management System (GPDAMS-CD)

    Digital Repository Service at National Institute of Oceanography (India)

    Kunte, P.D.

    -friendly access to large volume of data and means to visualize and extract selected data as per need. The software requires a minimum of computing expertise as it is controlled by a system of `pull down' menus, backed up by a context-sensitive system...

  11. The influence of scan mode and circle fitting on tree stem detection, stem diameter and volume extraction from terrestrial laser scans

    Science.gov (United States)

    Pueschel, Pyare; Newnham, Glenn; Rock, Gilles; Udelhoven, Thomas; Werner, Willy; Hill, Joachim

    2013-03-01

    Terrestrial laser scanning (TLS) has been used to estimate a number of biophysical and structural vegetation parameters. Of these stem diameter is a primary input to traditional forest inventory. While many experimental studies have confirmed the potential for TLS to successfully extract stem diameter, the estimation accuracies differ strongly for these studies - due to differences in experimental design, data processing and test plot characteristics. In order to provide consistency and maximize estimation accuracy, a systematic study into the impact of these variables is required. To contribute to such an approach, 12 scans were acquired with a FARO photon 120 at two test plots (Beech, Douglas fir) to assess the effects of scan mode and circle fitting on the extraction of stem diameter and volume. An automated tree stem detection algorithm based on the range images of single scans was developed and applied to the data. Extraction of stem diameter was achieved by slicing the point cloud and fitting circles to the slices using three different algorithms (Lemen, Pratt and Taubin), resulting in diameter profiles for each detected tree. Diameter at breast height (DBH) was determined using both the single value for the diameter fitted at the nominal breast height and by a linear fit of the stem diameter vertical profile. The latter is intended to reduce the influence of outliers and errors in the ground level determination. TLS-extracted DBH was compared to tape-measured DBH. Results show that tree stems with an unobstructed view to the scanner can be successfully extracted automatically from range images of the TLS data with detection rates of 94% for Beech and 96% for Douglas fir. If occlusion of trees is accounted for stem detection rates decrease to 85% (Beech) and 84% (Douglas fir). As far as the DBH estimation is concerned, both DBH extraction methods yield estimates which agree with reference measurements, however, the linear fit based approach proved to be more

  12. Revisiting software ecosystems research

    DEFF Research Database (Denmark)

    Manikas, Konstantinos

    2016-01-01

    Software ecosystems’ is argued to first appear as a concept more than 10 years ago and software ecosystem research started to take off in 2010. We conduct a systematic literature study, based on the most extensive literature review in the field up to date, with two primarily aims: (a) to provide...... an updated overview of the field and (b) to document evolution in the field. In total, we analyze 231 papers from 2007 until 2014 and provide an overview of the research in software ecosystems. Our analysis reveals a field that is rapidly growing both in volume and empirical focus while becoming more mature...... from evolving. We propose means for future research and the community to address them. Finally, our analysis shapes the view of the field having evolved outside the existing definitions of software ecosystems and thus propose the update of the definition of software ecosystems....

  13. Performance of new automated transthoracic three-dimensional echocardiographic software for left ventricular volumes and function assessment in routine clinical practice: Comparison with 3 Tesla cardiac magnetic resonance.

    Science.gov (United States)

    Levy, Franck; Dan Schouver, Elie; Iacuzio, Laura; Civaia, Filippo; Rusek, Stephane; Dommerc, Carinne; Marechaux, Sylvestre; Dor, Vincent; Tribouilloy, Christophe; Dreyfus, Gilles

    2017-05-26

    Three-dimensional (3D) transthoracic echocardiography (TTE) is superior to two-dimensional Simpson's method for assessment of left ventricular (LV) volumes and LV ejection fraction (LVEF). Nevertheless, 3D TTE is not incorporated into everyday practice, as current LV chamber quantification software products are time-consuming. To evaluate the feasibility, accuracy and reproducibility of new fully automated fast 3D TTE software (HeartModel(A.I.); Philips Healthcare, Andover, MA, USA) for quantification of LV volumes and LVEF in routine practice; to compare the 3D LV volumes and LVEF obtained with a cardiac magnetic resonance (CMR) reference; and to optimize automated default border settings with CMR as reference. Sixty-three consecutive patients, who had comprehensive 3D TTE and CMR examinations within 24hours, were eligible for inclusion. Nine patients (14%) were excluded because of insufficient echogenicity in the 3D TTE. Thus, 54 patients (40 men; mean age 63±13 years) were prospectively included into the study. The inter- and intraobserver reproducibilities of 3D TTE were excellent (coefficient of variation<10%) for end-diastolic volume (EDV), end-systolic volume (ESV) and LVEF. Despite a slight underestimation of EDV using 3D TTE compared with CMR (bias=-22±34mL; P<0.0001), a significant correlation was found between the two measurements (r=0.93; P=0.0001). Enlarging default border detection settings leads to frequent volume overestimation in the general population, but improved agreement with CMR in patients with LVEF≤50%. Correlations between 3D TTE and CMR for ESV and LVEF were excellent (r=0.93 and r=0.91, respectively; P<0.0001). 3D TTE using new-generation fully automated software is a feasible, fast, reproducible and accurate imaging modality for LV volumetric quantification in routine practice. Optimization of border detection settings may increase agreement with CMR for EDV assessment in dilated ventricles. Copyright © 2017 Elsevier Masson

  14. SOFTWARE OPEN SOURCE, SOFTWARE GRATIS?

    Directory of Open Access Journals (Sweden)

    Nur Aini Rakhmawati

    2006-01-01

    Full Text Available Normal 0 false false false IN X-NONE X-NONE MicrosoftInternetExplorer4 Berlakunya Undang – undang Hak Atas Kekayaan Intelektual (HAKI, memunculkan suatu alternatif baru untuk menggunakan software open source. Penggunaan software open source menyebar seiring dengan isu global pada Information Communication Technology (ICT saat ini. Beberapa organisasi dan perusahaan mulai menjadikan software open source sebagai pertimbangan. Banyak konsep mengenai software open source ini. Mulai dari software yang gratis sampai software tidak berlisensi. Tidak sepenuhnya isu software open source benar, untuk itu perlu dikenalkan konsep software open source mulai dari sejarah, lisensi dan bagaimana cara memilih lisensi, serta pertimbangan dalam memilih software open source yang ada. Kata kunci :Lisensi, Open Source, HAKI

  15. Software Technology for Adaptable, Reliable Systems (STARS): US40 - STARS Reuse Concept of Operations. Volume 1. Version 0.5 - Draft

    Science.gov (United States)

    1991-08-27

    Development The goal of application generator development is to provide a capability that allows a reuser or application developer to create software (sub...by Role Access to asset library services should be restricted by library role. For example, an asset reuser should not be allowed to modify the

  16. Software Reviews.

    Science.gov (United States)

    Smith, Richard L., Ed.

    1985-01-01

    Reviews software packages by providing extensive descriptions and discussions of their strengths and weaknesses. Software reviewed include (1) "VISIFROG: Vertebrate Anatomy" (grade seven-adult); (2) "Fraction Bars Computer Program" (grades three to six) and (3) four telecommunications utilities. (JN)

  17. A method for rapid in situ extraction and laboratory determination of Th, Pb, and Ra isotopes from large volumes of seawater

    Science.gov (United States)

    Baskaran, M.; Murphy, David J.; Santschi, Peter H.; Orr, James C.; Schink, David R.

    1993-04-01

    An in situ pump and six channel extraction system, operating typically at 351 min -1, can not only filter seawater samples but also extract dissolved Th, Pb and Ra isotopes from large volumes of seawater at six different depths on a single lowering. By gamma counting the ash residues of three cartridges (one particle extractor and two nuclide extractors), we can determine 234Th and 210Pb concentrations in the particulate and dissolved phase. Extraction efficiencies for 234Th from the dissolved phase average 93 ± 5% and for 210Pb average ≥90%. At the same time we can determine radium isotopic ratios precisely, but, at present radium concentrations can only be established by the determination of 226Ra in a 30-1 cohort sample. This system can be deployed easily from ships having an A-frame. It eliminates the problems of transporting and processing large volumes of samples and involves relatively little manual labor and analytical time. We have tested and used this device repeatedly in the Gulf of Mexico. Our preliminary results agree well with values of 234Th and 210Pb concentrations and 228Ra/ 226Ra activity ratios reported in the literature for the Gulf of Mexico.

  18. Software reliability

    CERN Document Server

    Bendell, A

    1986-01-01

    Software Reliability reviews some fundamental issues of software reliability as well as the techniques, models, and metrics used to predict the reliability of software. Topics covered include fault avoidance, fault removal, and fault tolerance, along with statistical methods for the objective assessment of predictive accuracy. Development cost models and life-cycle cost models are also discussed. This book is divided into eight sections and begins with a chapter on adaptive modeling used to predict software reliability, followed by a discussion on failure rate in software reliability growth mo

  19. Evaluation of automated decision making methodologies and development of an integrated robotic system simulation. Volume 2, Part 1. Appendix a: software documentation

    Energy Technology Data Exchange (ETDEWEB)

    Lowrie, J.W.; Fermelia, A.J.; Haley, D.C.; Gremban, K.D.; Vanbaalen, J.

    1982-09-01

    Documentation of the preliminary software developed as a framework for a generalized integrated robotic system simulation is presented. The program structure is composed of three major functions controlled by a program executive. The three major functions are: system definition, analysis tools, and post processing. The system definition function handles user input of system parameters and definition of the manipulator configuration. The analysis tools function handles the computational requirements of the program. The post processing function allows for more detailed study of the results of analysis tool function executions. Also documented is the manipulator joint model software to be used as the basis of the manipulator simulation which will be part of the analysis tools capability.

  20. Computer software.

    Science.gov (United States)

    Rosenthal, L E

    1986-10-01

    Software is the component in a computer system that permits the hardware to perform the various functions that a computer system is capable of doing. The history of software and its development can be traced to the early nineteenth century. All computer systems are designed to utilize the "stored program concept" as first developed by Charles Babbage in the 1850s. The concept was lost until the mid-1940s, when modern computers made their appearance. Today, because of the complex and myriad tasks that a computer system can perform, there has been a differentiation of types of software. There is software designed to perform specific business applications. There is software that controls the overall operation of a computer system. And there is software that is designed to carry out specialized tasks. Regardless of types, software is the most critical component of any computer system. Without it, all one has is a collection of circuits, transistors, and silicone chips.

  1. Automatic extraction of forward stroke volume using dynamic PET/CT: a dual-tracer and dual-scanner validation in patients with heart valve disease

    OpenAIRE

    Harms, Hendrik Johannes; Tolbod, Lars Poulsen; Hansson, Nils Henrik Stubkjær; Kero, Tanja; Örndahl, Lovisa Holm; Kim, Won Yong; Bjerner, Tomas; Bouchelouche, Kirsten; Wiggers, Henrik; Frøkiær, Jørgen; Sörensen, Jens

    2015-01-01

    BACKGROUND: The aim of this study was to develop and validate an automated method for extracting forward stroke volume (FSV) using indicator dilution theory directly from dynamic positron emission tomography (PET) studies for two different tracers and scanners. METHODS: 35 subjects underwent a dynamic (11)C-acetate PET scan on a Siemens Biograph TruePoint-64 PET/CT (scanner I). In addition, 10 subjects underwent both dynamic (15)O-water PET and (11)C-acetate PET scans on a GE Discovery-ST PET...

  2. Base excision repair efficiency and mechanism in nuclear extracts are influenced by the ratio between volume of nuclear extraction buffer and nuclei-Implications for comparative studies

    DEFF Research Database (Denmark)

    Akbari, Mansour; Krokan, Hans E

    2012-01-01

    The base excision repair (BER) pathway corrects many different DNA base lesions and is important for genomic stability. The mechanism of BER cannot easily be investigated in intact cells and therefore in vitro methods that reflect the in vivo processes are in high demand. Reconstitution of BER...... using purified proteins essentially mirror properties of the proteins used, and does not necessarily reflect the mechanism as it occurs in the cell. Nuclear extracts from cultured cells have the capacity to carry out complete BER and can give important information on the mechanism. Furthermore...

  3. The effect of late-phase contrast enhancement on semi-automatic software measurements of CT attenuation and volume of part-solid nodules in lung adenocarcinomas

    NARCIS (Netherlands)

    Cohen, J.G.; Goo, J.M.; Yoo, R.E.; Park, S.B.; Ginneken, B. van; Ferretti, G.R.; Lee, C.H.; Park, C.M.

    2016-01-01

    OBJECTIVES: To evaluate the differences in semi-automatic measurements of CT attenuation and volume of part-solid nodules (PSNs) between unenhanced and enhanced CT scans. MATERIALS AND METHODS: CT scans including unenhanced and enhanced phases (slice thickness 0.625 and 1.25mm, respectively) for 53

  4. Assessment of the Combat Developer’s Role in Post-Deployment Software Support (PDSS) 30 June 1980 - 28 February 1981. Volume IV.

    Science.gov (United States)

    1981-01-31

    automation of individual battlefield systems is presenting increasing requirements for larger volumes of traffic, especially digital traffic, at higher...time" requirements of manual command and control in which time is measured in minutes to hours. These trends in digital data, time-compression, traffic...operations research, systems analysis, mathematics, electrical and electronic engineering, computer science, comunications and Army communications

  5. NASA software documentation standard software engineering program

    Science.gov (United States)

    1991-01-01

    The NASA Software Documentation Standard (hereinafter referred to as Standard) can be applied to the documentation of all NASA software. This Standard is limited to documentation format and content requirements. It does not mandate specific management, engineering, or assurance standards or techniques. This Standard defines the format and content of documentation for software acquisition, development, and sustaining engineering. Format requirements address where information shall be recorded and content requirements address what information shall be recorded. This Standard provides a framework to allow consistency of documentation across NASA and visibility into the completeness of project documentation. This basic framework consists of four major sections (or volumes). The Management Plan contains all planning and business aspects of a software project, including engineering and assurance planning. The Product Specification contains all technical engineering information, including software requirements and design. The Assurance and Test Procedures contains all technical assurance information, including Test, Quality Assurance (QA), and Verification and Validation (V&V). The Management, Engineering, and Assurance Reports is the library and/or listing of all project reports.

  6. Critical-fluid extraction of organics from water. Volume II. Experimental. Final report, 1 October 1979-30 November 1983

    Energy Technology Data Exchange (ETDEWEB)

    Abboud, O.K.; de Filippi, R.P.; Goklen, K.E.; Moses, J.M.

    1984-06-01

    Critical fluid extraction has been tested at the pilot plant scale as a method of separating organics from water. The process employed resembles a liquid-liquid extraction in which the solvent is near-critical carbon dioxide and the feed is an organic in aqueous solution. Carbon dioxide's solvent and other thermodynamic properties, and the effective utilization of a vapor recompression cycle in the process design have significantly reduced the energy required for these separations. This process is an energy-conserving alternative to the distillation processes which are currently employed. The objectives of this portion of the project were to demonstrate the feasibility of this technology and to gather the engineering data required to evaluate the process. Three alcohols were tested in these experiments - ethanol, isopropanol and sec-butanol - and were all successfully extracted.

  7. Software engineering design theory and practice

    CERN Document Server

    Otero, Carlos

    2012-01-01

    … intended for use as a textbook for an advanced course in software design. Each chapter ends with review questions and references. … provides an overview of the software development process, something that would not be out of line in a course on software engineering including such topics as software process, software management, balancing conflicting values of stakeholders, testing, quality, and ethics. The author has principally focused on software design though, extracting the design phase from the surrounding software development lifecycle. … Software design strategies are addressed

  8. Software piracy

    OpenAIRE

    Kráčmer, Stanislav

    2011-01-01

    The objective of the present thesis is to clarify the term of software piracy and to determine responsibility of individual entities as to actual realization of software piracy. First, the thesis focuses on a computer programme, causes, realization and pitfalls of its inclusion under copyright protection. Subsequently, it observes methods of legal usage of a computer programme. This is the point of departure for the following attempt to define software piracy, accompanied with methods of actu...

  9. Software engineering

    CERN Document Server

    Sommerville, Ian

    2016-01-01

    For courses in computer science and software engineering The Fundamental Practice of Software Engineering Software Engineering introduces readers to the overwhelmingly important subject of software programming and development. In the past few years, computer systems have come to dominate not just our technological growth, but the foundations of our world's major industries. This text seeks to lay out the fundamental concepts of this huge and continually growing subject area in a clear and comprehensive manner. The Tenth Edition contains new information that highlights various technological updates of recent years, providing readers with highly relevant and current information. Sommerville's experience in system dependability and systems engineering guides the text through a traditional plan-based approach that incorporates some novel agile methods. The text strives to teach the innovators of tomorrow how to create software that will make our world a better, safer, and more advanced place to live.

  10. Optimized determination of trace jet fuel volatile organic compounds in human blood using in-field liquid-liquid extraction with subsequent laboratory gas chromatographic-mass spectrometric analysis and on-column large-volume injection.

    Science.gov (United States)

    Liu, S; Pleil, J D

    2001-03-05

    A practical and sensitive method to assess volatile organic compounds (VOCs) from JP-8 jet fuel in human whole blood was developed by modifying previously established liquid-liquid extraction procedures, optimizing extraction times, solvent volume, specific sample processing techniques, and a new on-column large-volume injection method for GC-MS analysis. With the optimized methods, the extraction efficiency was improved by 4.3 to 20.1 times and the detection sensitivity increased up to 660 times over the standard method. Typical detection limits in the parts-per-trillion (ppt) level range were achieved for all monitored JP-8 constituents; this is sufficient for assessing human fuels exposures at trace environmental levels as well as occupational exposure levels. The sample extractions are performed in the field and only solvent extracts need to be shipped to the laboratory. The method is implemented with standard biological laboratory equipment and a modest bench-top GC-MS system.

  11. Evaluation of an automated high-volume extraction method for viral nucleic acids in comparison to a manual procedure with preceding enrichment.

    Science.gov (United States)

    Hourfar, M K; Schmidt, M; Seifried, E; Roth, W K

    2005-08-01

    Nucleic acid extraction still harbours the potential for improvements in automation and sensitivity of nucleic acid amplification technology (NAT) testing. This study evaluates the feasibility of a novel automated high-volume extraction protocol for NAT minipool testing in a blood bank setting. The chemagic Viral DNA/RNA Kit special for automated purification of viral nucleic acids from 9.6 ml of plasma by using the chemagic Magnetic Separation Module I was investigated. Analytical sensitivity for hepatitis C virus (HCV), human immunodeficiency virus-1 (HIV-1), hepatitis B virus (HBV), hepatitis A virus (HAV) and parvovirus B19 (B19) was compared to our present manual procedure that involves virus enrichment by centrifugation. Chemagic technology allows automation of the viral DNA/RNA extraction process. Viral nucleic acids were bound directly to magnetic beads from 9.6-ml minipools. By combining the automated magnetic beads-based extraction technology with our in-house TaqMan polymerase chain reaction (PCR) assays, 95% detection limits were 280 IU/ml for HCV, 4955 IU/ml for HIV-1, 249 IU/ml for HBV, 462 IU/ml for HAV and 460 IU/ml for B19, calculated for an individual donation in a pool of 96 donors. The detection limits of our present method were 460 IU/ml for HCV, 879 IU/ml for HIV-1, 90 IU/ml for HBV, 203 IU/ml for HAV and 314 IU/ml for B19. The 95% detection limits obtained by using the chemagic method were within the regulatory requirements for blood donor screening. The sensitivities detected for HCV, HBV, HAV and B19 were found to be in a range similar to that of the manual purification method. Sensitivity for HIV-1, however, was found to be inferior for the chemagic method in this study.

  12. Multimodal evaluation of 2-D and 3-D ultrasound, computed tomography and magnetic resonance imaging in measurements of the thyroid volume using universally applicable cross-sectional imaging software: a phantom study.

    Science.gov (United States)

    Freesmeyer, Martin; Wiegand, Steffen; Schierz, Jan-Henning; Winkens, Thomas; Licht, Katharina

    2014-07-01

    A precise estimate of thyroid volume is necessary for making adequate therapeutic decisions and planning, as well as for monitoring therapy response. The goal of this study was to compare the precision of different volumetry methods. Thyroid-shaped phantoms were subjected to volumetry via 2-D and 3-D ultrasonography (US), computed tomography (CT) and magnetic resonance imaging (MRI). The 3-D US scans were performed using sensor navigation and mechanical sweeping methods. Volumetry calculation ensued with the conventional ellipsoid model and the manual tracing method. The study confirmed the superiority of manual tracing with CT and MRI volumetry of the thyroid, but extended this knowledge also to the superiority of the 3-D US method, regardless of whether sensor navigation or mechanical sweeping is used. A novel aspect was successful use of the same universally applicable cross-imaging software for all modalities. Copyright © 2014 World Federation for Ultrasound in Medicine & Biology. Published by Elsevier Inc. All rights reserved.

  13. Software requirements

    CERN Document Server

    Wiegers, Karl E

    2003-01-01

    Without formal, verifiable software requirements-and an effective system for managing them-the programs that developers think they've agreed to build often will not be the same products their customers are expecting. In SOFTWARE REQUIREMENTS, Second Edition, requirements engineering authority Karl Wiegers amplifies the best practices presented in his original award-winning text?now a mainstay for anyone participating in the software development process. In this book, you'll discover effective techniques for managing the requirements engineering process all the way through the development cy

  14. Software Innovation

    DEFF Research Database (Denmark)

    Rose, Jeremy

      Innovation is the forgotten key to modern systems development - the element that defines the enterprising engineer, the thriving software firm and the cutting edge software application.  Traditional forms of technical education pay little attention to creativity - often encouraging overly...... rationalistic ways of thinking which stifle the ability to innovate. Professional software developers are often drowned in commercial drudgery and overwhelmed by work pressure and deadlines. The topic that will both ensure success in the market and revitalize their work lives is never addressed. This book sets...... out the new field of software innovation. It organizes the existing scientific research into eight simple heuristics - guiding principles for organizing a system developer's work-life so that it focuses on innovation....

  15. Software Reviews.

    Science.gov (United States)

    Classroom Computer Learning, 1990

    1990-01-01

    Reviewed are three computer software packages including "Martin Luther King, Jr.: Instant Replay of History,""Weeds to Trees," and "The New Print Shop, School Edition." Discussed are hardware requirements, costs, grade levels, availability, emphasis, strengths, and weaknesses. (CW)

  16. Software Reviews.

    Science.gov (United States)

    Wulfson, Stephen, Ed.

    1987-01-01

    Reviews seven computer software programs that can be used in science education programs. Describes courseware which deals with muscles and bones, terminology, classifying animals without backbones, molecular structures, drugs, genetics, and shaping the earth's surface. (TW)

  17. Software Reviews.

    Science.gov (United States)

    Dwyer, Donna; And Others

    1989-01-01

    Reviewed are seven software packages for Apple and IBM computers. Included are: "Toxicology"; "Science Corner: Space Probe"; "Alcohol and Pregnancy"; "Science Tool Kit Plus"; Computer Investigations: Plant Growth"; "Climatrolls"; and "Animal Watch: Whales." (CW)

  18. Reusable Software.

    Science.gov (United States)

    1984-03-01

    overseeing reusable software, the Reusable Software Organization ( RUSO ). This author does not feel at this time that establishment of such a specific...49] have not been accompanied by establishment of RUSO -like activities. There is need, however, for assurance that functions which a RUSO might be...assurance 6. establishment and maintenance of reuse archival facilities and activities. Actual establishment of a RUSO is best dictated by size of the

  19. Software Epistemology

    Science.gov (United States)

    2016-03-01

    comprehensive approach for determining software epistemology which significantly advances the state of the art in automated vulnerability discovery...straightforward. First, internet -based repositories of open source software (e.g., FreeBSD ports, GitHub, SourceForge, etc.) are mined Approved for...the fix delta, we attempted to perform the same process to determine if the firmware release present in an Internet -of-Things (IoT) streaming camera

  20. 不同软件在ASTER数据中提取DEM的精度对比%Comparative Analysis of the DEM Extracting Accuracy form ASTER Data by Different Software

    Institute of Scientific and Technical Information of China (English)

    何兆培; 杨斌

    2013-01-01

    利用3个不同的软件对四川省龙门山中段ASTER 15m分辨率的立体像对进行了DEM提取,并对其精度进行了初步评价.分别使用立体测量法和干涉测量法提取DEM,并通过检验点法和剖面线法对比分析.结果表明,利用ERDAS的干涉测量法提取出的DEM效果较好,高程精度可达30m,对后续数据深挖掘和高层次地形分析具有应用价值.%This paper used three different software to extract the DEM form the stereo images which the ASTER 15 m resolution of the middle Longmen Mountain in Sichuan province, and evaluated its accuracy preliminarily. DEM's accuracy depends on accuracy, distribution and quantity of the control point of ground.lt is also influenced by the control precision of software in the production process. In this thesis, three-dimensional measurements and interferometer measurements were taken to extract the DEM respectively, and a comparative analysis was made by test point method and the section line method. The results show that using the interferometer method of ERDAS to extract the DEM is better, with the height accuracy up to 30 m. It will provide the practice value for getting more detail data in future and analysis of the high-level terrain.

  1. Extracting Metrics for Three-dimensional Root Systems: Volume and Surface Analysis from In-soil X-ray Computed Tomography Data

    Energy Technology Data Exchange (ETDEWEB)

    Suresh, Niraj; Stephens, Sean A.; Adams, Lexor; Beck, Anthon N.; McKinney, Adriana L.; Varga, Tamas

    2016-01-01

    Plant roots play a critical role in plant-soil-microbe interactions that occur in the rhizosphere, as well as processes with important implications to climate change and forest management. Quantitative size information on roots in their native environment is invaluable for studying root growth and environmental processes involving the plant. X ray computed tomography (XCT) has been demonstrated to be an effective tool for in situ root scanning and analysis. Our group at the Environmental Molecular Sciences Laboratory (EMSL) has developed an XCT-based tool to image and quantitatively analyze plant root structures in their native soil environment. XCT data collected on a Prairie dropseed (Sporobolus heterolepis) specimen was used to visualize its root structure. A combination of open-source software RooTrak and DDV were employed to segment the root from the soil, and calculate its isosurface, respectively. Our own computer script named 3DRoot-SV was developed and used to calculate root volume and surface area from a triangular mesh. The process utilizing a unique combination of tools, from imaging to quantitative root analysis, including the 3DRoot-SV computer script, is described.

  2. MIAWARE Software

    DEFF Research Database (Denmark)

    Wilkowski, Bartlomiej; Pereira, Oscar N. M.; Dias, Paulo

    2008-01-01

    This article presents MIAWARE, a software for Medical Image Analysis With Automated Reporting Engine, which was designed and developed for doctor/radiologist assistance. It allows to analyze an image stack from computed axial tomography scan of lungs (thorax) and, at the same time, to mark all...... pathologies on images and report their characteristics. The reporting process is normalized - radiologists cannot describe pathological changes with their own words, but can only use some terms from a specific vocabulary set provided by the software. Consequently, a normalized radiological report...... is automatically generated. Furthermore, MIAWARE software is accompanied with an intelligent search engine for medical reports, based on the relations between parts of the lungs. A logical structure of the lungs is introduced to the search algorithm through the specially developed ontology. As a result...

  3. Software engineering

    CERN Document Server

    Thorin, Marc

    1985-01-01

    Software Engineering describes the conceptual bases as well as the main methods and rules on computer programming. This book presents software engineering as a coherent and logically built synthesis and makes it possible to properly carry out an application of small or medium difficulty that can later be developed and adapted to more complex cases. This text is comprised of six chapters and begins by introducing the reader to the fundamental notions of entities, actions, and programming. The next two chapters elaborate on the concepts of information and consistency domains and show that a proc

  4. Software method for extracting photoplethysmography in pulse oximeter%脉搏血氧仪中光电容积脉搏波的软件检出方法

    Institute of Scientific and Technical Information of China (English)

    张虹; 孙卫新; 金捷

    2001-01-01

    脉搏血氧仪中光电容积脉搏波的正确提取和检出是完成氧饱和度测量任务的主要环节之一. 为了克服传统设计中采用硬件电路完成脉搏波提取和检出时所带来的系统整体设计复杂,稳定性和重复性差等缺点,作者研究了光电容积脉搏波的软件提取和检出方法,为脉搏血氧饱和度检测系统的数字化设计奠定了基础.%Correct extraction of the photoplethysmography is one of the primary procedures to measure the oxygen saturation in the pulse oximeter. In order to overcome the problems of complex system design, instability and bad repetition which are brought about by analog circuits for completing the photoplethysmography extraction in conventional pulse oximetry systems, the author studied the software method to extract the photoplethysmography which will be the basis for digitizing the design of the pulse oximetry systems.

  5. [Software version and medical device software supervision].

    Science.gov (United States)

    Peng, Liang; Liu, Xiaoyan

    2015-01-01

    The importance of software version in the medical device software supervision does not cause enough attention at present. First of all, the effect of software version in the medical device software supervision is discussed, and then the necessity of software version in the medical device software supervision is analyzed based on the discussion of the misunderstanding of software version. Finally the concrete suggestions on software version naming rules, software version supervision for the software in medical devices, and software version supervision scheme are proposed.

  6. Educational Software.

    Science.gov (United States)

    Northwest Regional Educational Lab., Portland, OR.

    The third session of IT@EDU98 consisted of five papers on educational software and was chaired by Tran Van Hao (University of Education, Ho Chi Minh City, Vietnam). "Courseware Engineering" (Nguyen Thanh Son, Ngo Ngoc Bao Tran, Quan Thanh Tho, Nguyen Hong Lam) briefly describes the use of courseware. "Machine Discovery Theorems in Geometry: A…

  7. Software Patents.

    Science.gov (United States)

    Burke, Edmund B.

    1994-01-01

    Outlines basic patent law information that pertains to computer software programs. Topics addressed include protection in other countries; how to obtain patents; kinds of patents; duration; classes of patentable subject matter, including machines and processes; patentability searches; experimental use prior to obtaining a patent; and patent…

  8. Software Systems

    Institute of Scientific and Technical Information of China (English)

    崔涛; 周淼

    1996-01-01

    The information used with computers is known as software and includesprograms and data. Programs are sets of instructions telling the computerwhat operations have to be carried out and in what order they should be done. Specialised programs which enable the computer to be used for particularpurposes are called applications programs. A collection of these programs kept

  9. Software Review.

    Science.gov (United States)

    McGrath, Diane, Ed.

    1989-01-01

    Reviewed is a computer software package entitled "Audubon Wildlife Adventures: Grizzly Bears" for Apple II and IBM microcomputers. Included are availability, hardware requirements, cost, and a description of the program. The murder-mystery flavor of the program is stressed in this program that focuses on illegal hunting and game management. (CW)

  10. Sample Preparation and Extraction in Small Sample Volumes Suitable for Pediatric Clinical Studies: Challenges, Advances, and Experiences of a Bioanalytical HPLC-MS/MS Method Validation Using Enalapril and Enalaprilat

    Directory of Open Access Journals (Sweden)

    Bjoern B. Burckhardt

    2015-01-01

    Full Text Available In USA and Europe, medicines agencies force the development of child-appropriate medications and intend to increase the availability of information on the pediatric use. This asks for bioanalytical methods which are able to deal with small sample volumes as the trial-related blood lost is very restricted in children. Broadly used HPLC-MS/MS, being able to cope with small volumes, is susceptible to matrix effects. The latter restrains the precise drug quantification through, for example, causing signal suppression. Sophisticated sample preparation and purification utilizing solid-phase extraction was applied to reduce and control matrix effects. A scale-up from vacuum manifold to positive pressure manifold was conducted to meet the demands of high-throughput within a clinical setting. Faced challenges, advances, and experiences in solid-phase extraction are exemplarily presented on the basis of the bioanalytical method development and validation of low-volume samples (50 μL serum. Enalapril, enalaprilat, and benazepril served as sample drugs. The applied sample preparation and extraction successfully reduced the absolute and relative matrix effect to comply with international guidelines. Recoveries ranged from 77 to 104% for enalapril and from 93 to 118% for enalaprilat. The bioanalytical method comprising sample extraction by solid-phase extraction was fully validated according to FDA and EMA bioanalytical guidelines and was used in a Phase I study in 24 volunteers.

  11. SAT-Based Software Certification

    Science.gov (United States)

    2006-02-01

    Lecture Notes in Computer Science . Paris, France, July 18–22, 2001. New York, NY: Springer-Verlag, 2001. [Balaban 05] Balaban, I...Model Checking, and Abstract Interpretation (VMCAI ’05), Volume 3385 of Lecture Notes in Computer Science . Paris, France, January 17–19, 2005. New York...Proceedings of the 8th International SPIN Workshop on Model Checking of Software (SPIN ’01), Volume 2057 of Lecture Notes in Computer Science

  12. EPIQR software

    Energy Technology Data Exchange (ETDEWEB)

    Flourentzos, F. [Federal Institute of Technology, Lausanne (Switzerland); Droutsa, K. [National Observatory of Athens, Athens (Greece); Wittchen, K.B. [Danish Building Research Institute, Hoersholm (Denmark)

    1999-11-01

    The support of the EPIQR method is a multimedia computer program. Several modules help the users of the method to treat the data collected during a diagnosis survey, to set up refurbishment scenario and calculate their cost or energy performance, and finally to visualize the results in a comprehensive way and to prepare quality reports. This article presents the structure and the main features of the software. (au)

  13. EPIQR software

    Energy Technology Data Exchange (ETDEWEB)

    Flourentzos, F. [Federal Institute of Technology-Lausanne (EPFL), Solar Energy and Building Physics Laboratory (LESO-PB), Lausanne (Switzerland); Droutsa, K. [National Observatory of Athens, Institute of Meteorology and Physics of Atmospheric Environment, Group Energy Conservation, Athens (Greece); Wittchen, K.B. [Danish Building Research Institute, Division of Energy and Indoor Environment, Hoersholm, (Denmark)

    2000-07-01

    The support of the EPIQR method is a multimedia computer program. Several modules help the users of the method to treat the data collected during a diagnosis survey, to set up refurbishment scenarios and calculate their cost or energy performance, and finally to visualize the results in a comprehensive way and to prepare quality reports. This article presents the structure and the main features of the software. (author)

  14. Software preservation

    Directory of Open Access Journals (Sweden)

    Tadej Vodopivec

    2011-01-01

    Full Text Available Comtrade Ltd. covers a wide range of activities related to information and communication technologies; its deliverables include web applications, locally installed programs,system software, drivers, embedded software (used e.g. in medical devices, auto parts,communication switchboards. Also the extensive knowledge and practical experience about digital long-term preservation technologies have been acquired. This wide spectrum of activities puts us in the position to discuss the often overlooked aspect of the digital preservation - preservation of software programs. There are many resources dedicated to digital preservation of digital data, documents and multimedia records,but not so many about how to preserve the functionalities and features of computer programs. Exactly these functionalities - dynamic response to inputs - render the computer programs rich compared to documents or linear multimedia. The article opens the questions on the beginning of the way to the permanent digital preservation. The purpose is to find a way in the right direction, where all relevant aspects will be covered in proper balance. The following questions are asked: why at all to preserve computer programs permanently, who should do this and for whom, when we should think about permanent program preservation, what should be persevered (such as source code, screenshots, documentation, and social context of the program - e.g. media response to it ..., where and how? To illustrate the theoretic concepts given the idea of virtual national museum of electronic banking is also presented.

  15. Software Engineering to Professionalize Software Development

    Directory of Open Access Journals (Sweden)

    Juan Miguel Alonso

    2011-12-01

    Full Text Available The role, increasingly important, that plays the software in the systems with widespread effects presents new challenges for the formation of Software Engineers. Not only because social dependence software is increasing, but also because the character of software development is also changing and with it the demands for software developers certified. In this paper are propose some challenges and aspirations that guide the learning processes Software Engineering and help to identify the need to train professionals in software development.

  16. Rapid and sensitive solid phase extraction-large volume injection-gas chromatography for the analysis of mineral oil saturated and aromatic hydrocarbons in cardboard and dried foods.

    Science.gov (United States)

    Moret, Sabrina; Barp, Laura; Purcaro, Giorgia; Conte, Lanfranco S

    2012-06-22

    A rapid off-line solid phase extraction-large volume injection-gas chromatography-flame ionisation detection (SPE-LVI-GC-FID) method, based on the use of silver silica gel and low solvent consumption, was developed for mineral oil saturated hydrocarbon (MOSH) and mineral oil aromatic hydrocarbon (MOAH) determination in cardboard and dried foods packaged in cardboard. The SPE method was validated using LVI with a conventional on-column injector and the retention gap technique (which allowed to inject up to 50 μL of the sample). Detector response was linear over all the concentration range tested (0.5-250 μg/mL), recoveries were practically quantitative, repeatability was good (coefficients of variation lower than 7%) and limit of quantification adequate to quantify the envisioned limit of 0.15 mg/kg proposed in Germany for MOAH analysis in food samples packaged in recycled cardboard. Rapid heating of the GC oven allowed to increase sample throughput (3-4 samples per hour) and to enhance sensitivity. The proposed method was used for MOSH and MOAH determination in selected food samples usually commercialised in cardboard packaging. The most contaminated was a tea sample (102.2 and 7.9 mg/kg of MOSH and MOAH below n-C25, respectively), followed by a rice and a sugar powder sample, all packaged in recycled cardboard.

  17. MEDEX Processing System. Volume 2. Software

    Science.gov (United States)

    1974-10-21

    A-44 AMPC ............ .......................... .... A-47 FIX ................... .......................... A-48 FIXPT...SCAL 8 A-40 AXIS 9 A-44 AMPC 10 A-47 FIX 11 A-48 P FIXPT 12 A-50 STRPH 13 A-54 A-1 AUTOMATIC CALIBRATION~ P/SAZL Y -sLIEcT ARRAY TYPE ACCIEPT ACCEPT lmc...uetOu’r i4E AXIS A-46 1-SFUr1NE. AMPC (A y .ISHD0) KIA ARRAY (K~): MW Y(KI) I A-47 SUB¶ROUTttN FIX, FKC-T fipar.~ WOI LI FM~’~. rI Li A,4 qLL~t= (FCT

  18. Formal Verification of Mathematical Software. Volume 2

    Science.gov (United States)

    1990-05-01

    the greater is the possibility for performing many operations in parallel. Ercegovac and Watanuki [WE811 have developed an on-line version, using the...suggestion for a fast multiplier. IEEE Transac- tions on Electronic Computers, EC-13:14-17, February 1964. [WE81] 0. Watanuki and M. D. Ercegovac

  19. Space Software

    Science.gov (United States)

    1990-01-01

    Xontech, Inc.'s software package, XonVu, simulates the missions of Voyager 1 at Jupiter and Saturn, Voyager 2 at Jupiter, Saturn, Uranus and Neptune, and Giotto in close encounter with Comet Halley. With the program, the user can generate scenes of the planets, moons, stars or Halley's nucleus and tail as seen by Giotto, all graphically reproduced with high accuracy in wireframe representation. Program can be used on a wide range of computers, including PCs. User friendly and interactive, with many options, XonVu can be used by a space novice or a professional astronomer. With a companion user's manual, it sells for $79.

  20. Software architecture

    CERN Document Server

    Vogel, Oliver; Chughtai, Arif

    2011-01-01

    As a software architect you work in a wide-ranging and dynamic environment. You have to understand the needs of your customer, design architectures that satisfy both functional and non-functional requirements, and lead development teams in implementing the architecture. And it is an environment that is constantly changing: trends such as cloud computing, service orientation, and model-driven procedures open up new architectural possibilities. This book will help you to develop a holistic architectural awareness and knowledge base that extends beyond concrete methods, techniques, and technologi

  1. Global Software Engineering: A Software Process Approach

    Science.gov (United States)

    Richardson, Ita; Casey, Valentine; Burton, John; McCaffery, Fergal

    Our research has shown that many companies are struggling with the successful implementation of global software engineering, due to temporal, cultural and geographical distance, which causes a range of factors to come into play. For example, cultural, project managementproject management and communication difficulties continually cause problems for software engineers and project managers. While the implementation of efficient software processes can be used to improve the quality of the software product, published software process models do not cater explicitly for the recent growth in global software engineering. Our thesis is that global software engineering factors should be included in software process models to ensure their continued usefulness in global organisations. Based on extensive global software engineering research, we have developed a software process, Global Teaming, which includes specific practices and sub-practices. The purpose is to ensure that requirements for successful global software engineering are stipulated so that organisations can ensure successful implementation of global software engineering.

  2. T2’-Imaging to Assess Cerebral Oxygen Extraction Fraction in Carotid Occlusive Disease: Influence of Cerebral Autoregulation and Cerebral Blood Volume

    Science.gov (United States)

    Deichmann, Ralf; Pfeilschifter, Waltraud; Hattingen, Elke; Singer, Oliver C.; Wagner, Marlies

    2016-01-01

    Purpose Quantitative T2'-mapping detects regional changes of the relation of oxygenated and deoxygenated hemoglobin (Hb) by using their different magnetic properties in gradient echo imaging and might therefore be a surrogate marker of increased oxygen extraction fraction (OEF) in cerebral hypoperfusion. Since elevations of cerebral blood volume (CBV) with consecutive accumulation of Hb might also increase the fraction of deoxygenated Hb and, through this, decrease the T2’-values in these patients we evaluated the relationship between T2’-values and CBV in patients with unilateral high-grade large-artery stenosis. Materials and Methods Data from 16 patients (13 male, 3 female; mean age 53 years) with unilateral symptomatic or asymptomatic high-grade internal carotid artery (ICA) or middle cerebral artery (MCA) stenosis/occlusion were analyzed. MRI included perfusion-weighted imaging and high-resolution T2’-mapping. Representative relative (r)CBV-values were analyzed in areas of decreased T2’ with different degrees of perfusion delay and compared to corresponding contralateral areas. Results No significant elevations in cerebral rCBV were detected within areas with significantly decreased T2’-values. In contrast, rCBV was significantly decreased (pperfusion delay and decreased T2’. Furthermore, no significant correlation between T2’- and rCBV-values was found. Conclusions rCBV is not significantly increased in areas of decreased T2’ and in areas of restricted perfusion in patients with unilateral high-grade stenosis. Therefore, T2’ should only be influenced by changes of oxygen metabolism, regarding our patient collective especially by an increase of the OEF. T2’-mapping is suitable to detect altered oxygen consumption in chronic cerebrovascular disease. PMID:27560515

  3. Visual querying and analysis of large software repositories

    NARCIS (Netherlands)

    Voinea, Lucian; Telea, Alexandru

    2009-01-01

    We present a software framework for mining software repositories. Our extensible framework enables the integration of data extraction from repositories with data analysis and interactive visualization. We demonstrate the applicability of the framework by presenting several case studies performed on

  4. SOFTWARE METRICS VALIDATION METHODOLOGIES IN SOFTWARE ENGINEERING

    Directory of Open Access Journals (Sweden)

    K.P. Srinivasan

    2014-12-01

    Full Text Available In the software measurement validations, assessing the validation of software metrics in software engineering is a very difficult task due to lack of theoretical methodology and empirical methodology [41, 44, 45]. During recent years, there have been a number of researchers addressing the issue of validating software metrics. At present, software metrics are validated theoretically using properties of measures. Further, software measurement plays an important role in understanding and controlling software development practices and products. The major requirement in software measurement is that the measures must represent accurately those attributes they purport to quantify and validation is critical to the success of software measurement. Normally, validation is a collection of analysis and testing activities across the full life cycle and complements the efforts of other quality engineering functions and validation is a critical task in any engineering project. Further, validation objective is to discover defects in a system and assess whether or not the system is useful and usable in operational situation. In the case of software engineering, validation is one of the software engineering disciplines that help build quality into software. The major objective of software validation process is to determine that the software performs its intended functions correctly and provides information about its quality and reliability. This paper discusses the validation methodology, techniques and different properties of measures that are used for software metrics validation. In most cases, theoretical and empirical validations are conducted for software metrics validations in software engineering [1-50].

  5. Software not as a service

    Science.gov (United States)

    Teal, Tracy

    2017-01-01

    With the expansion in the variety, velocity and volume of data being produced, computing and software development has become a crucial element of astronomy research. However, while we value the research, we place less importance on the development of the software itself, viewing software as a service to research. By viewing software as a service, we derate the effort and expertise it takes to produce, and the training required, for effective research computing. We also don’t provide support for the people doing the development, often expecting individual developers to provide systems administration, user support and training and produce documentation and user interfaces. With our increased reliance on research computing, accurate and reproducible research requires that software not be separate from the act of conducting research, but an integral component - a part of, rather than a service to research. Shifts in how we provide data skills and software development training, integrate development into research programs and academic departments and value software as a product can have an impact on the quality, creativity and types of research we can conduct.

  6. Rapid and Semi-Automated Extraction of Neuronal Cell Bodies and Nuclei from Electron Microscopy Image Stacks

    Science.gov (United States)

    Holcomb, Paul S.; Morehead, Michael; Doretto, Gianfranco; Chen, Peter; Berg, Stuart; Plaza, Stephen; Spirou, George

    2016-01-01

    Connectomics—the study of how neurons wire together in the brain—is at the forefront of modern neuroscience research. However, many connectomics studies are limited by the time and precision needed to correctly segment large volumes of electron microscopy (EM) image data. We present here a semi-automated segmentation pipeline using freely available software that can significantly decrease segmentation time for extracting both nuclei and cell bodies from EM image volumes. PMID:27259933

  7. Software engineering architecture-driven software development

    CERN Document Server

    Schmidt, Richard F

    2013-01-01

    Software Engineering: Architecture-driven Software Development is the first comprehensive guide to the underlying skills embodied in the IEEE's Software Engineering Body of Knowledge (SWEBOK) standard. Standards expert Richard Schmidt explains the traditional software engineering practices recognized for developing projects for government or corporate systems. Software engineering education often lacks standardization, with many institutions focusing on implementation rather than design as it impacts product architecture. Many graduates join the workforce with incomplete skil

  8. Lessons from 30 Years of Flight Software

    Science.gov (United States)

    McComas, David C.

    2015-01-01

    This presentation takes a brief historical look at flight software over the past 30 years, extracts lessons learned and shows how many of the lessons learned are embodied in the Flight Software product line called the core Flight System (cFS). It also captures the lessons learned from developing and applying the cFS.

  9. Advances in software science and technology

    CERN Document Server

    Hikita, Teruo; Kakuda, Hiroyasu

    1993-01-01

    Advances in Software Science and Technology, Volume 4 provides information pertinent to the advancement of the science and technology of computer software. This book discusses the various applications for computer systems.Organized into two parts encompassing 10 chapters, this volume begins with an overview of the historical survey of programming languages for vector/parallel computers in Japan and describes compiling methods for supercomputers in Japan. This text then explains the model of a Japanese software factory, which is presented by the logical configuration that has been satisfied by

  10. Software Metrics to Estimate Software Quality using Software Component Reusability

    Directory of Open Access Journals (Sweden)

    Prakriti Trivedi

    2012-03-01

    Full Text Available Today most of the applications developed using some existing libraries, codes, open sources etc. As a code is accessed in a program, it is represented as the software component. Such as in java beans and in .net ActiveX controls are the software components. These components are ready to use programming code or controls that excel the code development. A component based software system defines the concept of software reusability. While using these components the main question arise is whether to use such components is beneficial or not. In this proposed work we are trying to present the answer for the same question. In this work we are presenting a set of software matrix that will check the interconnection between the software component and the application. How strong this relation defines the software quality after using this software component. The overall metrics will return the final result in terms of the boundless of the component with application.

  11. The software life cycle

    CERN Document Server

    Ince, Darrel

    1990-01-01

    The Software Life Cycle deals with the software lifecycle, that is, what exactly happens when software is developed. Topics covered include aspects of software engineering, structured techniques of software development, and software project management. The use of mathematics to design and develop computer systems is also discussed. This book is comprised of 20 chapters divided into four sections and begins with an overview of software engineering and software development, paying particular attention to the birth of software engineering and the introduction of formal methods of software develop

  12. Reaction Wheel Disturbance Model Extraction Software Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Reaction wheel mechanical noise is one of the largest sources of disturbance forcing on space-based observatories. Such noise arises from mass imbalance, bearing...

  13. Reaction Wheel Disturbance Model Extraction Software Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Reaction wheel disturbances are some of the largest sources of noise on sensitive telescopes. Such wheel-induced mechanical noises are not well characterized....

  14. Software and Network Engineering

    CERN Document Server

    2012-01-01

    The series "Studies in Computational Intelligence" (SCI) publishes new developments and advances in the various areas of computational intelligence – quickly and with a high quality. The intent is to cover the theory, applications, and design methods of computational intelligence, as embedded in the fields of engineering, computer science, physics and life science, as well as the methodologies behind them. The series contains monographs, lecture notes and edited volumes in computational intelligence spanning the areas of neural networks, connectionist systems, genetic algorithms, evolutionary computation, artificial intelligence, cellular automata, self-organizing systems, soft computing, fuzzy systems, and hybrid intelligent systems. Critical to both contributors and readers are the short publication time and world-wide distribution - this permits a rapid and broad dissemination of research results.   The purpose of the first ACIS International Symposium on Software and Network Engineering held on Decembe...

  15. Extraction and processing of circulating DNA from large sample volumes using methylation on beads for the detection of rare epigenetic events.

    Science.gov (United States)

    Keeley, Brian; Stark, Alejandro; Pisanic, Thomas R; Kwak, Ruby; Zhang, Yi; Wrangle, John; Baylin, Stephen; Herman, James; Ahuja, Nita; Brock, Malcolm V; Wang, Tza-Huei

    2013-10-21

    The use of methylated tumor-specific circulating DNA has shown great promise as a potential cancer biomarker. Nonetheless, the relative scarcity of tumor-specific circulating DNA presents a challenge for traditional DNA extraction and processing techniques. Here we demonstrate a single tube extraction and processing technique dubbed "methylation on beads" that allows for DNA extraction and bisulfite conversion for up to 2 ml of plasma or serum. In comparison to traditional techniques including phenol chloroform and alcohol extraction, methylation on beads yields a 1.5- to 5-fold improvement in extraction efficiency. The technique results in far less carryover of PCR inhibitors yielding analytical sensitivity improvements of over 25-fold. The combination of improved recovery and sensitivity make possible the detection of rare epigenetic events and the development of high sensitivity epigenetic diagnostic assays. © 2013 Elsevier B.V. All rights reserved.

  16. Amalgamation of Personal Software Process in Software ...

    African Journals Online (AJOL)

    evolutionary series of personal software engineering techniques that an engineer learns and ... Article History: Received : 30-04- ... began to realize that software process, plans and methodologies for ..... Executive Strategy. Addison-Wesley ...

  17. Software attribute visualization for high integrity software

    Energy Technology Data Exchange (ETDEWEB)

    Pollock, G.M.

    1998-03-01

    This report documents a prototype tool developed to investigate the use of visualization and virtual reality technologies for improving software surety confidence. The tool is utilized within the execution phase of the software life cycle. It provides a capability to monitor an executing program against prespecified requirements constraints provided in a program written in the requirements specification language SAGE. The resulting Software Attribute Visual Analysis Tool (SAVAnT) also provides a technique to assess the completeness of a software specification.

  18. Ontologies for software engineering and software technology

    CERN Document Server

    Calero, Coral; Piattini, Mario

    2006-01-01

    Covers two applications of ontologies in software engineering and software technology: sharing knowledge of the problem domain and using a common terminology among all stakeholders; and filtering the knowledge when defining models and metamodels. This book is of benefit to software engineering researchers in both academia and industry.

  19. Software Vulnerability Taxonomy Consolidation

    Energy Technology Data Exchange (ETDEWEB)

    Polepeddi, Sriram S. [Carnegie Mellon Univ., Pittsburgh, PA (United States)

    2004-12-07

    In today's environment, computers and networks are increasing exposed to a number of software vulnerabilities. Information about these vulnerabilities is collected and disseminated via various large publicly available databases such as BugTraq, OSVDB and ICAT. Each of these databases, individually, do not cover all aspects of a vulnerability and lack a standard format among them, making it difficult for end-users to easily compare various vulnerabilities. A central database of vulnerabilities has not been available until today for a number of reasons, such as the non-uniform methods by which current vulnerability database providers receive information, disagreement over which features of a particular vulnerability are important and how best to present them, and the non-utility of the information presented in many databases. The goal of this software vulnerability taxonomy consolidation project is to address the need for a universally accepted vulnerability taxonomy that classifies vulnerabilities in an unambiguous manner. A consolidated vulnerability database (CVDB) was implemented that coalesces and organizes vulnerability data from disparate data sources. Based on the work done in this paper, there is strong evidence that a consolidated taxonomy encompassing and organizing all relevant data can be achieved. However, three primary obstacles remain: lack of referencing a common ''primary key'', un-structured and free-form descriptions of necessary vulnerability data, and lack of data on all aspects of a vulnerability. This work has only considered data that can be unambiguously extracted from various data sources by straightforward parsers. It is felt that even with the use of more advanced, information mining tools, which can wade through the sea of unstructured vulnerability data, this current integration methodology would still provide repeatable, unambiguous, and exhaustive results. Though the goal of coalescing all available data

  20. Controlling Software Piracy.

    Science.gov (United States)

    King, Albert S.

    1992-01-01

    Explains what software manufacturers are doing to combat software piracy, recommends how managers should deal with this problem, and provides a role-playing exercise to help students understand the issues in software piracy. (SR)

  1. Space Flight Software Development Software for Intelligent System Health Management

    Science.gov (United States)

    Trevino, Luis C.; Crumbley, Tim

    2004-01-01

    The slide presentation examines the Marshall Space Flight Center Flight Software Branch, including software development projects, mission critical space flight software development, software technical insight, advanced software development technologies, and continuous improvement in the software development processes and methods.

  2. Software Engineering Guidebook

    Science.gov (United States)

    Connell, John; Wenneson, Greg

    1993-01-01

    The Software Engineering Guidebook describes SEPG (Software Engineering Process Group) supported processes and techniques for engineering quality software in NASA environments. Three process models are supported: structured, object-oriented, and evolutionary rapid-prototyping. The guidebook covers software life-cycles, engineering, assurance, and configuration management. The guidebook is written for managers and engineers who manage, develop, enhance, and/or maintain software under the Computer Software Services Contract.

  3. Software and systems traceability

    CERN Document Server

    Cleland-Huang, Jane; Zisman, Andrea

    2012-01-01

    ""Software and Systems Traceability"" provides a comprehensive description of the practices and theories of software traceability across all phases of the software development lifecycle. The term software traceability is derived from the concept of requirements traceability. Requirements traceability is the ability to track a requirement all the way from its origins to the downstream work products that implement that requirement in a software system. Software traceability is defined as the ability to relate the various types of software artefacts created during the development of software syst

  4. Maximizing ROI on software development

    CERN Document Server

    Sikka, Vijay

    2004-01-01

    A brief review of software development history. Software complexity crisis. Software development ROI. The case for global software development and testing. Software quality and test ROI. How do you implement global software development and testing. Case studies.

  5. Technology Foundations for Computational Evaluation of Software Security Attributes

    Science.gov (United States)

    2006-12-01

    Technology Foundations for Computational Evaluation of Software Security Attributes Gwendolyn H. Walton Thomas A. Longstaff Richard C...security attributes to the functional behavior of the software . The emergence of CERT’s new function extraction (FX) technology , unavailable to previous... software meets security requirements if they have been specified in behavioral terms. FX technology prescribes effective means to create and record

  6. Computational intelligence and quantitative software engineering

    CERN Document Server

    Succi, Giancarlo; Sillitti, Alberto

    2016-01-01

    In a down-to-the earth manner, the volume lucidly presents how the fundamental concepts, methodology, and algorithms of Computational Intelligence are efficiently exploited in Software Engineering and opens up a novel and promising avenue of a comprehensive analysis and advanced design of software artifacts. It shows how the paradigm and the best practices of Computational Intelligence can be creatively explored to carry out comprehensive software requirement analysis, support design, testing, and maintenance. Software Engineering is an intensive knowledge-based endeavor of inherent human-centric nature, which profoundly relies on acquiring semiformal knowledge and then processing it to produce a running system. The knowledge spans a wide variety of artifacts, from requirements, captured in the interaction with customers, to design practices, testing, and code management strategies, which rely on the knowledge of the running system. This volume consists of contributions written by widely acknowledged experts ...

  7. Improving Software Developer's Competence

    DEFF Research Database (Denmark)

    Abrahamsson, Pekka; Kautz, Karlheinz; Sieppi, Heikki

    2002-01-01

    Emerging agile software development methods are people oriented development approaches to be used by the software industry. The personal software process (PSP) is an accepted method for improving the capabilities of a single software engineer. Five original hypotheses regarding the impact...

  8. Ensuring Software IP Cleanliness

    Directory of Open Access Journals (Sweden)

    Mahshad Koohgoli

    2007-12-01

    Full Text Available At many points in the life of a software enterprise, determination of intellectual property (IP cleanliness becomes critical. The value of an enterprise that develops and sells software may depend on how clean the software is from the IP perspective. This article examines various methods of ensuring software IP cleanliness and discusses some of the benefits and shortcomings of current solutions.

  9. Improving Software Developer's Competence

    DEFF Research Database (Denmark)

    Abrahamsson, Pekka; Kautz, Karlheinz; Sieppi, Heikki

    2002-01-01

    Emerging agile software development methods are people oriented development approaches to be used by the software industry. The personal software process (PSP) is an accepted method for improving the capabilities of a single software engineer. Five original hypotheses regarding the impact...

  10. Agile Software Development

    Science.gov (United States)

    Biju, Soly Mathew

    2008-01-01

    Many software development firms are now adopting the agile software development method. This method involves the customer at every level of software development, thus reducing the impact of change in the requirement at a later stage. In this article, the principles of the agile method for software development are explored and there is a focus on…

  11. Software distribution using xnetlib

    Energy Technology Data Exchange (ETDEWEB)

    Dongarra, J.J. [Univ. of Tennessee, Knoxville, TN (US). Dept. of Computer Science]|[Oak Ridge National Lab., TN (US); Rowan, T.H. [Oak Ridge National Lab., TN (US); Wade, R.C. [Univ. of Tennessee, Knoxville, TN (US). Dept. of Computer Science

    1993-06-01

    Xnetlib is a new tool for software distribution. Whereas its predecessor netlib uses e-mail as the user interface to its large collection of public-domain mathematical software, xnetlib uses an X Window interface and socket-based communication. Xnetlib makes it easy to search through a large distributed collection of software and to retrieve requested software in seconds.

  12. Image Processing Software

    Science.gov (United States)

    Bosio, M. A.

    1990-11-01

    ABSTRACT: A brief description of astronomical image software is presented. This software was developed in a Digital Micro Vax II Computer System. : St presenta una somera descripci6n del software para procesamiento de imagenes. Este software fue desarrollado en un equipo Digital Micro Vax II. : DATA ANALYSIS - IMAGE PROCESSING

  13. Agile Software Development

    Science.gov (United States)

    Biju, Soly Mathew

    2008-01-01

    Many software development firms are now adopting the agile software development method. This method involves the customer at every level of software development, thus reducing the impact of change in the requirement at a later stage. In this article, the principles of the agile method for software development are explored and there is a focus on…

  14. Software productivity improvement through software engineering technology

    Science.gov (United States)

    Mcgarry, F. E.

    1985-01-01

    It has been estimated that NASA expends anywhere from 6 to 10 percent of its annual budget on the acquisition, implementation and maintenance of computer software. Although researchers have produced numerous software engineering approaches over the past 5-10 years; each claiming to be more effective than the other, there is very limited quantitative information verifying the measurable impact htat any of these technologies may have in a production environment. At NASA/GSFC, an extended research effort aimed at identifying and measuring software techniques that favorably impact productivity of software development, has been active over the past 8 years. Specific, measurable, software development technologies have been applied and measured in a production environment. Resulting software development approaches have been shown to be effective in both improving quality as well as productivity in this one environment.

  15. Great software debates

    CERN Document Server

    Davis, A

    2004-01-01

    The industry’s most outspoken and insightful critic explains how the software industry REALLY works. In Great Software Debates, Al Davis, shares what he has learned about the difference between the theory and the realities of business and encourages you to question and think about software engineering in ways that will help you succeed where others fail. In short, provocative essays, Davis fearlessly reveals the truth about process improvement, productivity, software quality, metrics, agile development, requirements documentation, modeling, software marketing and sales, empiricism, start-up financing, software research, requirements triage, software estimation, and entrepreneurship.

  16. Software Engineering for Practiced Software Enhancement

    Directory of Open Access Journals (Sweden)

    Rashmi Yadav

    2011-03-01

    Full Text Available Software development scenario particularly in IT industries is very competitive and demands for development with minimum resources. Software development started and prevailed up to an extent in industry without the use of software engineering practices, which was perceived as an overhead. This approach causes over use of resources, such as money, man-hours, hardware components. This paper attempts to present the causes of inefficiencies in an almost exhaustive way. Further, an attempt has been made to elaborate the software engineering methods as remedies against the listed causes of inefficiencies of development.

  17. Software Metrics for Identifying Software Size in Software Development Projects

    Directory of Open Access Journals (Sweden)

    V.S.P Vidanapathirana

    2015-11-01

    Full Text Available Measurements are fundamental any engineering discipline. They indicate the amount, extent, dimension or capacity of an attribute or a product, in a quantitative manner. The analyzed results of the measured data can be given as the basic idea of metrics. It is a quantitative representation of the measurements of the degree to which a system, component, or process possesses a given attribute. When it comes to software, the metrics are a wide scope of measurements of computer programming. The size oriented metrics takes a main role in it since they can be used as the key for better estimations, to improve trust and confidence, and to have a better control over the software products. Software professionals traditionally have been measuring the size of software applications by using several methods. In this paper the researchers discuss about the software size metrics for identifying software size and it is mainly focused on the software development projects in today’s Information Technology (IT industry.

  18. Software Cost Estimation Review

    OpenAIRE

    Ongere, Alphonce

    2013-01-01

    Software cost estimation is the process of predicting the effort, the time and the cost re-quired to complete software project successfully. It involves size measurement of the soft-ware project to be produced, estimating and allocating the effort, drawing the project schedules, and finally, estimating overall cost of the project. Accurate estimation of software project cost is an important factor for business and the welfare of software organization in general. If cost and effort estimat...

  19. Software Partitioning Technologies

    Science.gov (United States)

    2001-05-29

    1 Software Partitioning Technologies Tim Skutt Smiths Aerospace 3290 Patterson Ave. SE Grand Rapids, MI 49512-1991 (616) 241-8645 skutt_timothy...Limitation of Abstract UU Number of Pages 12 2 Agenda n Software Partitioning Overview n Smiths Software Partitioning Technology n Software Partitioning...Partition Level OS Core Module Level OS Timers MMU I/O API Layer Partitioning Services 6 Smiths Software Partitioning Technology n Smiths has developed

  20. A novel ionic liquid/micro-volume back extraction procedure combined with flame atomic absorption spectrometry for determination of trace nickel in samples of nutritional interest

    Energy Technology Data Exchange (ETDEWEB)

    Dadfarnia, Shayesteh, E-mail: sdadfarnia@yazduni.ac.ir [Department of Chemistry, Faculty of Science, Yazd University, 89195/74, Yazd (Iran, Islamic Republic of); Haji Shabani, Ali Mohammad; Shirani Bidabadi, Mahboubeh; Jafari, Abbas Ali [Department of Chemistry, Faculty of Science, Yazd University, 89195/74, Yazd (Iran, Islamic Republic of)

    2010-01-15

    A simple, highly sensitive and environment-friendly method for the determination of trace amount of nickel ion in different matrices is proposed. In the preconcentration step, the nickel from 10 mL of an aqueous solution was extracted into 500 {mu}L of ionic liquid, 1-butyl-3-methylimidazolium hexafluorophosphate [C{sub 4}MIM][PF{sub 6}], containing PAN as complexing agent. Subsequently, the PAN complex was back-extracted into 250 {mu}L of nitric acid solution, and 100 {mu}L of it was analyzed by flow injection flame atomic absorption spectrometry (FI-FAAS). The main parameter influencing the extraction and determination of nickel, such as pH, concentration of PAN, extraction time and temperature, ionic strength, and concentration of stripping acid solution, were optimized. An enhancement factor of 40.2 was achieved with 25 mL sample. The limit of detection (LOD) and quantification obtained under the optimum conditions were 12.5 and 41.0 {mu}g L{sup -1}, respectively. To validate the proposed methods two certified reference materials 681-I and BCR No. 288 were analyzed and the results were in good agreement with the certified values. The proposed method was successfully applied to determination of nickel in water samples, rice flour and black tea.

  1. A novel ionic liquid/micro-volume back extraction procedure combined with flame atomic absorption spectrometry for determination of trace nickel in samples of nutritional interest.

    Science.gov (United States)

    Dadfarnia, Shayesteh; Shabani, Ali Mohammad Haji; Bidabadi, Mahboubeh Shirani; Jafari, Abbas Ali

    2010-01-15

    A simple, highly sensitive and environment-friendly method for the determination of trace amount of nickel ion in different matrices is proposed. In the preconcentration step, the nickel from 10 mL of an aqueous solution was extracted into 500 microL of ionic liquid, 1-butyl-3-methylimidazolium hexafluorophosphate [C(4)MIM][PF(6)], containing PAN as complexing agent. Subsequently, the PAN complex was back-extracted into 250 microL of nitric acid solution, and 100 microL of it was analyzed by flow injection flame atomic absorption spectrometry (FI-FAAS). The main parameter influencing the extraction and determination of nickel, such as pH, concentration of PAN, extraction time and temperature, ionic strength, and concentration of stripping acid solution, were optimized. An enhancement factor of 40.2 was achieved with 25 mL sample. The limit of detection (LOD) and quantification obtained under the optimum conditions were 12.5 and 41.0 microg L(-1), respectively. To validate the proposed methods two certified reference materials 681-I and BCR No. 288 were analyzed and the results were in good agreement with the certified values. The proposed method was successfully applied to determination of nickel in water samples, rice flour and black tea.

  2. International Conference on Harmonisation; Guidance on Q4B Evaluation and Recommendation of Pharmacopoeial Texts for Use in the International Conference on Harmonisation Regions; Annex on Test for Extractable Volume of Parenteral Preparations General Chapter; availability. Notice.

    Science.gov (United States)

    2009-01-09

    The Food and Drug Administration (FDA) is announcing the availability of a guidance entitled "Q4B Evaluation and Recommendation of Pharmacopoeial Texts for Use in the ICH Regions; Annex 2: Test for Extractable Volume of Parenteral Preparations General Chapter." The guidance was prepared under the auspices of the International Conference on Harmonisation of Technical Requirements for Registration of Pharmaceuticals for Human Use (ICH). The guidance provides the results of the ICH Q4B evaluation of the Test for Extractable Volume of Parenteral Preparations General Chapter harmonized text from each of the three pharmacopoeias (United States, European, and Japanese) represented by the Pharmacopoeial Discussion Group (PDG). The guidance conveys recognition of the three pharmacopoeial methods by the three ICH regulatory regions and provides specific information regarding the recognition. The guidance is intended to recognize the interchangeability between the local regional pharmacopoeias, thus avoiding redundant testing in favor of a common testing strategy in each regulatory region. In the Federal Register of February 21, 2008 (73 FR 9575), FDA made available a guidance on the Q4B process entitled "Q4B Evaluation and Recommendation of Pharmacopoeial Texts for Use in the ICH Regions."

  3. Advances in software science and technology

    CERN Document Server

    Ohno, Yoshio; Kamimura, Tsutomu

    1991-01-01

    Advances in Software Science and Technology, Volume 2 provides information pertinent to the advancement of the science and technology of computer software. This book discusses the various applications for computer systems.Organized into four parts encompassing 12 chapters, this volume begins with an overview of categorical frameworks that are widely used to represent data types in computer science. This text then provides an algorithm for generating vertices of a smoothed polygonal line from the vertices of a digital curve or polygonal curve whose position contains a certain amount of error. O

  4. Japan society for software science and technology

    CERN Document Server

    Nakajima, Reiji; Hagino, Tatsuya

    1990-01-01

    Advances in Software Science and Technology, Volume 1 provides information pertinent to the advancement of the science and technology of computer software. This book discusses the various applications for computer systems.Organized into three parts encompassing 13 chapters, this volume begins with an overview of the phase structure grammar for Japanese called JPSG, and a parser based on this grammar. This text then explores the logic-based knowledge representation called Uranus, which uses a multiple world mechanism. Other chapters consider the optimal file segmentation techniques for multi-at

  5. Advances in software science and technology

    CERN Document Server

    Kakuda, Hiroyasu; Ohno, Yoshio

    1992-01-01

    Advances in Software Science and Technology, Volume 3 provides information pertinent to the advancement of the science and technology of computer software. This book discusses the various applications for computer systems.Organized into two parts encompassing 11 chapters, this volume begins with an overview of the development of a system of writing tools called SUIKOU that analyzes a machine-readable Japanese document textually. This text then presents the conditioned attribute grammars (CAGs) and a system for evaluating them that can be applied to natural-language processing. Other chapters c

  6. Advances in software science and technology

    CERN Document Server

    Kamimura, Tsutomu

    1994-01-01

    This serial is a translation of the original works within the Japan Society of Software Science and Technology. A key source of information for computer scientists in the U.S., the serial explores the major areas of research in software and technology in Japan. These volumes are intended to promote worldwide exchange of ideas among professionals.This volume includes original research contributions in such areas as Augmented Language Logic (ALL), distributed C language, Smalltalk 80, and TAMPOPO-an evolutionary learning machine based on the principles of Realtime Minimum Skyline Detection.

  7. Software Engineering Program: Software Process Improvement Guidebook

    Science.gov (United States)

    1996-01-01

    The purpose of this document is to provide experience-based guidance in implementing a software process improvement program in any NASA software development or maintenance community. This guidebook details how to define, operate, and implement a working software process improvement program. It describes the concept of the software process improvement program and its basic organizational components. It then describes the structure, organization, and operation of the software process improvement program, illustrating all these concepts with specific NASA examples. The information presented in the document is derived from the experiences of several NASA software organizations, including the SEL, the SEAL, and the SORCE. Their experiences reflect many of the elements of software process improvement within NASA. This guidebook presents lessons learned in a form usable by anyone considering establishing a software process improvement program within his or her own environment. This guidebook attempts to balance general and detailed information. It provides material general enough to be usable by NASA organizations whose characteristics do not directly match those of the sources of the information and models presented herein. It also keeps the ideas sufficiently close to the sources of the practical experiences that have generated the models and information.

  8. Payload software technology: Software technology development plan

    Science.gov (United States)

    1977-01-01

    Programmatic requirements for the advancement of software technology are identified for meeting the space flight requirements in the 1980 to 1990 time period. The development items are described, and software technology item derivation worksheets are presented along with the cost/time/priority assessments.

  9. Turning the volume down on heavy metals using tuned diatomite. A review of diatomite and modified diatomite for the extraction of heavy metals from water.

    Science.gov (United States)

    Danil de Namor, Angela F; El Gamouz, Abdelaziz; Frangie, Sofia; Martinez, Vanina; Valiente, Liliana; Webb, Oliver A

    2012-11-30

    Contamination of water by heavy metals is a global problem, to which an inexpensive and simple solution is required. Within this context the unique properties of diatomite and its abundance in many regions of the world have led to the current widespread interest in this material for water purification purposes. Defined sections on articles published on the use of raw and modified diatomite for the removal of heavy metal pollutants from water are critically reviewed. The capability of the materials as extracting agents for individual species and mixtures of heavy metals are considered in terms of the kinetics, the thermodynamics and the recyclability for both, the pollutant and the extracting material. The concept of 'selectivity' for the enrichment of naturally occurring materials such as diatomite through the introduction of suitable functionalities in their structure to target a given pollutant is emphasised. Suggestions for further research in this area are given. Copyright © 2012 Elsevier B.V. All rights reserved.

  10. Turning the volume down on heavy metals using tuned diatomite. A review of diatomite and modified diatomite for the extraction of heavy metals from water

    Energy Technology Data Exchange (ETDEWEB)

    Danil de Namor, Angela F., E-mail: A.Danil-De-Namor@surrey.ac.uk [Instituto Nacional de Tecnologia Industrial, Parque Tecnologico Industrial Miguelete, Buenos Aires (Argentina); Department of Chemistry, University of Surrey, Guildford, Surrey GU2 7XH (United Kingdom); El Gamouz, Abdelaziz [Department of Chemistry, University of Surrey, Guildford, Surrey GU2 7XH (United Kingdom); Frangie, Sofia; Martinez, Vanina; Valiente, Liliana [Instituto Nacional de Tecnologia Industrial, Parque Tecnologico Industrial Miguelete, Buenos Aires (Argentina); Webb, Oliver A. [Department of Chemistry, University of Surrey, Guildford, Surrey GU2 7XH (United Kingdom)

    2012-11-30

    Highlights: Black-Right-Pointing-Pointer Critical assessment of published work on raw and modified diatomites. Black-Right-Pointing-Pointer Counter-ion effect on the extraction of heavy metal speciation by diatomite. Black-Right-Pointing-Pointer Selection of the counter-ion by the use of existing thermodynamic data. Black-Right-Pointing-Pointer Enrichment of diatomites by attaching heavy metal selective functionalities. Black-Right-Pointing-Pointer Supramolecular chemistry for conferring selectivity to diatomites. - Abstract: Contamination of water by heavy metals is a global problem, to which an inexpensive and simple solution is required. Within this context the unique properties of diatomite and its abundance in many regions of the world have led to the current widespread interest in this material for water purification purposes. Defined sections on articles published on the use of raw and modified diatomite for the removal of heavy metal pollutants from water are critically reviewed. The capability of the materials as extracting agents for individual species and mixtures of heavy metals are considered in terms of the kinetics, the thermodynamics and the recyclability for both, the pollutant and the extracting material. The concept of 'selectivity' for the enrichment of naturally occurring materials such as diatomite through the introduction of suitable functionalities in their structure to target a given pollutant is emphasised. Suggestions for further research in this area are given.

  11. Optimisation of a selective method for the determination of organophosphorous triesters in outdoor particulate samples by pressurised liquid extraction and large-volume injection gas chromatography-positive chemical ionisation-tandem mass spectrometry.

    Science.gov (United States)

    Quintana, José Benito; Rodil, Rosario; López-Mahía, Purificación; Muniategui-Lorenzo, Soledad; Prada-Rodríguez, Darío

    2007-07-01

    A selective analytical method for the determination of nine organophosphate triesters and triphenylphosphine oxide (TPPO) in outdoor particulate matter is presented. It involves a fully automated pressurised liquid extraction (PLE) step, integrating an alumina clean-up process, and subsequent determination by large-volume injection gas chromatography-positive chemical ionisation-tandem mass spectrometry (LVI-GC-PCI-MS/MS). The extraction variables (solvent, amount of adsorbent, temperature, time and number of cycles) were optimised using a multicriteria strategy which implements a desirability function that maximises both extraction and clean-up efficiencies while searching for the best-compromise PLE conditions. The final method affords quantification limits of between 0.01 and 0.3 microg g(-1) and recoveries of >80%, with the exceptions of the most polar analytes, TCEP and TPPO (~65%) for both urban dust and PM10 samples. Moreover, the method permitted the levels of these compounds in dust deposited outdoors (between LOD and 0.5 microg g(-1) for TEHP) and PM10 samples (between LOD and 2.4 microg m(-3) for TiBP) to be measured and reported for the first time.

  12. Pragmatic Software Innovation

    DEFF Research Database (Denmark)

    Aaen, Ivan; Jensen, Rikke Hagensby

    2014-01-01

    We understand software innovation as concerned with introducing innovation into the development of software intensive systems, i.e. systems in which software development and/or integration are dominant considerations. Innovation is key in almost any strategy for competitiveness in existing markets......, for creating new markets, or for curbing rising public expenses, and software intensive systems are core elements in most such strategies. Software innovation therefore is vital for about every sector of the economy. Changes in software technologies over the last decades have opened up for experimentation...

  13. Optimization of ultrasonic extraction of 23 elements from cotton.

    Science.gov (United States)

    Rezić, I

    2009-01-01

    Optimization of ultrasonic extraction of 23 elements from cotton was performed with different solvent volume ratios. For this purpose nitric acid, hydrochloric acid and water were mixed and applied in a mixture for the extraction of elements adsorbed on cotton material. The elements chosen for the extraction procedure (Al, As, Be, Bi, Ca, Cd, Co, Cr, Cu, Fe, Hg, K, Mg, Mn, Mo, Na, Ni, Pb, Sb, Si, Sn, Tl and Zn) were those that are important in textile processing. Some of them cause problems during fiber processing, dyeing or bleaching. The removal of elements from the processed fabric can be successfully done with ultrasonic extraction in the ultrasonic bath. Extraction procedure was optimized by software package Design Expert 6 (DX6) and the optimum of ultrasonic extraction was achieved with the mixture of 1M HCl-1M HNO(3)-H(2)O=3.32/2.83/93.85 (v/v). Ultrasonic extraction was a fast and efficient extraction procedure easily applied on cotton textile material.

  14. Pragmatic Software Innovation

    DEFF Research Database (Denmark)

    Aaen, Ivan; Jensen, Rikke Hagensby

    2014-01-01

    We understand software innovation as concerned with introducing innovation into the development of software intensive systems, i.e. systems in which software development and/or integration are dominant considerations. Innovation is key in almost any strategy for competitiveness in existing markets......, for creating new markets, or for curbing rising public expenses, and software intensive systems are core elements in most such strategies. Software innovation therefore is vital for about every sector of the economy. Changes in software technologies over the last decades have opened up for experimentation......, learning, and flexibility in ongoing software projects, but how can this change be used to facilitate software innovation? How can a team systematically identify and pursue opportunities to create added value in ongoing projects? In this paper, we describe Deweyan pragmatism as the philosophical foundation...

  15. Software Engineering Improvement Plan

    Science.gov (United States)

    2006-01-01

    In performance of this task order, bd Systems personnel provided support to the Flight Software Branch and the Software Working Group through multiple tasks related to software engineering improvement and to activities of the independent Technical Authority (iTA) Discipline Technical Warrant Holder (DTWH) for software engineering. To ensure that the products, comments, and recommendations complied with customer requirements and the statement of work, bd Systems personnel maintained close coordination with the customer. These personnel performed work in areas such as update of agency requirements and directives database, software effort estimation, software problem reports, a web-based process asset library, miscellaneous documentation review, software system requirements, issue tracking software survey, systems engineering NPR, and project-related reviews. This report contains a summary of the work performed and the accomplishments in each of these areas.

  16. Paladin Software Support Lab

    Data.gov (United States)

    Federal Laboratory Consortium — The Paladin Software Support Environment (SSE) occupies 2,241 square-feet. It contains the hardware and software tools required to support the Paladin Automatic Fire...

  17. ATLAS software packaging

    CERN Document Server

    Rybkin, G

    2012-01-01

    Software packaging is indispensable part of build and prerequisite for deployment processes. Full ATLAS software stack consists of TDAQ, HLT, and Offline software. These software groups depend on some 80 external software packages. We present tools, package PackDist, developed and used to package all this software except for TDAQ project. PackDist is based on and driven by CMT, ATLAS software configuration and build tool, and consists of shell and Python scripts. The packaging unit used is CMT project. Each CMT project is packaged as several packages - platform dependent (one per platform available), source code excluding header files, other platform independent files, documentation, and debug information packages (the last two being built optionally). Packaging can be done recursively to package all the dependencies. The whole set of packages for one software release, distribution kit, also includes configuration packages and contains some 120 packages for one platform. Also packaged are physics analysis pro...

  18. Commercial Data Mining Software

    Science.gov (United States)

    Zhang, Qingyu; Segall, Richard S.

    This chapter discusses selected commercial software for data mining, supercomputing data mining, text mining, and web mining. The selected software are compared with their features and also applied to available data sets. The software for data mining are SAS Enterprise Miner, Megaputer PolyAnalyst 5.0, PASW (formerly SPSS Clementine), IBM Intelligent Miner, and BioDiscovery GeneSight. The software for supercomputing are Avizo by Visualization Science Group and JMP Genomics from SAS Institute. The software for text mining are SAS Text Miner and Megaputer PolyAnalyst 5.0. The software for web mining are Megaputer PolyAnalyst and SPSS Clementine . Background on related literature and software are presented. Screen shots of each of the selected software are presented, as are conclusions and future directions.

  19. Building Software with Gradle

    CERN Document Server

    CERN. Geneva; Studer, Etienne

    2014-01-01

    In this presentation, we will give an overview of the key concepts and main features of Gradle, the innovative build system that has become the de-facto standard in the enterprise. We will cover task declaration and task graph execution, incremental builds, multi-project builds, dependency management, applying plugins, extracting reusable build logic, bootstrapping a build, and using the Gradle daemon. By the end of this talk, you will have a good understanding of what makes Gradle so powerful yet easy to use. You will also understand why companies like Pivotal, LinkedIn, Google, and other giants with complex builds count on Gradle. About the speakers Etienne is leading the Tooling Team at Gradleware. He has been working as a developer, architect, project manager, and CTO over the past 15 years. He has spent most of his time building software products from the ground up and successfully shipping them to happy customers. He had ...

  20. Maneuver Automation Software

    Science.gov (United States)

    Uffelman, Hal; Goodson, Troy; Pellegrin, Michael; Stavert, Lynn; Burk, Thomas; Beach, David; Signorelli, Joel; Jones, Jeremy; Hahn, Yungsun; Attiyah, Ahlam; Illsley, Jeannette

    2009-01-01

    The Maneuver Automation Software (MAS) automates the process of generating commands for maneuvers to keep the spacecraft of the Cassini-Huygens mission on a predetermined prime mission trajectory. Before MAS became available, a team of approximately 10 members had to work about two weeks to design, test, and implement each maneuver in a process that involved running many maneuver-related application programs and then serially handing off data products to other parts of the team. MAS enables a three-member team to design, test, and implement a maneuver in about one-half hour after Navigation has process-tracking data. MAS accepts more than 60 parameters and 22 files as input directly from users. MAS consists of Practical Extraction and Reporting Language (PERL) scripts that link, sequence, and execute the maneuver- related application programs: "Pushing a single button" on a graphical user interface causes MAS to run navigation programs that design a maneuver; programs that create sequences of commands to execute the maneuver on the spacecraft; and a program that generates predictions about maneuver performance and generates reports and other files that enable users to quickly review and verify the maneuver design. MAS can also generate presentation materials, initiate electronic command request forms, and archive all data products for future reference.

  1. Software Testing Requires Variability

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak

    2003-01-01

    Software variability is the ability of a software system or artefact to be changed, customized or configured for use in a particular context. Variability in software systems is important from a number of perspectives. Some perspectives rightly receive much attention due to their direct economic...... impact in software production. As is also apparent from the call for papers these perspectives focus on qualities such as reuse, adaptability, and maintainability....

  2. Software engineer's pocket book

    CERN Document Server

    Tooley, Michael

    2013-01-01

    Software Engineer's Pocket Book provides a concise discussion on various aspects of software engineering. The book is comprised of six chapters that tackle various areas of concerns in software engineering. Chapter 1 discusses software development, and Chapter 2 covers programming languages. Chapter 3 deals with operating systems. The book also tackles discrete mathematics and numerical computation. Data structures and algorithms are also explained. The text will be of great use to individuals involved in the specification, design, development, implementation, testing, maintenance, and qualit

  3. Software Testing Requires Variability

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak

    2003-01-01

    Software variability is the ability of a software system or artefact to be changed, customized or configured for use in a particular context. Variability in software systems is important from a number of perspectives. Some perspectives rightly receive much attention due to their direct economic...... impact in software production. As is also apparent from the call for papers these perspectives focus on qualities such as reuse, adaptability, and maintainability....

  4. Software engineering measurement

    CERN Document Server

    Munson, PhD, John C

    2003-01-01

    By demonstrating how to develop simple experiments for the empirical validation of theoretical research and showing how to convert measurement data into meaningful and valuable information, this text fosters more precise use of software measurement in the computer science and software engineering literature. Software Engineering Measurement shows you how to convert your measurement data to valuable information that can be used immediately for software process improvement.

  5. Software variability management

    NARCIS (Netherlands)

    Bosch, J; Nord, RL

    2004-01-01

    During recent years, the amount of variability that has to be supported by a software artefact is growing considerably and its management is evolving into a major challenge during development, usage, and evolution of software artefacts. Successful management of variability in software leads to

  6. Software Language Evolution

    NARCIS (Netherlands)

    Vermolen, S.D.

    2012-01-01

    Software plays a critical role in our daily life. Vast amounts of money are spent on more and more complex systems. All software, regardless if it controls a plane or the game on your phone is never finished. Software changes when it contains bugs or when new functionality is added. This process of

  7. Software Architecture Evolution

    Science.gov (United States)

    Barnes, Jeffrey M.

    2013-01-01

    Many software systems eventually undergo changes to their basic architectural structure. Such changes may be prompted by new feature requests, new quality attribute requirements, changing technology, or other reasons. Whatever the causes, architecture evolution is commonplace in real-world software projects. Today's software architects, however,…

  8. Java for flight software

    Science.gov (United States)

    Benowitz, E.; Niessner, A.

    2003-01-01

    This work involves developing representative mission-critical spacecraft software using the Real-Time Specification for Java (RTSJ). This work currently leverages actual flight software used in the design of actual flight software in the NASA's Deep Space 1 (DSI), which flew in 1998.

  9. Software Language Evolution

    NARCIS (Netherlands)

    Vermolen, S.D.

    2012-01-01

    Software plays a critical role in our daily life. Vast amounts of money are spent on more and more complex systems. All software, regardless if it controls a plane or the game on your phone is never finished. Software changes when it contains bugs or when new functionality is added. This process of

  10. Software Engineering for Portability.

    Science.gov (United States)

    Stanchev, Ivan

    1990-01-01

    Discussion of the portability of educational software focuses on the software design and development process. Topics discussed include levels of portability; the user-computer dialog; software engineering principles; design techniques for student performance records; techniques of courseware programing; and suggestions for further research and…

  11. Software Architecture Evolution

    Science.gov (United States)

    Barnes, Jeffrey M.

    2013-01-01

    Many software systems eventually undergo changes to their basic architectural structure. Such changes may be prompted by new feature requests, new quality attribute requirements, changing technology, or other reasons. Whatever the causes, architecture evolution is commonplace in real-world software projects. Today's software architects, however,…

  12. Software Maintenance Success Recipes

    CERN Document Server

    Reifer, Donald J

    2011-01-01

    Dispelling much of the folklore surrounding software maintenance, Software Maintenance Success Recipes identifies actionable formulas for success based on in-depth analysis of more than 200 real-world maintenance projects. It details the set of factors that are usually present when effective software maintenance teams do their work and instructs on the methods required to achieve success. Donald J. Reifer--an award winner for his contributions to the field of software engineering and whose experience includes managing the DoD Software Initiatives Office--provides step-by-step guidance on how t

  13. Funding Research Software Development

    Science.gov (United States)

    Momcheva, Ivelina G.

    2017-01-01

    Astronomical software is used by each and every member of our scientific community. Purpose-build software is becoming ever more critical as we enter the regime of large datasets and simulations of increasing complexity. However, financial investments in building, maintaining and renovating the software infrastructure have been uneven. In this talk I will summarize past and current funding sources for astronomical software development, discuss other models of funding and introduce a new initiative for supporting community software at STScI. The purpose of this talk is to prompt discussion about how we allocate resources to this vital infrastructure.

  14. MDP challenges from a software provider's perspective

    Science.gov (United States)

    Ohara, Shuichiro

    2014-10-01

    This industry faces new challenges every day. It gets tougher as process nodes shrink and the data complexity and volume increase. We are a mask data preparation (MDP) software provider, and have been providing MDP systems to mask shops since 1990. As the industry has, MDP software providers also have been facing new challenges over time, and the challenges get tougher as process nodes shrink and the data complexity and volume increase. We discuss such MDP challenges and solutions in this paper from a MDP software provider's perspective. The data volume continuously increases, and it is caused by shrinking the process node. In addition, resolution enhancement techniques (RET) such as optical proximity correction (OPC) and inverse lithography technique (ILT) induce data complexity, and it contributes considerably to the increase in data volume. The growth of data volume and complexity brings challenges to MDP system, such as the computing speed, shot count, and mask process correction (MPC). New tools (especially mask writers) also bring new challenges. Variable-shaped E-beam (VSB) mask writers demand fracturing less slivers and lower figure counts for CD accuracy and write time requirements respectively. Now multibeam mask writers are under development and will definitely bring new challenges.

  15. Comparison of semipermeable membrane device (SPMD) and large-volume solid-phase extraction techniques to measure water concentrations of 4,4'-DDT, 4,4'-DDE, and 4,4'-DDD in Lake Chelan, Washington.

    Science.gov (United States)

    Ellis, Steven G; Booij, Kees; Kaputa, Mike

    2008-07-01

    Semipermeable membrane devices (SPMDs) spiked with the performance reference compound PCB29 were deployed 6.1 m above the sediments of Lake Chelan, Washington, for a period of 27 d, to estimate the dissolved concentrations of 4,4'-DDT, 4,4'-DDE, and 4,4'-DDD. Water concentrations were estimated using methods proposed in 2002 and newer equations published in 2006 to determine how the application of the newer equations affects historical SPMD data that used the older method. The estimated concentrations of DDD, DDE, and DDD calculated using the older method were 1.5-2.9 times higher than the newer method. SPMD estimates from both methods were also compared to dissolved and particulate DDT concentrations measured directly by processing large volumes of water through a large-volume solid-phase extraction device (Infiltrex 300). SPMD estimates of DDD+DDE+DDT (SigmaDDT) using the older and newer methods were lower than Infiltrex concentrations by factors of 1.1 and 2.3, respectively. All measurements of DDT were below the Washington State water quality standards for the protection of human health (0.59 ng l(-1)) and aquatic life (1.0 ng l(-1)).

  16. Human-Centered Software Engineering: Software Engineering Architectures, Patterns, and Sodels for Human Computer Interaction

    Science.gov (United States)

    Seffah, Ahmed; Vanderdonckt, Jean; Desmarais, Michel C.

    The Computer-Human Interaction and Software Engineering (CHISE) series of edited volumes originated from a number of workshops and discussions over the latest research and developments in the field of Human Computer Interaction (HCI) and Software Engineering (SE) integration, convergence and cross-pollination. A first volume in this series (CHISE Volume I - Human-Centered Software Engineering: Integrating Usability in the Development Lifecycle) aims at bridging the gap between the field of SE and HCI, and addresses specifically the concerns of integrating usability and user-centered systems design methods and tools into the software development lifecycle and practices. This has been done by defining techniques, tools and practices that can fit into the entire software engineering lifecycle as well as by defining ways of addressing the knowledge and skills needed, and the attitudes and basic values that a user-centered development methodology requires. The first volume has been edited as Vol. 8 in the Springer HCI Series (Seffah, Gulliksen and Desmarais, 2005).

  17. Volume Entropy

    CERN Document Server

    Astuti, Valerio; Rovelli, Carlo

    2016-01-01

    Building on a technical result by Brunnemann and Rideout on the spectrum of the Volume operator in Loop Quantum Gravity, we show that the dimension of the space of the quadrivalent states --with finite-volume individual nodes-- describing a region with total volume smaller than $V$, has \\emph{finite} dimension, bounded by $V \\log V$. This allows us to introduce the notion of "volume entropy": the von Neumann entropy associated to the measurement of volume.

  18. A Bisimulation-based Hierarchical Framework for Software Development Models

    Directory of Open Access Journals (Sweden)

    Ping Liang

    2013-08-01

    Full Text Available Software development models have been ripen since the emergence of software engineering, like waterfall model, V-model, spiral model, etc. To ensure the successful implementation of those models, various metrics for software products and development process have been developed along, like CMMI, software metrics, and process re-engineering, etc. The quality of software products and processes can be ensured in consistence as much as possible and the abstract integrity of a software product can be achieved. However, in reality, the maintenance of software products is still high and even higher along with software evolution due to the inconsistence occurred by changes and inherent errors of software products. It is better to build up a robust software product that can sustain changes as many as possible. Therefore, this paper proposes a process algebra based hierarchical framework to extract an abstract equivalent of deliverable at the end of phases of a software product from its software development models. The process algebra equivalent of the deliverable is developed hierarchically with the development of the software product, applying bi-simulation to test run the deliverable of phases to guarantee the consistence and integrity of the software development and product in a trivially mathematical way. And an algorithm is also given to carry out the assessment of the phase deliverable in process algebra.  

  19. New method for rapid solid-phase extraction of large-volume water samples and its application to non-target screening of North Sea water for organic contaminants by gas chromatography-mass spectrometry.

    Science.gov (United States)

    Weigel, S; Bester, K; Hühnerfuss, H

    2001-03-30

    A method has been developed that allows the solid-phase extraction of microorganic compounds from large volumes of water (10 l) for non-target analysis of filtered seawater. The filtration-extraction system is operated with glass fibre filter candles and the polymeric styrene-divinylbenzene sorbent SDB-1 at flow-rates as high as 500 ml/min. Recovery studies carried out for a couple of model substances covering a wide range of polarity and chemical classes revealed a good performance of the method. Especially for polar compounds (log Kow 3.3-0.7) quantitative recovery was achieved. Limits of detection were between 0.1 and 0.7 ng/l in the full scan mode of the MS. The suitability of the method for the analysis of marine water samples is demonstrated by the non-target screening of water from the German Bight for the presence of organic contaminants. In the course of this screening a large variety of substances was identified including pesticides, industrial chemicals and pharmaceuticals. For some of the identified compounds their occurrence in marine ecosystems has not been reported before, such as dichloropyridines, carbamazepine, propyphenazone and caffeine.

  20. Software Maintenance and Evolution: The Implication for Software ...

    African Journals Online (AJOL)

    PROF. O. E. OSUAGWU

    2013-06-01

    Jun 1, 2013 ... ... the test of time. Keywords: Software, software maintenance, software evolution, reverse engineering, ... area of human endeavour be it automobile, software, etc. at .... greater efficiency and productivity in the maintenance ...

  1. Software metrics a guide to planning, analysis, and application

    CERN Document Server

    Pandian, C Ravindranath

    2003-01-01

    Software Metrics: A Guide to Planning, Analysis, and Application simplifies software measurement and explains its value as a pragmatic tool for management. Ideas and techniques presented in this book are derived from best practices. The ideas are field-proven, down to earth, and straightforward, making this volume an invaluable resource for those striving for process improvement.

  2. An Approach for Constructing Reusable Software Components in Ada

    Science.gov (United States)

    1990-09-01

    Cognitive Issues in Reusing Software Artifacts. Software Reusability, Volume 11: Applications and Experience, ed. Ted J. Biggerstaff and Alan J...Terms with Terminology for Software Reuse. Pittsburgh, PA: position paper for the Reuse in Practice Workshop, SEI. Prieto -Diaz87a Prieto -Diaz, Rubdn...Novak GE/SEI Carnegie Mellon University Pittsburgh, PA 15213-3890 Frank Poslajko SDC MS CSSD-SP 206 Wynn Drive Huntsville, AL 35807 Ruben Prieto -Diaz SPC

  3. A new method to address unmet needs for extracting individual cell migration features from a large number of cells embedded in 3D volumes.

    Directory of Open Access Journals (Sweden)

    Ivan Adanja

    Full Text Available BACKGROUND: In vitro cell observation has been widely used by biologists and pharmacologists for screening molecule-induced effects on cancer cells. Computer-assisted time-lapse microscopy enables automated live cell imaging in vitro, enabling cell behavior characterization through image analysis, in particular regarding cell migration. In this context, 3D cell assays in transparent matrix gels have been developed to provide more realistic in vitro 3D environments for monitoring cell migration (fundamentally different from cell motility behavior observed in 2D, which is related to the spread of cancer and metastases. METHODOLOGY/PRINCIPAL FINDINGS: In this paper we propose an improved automated tracking method that is designed to robustly and individually follow a large number of unlabeled cells observed under phase-contrast microscopy in 3D gels. The method automatically detects and tracks individual cells across a sequence of acquired volumes, using a template matching filtering method that in turn allows for robust detection and mean-shift tracking. The robustness of the method results from detecting and managing the cases where two cell (mean-shift trackers converge to the same point. The resulting trajectories quantify cell migration through statistical analysis of 3D trajectory descriptors. We manually validated the method and observed efficient cell detection and a low tracking error rate (6%. We also applied the method in a real biological experiment where the pro-migratory effects of hyaluronic acid (HA were analyzed on brain cancer cells. Using collagen gels with increased HA proportions, we were able to evidence a dose-response effect on cell migration abilities. CONCLUSIONS/SIGNIFICANCE: The developed method enables biomedical researchers to automatically and robustly quantify the pro- or anti-migratory effects of different experimental conditions on unlabeled cell cultures in a 3D environment.

  4. Software Defined Networking Demands on Software Technologies

    DEFF Research Database (Denmark)

    Galinac Grbac, T.; Caba, Cosmin Marius; Soler, José

    2015-01-01

    Software Defined Networking (SDN) is a networking approach based on a centralized control plane architecture with standardised interfaces between control and data planes. SDN enables fast configuration and reconfiguration of the network to enhance resource utilization and service performances....... This new approach enables a more dynamic and flexible network, which may adapt to user needs and application requirements. To this end, systemized solutions must be implemented in network software, aiming to provide secure network services that meet the required service performance levels. In this paper......, we review this new approach to networking from an architectural point of view, and identify and discuss some critical quality issues that require new developments in software technologies. These issues we discuss along with use case scenarios. Here in this paper we aim to identify challenges...

  5. Trace Software Pipelining

    Institute of Scientific and Technical Information of China (English)

    王剑; AndreasKrall; 等

    1995-01-01

    Global software pipelining is a complex but efficient compilation technique to exploit instruction-level parallelism for loops with branches.This paper presents a novel global software pipelining technique,called Trace Software Pipelining,targeted to the instruction-level parallel processors such as Very Long Instruction Word (VLIW) and superscalar machines.Trace software pipelining applies a global code scheduling technique to compact the original loop body.The resulting loop is called a trace software pipelined (TSP) code.The trace softwrae pipelined code can be directly executed with special architectural support or can be transformed into a globally software pipelined loop for the current VLIW and superscalar processors.Thus,exploiting parallelism across all iterations of a loop can be completed through compacting the original loop body with any global code scheduling technique.This makes our new technique very promising in practical compilers.Finally,we also present the preliminary experimental results to support our new approach.

  6. COTS software selection process.

    Energy Technology Data Exchange (ETDEWEB)

    Watkins, William M. (Strike Wire Technologies, Louisville, CO); Lin, Han Wei; McClelland, Kelly (U.S. Security Associates, Livermore, CA); Ullrich, Rebecca Ann; Khanjenoori, Soheil; Dalton, Karen; Lai, Anh Tri; Kuca, Michal; Pacheco, Sandra; Shaffer-Gant, Jessica

    2006-05-01

    Today's need for rapid software development has generated a great interest in employing Commercial-Off-The-Shelf (COTS) software products as a way of managing cost, developing time, and effort. With an abundance of COTS software packages to choose from, the problem now is how to systematically evaluate, rank, and select a COTS product that best meets the software project requirements and at the same time can leverage off the current corporate information technology architectural environment. This paper describes a systematic process for decision support in evaluating and ranking COTS software. Performed right after the requirements analysis, this process provides the evaluators with more concise, structural, and step-by-step activities for determining the best COTS software product with manageable risk. In addition, the process is presented in phases that are flexible to allow for customization or tailoring to meet various projects' requirements.

  7. Social software in global software development

    DEFF Research Database (Denmark)

    2010-01-01

    Social software (SoSo) is defined by Farkas as tools that (1) allow people to communicate, collaborate, and build community online (2) can be syndicated, shared, reused or remixed and (3) let people learn easily from and capitalize on the behavior and knowledge of others. [1]. SoSo include a wide...... variety of tools such as: instant messaging, internet forums, mailing lists, blogs, wikis, social network sites, social bookmarking, social libraries, virtual worlds. Though normally rather belonging to the private realm, the use of social software in corporate context has been reported, e.g. as a way...

  8. Social software in global software development

    DEFF Research Database (Denmark)

    2010-01-01

    Social software (SoSo) is defined by Farkas as tools that (1) allow people to communicate, collaborate, and build community online (2) can be syndicated, shared, reused or remixed and (3) let people learn easily from and capitalize on the behavior and knowledge of others. [1]. SoSo include a wide...... variety of tools such as: instant messaging, internet forums, mailing lists, blogs, wikis, social network sites, social bookmarking, social libraries, virtual worlds. Though normally rather belonging to the private realm, the use of social software in corporate context has been reported, e.g. as a way...

  9. Software Quality Assurance in Software Projects: A Study of Pakistan

    Directory of Open Access Journals (Sweden)

    Faisal Shafique Butt

    2013-05-01

    Full Text Available Software quality is specific property which tells what kind of standard software should have. In a software project, quality is the key factor of success and decline of software related organization. Many researches have been done regarding software quality. Software related organization follows standards introduced by Capability Maturity Model Integration (CMMI to achieve good quality software. Quality is divided into three main layers which are Software Quality Assurance (SQA, Software Quality Plan (SQP and Software Quality Control (SQC. So In this study, we are discussing the quality standards and principles of software projects in Pakistan software Industry and how these implemented quality standards are measured and managed. In this study, we will see how many software firms are following the rules of CMMI to create software. How many are reaching international standards and how many firms are measuring the quality of their projects. The results show some of the companies are using software quality assurance techniques in Pakstan.

  10. Software engineering the current practice

    CERN Document Server

    Rajlich, Vaclav

    2011-01-01

    INTRODUCTION History of Software EngineeringSoftware PropertiesOrigins of SoftwareBirth of Software EngineeringThird Paradigm: Iterative ApproachSoftware Life Span ModelsStaged ModelVariants of Staged ModelSoftware Technologies Programming Languages and CompilersObject-Oriented TechnologyVersion Control SystemSoftware ModelsClass DiagramsUML Activity DiagramsClass Dependency Graphs and ContractsSOFTWARE CHANGEIntroduction to Software ChangeCharacteristics of Software ChangePhases of Software ChangeRequirements and Their ElicitationRequirements Analysis and Change InitiationConcepts and Concept

  11. Real World Software Engineering

    Science.gov (United States)

    1994-07-15

    semester addresses the remaining principles of a complete, mature software development process [ Humphrey 88]. In order to provide an instructional...Software Innovations Technology, 1083 Mandarin Drive N.E.. Palm Bay FL 32905-4706 [ Humphrey 88] W. S. Humphrey , "Characterizing the Software Process: A...Copies of all the forms mentioned are available via electronic mail from the authors. 40 [1) Doris Carver, "Comparison of Techniques In Project-Based

  12. Software configuration management

    CERN Document Server

    Keyes, Jessica

    2004-01-01

    Software Configuration Management discusses the framework from a standards viewpoint, using the original DoD MIL-STD-973 and EIA-649 standards to describe the elements of configuration management within a software engineering perspective. Divided into two parts, the first section is composed of 14 chapters that explain every facet of configuration management related to software engineering. The second section consists of 25 appendices that contain many valuable real world CM templates.

  13. Software Process Improvement Defined

    DEFF Research Database (Denmark)

    Aaen, Ivan

    2002-01-01

    This paper argues in favor of the development of explanatory theory on software process improvement. The last one or two decades commitment to prescriptive approaches in software process improvement theory may contribute to the emergence of a gulf dividing theorists and practitioners....... It is proposed that this divide be met by the development of theory evaluating prescriptive approaches and informing practice with a focus on the software process policymaking and process control aspects of improvement efforts...

  14. Software evolution with XVCL

    DEFF Research Database (Denmark)

    Zhang, Weishan; Jarzabek, Stan; Zhang, Hongyu

    2004-01-01

    This chapter introduces software evolution with XVCL (XML-based Variant Configuration Language), which is an XML-based metaprogramming technique. As the software evolves, a large number of variants may arise, especially whtn such kinds of evolutions are related to multiple platforms as shown in our...... case study. Handling variants and tracing the impact of variants across the development lifecycle is a challenge. This chapter shows how we can maintain different versions of software in a reuse-based way....

  15. Software systems for astronomy

    CERN Document Server

    Conrad, Albert R

    2014-01-01

    This book covers the use and development of software for astronomy. It describes the control systems used to point the telescope and operate its cameras and spectrographs, as well as the web-based tools used to plan those observations. In addition, the book also covers the analysis and archiving of astronomical data once it has been acquired. Readers will learn about existing software tools and packages, develop their own software tools, and analyze real data sets.

  16. Lean software development

    OpenAIRE

    Hefnerová, Lucie

    2011-01-01

    The main goal of this bachelor thesis is the emergence of the clear Czech written material concerning the concept of Lean Software Development, which has been gaining significant attention in the field of software development, recently. Another goal of this thesis is to summarize the possible approaches of categorizing the concept and to summarize the possible approaches of defining the relationship between Lean and Agile software development. The detailed categorization of the tools potentia...

  17. Essential software architecture

    CERN Document Server

    Gorton, Ian

    2011-01-01

    Job titles like ""Technical Architect"" and ""Chief Architect"" nowadays abound in software industry, yet many people suspect that ""architecture"" is one of the most overused and least understood terms in professional software development. Gorton's book tries to resolve this dilemma. It concisely describes the essential elements of knowledge and key skills required to be a software architect. The explanations encompass the essentials of architecture thinking, practices, and supporting technologies. They range from a general understanding of structure and quality attributes through technical i

  18. Software Process Improvement Defined

    DEFF Research Database (Denmark)

    Aaen, Ivan

    2002-01-01

    This paper argues in favor of the development of explanatory theory on software process improvement. The last one or two decades commitment to prescriptive approaches in software process improvement theory may contribute to the emergence of a gulf dividing theorists and practitioners....... It is proposed that this divide be met by the development of theory evaluating prescriptive approaches and informing practice with a focus on the software process policymaking and process control aspects of improvement efforts...

  19. Solar Asset Management Software

    Energy Technology Data Exchange (ETDEWEB)

    Iverson, Aaron [Ra Power Management, Inc., Oakland, CA (United States); Zviagin, George [Ra Power Management, Inc., Oakland, CA (United States)

    2016-09-30

    Ra Power Management (RPM) has developed a cloud based software platform that manages the financial and operational functions of third party financed solar projects throughout their lifecycle. RPM’s software streamlines and automates the sales, financing, and management of a portfolio of solar assets. The software helps solar developers automate the most difficult aspects of asset management, leading to increased transparency, efficiency, and reduction in human error. More importantly, our platform will help developers save money by improving their operating margins.

  20. Software evolution and maintenance

    CERN Document Server

    Tripathy, Priyadarshi

    2014-01-01

    Software Evolution and Maintenance: A Practitioner's Approach is an accessible textbook for students and professionals, which collates the advances in software development and provides the most current models and techniques in maintenance.Explains two maintenance standards: IEEE/EIA 1219 and ISO/IEC14764Discusses several commercial reverse and domain engineering toolkitsSlides for instructors are available onlineInformation is based on the IEEE SWEBOK (Software Engineering Body of Knowledge)

  1. Software Architecture Technology Initiative

    Science.gov (United States)

    2008-04-01

    2008 Carnegie Mellon University 2008 PLS March 2008 © 2008 Carnegie Mellon University Software Architecture Technology Initiative SATURN 2008...SUBTITLE Software Architecture Technology Initiative 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT...SUPPLEMENTARY NOTES presented at the SEI Software Architecture Technology User Network (SATURN) Workshop, 30 Apr ? 1 May 2008, Pittsburgh, PA. 14

  2. Gammasphere software development

    Energy Technology Data Exchange (ETDEWEB)

    Piercey, R.B.

    1993-01-01

    Activities of the nuclear physics group are described. Progress was made in organizing the Gammasphere Software Working Group, establishing a nuclear computing facility, participating in software development at Lawrence Berkeley, developing a common data file format, and adapting the ORNL UPAK software to run at Gammasphere. A universal histogram object was developed that defines a file format and provides for an objective-oriented programming model. An automated liquid nitrogen fill system was developed for Gammasphere (110 Ge detectors comprise the sphere).

  3. Parallel Software Model Checking

    Science.gov (United States)

    2015-01-08

    JAN 2015 2. REPORT TYPE N/A 3. DATES COVERED 4. TITLE AND SUBTITLE Parallel Software Model Checking 5a. CONTRACT NUMBER 5b. GRANT NUMBER...AND ADDRESS(ES) Software Engineering Institute Carnegie Mellon University Pittsburgh, PA 15213 8. PERFORMING ORGANIZATION REPORT NUMBER 9...3: ∧ ≥ 10 ∧ ≠ 10 ⇒ : Parallel Software Model Checking Team Members Sagar Chaki, Arie Gurfinkel

  4. Gammasphere software development

    Energy Technology Data Exchange (ETDEWEB)

    Piercey, R.B.

    1993-01-01

    Activities of the nuclear physics group are described. Progress was made in organizing the Gammasphere Software Working Group, establishing a nuclear computing facility, participating in software development at Lawrence Berkeley, developing a common data file format, and adapting the ORNL UPAK software to run at Gammasphere. A universal histogram object was developed that defines a file format and provides for an objective-oriented programming model. An automated liquid nitrogen fill system was developed for Gammasphere (110 Ge detectors comprise the sphere).

  5. SOFTWARE FOR REGIONS OF INTEREST RETRIEVAL ON MEDICAL 3D IMAGES

    Directory of Open Access Journals (Sweden)

    G. G. Stromov

    2014-01-01

    Full Text Available Background. Implementation of software for areas of interest retrieval in 3D medical images is described in this article. It has been tested against large volume of model MRIs.Material and methods. We tested software against normal and pathological (severe multiple sclerosis model MRIs from tge BrainWeb resource. Technological stack is based on open-source cross-platform solutions. We implemented storage system on Maria DB (an open-sourced fork of MySQL with P/SQL extensions. Python 2.7 scripting was used for automatization of extract-transform-load operations. The computational core is written on Java 7 with Spring framework 3. MongoDB was used as a cache in the cluster of workstations. Maven 3 was chosen as a dependency manager and build system, the project is hosted at Github.Results. As testing on SSMU's LAN has showed, software has been developed is quite efficiently retrieves ROIs are matching for the morphological substratum on pathological MRIs.Conclusion. Automation of a diagnostic process using medical imaging allows to level down the subjective component in decision making and increase the availability of hi-tech medicine. Software has shown in the article is a complex solution for ROI retrieving and segmentation process on model medical images in full-automated mode.We would like to thank Robert Vincent for great help with consulting of usage the BrainWeb resource.

  6. Essence: Facilitating Software Innovation

    DEFF Research Database (Denmark)

    Aaen, Ivan

    2008-01-01

      This paper suggests ways to facilitate creativity and innovation in software development. The paper applies four perspectives – Product, Project, Process, and People –to identify an outlook for software innovation. The paper then describes a new facility–Software Innovation Research Lab (SIRL......) – and a new method concept for software innovation – Essence – based on views, modes, and team roles. Finally, the paper reports from an early experiment using SIRL and Essence and identifies further research....

  7. Global Software Engineering

    DEFF Research Database (Denmark)

    Ebert, Christof; Kuhrmann, Marco; Prikladnicki, Rafael

    2016-01-01

    Professional software products and IT systems and services today are developed mostly by globally distributed teams, projects, and companies. Successfully orchestrating Global Software Engineering (GSE) has become the major success factor both for organizations and practitioners. Yet, more than...... and experience reported at the IEEE International Conference on Software Engineering (ICGSE) series. The outcomes of our analysis show GSE as a field highly attached to industry and, thus, a considerable share of ICGSE papers address the transfer of Software Engineering concepts and solutions to the global stage...

  8. Agile software development

    CERN Document Server

    Dingsoyr, Torgeir; Moe, Nils Brede

    2010-01-01

    Agile software development has become an umbrella term for a number of changes in how software developers plan and coordinate their work, how they communicate with customers and external stakeholders, and how software development is organized in small, medium, and large companies, from the telecom and healthcare sectors to games and interactive media. Still, after a decade of research, agile software development is the source of continued debate due to its multifaceted nature and insufficient synthesis of research results. Dingsoyr, Dyba, and Moe now present a comprehensive snapshot of the kno

  9. Software architecture 1

    CERN Document Server

    Oussalah , Mourad Chabane

    2014-01-01

    Over the past 20 years, software architectures have significantly contributed to the development of complex and distributed systems. Nowadays, it is recognized that one of the critical problems in the design and development of any complex software system is its architecture, i.e. the organization of its architectural elements. Software Architecture presents the software architecture paradigms based on objects, components, services and models, as well as the various architectural techniques and methods, the analysis of architectural qualities, models of representation of architectural template

  10. Managing Software Process Evolution

    DEFF Research Database (Denmark)

    This book focuses on the design, development, management, governance and application of evolving software processes that are aligned with changing business objectives, such as expansion to new domains or shifting to global production. In the context of an evolving business world, it examines...... essential insights and tips to help readers manage process evolutions. And last but not least, it provides a wealth of examples and cases on how to deal with software evolution in practice. Reflecting these topics, the book is divided into three parts. Part 1 focuses on software business transformation...... the organization and management of (software development) projects and process improvements projects....

  11. Global Software Engineering

    DEFF Research Database (Denmark)

    Ebert, Christof; Kuhrmann, Marco; Prikladnicki, Rafael

    2016-01-01

    Professional software products and IT systems and services today are developed mostly by globally distributed teams, projects, and companies. Successfully orchestrating Global Software Engineering (GSE) has become the major success factor both for organizations and practitioners. Yet, more than...... and experience reported at the IEEE International Conference on Software Engineering (ICGSE) series. The outcomes of our analysis show GSE as a field highly attached to industry and, thus, a considerable share of ICGSE papers address the transfer of Software Engineering concepts and solutions to the global stage...

  12. Software architecture 2

    CERN Document Server

    Oussalah, Mourad Chabanne

    2014-01-01

    Over the past 20 years, software architectures have significantly contributed to the development of complex and distributed systems. Nowadays, it is recognized that one of the critical problems in the design and development of any complex software system is its architecture, i.e. the organization of its architectural elements. Software Architecture presents the software architecture paradigms based on objects, components, services and models, as well as the various architectural techniques and methods, the analysis of architectural qualities, models of representation of architectural templa

  13. MYOB software for dummies

    CERN Document Server

    Curtis, Veechi

    2012-01-01

    Your complete guide to MYOB® AccountRight softwareNow in its seventh edition, MYOB® Software For Dummies walks you through everything you need to know, from starting your MYOB® file from scratch and recording payments and receipts, to tracking profit and analysing sales. This new edition includes all the information you need on the new generation of MYOB® AccountRight software, including the new cloud computing features. Set up MYOB® software - understand how to make it work the first time Keep track of purchases and sales - monitor customer accounts and ensure you get pai

  14. Accuracy of MRI volume measurements of breast lesions: comparison between automated, semiautomated and manual assessment

    Energy Technology Data Exchange (ETDEWEB)

    Rominger, Marga B.; Fournell, Daphne; Nadar, Beenarose Thanka; Figiel, Jens H.; Keil, Boris; Heverhagen, Johannes T. [Philipps University, Department of Radiology, Marburg (Germany); Behrens, Sarah N.M. [MeVis GmbH, Bremen (Germany)

    2009-05-15

    The aim of this study was to investigate the efficacy of a dedicated software tool for automated and semiautomated volume measurement in contrast-enhanced (CE) magnetic resonance mammography (MRM). Ninety-six breast lesions with histopathological workup (27 benign, 69 malignant) were re-evaluated by different volume measurement techniques. Volumes of all lesions were extracted automatically (AVM) and semiautomatically (SAVM) from CE 3D MRM and compared with manual 3D contour segmentation (manual volume measurement, MVM, reference measurement technique) and volume estimates based on maximum diameter measurement (MDM). Compared with MVM as reference method MDM, AVM and SAVM underestimated lesion volumes by 63.8%, 30.9% and 21.5%, respectively, with significantly different accuracy for benign (102.4%, 18.4% and 11.4%) and malignant (54.9%, 33.0% and 23.1%) lesions (p<0.05). Inter- and intraobserver reproducibility was best for AVM (mean difference {+-}2SD, 1.0{+-}9.7% and 1.8{+-}12.1%) followed by SAVM (4.3{+-}25.7% and 4.3{+-}7.9%), MVM (2.3{+-}38.2% and 8.6{+-}31.8%) and MDM (33.9{+-}128.4% and 9.3{+-}55.9%). SAVM is more accurate for volume assessment of breast lesions than MDM and AVM. Volume measurement is less accurate for malignant than benign lesions. (orig.)

  15. Teaching Social Software with Social Software

    Science.gov (United States)

    Mejias, Ulises

    2006-01-01

    Ulises Mejias examines how social software--information and communications technologies that facilitate the collaboration and exchange of ideas--enables students to participate in distributed research, an approach to learning in which knowledge is collectively constructed and shared. During Fall 2005, Mejias taught a graduate seminar that provided…

  16. ATLAS software packaging

    Science.gov (United States)

    Rybkin, Grigory

    2012-12-01

    Software packaging is indispensable part of build and prerequisite for deployment processes. Full ATLAS software stack consists of TDAQ, HLT, and Offline software. These software groups depend on some 80 external software packages. We present tools, package PackDist, developed and used to package all this software except for TDAQ project. PackDist is based on and driven by CMT, ATLAS software configuration and build tool, and consists of shell and Python scripts. The packaging unit used is CMT project. Each CMT project is packaged as several packages—platform dependent (one per platform available), source code excluding header files, other platform independent files, documentation, and debug information packages (the last two being built optionally). Packaging can be done recursively to package all the dependencies. The whole set of packages for one software release, distribution kit, also includes configuration packages and contains some 120 packages for one platform. Also packaged are physics analysis projects (currently 6) used by particular physics groups on top of the full release. The tools provide an installation test for the full distribution kit. Packaging is done in two formats for use with the Pacman and RPM package managers. The tools are functional on the platforms supported by ATLAS—GNU/Linux and Mac OS X. The packaged software is used for software deployment on all ATLAS computing resources from the detector and trigger computing farms, collaboration laboratories computing centres, grid sites, to physicist laptops, and CERN VMFS and covers the use cases of running all applications as well as of software development.

  17. TAPSOFT'95: Theory and Practice of Software Development

    DEFF Research Database (Denmark)

    theoretical computer scientists and software engineers (researchers and practitioners) with a view to discussing how formal methods can usefully be applied in software development. The volume contains seven invited papers, among them one by Vaugham Pratt on the recently revealed bug in the Pentium chip......This volume presents the proceedings of the Sixth International Joint Conference on the Theory and Practice of Software Engineering, TAPSOFT '95, held in Aarhus, Denmark in May 1995. TAPSOFT '95 celebrates the 10th anniversary of this conference series started in Berlin in 1985 to bring together...

  18. Some Future Software Engineering Opportunities and Challenges

    Science.gov (United States)

    Boehm, Barry

    This paper provides an update and extension of a 2006 paper, “Some Future Trends and Implications for Systems and Software Engineering Processes,” Systems Engineering, Spring 2006. Some of its challenges and opportunities are similar, such as the need to simultaneously achieve high levels of both agility and assurance. Others have emerged as increasingly important, such as the challenges of dealing with ultralarge volumes of data, with multicore chips, and with software as a service. The paper is organized around eight relatively surprise-free trends and two “wild cards” whose trends and implications are harder to foresee. The eight surprise-free trends are:

  19. Software for multistate analysis

    NARCIS (Netherlands)

    Willekens, Frans; Putter, H.

    2014-01-01

    Background: The growing interest in pathways, the increased availability of life-history data, innovations in statistical and demographic techniques, and advances in software technology have stimulated the development of software packages for multistate modeling of life histories. Objective: In the

  20. Software for multistate analysis

    NARCIS (Netherlands)

    Willekens, Frans; Putter, H.

    2014-01-01

    Background: The growing interest in pathways, the increased availability of life-history data, innovations in statistical and demographic techniques, and advances in software technology have stimulated the development of software packages for multistate modeling of life histories.Objective: In the

  1. Software evolution with XVCL

    DEFF Research Database (Denmark)

    Zhang, Weishan; Jarzabek, Stan; Zhang, Hongyu

    2004-01-01

    This chapter introduces software evolution with XVCL (XML-based Variant Configuration Language), which is an XML-based metaprogramming technique. As the software evolves, a large number of variants may arise, especially whtn such kinds of evolutions are related to multiple platforms as shown in o...

  2. Software Project Management

    Science.gov (United States)

    1989-07-01

    on software management obstacles and ways Chakrabarty, which held that genetically altered to cope with them are presented. Standardization is... algorith - MacProject89 mic models used to estimate software costs (SLIM, MacProject 11. Claris Corp., Mountain View, Calif., COCOMO, Function Points

  3. Software Quality Metrics

    Science.gov (United States)

    1991-07-01

    March 1979, pp. 121-128. Gorla, Narasimhaiah, Alan C. Benander, and Barbara A. Benander, "Debugging Effort Estimation Using Software Metrics", IEEE...Society, IEEE Guide for the Use of IEEE Standard Dictionary of Measures to Produce Reliable Software, IEEE Std 982.2-1988, June 1989. Jones, Capers

  4. Software Engineering Education Directory

    Science.gov (United States)

    1989-02-01

    The C Programming Language by Kernighan, Brian W. and Ritchie, Dennis M. Compilers: C Computers: NCR Tower 32/600 running UNIX System V...Sun Microsystems, Ada Eiffel 3B2) Software Testing CS 429 U P E O 10 Textbooks: Software Testing Techniques by Beizer, Boris Systems

  5. Marketing Mix del Software.

    Directory of Open Access Journals (Sweden)

    Yudith del Carmen Rodríguez Pérez

    2006-03-01

    Por ello, en este trabajo se define el concepto de producto software, se caracteriza al mismo y se exponen sus atributos de calidad. Además, se aborda la mezcla de marketing del software necesaria y diferente a la de otros productos para que este triunfe en el mercado.

  6. Software engineering ethics

    Science.gov (United States)

    Bown, Rodney L.

    1991-01-01

    Software engineering ethics is reviewed. The following subject areas are covered: lack of a system viewpoint; arrogance of PC DOS software vendors; violation od upward compatibility; internet worm; internet worm revisited; student cheating and company hiring interviews; computing practitioners and the commodity market; new projects and old programming languages; schedule and budget; and recent public domain comments.

  7. Software Assurance Competency Model

    Science.gov (United States)

    2013-03-01

    2010a]: Application of technologies and processes to achieve a required level of confidence that software systems and services function in the...for specific projects. L5: Analyze assurance technologies and contribute to the development of new ones. Assured Software Development L1

  8. Threats to Bitcoin Software

    OpenAIRE

    Kateraas, Christian H

    2014-01-01

    Collect and analyse threat models to the Bitcoin ecosystem and its software. The create misuse case, attack trees, and sequence diagrams of the threats. Create a malicious client from the gathered threat models. Once the development of the client is complete, test the client and evaluate its performance. From this, assess the security of the Bitcoin software.

  9. Software cost estimation

    NARCIS (Netherlands)

    Heemstra, F.J.

    1992-01-01

    The paper gives an overview of the state of the art of software cost estimation (SCE). The main questions to be answered in the paper are: (1) What are the reasons for overruns of budgets and planned durations? (2) What are the prerequisites for estimating? (3) How can software development effort be

  10. Cactus: Software Priorities

    Science.gov (United States)

    Hyde, Hartley

    2009-01-01

    The early eighties saw a period of rapid change in computing and teachers lost control of how they used computers in their classrooms. Software companies produced computer tools that looked so good that teachers forgot about writing their own classroom materials and happily purchased software--that offered much more than teachers needed--from…

  11. Selecting the Right Software.

    Science.gov (United States)

    Shearn, Joseph

    1987-01-01

    Selection of administrative software requires analyzing present needs and, to meet future needs, choosing software that will function with a more powerful computer system. Other important factors to include are a professional system demonstration, maintenance and training, and financial considerations that allow leasing or renting alternatives.…

  12. Systematic Software Development

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; Méndez Fernández, Daniel

    2015-01-01

    The speed of innovation and the global allocation of resources to accelerate development or to reduce cost put pressure on the software industry. In the global competition, especially so-called high-price countries have to present arguments why the higher development cost is justified and what...... makes these countries an attractive host for software companies. Often, high-quality engineering and excellent quality of products, e.g., machinery and equipment, are mentioned. Yet, the question is: Can such arguments be also found for the software industry? We aim at investigating the degree...... of professionalism and systematization of software development to draw a map of strengths and weaknesses. To this end, we conducted as a first step an exploratory survey in Germany, presented in this paper. In this survey, we focused on the perceived importance of the two general software engineering process areas...

  13. Software architecture evolution

    DEFF Research Database (Denmark)

    Barais, Olivier; Le Meur, Anne-Francoise; Duchien, Laurence

    2008-01-01

    Software architectures must frequently evolve to cope with changing requirements, and this evolution often implies integrating new concerns. Unfortunately, when the new concerns are crosscutting, existing architecture description languages provide little or no support for this kind of evolution....... The software architect must modify multiple elements of the architecture manually, which risks introducing inconsistencies. This chapter provides an overview, comparison and detailed treatment of the various state-of-the-art approaches to describing and evolving software architectures. Furthermore, we discuss...... one particular framework named Tran SAT, which addresses the above problems of software architecture evolution. Tran SAT provides a new element in the software architecture descriptions language, called an architectural aspect, for describing new concerns and their integration into an existing...

  14. Software quality in 1997

    Energy Technology Data Exchange (ETDEWEB)

    Jones, C. [Software Productivity Research, Inc., Burlington, MA (United States)

    1997-11-01

    For many years, software quality assurance lagged behind hardware quality assurance in terms of methods, metrics, and successful results. New approaches such as Quality Function Deployment (QFD) the ISO 9000-9004 standards, the SEI maturity levels, and Total Quality Management (TQM) are starting to attract wide attention, and in some cases to bring software quality levels up to a parity with manufacturing quality levels. Since software is on the critical path for many engineered products, and for internal business systems as well, the new approaches are starting to affect global competition and attract widespread international interest. It can be hypothesized that success in mastering software quality will be a key strategy for dominating global software markets in the 21st century.

  15. Software Requirements Management

    Directory of Open Access Journals (Sweden)

    Ali Altalbe

    2015-04-01

    Full Text Available Requirements are defined as the desired set of characteristics of a product or a service. In the world of software development, it is estimated that more than half of the failures are attributed towards poor requirements management. This means that although the software functions correctly, it is not what the client requested. Modern software requirements management methodologies are available to reduce the occur-rence of such incidents. This paper performs a review on the available literature in the area while tabulating possible methods of managing requirements. It also highlights the benefits of following a proper guideline for the requirements management task. With the introduction of specific software tools for the requirements management task, better software products are now been developed with lesser resources.

  16. Software licenses: Stay honest!

    CERN Document Server

    Computer Security Team

    2012-01-01

    Do you recall our article about copyright violation in the last issue of the CERN Bulletin, “Music, videos and the risk for CERN”? Now let’s be more precise. “Violating copyright” not only means the illegal download of music and videos, it also applies to software packages and applications.   Users must respect proprietary rights in compliance with the CERN Computing Rules (OC5). Not having legitimately obtained a program or the required licenses to run that software is not a minor offense. It violates CERN rules and puts the Organization at risk! Vendors deserve credit and compensation. Therefore, make sure that you have the right to use their software. In other words, you have bought the software via legitimate channels and use a valid and honestly obtained license. This also applies to “Shareware” and software under open licenses, which might also come with a cost. Usually, only “Freeware” is complete...

  17. Software safety hazard analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lawrence, J.D. [Lawrence Livermore National Lab., CA (United States)

    1996-02-01

    Techniques for analyzing the safety and reliability of analog-based electronic protection systems that serve to mitigate hazards in process control systems have been developed over many years, and are reasonably well understood. An example is the protection system in a nuclear power plant. The extension of these techniques to systems which include digital computers is not well developed, and there is little consensus among software engineering experts and safety experts on how to analyze such systems. One possible technique is to extend hazard analysis to include digital computer-based systems. Software is frequently overlooked during system hazard analyses, but this is unacceptable when the software is in control of a potentially hazardous operation. In such cases, hazard analysis should be extended to fully cover the software. A method for performing software hazard analysis is proposed in this paper.

  18. Developing Software Simulations

    Directory of Open Access Journals (Sweden)

    Tom Hall

    2007-06-01

    Full Text Available Programs in education and business often require learners to develop and demonstrate competence in specified areas and then be able to effectively apply this knowledge. One method to aid in developing a skill set in these areas is through the use of software simulations. These simulations can be used for learner demonstrations of competencies in a specified course as well as a review of the basic skills at the beginning of subsequent courses. The first section of this paper discusses ToolBook, the software used to develop our software simulations. The second section discusses the process of developing software simulations. The third part discusses how we have used software simulations to assess student knowledge of research design by providing simulations that allow the student to practice using SPSS and Excel.

  19. DIVERSIFICATION IN SOFTWARE ENGINEERING

    Directory of Open Access Journals (Sweden)

    Er.Kirtesh Jailia,

    2010-06-01

    Full Text Available In this paper we examine the factors that have promoted the iversification of software process models. The intention is to understand more clearly the problem-solving process in software engineering & try to find out the efficient way to manage the risk. A review of software process modeling is given first, followed by a discussion of process evaluation techniques. A taxonomy for categorizing process models, based on establishing decision criteria,is identified that can guide selecting the appropriate model from a set of alternatives on the basis of model characteristics and software project needs. We are proposing a model in this paper, for dealing with the diversification in software engineering.

  20. Trends in software testing

    CERN Document Server

    Mohanty, J; Balakrishnan, Arunkumar

    2017-01-01

    This book is focused on the advancements in the field of software testing and the innovative practices that the industry is adopting. Considering the widely varied nature of software testing, the book addresses contemporary aspects that are important for both academia and industry. There are dedicated chapters on seamless high-efficiency frameworks, automation on regression testing, software by search, and system evolution management. There are a host of mathematical models that are promising for software quality improvement by model-based testing. There are three chapters addressing this concern. Students and researchers in particular will find these chapters useful for their mathematical strength and rigor. Other topics covered include uncertainty in testing, software security testing, testing as a service, test technical debt (or test debt), disruption caused by digital advancement (social media, cloud computing, mobile application and data analytics), and challenges and benefits of outsourcing. The book w...

  1. AUTOMATED SOFTWARE DISTRIBUTION

    Directory of Open Access Journals (Sweden)

    J.J. Strasheim

    2012-01-01

    Full Text Available

    ENGLISH ABSTRACT: Automated distribution of computer software via electronic means in large corporate networks is growing in popularity. The relative importance of personal computer software, in financial and logistical terms, is described and the developing need for automated software distribution explained. An actual comparitive example of alternative software distribution strategies is presented and discussed proving the viability of Electronic Software Distribution.

    AFRIKAANSE OPSOMMING: Geoutomatiseerde verspreiding van rekenaarprogrammatuur met behulp van elektroniese metodes in groot korporatiewe netwerke, is toenemend populer, Die relatiewe belangrikheid van persoonlike rekenaarprogrammatuur in finansiele en logistieke terme word bespreek en die groeiende behoefte na geoutomatiseerde programmatuurverspreiding verduidelik. 'n Werklike vergelykende voorbeeld van alternatiewe programmatuurverspreidingsstrategiee word aangebied en bespreek wat die lewensvatbaarheid van Elektroniese Programmatuurverspreiding bewys.

  2. Developing Software Simulations

    Directory of Open Access Journals (Sweden)

    Tom Hall

    2007-06-01

    Full Text Available Programs in education and business often require learners to develop and demonstrate competence in specified areas and then be able to effectively apply this knowledge. One method to aid in developing a skill set in these areas is through the use of software simulations. These simulations can be used for learner demonstrations of competencies in a specified course as well as a review of the basic skills at the beginning of subsequent courses. The first section of this paper discusses ToolBook, the software used to develop our software simulations. The second section discusses the process of developing software simulations. The third part discusses how we have used software simulations to assess student knowledge of research design by providing simulations that allow the student to practice using SPSS and Excel.

  3. Scientific Software Component Technology

    Energy Technology Data Exchange (ETDEWEB)

    Kohn, S.; Dykman, N.; Kumfert, G.; Smolinski, B.

    2000-02-16

    We are developing new software component technology for high-performance parallel scientific computing to address issues of complexity, re-use, and interoperability for laboratory software. Component technology enables cross-project code re-use, reduces software development costs, and provides additional simulation capabilities for massively parallel laboratory application codes. The success of our approach will be measured by its impact on DOE mathematical and scientific software efforts. Thus, we are collaborating closely with library developers and application scientists in the Common Component Architecture forum, the Equation Solver Interface forum, and other DOE mathematical software groups to gather requirements, write and adopt a variety of design specifications, and develop demonstration projects to validate our approach. Numerical simulation is essential to the science mission at the laboratory. However, it is becoming increasingly difficult to manage the complexity of modern simulation software. Computational scientists develop complex, three-dimensional, massively parallel, full-physics simulations that require the integration of diverse software packages written by outside development teams. Currently, the integration of a new software package, such as a new linear solver library, can require several months of effort. Current industry component technologies such as CORBA, JavaBeans, and COM have all been used successfully in the business domain to reduce software development costs and increase software quality. However, these existing industry component infrastructures will not scale to support massively parallel applications in science and engineering. In particular, they do not address issues related to high-performance parallel computing on ASCI-class machines, such as fast in-process connections between components, language interoperability for scientific languages such as Fortran, parallel data redistribution between components, and massively

  4. An Introduction to the Special Volume on

    Directory of Open Access Journals (Sweden)

    Micah Altman

    2011-08-01

    Full Text Available This special volume of the Journal of Statistical Software on political methodology includes 14 papers, with wide-ranging software contributions of political scientists to their own field, and more generally to statistical data analysis in the the social sciences and beyond. Special emphasis is given to software that is written in or can cooperate with the R system for statistical computing.

  5. Contrast study on effect of the class I malocclusion patients application of orthodontic extraction and non extraction orthodontic treat-ment for oropharyngeal airway volume%Ⅰ类错牙合畸形患者应用拔牙矫治与非拔牙矫治对其口咽气道容积影响对比研究

    Institute of Scientific and Technical Information of China (English)

    何珍; 叶珊珊

    2015-01-01

    Objective To study the effect of contrast in the class I malocclusion patients application of orthodontic extraction and non ex-traction orthodontic treatment for oropharyngeal airway volume. Methods From 2010 May to 2013 March,97 class I malocclusion patients in our hospital accepting the malocclusion correction were included as the research object. According to the different treatment method,these cases were divided into experimental group(extraction treatment)and control group(non extraction treatment). By using the retrospective analysis method, data were collected after treatment the clinical data and imaging basis. Results After the group comparison,the central length,mandibular length of two groups of patients with malocclusion is correct before long. Correctional tooth extraction in the treatment group after middle incisor inclination and lower incisor inclination reduced compared with before correction. The correctional tooth extraction treatment group after middle incisor inclina-tion and lower incisor inclination in the former can increase a correction. Rectifying the indicators before and after the differences were statistically significant( P ﹤ 0. 05). The treatment for the first two group of each imaging index difference has no statistical significance( P ﹥ 0. 05),but mandibular length is shorter in the tooth extraction treatment group after correctional tooth extraction rectification group central length. Middle inci-sor inclination angle,tooth angle is less than the tooth extraction in treatment group. The oropharyngeal airway volume and the narrowest part area were greater than the tooth extraction treatment group. After treatment,indicators are similar between the two groups have statistical significance( P ﹤ 0. 05). Conclusion Tooth extraction rectification in the treatment of the class I malocclusion,the oropharyngeal airway volume effect have more positive affection,should be in the treatment of the class I malocclusion caused enough

  6. Data structure and software engineering challenges and improvements

    CERN Document Server

    Antonakos, James L

    2011-01-01

    Data structure and software engineering is an integral part of computer science. This volume presents new approaches and methods to knowledge sharing, brain mapping, data integration, and data storage. The author describes how to manage an organization's business process and domain data and presents new software and hardware testing methods. The book introduces a game development framework used as a learning aid in a software engineering at the university level. It also features a review of social software engineering metrics and methods for processing business information. It explains how to

  7. Cortical cartography and Caret software.

    Science.gov (United States)

    Van Essen, David C

    2012-08-15

    Caret software is widely used for analyzing and visualizing many types of fMRI data, often in conjunction with experimental data from other modalities. This article places Caret's development in a historical context that spans three decades of brain mapping--from the early days of manually generated flat maps to the nascent field of human connectomics. It also highlights some of Caret's distinctive capabilities. This includes the ease of visualizing data on surfaces and/or volumes and on atlases as well as individual subjects. Caret can display many types of experimental data using various combinations of overlays (e.g., fMRI activation maps, cortical parcellations, areal boundaries), and it has other features that facilitate the analysis and visualization of complex neuroimaging datasets. Copyright © 2011 Elsevier Inc. All rights reserved.

  8. 基于软件体系结构的可视化软件开发平台%Platform for Visualization Software Based on Software Architecture

    Institute of Scientific and Technical Information of China (English)

    胡华; 林昌东

    2003-01-01

    Platform for visualization software is a kind of large and complex software system, the work to design and evolve platform for visualization software is also a complex work. This paper proposes using software architecture to analyze and evolve a large visualization platform based plane data to a visualization platform with ability of processing both plane and volume. The result of this paper proves the reasonability of our method.

  9. Global Software Engineering

    DEFF Research Database (Denmark)

    Ebert, Christof; Kuhrmann, Marco; Prikladnicki, Rafael

    2016-01-01

    SOFTWARE, LIKE ALL industry products, is the result of complex multinational supply chains with many partners from concept to development to production and maintenance. Global software engineering (GSE), IT outsourcing, and business process outsourcing during the past decade have showed growth...... rates of 10 to 20 percent per year. This instalment of Practitioner’s Digest summarizes experiences and guidance from industry to facilitate knowledge and technology transfer for GSE. It’s based on industry feedback from the annual IEEE International Conference on Global Software Engineering, which had...

  10. Management systems and software.

    Science.gov (United States)

    Levin, R P

    2001-02-01

    To ensure that your software optimizes your practice management systems, design systems that allow you and your team to achieve your goals and provide high levels of quality dentistry and customer service to your patients. Then use your current software system or purchase a new practice management software program that will allow your practice to operate within the guidelines of the systems which you have established. You can be certain that taking these steps will allow you to practice dentistry with maximum profitability and minimum stress for the remainder of your career.

  11. Advanced fingerprint verification software

    Science.gov (United States)

    Baradarani, A.; Taylor, J. R. B.; Severin, F.; Maev, R. Gr.

    2016-05-01

    We have developed a fingerprint software package that can be used in a wide range of applications from law enforcement to public and private security systems, and to personal devices such as laptops, vehicles, and door- locks. The software and processing units are a unique implementation of new and sophisticated algorithms that compete with the current best systems in the world. Development of the software package has been in line with the third generation of our ultrasonic fingerprinting machine1. Solid and robust performance is achieved in the presence of misplaced and low quality fingerprints.

  12. Software takes command

    CERN Document Server

    Manovich, Lev

    2013-01-01

    Software has replaced a diverse array of physical, mechanical, and electronic technologies used before 21st century to create, store, distribute and interact with cultural artifacts. It has become our interface to the world, to others, to our memory and our imagination - a universal language through which the world speaks, and a universal engine on which the world runs. What electricity and combustion engine were to the early 20th century, software is to the early 21st century. Offering the the first theoretical and historical account of software for media authoring and its effects on the prac

  13. CNEOST Control Software System

    Science.gov (United States)

    Wang, X.; Zhao, H. B.; Xia, Y.; Lu, H.; Li, B.

    2015-03-01

    In 2013, CNEOST (China Near Earth Object Survey Telescope) adapted its hardware system for the new CCD camera. Based on the new system architecture, the control software is re-designed and implemented. The software system adopts the message passing mechanism via WebSocket protocol, and improves its flexibility, expansibility, and scalability. The user interface with responsive web design realizes the remote operating under both desktop and mobile devices. The stable operating of software system has greatly enhanced the operation efficiency while reducing the complexity, and has also made a successful attempt for the future system design of telescope and telescope cloud.

  14. Speakeasy software development

    Science.gov (United States)

    Baskinger, Patricia J.; Ozarow, Larry; Chruscicki, Mary C.

    1993-08-01

    The Speakeasy Software Development Project had three primary objectives. The first objective was to perform Independent Verification and Validation (IV & V) of the software and documentation associated with the signal processor being developed by Hazeltine and TRW under the Speakeasy program. The IV & V task also included an analysis and assessment of the ability of the signal processor software to provide LPI communications functions. The second objective was to assist in the enhancement and modification of an existing Rome Lab signal processor workstation. Finally, TASC developed project management support tools and provided program management support to the Speakeasy Program Office.

  15. Error Free Software

    Science.gov (United States)

    1985-01-01

    A mathematical theory for development of "higher order" software to catch computer mistakes resulted from a Johnson Space Center contract for Apollo spacecraft navigation. Two women who were involved in the project formed Higher Order Software, Inc. to develop and market the system of error analysis and correction. They designed software which is logically error-free, which, in one instance, was found to increase productivity by 600%. USE.IT defines its objectives using AXES -- a user can write in English and the system converts to computer languages. It is employed by several large corporations.

  16. Guide to software export

    CERN Document Server

    Philips, Roger A

    2014-01-01

    An ideal reference source for CEOs, marketing and sales managers, sales consultants, and students of international marketing, Guide to Software Export provides a step-by-step approach to initiating or expanding international software sales. It teaches you how to examine critically your candidate product for exportability; how to find distributors, agents, and resellers abroad; how to identify the best distribution structure for export; and much, much more!Not content with providing just the guidelines for setting up, expanding, and managing your international sales channels, Guide to Software

  17. Software quality assurance handbook

    Energy Technology Data Exchange (ETDEWEB)

    1990-09-01

    There are two important reasons for Software Quality Assurance (SQA) at Allied-Signal Inc., Kansas City Division (KCD): First, the benefits from SQA make good business sense. Second, the Department of Energy has requested SQA. This handbook is one of the first steps in a plant-wide implementation of Software Quality Assurance at KCD. The handbook has two main purposes. The first is to provide information that you will need to perform software quality assurance activities. The second is to provide a common thread to unify the approach to SQA at KCD. 2 figs.

  18. Calidad de componentes software

    OpenAIRE

    2010-01-01

    En los últimos años se constata una tendencia creciente por parte de las organizaciones a desarrollar sus sistemas software mediante la combinación de componentes, en lugar de desarrollar dichos sistemas partiendo de cero. Esta tendencia es debida a varios factores. Entre ellos cabe destacar: la necesidad de las organizaciones de reducir los costes y el tiempo dedicados al desarrollo de los sistemas software; el crecimiento del mercado de componentes software; la reducción de la distancia ent...

  19. Beginning software engineering

    CERN Document Server

    Stephens, Rod

    2015-01-01

    Beginning Software Engineering demystifies the software engineering methodologies and techniques that professional developers use to design and build robust, efficient, and consistently reliable software. Free of jargon and assuming no previous programming, development, or management experience, this accessible guide explains important concepts and techniques that can be applied to any programming language. Each chapter ends with exercises that let you test your understanding and help you elaborate on the chapter's main concepts. Everything you need to understand waterfall, Sashimi, agile, RAD, Scrum, Kanban, Extreme Programming, and many other development models is inside!

  20. Antidiarrheal and thrombolytic effects of methanol extract of Wikstroemia indica (L. C. A. Mey leaves

    Directory of Open Access Journals (Sweden)

    Md Khalilur Rahman

    2015-01-01

    Full Text Available Context: Medicinal plants contribute as potential sources of therapeutic uses. Wikstroemia indica, a traditional medicinal plant, has long been used as anti-inflammatory, antiviral, antimalarial, anti-mitotic, antitumor, and anti-HIV in different parts of the world. Aims: The aim was to investigate the antidiarrheal and thrombolytic effect of W. indica leaf extract. Settings and Design: Sample collection, identification, solvent extraction, and crude extract preparations were led to evaluate the antidiarrheal effect in in vivo model and the thrombolytic effect in in vitro model. Materials and Methods: Castor oil-induced diarrhea and enteropooling assays and gastrointestinal motility tests were used to examine the in vivo antidiarrheal activity in Wistar albino rat. In vitro clot lysis model was undertaken to investigate the thrombolytic action of the extract. Data were analyzed using statistical software (Statistical Package for Social Science, SPSS, version 19.0, SPSS Inc., USA. Results: The diarrheal episode was inhibited by 18.64% and 28.96% for the methanol extract at the doses of 200 and 400 mg/kg, respectively. The extract significantly (P < 0.05 reduced the intestinal volume and intestinal transit in comparison to control. The extract also reduced the rate of defecation, accumulation of fluid, and transit of charcoal oil. The extract showed a moderate thrombolytic effect compared to the reference control. Conclusion: Methanol extract of W. indica might be triggered the premonition of novel drug discovery in the future due to its antidiarrheal effect in the animal model.

  1. "IBSAR" Software 4.0

    Directory of Open Access Journals (Sweden)

    2004-06-01

    Full Text Available A review for Arabic software entitled "IBSAR" software assigned to help blinds in usage of the computer, the software pronounces the commands and the contents of screens and applications browsed by users, this review includes general introduction about the software, the components and commands of the software , system requirements , and its functions with Windows operating system and Microsoft Word.

  2. A Novel Software Evolution Model Based on Software Networks

    Science.gov (United States)

    Pan, Weifeng; Li, Bing; Ma, Yutao; Liu, Jing

    Many published papers analyzed the forming mechanisms and evolution laws of OO software systems from software reuse, software pattern, etc. There, however, have been fewer models so far merely built on the software components such as methods, classes, etc. and their interactions. In this paper, a novel Software Evolution Model based on Software Networks (called SEM-SN) is proposed. It uses software network at class level to represent software systems, and uses software network’s dynamical generating process to simulate activities in real software development process such as new classes’ dynamical creations and their dynamical interactions with already existing classes. It also introduces the concept of node/edge ageing to describe the decaying of classes with time. Empirical results on eight open-source Object-Oriented (OO) software systems demonstrate that SCM-SN roughly describes the evolution process of software systems and the emergence of their complex network characteristics.

  3. A Review of Software Maintenance Technology.

    Science.gov (United States)

    1980-02-01

    is a concept that must be formalized. Class differentiation in this context implies that there are varying maintenance requirements among the classes...has been developed using a number of fundamental equations which relate failures experienced, present MTTF, MTTF objective, and time required to meet...34Scheduled Maintenance of Applications Software,’ Datamation, Volume 19, Number 5, May 1973, pp. 64-67. Liskov, B. and Zilles , S., "Specification

  4. Estimation of Apple Volume and Its Shape Indentation Using Image Processing Technique and Neural Network

    Directory of Open Access Journals (Sweden)

    M Jafarlou

    2014-04-01

    Full Text Available Physical properties of agricultural products such as volume are the most important parameters influencing grading and packaging systems. They should be measured accurately as they are considered for any good system design. Image processing and neural network techniques are both non-destructive and useful methods which are recently used for such purpose. In this study, the images of apples were captured from a constant distance and then were processed in MATLAB software and the edges of apple images were extracted. The interior area of apple image was divided into some thin trapezoidal elements perpendicular to longitudinal axis. Total volume of apple was estimated by the summation of incremental volumes of these elements revolved around the apple’s longitudinal axis. The picture of half cut apple was also captured in order to obtain the apple shape’s indentation volume, which was subtracted from the previously estimated total volume of apple. The real volume of apples was measured using water displacement method and the relation between the real volume and estimated volume was obtained. The t-test and Bland-Altman indicated that the difference between the real volume and the estimated volume was not significantly different (p>0.05 i.e. the mean difference was 1.52 cm3 and the accuracy of measurement was 92%. Utilizing neural network with input variables of dimension and mass has increased the accuracy up to 97% and the difference between the mean of volumes decreased to 0.7 cm3.

  5. Project Portfolio Management Software

    OpenAIRE

    Paul POCATILU

    2006-01-01

    In order to design a methodology for the development of project portfolio management (PPM) applications, the existing applications have to be studied. This paper describes the main characteristics of the leading project portfolio management software applications.

  6. Project Portfolio Management Software

    Directory of Open Access Journals (Sweden)

    Paul POCATILU

    2006-01-01

    Full Text Available In order to design a methodology for the development of project portfolio management (PPM applications, the existing applications have to be studied. This paper describes the main characteristics of the leading project portfolio management software applications.

  7. Test af Software

    DEFF Research Database (Denmark)

    Dette dokument udgør slutrapporten for netværkssamarbejdet ”Testnet”, som er udført i perioden 1.4.2006 til 31.12.2008. Netværket beskæftiger sig navnlig med emner inden for test af indlejret og teknisk software, men et antal eksempler på problemstillinger og løsninger forbundet med test af...... administrativ software indgår også. Rapporten er opdelt i følgende 3 dele: Overblik. Her giver vi et resumé af netværkets formål, aktiviteter og resultater. State of the art af software test ridses op. Vi omtaler, at CISS og netværket tager nye tiltag. Netværket. Formål, deltagere og behandlede emner på ti...... række danske software-, elektronik- og IT-virksomheder....

  8. Tier2 Submit Software

    Science.gov (United States)

    Download this tool for Windows or Mac, which helps facilities prepare a Tier II electronic chemical inventory report. The data can also be exported into the CAMEOfm (Computer-Aided Management of Emergency Operations) emergency planning software.

  9. ACS: ALMA Common Software

    Science.gov (United States)

    Chiozzi, Gianluca; Šekoranja, Matej

    2013-02-01

    ALMA Common Software (ACS) provides a software infrastructure common to all ALMA partners and consists of a documented collection of common patterns and components which implement those patterns. The heart of ACS is based on a distributed Component-Container model, with ACS Components implemented as CORBA objects in any of the supported programming languages. ACS provides common CORBA-based services such as logging, error and alarm management, configuration database and lifecycle management. Although designed for ALMA, ACS can and is being used in other control systems and distributed software projects, since it implements proven design patterns using state of the art, reliable technology. It also allows, through the use of well-known standard constructs and components, that other team members whom are not authors of ACS easily understand the architecture of software modules, making maintenance affordable even on a very large project.

  10. Spreadsheet Auditing Software

    CERN Document Server

    Nixon, David

    2010-01-01

    It is now widely accepted that errors in spreadsheets are both common and potentially dangerous. Further research has taken place to investigate how frequently these errors occur, what impact they have, how the risk of spreadsheet errors can be reduced by following spreadsheet design guidelines and methodologies, and how effective auditing of a spreadsheet is in the detection of these errors. However, little research exists to establish the usefulness of software tools in the auditing of spreadsheets. This paper documents and tests office software tools designed to assist in the audit of spreadsheets. The test was designed to identify the success of software tools in detecting different types of errors, to identify how the software tools assist the auditor and to determine the usefulness of the tools.

  11. Test af Software

    DEFF Research Database (Denmark)

    Dette dokument udgør slutrapporten for netværkssamarbejdet ”Testnet”, som er udført i perioden 1.4.2006 til 31.12.2008. Netværket beskæftiger sig navnlig med emner inden for test af indlejret og teknisk software, men et antal eksempler på problemstillinger og løsninger forbundet med test af...... administrativ software indgår også. Rapporten er opdelt i følgende 3 dele: Overblik. Her giver vi et resumé af netværkets formål, aktiviteter og resultater. State of the art af software test ridses op. Vi omtaler, at CISS og netværket tager nye tiltag. Netværket. Formål, deltagere og behandlede emner på ti...... række danske software-, elektronik- og IT-virksomheder....

  12. Software For Genetic Algorithms

    Science.gov (United States)

    Wang, Lui; Bayer, Steve E.

    1992-01-01

    SPLICER computer program is genetic-algorithm software tool used to solve search and optimization problems. Provides underlying framework and structure for building genetic-algorithm application program. Written in Think C.

  13. Computer Center: Software Review.

    Science.gov (United States)

    Duhrkopf, Richard, Ed.; Belshe, John F., Ed.

    1988-01-01

    Reviews a software package, "Mitosis-Meiosis," available for Apple II or IBM computers with colorgraphics capabilities. Describes the documentation, presentation and flexibility of the program. Rates the program based on graphics and usability in a biology classroom. (CW)

  14. Error-Free Software

    Science.gov (United States)

    1989-01-01

    001 is an integrated tool suited for automatically developing ultra reliable models, simulations and software systems. Developed and marketed by Hamilton Technologies, Inc. (HTI), it has been applied in engineering, manufacturing, banking and software tools development. The software provides the ability to simplify the complex. A system developed with 001 can be a prototype or fully developed with production quality code. It is free of interface errors, consistent, logically complete and has no data or control flow errors. Systems can be designed, developed and maintained with maximum productivity. Margaret Hamilton, President of Hamilton Technologies, also directed the research and development of USE.IT, an earlier product which was the first computer aided software engineering product in the industry to concentrate on automatically supporting the development of an ultrareliable system throughout its life cycle. Both products originated in NASA technology developed under a Johnson Space Center contract.

  15. Collaborative software development

    NARCIS (Netherlands)

    Jonge, M. de; Visser, E.; Visser, J.M.W.

    2001-01-01

    We present an approach to collaborative software development where obtaining components and contributing components across organizational boundaries are explicit phases in the development process. A lightweight generative infrastructure supports this approach with an online package base, and several

  16. Core Flight Software Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The mission of the CFS project is to provide reusable software in support of human space exploration programs.   The top-level technical approach to...

  17. TMT common software update

    Science.gov (United States)

    Gillies, Kim; Brighton, Allan; Buur, Hanne

    2016-08-01

    TMT Common Software (CSW). CSW consists of software services and library code that is used by developers to create the subsystems and components that participate in the software system. CSW also defines the types of components that can be constructed and their functional roles in the software system. TMT CSW has recently passed its preliminary design review. The unique features of CSW include its use of multiple, open-source products as the basis for services, and an approach that works to reduce the amount of CSW-provided infrastructure code. Considerable prototyping was completed during this phase to mitigate risk with results that demonstrate the validity of this design approach and the selected service implementation products. This paper describes the latest design of TMT CSW, key features, and results from the prototyping effort.

  18. Advanced Software Protection Now

    CERN Document Server

    Bendersky, Diego; Notarfrancesco, Luciano; Sarraute, Carlos; Waissbein, Ariel

    2010-01-01

    Software digital rights management is a pressing need for the software development industry which remains, as no practical solutions have been acclamaimed succesful by the industry. We introduce a novel software-protection method, fully implemented with today's technologies, that provides traitor tracing and license enforcement and requires no additional hardware nor inter-connectivity. Our work benefits from the use of secure triggers, a cryptographic primitive that is secure assuming the existence of an ind-cpa secure block cipher. Using our framework, developers may insert license checks and fingerprints, and obfuscate the code using secure triggers. As a result, this rises the cost that software analysis tools have detect and modify protection mechanisms. Thus rising the complexity of cracking this system.

  19. Banking Software Applications Security

    Directory of Open Access Journals (Sweden)

    Ioan Alexandru Bubu

    2015-03-01

    Full Text Available Computer software products are among the most complex artifacts, if not the most complex artifacts mankind has created. Securing those artifacts against intelligent attackers who try to exploit flaws in software design and construct is a great challenge too.The purpose of this paper is to introduce a secure alternative to banking software applications that are currently in use. This new application aims to cover most of the well-known vulnerabilities that plague the majority of current software.First we will take a quick look at current security methods that are in use, and a few known vulnerabilities. After this, we will discuss the security measures implemented in my application, and finally, we will the results of implementing them.

  20. Software citation principles

    Directory of Open Access Journals (Sweden)

    Arfon M. Smith

    2016-09-01

    Full Text Available Software is a critical part of modern research and yet there is little support across the scholarly ecosystem for its acknowledgement and citation. Inspired by the activities of the FORCE11 working group focused on data citation, this document summarizes the recommendations of the FORCE11 Software Citation Working Group and its activities between June 2015 and April 2016. Based on a review of existing community practices, the goal of the working group was to produce a consolidated set of citation principles that may encourage broad adoption of a consistent policy for software citation across disciplines and venues. Our work is presented here as a set of software citation principles, a discussion of the motivations for developing the principles, reviews of existing community practice, and a discussion of the requirements these principles would place upon different stakeholders. Working examples and possible technical solutions for how these principles can be implemented will be discussed in a separate paper.

  1. eSoftwareList

    Data.gov (United States)

    US Agency for International Development — USAID Software Database reporting tool created in Oracle Application Express (APEX). This version provides read only access to a database view of the JIRA SAR...

  2. Astronomical Software Directory Service

    Science.gov (United States)

    Hanisch, Robert J.; Payne, Harry; Hayes, Jeffrey

    1997-01-01

    With the support of NASA's Astrophysics Data Program (NRA 92-OSSA-15), we have developed the Astronomical Software Directory Service (ASDS): a distributed, searchable, WWW-based database of software packages and their related documentation. ASDS provides integrated access to 56 astronomical software packages, with more than 16,000 URLs indexed for full-text searching. Users are performing about 400 searches per month. A new aspect of our service is the inclusion of telescope and instrumentation manuals, which prompted us to change the name to the Astronomical Software and Documentation Service. ASDS was originally conceived to serve two purposes: to provide a useful Internet service in an area of expertise of the investigators (astronomical software), and as a research project to investigate various architectures for searching through a set of documents distributed across the Internet. Two of the co-investigators were then installing and maintaining astronomical software as their primary job responsibility. We felt that a service which incorporated our experience in this area would be more useful than a straightforward listing of software packages. The original concept was for a service based on the client/server model, which would function as a directory/referral service rather than as an archive. For performing the searches, we began our investigation with a decision to evaluate the Isite software from the Center for Networked Information Discovery and Retrieval (CNIDR). This software was intended as a replacement for Wide-Area Information Service (WAIS), a client/server technology for performing full-text searches through a set of documents. Isite had some additional features that we considered attractive, and we enjoyed the cooperation of the Isite developers, who were happy to have ASDS as a demonstration project. We ended up staying with the software throughout the project, making modifications to take advantage of new features as they came along, as well as

  3. Global software development

    DEFF Research Database (Denmark)

    Matthiesen, Stina

    2016-01-01

    This overview presents the mid stages of my doctoral research-based on ethnographic work conducted in IT companies in India and in Denmark-on collaborative work within global software development (GSD). In the following I briefly introduce how this research seeks to spark a debate in CSCW...... by challenging contemporary ideals about software development outsourcing through the exploration of the multiplicities and asymmetric dynamics inherent in the collaborative work of GSD....

  4. Biological Imaging Software Tools

    Science.gov (United States)

    Eliceiri, Kevin W.; Berthold, Michael R.; Goldberg, Ilya G.; Ibáñez, Luis; Manjunath, B.S.; Martone, Maryann E.; Murphy, Robert F.; Peng, Hanchuan; Plant, Anne L.; Roysam, Badrinath; Stuurman, Nico; Swedlow, Jason R.; Tomancak, Pavel; Carpenter, Anne E.

    2013-01-01

    Few technologies are more widespread in modern biological laboratories than imaging. Recent advances in optical technologies and instrumentation are providing hitherto unimagined capabilities. Almost all these advances have required the development of software to enable the acquisition, management, analysis, and visualization of the imaging data. We review each computational step that biologists encounter when dealing with digital images, the challenges in that domain, and the overall status of available software for bioimage informatics, focusing on open source options. PMID:22743775

  5. Engineering and Software Engineering

    Science.gov (United States)

    Jackson, Michael

    The phrase ‘software engineering' has many meanings. One central meaning is the reliable development of dependable computer-based systems, especially those for critical applications. This is not a solved problem. Failures in software development have played a large part in many fatalities and in huge economic losses. While some of these failures may be attributable to programming errors in the narrowest sense—a program's failure to satisfy a given formal specification—there is good reason to think that most of them have other roots. These roots are located in the problem of software engineering rather than in the problem of program correctness. The famous 1968 conference was motivated by the belief that software development should be based on “the types of theoretical foundations and practical disciplines that are traditional in the established branches of engineering.” Yet after forty years of currency the phrase ‘software engineering' still denotes no more than a vague and largely unfulfilled aspiration. Two major causes of this disappointment are immediately clear. First, too many areas of software development are inadequately specialised, and consequently have not developed the repertoires of normal designs that are the indispensable basis of reliable engineering success. Second, the relationship between structural design and formal analytical techniques for software has rarely been one of fruitful synergy: too often it has defined a boundary between competing dogmas, at which mutual distrust and incomprehension deprive both sides of advantages that should be within their grasp. This paper discusses these causes and their effects. Whether the common practice of software development will eventually satisfy the broad aspiration of 1968 is hard to predict; but an understanding of past failure is surely a prerequisite of future success.

  6. 3-Dimensional Right Ventricular Volume Assessment

    NARCIS (Netherlands)

    Jainandunsing, Jayant S.; Matyal, Robina; Shahul, Sajid S.; Wang, Angela; Woltersom, Bozena; Mahmood, Feroze

    Purpose: The purpose of this review was to evaluate new computer software available for 3-dimensional right ventricular (RV) volume estimation. Description: Based on 2-dimensional echocardiography, various algorithms have been used for RV volume estimation. These are complex, time-consuming

  7. The Application Value of T1-vibe MRI Imaging Combined with Argus Software in Measuring the Knee Osteoarthritis Patella Cartilage Volume%T1-vibe MRI成像结合Argus软件测量膝骨关节炎髌骨软骨体积的应用价值

    Institute of Scientific and Technical Information of China (English)

    董玉茹; 王宏; 刘腾; 尹媛媛

    2015-01-01

    目的:研究T1-vibe MRI成像结合Argus软件测量膝骨关节炎髌骨软骨体积的应用价值。方法选择2013年6月至2014年6月,于我院治疗膝骨关节炎的患者88例作为研究对象。以数字法随机将其分为观察组44例以及对照组44例。观察组患者使用T1-vibe MRI成像结合Argus软件对髌骨软骨的体积进行测量,对照组患者使用T1-vibe MRI成像技术结合常规手工测量,对比观察两组测量结果,研究其应用价值。结果两组在操作过程中,观察组所需时间明显少于对照组,差异有统计学意义(P<0.05)。两组进行测量的可重复性误差之间进行比较,观察组的误差分别为6.2%,1.6%,(0.7-2.0)%以及22%,除个体间差异外,其它均明显低于对照组误差,差异均有统计学意义(P<0.05)。两组髌骨体积测量结果显示,观察组测量体积和对照组并无较大差异(P>0.05),系统配对误差为-3.8%,两组随机配对误差为(4.7±2.6)%。结论使用T1-vibe MRI成像结合Argus软件测量膝骨关节炎髌骨软骨体积,大大减少了检测所耗时长,并且可重复性较高,安全性较好,值得临床测量使用推荐。%Objective Research the application value of T1-vibe MRI imaging combined with Argus software in measuring the knee osteoarthritis patella cartilage volume. Methods Choose 88 knee osteoarthritis patients who were admitted in our hospital from June 2013 to June 2014, which were devided into two groups with random number table method, every group had 44 patients. We used T1-vibe MRI imaging combined with Argus software to measure the patella cartilage volume of observation group while the control group use T1-vibe MRI imaging combined with manual measurement. Compare the measurement results of two groups and study its application value. Results The examination time of observation group was obviously less than the control group. There's significant difference in the

  8. Towards research on software cybernetics

    OpenAIRE

    2002-01-01

    Software cybernetics is a newly proposed area in software engineering. It makes better use of the interplay between control theory/engineering and software engineering. In this paper, we look into the research potentials of this emerging area.

  9. Geopressured geothermal bibliography. Volume 1 (citation extracts)

    Energy Technology Data Exchange (ETDEWEB)

    Hill, T.R.; Sepehrnoori, K.

    1981-08-01

    This bibliography was compiled by the Center for Energy Studies at The University of Texas at Austin to serve as a tool for researchers in the field of geopressured geothermal energy resources. The bibliography represents citations of papers on geopressured geothermal energy resources over the past eighteen years. Topics covered in the bibliography range from the technical aspects of geopressured geothermal reservoirs to social, environmental, and legal aspects of tapping those reservoirs for their energy resources. The bibliography currently contains more than 750 entries. For quick reference to a given topic, the citations are indexed into five divisions: author, category, conference title, descriptor, and sponsor. These indexes are arranged alphabetically and cross-referenced by page number.

  10. Software Engineering Reviews and Audits

    CERN Document Server

    Summers, Boyd L

    2011-01-01

    Accurate software engineering reviews and audits have become essential to the success of software companies and military and aerospace programs. These reviews and audits define the framework and specific requirements for verifying software development efforts. Authored by an industry professional with three decades of experience, Software Engineering Reviews and Audits offers authoritative guidance for conducting and performing software first article inspections, and functional and physical configuration software audits. It prepares readers to answer common questions for conducting and perform

  11. Computing and software

    Directory of Open Access Journals (Sweden)

    White, G. C.

    2004-06-01

    Full Text Available The reality is that the statistical methods used for analysis of data depend upon the availability of software. Analysis of marked animal data is no different than the rest of the statistical field. The methods used for analysis are those that are available in reliable software packages. Thus, the critical importance of having reliable, up–to–date software available to biologists is obvious. Statisticians have continued to develop more robust models, ever expanding the suite of potential analysis methods available. But without software to implement these newer methods, they will languish in the abstract, and not be applied to the problems deserving them. In the Computers and Software Session, two new software packages are described, a comparison of implementation of methods for the estimation of nest survival is provided, and a more speculative paper about how the next generation of software might be structured is presented. Rotella et al. (2004 compare nest survival estimation with different software packages: SAS logistic regression, SAS non–linear mixed models, and Program MARK. Nests are assumed to be visited at various, possibly infrequent, intervals. All of the approaches described compute nest survival with the same likelihood, and require that the age of the nest is known to account for nests that eventually hatch. However, each approach offers advantages and disadvantages, explored by Rotella et al. (2004. Efford et al. (2004 present a new software package called DENSITY. The package computes population abundance and density from trapping arrays and other detection methods with a new and unique approach. DENSITY represents the first major addition to the analysis of trapping arrays in 20 years. Barker & White (2004 discuss how existing software such as Program MARK require that each new model’s likelihood must be programmed specifically for that model. They wishfully think that future software might allow the user to combine

  12. Nano-electromembrane extraction

    DEFF Research Database (Denmark)

    Payán, María D Ramos; Li, Bin; Petersen, Nickolaj J.

    2013-01-01

    The present work has for the first time described nano-electromembrane extraction (nano-EME). In nano-EME, five basic drugs substances were extracted as model analytes from 200 μL acidified sample solution, through a supported liquid membrane (SLM) of 2-nitrophenyl octyl ether (NPOE......), and into approximately 8 nL phosphate buffer (pH 2.7) as acceptor phase. The driving force for the extraction was an electrical potential sustained over the SLM. The acceptor phase was located inside a fused silica capillary, and this capillary was also used for the final analysis of the acceptor phase by capillary...... as extraction selectivity. Compared with conventional EME, the acceptor phase volume in nano-EME was down-scaled by a factor of more than 1000. This resulted in a very high enrichment capacity. With loperamide as an example, an enrichment factor exceeding 500 was obtained in only 5 min of extraction...

  13. Managing Distributed Software Projects

    DEFF Research Database (Denmark)

    Persson, John Stouby

    Increasingly, software projects are becoming geographically distributed, with limited face-toface interaction between participants. These projects face particular challenges that need careful managerial attention. This PhD study reports on how we can understand and support the management of distr......Increasingly, software projects are becoming geographically distributed, with limited face-toface interaction between participants. These projects face particular challenges that need careful managerial attention. This PhD study reports on how we can understand and support the management...... of distributed software projects, based on a literature study and a case study. The main emphasis of the literature study was on how to support the management of distributed software projects, but also contributed to an understanding of these projects. The main emphasis of the case study was on how to understand...... the management of distributed software projects, but also contributed to supporting the management of these projects. The literature study integrates what we know about risks and risk-resolution techniques, into a framework for managing risks in distributed contexts. This framework was developed iteratively...

  14. The Ettention software package

    Energy Technology Data Exchange (ETDEWEB)

    Dahmen, Tim, E-mail: Tim.Dahmen@dfki.de [German Research Center for Artificial Intelligence GmbH (DFKI), 66123 Saarbrücken (Germany); Saarland University, 66123 Saarbrücken (Germany); Marsalek, Lukas [Eyen SE, Na Nivách 1043/16, 141 00 Praha 4 (Czech Republic); Saarland University, 66123 Saarbrücken (Germany); Marniok, Nico [Saarland University, 66123 Saarbrücken (Germany); Turoňová, Beata [Saarland University, 66123 Saarbrücken (Germany); IMPRS-CS, Max-Planck Institute for Informatics, Campus E 1.4, 66123 Saarbrücken (Germany); Bogachev, Sviatoslav [Saarland University, 66123 Saarbrücken (Germany); Trampert, Patrick; Nickels, Stefan [German Research Center for Artificial Intelligence GmbH (DFKI), 66123 Saarbrücken (Germany); Slusallek, Philipp [German Research Center for Artificial Intelligence GmbH (DFKI), 66123 Saarbrücken (Germany); Saarland University, 66123 Saarbrücken (Germany)

    2016-02-15

    We present a novel software package for the problem “reconstruction from projections” in electron microscopy. The Ettention framework consists of a set of modular building-blocks for tomographic reconstruction algorithms. The well-known block iterative reconstruction method based on Kaczmarz algorithm is implemented using these building-blocks, including adaptations specific to electron tomography. Ettention simultaneously features (1) a modular, object-oriented software design, (2) optimized access to high-performance computing (HPC) platforms such as graphic processing units (GPU) or many-core architectures like Xeon Phi, and (3) accessibility to microscopy end-users via integration in the IMOD package and eTomo user interface. We also provide developers with a clean and well-structured application programming interface (API) that allows for extending the software easily and thus makes it an ideal platform for algorithmic research while hiding most of the technical details of high-performance computing. - Highlights: • Novel software package for “reconstruction from projections” in electron microscopy. • Support for high-resolution reconstructions on iterative reconstruction algorithms. • Support for CPU, GPU and Xeon Phi. • Integration in the IMOD software. • Platform for algorithm researchers: object oriented, modular design.

  15. Software Activation Using Multithreading

    Directory of Open Access Journals (Sweden)

    Jianrui Zhang

    2012-11-01

    Full Text Available Software activation is an anti-piracy technology designed to verify that software products have been legitimately licensed. Activation should be quick and simple while simultaneously being secure and protecting customer privacy. The most common form of software activation is for the user to enter a legitimate product serial number. However, software activation based on serial numbers appears to be weak, since cracks for many programs are readily available on the Internet. Users can employ such cracks to bypass software activation.Serial number verification logic usually executes sequentially in a single thread. Such an approach is relatively easy to break since attackers can trace the code to understand how the logic works. In this paper, we develop a practical multi-threaded verification design. Our results show that by proper use of multi-threading, the amount of traceable code in a debugger can be reduced to a low percentage of the total and the traceable code in each run can differ as well. This makes it significantly more difficult for an attacker to reverse engineer the code as a means of bypassing a security check. Finally, we attempt to quantify the increased effort needed to break our verification logic.

  16. Plume Ascent Tracker: Interactive Matlab software for analysis of ascending plumes in image data

    Science.gov (United States)

    Valade, S. A.; Harris, A. J. L.; Cerminara, M.

    2014-05-01

    This paper presents Matlab-based software designed to track and analyze an ascending plume as it rises above its source, in image data. It reads data recorded in various formats (video files, image files, or web-camera image streams), and at various wavelengths (infrared, visible, or ultra-violet). Using a set of filters which can be set interactively, the plume is first isolated from its background. A user-friendly interface then allows tracking of plume ascent and various parameters that characterize plume evolution during emission and ascent. These include records of plume height, velocity, acceleration, shape, volume, ash (fine-particle) loading, spreading rate, entrainment coefficient and inclination angle, as well as axial and radial profiles for radius and temperature (if data are radiometric). Image transformations (dilatation, rotation, resampling) can be performed to create new images with a vent-centered metric coordinate system. Applications may interest both plume observers (monitoring agencies) and modelers. For the first group, the software is capable of providing quantitative assessments of plume characteristics from image data, for post-event analysis or in near real-time analysis. For the second group, extracted data can serve as benchmarks for plume ascent models, and as inputs for cloud dispersal models. We here describe the software's tracking methodology and main graphical interfaces, using thermal infrared image data of an ascending volcanic ash plume at Santiaguito volcano.

  17. Impact of Agile Software Development Model on Software Maintainability

    Science.gov (United States)

    Gawali, Ajay R.

    2012-01-01

    Software maintenance and support costs account for up to 60% of the overall software life cycle cost and often burdens tightly budgeted information technology (IT) organizations. Agile software development approach delivers business value early, but implications on software maintainability are still unknown. The purpose of this quantitative study…

  18. Software cost/resource modeling: Software quality tradeoff measurement

    Science.gov (United States)

    Lawler, R. W.

    1980-01-01

    A conceptual framework for treating software quality from a total system perspective is developed. Examples are given to show how system quality objectives may be allocated to hardware and software; to illustrate trades among quality factors, both hardware and software, to achieve system performance objectives; and to illustrate the impact of certain design choices on software functionality.

  19. Impact of Agile Software Development Model on Software Maintainability

    Science.gov (United States)

    Gawali, Ajay R.

    2012-01-01

    Software maintenance and support costs account for up to 60% of the overall software life cycle cost and often burdens tightly budgeted information technology (IT) organizations. Agile software development approach delivers business value early, but implications on software maintainability are still unknown. The purpose of this quantitative study…

  20. The Solid* toolset for software visual analytics of program structure and metrics comprehension : From research prototype to product

    NARCIS (Netherlands)

    Reniers, Dennie; Voinea, Lucian; Ersoy, Ozan; Telea, Alexandru

    2014-01-01

    Software visual analytics (SVA) tools combine static program analysis and fact extraction with information visualization to support program comprehension. However, building efficient and effective SVA tools is highly challenging, as it involves extensive software development in program analysis, gra

  1. Parallel Feature Extraction System

    Institute of Scientific and Technical Information of China (English)

    MAHuimin; WANGYan

    2003-01-01

    Very high speed image processing is needed in some application specially for weapon. In this paper, a high speed image feature extraction system with parallel structure was implemented by Complex programmable logic device (CPLD), and it can realize image feature extraction in several microseconds almost with no delay. This system design is presented by an application instance of flying plane, whose infrared image includes two kinds of feature: geometric shape feature in the binary image and temperature-feature in the gray image. Accordingly the feature extraction is taken on the two kind features. Edge and area are two most important features of the image. Angle often exists in the connection of the different parts of the target's image, which indicates that one area ends and the other area begins. The three key features can form the whole presentation of an image. So this parallel feature extraction system includes three processing modules: edge extraction, angle extraction and area extraction. The parallel structure is realized by a group of processors, every detector is followed by one route of processor, every route has the same circuit form, and works together at the same time controlled by a set of clock to realize feature extraction. The extraction system has simple structure, small volume, high speed, and better stability against noise. It can be used in the war field recognition system.

  2. libdrdc: software standards library

    Science.gov (United States)

    Erickson, David; Peng, Tie

    2008-04-01

    This paper presents the libdrdc software standards library including internal nomenclature, definitions, units of measure, coordinate reference frames, and representations for use in autonomous systems research. This library is a configurable, portable C-function wrapped C++ / Object Oriented C library developed to be independent of software middleware, system architecture, processor, or operating system. It is designed to use the automatically-tuned linear algebra suite (ATLAS) and Basic Linear Algebra Suite (BLAS) and port to firmware and software. The library goal is to unify data collection and representation for various microcontrollers and Central Processing Unit (CPU) cores and to provide a common Application Binary Interface (ABI) for research projects at all scales. The library supports multi-platform development and currently works on Windows, Unix, GNU/Linux, and Real-Time Executive for Multiprocessor Systems (RTEMS). This library is made available under LGPL version 2.1 license.

  3. Lecture 2: Software Security

    CERN Document Server

    CERN. Geneva

    2013-01-01

    Computer security has been an increasing concern for IT professionals for a number of years, yet despite all the efforts, computer systems and networks remain highly vulnerable to attacks of different kinds. Design flaws and security bugs in the underlying software are among the main reasons for this. This lecture addresses the following question: how to create secure software? The lecture starts with a definition of computer security and an explanation of why it is so difficult to achieve. It then introduces the main security principles (like least-privilege, or defense-in-depth) and discusses security in different phases of the software development cycle. The emphasis is put on the implementation part: most common pitfalls and security bugs are listed, followed by advice on best practice for security development, testing and deployment. Sebastian Lopienski is CERN’s deputy Computer Security Officer. He works on security strategy and policies; offers internal consultancy and audit services; develops and ...

  4. Astronomers as Software Developers

    Science.gov (United States)

    Pildis, Rachel A.

    2016-01-01

    Astronomers know that their research requires writing, adapting, and documenting computer software. Furthermore, they often have to learn new computer languages and figure out how existing programs work without much documentation or guidance and with extreme time pressure. These are all skills that can lead to a software development job, but recruiters and employers probably won't know that. I will discuss all the highly useful experience that astronomers may not know that they already have, and how to explain that knowledge to others when looking for non-academic software positions. I will also talk about some of the pitfalls I have run into while interviewing for jobs and working as a developer, and encourage you to embrace the curiosity employers might have about your non-standard background.

  5. Secure software practices among Malaysian software practitioners: An exploratory study

    Science.gov (United States)

    Mohamed, Shafinah Farvin Packeer; Baharom, Fauziah; Deraman, Aziz; Yahya, Jamaiah; Mohd, Haslina

    2016-08-01

    Secure software practices is increasingly gaining much importance among software practitioners and researchers due to the rise of computer crimes in the software industry. It has become as one of the determinant factors for producing high quality software. Even though its importance has been revealed, its current practice in the software industry is still scarce, particularly in Malaysia. Thus, an exploratory study is conducted among software practitioners in Malaysia to study their experiences and practices in the real-world projects. This paper discusses the findings from the study, which involved 93 software practitioners. Structured questionnaire is utilized for data collection purpose whilst statistical methods such as frequency, mean, and cross tabulation are used for data analysis. Outcomes from this study reveal that software practitioners are becoming increasingly aware on the importance of secure software practices, however, they lack of appropriate implementation, which could affect the quality of produced software.

  6. Methods, software and datasets to verify DVH calculations against analytical values: Twenty years late(r)

    Energy Technology Data Exchange (ETDEWEB)

    Nelms, Benjamin [Canis Lupus LLC, Merrimac, Wisconsin 53561 (United States); Stambaugh, Cassandra [Department of Physics, University of South Florida, Tampa, Florida 33612 (United States); Hunt, Dylan; Tonner, Brian; Zhang, Geoffrey; Feygelman, Vladimir, E-mail: vladimir.feygelman@moffitt.org [Department of Radiation Oncology, Moffitt Cancer Center, Tampa, Florida 33612 (United States)

    2015-08-15

    Purpose: The authors designed data, methods, and metrics that can serve as a standard, independent of any software package, to evaluate dose-volume histogram (DVH) calculation accuracy and detect limitations. The authors use simple geometrical objects at different orientations combined with dose grids of varying spatial resolution with linear 1D dose gradients; when combined, ground truth DVH curves can be calculated analytically in closed form to serve as the absolute standards. Methods: DICOM RT structure sets containing a small sphere, cylinder, and cone were created programmatically with axial plane spacing varying from 0.2 to 3 mm. Cylinders and cones were modeled in two different orientations with respect to the IEC 1217 Y axis. The contours were designed to stringently but methodically test voxelation methods required for DVH. Synthetic RT dose files were generated with 1D linear dose gradient and with grid resolution varying from 0.4 to 3 mm. Two commercial DVH algorithms—PINNACLE (Philips Radiation Oncology Systems) and PlanIQ (Sun Nuclear Corp.)—were tested against analytical values using custom, noncommercial analysis software. In Test 1, axial contour spacing was constant at 0.2 mm while dose grid resolution varied. In Tests 2 and 3, the dose grid resolution was matched to varying subsampled axial contours with spacing of 1, 2, and 3 mm, and difference analysis and metrics were employed: (1) histograms of the accuracy of various DVH parameters (total volume, D{sub max}, D{sub min}, and doses to % volume: D99, D95, D5, D1, D0.03 cm{sup 3}) and (2) volume errors extracted along the DVH curves were generated and summarized in tabular and graphical forms. Results: In Test 1, PINNACLE produced 52 deviations (15%) while PlanIQ produced 5 (1.5%). In Test 2, PINNACLE and PlanIQ differed from analytical by >3% in 93 (36%) and 18 (7%) times, respectively. Excluding D{sub min} and D{sub max} as least clinically relevant would result in 32 (15%) vs 5 (2

  7. Methods of Software Verification

    Directory of Open Access Journals (Sweden)

    R. E. Gurin

    2015-01-01

    Full Text Available This article is devoted to the problem of software verification (SW. Methods of software verification designed to check the software for compliance with the stated requirements such as correctness, system security and system adaptability to small changes in the environment, portability and compatibility, etc. These are various methods both by the operation process and by the way of achieving result. The article describes the static and dynamic methods of software verification and paid attention to the method of symbolic execution. In its review of static analysis are discussed and described the deductive method, and methods for testing the model. A relevant issue of the pros and cons of a particular method is emphasized. The article considers classification of test techniques for each method. In this paper we present and analyze the characteristics and mechanisms of the static analysis of dependencies, as well as their views, which can reduce the number of false positives in situations where the current state of the program combines two or more states obtained both in different paths of execution and in working with multiple object values. Dependences connect various types of software objects: single variables, the elements of composite variables (structure fields, array elements, the size of the heap areas, the length of lines, the number of initialized array elements in the verification code using static methods. The article pays attention to the identification of dependencies within the framework of the abstract interpretation, as well as gives an overview and analysis of the inference tools.Methods of dynamic analysis such as testing, monitoring and profiling are presented and analyzed. Also some kinds of tools are considered which can be applied to the software when using the methods of dynamic analysis. Based on the work a conclusion is drawn, which describes the most relevant problems of analysis techniques, methods of their solutions and

  8. Processeringsoptimering med Canons software

    DEFF Research Database (Denmark)

    Precht, Helle

    2009-01-01

    . Muligheder i software optimering blev studeret i relation til optimal billedkvalitet og kontrol optagelser, for at undersøge om det var muligt at acceptere diagnostisk billedkvalitet og derved tage afsæt i ALARA. Metode og materialer Et kvantitativt eksperimentelt studie baseret på forsøg med teknisk og...... humant fantom. CD Rad fantom anvendes som teknisk fantom, hvor billederne blev analyseret med CD Rad software, og resultatet var en objektiv IQF værdi. Det humane fantom var et lamme pelvis med femur, der via NRPB’ er sammenlignelig med absorptionen ved et femårigt barn. De humane forsøgsbilleder blev...

  9. Agile distributed software development

    DEFF Research Database (Denmark)

    Persson, John Stouby; Mathiassen, Lars; Aaen, Ivan

    2012-01-01

    While face-to-face interaction is fundamental in agile software development, distributed environments must rely extensively on mediated interactions. Practicing agile principles in distributed environments therefore poses particular control challenges related to balancing fixed vs. evolving quality...... requirements and people vs. process-based collaboration. To investigate these challenges, we conducted an in-depth case study of a successful agile distributed software project with participants from a Russian firm and a Danish firm. Applying Kirsch’s elements of control framework, we offer an analysis of how...

  10. Software Testing as Science

    Directory of Open Access Journals (Sweden)

    Ingrid Gallesdic

    2013-06-01

    Full Text Available The most widespread opinion among people who have some connection with software testing is that this activity is an art. In fact, books have been published widely whose titles refer to it as art, role or process. But because software complexity is increasing every year, this paper proposes a new approach, conceiving the test as a science. This is because the processes by which they are applied are the steps of the scientific method: inputs, processes, outputs. The contents of this paper examines the similarities and test characteristics as science.

  11. Agile software development

    CERN Document Server

    Stober, Thomas

    2009-01-01

    Software Development is moving towards a more agile and more flexible approach. It turns out that the traditional 'waterfall' model is not supportive in an environment where technical, financial and strategic constraints are changing almost every day. But what is agility? What are today's major approaches? And especially: What is the impact of agile development principles on the development teams, on project management and on software architects? How can large enterprises become more agile and improve their business processes, which have been existing since many, many years? What are the limit

  12. Software Safety and Security

    CERN Document Server

    Nipkow, T; Hauptmann, B

    2012-01-01

    Recent decades have seen major advances in methods and tools for checking the safety and security of software systems. Automatic tools can now detect security flaws not only in programs of the order of a million lines of code, but also in high-level protocol descriptions. There has also been something of a breakthrough in the area of operating system verification. This book presents the lectures from the NATO Advanced Study Institute on Tools for Analysis and Verification of Software Safety and Security; a summer school held at Bayrischzell, Germany, in 2011. This Advanced Study Institute was

  13. Machine Tool Software

    Science.gov (United States)

    1988-01-01

    A NASA-developed software package has played a part in technical education of students who major in Mechanical Engineering Technology at William Rainey Harper College. Professor Hack has been using (APT) Automatically Programmed Tool Software since 1969 in his CAD/CAM Computer Aided Design and Manufacturing curriculum. Professor Hack teaches the use of APT programming languages for control of metal cutting machines. Machine tool instructions are geometry definitions written in APT Language to constitute a "part program." The part program is processed by the machine tool. CAD/CAM students go from writing a program to cutting steel in the course of a semester.

  14. Software product quality control

    CERN Document Server

    Wagner, Stefan

    2013-01-01

    Quality is not a fixed or universal property of software; it depends on the context and goals of its stakeholders. Hence, when you want to develop a high-quality software system, the first step must be a clear and precise specification of quality. Yet even if you get it right and complete, you can be sure that it will become invalid over time. So the only solution is continuous quality control: the steady and explicit evaluation of a product's properties with respect to its updated quality goals.This book guides you in setting up and running continuous quality control in your environment. Star

  15. TIA Software User's Manual

    Science.gov (United States)

    Cramer, K. Elliott; Syed, Hazari I.

    1995-01-01

    This user's manual describes the installation and operation of TIA, the Thermal-Imaging acquisition and processing Application, developed by the Nondestructive Evaluation Sciences Branch at NASA Langley Research Center, Hampton, Virginia. TIA is a user friendly graphical interface application for the Macintosh 2 and higher series computers. The software has been developed to interface with the Perceptics/Westinghouse Pixelpipe(TM) and PixelStore(TM) NuBus cards and the GW Instruments MacADIOS(TM) input-output (I/O) card for the Macintosh for imaging thermal data. The software is also capable of performing generic image-processing functions.

  16. Workstation software framework

    Science.gov (United States)

    Andolfato, L.; Karban, R.

    2008-08-01

    The Workstation Software Framework (WSF) is a state machine model driven development toolkit designed to generate event driven applications based on ESO VLT software. State machine models are used to generate executables. The toolkit provides versatile code generation options and it supports Mealy, Moore and hierarchical state machines. Generated code is readable and maintainable since it combines well known design patterns such as the State and the Template patterns. WSF promotes a development process that is based on model reusability through the creation of a catalog of state machine patterns.

  17. Green in software engineering

    CERN Document Server

    Calero Munoz, Coral

    2015-01-01

    This is the first book that presents a comprehensive overview of sustainability aspects in software engineering. Its format follows the structure of the SWEBOK and covers the key areas involved in the incorporation of green aspects in software engineering, encompassing topics from requirement elicitation to quality assurance and maintenance, while also considering professional practices and economic aspects. The book consists of thirteen chapters, which are structured in five parts. First the "Introduction" gives an overview of the primary general concepts related to Green IT, discussing wha

  18. Six Sigma software development

    CERN Document Server

    Tayntor, Christine B

    2002-01-01

    Since Six Sigma has had marked success in improving quality in other settings, and since the quality of software remains poor, it seems a natural evolution to apply the concepts and tools of Six Sigma to system development and the IT department. Until now however, there were no books available that applied these concepts to the system development process. Six Sigma Software Development fills this void and illustrates how Six Sigma concepts can be applied to all aspects of the evolving system development process. It includes the traditional waterfall model and in the support of legacy systems,

  19. INTEGRATING CRM SOFTWARE APPLICATIONS

    OpenAIRE

    2008-01-01

    Scientists, end users of CRM applications and producers of CRM software, all come to an agreement when talking about the idea of CRM, the CRM strategy or the term CRM. The main aspect is that CRM can be analyzed from two different points of view: CRM – the marketing strategy and CRM – the software. The first term refers to establishing some personalized relationships with the customers that can be afterwards easily managed. This way, it can determine at any time the past client relations, the...

  20. Future Trends of Software Technology and Applications: Software Architecture

    Science.gov (United States)

    2006-01-01

    Sponsored by the U.S. Department of Defense © 2006 by Carnegie Mellon University 1 Pittsburgh, PA 15213-3890 Future Trends of Software Technology ...COVERED 00-00-2006 to 00-00-2006 4. TITLE AND SUBTITLE Future Trends of Software Technology and Applications: Software Architecture 5a. CONTRACT...and Applications: Software Architecture Paul Clements Software Engineering Institute Carnegie Mellon University Report Documentation Page Form

  1. Determination of Acrylamide in Water by Liquid-Liquid Small Volume Extraction Combined with Gas Chromatography%液液小体积萃取气相色谱法测定水中丙烯酰胺

    Institute of Scientific and Technical Information of China (English)

    陈蓓蓓

    2013-01-01

    Method of determination of acrylamide in water by liquid-liquid small volume extraction combined with gas chromatography(GC-ECD) was proposed. The preprocessing was optimized, shortening the processing time and reducing the usage amount of reagent and sample water. H2SO4,, KBrO3 and KBr were added into 20 mL water and reacted for at least 30 min under the temperature 4℃, and acrylamide reacted with bromine to 2, 3-dibromopropionamide. Na2S2O3 was added to remove the extra bromine and Na2SO4 was dissolved at room temperature. The sample was extracted with 2.0 mL ethyl acetate and the extract was detected by GC-ECD. Calibration curve was established with the responding peak's area of 2,3-dibromopropionamide vs the concentration of acrylamide in water, with correlation coefficient 0.999 47. The method is applicable to the determination of acrylamide in drinking water and surface water with the detection limits 0.026 μg/L, relative standard deviation 2.70% and average recovery in the range of 80.67%~100.43%.%文章建立了溴化衍生-液液小体积萃取-气相色谱法-电子捕获检测器(GC-ECD)测定水中丙烯酰胺的方法.优化了前处理步骤,缩短了样品的处理时间,减少了试剂和样品的使用量.取20 mL水样,分别加入H2SO4、KBrO3、KBr,4℃下反应至少30 min,丙烯酰胺与新生溴反应生成2,3-二溴丙酰胺.加入NaSO3去除多余的溴,加无水硫酸钠于室温下溶解盐析,用2.0 mL乙酸乙酯萃取,取上层萃取液GC-ECD分析.用衍生物2,3-二溴丙酰胺的响应峰面积对水中丙烯酰胺的质量浓度绘制工作曲线,相关系数达到0.999 47.方法适用于饮用水、地表水中丙烯酰胺的测定,检出限可达到0.026 μg/L,相对标准偏差RSD为2.70%,加标回收率为80.67%~100.43%.

  2. Software testing concepts and operations

    CERN Document Server

    Mili, Ali

    2015-01-01

    Explores and identifies the main issues, concepts, principles and evolution of software testing, including software quality engineering and testing concepts, test data generation, test deployment analysis, and software test managementThis book examines the principles, concepts, and processes that are fundamental to the software testing function. This book is divided into five broad parts. Part I introduces software testing in the broader context of software engineering and explores the qualities that testing aims to achieve or ascertain, as well as the lifecycle of software testing. Part II c

  3. The Art of Software Testing

    CERN Document Server

    Myers, Glenford J; Badgett, Tom

    2011-01-01

    The classic, landmark work on software testing The hardware and software of computing have changed markedly in the three decades since the first edition of The Art of Software Testing, but this book's powerful underlying analysis has stood the test of time. Whereas most books on software testing target particular development techniques, languages, or testing methods, The Art of Software Testing, Third Edition provides a brief but powerful and comprehensive presentation of time-proven software testing approaches. If your software development project is mission critical, this book is an investme

  4. Evaluation & Optimization of Software Engineering

    Directory of Open Access Journals (Sweden)

    Asaduzzaman Noman

    2016-06-01

    Full Text Available The term is made of two words, software and engineering. Software is more than just a program code. A program is an executable code, which serves some computational purpose. Software is considered to be collection of executable programming code, associated libraries and documentations. Software, when made for a specific requirement is called software product. Engineering on the other hand, is all about developing products, using well-defined, scientific principles and methods. The outcome of software engineering is an efficient and reliable software product. IEEE defines software engineering as: The application of a systematic, disciplined, quantifiable approach to the development, operation and maintenance of software; that is, the application of engineering to software.

  5. Mining dynamic noteworthy functions in software execution sequences.

    Science.gov (United States)

    Zhang, Bing; Huang, Guoyan; Wang, Yuqian; He, Haitao; Ren, Jiadong

    2017-01-01

    As the quality of crucial entities can directly affect that of software, their identification and protection become an important premise for effective software development, management, maintenance and testing, which thus contribute to improving the software quality and its attack-defending ability. Most analysis and evaluation on important entities like codes-based static structure analysis are on the destruction of the actual software running. In this paper, from the perspective of software execution process, we proposed an approach to mine dynamic noteworthy functions (DNFM)in software execution sequences. First, according to software decompiling and tracking stack changes, the execution traces composed of a series of function addresses were acquired. Then these traces were modeled as execution sequences and then simplified so as to get simplified sequences (SFS), followed by the extraction of patterns through pattern extraction (PE) algorithm from SFS. After that, evaluating indicators inner-importance and inter-importance were designed to measure the noteworthiness of functions in DNFM algorithm. Finally, these functions were sorted by their noteworthiness. Comparison and contrast were conducted on the experiment results from two traditional complex network-based node mining methods, namely PageRank and DegreeRank. The results show that the DNFM method can mine noteworthy functions in software effectively and precisely.

  6. Mining dynamic noteworthy functions in software execution sequences

    Science.gov (United States)

    Huang, Guoyan; Wang, Yuqian; He, Haitao; Ren, Jiadong

    2017-01-01

    As the quality of crucial entities can directly affect that of software, their identification and protection become an important premise for effective software development, management, maintenance and testing, which thus contribute to improving the software quality and its attack-defending ability. Most analysis and evaluation on important entities like codes-based static structure analysis are on the destruction of the actual software running. In this paper, from the perspective of software execution process, we proposed an approach to mine dynamic noteworthy functions (DNFM)in software execution sequences. First, according to software decompiling and tracking stack changes, the execution traces composed of a series of function addresses were acquired. Then these traces were modeled as execution sequences and then simplified so as to get simplified sequences (SFS), followed by the extraction of patterns through pattern extraction (PE) algorithm from SFS. After that, evaluating indicators inner-importance and inter-importance were designed to measure the noteworthiness of functions in DNFM algorithm. Finally, these functions were sorted by their noteworthiness. Comparison and contrast were conducted on the experiment results from two traditional complex network-based node mining methods, namely PageRank and DegreeRank. The results show that the DNFM method can mine noteworthy functions in software effectively and precisely. PMID:28278276

  7. Software for multistate analysis

    Directory of Open Access Journals (Sweden)

    Frans J. Willekens

    2014-08-01

    Full Text Available Background: The growing interest in pathways, the increased availability of life-history data,innovations in statistical and demographic techniques, and advances in softwaretechnology have stimulated the development of software packages for multistatemodeling of life histories. Objective: In the paper we list and briefly discuss several software packages for multistate analysisof life-history data. The packages cover the estimation of multistate models (transitionrates and transition probabilities, multistate life tables, multistate populationprojections, and microsimulation. Methods: Brief description of software packages in a historical and comparative perspective. Results: During the past 10 years the advances in multistate modeling software have beenimpressive. New computational tools accompany the development of new methods instatistics and demography. The statistical theory of counting processes is the preferredmethod for the estimation of multistate models and R is the preferred programmingplatform. Conclusions: Innovations in method, data, and computer technology have removed the traditionalbarriers to multistate modeling of life histories and the computation of informative lifecourseindicators. The challenge ahead of us is to model and predict individual lifehistories.

  8. Open Source Software Acquisition

    DEFF Research Database (Denmark)

    Holck, Jesper; Kühn Pedersen, Mogens; Holm Larsen, Michael

    2005-01-01

    Lately we have seen a growing interest from both public and private organisations to adopt OpenSource Software (OSS), not only for a few, specific applications but also on a more general levelthroughout the organisation. As a consequence, the organisations' decisions on adoption of OSS arebecoming...

  9. Software Process Improvement

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; Diebold, Philipp; Münch, Jürgen

    2016-01-01

    Software process improvement (SPI) is around for decades: frameworks are proposed, success factors are studied, and experiences have been reported. However, the sheer mass of concepts, approaches, and standards published over the years overwhelms practitioners as well as researchers. What is out...

  10. Global Software Development

    DEFF Research Database (Denmark)

    Søderberg, Anne-Marie; Krishna, S.; Bjørn, Pernille

    2013-01-01

    accounts of close collaboration processes in two large and complex projects, where off-shoring of software development is moved to a strategic level, we found that the vendor was able to establish a strategic partnership through long-term engagement with the field of banking and insurance as well...

  11. Improving Agile Software Practice

    DEFF Research Database (Denmark)

    Tjørnehøj, Gitte

    2006-01-01

    Software process improvement in small and agile organizations is often problematic, but achieving good SPI-assessments can still be necessary to stay in the marked or to meet demands of multinational owners. The traditional norm driven, centralized and control centered improvement approaches has...

  12. AOFlagger: RFI Software

    Science.gov (United States)

    Offringa, A. R.

    2010-10-01

    The RFI software presented here can automatically flag data and can be used to analyze the data in a measurement. The purpose of flagging is to mark samples that are affected by interfering sources such as radio stations, airplanes, electrical fences or other transmitting interferers. The tools in the package are meant for offline use. The software package contains a graphical interface ("rfigui") that can be used to visualize a measurement set and analyze mitigation techniques. It also contains a console flagger ("rficonsole") that can execute a script of mitigation functions without the overhead of a graphical environment. All tools were written in C++. The software has been tested extensively on low radio frequencies (150 MHz or lower) produced by the WSRT and LOFAR telescopes. LOFAR is the Low Frequency Array that is built in and around the Netherlands. Higher frequencies should work as well. Some of the methods implemented are the SumThreshold, the VarThreshold and the singular value decomposition (SVD) method. Included also are several surface fitting algorithms. The software is published under the GNU General Public License version 3.

  13. Writing testable software requirements

    Energy Technology Data Exchange (ETDEWEB)

    Knirk, D. [Sandia National Labs., Albuquerque, NM (United States)

    1997-11-01

    This tutorial identifies common problems in analyzing requirements in the problem and constructing a written specification of what the software is to do. It deals with two main problem areas: identifying and describing problem requirements, and analyzing and describing behavior specifications.

  14. Improving Agile Software Practice

    DEFF Research Database (Denmark)

    Tjørnehøj, Gitte

    2006-01-01

    Software process improvement in small and agile organizations is often problematic, but achieving good SPI-assessments can still be necessary to stay in the marked or to meet demands of multinational owners. The traditional norm driven, centralized and control centered improvement approaches has...

  15. Software Process Improvement

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; Konopka, Claudia; Nellemann, Peter

    2016-01-01

    Software process improvement (SPI) is around for decades: frameworks are proposed, success factors are studied, and experiences have been reported. However, the sheer mass of concepts, approaches, and standards published over the years overwhelms practitioners as well as researchers. What is out...

  16. Software Defined Networking

    DEFF Research Database (Denmark)

    Caba, Cosmin Marius

    resources are limited. Hence, to counteract this trend, current QoS mechanisms must become simpler to deploy and operate, in order to motivate NSPs to employ QoS techniques instead of overprovisioning. Software Defined Networking (SDN) represents a paradigm shift in the way telecommunication and data...

  17. Complexity, Systems, and Software

    Science.gov (United States)

    2014-08-14

    complex ( Hidden issues; dumbs down operator) 11 Complexity, Systems, and Software Sarah Sheard August 14, 2014 © 2014 Carnegie...August 14, 2014 © 2014 Carnegie Mellon University Addressing Complexity in SoSs Source: SEBOK Wiki System Con truer Strateglc Context

  18. Green Software Products

    NARCIS (Netherlands)

    Jagroep, Erik Arijender

    2017-01-01

    The rising energy consumption of the ICT industry has triggered a quest for more green, energy efficient ICT solutions. The role of software as the true consumer of power and its potential contribution to reach sustainability goals has increasingly been acknowledged. At the same time, it is shown to

  19. The FARE Software

    Science.gov (United States)

    Pitarello, Adriana

    2015-01-01

    This article highlights the importance of immediate corrective feedback in tutorial software for language teaching in an academic learning environment. We aim to demonstrate that, rather than simply reporting on the performance of the foreign language learner, this feedback can act as a mediator of students' cognitive and metacognitive activity.…

  20. JSATS Decoder Software Manual

    Energy Technology Data Exchange (ETDEWEB)

    Flory, Adam E.; Lamarche, Brian L.; Weiland, Mark A.

    2013-05-01

    The Juvenile Salmon Acoustic Telemetry System (JSATS) Decoder is a software application that converts a digitized acoustic signal (a waveform stored in the .bwm file format) into a list of potential JSATS Acoustic MicroTransmitter (AMT) tagcodes along with other data about the signal including time of arrival and signal to noise ratios (SNR). This software is capable of decoding single files, directories, and viewing raw acoustic waveforms. When coupled with the JSATS Detector, the Decoder is capable of decoding in ‘real-time’ and can also provide statistical information about acoustic beacons placed within receive range of hydrophones within a JSATS array. This document details the features and functionality of the software. The document begins with software installation instructions (section 2), followed in order by instructions for decoder setup (section 3), decoding process initiation (section 4), then monitoring of beacons (section 5) using real-time decoding features. The last section in the manual describes the beacon, beacon statistics, and the results file formats. This document does not consider the raw binary waveform file format.

  1. Software Architecture Evolution

    Science.gov (United States)

    2013-12-01

    Clinical Linguistics & Phonetics 16(5): 299–316. doi:10.1080/ 02699200210135901 [162] G. C. Murphy, D. Notkin, K. J. Sullivan (2001). “Software...same connector in the model. One way of understanding such examples is as instances of a very common linguistic phenomenon called synecdoche, in which a

  2. High Assurance Software

    Science.gov (United States)

    2013-10-22

    Conference on Software Engineering. 2013. Fingerprinting Malware Using Bioinformatics Tools Building a Classifier for the Zeus Virus, Pedersen, J...file is a nucleotide representation of the original artifact file. This process can be reversed to obtain the original file from the FASTA file

  3. Limits of Software Reuse

    NARCIS (Netherlands)

    Holenderski, L.

    2006-01-01

    Software reuse is considered one of the main techniques to increasesoftware productivity. We present two simple mathematical argumentsthat show some theoretical limits of reuse. It turns out that the increase of productivity due to internal reuse is at most linear, farfrom the needed exponential gr

  4. Software engineering tools.

    Science.gov (United States)

    Wear, L L; Pinkert, J R

    1994-01-01

    We have looked at general descriptions and illustrations of several software development tools, such as tools for prototyping, developing DFDs, testing, and maintenance. Many others are available, and new ones are being developed. However, you have at least seen some examples of powerful CASE tools for systems development.

  5. Generalized Software Security Framework

    Directory of Open Access Journals (Sweden)

    Smriti Jain

    2011-01-01

    Full Text Available Security of information has become a major concern in today's digitized world. As a result, effective techniques to secure information are required. The most effective way is to incorporate security in the development process itself thereby resulting into secured product. In this paper, we propose a framework that enables security to be included in the software development process. The framework consists of three layers namely; control layer, aspect layer and development layer. The control layer illustrates the managerial control of the entire software development process with the help of governance whereas aspect layer recognizes the security mechanisms that can be incorporated during the software development to identify the various security features. The development layer helps to integrate the various security aspects as well as the controls identified in the above layers during the development process. The layers are further verified by a survey amongst the IT professionals. The professionals concluded that the developed framework is easy to use due to its layered architecture and, can be customized for various types of softwares.

  6. Iterative software kernels

    Energy Technology Data Exchange (ETDEWEB)

    Duff, I.

    1994-12-31

    This workshop focuses on kernels for iterative software packages. Specifically, the three speakers discuss various aspects of sparse BLAS kernels. Their topics are: `Current status of user lever sparse BLAS`; Current status of the sparse BLAS toolkit`; and `Adding matrix-matrix and matrix-matrix-matrix multiply to the sparse BLAS toolkit`.

  7. Method for analyzing solvent extracted sponge core

    Energy Technology Data Exchange (ETDEWEB)

    Ellington, W.E.; Calkin, C.L.

    1988-11-22

    For use in solvent extracted sponge core measurements of the oil saturation of earth formations, a method is described for quantifying the volume of oil in the fluids resulting from such extraction. The method consists of: (a) separating the solvent/oil mixture from the water in the extracted fluids, (b) distilling at least a portion of the solvent from the solvent/oil mixture substantially without co-distillation or loss of the light hydrocarbons in the mixture, (c) determining the volume contribution of the solvent remaining in the mixture, and (d) determining the volume of oil removed from the sponge by substracting the determined remaining solvent volume.

  8. Generic Kalman Filter Software

    Science.gov (United States)

    Lisano, Michael E., II; Crues, Edwin Z.

    2005-01-01

    The Generic Kalman Filter (GKF) software provides a standard basis for the development of application-specific Kalman-filter programs. Historically, Kalman filters have been implemented by customized programs that must be written, coded, and debugged anew for each unique application, then tested and tuned with simulated or actual measurement data. Total development times for typical Kalman-filter application programs have ranged from months to weeks. The GKF software can simplify the development process and reduce the development time by eliminating the need to re-create the fundamental implementation of the Kalman filter for each new application. The GKF software is written in the ANSI C programming language. It contains a generic Kalman-filter-development directory that, in turn, contains a code for a generic Kalman filter function; more specifically, it contains a generically designed and generically coded implementation of linear, linearized, and extended Kalman filtering algorithms, including algorithms for state- and covariance-update and -propagation functions. The mathematical theory that underlies the algorithms is well known and has been reported extensively in the open technical literature. Also contained in the directory are a header file that defines generic Kalman-filter data structures and prototype functions and template versions of application-specific subfunction and calling navigation/estimation routine code and headers. Once the user has provided a calling routine and the required application-specific subfunctions, the application-specific Kalman-filter software can be compiled and executed immediately. During execution, the generic Kalman-filter function is called from a higher-level navigation or estimation routine that preprocesses measurement data and post-processes output data. The generic Kalman-filter function uses the aforementioned data structures and five implementation- specific subfunctions, which have been developed by the user on

  9. Software redundancy: what, where, how

    OpenAIRE

    Mattavelli, Andrea; Pezzè, Mauro; Carzaniga, Antonio

    2017-01-01

    Software systems have become pervasive in everyday life and are the core component of many crucial activities. An inadequate level of reliability may determine the commercial failure of a software product. Still, despite the commitment and the rigorous verification processes employed by developers, software is deployed with faults. To increase the reliability of software systems, researchers have investigated the use of various form of redundancy. Informally, a software system is redunda...

  10. Numerical software: science or alchemy

    Energy Technology Data Exchange (ETDEWEB)

    Gear, C.W.

    1979-06-01

    This is a summary of the Forsythe lecture presented at the Computer Science Conference, Dayton, Ohio, in February 1979. It examines the activity called Numerical Software, first to see what distinguishes numerical software from any other form of software and why numerical software is so much more difficult. Then it examines the scientific basis of such software and discusses that is lacking in that basis.

  11. Flight Software Math Library

    Science.gov (United States)

    McComas, David

    2013-01-01

    The flight software (FSW) math library is a collection of reusable math components that provides typical math utilities required by spacecraft flight software. These utilities are intended to increase flight software quality reusability and maintainability by providing a set of consistent, well-documented, and tested math utilities. This library only has dependencies on ANSI C, so it is easily ported. Prior to this library, each mission typically created its own math utilities using ideas/code from previous missions. Part of the reason for this is that math libraries can be written with different strategies in areas like error handling, parameters orders, naming conventions, etc. Changing the utilities for each mission introduces risks and costs. The obvious risks and costs are that the utilities must be coded and revalidated. The hidden risks and costs arise in miscommunication between engineers. These utilities must be understood by both the flight software engineers and other subsystem engineers (primarily guidance navigation and control). The FSW math library is part of a larger goal to produce a library of reusable Guidance Navigation and Control (GN&C) FSW components. A GN&C FSW library cannot be created unless a standardized math basis is created. This library solves the standardization problem by defining a common feature set and establishing policies for the library s design. This allows the libraries to be maintained with the same strategy used in its initial development, which supports a library of reusable GN&C FSW components. The FSW math library is written for an embedded software environment in C. This places restrictions on the language features that can be used by the library. Another advantage of the FSW math library is that it can be used in the FSW as well as other environments like the GN&C analyst s simulators. This helps communication between the teams because they can use the same utilities with the same feature set and syntax.

  12. Refinement of Research Surveying in Software Methodologies by Analogy: finding your patch

    Directory of Open Access Journals (Sweden)

    Eugene Doroshenko

    1999-05-01

    Full Text Available To enhance research surveying in software methodologies, a model is introduced that can indicate field maturity based on vocabulary and relevant literature. This model is developed by drawing analogies with software methodologies. Two analogies are used: software models and software life cycles or processes. How this model can reduce research surveying problems for researchers is described using extracts from application results as examples. Although the model does support research surveying activities, it cannot choose the subject for the researcher.

  13. Interface-based software testing

    Directory of Open Access Journals (Sweden)

    Aziz Ahmad Rais

    2016-10-01

    Full Text Available Software quality is determined by assessing the characteristics that specify how it should work, which are verified through testing. If it were possible to touch, see, or measure software, it would be easier to analyze and prove its quality. Unfortunately, software is an intangible asset, which makes testing complex. This is especially true when software quality is not a question of particular functions that can be tested through a graphical user interface. The primary objective of software architecture is to design quality of software through modeling and visualization. There are many methods and standards that define how to control and manage quality. However, many IT software development projects still fail due to the difficulties involved in measuring, controlling, and managing software quality. Software quality failure factors are numerous. Examples include beginning to test software too late in the development process, or failing properly to understand, or design, the software architecture and the software component structure. The goal of this article is to provide an interface-based software testing technique that better measures software quality, automates software quality testing, encourages early testing, and increases the software’s overall testability

  14. Software Design Document PVD CSCI (3). Volume 2, Appendices

    Science.gov (United States)

    1991-06-01

    save- laser -info in lase.c, (null) check-lasing-duration in lase.c, (null) C-68 BBN Systems and Technologies Plan View Display CSCl FUNCTION: fstat(fd... huesO calledBy: init_catc in catc.c, (null) FUNCTION: createtiangle() called.By: init_catc in catc.c, (null) zoom-icon in icon.c, (null) FUNCTION...drawjlaserolaserpacket) calledBy: lase in lase.c, (null) FUNCTION: savejaserjinfo(laserx, laser -y, muzzle-x, muzzle-y) calledBy: draw- laser in lase.c, (null

  15. Strategy for a DOD Software Initiative. Volume 2. Appendices

    Science.gov (United States)

    1983-10-01

    Office of Naval Research commissioned a panel from industry and academia to assess the current state of the art and to make recommendations for research...Batchelder, Merton J. Bate, Roger R. Batz, Joseph C. Belady, Laszlo A. Bellas , R. J. Berglass, Gilbert R. Bergstrom, Deane F. Betz, Gene A. Blackmion, J...management are just appearing [Glass, 81; Perry, 811. The state-of-the- art is still relatively primitive, and much work remains to be done in this area

  16. Automating the Transformational Development of Software. Volume 2. Appendices.

    Science.gov (United States)

    1983-03-01

    derived-reletion 4 , Action: 1) Flatten body-of[DR] -.9.2) orall reference-location[BR. S. DR] do forall reference-locaton[BR, L. spe) do begin Apply...ReformLocalAsFirst Goal: Reformulate VI variable as globe- expresion Filter: a) patten-match[relation name (seq3glsuenco oL type) def;. R. spec] b) domain-type-of[type

  17. Automating the Transformational Development of Software. Volume 1.

    Science.gov (United States)

    1983-03-01

    tyie can be used instead of the form Gvar name~ltype. In this * example, package is shorthand for package I PACK(AGE.. ’.7 1Notation. ( expresion ...iteration of the loop 1) Q(x) AU R(x) V. PAGE 118 Development Methods 2) jLtQ(x) A"R(x) Thaft A(x) 3) demon Example (x).. Given f orall component-of[Sub

  18. Inertial Navigation System Standardized Software Development. Volume IV. Program Listings

    Science.gov (United States)

    1976-06-01

    8217 der 0’ ftI L’_ ’* C. C’ T iC C..C ’ C , -1 >’ C ." C’ý C’.. a’ c C’! 1 I.. b.. * I cc Cs c - ., ( Cz jeo -~ -e- Cj.c .j , j j uj m mU.. LU.C,~ t LAw L

  19. Software Design Document Vehicle Simulation CSCI (5). Volume 4, Appendices

    Science.gov (United States)

    1991-06-01

    findlocation in libimage.c, (null) kinematicscalcvelocity in calc-v.c, (null) kinematics-update-p in update-p.c, (null) E-55 BBN Systems and TechnoIlogies ...cig-stop in Ocig-stop.c, (null) E-255 BBN Systems and Technoilogies Vehicles CSCI cig-preparejic’.op in cig-no-op.c, (null) cig-reconfig-s=a in cig-j...network-set-commo-kill(kill-status) FUNCTION: network-set- mobility kill(kill-status) FUNCTION: network-set-firepower-kill(kilLstatus) FILE: dust

  20. Dynamic Displays for Tactical Planning. Volume 3. Software Documentation

    Science.gov (United States)

    1979-12-01

    7 STAI T=EVTPTR C *~*SHIF! MoF I(- MAKE ThUM 16. MCTIUA BLOCK FCR GREEN ACTIi’ KEL_’f9.J PTRPlP=LF-LPTR+lsLJSLZ-1 ACTPlk=MDF(PTRPTk’) ST ART =ACTPTR 8...IN2 9 ML Z LSP= (’fN2-~iN 0* 12000+ %IL2-rlL I IF(LSP.GT.4C,)G~i TO 300 DO 315 K~lv400 315 C0i𔃾INUE Ii-( lATT (1).E.0L, TO J12 IU-IATT(1).N-.36.CR.IATT(3...0 ?nl IF ( lATT (i).NL.-𔃾) GO TO 30 KEY=ITEM P +34 GO TO 1000 3, It- ( IA TT 1l).Nl:.32) GO TC 40 u&EY=130kJO*IATI (; GL, TO 1000 4,j IF IIATI1.NE.30