WorldWideScience

Sample records for saphire software performs

  1. SAPHIRE 8 Software Project Plan

    Energy Technology Data Exchange (ETDEWEB)

    Curtis L.Smith; Ted S. Wood

    2010-03-01

    This project is being conducted at the request of the DOE and the NRC. The INL has been requested by the NRC to improve and maintain the Systems Analysis Programs for Hands-on Integrated Reliability Evaluation (SAPHIRE) tool set concurrent with the changing needs of the user community as well as staying current with new technologies. Successful completion will be upon NRC approved release of all software and accompanying documentation in a timely fashion. This project will enhance the SAPHIRE tool set for the user community (NRC, Nuclear Power Plant operations, Probabilistic Risk Analysis (PRA) model developers) by providing improved Common Cause Failure (CCF), External Events, Level 2, and Significance Determination Process (SDP) analysis capabilities. The SAPHIRE development team at the Idaho National Laboratory is responsible for successful completion of this project. The project is under the supervision of Curtis L. Smith, PhD, Technical Lead for the SAPHIRE application. All current capabilities from SAPHIRE version 7 will be maintained in SAPHIRE 8. The following additional capabilities will be incorporated: • Incorporation of SPAR models for the SDP interface. • Improved quality assurance activities for PRA calculations of SAPHIRE Version 8. • Continue the current activities for code maintenance, documentation, and user support for the code.

  2. SAPHIRE 8 Software Configuration Management Plan

    Energy Technology Data Exchange (ETDEWEB)

    Curtis Smith

    2010-01-01

    The INL software developers use version control for both the formally released SAPHIRE versions, as well as for source code. For each formal release of the software, the developers perform an acceptance test: the software must pass a suite of automated tests prior to official release. Each official release of SAPHIRE is assigned a unique version identifier. The release is bundled into a standard installation package for easy and consistent set-up by individual users. Included in the release is a list of bug fixes and new features for the current release, as well as a history of those items for past releases. Each formal release of SAPHIRE will have passed an acceptance test. In addition to assignment of a unique version identifier for an official software release, each source code file is kept in a controlled library. Source code is a collection of all the computer instructions written by developers to create the finished product. The library is kept on a server, where back-ups are regularly made. This document describes the configuration management approach used as part of the SAPHIRE development.

  3. OVERVIEW OF THE SAPHIRE PROBABILISTIC RISK ANALYSIS SOFTWARE

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Curtis L.; Wood, Ted; Knudsen, James; Ma, Zhegang

    2016-10-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) is a software application developed for performing a complete probabilistic risk assessment (PRA) using a personal computer (PC) running the Microsoft Windows operating system. SAPHIRE Version 8 is funded by the U.S. Nuclear Regulatory Commission (NRC) and developed by the Idaho National Laboratory (INL). INL's primary role in this project is that of software developer and tester. However, INL also plays an important role in technology transfer by interfacing and supporting SAPHIRE users, who constitute a wide range of PRA practitioners from the NRC, national laboratories, the private sector, and foreign countries. In this paper, we provide an overview of the current technical capabilities found in SAPHIRE Version 8, including the user interface and enhanced solving algorithms.

  4. SAPHIRE models and software for ASP evaluations

    Energy Technology Data Exchange (ETDEWEB)

    Sattison, M.B.; Schroeder, J.A.; Russell, K.D. [Idaho National Engineering Lab., Idaho Falls, ID (United States)] [and others

    1995-04-01

    The Idaho National Engineering Laboratory (INEL) over the past year has created 75 plant-specific Accident Sequence Precursor (ASP) models using the SAPHIRE suite of PRA codes. Along with the new models, the INEL has also developed a new module for SAPHIRE which is tailored specifically to the unique needs of conditional core damage probability (CCDP) evaluations. These models and software will be the next generation of risk tools for the evaluation of accident precursors by both NRR and AEOD. This paper presents an overview of the models and software. Key characteristics include: (1) classification of the plant models according to plant response with a unique set of event trees for each plant class, (2) plant-specific fault trees using supercomponents, (3) generation and retention of all system and sequence cutsets, (4) full flexibility in modifying logic, regenerating cutsets, and requantifying results, and (5) user interface for streamlined evaluation of ASP events.

  5. Saphire models and software for ASP evaluations

    Energy Technology Data Exchange (ETDEWEB)

    Sattison, M.B. [Idaho National Engineering Lab., Idaho Falls, ID (United States)

    1997-02-01

    The Idaho National Engineering Laboratory (INEL) over the three years has created 75 plant-specific Accident Sequence Precursor (ASP) models using the SAPHIRE suite of PRA codes. Along with the new models, the INEL has also developed a new module for SAPHIRE which is tailored specifically to the unique needs of ASP evaluations. These models and software will be the next generation of risk tools for the evaluation of accident precursors by both the U.S. Nuclear Regulatory Commission`s (NRC`s) Office of Nuclear Reactor Regulation (NRR) and the Office for Analysis and Evaluation of Operational Data (AEOD). This paper presents an overview of the models and software. Key characteristics include: (1) classification of the plant models according to plant response with a unique set of event trees for each plant class, (2) plant-specific fault trees using supercomponents, (3) generation and retention of all system and sequence cutsets, (4) full flexibility in modifying logic, regenerating cutsets, and requantifying results, and (5) user interface for streamlined evaluation of ASP events. Future plans for the ASP models is also presented.

  6. SAPHIRE 8 Software Independent Verification and Validation Plan

    Energy Technology Data Exchange (ETDEWEB)

    Rae J. Nims; Kent M. Norris

    2010-02-01

    SAPHIRE 8 is being developed with a phased or cyclic iterative rapid application development methodology. Due to this approach, a similar approach is being taken for the IV&V activities on each vital software object. The IV&V plan is structured around NUREG/BR-0167, “Software Quality Assurance Program and Guidelines,” February 1993. The Nuclear Regulatory Research Office Instruction No.: PRM-12, “Software Quality Assurance for RES Sponsored Codes,” March 26, 2007 specifies that RES-sponsored software is to be evaluated against NUREG/BR-0167. Per the guidance in NUREG/BR-0167, SAPHIRE is classified as “Level 1.” Level 1 software corresponds to technical application software used in a safety decision.

  7. Independent Verification and Validation Of SAPHIRE 8 Software Requirements Project Number: N6423 U.S. Nuclear Regulatory Commission

    Energy Technology Data Exchange (ETDEWEB)

    Kent Norris

    2009-09-01

    The purpose of the Independent Verification and Validation (IV&V) role in the evaluation of the SAPHIRE requirements definition is to assess the activities that results in the specification, documentation, and review of the requirements that the software product must satisfy, including functionality, performance, design constraints, attributes and external interfaces. The IV&V team began this endeavor after the software engineering and software development of SAPHIRE had already been in production. IV&V reviewed the requirements specified in the NRC Form 189s to verify these requirements were included in SAPHIRE’s Software Verification and Validation Plan (SVVP).

  8. Independent Verification and Validation Of SAPHIRE 8 Software Requirements Project Number: N6423 U.S. Nuclear Regulatory Commission

    Energy Technology Data Exchange (ETDEWEB)

    Kent Norris

    2010-03-01

    The purpose of the Independent Verification and Validation (IV&V) role in the evaluation of the SAPHIRE requirements definition is to assess the activities that results in the specification, documentation, and review of the requirements that the software product must satisfy, including functionality, performance, design constraints, attributes and external interfaces. The IV&V team began this endeavor after the software engineering and software development of SAPHIRE had already been in production. IV&V reviewed the requirements specified in the NRC Form 189s to verify these requirements were included in SAPHIRE’s Software Verification and Validation Plan (SVVP).

  9. Independent Verification and Validation Of SAPHIRE 8 Software Acceptance Test Plan Project Number: N6423 U.S. Nuclear Regulatory Commission

    Energy Technology Data Exchange (ETDEWEB)

    Kent Norris

    2010-03-01

    The purpose of the Independent Verification and Validation (IV&V) role in the evaluation of the SAPHIRE 8 Software Acceptance Test Plan is to assess the approach to be taken for intended testing activities. The plan typically identifies the items to be tested, the requirements being tested, the testing to be performed, test schedules, personnel requirements, reporting requirements, evaluation criteria, and any risks requiring contingency planning. The IV&V team began this endeavor after the software engineering and software development of SAPHIRE had already been in production.

  10. SAPHIRE 8 New Features and Capabilities

    Energy Technology Data Exchange (ETDEWEB)

    Curtis Smith

    2008-08-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) software performs probabilistic risk assessment (PRA) calculations. SAPHIRE is used in support of NRC’s risk-informed programs such as the Accident Sequence Precursor (ASP) program, Management Directive 8.3, “NRC Incident Investigation Program,” or the Significance Determination Process (SDP). It is also used to develop and run the Standardized Plant Analysis Risk (SPAR) models. SAPHIRE Version 8 is a new version of the software with an improved interface and capabilities to support risk-informed programs. SAPHIRE Version 8 is designed to easily handle larger and more complex models. Applications of previous SAPHIRE versions indicated the need to build and solve models with a large number of sequences. Risk assessments that include endstate evaluations for core damage frequency and large, early release frequency evaluations have greatly increased the number of sequences required. In addition, the complexity of the models has increased since risk assessments evaluate both potential internal and external events, as well as different plant operational states. Special features of SAPHIRE 8 help create and run integrated models which may be composed of different model types. SAPHIRE 8 includes features and capabilities that are new or improved over the current Version 7 to address the new requirements for risk-informed programs and SPAR models. These include: • Improved User Interfaces • Model development • Methods • General Support Features

  11. Independent Verification and Validation Of SAPHIRE 8 Software Configuration Management Plan Project Number: N6423 U.S. Nuclear Regulatory Commission

    Energy Technology Data Exchange (ETDEWEB)

    Kent Norris

    2009-10-01

    The purpose of the Independent Verification and Validation (IV&V) role in the evaluation of the SAPHIRE configuration management is to assess the activities that results in the process of identifying and defining the baselines associated with the SAPHIRE software product; controlling the changes to baselines and release of baselines throughout the life cycle; recording and reporting the status of baselines and the proposed and actual changes to the baselines; and verifying the correctness and completeness of baselines.. The IV&V team began this endeavor after the software engineering and software development of SAPHIRE had already been in production.

  12. Independent Verification and Validation Of SAPHIRE 8 Software Configuration Management Plan Project Number: N6423 U.S. Nuclear Regulatory Commission

    Energy Technology Data Exchange (ETDEWEB)

    Kent Norris

    2010-02-01

    The purpose of the Independent Verification and Validation (IV&V) role in the evaluation of the SAPHIRE configuration management is to assess the activities that results in the process of identifying and defining the baselines associated with the SAPHIRE software product; controlling the changes to baselines and release of baselines throughout the life cycle; recording and reporting the status of baselines and the proposed and actual changes to the baselines; and verifying the correctness and completeness of baselines.. The IV&V team began this endeavor after the software engineering and software development of SAPHIRE had already been in production.

  13. SAPHIRE 8 Volume 6 - Quality Assurance

    Energy Technology Data Exchange (ETDEWEB)

    C. L. Smith; R. Nims; K. J. Kvarfordt

    2011-03-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) Version 8 is a software application developed for performing a complete probabilistic risk assessment using a personal computer running the Microsoft Windows™ operating system. SAPHIRE 8 is funded by the U.S. Nuclear Regulatory Commission (NRC). The role of the INL in this project is that of software developer and tester. This development takes place using formal software development procedures and is subject to quality assurance (QA) processes. The purpose of this document is to describe how the SAPHIRE software QA is performed for Version 8, what constitutes its parts, and limitations of those processes. In addition, this document describes the Independent Verification and Validation that was conducted for Version 8 as part of an overall QA process.

  14. Independent Verification and Validation Of SAPHIRE 8 Software Design and Interface Design Project Number: N6423 U.S. Nuclear Regulatory Commission

    Energy Technology Data Exchange (ETDEWEB)

    Kent Norris

    2009-10-01

    The purpose of the Independent Verification and Validation (IV&V) role in the evaluation of the SAPHIRE software design and interface design is to assess the activities that results in the development, documentation, and review of a software design that meets the requirements defined in the software requirements documentation. The IV&V team began this endeavor after the software engineering and software development of SAPHIRE had already been in production. IV&V reviewed the requirements specified in the NRC Form 189s to verify these requirements were included in SAPHIRE’s Software Verification and Validation Plan (SVVP) design specification.

  15. Independent Verification and Validation Of SAPHIRE 8 Software Design and Interface Design Project Number: N6423 U.S. Nuclear Regulatory Commission

    Energy Technology Data Exchange (ETDEWEB)

    Kent Norris

    2010-03-01

    The purpose of the Independent Verification and Validation (IV&V) role in the evaluation of the SAPHIRE software design and interface design is to assess the activities that results in the development, documentation, and review of a software design that meets the requirements defined in the software requirements documentation. The IV&V team began this endeavor after the software engineering and software development of SAPHIRE had already been in production. IV&V reviewed the requirements specified in the NRC Form 189s to verify these requirements were included in SAPHIRE’s Software Verification and Validation Plan (SVVP) design specification.

  16. Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) Quality Assurance Manual

    Energy Technology Data Exchange (ETDEWEB)

    C. L. Smith; R. Nims; K. J. Kvarfordt; C. Wharton

    2008-08-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) is a software application developed for performing a complete probabilistic risk assessment using a personal computer running the Microsoft Windows operating system. SAPHIRE is primarily funded by the U.S. Nuclear Regulatory Commission (NRC). The role of the INL in this project is that of software developer and tester. This development takes place using formal software development procedures and is subject to quality assurance (QA) processes. The purpose of this document is to describe how the SAPHIRE software QA is performed for Version 6 and 7, what constitutes its parts, and limitations of those processes.

  17. SAPHIRE 8 Volume 1 - Overview and Summary

    Energy Technology Data Exchange (ETDEWEB)

    C. L. Smith; S. T. Wood

    2011-03-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) is a software application developed for performing a complete probabilistic risk assessment (PRA) using a personal computer (PC) running the Microsoft Windows operating system. SAPHIRE Version 8 is funded by the U.S. Nuclear Regulatory Commission (NRC) and developed by the Idaho National Laboratory (INL). INL's primary role in this project is that of software developer and tester. However, INL also plays an important role in technology transfer by interfacing and supporting SAPHIRE users, who constitute a wide range of PRA practitioners from the NRC, national laboratories, the private sector, and foreign countries. SAPHIRE can be used to model a complex system’s response to initiating events and quantify associated consequential outcome frequencies. Specifically, for nuclear power plant applications, SAPHIRE 8 can identify important contributors to core damage (Level 1 PRA) and containment failure during a severe accident which leads to releases (Level 2 PRA). It can be used for a PRA where the reactor is at full power, low power, or at shutdown conditions. Furthermore, it can be used to analyze both internal and external initiating events and has special features for managing models such as flooding and fire. It can also be used in a limited manner to quantify risk in terms of release consequences to the public and environment (Level 3 PRA). In SAPHIRE 8, the act of creating a model has been separated from the analysis of that model in order to improve the quality of both the model (e.g., by avoiding inadvertent changes) and the analysis. Consequently, in SAPHIRE 8, the analysis of models is performed by using what are called Workspaces. Currently, there are Workspaces for three types of analyses: (1) the NRC’s Accident Sequence Precursor program, where the workspace is called “Events and Condition Assessment (ECA);” (2) the NRC’s Significance Determination

  18. Coupling CFAST fire modeling and SAPHIRE probabilistic assessment software for internal fire safety evaluation of a typical TRIGA research reactor

    Energy Technology Data Exchange (ETDEWEB)

    Safaei Arshi, Saiedeh [School of Engineering, Shiraz University, 71348-51154 Shiraz (Iran, Islamic Republic of); Nematollahi, Mohammadreza, E-mail: nema@shirazu.ac.i [School of Engineering, Shiraz University, 71348-51154 Shiraz (Iran, Islamic Republic of); Safety Research Center of Shiraz University, 71348-51154 Shiraz (Iran, Islamic Republic of); Sepanloo, Kamran [Safety Research Center of Shiraz University, 71348-51154 Shiraz (Iran, Islamic Republic of)

    2010-03-15

    Due to the significant threat of internal fires for the safety operation of nuclear reactors, presumed fire scenarios with potential hazards for loss of typical research reactor safety functions are analyzed by coupling CFAST fire modeling and SAPHIRE probabilistic assessment software. The investigations show that fire hazards associated with electrical cable insulation, lubricating oils, diesel, electrical equipment and carbon filters may lead to unsafe situations called core damage states. Using system-specific event trees, the occurrence frequency of core damage states after the occurrence of each possible fire scenario in critical fire compartments is evaluated. Probability that the fire ignited in the given fire compartment will burn long enough to cause the extent of damage defined by each fire scenario is calculated by means of detection-suppression event tree. As a part of detection-suppression event trees quantification, and also for generating the necessary input data for evaluating the frequency of core damage states by SAPHIRE 7.0 software, CFAST fire modeling software is applied. The results provide a probabilistic measure of the quality of existing fire protection systems in order to maintain the reactor at a reasonable safety level.

  19. SAPHIRE 8 Volume 7 - Data Loading

    Energy Technology Data Exchange (ETDEWEB)

    K. J. Kvarfordt; S. T. Wood; C. L. Smith; S. R. Prescott

    2011-03-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) is a software application developed for performing a complete probabilistic risk assessment (PRA) using a personal computer. SAPHIRE Version 8 is funded by the U.S. Nuclear Regulatory Commission and developed by the Idaho National Laboratory. This report is intended to assist the user to enter PRA data into the SAPHIRE program using the built-in MAR-D ASCII-text file data transfer process. Towards this end, a small sample database is constructed and utilized for demonstration. Where applicable, the discussion includes how the data processes for loading the sample database relate to the actual processes used to load a larger PRA models. The procedures described herein were developed for use with SAPHIRE Version 8. The guidance specified in this document will allow a user to have sufficient knowledge to both understand the data format used by SAPHIRE and to carry out the transfer of data between different PRA projects.

  20. SAPHIRE 8 Volume 3 - Users' Guide

    Energy Technology Data Exchange (ETDEWEB)

    C. L. Smith; K. Vedros; K. J. Kvarfordt

    2011-03-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) is a software application developed for performing a complete probabilistic risk assessment (PRA) using a personal computer. SAPHIRE is funded by the U.S. Nuclear Regulatory Commission (NRC) and developed by the Idaho National Laboratory (INL). The INL's primary role in this project is that of software developer. However, the INL also plays an important role in technology transfer by interfacing and supporting SAPHIRE users comprised of a wide range of PRA practitioners from the NRC, national laboratories, the private sector, and foreign countries. SAPHIRE can be used to model a complex system’s response to initiating events, quantify associated damage outcome frequencies, and identify important contributors to this damage (Level 1 PRA) and to analyze containment performance during a severe accident and quantify radioactive releases (Level 2 PRA). It can be used for a PRA evaluating a variety of operating conditions, for example, for a nuclear reactor at full power, low power, or at shutdown conditions. Furthermore, SAPHIRE can be used to analyze both internal and external initiating events and has special features for transforming models built for internal event analysis to models for external event analysis. It can also be used in a limited manner to quantify risk in terms of release consequences to both the public and the environment (Level 3 PRA). This reference guide will introduce the SAPHIRE Version 8.0 software. A brief discussion of the purpose and history of the software is included along with general information such as installation instructions, starting and stopping the program, and some pointers on how to get around inside the program. Next, database concepts and structure are discussed. Following that discussion are nine sections, one for each of the menu options on the SAPHIRE main menu, wherein the purpose and general capabilities for each option are

  1. Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) Code Reference Manual

    Energy Technology Data Exchange (ETDEWEB)

    C. L. Smith; K. J. Kvarfordt; S. T. Wood

    2008-08-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) is a software application developed for performing a complete probabilistic risk assessment (PRA) using a personal computer. SAPHIRE is funded by the U.S. Nuclear Regulatory Commission (NRC) and developed by the Idaho National Laboratory (INL). The INL's primary role in this project is that of software developer. However, the INL also plays an important role in technology transfer by interfacing and supporting SAPHIRE users comprised of a wide range of PRA practitioners from the NRC, national laboratories, the private sector, and foreign countries. SAPHIRE can be used to model a complex system’s response to initiating events, quantify associated damage outcome frequencies, and identify important contributors to this damage (Level 1 PRA) and to analyze containment performance during a severe accident and quantify radioactive releases (Level 2 PRA). It can be used for a PRA evaluating a variety of operating conditions, for example, for a nuclear reactor at full power, low power, or at shutdown conditions. Furthermore, SAPHIRE can be used to analyze both internal and external initiating events and has special features for transforming models built for internal event analysis to models for external event analysis. It can also be used in a limited manner to quantify risk in terms of release consequences to both the public and the environment (Level 3 PRA). SAPHIRE includes a separate module called the Graphical Evaluation Module (GEM). GEM provides a highly specialized user interface with SAPHIRE that automates SAPHIRE process steps for evaluating operational events at commercial nuclear power plants. Using GEM, an analyst can estimate the risk associated with operational events in a very efficient and expeditious manner. This reference guide will introduce the SAPHIRE Version 7.0 software. A brief discussion of the purpose and history of the software is included along with

  2. Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) Code Reference Manual

    Energy Technology Data Exchange (ETDEWEB)

    C. L. Smith; K. J. Kvarfordt; S. T. Wood

    2006-07-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) is a software application developed for performing a complete probabilistic risk assessment (PRA) using a personal computer. SAPHIRE is funded by the U.S. Nuclear Regulatory Commission (NRC) and developed by the Idaho National Laboratory (INL). The INL's primary role in this project is that of software developer. However, the INL also plays an important role in technology transfer by interfacing and supporting SAPHIRE users comprised of a wide range of PRA practitioners from the NRC, national laboratories, the private sector, and foreign countries. SAPHIRE can be used to model a complex system’s response to initiating events, quantify associated damage outcome frequencies, and identify important contributors to this damage (Level 1 PRA) and to analyze containment performance during a severe accident and quantify radioactive releases (Level 2 PRA). It can be used for a PRA evaluating a variety of operating conditions, for example, for a nuclear reactor at full power, low power, or at shutdown conditions. Furthermore, SAPHIRE can be used to analyze both internal and external initiating events and has special features for ansforming models built for internal event analysis to models for external event analysis. It can also be used in a limited manner to quantify risk in terms of release consequences to both the public and the environment (Level 3 PRA). SAPHIRE includes a separate module called the Graphical Evaluation Module (GEM). GEM provides a highly specialized user interface with SAPHIRE that automates SAPHIRE process steps for evaluating operational events at commercial nuclear power plants. Using GEM, an analyst can estimate the risk associated with operational events in a very efficient and expeditious manner. This reference guide will introduce the SAPHIRE Version 7.0 software. A brief discussion of the purpose and history of the software is included along with

  3. Independent Verification and Validation Of SAPHIRE 8 System Test Plan Project Number: N6423 U.S. Nuclear Regulatory Commission

    Energy Technology Data Exchange (ETDEWEB)

    Kent Norris

    2010-02-01

    The purpose of the Independent Verification and Validation (IV&V) role in the evaluation of the SAPHIRE System Test Plan is to assess the approach to be taken for intended testing activities associated with the SAPHIRE software product. The IV&V team began this endeavor after the software engineering and software development of SAPHIRE had already been in production.

  4. Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) Data Loading Manual

    Energy Technology Data Exchange (ETDEWEB)

    C. L. Smith; K. J. Kvarfordt; S. T. Wood

    2006-07-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) is a software application developed for performing a complete probabilistic risk assessment (PRA) using a personal computer. SAPHIRE is primarily funded by the U.S. Nuclear Regulatory Commission (NRC) and developed by the Idaho National Laboratory. This report is intended to assist the user to enter PRA data into the SAPHIRE program using the built-in MAR-D ASCII-text file data transfer process. Towards this end, a small sample database is constructed and utilized for demonstration. Where applicable, the discussion includes how the data processes for loading the sample database relate to the actual processes used to load a larger PRA models. The procedures described herein were developed for use with SAPHIRE Version 6.0 and Version 7.0. In general, the data transfer procedures for version 6 and 7 are the same, but where deviations exist, the differences are noted. The guidance specified in this document will allow a user to have sufficient knowledge to both understand the data format used by SAPHIRE and to carry out the transfer of data between different PRA projects.

  5. Systems Analysis Programs for Hands-on Intergrated Reliability Evaluations (SAPHIRE) Summary Manual

    Energy Technology Data Exchange (ETDEWEB)

    C. L. Smith

    2008-08-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) is a software application developed for performing a complete probabilistic risk assessment (PRA) using a personal computer (PC) running the Microsoft Windows operating system. SAPHIRE is primarily funded by the U.S. Nuclear Regulatory Commission (NRC) and developed by the Idaho National Laboratory (INL). INL's primary role in this project is that of software developer and tester. However, INL also plays an important role in technology transfer by interfacing and supporting SAPHIRE users, who constitute a wide range of PRA practitioners from the NRC, national laboratories, the private sector, and foreign countries. SAPHIRE can be used to model a complex system’s response to initiating events and quantify associated consequential outcome frequencies. Specifically, for nuclear power plant applications, SAPHIRE can identify important contributors to core damage (Level 1 PRA) and containment failure during a severe accident which lead to releases (Level 2 PRA). It can be used for a PRA where the reactor is at full power, low power, or at shutdown conditions. Furthermore, it can be used to analyze both internal and external initiating events and has special features for transforming an internal events model to a model for external events, such as flooding and fire analysis. It can also be used in a limited manner to quantify risk in terms of release consequences to the public and environment (Level 3 PRA). SAPHIRE also includes a separate module called the Graphical Evaluation Module (GEM). GEM is a special user interface linked to SAPHIRE that automates the SAPHIRE process steps for evaluating operational events at commercial nuclear power plants. Using GEM, an analyst can estimate the risk associated with operational events (for example, to calculate a conditional core damage probability) very efficiently and expeditiously. This report provides an overview of the functions

  6. Independent Verification and Validation Of SAPHIRE 8 Software Project Plan Project Number: N6423 U.S. Nuclear Regulatory Commission

    Energy Technology Data Exchange (ETDEWEB)

    Carl Wharton; Kent Norris

    2009-12-01

    This report provides an evaluation of the Project Plan. The Project Plan is intended to provide the high-level direction that documents the required software activities to meet the contractual commitments prepared by the sponsor; the Nuclear Regulatory Commission.

  7. Independent Verification and Validation Of SAPHIRE 8 Software Project Plan Project Number: N6423 U.S. Nuclear Regulatory Commission

    Energy Technology Data Exchange (ETDEWEB)

    Carl Wharton

    2009-10-01

    This report provides an evaluation of the Project Plan. The Project Plan is intended to provide the high-level direction that documents the required software activities to meet the contractual commitments prepared by the sponsor; the Nuclear Regulatory Commission.

  8. Independent Verification and Validation Of SAPHIRE 8 Software Project Plan Project Number: N6423 U.S. Nuclear Regulatory Commission

    Energy Technology Data Exchange (ETDEWEB)

    Carl Wharton; Kent Norris

    2010-03-01

    This report provides an evaluation of the Project Plan. The Project Plan is intended to provide the high-level direction that documents the required software activities to meet the contractual commitments prepared by the sponsor; the Nuclear Regulatory Commission.

  9. Investigation of spatial resolution and temporal performance of SAPHIRE (scintillator avalanche photoconductor with high resolution emitter readout) with integrated electrostatic focusing

    Science.gov (United States)

    Scaduto, David A.; Lubinsky, Anthony R.; Rowlands, John A.; Kenmotsu, Hidenori; Nishimoto, Norihito; Nishino, Takeshi; Tanioka, Kenkichi; Zhao, Wei

    2014-03-01

    We have previously proposed SAPHIRE (scintillator avalanche photoconductor with high resolution emitter readout), a novel detector concept with potentially superior spatial resolution and low-dose performance compared with existing flat-panel imagers. The detector comprises a scintillator that is optically coupled to an amorphous selenium photoconductor operated with avalanche gain, known as high-gain avalanche rushing photoconductor (HARP). High resolution electron beam readout is achieved using a field emitter array (FEA). This combination of avalanche gain, allowing for very low-dose imaging, and electron emitter readout, providing high spatial resolution, offers potentially superior image quality compared with existing flat-panel imagers, with specific applications to fluoroscopy and breast imaging. Through the present collaboration, a prototype HARP sensor with integrated electrostatic focusing and nano- Spindt FEA readout technology has been fabricated. The integrated electron-optic focusing approach is more suitable for fabricating large-area detectors. We investigate the dependence of spatial resolution on sensor structure and operating conditions, and compare the performance of electrostatic focusing with previous technologies. Our results show a clear dependence of spatial resolution on electrostatic focusing potential, with performance approaching that of the previous design with external mesh-electrode. Further, temporal performance (lag) of the detector is evaluated and the results show that the integrated electrostatic focusing design exhibits comparable or better performance compared with the mesh-electrode design. This study represents the first technical evaluation and characterization of the SAPHIRE concept with integrated electrostatic focusing.

  10. SAPHIR, how it ended

    Energy Technology Data Exchange (ETDEWEB)

    Brogli, R.; Hammer, J.; Wiezel, L.; Christen, R.; Heyck, H.; Lehmann, E. [Paul Scherrer Inst. (PSI), Villigen (Switzerland)

    1995-10-01

    On May 16th, 1994, PSI decided to discontinue its efforts to retrofit the SAPHIR reactor for operation at 10 MW. This decision was made because the effort and time for the retrofit work in progress had proven to be more complex than was anticipated. In view of the start-up of the new spallation-neutron source SINQ in 1996, the useful operating time between the eventual restart of SAPHIR and the start-up of SINQ became less than two years, which was regarded by PSI as too short a period to warrant the large retrofit effort. Following the decision of PSI not to re-use SAPHIR as a neutron source, several options for the further utilization of the facility were open. However, none of them appeared promising in comparison with other possibilities; it was therefore decided that SAPHIR should be decommissioned. A concerted effort was initiated to consolidate the nuclear and conventional safety for the post-operational period. (author) 3 figs., 3 tab.

  11. Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) GEM Manual

    Energy Technology Data Exchange (ETDEWEB)

    C. L. Smith; J. Schroeder; S. T. Beck

    2008-08-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) is a software application developed for performing a complete probabilistic risk assessment (PRA) using a personal computer running the Microsoft Windows? operating system. SAPHIRE is primarily funded by the U.S. Nuclear Regulatory Commission (NRC) and developed by the Idaho National Laboratory (INL). The INL's primary role in this project is that of software developer and tester. Using the SAPHIRE analysis engine and relational database is a complementary program called GEM. GEM has been designed to simplify using existing PRA analysis for activities such as the NRC’s Accident Sequence Precursor program. In this report, the theoretical framework behind GEM-type calculations are discussed in addition to providing guidance and examples for performing evaluations when using the GEM software. As part of this analysis framework, the two types of GEM analysis are outlined, specifically initiating event (where an initiator occurs) and condition (where a component is failed for some length of time) assessments.

  12. Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) Tutorial

    Energy Technology Data Exchange (ETDEWEB)

    C. L. Smith; S. T. Beck; S. T. Wood

    2008-08-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) refers to a set of computer programs that were developed to create and analyze probabilistic risk assessment (PRAs). This volume is the tutorial manual for the SAPHIRE system. In this document, a series of lessons are provided that guide the user through basic steps common to most analyses preformed with SAPHIRE. The tutorial is divided into two major sections covering both basic and advanced features. The section covering basic topics contains lessons that lead the reader through development of a probabilistic hypothetical problem involving a vehicle accident, highlighting the program’s most fundamental features. The advanced features section contains additional lessons that expand on fundamental analysis features of SAPHIRE and provide insights into more complex analysis techniques. Together, these two elements provide an overview into the operation and capabilities of the SAPHIRE software.

  13. Systems Analysis Programs for Hands-On Integrated Reliability Evaluations (SAPHIRE) Technical Reference

    Energy Technology Data Exchange (ETDEWEB)

    C. L. Smith; W. J. Galyean; S. T. Beck

    2008-08-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) is a software application developed for performing a complete probabilistic risk assessment (PRA) using a personal computer (PC) running the Microsoft Windows? operating system. Herein information is provided on the principles used in the construction and operation of Version 6.0 and 7.0 of the SAPHIRE system. This report summarizes the fundamental mathematical concepts of sets and logic, fault trees, and probability. This volume then describes the algorithms used to construct a fault tree and to obtain the minimal cut sets. It gives the formulas used to obtain the probability of the top event from the minimal cut sets, and the formulas for probabilities that apply for various assumptions concerning reparability and mission time. It defines the measures of basic event importance that SAPHIRE can calculate. This volume gives an overview of uncertainty analysis using simple Monte Carlo sampling or Latin Hypercube sampling, and states the algorithms used by this program to generate random basic event probabilities from various distributions. Also covered are enhance capabilities such as seismic analysis, cut set "recovery," end state manipulation, and use of "compound events."

  14. Systems Analysis Programs for Hands-On Integrated Reliability Evaluations (SAPHIRE) Technical Reference Manual

    Energy Technology Data Exchange (ETDEWEB)

    C. L. Smith; W. J. Galyean; S. T. Beck

    2006-07-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) is a software application developed for performing a complete probabilistic risk assessment (PRA) using a personal computer (PC) running the Microsoft Windows? operating system. Herein information is provided on the principles used in the construction and operation of Version 6.0 and 7.0 of the SAPHIRE system. This report summarizes the fundamental mathematical concepts of sets and logic, fault trees, and probability. This volume then describes the algorithms used to construct a fault tree and to obtain the minimal cut sets. It gives the formulas used to obtain the probability of the top event from the minimal cut sets, and the formulas for probabilities that apply for various assumptions concerning reparability and mission time. It defines the measures of basic event importance that SAPHIRE can calculate. This volume gives an overview of uncertainty analysis using simple Monte Carlo sampling or Latin Hypercube sampling, and states the algorithms used by this program to generate random basic event probabilities from various distributions. Also covered are enhance capabilities such as seismic analysis, cut set "recovery," end state manipulation, and use of "compound events."

  15. Understanding Software Development Team Performance

    OpenAIRE

    G.P. SUDHAKAR

    2010-01-01

    This paper gives the popular definitions of team, essential characteristics of teams and team de-velopment stages. It discusses the previous empirical studies undertaken to investigate the software development team performance. The factors affecting the software development team performance have been explained. It reviews the past research done in finding the performance of software devel-opment teams. It also discusses some of the representative research done on team performance in non-softw...

  16. Quality assessment and assimilation of Megha-Tropiques SAPHIR radiances into WRF assimilation system

    Science.gov (United States)

    Singh, Randhir; Ojha, Satya P.; Kishtawal, C. M.; Pal, P. K.

    2013-07-01

    This study presents an initial assessment of the quality of radiances measured from SAPHIR (Sounder for Probing Vertical Profiles of Humidity) on board Megha-Tropiques (Indo-French joint satellite), launched by the Indian Space Research Organisation on 12 October 2011. The radiances measured from SAPHIR are compared with those simulated by the radiative transfer model (RTM) using radiosondes measurements, Atmospheric Infrared Sounder retrievals, and National Centers for Environmental Prediction (NCEP) analyzed fields over the Indian subcontinent, during January to November 2012. The radiances from SAPHIR are also compared with the similar measurements available from Microwave Humidity Sounder (MHS) on board MetOp-A and NOAA-18/19 satellites, during January to November 2012. A limited comparison is also carried out between SAPHIR measured and the RTM computed radiances using European Centre for Medium-Range Weather Forecasts analyzed fields, during May and November 2012. The comparison of SAPHIR measured radiances with RTM simulated and MHS observed radiances reveals that SAPHIR observations are of good quality. After the initial assessment of the quality of the SAPHIR radiances, these radiances have been assimilated within the Weather Research and Forecasting (WRF) three-dimensional variational data assimilation system. Analysis/forecast cycling experiments with and without SAPHIR radiances are performed over the Indian region during the entire month of May 2012. The assimilation of SAPHIR radiances shows considerable improvements (with moisture analysis error reduction up to 30%) in the tropospheric analyses and forecast of moisture, temperature, and winds when compared to NCEP analyses and radiances measurement obtained from MHS, Advanced Microwave Sounding Unit-A, and High Resolution Infrared Sounder. Assimilation of SAPHIR radiances also resulted in substantial improvement in the precipitation forecast skill when compared with satellite-derived rain. Overall

  17. Independent Verification and Validation SAPHIRE Version 8 Final Report Project Number: N6423 U.S. Nuclear Regulatory Commission

    Energy Technology Data Exchange (ETDEWEB)

    Kent Norris

    2010-04-01

    This report provides an evaluation of the SAPHIRE version 8 software product. SAPHIRE version 8 is being developed with a phased or cyclic iterative rapid application development methodology. Due to this approach, a similar approach has been taken for the IV&V activities on each vital software object. IV&V and Software Quality Assurance (SQA) activities occur throughout the entire development life cycle and therefore, will be required through the full development of SAPHIRE version 8. Later phases of the software life cycle, the operation and maintenance phases, are not applicable in this effort since the IV&V is being done prior to releasing Version 8.

  18. DSN system performance test software

    Science.gov (United States)

    Martin, M.

    1978-01-01

    The system performance test software is currently being modified to include additional capabilities and enhancements. Additional software programs are currently being developed for the Command Store and Forward System and the Automatic Total Recall System. The test executive is the main program. It controls the input and output of the individual test programs by routing data blocks and operator directives to those programs. It also processes data block dump requests from the operator.

  19. Software Complexity Threatens Performance Portability

    Energy Technology Data Exchange (ETDEWEB)

    Gamblin, T. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-09-11

    Modern HPC software packages are rarely self-contained. They depend on a large number of external libraries, and many spend large fractions of their runtime in external subroutines. Performance portability depends not only on the effort of application teams, but also on the availability of well-tuned libraries. At most sites, the burden of maintaining libraries is shared by code teams and facilities. Facilities typically provide well-tuned default versions, but code teams frequently build with bleeding-edge compilers to achieve high performance. For this reason, HPC has no “standard” software stack, unlike other domains where performance is not critical. Incompatibilities among compilers and software versions force application teams and facility staff to re-build custom versions of libraries for each new toolchain. Because the number of potential configurations is combinatorial, and because HPC software is notoriously difficult to port to new machines [3, 7, 8], the tuning effort required to support and maintain performance-portable libraries outstrips the available manpower at most sites. Software complexity is a growing obstacle to performance portability for HPC.

  20. Systems analysis programs for Hands-on integrated reliability evaluations (SAPHIRE) Version 5.0: Verification and validation (V&V) manual. Volume 9

    Energy Technology Data Exchange (ETDEWEB)

    Jones, J.L.; Calley, M.B.; Capps, E.L.; Zeigler, S.L.; Galyean, W.J.; Novack, S.D.; Smith, C.L.; Wolfram, L.M. [Lockheed Idaho Technologies Co., Idaho Falls, ID (United States)

    1995-03-01

    A verification and validation (V&V) process has been performed for the System Analysis Programs for Hands-on Integrated Reliability Evaluation (SAPHIRE) Version 5.0. SAPHIRE is a set of four computer programs that NRC developed for performing probabilistic risk assessments. They allow an analyst to perform many of the functions necessary to create, quantify, and evaluate the risk associated with a facility or process being analyzed. The programs are Integrated Reliability and Risk Analysis System (IRRAS) System Analysis and Risk Assessment (SARA), Models And Results Database (MAR-D), and Fault tree, Event tree, and Piping and instrumentation diagram (FEP) graphical editor. Intent of this program is to perform a V&V of successive versions of SAPHIRE. Previous efforts have been the V&V of SAPHIRE Version 4.0. The SAPHIRE 5.0 V&V plan is based on the SAPHIRE 4.0 V&V plan with revisions to incorporate lessons learned from the previous effort. Also, the SAPHIRE 5.0 vital and nonvital test procedures are based on the test procedures from SAPHIRE 4.0 with revisions to include the new SAPHIRE 5.0 features as well as to incorporate lessons learned from the previous effort. Most results from the testing were acceptable; however, some discrepancies between expected code operation and actual code operation were identified. Modifications made to SAPHIRE are identified.

  1. Independent Verification and Validation Of SAPHIRE 8 Software Quality Assurance Plan Project Number: N6423 U.S. Nuclear Regulatory Commission

    Energy Technology Data Exchange (ETDEWEB)

    Kent Norris

    2010-02-01

    This report provides an evaluation of the Software Quality Assurance Plan. The Software Quality Assurance Plan is intended to ensure all actions necessary for the software life cycle; verification and validation activities; documentation and deliverables; project management; configuration management, nonconformance reporting and corrective action; and quality assessment and improvement have been planned and a systematic pattern of all actions necessary to provide adequate confidence that a software product conforms to established technical requirements; and to meet the contractual commitments prepared by the sponsor; the Nuclear Regulatory Commission.

  2. Independent Verification and Validation Of SAPHIRE 8 Software Quality Assurance Plan Project Number: N6423 U.S. Nuclear Regulatory Commission

    Energy Technology Data Exchange (ETDEWEB)

    Kent Norris

    2010-03-01

    This report provides an evaluation of the Software Quality Assurance Plan. The Software Quality Assurance Plan is intended to ensure all actions necessary for the software life cycle; verification and validation activities; documentation and deliverables; project management; configuration management, nonconformance reporting and corrective action; and quality assessment and improvement have been planned and a systematic pattern of all actions necessary to provide adequate confidence that a software product conforms to established technical requirements; and to meet the contractual commitments prepared by the sponsor; the Nuclear Regulatory Commission.

  3. Megha-Tropiques/SAPHIR measurements of humidity profiles: validation with AIRS and global radiosonde network

    Science.gov (United States)

    Subrahmanyam, K. V.; Kumar, K. K.

    2013-12-01

    The vertical profiles of humidity measured by SAPHIR (Sondeur Atmospherique du Profil d' Humidité Intropicale par Radiométrie) on-board Megha-Tropiques satellite are validated using Atmosphere Infrared Sounder (AIRS) and ground based radiosonde observations during July-September 2012. SAPHIR provides humidity profiles at six pressure layers viz., 1000-850 (level 1), 850-700 (level 2), 700-550 (level 3), 550-400 (level 4) 400-250 (level 5) and 250-100(level 6) hPa. Segregated AIRS observations over land and oceanic regions are used to assess the performance of SAPHIR quantitatively. The regression analysis over oceanic region (125° W-180° W; 30° S-30° N) reveal that the SAPHIR measurements agrees very well with the AIRS measurements at levels 3, 4, 5 and 6 with correlation coefficients 0.79, 0.88, 0.87 and 0.78 respectively. However, at level 6 SAPHIR seems to be systematically underestimating the AIRS measurements. At level 2, the agreement is reasonably good with correlation coefficient of 0.52 and at level 1 the agreement is very poor with correlation coefficient 0.17. The regression analysis over land region (10° W-30° E; 8° N-30° N) revealed an excellent correlation between AIRS and SAPHIR at all the six levels with 0.80, 0.78, 0.84, 0.84, 0.86 and 0.65 respectively. However, again at levels 5 and 6, SAPHIR seems to be underestimating the AIRS measurements. After carrying out the quantitative comparison between SAPHIR and AIRS separately over land and ocean, the ground based global radiosonde network observations of humidity profiles over three distinct geographical locations (East Asia, tropical belt of South and North America and South Pacific) are then used to further validate the SAPHIR observations as AIRS has its own limitations. The SAPHIR observations within a radius of 50 km around the radiosonde stations are averaged and then the regression analysis is carried out at the first five levels of SAPHIR. The comparison is not carried out at sixth

  4. High performance in software development

    CERN Document Server

    CERN. Geneva; Haapio, Petri; Liukkonen, Juha-Matti

    2015-01-01

    What are the ingredients of high-performing software? Software development, especially for large high-performance systems, is one the most complex tasks mankind has ever tried. Technological change leads to huge opportunities but challenges our old ways of working. Processing large data sets, possibly in real time or with other tight computational constraints, requires an efficient solution architecture. Efficiency requirements span from the distributed storage and large-scale organization of computation and data onto the lowest level of processor and data bus behavior. Integrating performance behavior over these levels is especially important when the computation is resource-bounded, as it is in numerics: physical simulation, machine learning, estimation of statistical models, etc. For example, memory locality and utilization of vector processing are essential for harnessing the computing power of modern processor architectures due to the deep memory hierarchies of modern general-purpose computers. As a r...

  5. Software Performs Complex Design Analysis

    Science.gov (United States)

    2008-01-01

    Designers use computational fluid dynamics (CFD) to gain greater understanding of the fluid flow phenomena involved in components being designed. They also use finite element analysis (FEA) as a tool to help gain greater understanding of the structural response of components to loads, stresses and strains, and the prediction of failure modes. Automated CFD and FEA engineering design has centered on shape optimization, which has been hindered by two major problems: 1) inadequate shape parameterization algorithms, and 2) inadequate algorithms for CFD and FEA grid modification. Working with software engineers at Stennis Space Center, a NASA commercial partner, Optimal Solutions Software LLC, was able to utilize its revolutionary, one-of-a-kind arbitrary shape deformation (ASD) capability-a major advancement in solving these two aforementioned problems-to optimize the shapes of complex pipe components that transport highly sensitive fluids. The ASD technology solves the problem of inadequate shape parameterization algorithms by allowing the CFD designers to freely create their own shape parameters, therefore eliminating the restriction of only being able to use the computer-aided design (CAD) parameters. The problem of inadequate algorithms for CFD grid modification is solved by the fact that the new software performs a smooth volumetric deformation. This eliminates the extremely costly process of having to remesh the grid for every shape change desired. The program can perform a design change in a markedly reduced amount of time, a process that would traditionally involve the designer returning to the CAD model to reshape and then remesh the shapes, something that has been known to take hours, days-even weeks or months-depending upon the size of the model.

  6. Independent Verification and Validation Of SAPHIRE 8 Volume 3 Users' Guide Project Number: N6423 U.S. Nuclear Regulatory Commission

    Energy Technology Data Exchange (ETDEWEB)

    Kent Norris

    2010-03-01

    The purpose of the Independent Verification and Validation (IV&V) role in the evaluation of the SAPHIRE 8 Volume 3 Users’ Guide is to assess the user documentation for its completeness, correctness, and consistency with respect to requirements for user interface and for any functionality that can be invoked by the user. The IV&V team began this endeavor after the software engineering and software development of SAPHIRE had already been in production.

  7. Evaluation of high-performance computing software

    Energy Technology Data Exchange (ETDEWEB)

    Browne, S.; Dongarra, J. [Univ. of Tennessee, Knoxville, TN (United States); Rowan, T. [Oak Ridge National Lab., TN (United States)

    1996-12-31

    The absence of unbiased and up to date comparative evaluations of high-performance computing software complicates a user`s search for the appropriate software package. The National HPCC Software Exchange (NHSE) is attacking this problem using an approach that includes independent evaluations of software, incorporation of author and user feedback into the evaluations, and Web access to the evaluations. We are applying this approach to the Parallel Tools Library (PTLIB), a new software repository for parallel systems software and tools, and HPC-Netlib, a high performance branch of the Netlib mathematical software repository. Updating the evaluations with feed-back and making it available via the Web helps ensure accuracy and timeliness, and using independent reviewers produces unbiased comparative evaluations difficult to find elsewhere.

  8. Evaluation of high-performance computing software

    Energy Technology Data Exchange (ETDEWEB)

    Browne, S.; Dongarra, J. [Univ. of Tennessee, Knoxville, TN (United States); Rowan, T. [Oak Ridge National Lab., TN (United States)

    1996-12-31

    The absence of unbiased and up to date comparative evaluations of high-performance computing software complicates a user`s search for the appropriate software package. The National HPCC Software Exchange (NHSE) is attacking this problem using an approach that includes independent evaluations of software, incorporation of author and user feedback into the evaluations, and Web access to the evaluations. We are applying this approach to the Parallel Tools Library (PTLIB), a new software repository for parallel systems software and tools, and HPC-Netlib, a high performance branch of the Netlib mathematical software repository. Updating the evaluations with feed-back and making it available via the Web helps ensure accuracy and timeliness, and using independent reviewers produces unbiased comparative evaluations difficult to find elsewhere.

  9. Power and performance software analysis and optimization

    CERN Document Server

    Kukunas, Jim

    2015-01-01

    Power and Performance: Software Analysis and Optimization is a guide to solving performance problems in modern Linux systems. Power-efficient chips are no help if the software those chips run on is inefficient. Starting with the necessary architectural background as a foundation, the book demonstrates the proper usage of performance analysis tools in order to pinpoint the cause of performance problems, and includes best practices for handling common performance issues those tools identify. Provides expert perspective from a key member of Intel's optimization team on how processors and memory

  10. New developments in the Saphire computer codes

    Energy Technology Data Exchange (ETDEWEB)

    Russell, K.D.; Wood, S.T.; Kvarfordt, K.J. [Idaho Engineering Lab., Idaho Falls, ID (United States)] [and others

    1996-03-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) refers to a suite of computer programs that were developed to create and analyze a probabilistic risk assessment (PRA) of a nuclear power plant. Many recent enhancements to this suite of codes have been made. This presentation will provide an overview of these features and capabilities. The presentation will include a discussion of the new GEM module. This module greatly reduces and simplifies the work necessary to use the SAPHIRE code in event assessment applications. An overview of the features provided in the new Windows version will also be provided. This version is a full Windows 32-bit implementation and offers many new and exciting features. [A separate computer demonstration was held to allow interested participants to get a preview of these features.] The new capabilities that have been added since version 5.0 will be covered. Some of these major new features include the ability to store an unlimited number of basic events, gates, systems, sequences, etc.; the addition of improved reporting capabilities to allow the user to generate and {open_quotes}scroll{close_quotes} through custom reports; the addition of multi-variable importance measures; and the simplification of the user interface. Although originally designed as a PRA Level 1 suite of codes, capabilities have recently been added to SAPHIRE to allow the user to apply the code in Level 2 analyses. These features will be discussed in detail during the presentation. The modifications and capabilities added to this version of SAPHIRE significantly extend the code in many important areas. Together, these extensions represent a major step forward in PC-based risk analysis tools. This presentation provides a current up-to-date status of these important PRA analysis tools.

  11. ATLAS Offline Software Performance Monitoring and Optimization

    CERN Document Server

    Chauhan, N; Kittelmann, T; Langenberg, R; Mandrysch , R; Salzburger, A; Seuster, R; Ritsch, E; Stewart, G; van Eldik, N; Vitillo, R

    2014-01-01

    In a complex multi-developer, multi-package software environment, such as the ATLAS offline Athena framework, tracking the performance of the code can be a non-trivial task in itself. In this paper we describe improvements in the instrumentation of ATLAS offline software that have given considerable insight into the performance of the code and helped to guide optimisation. Code can be instrumented firstly using the PAPI tool, which is a programing interface for accessing hardware performance counters. PAPI events can count floating point operations, cycles and instructions and cache accesses. Triggering PAPI to start/stop counting for each algorithm and processed event gives a good understanding of the whole algorithm level performance of ATLAS code. Further data can be obtained using pin, a dynamic binary instrumentation tool. Pintools can be used to obtain similar statistics as PAPI, but advantageously without requiring recompilation of the code. Fine grained routine and instruction level instrumentation is...

  12. ATLAS Offline Software Performance Monitoring and Optimization

    CERN Document Server

    Chauhan, N; The ATLAS collaboration; Kittelmann, T; Langenberg, R; Mandrysch , R; Salzburger, A; Seuster, R; Ritsch, E; Stewart, G; van Eldik, N; Vitillo, R

    2013-01-01

    In a complex multi-developer, multi-package software environment, such as the ATLAS offline Athena framework, tracking the performance of the code can be a non-trivial task in itself. In this paper we describe improvements in the instrumentation of ATLAS offline software that have given considerable insight into the performance of the code and helped to guide optimisation. Code can be instrumented firstly using the PAPI tool, which is a programing interface for accessing hardware performance counters. PAPI events can count floating point operations, cycles and instructions and cache accesses. Triggering PAPI to start/stop counting for each algorithm and processed event gives a good understanding of the whole algorithm level performance of ATLAS code. Further data can be obtained using pin, a dynamic binary instrumentation tool. Pintools can be used to obtain similar statistics as PAPI, but advantageously without requiring recompilation of the code. Fine grained routine and instruction level instrumentation is...

  13. Comparative Evaluation of Software Features and Performances.

    Science.gov (United States)

    Cecconi, Daniela

    2016-01-01

    Analysis of two-dimensional gel images is a crucial step for the determination of changes in the protein expression, but at present, it still represents one of the bottlenecks in 2-DE studies. Over the years, different commercial and academic software packages have been developed for the analysis of 2-DE images. Each of these shows different advantageous characteristics in terms of quality of analysis. In this chapter, the characteristics of the different commercial software packages are compared in order to evaluate their main features and performances.

  14. SAPHIRE 8 Volume 2 - Technical Reference

    Energy Technology Data Exchange (ETDEWEB)

    C. L. Smith; S. T. Wood; W. J. Galyean; J. A. Schroeder; M. B. Sattison

    2011-03-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) refers to a set of computer programs that were developed to create and analyze probabilistic risk assessment (PRAs). Herein information is provided on the principles used in the construction and operation of Version 8.0 of the SAPHIRE system. This report summarizes the fundamental mathematical concepts of sets and logic, fault trees, and probability. This volume then describes the algorithms used to construct a fault tree and to obtain the minimal cut sets. It gives the formulas used to obtain the probability of the top event from the minimal cut sets, and the formulas for probabilities that apply for various assumptions concerning reparability and mission time. It defines the measures of basic event importance that SAPHIRE can calculate. This volume gives an overview of uncertainty analysis using simple Monte Carlo sampling or Latin Hypercube sampling, and states the algorithms used by this program to generate random basic event probabilities from various distributions. Also covered are enhance capabilities such as seismic analysis, Workspace algorithms, cut set "recovery," end state manipulation, and use of "compound events."

  15. Intercalibrating and Validating Saphir and Atms Observations

    Science.gov (United States)

    Moradi, I.; Ferraro, R. R.

    2014-12-01

    We present the results of evaluating observations from microwave instruments aboard the Suomi National Polar-orbiting Partnership (NPP, ATMS instrument) and Megha-Tropiques (SAPHIR instrument) satellites. ATMS is a cross-track microwave sounder that currently flying on the Suomi National Polar-orbiting Partnership (S-NPP) satellite, launched in October 2011, which is in a Sun-synchronous orbit with the ascending equatorial crossing time at 01:30 a.m. Megha-Tropiques, launched in Nov 2011, is a low-inclination satellite meaning that the satellite only visits the tropical band between 30 S and 30 N. SAPHIR is a microwave humidity sounder with 6 channels operating at the frequencies close to the water vapor absorption line at 183 GHz. Megha-Tropiques revisits the tropical regions several times a day and provide a great capability for inter-calibrating the observations with the polar orbiting satellites. The study includes inter-comparison and inter-calibration of observations of similar channels from the two instruments, evaluation of the satellite data using high-quality radiosonde data from Atmospheric Radiation Measurement Program and GPS Radio Occultaion Observations from COSMIC mission, as well as geolocation error correction. The results of this study are valuable for generating climate data records from these instruments as well as for extending current climate data records from similar instruments such as AMSU-B and MHS to the ATMS and SAPHIR instruments.

  16. Antecedents and Moderators of Software Professionals’ Performance

    Directory of Open Access Journals (Sweden)

    Shiva Prasad H. C.

    2014-02-01

    Full Text Available Software professionals’ (SPs' performance is often understood narrowly in terms of input–output productivity. This study approaches performance from a broader perspective and examines whether the emotional intelligence competencies (EICs of SPs, the leadership style of team leaders, social capital among team members, and human resource management (HRM practices of software firms affect performance of SPs. It also tests whether the value of and opportunities for knowledge sharing moderate such relationships. Data were collected from 441 Indian SPs in a questionnaire survey. Fifty-five team leaders assessed the performance of SPs, and SPs assessed the other constructs. Results revealed that EICs, transformational leadership style, social capital, and HRM practices positively affect performance. EICs are the most important predictors of performance. Under high (low value of and high (low opportunities for knowledge sharing, the antecedents influencing performance are strengthened (attenuated or nullified. The value of and opportunities for knowledge sharing are quasi-moderators. These findings have significant implications for organizing effective work teams.

  17. Software performance and scalability a quantitative approach

    CERN Document Server

    Liu, Henry H

    2009-01-01

    Praise from the Reviewers:"The practicality of the subject in a real-world situation distinguishes this book from othersavailable on the market."—Professor Behrouz Far, University of Calgary"This book could replace the computer organization texts now in use that every CS and CpEstudent must take. . . . It is much needed, well written, and thoughtful."—Professor Larry Bernstein, Stevens Institute of TechnologyA distinctive, educational text onsoftware performance and scalabilityThis is the first book to take a quantitative approach to the subject of software performance and scalability

  18. 76 FR 60939 - Metal Fatigue Analysis Performed by Computer Software

    Science.gov (United States)

    2011-09-30

    ... COMMISSION Metal Fatigue Analysis Performed by Computer Software AGENCY: Nuclear Regulatory Commission... applicants' analyses and methodologies using the computer software package, WESTEMS TM , to demonstrate... by Computer Software Addressees All holders of, and applicants for, a power reactor operating...

  19. Performance Engineering Technology for Scientific Component Software

    Energy Technology Data Exchange (ETDEWEB)

    Malony, Allen D.

    2007-05-08

    Large-scale, complex scientific applications are beginning to benefit from the use of component software design methodology and technology for software development. Integral to the success of component-based applications is the ability to achieve high-performing code solutions through the use of performance engineering tools for both intra-component and inter-component analysis and optimization. Our work on this project aimed to develop performance engineering technology for scientific component software in association with the DOE CCTTSS SciDAC project (active during the contract period) and the broader Common Component Architecture (CCA) community. Our specific implementation objectives were to extend the TAU performance system and Program Database Toolkit (PDT) to support performance instrumentation, measurement, and analysis of CCA components and frameworks, and to develop performance measurement and monitoring infrastructure that could be integrated in CCA applications. These objectives have been met in the completion of all project milestones and in the transfer of the technology into the continuing CCA activities as part of the DOE TASCS SciDAC2 effort. In addition to these achievements, over the past three years, we have been an active member of the CCA Forum, attending all meetings and serving in several working groups, such as the CCA Toolkit working group, the CQoS working group, and the Tutorial working group. We have contributed significantly to CCA tutorials since SC'04, hosted two CCA meetings, participated in the annual ACTS workshops, and were co-authors on the recent CCA journal paper [24]. There are four main areas where our project has delivered results: component performance instrumentation and measurement, component performance modeling and optimization, performance database and data mining, and online performance monitoring. This final report outlines the achievements in these areas for the entire project period. The submitted progress

  20. Performance Engineering Technology for Scientific Component Software

    Energy Technology Data Exchange (ETDEWEB)

    Malony, Allen D.

    2007-05-08

    Large-scale, complex scientific applications are beginning to benefit from the use of component software design methodology and technology for software development. Integral to the success of component-based applications is the ability to achieve high-performing code solutions through the use of performance engineering tools for both intra-component and inter-component analysis and optimization. Our work on this project aimed to develop performance engineering technology for scientific component software in association with the DOE CCTTSS SciDAC project (active during the contract period) and the broader Common Component Architecture (CCA) community. Our specific implementation objectives were to extend the TAU performance system and Program Database Toolkit (PDT) to support performance instrumentation, measurement, and analysis of CCA components and frameworks, and to develop performance measurement and monitoring infrastructure that could be integrated in CCA applications. These objectives have been met in the completion of all project milestones and in the transfer of the technology into the continuing CCA activities as part of the DOE TASCS SciDAC2 effort. In addition to these achievements, over the past three years, we have been an active member of the CCA Forum, attending all meetings and serving in several working groups, such as the CCA Toolkit working group, the CQoS working group, and the Tutorial working group. We have contributed significantly to CCA tutorials since SC'04, hosted two CCA meetings, participated in the annual ACTS workshops, and were co-authors on the recent CCA journal paper [24]. There are four main areas where our project has delivered results: component performance instrumentation and measurement, component performance modeling and optimization, performance database and data mining, and online performance monitoring. This final report outlines the achievements in these areas for the entire project period. The submitted progress

  1. Strangeness Photoproduction with the Saphir Detector

    CERN Document Server

    Menze, D W

    1997-01-01

    Statistically improved data of total cross sections and of angular distributions for differential cross sections and hyperon recoil polarizations of the reactions \\gamma p --> K^+ \\Lambda and \\gamma p --> K^+ \\Sigma^0 have been collected with the SAPHIR detector at photon energies between threshold and 2.0 GeV. Here total cross section data up to 1.5 GeV are presented. The opposite sign of \\Lambda and \\Sigma polarization and the change of sign between forward and backward direction could be confirmed by higher statistics. A steep threshold behaviour of the K^+ \\Lambda total cross section is observed.

  2. Can SAPHIR Instrument Onboard MEGHATROPIQUES Retrieve Hydrometeors and Rainfall Characteristics ?

    Science.gov (United States)

    Goyal, J. M.; Srinivasan, J.; Satheesh, S. K.

    2014-12-01

    MEGHATROPIQUES (MT) is an Indo-French satellite launched in 2011 with the main intention of understanding the water cycle in the tropical region and is a part of GPM constellation. MADRAS was the primary instrument on-board MT to estimate rainfall characteristics, but unfortunately it's scanning mechanism failed obscuring the primary goal of the mission.So an attempt has been made to retrieve rainfall and different hydrometeors using other instrument SAPHIR onboard MT. The most important advantage of using MT is its orbitography which is specifically designed for tropical regions and can reach up to 6 passes per day more than any other satellite currently in orbit. Although SAPHIR is an humidity sounder with six channels centred around 183 GHz channel, it still operates in the microwave region which directly interacts with rainfall, especially wing channels and thus can pick up rainfall signatures. Initial analysis using radiative transfer models also establish this fact .To get more conclusive results using observations, SAPHIR level 1 brightness temperature (BT) data was compared with different rainfall products utilizing the benefits of each product. SAPHIR BT comparison with TRMM 3B42 for one pass clearly showed that channel 5 and 6 have a considerable sensitivity towards rainfall. Following this a huge database of more than 300000 raining pixels of spatially and temporally collocated 3B42 rainfall and corresponding SAPHIR BT for an entire month was created to include all kinds of rainfall events, to attain higher temporal resolution collocated database was also created for SAPHIR BT and rainfall from infrared sensor on geostationary satellite Kalpana 1.These databases were used to understand response of various channels of SAPHIR to different rainfall regimes . TRMM 2A12 rainfall product was also used to identify capabilities of SAPHIR to retrieve cloud and ice water path which also gave significant correlation. Conclusively,we have shown that SAPHIR has

  3. 75 FR 33162 - Airworthiness Directives; Microturbo Saphir 20 Model 095 Auxiliary Power Units (APUs)

    Science.gov (United States)

    2010-06-11

    ...-21-AD; Amendment 39-16332; AD 2010-13-01] RIN 2120-AA64 Airworthiness Directives; Microturbo Saphir..., of the SAPHIR 20 Model 095 APU is a life-limited part. Microturbo had determined through ``fleet...-015-03, of the SAPHIR 20 Model 095 APU is a life-limited part. Microturbo had determined...

  4. Teaching Software Developers to Perform UX Tasks

    DEFF Research Database (Denmark)

    Øvad, Tina; Bornoe, Nis; Larsen, Lars Bo

    2015-01-01

    Good UX design is becoming important within the industry when developing new products. This entails that UX skills have to be available in the development processes. This paper investigates the opportunities of using software developers as a UX work resource in the day-to-day working practice...

  5. Predicting SMT Solver Performance for Software Verification

    Directory of Open Access Journals (Sweden)

    Andrew Healy

    2017-01-01

    Full Text Available The Why3 IDE and verification system facilitates the use of a wide range of Satisfiability Modulo Theories (SMT solvers through a driver-based architecture. We present Where4: a portfolio-based approach to discharge Why3 proof obligations. We use data analysis and machine learning techniques on static metrics derived from program source code. Our approach benefits software engineers by providing a single utility to delegate proof obligations to the solvers most likely to return a useful result. It does this in a time-efficient way using existing Why3 and solver installations - without requiring low-level knowledge about SMT solver operation from the user.

  6. Design and performance test of spacecraft test and operation software

    Science.gov (United States)

    Wang, Guohua; Cui, Yan; Wang, Shuo; Meng, Xiaofeng

    2011-06-01

    Main test processor (MTP) software is the key element of Electrical Ground Support Equipment (EGSE) for spacecraft test and operation used in the Chinese Academy of Space Technology (CAST) for years without innovation. With the increasing demand for a more efficient and agile MTP software, the new MTP software was developed. It adopts layered and plug-in based software architecture, whose core runtime server provides message queue management, share memory management and process management services and forms the framework for a configurable and open architecture system. To investigate the MTP software's performance, the test case of network response time, test sequence management capability and data-processing capability was introduced in detail. Test results show that the MTP software is common and has higher performance than the legacy one.

  7. Performance Optimization of Deployed Software-as-a-Service Applications

    NARCIS (Netherlands)

    Bezemer, C.-P.; Zaidman, A.

    2013-01-01

    Preprint submitted to Elsevier. The goal of performance maintenance is to improve the performance of a software system after delivery. As the performance of a system is often characterized by unexpected combinations of metric values, manual analysis of performance is hard in complex systems. In thi

  8. Determinants of business model performance in software firms

    OpenAIRE

    Rajala, Risto

    2009-01-01

    The antecedents and consequences of business model design have gained increasing interest among information system (IS) scholars and business practitioners alike. Based on an extensive literature review and empirical research, this study investigates the factors that drive business model design and the performance effects generated by the different kinds of business models in software firms. The main research question is: “What are the determinants of business model performance in the softwar...

  9. Haptic interfaces: Hardware, software and human performance

    Science.gov (United States)

    Srinivasan, Mandayam A.

    1995-01-01

    Virtual environments are computer-generated synthetic environments with which a human user can interact to perform a wide variety of perceptual and motor tasks. At present, most of the virtual environment systems engage only the visual and auditory senses, and not the haptic sensorimotor system that conveys the sense of touch and feel of objects in the environment. Computer keyboards, mice, and trackballs constitute relatively simple haptic interfaces. Gloves and exoskeletons that track hand postures have more interaction capabilities and are available in the market. Although desktop and wearable force-reflecting devices have been built and implemented in research laboratories, the current capabilities of such devices are quite limited. To realize the full promise of virtual environments and teleoperation of remote systems, further developments of haptic interfaces are critical. In this paper, the status and research needs in human haptics, technology development and interactions between the two are described. In particular, the excellent performance characteristics of Phantom, a haptic interface recently developed at MIT, are highlighted. Realistic sensations of single point of contact interactions with objects of variable geometry (e.g., smooth, textured, polyhedral) and material properties (e.g., friction, impedance) in the context of a variety of tasks (e.g., needle biopsy, switch panels) achieved through this device are described and the associated issues in haptic rendering are discussed.

  10. Impact of Megha-Tropiques SAPHIR radiance assimilation on the simulation of tropical cyclones over Bay of Bengal

    Science.gov (United States)

    Dhanya, M.; Gopalakrishnan, Deepak; Chandrasekar, Anantharaman; Singh, Sanjeev Kumar; Prasad, V. S.

    2016-05-01

    Impact of SAPHIR radiance assimilation on the simulation of tropical cyclones over Indian region has been investigated using the Weather Research and Forecasting (WRF) model. Three cyclones that formed over Bay of Bengal have been considered in the present study. Assimilation methodology used here is the three dimensional variational (3DVar) scheme within the WRF model. With the initial and boundary conditions from Global Forecasting System (GFS) analyses from the National Centres for Environmental Prediction (NCEP), a control run (CTRL) without assimilation of any data and a 3DVar run with the assimilation of SAPHIR radiance have been performed. Both model simulations have been compared with the observations from India Meteorological Department (IMD), Tropical Rainfall Measurement Mission (TRMM), and analysis fields from GFS. Detailed analysis reveals that, the SAPHIR radiance assimilation has led to significant improvement in the simulation of all the three cyclones in terms of cyclone track, intensity, accumulated rainfall. The simulation of warm core structure and relative vorticity profile of each cyclone by 3DVar run are found to be more closer to GFS analyses, when compared with the CTRL run.

  11. Performance testing of 3D point cloud software

    Science.gov (United States)

    Varela-González, M.; González-Jorge, H.; Riveiro, B.; Arias, P.

    2013-10-01

    LiDAR systems are being used widely in recent years for many applications in the engineering field: civil engineering, cultural heritage, mining, industry and environmental engineering. One of the most important limitations of this technology is the large computational requirements involved in data processing, especially for large mobile LiDAR datasets. Several software solutions for data managing are available in the market, including open source suites, however, users often unknown methodologies to verify their performance properly. In this work a methodology for LiDAR software performance testing is presented and four different suites are studied: QT Modeler, VR Mesh, AutoCAD 3D Civil and the Point Cloud Library running in software developed at the University of Vigo (SITEGI). The software based on the Point Cloud Library shows better results in the loading time of the point clouds and CPU usage. However, it is not as strong as commercial suites in working set and commit size tests.

  12. Syntax Editing for Mark 4-A System Performance Test Software

    Science.gov (United States)

    Jacobson, G. N.

    1983-01-01

    This article describes the syntax editing concepts used by the Operations Sustaining Engineering Section in implementing System Performance Test software for the Mark 4-A era. The processing functions are discussed, as well as the necessary data structures and table generation macros used in implementing those functions. In addition, the procedural and software interfaces which have been developed for users of the syntax editor are described, including the forms required for establishing directive and parameter characteristics.

  13. CRPC research into linear algebra software for high performance computers

    Energy Technology Data Exchange (ETDEWEB)

    Choi, J.; Walker, D.W. [Oak Ridge National Lab., TN (United States). Mathematical Sciences Section; Dongarra, J.J. [Univ. of Tennessee, Knoxville, TN (United States). Dept. of Computer Science]|[Oak Ridge National Lab., TN (United States). Mathematical Sciences Section; Pozo, R. [Univ. of Tennessee, Knoxville, TN (United States). Dept. of Computer Science; Sorensen, D.C. [Rice Univ., Houston, TX (United States). Dept. of Computational and Applied Mathematics

    1994-12-31

    In this paper the authors look at a number of approaches being investigated in the Center for Research on Parallel Computation (CRPC) to develop linear algebra software for high-performance computers. These approaches are exemplified by the LAPACK, templates, and ARPACK projects. LAPACK is a software library for performing dense and banded linear algebra computations, and was designed to run efficiently on high-performance computers. The authors focus on the design of the distributed-memory version of LAPACK, and on an object-oriented interface to LAPACK.

  14. Performance Evaluation of 3d Modeling Software for Uav Photogrammetry

    Science.gov (United States)

    Yanagi, H.; Chikatsu, H.

    2016-06-01

    UAV (Unmanned Aerial Vehicle) photogrammetry, which combines UAV and freely available internet-based 3D modeling software, is widely used as a low-cost and user-friendly photogrammetry technique in the fields such as remote sensing and geosciences. In UAV photogrammetry, only the platform used in conventional aerial photogrammetry is changed. Consequently, 3D modeling software contributes significantly to its expansion. However, the algorithms of the 3D modelling software are black box algorithms. As a result, only a few studies have been able to evaluate their accuracy using 3D coordinate check points. With this motive, Smart3DCapture and Pix4Dmapper were downloaded from the Internet and commercial software PhotoScan was also employed; investigations were performed in this paper using check points and images obtained from UAV.

  15. Performance evaluation software moving object detection and tracking in videos

    CERN Document Server

    Karasulu, Bahadir

    2013-01-01

    Performance Evaluation Software: Moving Object Detection and Tracking in Videos introduces a software approach for the real-time evaluation and performance comparison of the methods specializing in moving object detection and/or tracking (D&T) in video processing. Digital video content analysis is an important item for multimedia content-based indexing (MCBI), content-based video retrieval (CBVR) and visual surveillance systems. There are some frequently-used generic algorithms for video object D&T in the literature, such as Background Subtraction (BS), Continuously Adaptive Mean-shift (CMS),

  16. Performance testing of LiDAR exploitation software

    Science.gov (United States)

    Varela-González, M.; González-Jorge, H.; Riveiro, B.; Arias, P.

    2013-04-01

    Mobile LiDAR systems are being used widely in recent years for many applications in the field of geoscience. One of most important limitations of this technology is the large computational requirements involved in data processing. Several software solutions for data processing are available in the market, but users are often unknown about the methodologies to verify their performance accurately. In this work a methodology for LiDAR software performance testing is presented and six different suites are studied: QT Modeler, AutoCAD Civil 3D, Mars 7, Fledermaus, Carlson and TopoDOT (all of them in x64). Results depict as QTModeler, TopoDOT and AutoCAD Civil 3D allow the loading of large datasets, while Fledermaus, Mars7 and Carlson do not achieve these powerful performance. AutoCAD Civil 3D needs large loading time in comparison with the most powerful softwares such as QTModeler and TopoDOT. Carlson suite depicts the poorest results among all the softwares under study, where point clouds larger than 5 million points cannot be loaded and loading time is very large in comparison with the other suites even for the smaller datasets. AutoCAD Civil 3D, Carlson and TopoDOT show more threads than other softwares like QTModeler, Mars7 and Fledermaus.

  17. All-sky radiance simulation of Megha-Tropiques SAPHIR microwave sensor using multiple scattering radiative transfer model for data assimilation applications

    Indian Academy of Sciences (India)

    A Madhulatha; John P George; E N Rajagopal

    2017-03-01

    Incorporation of cloud- and precipitation-affected radiances from microwave satellite sensors in data assimilation system has a great potential in improving the accuracy of numerical model forecasts over the regions of high impact weather. By employing the multiple scattering radiative transfer model RTTOVSCATT,all-sky radiance (clear sky and cloudy sky) simulation has been performed for six channel microwave SAPHIR (Sounder for Atmospheric Profiling of Humidity in the Inter-tropics by Radiometry) sensors of Megha-Tropiques (MT) satellite. To investigate the importance of cloud-affected radiance data in severe weather conditions, all-sky radiance simulation is carried out for the severe cyclonic storm ‘Hudhud’ formed over Bay of Bengal. Hydrometeors from NCMRWF unified model (NCUM) forecasts are used as input to the RTTOV model to simulate cloud-affected SAPHIR radiances. Horizontal and vertical distribution of all-sky simulated radiances agrees reasonably well with the SAPHIR observed radiancesover cloudy regions during different stages of cyclone development. Simulated brightness temperatures of six SAPHIR channels indicate that the three dimensional humidity structure of tropical cyclone is well represented in all-sky computations. Improved correlation and reduced bias and root mean squareerror against SAPHIR observations are apparent. Probability distribution functions reveal that all-sky simulations are able to produce the cloud-affected lower brightness temperatures associated with cloudy regions. The density scatter plots infer that all-sky radiances are more consistent with observed radiances.Correlation between different types of hydrometeors and simulated brightness temperatures at respective atmospheric levels highlights the significance of inclusion of scattering effects from different hydrometeors in simulating the cloud-affected radiances in all-sky simulations. The results are promisingand suggest that the inclusion of multiple scattering

  18. All-sky radiance simulation of Megha-Tropiques SAPHIR microwave sensor using multiple scattering radiative transfer model for data assimilation applications

    Science.gov (United States)

    Madhulatha, A.; George, John P.; Rajagopal, E. N.

    2017-03-01

    Incorporation of cloud- and precipitation-affected radiances from microwave satellite sensors in data assimilation system has a great potential in improving the accuracy of numerical model forecasts over the regions of high impact weather. By employing the multiple scattering radiative transfer model RTTOV-SCATT, all-sky radiance (clear sky and cloudy sky) simulation has been performed for six channel microwave SAPHIR (Sounder for Atmospheric Profiling of Humidity in the Inter-tropics by Radiometry) sensors of Megha-Tropiques (MT) satellite. To investigate the importance of cloud-affected radiance data in severe weather conditions, all-sky radiance simulation is carried out for the severe cyclonic storm `Hudhud' formed over Bay of Bengal. Hydrometeors from NCMRWF unified model (NCUM) forecasts are used as input to the RTTOV model to simulate cloud-affected SAPHIR radiances. Horizontal and vertical distribution of all-sky simulated radiances agrees reasonably well with the SAPHIR observed radiances over cloudy regions during different stages of cyclone development. Simulated brightness temperatures of six SAPHIR channels indicate that the three dimensional humidity structure of tropical cyclone is well represented in all-sky computations. Improved correlation and reduced bias and root mean square error against SAPHIR observations are apparent. Probability distribution functions reveal that all-sky simulations are able to produce the cloud-affected lower brightness temperatures associated with cloudy regions. The density scatter plots infer that all-sky radiances are more consistent with observed radiances. Correlation between different types of hydrometeors and simulated brightness temperatures at respective atmospheric levels highlights the significance of inclusion of scattering effects from different hydrometeors in simulating the cloud-affected radiances in all-sky simulations. The results are promising and suggest that the inclusion of multiple scattering

  19. Conference on High Performance Software for Nonlinear Optimization

    CERN Document Server

    Murli, Almerico; Pardalos, Panos; Toraldo, Gerardo

    1998-01-01

    This book contains a selection of papers presented at the conference on High Performance Software for Nonlinear Optimization (HPSN097) which was held in Ischia, Italy, in June 1997. The rapid progress of computer technologies, including new parallel architec­ tures, has stimulated a large amount of research devoted to building software environments and defining algorithms able to fully exploit this new computa­ tional power. In some sense, numerical analysis has to conform itself to the new tools. The impact of parallel computing in nonlinear optimization, which had a slow start at the beginning, seems now to increase at a fast rate, and it is reasonable to expect an even greater acceleration in the future. As with the first HPSNO conference, the goal of the HPSN097 conference was to supply a broad overview of the more recent developments and trends in nonlinear optimization, emphasizing the algorithmic and high performance software aspects. Bringing together new computational methodologies with theoretical...

  20. Mitigating the controller performance bottlenecks in Software Defined Networks

    DEFF Research Database (Denmark)

    Caba, Cosmin Marius; Soler, José

    2016-01-01

    The centralization of the control plane decision logic in Software Defined Networking (SDN) has raised concerns regarding the performance of the SDN Controller (SDNC) when the network scales up. A number of solutions have been proposed in the literature to address these concerns. This paper propo...

  1. Component-based software for high-performance scientific computing

    Science.gov (United States)

    Alexeev, Yuri; Allan, Benjamin A.; Armstrong, Robert C.; Bernholdt, David E.; Dahlgren, Tamara L.; Gannon, Dennis; Janssen, Curtis L.; Kenny, Joseph P.; Krishnan, Manojkumar; Kohl, James A.; Kumfert, Gary; Curfman McInnes, Lois; Nieplocha, Jarek; Parker, Steven G.; Rasmussen, Craig; Windus, Theresa L.

    2005-01-01

    Recent advances in both computational hardware and multidisciplinary science have given rise to an unprecedented level of complexity in scientific simulation software. This paper describes an ongoing grass roots effort aimed at addressing complexity in high-performance computing through the use of Component-Based Software Engineering (CBSE). Highlights of the benefits and accomplishments of the Common Component Architecture (CCA) Forum and SciDAC ISIC are given, followed by an illustrative example of how the CCA has been applied to drive scientific discovery in quantum chemistry. Thrusts for future research are also described briefly.

  2. Software life cycle dynamic simulation model: The organizational performance submodel

    Science.gov (United States)

    Tausworthe, Robert C.

    1985-01-01

    The submodel structure of a software life cycle dynamic simulation model is described. The software process is divided into seven phases, each with product, staff, and funding flows. The model is subdivided into an organizational response submodel, a management submodel, a management influence interface, and a model analyst interface. The concentration here is on the organizational response model, which simulates the performance characteristics of a software development subject to external and internal influences. These influences emanate from two sources: the model analyst interface, which configures the model to simulate the response of an implementing organization subject to its own internal influences, and the management submodel that exerts external dynamic control over the production process. A complete characterization is given of the organizational response submodel in the form of parameterized differential equations governing product, staffing, and funding levels. The parameter values and functions are allocated to the two interfaces.

  3. A Knowledge-based Environment for Software Process Performance Analysis

    Directory of Open Access Journals (Sweden)

    Natália Chaves Lessa Schots

    2015-08-01

    Full Text Available Background: Process performance analysis is a key step for implementing continuous improvement in software organizations. However, the knowledge to execute such analysis is not trivial and the person responsible to executing it must be provided with appropriate support. Aim: This paper presents a knowledge-based environment, named SPEAKER, proposed for supporting software organizations during the execution of process performance analysis. SPEAKER comprises a body of knowledge and a set of activities and tasks for software process performance analysis along with supporting tools to executing these activities and tasks. Method: We conducted an informal literature reviews and a systematic mapping study, which provided basic requirements for the proposed environment. We implemented the SPEAKER environment integrating supporting tools for the execution of activities and tasks of performance analysis and the knowledge necessary to execute them, in order to meet the variability presented by the characteristics of these activities. Results: In this paper, we describe each SPEAKER module and the individual evaluations of these modules, and also present an example of use comprising how the environment can guide the user through a specific performance analysis activity. Conclusion: Although we only conducted individual evaluations of SPEAKER’s modules, the example of use indicates the feasibility of the proposed environment. Therefore, the environment as a whole will be further evaluated to verify if it attains its goal of assisting in the execution of process performance analysis by non-specialist people.

  4. Rain detection and measurement from Megha-Tropiques microwave sounder—SAPHIR

    Science.gov (United States)

    Varma, Atul Kumar; Piyush, D. N.; Gohil, B. S.; Pal, P. K.; Srinivasan, J.

    2016-08-01

    The Megha-Tropiques, an Indo-French satellite, carries on board a microwave sounder, Sondeur Atmosphérique du Profil d'Humidité Intertropical par Radiométrie (SAPHIR), and a microwave radiometer, Microwave Analysis and Detection of Rain and Atmospheric Structures (MADRAS), along with two other instruments. Being a Global Precipitation Measurement constellation satellite MT-MADRAS was an important sensor to study the convective clouds and rainfall. Due to the nonfunctioning of MADRAS, the possibility of detection and estimation of rain from SAPHIR is explored. Using near-concurrent SAPHIR and precipitation radar (PR) onboard Tropical Rainfall Measuring Mission (TRMM) observations, the rain effect on SAPHIR channels is examined. All the six channels of the SAPHIR are used to calculate the average rain probability (PR) for each SAPHIR pixel. Further, an exponential rain retrieval algorithm is developed. This algorithm explains a correlation of 0.72, RMS error of 0.75 mm/h, and bias of 0.04 mm/h. When rain identification and retrieval algorithms are applied together, it explains a correlation of 0.69 with an RMS error of 0.47 mm/h and bias of 0.01 mm/h. On applying the algorithm to the independent SAPHIR data set and compared with TRMM-3B42 rain on monthly scale, it explains a correlation of 0.85 and RMS error of 0.09 mm/h. Further distribution of rain difference of SAPHIR with other rain products is presented on global scale as well as for the climatic zones. For examining the capability of SAPHIR to measure intense rain, instantaneous rain over Phailin cyclone from SAPHIR is compared with other standard satellite-based rain products such as 3B42, Global Satellite Mapping of Precipitation, and Precipitation Estimation from Remote Sensing Information using Artificial Neural Network.

  5. Hardware support for software controlled fast reconfiguration of performance counters

    Science.gov (United States)

    Salapura, Valentina; Wisniewski, Robert W.

    2013-06-18

    Hardware support for software controlled reconfiguration of performance counters may include a plurality of performance counters collecting one or more counts of one or more selected activities. A storage element stores data value representing a time interval, and a timer element reads the data value and detects expiration of the time interval based on the data value and generates a signal. A plurality of configuration registers stores a set of performance counter configurations. A state machine receives the signal and selects a configuration register from the plurality of configuration registers for reconfiguring the one or more performance counters.

  6. Hardware support for software controlled fast reconfiguration of performance counters

    Science.gov (United States)

    Salapura, Valentina; Wisniewski, Robert W

    2013-09-24

    Hardware support for software controlled reconfiguration of performance counters may include a plurality of performance counters collecting one or more counts of one or more selected activities. A storage element stores data value representing a time interval, and a timer element reads the data value and detects expiration of the time interval based on the data value and generates a signal. A plurality of configuration registers stores a set of performance counter configurations. A state machine receives the signal and selects a configuration register from the plurality of configuration registers for reconfiguring the one or more performance counters.

  7. Large Scale and Performance tests of the ATLAS Online Software

    Institute of Scientific and Technical Information of China (English)

    Alexandrov; H.Wolters; 等

    2001-01-01

    One of the sub-systems of the Trigger/DAQ system of the future ATLAS experiment is the Online Software system.It encompasses the functionality needed to configure,control and monitor the DAQ.Its architecture is based on a component structure described in the ATLAS Trigger/DAQ technical proposal.Resular integration tests ensure its smooth operation in test beam setups during its evolutionary development towards the final ATLAS online system.Feedback is received and returned into the development process.Studies of the system.behavior have been performed on a set of up to 111 PCs on a configuration which is getting closer to the final size,Large scale and performance tests of the integrated system were performed on this setup with emphasis on investigating the aspects of the inter-dependence of the components and the performance of the communication software.Of particular interest were the run control state transitions in various configurations of the run control hierarchy.For the purpose of the tests,the software from other Trigger/DAQ sub-systems has been emulated.This paper presents a brief overview of the online system structure,its components and the large scale integration tests and their results.

  8. Software for evaluation of EPR-dosimetry performance.

    Science.gov (United States)

    Shishkina, E A; Timofeev, Yu S; Ivanov, D V

    2014-06-01

    Electron paramagnetic resonance (EPR) with tooth enamel is a method extensively used for retrospective external dosimetry. Different research groups apply different equipment, sample preparation procedures and spectrum processing algorithms for EPR dosimetry. A uniform algorithm for description and comparison of performances was designed and implemented in a new computer code. The aim of the paper is to introduce the new software 'EPR-dosimetry performance'. The computer code is a user-friendly tool for providing a full description of method-specific capabilities of EPR tooth dosimetry, from metrological characteristics to practical limitations in applications. The software designed for scientists and engineers has several applications, including support of method calibration by evaluation of calibration parameters, evaluation of critical value and detection limit for registration of radiation-induced signal amplitude, estimation of critical value and detection limit for dose evaluation, estimation of minimal detectable value for anthropogenic dose assessment and description of method uncertainty.

  9. High-Level Synthesis: Productivity, Performance, and Software Constraints

    Directory of Open Access Journals (Sweden)

    Yun Liang

    2012-01-01

    Full Text Available FPGAs are an attractive platform for applications with high computation demand and low energy consumption requirements. However, design effort for FPGA implementations remains high—often an order of magnitude larger than design effort using high-level languages. Instead of this time-consuming process, high-level synthesis (HLS tools generate hardware implementations from algorithm descriptions in languages such as C/C++ and SystemC. Such tools reduce design effort: high-level descriptions are more compact and less error prone. HLS tools promise hardware development abstracted from software designer knowledge of the implementation platform. In this paper, we present an unbiased study of the performance, usability and productivity of HLS using AutoPilot (a state-of-the-art HLS tool. In particular, we first evaluate AutoPilot using the popular embedded benchmark kernels. Then, to evaluate the suitability of HLS on real-world applications, we perform a case study of stereo matching, an active area of computer vision research that uses techniques also common for image denoising, image retrieval, feature matching, and face recognition. Based on our study, we provide insights on current limitations of mapping general-purpose software to hardware using HLS and some future directions for HLS tool development. We also offer several guidelines for hardware-friendly software design. For popular embedded benchmark kernels, the designs produced by HLS achieve 4X to 126X speedup over the software version. The stereo matching algorithms achieve between 3.5X and 67.9X speedup over software (but still less than manual RTL design with a fivefold reduction in design effort versus manual RTL design.

  10. EPIQR software. [Energy Performance, Indoor Environmental Quality, Retrofit

    Energy Technology Data Exchange (ETDEWEB)

    Flourentzos, F. (Federal Institute of Technology, Lausanne (Switzerland)); Droutsa, K. (National Observatory of Athens, Athens (Greece)); Wittchen, K.B. (Danish Building Research Institute, Hoersholm (Denmark))

    1999-01-01

    The support of the EPIQR method is a multimedia computer program. Several modules help the users of the method to treat the data collected during a diagnosis survey, to set up refurbishment scenario and calculate their cost or energy performance, and finally to visualize the results in a comprehensive way and to prepare quality reports. This article presents the structure and the main features of the software. (au)

  11. Improving Software Performance in the Compute Unified Device Architecture

    Directory of Open Access Journals (Sweden)

    Alexandru PIRJAN

    2010-01-01

    Full Text Available This paper analyzes several aspects regarding the improvement of software performance for applications written in the Compute Unified Device Architecture CUDA. We address an issue of great importance when programming a CUDA application: the Graphics Processing Unit’s (GPU’s memory management through ranspose ernels. We also benchmark and evaluate the performance for progressively optimizing a transposing matrix application in CUDA. One particular interest was to research how well the optimization techniques, applied to software application written in CUDA, scale to the latest generation of general-purpose graphic processors units (GPGPU, like the Fermi architecture implemented in the GTX480 and the previous architecture implemented in GTX280. Lately, there has been a lot of interest in the literature for this type of optimization analysis, but none of the works so far (to our best knowledge tried to validate if the optimizations can apply to a GPU from the latest Fermi architecture and how well does the Fermi architecture scale to these software performance improving techniques.

  12. Investigation of Monoterpene Degradation in the Atmospheric Simulation Chamber SAPHIR

    Science.gov (United States)

    Kaminski, Martin; Acir, Ismail-Hakki; Bohn, Birger; Brauers, Theo; Dorn, Hans-Peter; Fuchs, Hendrik; Haeseler, Rolf; Hofzumahaus, Andreas; Li, Xin; Lutz, Anna; Nehr, Sascha; Rohrer, Franz; Tillmann, Ralf; Wegener, Robert; Wahner, Andreas

    2013-04-01

    Monoterpenes are the volatile organic compound (VOC) species with the highest emission rates on a global scale beside isoprene. In the atmosphere these compounds are rapidly oxidized. Due to their high reactivity towards hydroxyl radicals (OH) they determine the radical chemistry under biogenic conditions if monoterpene concentration is higher than isoprene concentration. Recent field campaigns showed large discrepancies between measured and modeled OH concentration at low NOx conditions together with high reactivity of VOC towards OH (Hofzumahaus et al. 2009) especially in tropical forest areas (Lelieveld et al. 2008). These discrepancies were partly explained by new reaction pathways in the isoprene degradation mechanism (Whalley et al 2011). However, even an additional recycling rate of 2.7 was insufficient to explain the measured OH concentration. So other VOC species could be involved in a nonclassical OH recycling. Since the discrepancies in OH also occurred in the morning hours when the OH chemistry was mainly dominated by monoterpenes, it was assumed that also the degradation of monoterpenes may lead to OH recycling in the absence of NO. (Whalley et al 2011). The photochemical degradation of four monoterpene species was studied under high VOC reactivity and low NOx conditions in a dedicated series of experiments in the atmospheric simulation chamber SAPHIR from August to September 2012 to overcome the lack of mechanistic information for monoterpene degradation schemes. α-Pinene, β-pinene and limonene were chosen as most prominent representatives of this substance class. Moreover the degradation of myrcene was investigated due to its structural analogy to isoprene. The SAPHIR chamber was equipped with instrumentation to measure all important OH precursors (O3, HONO, HCHO), the parent VOC and their main oxidation products, radicals (OH, HO2, RO2), the total OH reactivity, and photolysis frequencies to investigate the degradation mechanism of monoterpenes in

  13. Performance Analysis of the ATLAS Second Level Trigger Software

    CERN Document Server

    Bogaerts, J A C; Li, W; Middleton, R P; Werner, P; Wickens, F J; Zobernig, H

    2002-01-01

    Abstract--In this paper we analyse the performance of the prototype software developed for the ATLAS Second Level Trigger. The OO framework written in C++ has been used to implement a distributed system which collects (simulated) detector data on which it executes event selection algorithms. The software has been used on testbeds of up to 100 nodes with various interconnect technologies. The final system will have to sustain traffic of ~ 40 Gbits/s and require an estimated number of ~750 processors. Timing measurements are crucial for issues such as trigger decision latency, assessment of required CPU and network capacity, scalability, and load-balancing. In addition, final architectural and technological choices, code optimisation and system tuning require a detailed understanding of both CPU utilisation and trigger decision latency. In this paper we describe the instrumentation used to disentangle effects due to such factors as OS system intervention, blocking on interlocks (applications are multi-threaded)...

  14. Software Systems for High-performance Quantum Computing

    Energy Technology Data Exchange (ETDEWEB)

    Humble, Travis S [ORNL; Britt, Keith A [ORNL

    2016-01-01

    Quantum computing promises new opportunities for solving hard computational problems, but harnessing this novelty requires breakthrough concepts in the design, operation, and application of computing systems. We define some of the challenges facing the development of quantum computing systems as well as software-based approaches that can be used to overcome these challenges. Following a brief overview of the state of the art, we present models for the quantum programming and execution models, the development of architectures for hybrid high-performance computing systems, and the realization of software stacks for quantum networking. This leads to a discussion of the role that conventional computing plays in the quantum paradigm and how some of the current challenges for exascale computing overlap with those facing quantum computing.

  15. Performance Evaluation of Communication Software Systems for Distributed Computing

    Science.gov (United States)

    Fatoohi, Rod

    1996-01-01

    In recent years there has been an increasing interest in object-oriented distributed computing since it is better quipped to deal with complex systems while providing extensibility, maintainability, and reusability. At the same time, several new high-speed network technologies have emerged for local and wide area networks. However, the performance of networking software is not improving as fast as the networking hardware and the workstation microprocessors. This paper gives an overview and evaluates the performance of the Common Object Request Broker Architecture (CORBA) standard in a distributed computing environment at NASA Ames Research Center. The environment consists of two testbeds of SGI workstations connected by four networks: Ethernet, FDDI, HiPPI, and ATM. The performance results for three communication software systems are presented, analyzed and compared. These systems are: BSD socket programming interface, IONA's Orbix, an implementation of the CORBA specification, and the PVM message passing library. The results show that high-level communication interfaces, such as CORBA and PVM, can achieve reasonable performance under certain conditions.

  16. Analysis of Performance of Stereoscopic-Vision Software

    Science.gov (United States)

    Kim, Won; Ansar, Adnan; Steele, Robert; Steinke, Robert

    2007-01-01

    A team of JPL researchers has analyzed stereoscopic vision software and produced a document describing its performance. This software is of the type used in maneuvering exploratory robotic vehicles on Martian terrain. The software in question utilizes correlations between portions of the images recorded by two electronic cameras to compute stereoscopic disparities, which, in conjunction with camera models, are used in computing distances to terrain points to be included in constructing a three-dimensional model of the terrain. The analysis included effects of correlation- window size, a pyramidal image down-sampling scheme, vertical misalignment, focus, maximum disparity, stereo baseline, and range ripples. Contributions of sub-pixel interpolation, vertical misalignment, and foreshortening to stereo correlation error were examined theoretically and experimentally. It was found that camera-calibration inaccuracy contributes to both down-range and cross-range error but stereo correlation error affects only the down-range error. Experimental data for quantifying the stereo disparity error were obtained by use of reflective metrological targets taped to corners of bricks placed at known positions relative to the cameras. For the particular 1,024-by-768-pixel cameras of the system analyzed, the standard deviation of the down-range disparity error was found to be 0.32 pixel.

  17. Plans for performance and model improvements in the LISE++ software

    Science.gov (United States)

    Kuchera, M. P.; Tarasov, O. B.; Bazin, D.; Sherrill, B. M.; Tarasova, K. V.

    2016-06-01

    The LISE++ software for fragment separator simulations is undergoing a major update. LISE++ is the standard software used at in-flight separator facilities for predicting beam intensity and purity. The code simulates nuclear physics experiments where fragments are produced and then selected with a fragment separator. A set of modifications to improve the functionality of the code is discussed in this work. These modifications include transportation to a modern graphics framework and updated compilers to aid in the performance and sustainability of the code. To accommodate the diversity of our users' computer platform preferences, we extend the software from Windows to a cross-platform application. The calculations of beam transport and isotope production are becoming more computationally intense with the new large scale facilities. Planned new features include new types of optimization, for example, optimization of ion optics, improvements in reaction models, and new event generator options. In addition, LISE++ interface with control systems are planned. Computational improvements as well as the schedule for updating this large package will be discussed.

  18. Software Tools for High-Performance Computiing: Survey and Recommendations

    Directory of Open Access Journals (Sweden)

    Bill Appelbe

    1996-01-01

    Full Text Available Applications programming for high-performance computing is notoriously difficult. Al-though parallel programming is intrinsically complex, the principal reason why high-performance computing is difficult is the lack of effective software tools. We believe that the lack of tools in turn is largely due to market forces rather than our inability to design and build such tools. Unfortunately, the poor availability and utilization of parallel tools hurt the entire supercomputing industry and the U.S. high performance computing initiative which is focused on applications. A disproportionate amount of resources is being spent on faster hardware and architectures, while tools are being neglected. This article introduces a taxonomy of tools, analyzes the major factors that contribute to this situation, and suggests ways that the imbalance could be redressed and the likely evolution of tools.

  19. Quality Assurance in Software Development: An Exploratory Investigation in Software Project Failures and Business Performance

    Science.gov (United States)

    Ichu, Emmanuel A.

    2010-01-01

    Software quality is perhaps one of the most sought-after attributes in product development, however; this goal is unattained. Problem factors in software development and how these have affected the maintainability of the delivered software systems requires a thorough investigation. It was, therefore, very important to understand software…

  20. Quality Assurance in Software Development: An Exploratory Investigation in Software Project Failures and Business Performance

    Science.gov (United States)

    Ichu, Emmanuel A.

    2010-01-01

    Software quality is perhaps one of the most sought-after attributes in product development, however; this goal is unattained. Problem factors in software development and how these have affected the maintainability of the delivered software systems requires a thorough investigation. It was, therefore, very important to understand software…

  1. Performance analysis of software for identification of intestinal parasites

    Directory of Open Access Journals (Sweden)

    Andressa P. Gomes

    2015-08-01

    Full Text Available ABSTRACTIntroduction:Intestinal parasites are among the most frequent diagnoses worldwide. An accurate clinical diagnosis of human parasitic infections depends on laboratory confirmation for specific differentiation of the infectious agent.Objectives:To create technological solutions to help parasitological diagnosis, through construction and use of specific software.Material and method:From the images obtained from the sediment, the software compares the morphometry, area, perimeter and circularity, and uses the information on specific morphological and staining characteristics of parasites and allows the potential identification of parasites.RESULTS:Our results demonstrate satisfactory performance, from a total of 204 images analyzed, 81.86% had the parasite correctly identified by the computer system, and 18.13% could not be identified, due to the large amount of fecal debris in the sample evaluated.Discussion:Currently the techniques used in Parasitology area are predominantly manual, probably being affected by variables, such as attention and experience of the professional. Therefore, the use of computerization in this sector can improve the performance of parasitological analysis.Conclusions:This work contributes to the computerization of healthcare area, and benefits both health professionals and their patients, in addition to provide a more efficient, accurate and secure diagnosis.

  2. Performance improvement of software component with partial evaluation

    Institute of Scientific and Technical Information of China (English)

    MAO Hong-yan; HUANG Lin-peng; LI Ming-lu

    2008-01-01

    To avoid the complexity and inefficiency for specific applications of the current software architecture, a novel approach using partial evaluation is proposed to improve the running performance of components. The generic program was specialized into domain-specific realization for the known knowledge and environments. The syntax and semantic(adj.) were analyzed based on byte code instruction sequences, and partial evaluation rules depicted how to perform the specialization. The partial evaluation for object-oriented programs was imple-mented. The experimental results show that partial evaluation is effective to speed up the running efficiency. The more generality and scalability can be obtained by the integration of partial evaluation with the favorable de-sign mechanisms and compiler optimization technology.

  3. SAPHIRE: A New Flat-Panel Digital Mammography Detector With Avalanche Photoconductor and High-Resolution Field Emitter Readout

    Science.gov (United States)

    2006-06-01

    AD_________________ Award Number: W81XWH-04-1-0554 TITLE: SAPHIRE : A New Flat-Panel Digital... SAPHIRE : A New Flat-Panel Digital Mammography Detector with Avalanche Photoconductor and High-Resolution Field Emitter Readout 5b. GRANT NUMBER w81xwh-04...CsI), and form a charge image that is read out by a high-resolution field emitter array (FEA). We call the proposed detector SAPHIRE (Scintillator

  4. High Performance Orbital Propagation Using a Generic Software Architecture

    Science.gov (United States)

    Möckel, M.; Bennett, J.; Stoll, E.; Zhang, K.

    2016-09-01

    Orbital propagation is a key element in many fields of space research. Over the decades, scientists have developed numerous orbit propagation algorithms, often tailored to specific use cases that vary in available input data, desired output as well as demands of execution speed and accuracy. Conjunction assessments, for example, require highly accurate propagations of a relatively small number of objects while statistical analyses of the (untracked) space debris population need a propagator that can process large numbers of objects in a short time with only medium accuracy. Especially in the latter case, a significant increase of computation speed can be achieved by using graphics processors, devices that are designed to process hundreds or thousands of calculations in parallel. In this paper, an analytical propagator is introduced that uses graphics processing to reduce the run time for propagating a large space debris population from several hours to minutes with only a minor loss of accuracy. A second performance analysis is conducted on a parallelised version of the popular SGP4 algorithm. It is discussed how these modifications can be applied to more accurate numerical propagators. Both programs are implemented using a generic, plugin-based software architecture designed for straightforward integration of propagators into other software tools. It is shown how this architecture can be used to easily integrate, compare and combine different orbital propagators, both CPU and GPU-based.

  5. Methods improvements incorporated into the SAPHIRE ASP models

    Energy Technology Data Exchange (ETDEWEB)

    Sattison, M.B.; Blackman, H.S.; Novack, S.D. [Idaho National Engineering Lab., Idaho Falls, ID (United States)] [and others

    1995-04-01

    The Office for Analysis and Evaluation of Operational Data (AEOD) has sought the assistance of the Idaho National Engineering Laboratory (INEL) to make some significant enhancements to the SAPHIRE-based Accident Sequence Precursor (ASP) models recently developed by the INEL. The challenge of this project is to provide the features of a full-scale PRA within the framework of the simplified ASP models. Some of these features include: (1) uncertainty analysis addressing the standard PRA uncertainties and the uncertainties unique to the ASP models and methods, (2) incorporation and proper quantification of individual human actions and the interaction among human actions, (3) enhanced treatment of common cause failures, and (4) extension of the ASP models to more closely mimic full-scale PRAs (inclusion of more initiators, explicitly modeling support system failures, etc.). This paper provides an overview of the methods being used to make the above improvements.

  6. Performing Verification and Validation in Reuse-Based Software Engineering

    Science.gov (United States)

    Addy, Edward A.

    1999-01-01

    The implementation of reuse-based software engineering not only introduces new activities to the software development process, such as domain analysis and domain modeling, it also impacts other aspects of software engineering. Other areas of software engineering that are affected include Configuration Management, Testing, Quality Control, and Verification and Validation (V&V). Activities in each of these areas must be adapted to address the entire domain or product line rather than a specific application system. This paper discusses changes and enhancements to the V&V process, in order to adapt V&V to reuse-based software engineering.

  7. Modeling TCP/IP software implementation performance and its application for software routers

    OpenAIRE

    Lepe Aldama, Oscar Iván

    2002-01-01

    Existen numerosos trabajos que estudian o tratan la realización software de los protocolos de comunicaciones para el acceso a la Internet-TCP/IP. Sin embargo, no conocemos ninguno que modele de forma general y precisa la ejecución de este software.La presente tesis aporta una caracterización detallada de la ejecución de la realización software de los mencionados protocolos sobre un computador personal y bajo un sistema operativo UNIX. Esta caracterización muestra cómo varía el rendimiento del...

  8. Evaluation of SAPHIR / Megha-Tropiques observations - CINDY/DYNAMO Campaign

    Science.gov (United States)

    Clain, Gaelle; Brogniez, Hélène; John, Viju; Payne, Vivienne; Luo, Ming

    2014-05-01

    The SAPHIR sounder (Sondeur Atmosphérique du Profil d'Humidité Intertropicale par Radiométrie) onboard the Megha-Tropiques (MT) platform observes the microwave radiation emitted by the Earth system in the strong absorption line of water vapor at 183.31 GHz. It is a multi-channel microwave humidity sounder with 6 channels in the 183.31GHz water vapor absorption band, a maximum scan angle of 42.96° around nadir, a 1700 km wide swath and a footprint resolution of 10 km at nadir. A comparison between the sensor L1A2 observations and radiative transfer calculations using in situ measurements from radiosondes as input is performed in order to validate the satellite observations on the brightness temperature (BT) level. The radiosonde humidity observations chosen as reference were performed during the CINDY/DYNAMO campaign (september 2011 to March 2012) with Vaïsala RS92-SGPD probes and match to a spatio-temporal co-location with MT satellite overpasses. Although several sonde systems were used during the campaign, all of the sites selected for this study used the Vaïsala RS92-SGPD system and were chosen in order to avoid discrepancies in data quality and biases. This work investigates the difference - or bias - between the BTs observed by the sensor and BT simulations from a radiative transfer model, RTTOV-10. The bias amplitude is characterized by a temperature dependent pattern, increasing from nearly 0 Kelvin for the 183.31 ± 0.2 channel to a range of 2 K for the 183.31 ± 11 channel. However the comparison between the sensor data and the radiative transfer simulations is not straightforward and uncertainties associated to the data processing must be propagated throughout the evaluation. Therefore this work documents an evaluation of the uncertainties and errors that can impact the BT bias. These can be linked to the radiative transfer model input and design, the radiosonde observations, the methodology chosen for the comparison and the SAPHIR instrument itself.

  9. Performance Analysis of Software Effort Estimation Models Using Neural Networks

    Directory of Open Access Journals (Sweden)

    P.Latha

    2013-08-01

    Full Text Available Software Effort estimation involves the estimation of effort required to develop software. Cost overrun, schedule overrun occur in the software development due to the wrong estimate made during the initial stage of software development. Proper estimation is very essential for successful completion of software development. Lot of estimation techniques available to estimate the effort in which neural network based estimation technique play a prominent role. Back propagation Network is the most widely used architecture. ELMAN neural network a recurrent type network can be used on par with Back propagation Network. For a good predictor system the difference between estimated effort and actual effort should be as low as possible. Data from historic project of NASA is used for training and testing. The experimental Results confirm that Back propagation algorithm is efficient than Elman neural network.

  10. Engineering bioinformatics: building reliability, performance and productivity into bioinformatics software.

    Science.gov (United States)

    Lawlor, Brendan; Walsh, Paul

    2015-01-01

    There is a lack of software engineering skills in bioinformatic contexts. We discuss the consequences of this lack, examine existing explanations and remedies to the problem, point out their shortcomings, and propose alternatives. Previous analyses of the problem have tended to treat the use of software in scientific contexts as categorically different from the general application of software engineering in commercial settings. In contrast, we describe bioinformatic software engineering as a specialization of general software engineering, and examine how it should be practiced. Specifically, we highlight the difference between programming and software engineering, list elements of the latter and present the results of a survey of bioinformatic practitioners which quantifies the extent to which those elements are employed in bioinformatics. We propose that the ideal way to bring engineering values into research projects is to bring engineers themselves. We identify the role of Bioinformatic Engineer and describe how such a role would work within bioinformatic research teams. We conclude by recommending an educational emphasis on cross-training software engineers into life sciences, and propose research on Domain Specific Languages to facilitate collaboration between engineers and bioinformaticians.

  11. Inter-calibration and validation of observations from SAPHIR and ATMS instruments

    Science.gov (United States)

    Moradi, I.; Ferraro, R. R.

    2015-12-01

    We present the results of evaluating observations from microwave instruments aboard the Suomi National Polar-orbiting Partnership (NPP, ATMS instrument) and Megha-Tropiques (SAPHIR instrument) satellites. The study includes inter-comparison and inter-calibration of observations of similar channels from the two instruments, evaluation of the satellite data using high-quality radiosonde data from Atmospheric Radiation Measurement Program and GPS Radio Occultaion Observations from COSMIC mission, as well as geolocation error correction. The results of this study are valuable for generating climate data records from these instruments as well as for extending current climate data records from similar instruments such as AMSU-B and MHS to the ATMS and SAPHIR instruments. Reference: Moradi et al., Intercalibration and Validation of Observations From ATMS and SAPHIR Microwave Sounders. IEEE Transactions on Geoscience and Remote Sensing. 01/2015; DOI: 10.1109/TGRS.2015.2427165

  12. Technical Note: Intercomparison of formaldehyde measurements at the atmosphere simulation chamber SAPHIR

    Directory of Open Access Journals (Sweden)

    A. Wisthaler

    2007-11-01

    Full Text Available The atmosphere simulation chamber SAPHIR at the Research Centre Jülich was used to test the suitability of state-of-the-art analytical instruments for the measurement of gas-phase formaldehyde (HCHO in air. Five analyzers based on four different sensing principles were deployed: a differential optical absorption spectrometer (DOAS, cartridges for 2,4-dinitro-phenyl-hydrazine (DNPH derivatization followed by off-line high pressure liquid chromatography (HPLC analysis, two different types of commercially available wet chemical sensors based on Hantzsch fluorimetry, and a proton-transfer-reaction mass spectrometer (PTR-MS. A new optimized mode of operation was used for the PTR-MS instrument which significantly enhanced its performance for on-line HCHO detection at low absolute humidities.

    The instruments were challenged with typical ambient levels of HCHO ranging from zero to several ppb. Synthetic air of high purity and particulate-filtered ambient air were used as sample matrices in the atmosphere simulation chamber onto which HCHO was spiked under varying levels of humidity and ozone. Measurements were compared to mixing ratios calculated from the chamber volume and the known amount of HCHO injected into the chamber; measurements were also compared between the different instruments. The formal and blind intercomparison exercise was conducted under the control of an independent referee. A number of analytical problems associated with the experimental set-up and with individual instruments were identified, the overall agreement between the methods was good.

  13. Technical Note: Intercomparison of formaldehyde measurements at the atmosphere simulation chamber SAPHIR

    Science.gov (United States)

    Wisthaler, A.; Apel, E. C.; Bossmeyer, J.; Hansel, A.; Junkermann, W.; Koppmann, R.; Meier, R.; Müller, K.; Solomon, S. J.; Steinbrecher, R.; Tillmann, R.; Brauers, T.

    2008-04-01

    The atmosphere simulation chamber SAPHIR at the Research Centre Jülich was used to test the suitability of state-of-the-art analytical instruments for the measurement of gas-phase formaldehyde (HCHO) in air. Five analyzers based on four different sensing principles were deployed: a differential optical absorption spectrometer (DOAS), cartridges for 2,4-dinitrophenylhydrazine (DNPH) derivatization followed by off-line high pressure liquid chromatography (HPLC) analysis, two different types of commercially available wet chemical sensors based on Hantzsch fluorimetry, and a proton-transfer-reaction mass spectrometer (PTR-MS). A new optimized mode of operation was used for the PTR-MS instrument which significantly enhanced its performance for online HCHO detection at low absolute humidities. The instruments were challenged with typical ambient levels of HCHO ranging from zero to several ppb. Synthetic air of high purity and particulate-filtered ambient air were used as sample matrices in the atmosphere simulation chamber onto which HCHO was spiked under varying levels of humidity and ozone. Measurements were compared to mixing ratios calculated from the chamber volume and the known amount of HCHO injected into the chamber; measurements were also compared between the different instruments. The formal and blind intercomparison exercise was conducted under the control of an independent referee. A number of analytical problems associated with the experimental set-up and with individual instruments were identified, the overall agreement between the methods was fair.

  14. Technical Note: Intercomparison of formaldehyde measurements at the atmosphere simulation chamber SAPHIR

    Directory of Open Access Journals (Sweden)

    A. Wisthaler

    2008-04-01

    Full Text Available The atmosphere simulation chamber SAPHIR at the Research Centre Jülich was used to test the suitability of state-of-the-art analytical instruments for the measurement of gas-phase formaldehyde (HCHO in air. Five analyzers based on four different sensing principles were deployed: a differential optical absorption spectrometer (DOAS, cartridges for 2,4-dinitro-phenyl-hydrazine (DNPH derivatization followed by off-line high pressure liquid chromatography (HPLC analysis, two different types of commercially available wet chemical sensors based on Hantzsch fluorimetry, and a proton-transfer-reaction mass spectrometer (PTR-MS. A new optimized mode of operation was used for the PTR-MS instrument which significantly enhanced its performance for online HCHO detection at low absolute humidities.

    The instruments were challenged with typical ambient levels of HCHO ranging from zero to several ppb. Synthetic air of high purity and particulate-filtered ambient air were used as sample matrices in the atmosphere simulation chamber onto which HCHO was spiked under varying levels of humidity and ozone. Measurements were compared to mixing ratios calculated from the chamber volume and the known amount of HCHO injected into the chamber; measurements were also compared between the different instruments. The formal and blind intercomparison exercise was conducted under the control of an independent referee. A number of analytical problems associated with the experimental set-up and with individual instruments were identified, the overall agreement between the methods was fair.

  15. Comparison of OH reactivity instruments in the atmosphere simulation chamber SAPHIR

    Science.gov (United States)

    Fuchs, Hendrik

    2016-04-01

    OH reactivity measurement has become an important measurement to constrain the total OH loss frequency in field experiments. Different techniques have been developed by various groups. They can be based on flow-tube or pump and probe techniques, which include direct OH detection by fluorescence, or on a comparative method, in which the OH loss of a reference species competes with the OH loss of trace gases in the sampled air. In order to ensure that these techniques deliver equivalent results, a comparison exercise was performed under controlled conditions. Nine OH reactivity instruments measured together in the atmosphere simulation chamber SAPHIR (volume 270 m3) during ten daylong experiments in October 2015 at ambient temperature (5 to 10° C) and pressure (990-1010 hPa). The chemical complexity of air mixtures in these experiments varied from CO in pure synthetic air to emissions from real plants and VOC/NOx mixtures representative of urban atmospheres. Potential differences between measurements were systematically investigated by changing the amount of reactants (including isoprene, monoterpenes and sesquiterpenes), water vapour, and nitrogen oxides. Some of the experiments also included the oxidation of reactants with ozone or hydroxyl radicals, in order to elaborate, if the presence of oxidation products leads to systematic differences between measurements of different instruments. Here we present first results of this comparison exercise.

  16. Locating Performance Improvement Opportunities in an Industrial Software-as-a-Service Application

    NARCIS (Netherlands)

    Bezemer, C.P.; Zaidman, A.E.; Van der Hoeven, A.; Van de Graaf, A.; Wiertz, M.; Weijers, R.

    2012-01-01

    Preprint of paper published in: ICSM 2012 - Proceedings of the IEEE International Conference on Software Maintenance, 23-28 September 2012; doi:10.1109/ICSM.2012.6405319 The goal of performance maintenance is to improve the performance of a software system after delivery. As the performance of a sy

  17. Improving Performance of Software Implemented Floating Point Addition

    DEFF Research Database (Denmark)

    Hindborg, Andreas Erik; Karlsson, Sven

    2011-01-01

    We outline and evaluate hardware extensions to an integer processor pipeline which allow IEEE 754 oating point, FP, addition to be eciently implemented in software. With a very moderate increase in hardware resources, our perfor- mance evaluation shows that, for a benchmark that executes 12.5% FP...... addition instructions, our approach exhibits a rel- ative slowdown of 3.38 to 15.15 as compared to dedicated hardware. This is a signicant improvement of pure software emulation which leads to relative slowdowns up to 45.33....

  18. SAPHIRE (scintillator avalanche photoconductor with high resolution emitter readout) for low dose x-ray imaging: spatial resolution.

    Science.gov (United States)

    Li, Dan; Zhao, Wei

    2008-07-01

    An indirect flat panel imager (FPI) with programmable avalanche gain and field emitter array (FEA) readout is being investigated for low-dose and high resolution x-ray imaging. It is made by optically coupling a structured x-ray scintillator, e.g., thallium (Tl) doped cesium iodide (CsI), to an amorphous selenium (a-Se) avalanche photoconductor called high-gain avalanche rushing amorphous photoconductor (HARP). The charge image created by the scintillator/HARP (SHARP) combination is read out by the electron beams emitted from the FEA. The proposed detector is called scintillator avalanche photoconductor with high resolution emitter readout (SAPHIRE). The programmable avalanche gain of HARP can improve the low dose performance of indirect FPI while the FEA can be made with pixel sizes down to 50 microm. Because of the avalanche gain, a high resolution type of CsI (Tl), which has not been widely used in indirect FPI due to its lower light output, can be used to improve the high spatial frequency performance. The purpose of the present article is to investigate the factors affecting the spatial resolution of SAPHIRE. Since the resolution performance of the SHARP combination has been well studied, the focus of the present work is on the inherent resolution of the FEA readout method. The lateral spread of the electron beam emitted from a 50 microm x 50 microm pixel FEA was investigated with two different electron-optical designs: mesh-electrode-only and electrostatic focusing. Our results showed that electrostatic focusing can limit the lateral spread of electron beams to within the pixel size of down to 50 microm. Since electrostatic focusing is essentially independent of signal intensity, it will provide excellent spatial uniformity.

  19. Fuzzy Logic Based Group Maturity Rating for Software Performance Prediction

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Driven by market requirements, software services organizations have adopted various software engineering process models (such as capability maturity model (CMM), capability maturity model integration (CMMI), ISO 9001:2000, etc.) and practice of the project management concepts defined in the project management body of knowledge. While this has definitely helped organizations to bring some methods into the software development madness, there always exists a demand for comparing various groups within the organization in terms of the practice of these defined process models. Even though there exist many metrics for comparison, considering the variety of projects in terms of technology, life cycle, etc., finding a single metric that caters to this is a difficult task. This paper proposes a model for arriving at a rating on group maturity within the organization. Considering the linguistic or imprecise and uncertain nature of software measurements, fuzzy logic approach is used for the proposed model. Without the barriers like technology or life cycle difference, the proposed model helps the organization to compare different groups within it with reasonable precision.

  20. NREL Software Models Performance of Wind Plants (Fact Sheet)

    Energy Technology Data Exchange (ETDEWEB)

    2015-01-01

    This NREL Highlight is being developed for the 2015 February Alliance S&T Meeting, and describes NREL's Simulator for Offshore Wind Farm Applications (SOWFA) software in collaboration with Norway-based Statoil, to optimize layouts and controls of wind plants arrays.

  1. Performance Evaluation of a New Method for Service-Oriented Developed Testing Software

    Directory of Open Access Journals (Sweden)

    Ibrahim Azhdari pour

    2016-02-01

    Full Text Available Today, using software is increasing significantly. The incidence of human errors in the process of producing software is inevitable. So, the necessity of testing the software in the process of their production becomes very important. Software testing is the process of software assessment to be sure of its correct performance under different conditions. To avoid the complexity in the software production, an architectural style is adopted for the software production. There are different architectural styles for the software production. One of these styles is the service-oriented architecture. A group of software that the service-oriented architecture is adopted for their production is called the service-oriented software, or the developed service-oriented software. In this article, we will make a comparison between different testing methods, and International Software Testing Qualification Board (ISTQB testing framework. Then, using simulation, we will get the conclusion that our suggested testing method is nearly 2 times more capable of detecting the committed errors in the software rather than other methods, which it will lead to the more satisfaction of the users, and the customers.

  2. Software engineering group work - Personality, patterns and performance

    OpenAIRE

    Bell, David G.; Hall, Tracy; Hannay, Jo Erskine; Pfahl, Dietmar; Acuña, Silvia T.

    2010-01-01

    This is the author's version of the work. It is posted here for your personal use. Not for redistribution. The definitive Version of Record was published in SIGMIS-CPR '10, http://dx.doi.org/10.1145/1796900.1796921 Proceedings of the 2010 Special Interest Group on Management Information System's 48th annual conference on Computer personnel research on Computer personnel research (Vancouver, BC, Canada) Software Engineering has been a fundamental part of many computing undergraduate cour...

  3. Design, Implementation, and Performance of CREAM Data Acquisition Software

    CERN Document Server

    Zinn, S Y; Bagliesi, M G; Beatty, J J; Childers, J T; Coutu, S; Duvernois, M A; Ganel, O; Kim, H J; Lee, M H; Lutz, L; Malinine, A; Maestro, P; Marrocchesi, P S; Park, I H; Seo, E S; Song, C; Swordy, S; Wu, J

    2005-01-01

    Cosmic Ray Energetics and Mass (CREAM) is a balloon-borne experiment scheduled for launching from Antarctica in late 2004. Its aim is to measure the energy spectrum and composition of cosmic rays from proton to iron nuclei at ultra high energies from 1 to 1,000 TeV. Ultra long duration balloons are expected to fly about 100 days. One special feature of the CREAM data acquisition software (CDAQ) is the telemetric operation of the instrument using satellites. During a flight the science event and housekeeping data are sent from the instrument to a ground facility. Likewise, commands for controlling both the hardware and the software are uploaded from the ground facility. This requires a robust, reliable, and fast software system. CDAQ has been developed and tested during three beam tests at CERN in July, September, and November 2003. Recently the interfaces to the transition radiation detector (TRD) and to the timing-based charge detector (TCD) have been added. These new additions to CDAQ will be checked at a t...

  4. Assimilation of SAPHIR radiance: impact on hyperspectral radiances in 4D-VAR

    Science.gov (United States)

    Indira Rani, S.; Doherty, Amy; Atkinson, Nigel; Bell, William; Newman, Stuart; Renshaw, Richard; George, John P.; Rajagopal, E. N.

    2016-04-01

    Assimilation of a new observation dataset in an NWP system may affect the quality of an existing observation data set against the model background (short forecast), which in-turn influence the use of an existing observation in the NWP system. Effect of the use of one data set on the use of another data set can be quantified as positive, negative or neutral. Impact of the addition of new dataset is defined as positive if the number of assimilated observations of an existing type of observation increases, and bias and standard deviation decreases compared to the control (without the new dataset) experiment. Recently a new dataset, Megha Tropiques SAPHIR radiances, which provides atmospheric humidity information, is added in the Unified Model 4D-VAR assimilation system. In this paper we discuss the impact of SAPHIR on the assimilation of hyper-spectral radiances like AIRS, IASI and CrIS. Though SAPHIR is a Microwave instrument, its impact can be clearly seen in the use of hyper-spectral radiances in the 4D-VAR data assimilation systems in addition to other Microwave and InfraRed observation. SAPHIR assimilation decreased the standard deviation of the spectral channels of wave number from 650 -1600 cm-1 in all the three hyperspectral radiances. Similar impact on the hyperspectral radiances can be seen due to the assimilation of other Microwave radiances like from AMSR2 and SSMIS Imager.

  5. Frequency Estimator Performance for a Software-Based Beacon Receiver

    Science.gov (United States)

    Zemba, Michael J.; Morse, Jacquelynne Rose; Nessel, James A.; Miranda, Felix

    2014-01-01

    As propagation terminals have evolved, their design has trended more toward a software-based approach that facilitates convenient adjustment and customization of the receiver algorithms. One potential improvement is the implementation of a frequency estimation algorithm, through which the primary frequency component of the received signal can be estimated with a much greater resolution than with a simple peak search of the FFT spectrum. To select an estimator for usage in a QV-band beacon receiver, analysis of six frequency estimators was conducted to characterize their effectiveness as they relate to beacon receiver design.

  6. The new CERN tape software - getting ready for total performance

    CERN Document Server

    Cano, E; Kruse, D F; Kotlyar, V; Côme, D

    2015-01-01

    CASTOR (the CERN Advanced STORage system) is used to store the custodial copy of all of the physics data collected from the CERN experiments, both past and present. CASTOR is a hierarchical storage management system that has a disk-based front-end and a tape-based back-end. The software responsible for controlling the tape back-end has been redesigned and redeveloped over the last year and was put in production at the beginning of 2015. This paper summarises the motives behind the redesign, describes in detail the redevelopment work and concludes with the short and long-term benefits.

  7. Performance Evaluation of a Software Engineering Tool for Automated Design of Cooling Systems in Injection Moulding

    DEFF Research Database (Denmark)

    Jauregui-Becker, Juan M.; Tosello, Guido; van Houten, Fred J.A.M.

    2013-01-01

    This paper presents a software tool for automating the design of cooling systems for injection moulding and a validation of its performance. Cooling system designs were automatically generated by the proposed software tool and by applying a best practice tool engineering design approach. The two...

  8. The Effect of Firm Strategy and Corporate Performance on Software Market Growth in Emerging Regions

    Science.gov (United States)

    Mertz, Sharon A.

    2013-01-01

    The purpose of this research is to evaluate the impact of firm strategies and corporate performance on enterprise software market growth in emerging regions. The emerging regions of Asia Pacific, Eastern Europe, the Middle East and Africa, and Latin America, currently represent smaller overall markets for software vendors, but exhibit high growth…

  9. The Effect of Firm Strategy and Corporate Performance on Software Market Growth in Emerging Regions

    Science.gov (United States)

    Mertz, Sharon A.

    2013-01-01

    The purpose of this research is to evaluate the impact of firm strategies and corporate performance on enterprise software market growth in emerging regions. The emerging regions of Asia Pacific, Eastern Europe, the Middle East and Africa, and Latin America, currently represent smaller overall markets for software vendors, but exhibit high growth…

  10. Retrieval of cloud ice water path using SAPHIR on board Megha-Tropiques over the tropical ocean

    Science.gov (United States)

    Piyush, Durgesh Nandan; Goyal, Jayesh; Srinivasan, J.

    2017-04-01

    The SAPHIR sensor onboard Megha-Tropiques (MT) measures the earth emitted radiation at frequencies near the water vapor absorption band. SAPHIR operates in six frequencies ranging from 183 ± 0.1 to 183 ± 11 GHz. These frequencies have been used to retrieve cloud ice water path (IWP) at a very high resolution. A method to retrieve IWP over the Indian ocean region is attempted in this study. The study is in two parts, in first part a radiative transfer based simulation is carried out to give an insight of using SAPHIR frequency channels for IWP retrieval, in the next part the real observations of SAPHIR and TRMM-TMI was used for IWP retrieval. The concurrent observations of SAPHIR brightness temperatures (Tbs) and TRMM TMI IWP were used in the development of the retrieval algorithm. An Eigen Vector analysis was done to identify weight of each channel in retrieving IWP; following this a two channel regression based algorithm was developed. The SAPHIR channels which are away from the water vapor absorption band were used to avoid possible water vapor contamination. When the retrieval is compared with independent test dataset, it gives a correlation of 0.80 and RMSE of 3.5%. SAPHIR derived IWP has been compared with other available global IWP products such as TMI, MSPPS, CloudSat and GPM-GMI qualitatively as well as quantitatively. PDF comparison of SAPHIR derived IWP found to have good agreement with CloudSat. Zonal mean comparison with recently launched GMI shows the strength of this algorithm.

  11. Hardware support for software controlled fast multiplexing of performance counters

    Science.gov (United States)

    Salapura, Valentina; Wisniewski, Robert W.

    2013-01-01

    Performance counters may be operable to collect one or more counts of one or more selected activities, and registers may be operable to store a set of performance counter configurations. A state machine may be operable to automatically select a register from the registers for reconfiguring the one or more performance counters in response to receiving a first signal. The state machine may be further operable to reconfigure the one or more performance counters based on a configuration specified in the selected register. The state machine yet further may be operable to copy data in selected one or more of the performance counters to a memory location, or to copy data from the memory location to the counters, in response to receiving a second signal. The state machine may be operable to store or restore the counter values and state machine configuration in response to a context switch event.

  12. Atmospheric photochemistry of aromatic hydrocarbons: OH budgets during SAPHIR chamber experiments

    Science.gov (United States)

    Nehr, S.; Bohn, B.; Dorn, H.-P.; Fuchs, H.; Häseler, R.; Hofzumahaus, A.; Li, X.; Rohrer, F.; Tillmann, R.; Wahner, A.

    2014-07-01

    Current photochemical models developed to simulate the atmospheric degradation of aromatic hydrocarbons tend to underestimate OH radical concentrations. In order to analyse OH budgets, we performed experiments with benzene, toluene, p-xylene and 1,3,5-trimethylbenzene in the atmosphere simulation chamber SAPHIR. Experiments were conducted under low-NO conditions (typically 0.1-0.2 ppb) and high-NO conditions (typically 7-8 ppb), and starting concentrations of 6-250 ppb of aromatics, dependent on OH rate constants. For the OH budget analysis a steady-state approach was applied in which OH production and destruction rates (POH and DOH) have to be equal. The POH were determined from measurements of HO2, NO, HONO, and O3 concentrations, considering OH formation by photolysis and recycling from HO2. The DOH were calculated from measurements of the OH concentrations and total OH reactivities. The OH budgets were determined from DOH/POH ratios. The accuracy and reproducibility of the approach were assessed in several experiments using CO as a reference compound where an average ratio DOH/POH = 1.13 ± 0.19 was obtained. In experiments with aromatics, these ratios ranged within 1.1-1.6 under low-NO conditions and 0.9-1.2 under high-NO conditions. The results indicate that OH budgets during photo-oxidation experiments with aromatics are balanced within experimental accuracies. Inclusion of a further, recently proposed OH production via HO2 + RO2 reactions led to improvements under low-NO conditions but the differences were small and insignificant within the experimental errors.

  13. An indirect flat-panel detector with avalanche gain for low dose x-ray imaging: SAPHIRE (scintillator avalanche photoconductor with high resolution emitter readout)

    Science.gov (United States)

    Zhao, Wei; Li, Dan; Rowlands, J. A.; Egami, N.; Takiguchi, Y.; Nanba, M.; Honda, Y.; Ohkawa, Y.; Kubota, M.; Tanioka, K.; Suzuki, K.; Kawai, T.

    2008-03-01

    An indirect flat-imager with programmable avalanche gain and field emitter array (FEA) readout is being investigated for low-dose x-ray imaging with high resolution. It is made by optically coupling a structured x-ray scintillator CsI (Tl) to an amorphous selenium (a-Se) avalanche photoconductor called HARP (high-gain avalanche rushing photoconductor). The charge image created by HARP is read out by electron beams generated by the FEA. The proposed detector is called SAPHIRE (Scintillator Avalanche Photoconductor with HIgh Resolution Emitter readout). The avalanche gain of HARP depends on both a-Se thickness and applied electric field E Se. At E Se of > 80 V/μm, the avalanche gain can enhance the signal at low dose (e.g. fluoroscopy) and make the detector x-ray quantum noise limited down to a single x-ray photon. At high exposure (e.g. radiography), the avalanche gain can be turned off by decreasing E Se to < 70 V/μm. In this paper the imaging characteristics of the FEA readout method, including the spatial resolution and noise, were investigated experimentally using a prototype optical HARP-FEA image sensor. The potential x-ray imaging performance of SAPHIRE, especially the aspect of programmable gain to ensure wide dynamic range and x-ray quantum noise limited performance at the lowest exposure in fluoroscopy, was investigated.

  14. The role of star performers in software design teams

    OpenAIRE

    Volmer, Judith; Sonnentag, Sabine

    2011-01-01

    Purpose - This study seeks to extend previous research on experts with mainly ad-hoc groups from laboratory research to a field setting. Specifically, this study aims to investigate experts relative importance in team performance. Expertise is differentiated into two categories (task functions and team functions) and the paper aims to investigate whether experts in task and team functions predict team performance over and above the team s average expertise level.Design/methodology/approach - L...

  15. The effect of construction cost estimating (CCE software on job performance: An improvement plan

    Directory of Open Access Journals (Sweden)

    Mohd Mukelas M.F.

    2014-01-01

    Full Text Available This paper presents a comprehensive statistical research on the effect of construction cost estimating software’s features towards estimating job performance. The objectives of this study are identification of cost estimating software features, analyzing the significant relation of cost estimating software’s features towards job performance, Explore the problem faced during the implementation and lastly propose a plan to improve the cost estimating software usage among contractors in Malaysia. The study statistically reveals four features of cost estimating software that significantly impact towards changes in cost estimating job performance. These features were refined by performing interview to focus group of respondent to observe the actual possible problems during the implementation. Eventually, the proposed improvement plan was validated by the focus group of respondents to enhance the cost estimating software implementation among contractors in Malaysia.

  16. Performance measurement of autonomous grasping software in a simulated orbital environment

    Science.gov (United States)

    Norsworthy, Robert S.

    1993-12-01

    The EVAHR (extravehicular activity helper/retriever) robot is being developed to perform a variety of navigation and manipulation tasks under astronaut supervision. The EVAHR is equipped with a manipulator and dexterous end-effector for capture and a laser range imager with pan/tilt for target perception. Perception software has been developed to perform target pose estimation, tracking, and motion estimation for rigid, freely rotating, polyhedral objects. Manipulator grasp planning and trajectory control software has also been developed to grasp targets while avoiding collisions. A software simulation of the EVAHR hardware, orbital dynamics, collision detection, and grasp impact dynamics has been developed to test and measure the performance of the integrated software. Performance measurements include grasp success/failure % and time-to-grasp for a variety of targets, initial target states, and simulated pose estimation computing resources.

  17. Mitigating the controller performance bottlenecks in Software Defined Networks

    DEFF Research Database (Denmark)

    Caba, Cosmin Marius; Soler, José

    2016-01-01

    proposes a new approach for addressing the performance bottlenecks that arise from limited computational resources at the SDNC. The proposed approach is based on optimally configuring the operating parameters of the components residing inside the SDNC (network control functions such as monitoring, routing...

  18. Importance of Explicit Vectorization for CPU and GPU Software Performance

    CERN Document Server

    Dickson, Neil G; Hamze, Firas

    2010-01-01

    Much of the current focus in high-performance computing is on multi-threading, multi-computing, and graphics processing unit (GPU) computing. However, vectorization and non-parallel optimization techniques, which can often be employed additionally, are less frequently discussed. In this paper, we present an analysis of several optimizations done on both central processing unit (CPU) and GPU implementations of a particular computationally intensive Metropolis Monte Carlo algorithm. Explicit vectorization on the CPU and the equivalent, explicit memory coalescing, on the GPU are found to be critical to achieving good performance of this algorithm in both environments. The fully-optimized CPU version achieves a 9x to 12x speedup over the original CPU version, in addition to speedup from multi-threading. This is 2x faster than the fully-optimized GPU version.

  19. Remote software upload techniques in future vehicles and their performance analysis

    Science.gov (United States)

    Hossain, Irina

    could benefit from it. However, like the unicast RSU, the security requirements of multicast communication, i.e., authenticity, confidentiality and integrity of the software transmitted and access control of the group members is challenging. In this thesis, an infrastructure-based mobile multicasting for RSU in vehicle ECUs is proposed where an ECU receives the software from a remote software distribution center using the road side BSs as gateways. The Vehicular Software Distribution Network (VSDN) is divided into small regions administered by a Regional Group Manager (RGM). Two multicast Group Key Management (GKM) techniques are proposed based on the degree of trust on the BSs named Fully-trusted (FT) and Semi-trusted (ST) systems. Analytical models are developed to find the multicast session establishment latency and handover latency for these two protocols. The average latency to perform mutual authentication of the software vendor and a vehicle, and to send the multicast session key by the software provider during multicast session initialization, and the handoff latency during multicast session is calculated. Analytical and simulation results show that the link establishment latency per vehicle of our proposed schemes is in the range of few seconds and the ST system requires few ms higher time than the FT system. The handoff latency is also in the range of few seconds and in some cases ST system requires less handoff time than the FT system. Thus, it is possible to build an efficient GKM protocol without putting too much trust on the BSs.

  20. Abnormity control design and performance analysis of real-time data exchange software based on Petri net

    Institute of Scientific and Technical Information of China (English)

    Zhang Weimin

    2005-01-01

    In many spaceflight measure and control software systems, varieties of measure data are exchanged between different software. Qualities of measure and control software systems are influenced by the performances of data exchange software greatly. Many problems that appear during the running process of real-time measure and control software and are difficult to be located are caused by data exchange software. So, it is necessary to analyze the performances of data exchange software while designing measure and control software systems. In this article, the Petri net model of the real-time data exchange software is established first. Then the model is simplified and analyzed. The design of abnormity control for buffer overflow is given. Finally,using the Petri net method, the performances of the real-time data exchange software are analyzed and discussed.

  1. Le disque à saphir dans l’édition phonographique – Première partie

    OpenAIRE

    Sébald, Bruno

    2010-01-01

    L’expression « disque à saphir » que nous emploierons dans cet article recouvre un terme générique utilisé pour décrire des disques plats lus, à l’origine, par le truchement d’un saphir de forme sphérique. Cette technique relève d’un procédé de gravure verticale et se démarque ainsi du disque à aiguille, dont la pointe de lecture diffère et dont le mode de gravure est latéral. Elle s’en distingue également physiquement par la texture perlée qui couvre la surface du disque. Historiquement, le ...

  2. Software Design Document for the AMP Nuclear Fuel Performance Code

    Energy Technology Data Exchange (ETDEWEB)

    Philip, Bobby [ORNL; Clarno, Kevin T [ORNL; Cochran, Bill [ORNL

    2010-03-01

    The purpose of this document is to describe the design of the AMP nuclear fuel performance code. It provides an overview of the decomposition into separable components, an overview of what those components will do, and the strategic basis for the design. The primary components of a computational physics code include a user interface, physics packages, material properties, mathematics solvers, and computational infrastructure. Some capability from established off-the-shelf (OTS) packages will be leveraged in the development of AMP, but the primary physics components will be entirely new. The material properties required by these physics operators include many highly non-linear properties, which will be replicated from FRAPCON and LIFE where applicable, as well as some computationally-intensive operations, such as gap conductance, which depends upon the plenum pressure. Because there is extensive capability in off-the-shelf leadership class computational solvers, AMP will leverage the Trilinos, PETSc, and SUNDIALS packages. The computational infrastructure includes a build system, mesh database, and other building blocks of a computational physics package. The user interface will be developed through a collaborative effort with the Nuclear Energy Advanced Modeling and Simulation (NEAMS) Capability Transfer program element as much as possible and will be discussed in detail in a future document.

  3. Retrieval and Validation of Upper Tropospheric Humidity from SAPHIR aboard Megha-Tropics

    Science.gov (United States)

    Mathew, Nizy; Krishna Moorthy, K.; Raju C, Suresh; Pillai Renju, Ramachandran; Oommen John, Viju

    Upper tropospheric humidity (UTH) has been derived from brightness temperature of SAPHIR payload aboard Megha-Tropiques (MT) mission. The channels of SAPHIR are very close to the water vapor absorption peak at 183.31GHz. First three channels at 183.31±0.2 GHz, 183.31±1.1 GHz and 183.31±2.8 are used for upper tropospheric humidity (UTH) studies. The channel at 183.31±0.2 GHz enables retrieval of humidity up to the highest altitude possible with the present nadir looking microwave humidity sounders. Transformation coefficients for the first three channels for all the incidence angles have been derived using the simulated brightness temperatures and Jocobians with Chevellier data set as input to the radiative transfer model ARTS. These coefficients are used to convert brightness temperatures to upper tropospheric humidity from different channels. A stringent deep convective cloud screeing has been done using the brightness temperatures of SAPHIR itself. The retrieved UTH has been validated with the Jacobian weighted UTH derived from collocated radiosonde observations and also with the humidity profiles derived from ground based microwave radiometer data. UTH variation over the inter-tropical region on global basis has been studied for one year, taking the advantage of the first humidity product with high spatial and temporal resolution over the tropical belt, unbiased with specific local times of the satellite pass. These data set have been used to adress the seasonal and spatial variability of humidity in the tropical upper tropospheric region and humidity variability during Indian monsoon. The details of the MT-SAPHIR characteristics, methodology and results will be presented. begin{enumerate} begin{center}

  4. Performance of student software development teams: the influence of personality and identifying as team members

    Science.gov (United States)

    Monaghan, Conal; Bizumic, Boris; Reynolds, Katherine; Smithson, Michael; Johns-Boast, Lynette; van Rooy, Dirk

    2015-01-01

    One prominent approach in the exploration of the variations in project team performance has been to study two components of the aggregate personalities of the team members: conscientiousness and agreeableness. A second line of research, known as self-categorisation theory, argues that identifying as team members and the team's performance norms should substantially influence the team's performance. This paper explores the influence of both these perspectives in university software engineering project teams. Eighty students worked to complete a piece of software in small project teams during 2007 or 2008. To reduce limitations in statistical analysis, Monte Carlo simulation techniques were employed to extrapolate from the results of the original sample to a larger simulated sample (2043 cases, within 319 teams). The results emphasise the importance of taking into account personality (particularly conscientiousness), and both team identification and the team's norm of performance, in order to cultivate higher levels of performance in student software engineering project teams.

  5. 软件体系结构性能评价研究%Study on Software Architecture Performance Evaluation

    Institute of Scientific and Technical Information of China (English)

    赵会群; 孙晶; 王国仁; 高远

    2003-01-01

    The software architecture is a design of application system,performance evaluation of software architecture during the early stage of their development is really attractive. This paper proposes a new method for software architecture performance modeling. To achieve this ,it adds new calculus into stochastic process algebra(SPA in short),the developed SPA called extended stochastic process algebra (ESPA in short). By ESPA,performance evaluation and software architecture modeling can combine perfectly. It defines a few performance terms for software architecture using reward structure derived from ESPA. To explain the performance terms it also designs an experiment.

  6. JMorph: Software for performing rapid morphometric measurements on digital images of fossil assemblages

    Science.gov (United States)

    Lelièvre, Peter G.; Grey, Melissa

    2017-08-01

    Quantitative morphometric analyses of form are widely used in palaeontology, especially for taxonomic and evolutionary research. These analyses can involve several measurements performed on hundreds or even thousands of samples. Performing measurements of size and shape on large assemblages of macro- or microfossil samples is generally infeasible or impossible with traditional instruments such as vernier calipers. Instead, digital image processing software is required to perform measurements via suitable digital images of samples. Many software packages exist for morphometric analyses but there is not much available for the integral stage of data collection, particularly for the measurement of the outlines of samples. Some software exists to automatically detect the outline of a fossil sample from a digital image. However, automatic outline detection methods may perform inadequately when samples have incomplete outlines or images contain poor contrast between the sample and staging background. Hence, a manual digitization approach may be the only option. We are not aware of any software packages that are designed specifically for efficient digital measurement of fossil assemblages with numerous samples, especially for the purposes of manual outline analysis. Throughout several previous studies, we have developed a new software tool, JMorph, that is custom-built for that task. JMorph provides the means to perform many different types of measurements, which we describe in this manuscript. We focus on JMorph's ability to rapidly and accurately digitize the outlines of fossils. JMorph is freely available from the authors.

  7. Atmospheric photochemistry of aromatic hydrocarbons: Analysis of OH budgets during SAPHIR chamber experiments and evaluation of MCMv3.2

    Science.gov (United States)

    Nehr, S.; Bohn, B.; Brauers, T.; Dorn, H.; Fuchs, H.; Häseler, R.; Hofzumahaus, A.; Li, X.; Lu, K.; Rohrer, F.; Tillmann, R.; Wahner, A.

    2012-12-01

    Aromatic hydrocarbons, almost exclusively originating from anthropogenic sources, comprise a significant fraction of volatile organic compounds observed in urban air. The photo-oxidation of aromatics results in the formation of secondary pollutants and impacts air quality in cities, industrialized areas, and districts of dense traffic. Up-to-date photochemical oxidation schemes of the Master Chemical Mechanism (MCMv3.2) exhibit moderate performance in simulating aromatic compound degradation observed during previous environmental chamber studies. To obtain a better understanding of aromatic photo-oxidation mechanisms, we performed experiments with a number of aromatic hydrocarbons in the outdoor atmosphere simulation chamber SAPHIR located in Jülich, Germany. These chamber studies were designed to derive OH turnover rates exclusively based on experimental data. Simultaneous measurements of NOx (= NO + NO2), HOx (= OH + HO2), and the total OH loss rate constant k(OH) facilitate a detailed analysis of the OH budgets during photo-oxidation experiments. The OH budget analysis was complemented by numerical model simulations using MCMv3.2. Despite MCM's tendency to overestimate k(OH) and to underpredict radical concentrations, the OH budgets are reasonably balanced for all investigated aromatics. However, the results leave some scope for OH producing pathways that are not considered in the current MCMv3.2. An improved reaction mechanism, derived from MCMv3.2 sensitivity studies, is presented. The model performance is basically improved by changes of the mechanistic representation of ring fragmentation channels.

  8. Development of high performance casting analysis software by coupled parallel computation

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Up to now, so much casting analysis software has been continuing to develop the new access way to real casting processes. Those include the melt flow analysis, heat transfer analysis for solidification calculation,mechanical property predictions and microstructure predictions. These trials were successful to obtain the ideal results comparing with real situations, so that CAE technologies became inevitable to design or develop new casting processes. But for manufacturing fields, CAE technologies are not so frequently being used because of their difficulties in using the software or insufficient computing performances. To introduce CAE technologies to manufacturing field,the high performance analysis is essential to shorten the gap between product designing time and prototyping time.The software code optimization can be helpful, but it is not enough, because the codes developed by software experts are already optimized enough. As an alternative proposal for high performance computations, the parallel computation technologies are eagerly being applied to CAE technologies to make the analysis time shorter. In this research, SMP (Shared Memory Processing) and MPI (Message Passing Interface) (1) methods for parallelization were applied to commercial software "Z-Cast" to calculate the casting processes. In the code parallelizing processes,the network stabilization, core optimization were also carried out under Microsoft Windows platform and their performances and results were compared with those of normal linear analysis codes.

  9. Development of high performance casting analysis software by coupled parallel computation

    Directory of Open Access Journals (Sweden)

    Sang Hyun CHO

    2007-08-01

    Full Text Available Up to now, so much casting analysis software has been continuing to develop the new access way to real casting processes. Those include the melt flow analysis, heat transfer analysis for solidification calculation, mechanical property predictions and microstructure predictions. These trials were successful to obtain the ideal results comparing with real situations, so that CAE technologies became inevitable to design or develop new casting processes. But for manufacturing fields, CAE technologies are not so frequently being used because of their difficulties in using the software or insufficient computing performances. To introduce CAE technologies to manufacturing field, the high performance analysis is essential to shorten the gap between product designing time and prototyping time. The software code optimization can be helpful, but it is not enough, because the codes developed by software experts are already optimized enough. As an alternative proposal for high performance computations, the parallel computation technologies are eagerly being applied to CAE technologies to make the analysis time shorter. In this research, SMP (Shared Memory Processing and MPI (Message Passing Interface (1 methods for parallelization were applied to commercial software "Z-Cast" to calculate the casting processes. In the code parallelizing processes, the network stabilization, core optimization were also carried out under Microsoft Windows platform and their performances and results were compared with those of normal linear analysis codes.

  10. Performance of Different Analytical Software Packages in Quantification of DNA Methylation by Pyrosequencing

    Science.gov (United States)

    Grasso, Chiara; Trevisan, Morena; Fiano, Valentina; Tarallo, Valentina; De Marco, Laura; Sacerdote, Carlotta; Richiardi, Lorenzo; Merletti, Franco; Gillio-Tos, Anna

    2016-01-01

    Background Pyrosequencing has emerged as an alternative method of nucleic acid sequencing, well suited for many applications which aim to characterize single nucleotide polymorphisms, mutations, microbial types and CpG methylation in the target DNA. The commercially available pyrosequencing systems can harbor two different types of software which allow analysis in AQ or CpG mode, respectively, both widely employed for DNA methylation analysis. Objective Aim of the study was to assess the performance for DNA methylation analysis at CpG sites of the two pyrosequencing software which allow analysis in AQ or CpG mode, respectively. Despite CpG mode having been specifically generated for CpG methylation quantification, many investigations on this topic have been carried out with AQ mode. As proof of equivalent performance of the two software for this type of analysis is not available, the focus of this paper was to evaluate if the two modes currently used for CpG methylation assessment by pyrosequencing may give overlapping results. Methods We compared the performance of the two software in quantifying DNA methylation in the promoter of selected genes (GSTP1, MGMT, LINE-1) by testing two case series which include DNA from paraffin embedded prostate cancer tissues (PC study, N = 36) and DNA from blood fractions of healthy people (DD study, N = 28), respectively. Results We found discrepancy in the two pyrosequencing software-based quality assignment of DNA methylation assays. Compared to the software for analysis in the AQ mode, less permissive criteria are supported by the Pyro Q-CpG software, which enables analysis in CpG mode. CpG mode warns the operators about potential unsatisfactory performance of the assay and ensures a more accurate quantitative evaluation of DNA methylation at CpG sites. Conclusion The implementation of CpG mode is strongly advisable in order to improve the reliability of the methylation analysis results achievable by pyrosequencing. PMID

  11. Performance of Different Analytical Software Packages in Quantification of DNA Methylation by Pyrosequencing.

    Directory of Open Access Journals (Sweden)

    Chiara Grasso

    Full Text Available Pyrosequencing has emerged as an alternative method of nucleic acid sequencing, well suited for many applications which aim to characterize single nucleotide polymorphisms, mutations, microbial types and CpG methylation in the target DNA. The commercially available pyrosequencing systems can harbor two different types of software which allow analysis in AQ or CpG mode, respectively, both widely employed for DNA methylation analysis.Aim of the study was to assess the performance for DNA methylation analysis at CpG sites of the two pyrosequencing software which allow analysis in AQ or CpG mode, respectively. Despite CpG mode having been specifically generated for CpG methylation quantification, many investigations on this topic have been carried out with AQ mode. As proof of equivalent performance of the two software for this type of analysis is not available, the focus of this paper was to evaluate if the two modes currently used for CpG methylation assessment by pyrosequencing may give overlapping results.We compared the performance of the two software in quantifying DNA methylation in the promoter of selected genes (GSTP1, MGMT, LINE-1 by testing two case series which include DNA from paraffin embedded prostate cancer tissues (PC study, N = 36 and DNA from blood fractions of healthy people (DD study, N = 28, respectively.We found discrepancy in the two pyrosequencing software-based quality assignment of DNA methylation assays. Compared to the software for analysis in the AQ mode, less permissive criteria are supported by the Pyro Q-CpG software, which enables analysis in CpG mode. CpG mode warns the operators about potential unsatisfactory performance of the assay and ensures a more accurate quantitative evaluation of DNA methylation at CpG sites.The implementation of CpG mode is strongly advisable in order to improve the reliability of the methylation analysis results achievable by pyrosequencing.

  12. Performance improvement of the software development project using the Value Management approach

    CERN Document Server

    Salem-Mhamdia, Amel Ben Hadj

    2011-01-01

    Improving performance and delivering value for customers have become a central theme in business. The software industry has become an increasingly important sector for the economy growth in Tunisia. This study aims to show how using Value Management in the Tunisian software industry for project analysis gives new insight about true project value and performance. This new approach is considered as an appropriate tool for guiding the process of making decisions. It offers tools in order to analyze the service value from the customer and organization perspectives. The results showed that the VM allows to have better performance in the software development project by linking customer satisfaction and cost analysis. The present case shows to service managers how they can benchmark project function to reduce their costs and improve resource allocation taking into consideration what customers consider important during their overall service experience. It can identify best professional practices, orient decisions to ...

  13. Analysing sensory panel performance in a proficiency test using the PanelCheck software

    DEFF Research Database (Denmark)

    Tomic, O.; Luciano, G.; Nilsen, A.

    2010-01-01

    results plays an important role as this provides a time saving and efficient way of screening and investigating sensory panel performances. Most of the statistical methods used in this paper are available in the open source software PanelCheck, which may be downloaded and used for free.......Check software, a workflow is proposed that guides the user through the data analysis process. This allows practitioners and non-statisticians to get an overview over panel performances in a rapid manner without the need to be familiar with details on the statistical methods. Visualisation of data analysis...

  14. Maximizing Use of Extension Beef Cattle Benchmarks Data Derived from Cow Herd Appraisal Performance Software

    Science.gov (United States)

    Ramsay, Jennifer M.; Hanna, Lauren L. Hulsman; Ringwall, Kris A.

    2016-01-01

    One goal of Extension is to provide practical information that makes a difference to producers. Cow Herd Appraisal Performance Software (CHAPS) has provided beef producers with production benchmarks for 30 years, creating a large historical data set. Many such large data sets contain useful information but are underutilized. Our goal was to create…

  15. A four-alternative forced choice (4AFC) software for observer performance evaluation in radiology

    Science.gov (United States)

    Zhang, Guozhi; Cockmartin, Lesley; Bosmans, Hilde

    2016-03-01

    Four-alternative forced choice (4AFC) test is a psychophysical method that can be adopted for observer performance evaluation in radiological studies. While the concept of this method is well established, difficulties to handle large image data, perform unbiased sampling, and keep track of the choice made by the observer have restricted its application in practice. In this work, we propose an easy-to-use software that can help perform 4AFC tests with DICOM images. The software suits for any experimental design that follows the 4AFC approach. It has a powerful image viewing system that favorably simulates the clinical reading environment. The graphical interface allows the observer to adjust various viewing parameters and perform the selection with very simple operations. The sampling process involved in 4AFC as well as the speed and accuracy of the choice made by the observer is precisely monitored in the background and can be easily exported for test analysis. The software has also a defensive mechanism for data management and operation control that minimizes the possibility of mistakes from user during the test. This software can largely facilitate the use of 4AFC approach in radiological observer studies and is expected to have widespread applicability.

  16. Maximizing Use of Extension Beef Cattle Benchmarks Data Derived from Cow Herd Appraisal Performance Software

    Science.gov (United States)

    Ramsay, Jennifer M.; Hanna, Lauren L. Hulsman; Ringwall, Kris A.

    2016-01-01

    One goal of Extension is to provide practical information that makes a difference to producers. Cow Herd Appraisal Performance Software (CHAPS) has provided beef producers with production benchmarks for 30 years, creating a large historical data set. Many such large data sets contain useful information but are underutilized. Our goal was to create…

  17. Distributed control software of high-performance control-loop algorithm

    CERN Document Server

    Blanc, D

    1999-01-01

    The majority of industrial cooling and ventilation plants require the control of complex processes. All these processes are highly important for the operation of the machines. The stability and reliability of these processes are leading factors identifying the quality of the service provided. The control system architecture and software structure, as well, are required to have high dynamical performance and robust behaviour. The intelligent systems based on PID or RST controllers are used for their high level of stability and accuracy. The design and tuning of these complex controllers require the dynamic model of the plant to be known (generally obtained by identification) and the desired performance of the various control loops to be specified for achieving good performances. The concept of having a distributed control algorithm software provides full automation facilities with well-adapted functionality and good performances, giving methodology, means and tools to master the dynamic process optimization an...

  18. Representation of the Physiological Factors Contributing to Postflight Changes in Functional Performance Using Motion Analysis Software

    Science.gov (United States)

    Parks, Kelsey

    2010-01-01

    Astronauts experience changes in multiple physiological systems due to exposure to the microgravity conditions of space flight. To understand how changes in physiological function influence functional performance, a testing procedure has been developed that evaluates both astronaut postflight functional performance and related physiological changes. Astronauts complete seven functional and physiological tests. The objective of this project is to use motion tracking and digitizing software to visually display the postflight decrement in the functional performance of the astronauts. The motion analysis software will be used to digitize astronaut data videos into stick figure videos to represent the astronauts as they perform the Functional Tasks Tests. This project will benefit NASA by allowing NASA scientists to present data of their neurological studies without revealing the identities of the astronauts.

  19. A Unified Component Modeling Approach for Performance Estimation in Hardware/Software Codesign

    DEFF Research Database (Denmark)

    Grode, Jesper Nicolai Riis; Madsen, Jan

    1998-01-01

    This paper presents an approach for abstract modeling of hardware/software architectures using Hierarchical Colored Petri Nets. The approach is able to capture complex behavioral characteristics often seen in software and hardware architectures, thus it is suitable for high level codesign issues...... such as performance estimation. In this paper, the development of a model of the ARM7 processor [5] is described to illustrate the full potential of the modeling approach. To further illustrate the approach, a cache model is also described. The approach and related tools are currently being implemented in the LYCOS...

  20. Software performance in segmenting ground-glass and solid components of subsolid nodules in pulmonary adenocarcinomas.

    Science.gov (United States)

    Cohen, Julien G; Goo, Jin Mo; Yoo, Roh-Eul; Park, Chang Min; Lee, Chang Hyun; van Ginneken, Bram; Chung, Doo Hyun; Kim, Young Tae

    2016-12-01

    To evaluate the performance of software in segmenting ground-glass and solid components of subsolid nodules in pulmonary adenocarcinomas. Seventy-three pulmonary adenocarcinomas manifesting as subsolid nodules were included. Two radiologists measured the maximal axial diameter of the ground-glass components on lung windows and that of the solid components on lung and mediastinal windows. Nodules were segmented using software by applying five (-850 HU to -650 HU) and nine (-130 HU to -500 HU) attenuation thresholds. We compared the manual and software measurements of ground-glass and solid components with pathology measurements of tumour and invasive components. Segmentation of ground-glass components at a threshold of -750 HU yielded mean differences of +0.06 mm (p = 0.83, 95 % limits of agreement, 4.51 to 4.67) and -2.32 mm (p software (at -350 HU) and pathology measurements and between the manual (lung and mediastinal windows) and pathology measurements were -0.12 mm (p = 0.74, -5.73 to 5.55]), 0.15 mm (p = 0.73, -6.92 to 7.22), and -1.14 mm (p Software segmentation of ground-glass and solid components in subsolid nodules showed no significant difference with pathology. • Software can effectively segment ground-glass and solid components in subsolid nodules. • Software measurements show no significant difference with pathology measurements. • Manual measurements are more accurate on lung windows than on mediastinal windows.

  1. A new plant chamber facility, PLUS, coupled to the atmosphere simulation chamber SAPHIR

    Science.gov (United States)

    Hohaus, T.; Kuhn, U.; Andres, S.; Kaminski, M.; Rohrer, F.; Tillmann, R.; Wahner, A.; Wegener, R.; Yu, Z.; Kiendler-Scharr, A.

    2016-03-01

    A new PLant chamber Unit for Simulation (PLUS) for use with the atmosphere simulation chamber SAPHIR (Simulation of Atmospheric PHotochemistry In a large Reaction Chamber) has been built and characterized at the Forschungszentrum Jülich GmbH, Germany. The PLUS chamber is an environmentally controlled flow-through plant chamber. Inside PLUS the natural blend of biogenic emissions of trees is mixed with synthetic air and transferred to the SAPHIR chamber, where the atmospheric chemistry and the impact of biogenic volatile organic compounds (BVOCs) can be studied in detail. In PLUS all important environmental parameters (e.g., temperature, photosynthetically active radiation (PAR), soil relative humidity (RH)) are well controlled. The gas exchange volume of 9.32 m3 which encloses the stem and the leaves of the plants is constructed such that gases are exposed to only fluorinated ethylene propylene (FEP) Teflon film and other Teflon surfaces to minimize any potential losses of BVOCs in the chamber. Solar radiation is simulated using 15 light-emitting diode (LED) panels, which have an emission strength up to 800 µmol m-2 s-1. Results of the initial characterization experiments are presented in detail. Background concentrations, mixing inside the gas exchange volume, and transfer rate of volatile organic compounds (VOCs) through PLUS under different humidity conditions are explored. Typical plant characteristics such as light- and temperature- dependent BVOC emissions are studied using six Quercus ilex trees and compared to previous studies. Results of an initial ozonolysis experiment of BVOC emissions from Quercus ilex at typical atmospheric concentrations inside SAPHIR are presented to demonstrate a typical experimental setup and the utility of the newly added plant chamber.

  2. A new plant chamber facility PLUS coupled to the atmospheric simulation chamber SAPHIR

    Science.gov (United States)

    Hohaus, T.; Kuhn, U.; Andres, S.; Kaminski, M.; Rohrer, F.; Tillmann, R.; Wahner, A.; Wegener, R.; Yu, Z.; Kiendler-Scharr, A.

    2015-11-01

    A new PLant chamber Unit for Simulation (PLUS) for use with the atmosphere simulation chamber SAPHIR (Simulation of Atmospheric PHotochemistry In a large Reaction Chamber) has been build and characterized at the Forschungszentrum Jülich GmbH, Germany. The PLUS chamber is an environmentally controlled flow through plant chamber. Inside PLUS the natural blend of biogenic emissions of trees are mixed with synthetic air and are transferred to the SAPHIR chamber where the atmospheric chemistry and the impact of biogenic volatile organic compounds (BVOC) can be studied in detail. In PLUS all important enviromental parameters (e.g. temperature, PAR, soil RH etc.) are well-controlled. The gas exchange volume of 9.32 m3 which encloses the stem and the leafes of the plants is constructed such that gases are exposed to FEP Teflon film and other Teflon surfaces only to minimize any potential losses of BVOCs in the chamber. Solar radiation is simulated using 15 LED panels which have an emission strength up to 800 μmol m-2 s-1. Results of the initial characterization experiments are presented in detail. Background concentrations, mixing inside the gas exchange volume, and transfer rate of volatile organic compounds (VOC) through PLUS under different humidity conditions are explored. Typical plant characteristics such as light and temperature dependent BVOC emissions are studied using six Quercus Ilex trees and compared to previous studies. Results of an initial ozonolysis experiment of BVOC emissions from Quercus Ilex at typical atmospheric concentrations inside SAPHIR are presented to demonstrate a typical experimental set up and the utility of the newly added plant chamber.

  3. AN ENHANCED MODEL TO ESTIMATE EFFORT, PERFORMANCE AND COST OF THE SOFTWARE PROJECTS

    Directory of Open Access Journals (Sweden)

    M. Pauline

    2013-04-01

    Full Text Available The Authors have proposed a model that first captures the fundamentals of software metrics in the phase 1 consisting of three primitive primary software engineering metrics; they are person-months (PM, function-points (FP, and lines of code (LOC. The phase 2 consists of the proposed function point which is obtained by grouping the adjustment factors to simplify the process of adjustment and to ensure more consistency in the adjustments. In the proposed method fuzzy logic is used for quantifying the quality of requirements and is added as one of the adjustment factor, thus a fuzzy based approach for the Enhanced General System Characteristics to Estimate Effort of the Software Projects using productivity has been obtained. The phase 3 takes the calculated function point from our work and is given as input to the static single variable model (i.e. to the Intermediate COCOMO and COCOMO II for cost estimation. The Authors have tailored the cost factors in intermediate COCOMO and both; cost and scale factors are tailored in COCOMO II to suite to the individual development environment, which is very important for the accuracy of the cost estimates. The software performance indicators are project duration, schedule predictability, requirements completion ratio and post-release defect density, are also measured for the software projects in my work. A comparative study for effort, performance measurement and cost estimation of the software project is done between the existing model and the authors proposed work. Thus our work analyzes the interaction¬al process through which the estimation tasks were collectively accomplished.

  4. Revisioning Theoretical Framework of Electronic Performance Support Systems (EPSS within the Software Application Examples

    Directory of Open Access Journals (Sweden)

    Dr. Servet BAYRAM,

    2004-04-01

    Full Text Available Revisioning Theoretical Framework of Electronic Performance Support Systems (EPSS within the Software Application Examples Assoc. Prof. Dr. Servet BAYRAM Computer Education & Instructional Technologies Marmara University , TURKEY ABSTRACT EPSS provides electronic support to learners in achieving a performance objective; a feature which makes it universally and consistently available on demand any time, any place, regardless of situation, without unnecessary intermediaries involved in the process. The aim of this review is to develop a set of theoretical construct that provide descriptive power for explanation of EPSS and its roots and features within the software application examples (i.e., Microsoft SharePoint Server”v2.0” Beta 2, IBM Lotus Notes 6 & Domino 6, Oracle 9i Collaboration Suite, and Mac OS X v10.2. From the educational and training point of view, the paper visualizes a pentagon model for the interrelated domains of the theoretical framework of EPSS. These domains are: learning theories, information processing theories, developmental theories, instructional theories, and acceptance theories. This descriptive framework explains a set of descriptions as to which outcomes occur under given theoretical conditions for a given EPSS model within software examples. It summarizes some of the theoretical concepts supporting to the EPSS’ related features and explains how such concepts sharing same features with the example software programs in education and job training.

  5. A Framework for Performing Verification and Validation in Reuse Based Software Engineering

    Science.gov (United States)

    Addy, Edward A.

    1997-01-01

    Verification and Validation (V&V) is currently performed during application development for many systems, especially safety-critical and mission- critical systems. The V&V process is intended to discover errors, especially errors related to critical processing, as early as possible during the development process. The system application provides the context under which the software artifacts are validated. This paper describes a framework that extends V&V from an individual application system to a product line of systems that are developed within an architecture-based software engineering environment. This framework includes the activities of traditional application-level V&V, and extends these activities into domain engineering and into the transition between domain engineering and application engineering. The framework includes descriptions of the types of activities to be performed during each of the life-cycle phases, and provides motivation for the activities.

  6. Independent Verification and Validation Of SAPHIRE 8 Risk Management Project Number: N6423 U.S. Nuclear Regulatory Commission

    Energy Technology Data Exchange (ETDEWEB)

    Kent Norris

    2009-11-01

    This report provides an evaluation of the risk management. Risk management is intended to ensure a methodology for conducting risk management planning, identification, analysis, responses, and monitoring and control activities associated with the SAPHIRE project work, and to meet the contractual commitments prepared by the sponsor; the Nuclear Regulatory Commission.

  7. Isotope effect in the formation of H2 from H2CO studied at the atmospheric simulation chamber SAPHIR

    NARCIS (Netherlands)

    Röckmann, T.; Walter, S.; Bohn, B.; Wegener, R.; Spahn, H.; Brauers, T.; Tillmann, R.; Schlosser, E.; Koppmann, R.; Rohrer, F.

    2010-01-01

    Formaldehyde of known, near-natural isotopic composition was photolyzed in the SAPHIR atmosphere simulation chamber under ambient conditions. The isotopic composition of the product H2 was used to determine the isotope effects in formaldehyde photolysis. The experiments are sensitive to the molecula

  8. A SOFTWARE TOOL TO COMPARE MEASURED AND SIMULATED BUILDING ENERGY PERFORMANCE DATA

    Energy Technology Data Exchange (ETDEWEB)

    Maile, Tobias; Bazjanac, Vladimir; O' Donnell, James; Garr, Matthew

    2011-11-01

    Building energy performance is often inadequate when compared to design goals. To link design goals to actual operation one can compare measured with simulated energy performance data. Our previously developed comparison approach is the Energy Performance Comparison Methodology (EPCM), which enables the identification of performance problems based on a comparison of measured and simulated performance data. In context of this method, we developed a software tool that provides graphing and data processing capabilities of the two performance data sets. The software tool called SEE IT (Stanford Energy Efficiency Information Tool) eliminates the need for manual generation of data plots and data reformatting. SEE IT makes the generation of time series, scatter and carpet plots independent of the source of data (measured or simulated) and provides a valuable tool for comparing measurements with simulation results. SEE IT also allows assigning data points on a predefined building object hierarchy and supports different versions of simulated performance data. This paper briefly introduces the EPCM, describes the SEE IT tool and illustrates its use in the context of a building case study.

  9. Intercomparison of NO3 radical detection instruments in the atmosphere simulation chamber SAPHIR

    Directory of Open Access Journals (Sweden)

    J. Thieser

    2013-01-01

    Full Text Available The detection of atmospheric NO3 radicals is still challenging owing to its low mixing ratios (≈ 1 to 300 pptv in the troposphere. While long-path differential optical absorption spectroscopy (DOAS is a well established NO3 detection approach for over 25 yr, newly sensitive techniques have been developed in the past decade. This publication outlines the results of the first comprehensive intercomparison of seven instruments developed for the spectroscopic detection of tropospheric NO3. Four instruments were based on cavity ring-down spectroscopy (CRDS, two utilised open-path cavity enhanced absorption spectroscopy (CEAS, and one applied "classical" long-path DOAS. The intercomparison campaign "NO3Comp" was held at the atmosphere simulation chamber SAPHIR in Jülich (Germany in June 2007. Twelve experiments were performed in the well mixed chamber for variable concentrations of NO3, N2O5, NO2, hydrocarbons, and water vapour, in the absence and in the presence of inorganic or organic aerosol. The overall precision of the cavity instruments varied between 0.5 and 5 pptv for integration times of 1 s to 5 min; that of the DOAS instrument was 9 pptv for an acquisition time of 1 min. The NO3 data of all instruments correlated excellently with the NOAA-CRDS instrument, which was selected as the common reference because of its superb sensitivity, high time resolution, and most comprehensive data coverage. The median of the coefficient of determination (r2 over all experiments of the campaign (60 correlations is r2 = 0.981 (25th/75th percentiles: 0.949/0.994; min/max: 0.540/0.999. The linear regression analysis of the campaign data set yielded very small intercepts (1.2 ± 5.3 pptv and the average slope of the regression lines was close to unity (1.02, min: 0.72, max: 1.36. The deviation of individual regression slopes from unity was always within the combined accuracies of each instrument pair. The very good correspondence between the NO3 measurements

  10. Intercomparison of NO3 radical detection instruments in the atmosphere simulation chamber SAPHIR

    Directory of Open Access Journals (Sweden)

    H.-P. Dorn

    2013-05-01

    Full Text Available The detection of atmospheric NO3 radicals is still challenging owing to its low mixing ratios (≈ 1 to 300 pptv in the troposphere. While long-path differential optical absorption spectroscopy (DOAS has been a well-established NO3 detection approach for over 25 yr, newly sensitive techniques have been developed in the past decade. This publication outlines the results of the first comprehensive intercomparison of seven instruments developed for the spectroscopic detection of tropospheric NO3. Four instruments were based on cavity ring-down spectroscopy (CRDS, two utilised open-path cavity-enhanced absorption spectroscopy (CEAS, and one applied "classical" long-path DOAS. The intercomparison campaign "NO3Comp" was held at the atmosphere simulation chamber SAPHIR in Jülich (Germany in June 2007. Twelve experiments were performed in the well-mixed chamber for variable concentrations of NO3, N2O5, NO2, hydrocarbons, and water vapour, in the absence and in the presence of inorganic or organic aerosol. The overall precision of the cavity instruments varied between 0.5 and 5 pptv for integration times of 1 s to 5 min; that of the DOAS instrument was 9 pptv for an acquisition time of 1 min. The NO3 data of all instruments correlated excellently with the NOAA-CRDS instrument, which was selected as the common reference because of its superb sensitivity, high time resolution, and most comprehensive data coverage. The median of the coefficient of determination (r2 over all experiments of the campaign (60 correlations is r2 = 0.981 (quartile 1 (Q1: 0.949; quartile 3 (Q3: 0.994; min/max: 0.540/0.999. The linear regression analysis of the campaign data set yielded very small intercepts (median: 1.1 pptv; Q1/Q3: −1.1/2.6 pptv; min/max: −14.1/28.0 pptv, and the slopes of the regression lines were close to unity (median: 1.01; Q1/Q3: 0.92/1.10; min/max: 0.72/1.36. The deviation of individual regression slopes from unity was always within the combined

  11. Intercomparison of NO3 radical detection instruments in the atmosphere simulation chamber SAPHIR

    Science.gov (United States)

    Dorn, H.-P.; Apodaca, R. L.; Ball, S. M.; Brauers, T.; Brown, S. S.; Crowley, J. N.; Dubé, W. P.; Fuchs, H.; Häseler, R.; Heitmann, U.; Jones, R. L.; Kiendler-Scharr, A.; Labazan, I.; Langridge, J. M.; Meinen, J.; Mentel, T. F.; Platt, U.; Pöhler, D.; Rohrer, F.; Ruth, A. A.; Schlosser, E.; Schuster, G.; Shillings, A. J. L.; Simpson, W. R.; Thieser, J.; Tillmann, R.; Varma, R.; Venables, D. S.; Wahner, A.

    2013-05-01

    The detection of atmospheric NO3 radicals is still challenging owing to its low mixing ratios (≈ 1 to 300 pptv) in the troposphere. While long-path differential optical absorption spectroscopy (DOAS) has been a well-established NO3 detection approach for over 25 yr, newly sensitive techniques have been developed in the past decade. This publication outlines the results of the first comprehensive intercomparison of seven instruments developed for the spectroscopic detection of tropospheric NO3. Four instruments were based on cavity ring-down spectroscopy (CRDS), two utilised open-path cavity-enhanced absorption spectroscopy (CEAS), and one applied "classical" long-path DOAS. The intercomparison campaign "NO3Comp" was held at the atmosphere simulation chamber SAPHIR in Jülich (Germany) in June 2007. Twelve experiments were performed in the well-mixed chamber for variable concentrations of NO3, N2O5, NO2, hydrocarbons, and water vapour, in the absence and in the presence of inorganic or organic aerosol. The overall precision of the cavity instruments varied between 0.5 and 5 pptv for integration times of 1 s to 5 min; that of the DOAS instrument was 9 pptv for an acquisition time of 1 min. The NO3 data of all instruments correlated excellently with the NOAA-CRDS instrument, which was selected as the common reference because of its superb sensitivity, high time resolution, and most comprehensive data coverage. The median of the coefficient of determination (r2) over all experiments of the campaign (60 correlations) is r2 = 0.981 (quartile 1 (Q1): 0.949; quartile 3 (Q3): 0.994; min/max: 0.540/0.999). The linear regression analysis of the campaign data set yielded very small intercepts (median: 1.1 pptv; Q1/Q3: -1.1/2.6 pptv; min/max: -14.1/28.0 pptv), and the slopes of the regression lines were close to unity (median: 1.01; Q1/Q3: 0.92/1.10; min/max: 0.72/1.36). The deviation of individual regression slopes from unity was always within the combined accuracies of each

  12. Interactive Software System Developed to Study How Icing Affects Airfoil Performance (Phase 1 Results)

    Science.gov (United States)

    Choo, Yung K.; Vickerman, Mary B.

    2000-01-01

    SmaggIce (Surface Modeling and Grid Generation for Iced Airfoils), which is being developed at the NASA Glenn Research Center at Lewis Field, is an interactive software system for data probing, boundary smoothing, domain decomposition, and structured grid generation and refinement. All these steps are required for aerodynamic performance prediction using structured, grid-based computational fluid dynamics (CFD), as illustrated in the following figure. SmaggIce provides the underlying computations to perform these functions, as well as a graphical user interface to control and interact with them, and graphics to display the results.

  13. Optimizing the Performance of Radionuclide Identification Software in the Hunt for Nuclear Security Threats

    Energy Technology Data Exchange (ETDEWEB)

    Fotion, Katherine A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-08-18

    The Radionuclide Analysis Kit (RNAK), my team’s most recent nuclide identification software, is entering the testing phase. A question arises: will removing rare nuclides from the software’s library improve its overall performance? An affirmative response indicates fundamental errors in the software’s framework, while a negative response confirms the effectiveness of the software’s key machine learning algorithms. After thorough testing, I found that the performance of RNAK cannot be improved with the library choice effect, thus verifying the effectiveness of RNAK’s algorithms—multiple linear regression, Bayesian network using the Viterbi algorithm, and branch and bound search.

  14. New Software Performance with Balanced Score Card Assessment: Case Study at LPGI Jakarta

    Directory of Open Access Journals (Sweden)

    Brata Wibawa Djojo

    2011-09-01

    Full Text Available Implementation of information technology (IT, especially new software applications, needs to be evaluated for its impact to organization’s business performance related to its strategic goal. The measurement and evaluation of a new software implementation impact in LPGI Jakarta uses Balanced Scorecard (BSC analysis by making comparison of three-year data. The analysis involves four perspectives of BSC: (1 Financial aspect with the growth of gross premium written (GPW, net premium written (NPW, underwriting profit; (2 internal business aspect: the frequency of policy issued and the average production per policy; (3 people or learning and growth which consists of human error and system error; (4 customer aspect with external endorsement and renewal ratio. This research measures and evaluates for the impact of the implementation of a new software application to the new business performance as Marginal and Fair contribution.  At the end of this paper the writer suggests LPGI Jakarta to increase the sales activities to reach the target which is related directly to financial aspect and internal business process aspect.

  15. A Business Analytics Software Tool for Monitoring and Predicting Radiology Throughput Performance.

    Science.gov (United States)

    Jones, Stephen; Cournane, Seán; Sheehy, Niall; Hederman, Lucy

    2016-12-01

    Business analytics (BA) is increasingly being utilised by radiology departments to analyse and present data. It encompasses statistical analysis, forecasting and predictive modelling and is used as an umbrella term for decision support and business intelligence systems. The primary aim of this study was to determine whether utilising BA technologies could contribute towards improved decision support and resource management within radiology departments. A set of information technology requirements were identified with key stakeholders, and a prototype BA software tool was designed, developed and implemented. A qualitative evaluation of the tool was carried out through a series of semi-structured interviews with key stakeholders. Feedback was collated, and emergent themes were identified. The results indicated that BA software applications can provide visibility of radiology performance data across all time horizons. The study demonstrated that the tool could potentially assist with improving operational efficiencies and management of radiology resources.

  16. Secondary organic aerosols - formation and ageing studies in the SAPHIR chamber

    Science.gov (United States)

    Spindler, Christian; Müller, Lars; Trimborn, Achim; Mentel, Thomas; Hoffmann, Thorsten

    2010-05-01

    Secondary organic aerosol (SOA) formation from oxidation products of biogenic volatile organic compounds (BVOC) constitutes an important coupling between vegetation, atmospheric chemistry, and climate change. Such secondary organic aerosol components play an important role in particle formation in Boreal regions ((Laaksonen et al., 2008)), where biogenic secondary organic aerosols contribute to an overall negative radiative forcing, thus a negative feed back between vegetation and climate warming (Spracklen et al., 2008). Within the EUCAARI project we investigated SOA formation from mixtures of monoterpenes (and sesquiterpenes) as emitted typically from Boreal tree species in Southern Finland. The experiments were performed in the large photochemical reactor SAPHIR in Juelich at natural light and oxidant levels. Oxidation of the BVOC mixtures and SOA formation was induced by OH radicals and O3. The SOA was formed on the first day and then aged for another day. The resulting SOA was characterized by HR-ToF-AMS, APCI-MS, and filter samples with subsequent H-NMR, GC-MS and HPLC-MS analysis. The chemical evolution of the SOA is characterized by a fast increase of the O/C ratio during the formation process on the first day, stable O/C ratio during night, and a distinctive increase of O/C ratio at the second day. The increase of the O/C ratio on the second day is highly correlated to the OH dose and is accompanied by condensational growth of the particles. We will present simultaneous factor analysis of AMS times series (PMF, Ulbrich et al., 2009 ) and direct measurements of individual chemical species. We found that four factors were needed to represent the time evolution of the SOA composition (in the mass spectra) if oxidation by OH plays a mayor role. Corresponding to these factors we observed individual, representative molecules with very similar time behaviour. The correlation between tracers and AMS factors is astonishingly good as the molecular tracers

  17. The Impact of Strategy for Building Sustainability on Performance of Software Development Business in Thailand

    Directory of Open Access Journals (Sweden)

    Karun Pratoom

    2011-02-01

    Full Text Available In the present business environments, balancing between the needs of a business enterprise and its stakeholders is recognized as a critical strategy for the success and long-term survival of any firm. However, the understanding of sustainable strategy on firm performance remains a key challenge for both academia and management alike. The purpose of this study is to examine the effect of strategy for building sustainability on performance. Data are collected from 122 managers of software development companies in Thailand. Results show that strategy for building sustainability positively affected Capability Maturity Model Integration (CMMI level, financial performance, corporate image, and stakeholder satisfaction. Furthermore, results also show that the strategy for building sustainability had indirect effects on corporate image through the CMMI level and the stakeholder satisfaction had direct effects on financial performance.

  18. Impact Analysis of Generalized Audit Software (GAS Utilization to Auditor Performances

    Directory of Open Access Journals (Sweden)

    Aries Wicaksono

    2016-09-01

    Full Text Available This study aimed to understand whether the use of Generalized Audit Software (GAS in the audit process have an impact on the auditors performance and to acquire conclusions in the evaluation form towards GAS audit process to provide a positive impact on the performance of auditors. The models used to evaluate the impact of GAS were Quantity of Work, Quality of Work, Job Knowledge, Creativeness, Cooperation, Dependability, Initiative, and Personal Qualities. The method used in this research was a qualitative method of analytical descriptive and evaluative, by analyzing the impact of the GAS implementation to the components of the user’s performance. The results indicate that the use of GAS has a positive impact on user’s performance components.

  19. A Framework for Performing V&V within Reuse-Based Software Engineering

    Science.gov (United States)

    Addy, Edward A.

    1996-01-01

    Verification and validation (V&V) is performed during application development for many systems, especially safety-critical and mission-critical systems. The V&V process is intended to discover errors, especially errors related to critical processing, as early as possible during the development process. Early discovery is important in order to minimize the cost and other impacts of correcting these errors. In order to provide early detection of errors, V&V is conducted in parallel with system development, often beginning with the concept phase. In reuse-based software engineering, however, decisions on the requirements, design and even implementation of domain assets can be made prior to beginning development of a specific system. In this case, V&V must be performed during domain engineering in order to have an impact on system development. This paper describes a framework for performing V&V within architecture-centric, reuse-based software engineering. This framework includes the activities of traditional application-level V&V, and extends these activities into domain engineering and into the transition between domain engineering and application engineering. The framework includes descriptions of the types of activities to be performed during each of the life-cycle phases, and provides motivation for the activities.

  20. Scalable, high-performance 3D imaging software platform: system architecture and application to virtual colonoscopy.

    Science.gov (United States)

    Yoshida, Hiroyuki; Wu, Yin; Cai, Wenli; Brett, Bevin

    2012-01-01

    One of the key challenges in three-dimensional (3D) medical imaging is to enable the fast turn-around time, which is often required for interactive or real-time response. This inevitably requires not only high computational power but also high memory bandwidth due to the massive amount of data that need to be processed. In this work, we have developed a software platform that is designed to support high-performance 3D medical image processing for a wide range of applications using increasingly available and affordable commodity computing systems: multi-core, clusters, and cloud computing systems. To achieve scalable, high-performance computing, our platform (1) employs size-adaptive, distributable block volumes as a core data structure for efficient parallelization of a wide range of 3D image processing algorithms; (2) supports task scheduling for efficient load distribution and balancing; and (3) consists of a layered parallel software libraries that allow a wide range of medical applications to share the same functionalities. We evaluated the performance of our platform by applying it to an electronic cleansing system in virtual colonoscopy, with initial experimental results showing a 10 times performance improvement on an 8-core workstation over the original sequential implementation of the system.

  1. Investigation of the formaldehyde differential absorption cross section at high and low spectral resolution in the simulation chamber SAPHIR

    Directory of Open Access Journals (Sweden)

    T. Brauers

    2007-07-01

    Full Text Available The results from a simulation chamber study on the formaldehyde (HCHO absorption cross section in the UV spectral region are presented. We performed 4 experiments at ambient HCHO concentrations with simultaneous measurements of two DOAS instruments in the atmosphere simulation chamber SAPHIR in Jülich. The two instruments differ in their spectral resolution, one working at 0.2 nm (broad-band, BB-DOAS, the other at 2.7 pm (high-resolution, HR-DOAS. Both instruments use dedicated multi reflection cells to achieve long light path lengths of 960 m and 2240 m, respectively, inside the chamber. During two experiments HCHO was injected into the clean chamber by thermolysis of well defined amounts of para-formaldehyde reaching mixing rations of 30 ppbV at maximum. The HCHO concentration calculated from the injection and the chamber volume agrees with the BB-DOAS measured value when the absorption cross section of Meller and Moortgat (2000 and the temperature coefficient of Cantrell (1990 were used for data evaluation. In two further experiments we produced HCHO in-situ from the ozone + ethene reaction which was intended to provide an independent way of HCHO calibration through the measurements of ozone and ethene. However, we found an unexpected deviation from the current understanding of the ozone + ethene reaction when CO was added to suppress possible oxidation of ethene by OH radicals. The reaction of the Criegee intermediate with CO could be 240 times slower than currently assumed. Based on the BB-DOAS measurements we could deduce a high-resolution cross section for HCHO which was not measured directly so far.

  2. A Visual Basic simulation software tool for performance analysis of a membrane-based advanced water treatment plant.

    Science.gov (United States)

    Pal, P; Kumar, R; Srivastava, N; Chowdhury, J

    2014-02-01

    A Visual Basic simulation software (WATTPPA) has been developed to analyse the performance of an advanced wastewater treatment plant. This user-friendly and menu-driven software is based on the dynamic mathematical model for an industrial wastewater treatment scheme that integrates chemical, biological and membrane-based unit operations. The software-predicted results corroborate very well with the experimental findings as indicated in the overall correlation coefficient of the order of 0.99. The software permits pre-analysis and manipulation of input data, helps in optimization and exhibits performance of an integrated plant visually on a graphical platform. It allows quick performance analysis of the whole system as well as the individual units. The software first of its kind in its domain and in the well-known Microsoft Excel environment is likely to be very useful in successful design, optimization and operation of an advanced hybrid treatment plant for hazardous wastewater.

  3. Parameter definition using vibration prediction software leads to significant drilling performance improvements

    Energy Technology Data Exchange (ETDEWEB)

    Amorim, Dalmo; Hanley, Chris Hanley; Fonseca, Isaac; Santos, Juliana [National Oilwell Varco, Houston TX (United States); Leite, Daltro J.; Borella, Augusto; Gozzi, Danilo [Petroleo Brasileiro S.A. (PETROBRAS), Rio de Janeiro, RJ (Brazil)

    2012-07-01

    The understanding and mitigation of downhole vibration has been a heavily researched subject in the oil industry as it results in more expensive drilling operations, as vibrations significantly diminish the amount of effective drilling energy available to the bit and generate forces that can push the bit or the Bottom Hole Assembly (BHA) off its concentric axis of rotation, producing high magnitude impacts with the borehole wall. In order to drill ahead, a sufficient amount of energy must be supplied by the rig to overcome the resistance of the drilling system, including the reactive torque of the system, drag forces, fluid pressure losses and energy dissipated by downhole vibrations, then providing the bit with the energy required to fail the rock. If the drill string enters resonant modes of vibration, not only does it decreases the amount of available energy to drill, but increases the potential for catastrophic downhole equipment and drilling bit failures. In this sense, the mitigation of downhole vibrations will result in faster, smoother, and cheaper drilling operations. A software tool using Finite Element Analysis (FEA) has been developed to provide better understanding of downhole vibration phenomena in drilling environments. The software tool calculates the response of the drilling system at various input conditions, based on the design of the wellbore along with the geometry of the Bottom Hole Assembly (BHA) and the drill string. It identifies where undesired levels of resonant vibration will be driven by certain combinations of specific drilling parameters, and also which combinations of drilling parameters will result in lower levels of vibration, so the least shocks, the highest penetration rate and the lowest cost per foot can be achieved. With the growing performance of personal computers, complex software systems modeling the drilling vibrations using FEA has been accessible to a wider audience of field users, further complimenting with real time

  4. Scalable, High-performance 3D Imaging Software Platform: System Architecture and Application to Virtual Colonoscopy

    OpenAIRE

    Yoshida, Hiroyuki; Wu, Yin; Cai, Wenli; Brett, Bevin

    2012-01-01

    One of the key challenges in three-dimensional (3D) medical imaging is to enable the fast turn-around time, which is often required for interactive or real-time response. This inevitably requires not only high computational power but also high memory bandwidth due to the massive amount of data that need to be processed. In this work, we have developed a software platform that is designed to support high-performance 3D medical image processing for a wide range of applications using increasingl...

  5. Performance Estimation for Hardware/Software codesign using Hierarchical Colored Petri Nets

    DEFF Research Database (Denmark)

    Grode, Jesper Nicolai Riis; Madsen, Jan; Jerraya, Ahmed

    1998-01-01

    estimation tool. This makes the approach very useful for designing component models used for performance estimation in Hardware/Software Codesign frameworks such as the LYCOS system. The paper presents the methodology and rules for designing component models using HCPNs. Two examples of architectural models......This paper presents an approach for abstract modeling of the functional behavior of hardware architectures using Hierarchical Colored Petri Nets (HCPNs). Using HCPNs as architectural models has several advantages such as higher estimation accuracy, higher flexibility, and the need for only one...

  6. ConfocalCheck - A Software Tool for the Automated Monitoring of Confocal Microscope Performance

    Science.gov (United States)

    Hng, Keng Imm; Dormann, Dirk

    2013-01-01

    Laser scanning confocal microscopy has become an invaluable tool in biomedical research but regular quality testing is vital to maintain the system’s performance for diagnostic and research purposes. Although many methods have been devised over the years to characterise specific aspects of a confocal microscope like measuring the optical point spread function or the field illumination, only very few analysis tools are available. Our aim was to develop a comprehensive quality assurance framework ranging from image acquisition to automated analysis and documentation. We created standardised test data to assess the performance of the lasers, the objective lenses and other key components required for optimum confocal operation. The ConfocalCheck software presented here analyses the data fully automatically. It creates numerous visual outputs indicating potential issues requiring further investigation. By storing results in a web browser compatible file format the software greatly simplifies record keeping allowing the operator to quickly compare old and new data and to spot developing trends. We demonstrate that the systematic monitoring of confocal performance is essential in a core facility environment and how the quantitative measurements obtained can be used for the detailed characterisation of system components as well as for comparisons across multiple instruments. PMID:24224017

  7. ConfocalCheck--a software tool for the automated monitoring of confocal microscope performance.

    Directory of Open Access Journals (Sweden)

    Keng Imm Hng

    Full Text Available Laser scanning confocal microscopy has become an invaluable tool in biomedical research but regular quality testing is vital to maintain the system's performance for diagnostic and research purposes. Although many methods have been devised over the years to characterise specific aspects of a confocal microscope like measuring the optical point spread function or the field illumination, only very few analysis tools are available. Our aim was to develop a comprehensive quality assurance framework ranging from image acquisition to automated analysis and documentation. We created standardised test data to assess the performance of the lasers, the objective lenses and other key components required for optimum confocal operation. The ConfocalCheck software presented here analyses the data fully automatically. It creates numerous visual outputs indicating potential issues requiring further investigation. By storing results in a web browser compatible file format the software greatly simplifies record keeping allowing the operator to quickly compare old and new data and to spot developing trends. We demonstrate that the systematic monitoring of confocal performance is essential in a core facility environment and how the quantitative measurements obtained can be used for the detailed characterisation of system components as well as for comparisons across multiple instruments.

  8. Observation of the positive-strangeness pentaquark $\\Theta^+$ in photoproduction with the SAPHIR detector at ELSA

    CERN Document Server

    Barth, J; Ernst, J; Glander, K H; Hannappel, J; Jöpen, N; Kalinowsky, H; Klein, F; Klempt, E; Lawall, R; Link, J; Menze, D W; Neuerburg, W; Ostrick, M; Paul, E; Van Pee, H; Schulday, I; Schwille, W J; Wiegers, B; Wieland, F W; Wisskirchen, J; Wu, C

    2003-01-01

    The positive--strangeness baryon resonance $\\Theta^+$ is observed in photoproduction of the $\\rm nK^+K^0_s$ final state with the SAPHIR detector at the Bonn ELectron Stretcher Accelerator ELSA. It is seen as a peak in the $\\rm nK^+$ invariant mass distribution with a $4.8\\sigma$ confidence level. We find a mass $\\rm M_{\\Theta^+} = 1540\\pm 4\\pm 2$ MeV and an upper limit of the width $\\rm \\Gamma_{\\Theta^+} < 25$ MeV at 90% c.l. The photoproduction cross section for $\\rm\\bar K^0\\Theta^+$ is in the order of 300 nb. From the absence of a signal in the $\\rm pK^+$ invariant mass distribution in $\\rm\\gamma p\\to pK^+K^-$ at the expected strength we conclude that the $\\Theta^+$ must be isoscalar.

  9. SAPHIR: a physiome core model of body fluid homeostasis and blood pressure regulation.

    Science.gov (United States)

    Thomas, S Randall; Baconnier, Pierre; Fontecave, Julie; Françoise, Jean-Pierre; Guillaud, François; Hannaert, Patrick; Hernández, Alfredo; Le Rolle, Virginie; Mazière, Pierre; Tahi, Fariza; White, Ronald J

    2008-09-13

    We present the current state of the development of the SAPHIR project (a Systems Approach for PHysiological Integration of Renal, cardiac and respiratory function). The aim is to provide an open-source multi-resolution modelling environment that will permit, at a practical level, a plug-and-play construction of integrated systems models using lumped-parameter components at the organ/tissue level while also allowing focus on cellular- or molecular-level detailed sub-models embedded in the larger core model. Thus, an in silico exploration of gene-to-organ-to-organism scenarios will be possible, while keeping computation time manageable. As a first prototype implementation in this environment, we describe a core model of human physiology targeting the short- and long-term regulation of blood pressure, body fluids and homeostasis of the major solutes. In tandem with the development of the core models, the project involves database implementation and ontology development.

  10. GNSS software receiver sampling noise and clock jitter performance and impact analysis

    Science.gov (United States)

    Chen, Jian Yun; Feng, XuZhe; Li, XianBin; Wu, GuangYao

    2015-02-01

    In the design of a multi-frequency multi-constellation GNSS software defined radio receivers is becoming more and more popular due to its simple architecture, flexible configuration and good coherence in multi-frequency signal processing. It plays an important role in navigation signal processing and signal quality monitoring. In particular, GNSS software defined radio receivers driving the sampling clock of analogue-to-digital converter (ADC) by FPGA implies that a more flexible radio transceiver design is possible. According to the concept of software defined radio (SDR), the ideal is to digitize as close to the antenna as possible. Whereas the carrier frequency of GNSS signal is of the frequency of GHz, converting at this frequency is expensive and consumes more power. Band sampling method is a cheaper, more effective alternative. When using band sampling method, it is possible to sample a RF signal at twice the bandwidth of the signal. Unfortunately, as the other side of the coin, the introduction of SDR concept and band sampling method induce negative influence on the performance of the GNSS receivers. ADC's suffer larger sampling clock jitter generated by FPGA; and low sampling frequency introduces more noise to the receiver. Then the influence of sampling noise cannot be neglected. The paper analyzes the sampling noise, presents its influence on the carrier noise ratio, and derives the ranging error by calculating the synchronization error of the delay locked loop. Simulations aiming at each impact factors of sampling-noise-induced ranging error are performed. Simulation and experiment results show that if the target ranging accuracy is at the level of centimeter, the quantization length should be no less than 8 and the sampling clock jitter should not exceed 30ps.

  11. Enabling Diverse Software Stacks on Supercomputers using High Performance Virtual Clusters.

    Energy Technology Data Exchange (ETDEWEB)

    Younge, Andrew J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Pedretti, Kevin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Grant, Ryan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Brightwell, Ron [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-05-01

    While large-scale simulations have been the hallmark of the High Performance Computing (HPC) community for decades, Large Scale Data Analytics (LSDA) workloads are gaining attention within the scientific community not only as a processing component to large HPC simulations, but also as standalone scientific tools for knowledge discovery. With the path towards Exascale, new HPC runtime systems are also emerging in a way that differs from classical distributed com- puting models. However, system software for such capabilities on the latest extreme-scale DOE supercomputing needs to be enhanced to more appropriately support these types of emerging soft- ware ecosystems. In this paper, we propose the use of Virtual Clusters on advanced supercomputing resources to enable systems to support not only HPC workloads, but also emerging big data stacks. Specifi- cally, we have deployed the KVM hypervisor within Cray's Compute Node Linux on a XC-series supercomputer testbed. We also use libvirt and QEMU to manage and provision VMs directly on compute nodes, leveraging Ethernet-over-Aries network emulation. To our knowledge, this is the first known use of KVM on a true MPP supercomputer. We investigate the overhead our solution using HPC benchmarks, both evaluating single-node performance as well as weak scaling of a 32-node virtual cluster. Overall, we find single node performance of our solution using KVM on a Cray is very efficient with near-native performance. However overhead increases by up to 20% as virtual cluster size increases, due to limitations of the Ethernet-over-Aries bridged network. Furthermore, we deploy Apache Spark with large data analysis workloads in a Virtual Cluster, ef- fectively demonstrating how diverse software ecosystems can be supported by High Performance Virtual Clusters.

  12. Perprof-py: A Python Package for Performance Profile of Mathematical Optimization Software

    Directory of Open Access Journals (Sweden)

    Abel Soares Siqueira

    2016-04-01

    Full Text Available A very important area of research in the field of Mathematical Optimization is the benchmarking of optimization packages to compare solvers. During benchmarking, one usually collects a large amount of information like CPU time, number of functions evaluations, number of iterations, and much more. This information, if presented as tables, can be difficult to analyze and compare due to large amount of data. Therefore tools to better process and understand optimization benchmark data have been developed. One of the most widespread tools is the Performance Profile graphics proposed by Dolan and Moré [2]. In this context, this paper describes perprof-py, a free/open source software that creates 'Performance Profile' graphics. This software produces graphics in PDF using LaTeX with PGF/TikZ [22] and PGFPLOTS [4] packages, in PNG using matplotlib [9], and in HTML using Bokeh [1]. Perprof-py can also be easily extended to be used with other plot libraries. It is implemented in Python 3 with support for internationalization, and is under the General Public License Version 3 (GPLv3.

  13. Performance of Object-Oriented Real-Time Control and Acquisition Software

    Science.gov (United States)

    Collins, Andrew

    2010-11-01

    The dead-time of the Object-oriented Real-time Control and Aqcuisition data acquisition software, orca, was quantitatively determined for a VME-based system utilizing a single, peak-sensing CAEN 785N analog-to-digital converter and two scaler modules. A single board computer in the VME crate controls rapid read-out of the modules and the data is then transferred via TCP/IP to the orca control program, running on MacOSX, where the data can be filtered based on desired criteria, saved in an open format, and displayed on-line in histograms. A graphical interface allows the system to be configured via ``drag and drop'' method. The performance tests were performed on orca and two other data acquisition systems used at Triangle Universities Nuclear Laboratory, coda and SpecTcl, to compare the systems' data collection capabilities and determine whether the new system is a worthy competitor of the existing systems.

  14. Comparing On-Orbit and Ground Performance for an S-Band Software-Defined Radio

    Science.gov (United States)

    Chelmins, David T.; Welch, Bryan W.

    2014-01-01

    NASA's Space Communications and Navigation Testbed was installed on an external truss of the International Space Station in 2012. The testbed contains several software-defined radios (SDRs), including the Jet Propulsion Laboratory (JPL) SDR, which underwent performance testing throughout 2013 with NASAs Tracking and Data Relay Satellite System (TDRSS). On-orbit testing of the JPL SDR was conducted at S-band with the Glenn Goddard TDRSS waveform and compared against an extensive dataset collected on the ground prior to launch. This paper will focus on the development of a waveform power estimator on the ground post-launch and discuss the performance challenges associated with operating the power estimator in space.

  15. Performance evaluation of automated segmentation software on optical coherence tomography volume data.

    Science.gov (United States)

    Tian, Jing; Varga, Boglarka; Tatrai, Erika; Fanni, Palya; Somfai, Gabor Mark; Smiddy, William E; Debuc, Delia Cabrera

    2016-05-01

    Over the past two decades a significant number of OCT segmentation approaches have been proposed in the literature. Each methodology has been conceived for and/or evaluated using specific datasets that do not reflect the complexities of the majority of widely available retinal features observed in clinical settings. In addition, there does not exist an appropriate OCT dataset with ground truth that reflects the realities of everyday retinal features observed in clinical settings. While the need for unbiased performance evaluation of automated segmentation algorithms is obvious, the validation process of segmentation algorithms have been usually performed by comparing with manual labelings from each study and there has been a lack of common ground truth. Therefore, a performance comparison of different algorithms using the same ground truth has never been performed. This paper reviews research-oriented tools for automated segmentation of the retinal tissue on OCT images. It also evaluates and compares the performance of these software tools with a common ground truth.

  16. Computer software.

    Science.gov (United States)

    Rosenthal, L E

    1986-10-01

    Software is the component in a computer system that permits the hardware to perform the various functions that a computer system is capable of doing. The history of software and its development can be traced to the early nineteenth century. All computer systems are designed to utilize the "stored program concept" as first developed by Charles Babbage in the 1850s. The concept was lost until the mid-1940s, when modern computers made their appearance. Today, because of the complex and myriad tasks that a computer system can perform, there has been a differentiation of types of software. There is software designed to perform specific business applications. There is software that controls the overall operation of a computer system. And there is software that is designed to carry out specialized tasks. Regardless of types, software is the most critical component of any computer system. Without it, all one has is a collection of circuits, transistors, and silicone chips.

  17. A software package for evaluating the performance of a star sensor operation

    Science.gov (United States)

    Sarpotdar, Mayuresh; Mathew, Joice; Sreejith, A. G.; K., Nirmal; Ambily, S.; Prakash, Ajin; Safonova, Margarita; Murthy, Jayant

    2017-01-01

    We have developed a low-cost off-the-shelf component star sensor (StarSense) for use in minisatellites and CubeSats to determine the attitude of a satellite in orbit. StarSense is an imaging camera with a limiting magnitude of 6.5, which extracts information from star patterns it records in the images. The star sensor implements a centroiding algorithm to find centroids of the stars in the image, a Geometric Voting algorithm for star pattern identification, and a QUEST algorithm for attitude quaternion calculation. Here, we describe the software package to evaluate the performance of these algorithms as a star sensor single operating system. We simulate the ideal case where sky background and instrument errors are omitted, and a more realistic case where noise and camera parameters are added to the simulated images. We evaluate such performance parameters of the algorithms as attitude accuracy, calculation time, required memory, star catalog size, sky coverage, etc., and estimate the errors introduced by each algorithm. This software package is written for use in MATLAB. The testing is parametrized for different hardware parameters, such as the focal length of the imaging setup, the field of view (FOV) of the camera, angle measurement accuracy, distortion effects, etc., and therefore, can be applied to evaluate the performance of such algorithms in any star sensor. For its hardware implementation on our StarSense, we are currently porting the codes in form of functions written in C. This is done keeping in view its easy implementation on any star sensor electronics hardware.

  18. A software package for evaluating the performance of a star sensor operation

    Science.gov (United States)

    Sarpotdar, Mayuresh; Mathew, Joice; Sreejith, A. G.; Nirmal, K.; Ambily, S.; Prakash, Ajin; Safonova, Margarita; Murthy, Jayant

    2017-02-01

    We have developed a low-cost off-the-shelf component star sensor ( StarSense) for use in minisatellites and CubeSats to determine the attitude of a satellite in orbit. StarSense is an imaging camera with a limiting magnitude of 6.5, which extracts information from star patterns it records in the images. The star sensor implements a centroiding algorithm to find centroids of the stars in the image, a Geometric Voting algorithm for star pattern identification, and a QUEST algorithm for attitude quaternion calculation. Here, we describe the software package to evaluate the performance of these algorithms as a star sensor single operating system. We simulate the ideal case where sky background and instrument errors are omitted, and a more realistic case where noise and camera parameters are added to the simulated images. We evaluate such performance parameters of the algorithms as attitude accuracy, calculation time, required memory, star catalog size, sky coverage, etc., and estimate the errors introduced by each algorithm. This software package is written for use in MATLAB. The testing is parametrized for different hardware parameters, such as the focal length of the imaging setup, the field of view (FOV) of the camera, angle measurement accuracy, distortion effects, etc., and therefore, can be applied to evaluate the performance of such algorithms in any star sensor. For its hardware implementation on our StarSense, we are currently porting the codes in form of functions written in C. This is done keeping in view its easy implementation on any star sensor electronics hardware.

  19. High-Quality Random Number Generation Software for High-Performance Computing Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Random number (RN) generation is the key software component that permits random sampling. Software for parallel RN generation (RNG) should be based on RNGs that are...

  20. Parameter definition using vibration prediction software leads to significant drilling performance improvements

    Energy Technology Data Exchange (ETDEWEB)

    Amorim, Dalmo; Hanley, Chris Hanley; Fonseca, Isaac; Santos, Juliana [National Oilwell Varco, Houston TX (United States); Leite, Daltro J.; Borella, Augusto; Gozzi, Danilo [Petroleo Brasileiro S.A. (PETROBRAS), Rio de Janeiro, RJ (Brazil)

    2012-07-01

    The understanding and mitigation of downhole vibration has been a heavily researched subject in the oil industry as it results in more expensive drilling operations, as vibrations significantly diminish the amount of effective drilling energy available to the bit and generate forces that can push the bit or the Bottom Hole Assembly (BHA) off its concentric axis of rotation, producing high magnitude impacts with the borehole wall. In order to drill ahead, a sufficient amount of energy must be supplied by the rig to overcome the resistance of the drilling system, including the reactive torque of the system, drag forces, fluid pressure losses and energy dissipated by downhole vibrations, then providing the bit with the energy required to fail the rock. If the drill string enters resonant modes of vibration, not only does it decreases the amount of available energy to drill, but increases the potential for catastrophic downhole equipment and drilling bit failures. In this sense, the mitigation of downhole vibrations will result in faster, smoother, and cheaper drilling operations. A software tool using Finite Element Analysis (FEA) has been developed to provide better understanding of downhole vibration phenomena in drilling environments. The software tool calculates the response of the drilling system at various input conditions, based on the design of the wellbore along with the geometry of the Bottom Hole Assembly (BHA) and the drill string. It identifies where undesired levels of resonant vibration will be driven by certain combinations of specific drilling parameters, and also which combinations of drilling parameters will result in lower levels of vibration, so the least shocks, the highest penetration rate and the lowest cost per foot can be achieved. With the growing performance of personal computers, complex software systems modeling the drilling vibrations using FEA has been accessible to a wider audience of field users, further complimenting with real time

  1. ATLAS High Level Calorimeter Trigger Software Performance for Cosmic Ray Events

    CERN Document Server

    Oliveira Damazio, Denis; The ATLAS collaboration

    2009-01-01

    The ATLAS detector is undergoing intense commissioning effort with cosmic rays preparing for the first LHC collisions next spring. Combined runs with all of the ATLAS subsystems are being taken in order to evaluate the detector performance. This is an unique opportunity also for the trigger system to be studied with different detector operation modes, such as different event rates and detector configuration. The ATLAS trigger starts with a hardware based system which tries to identify detector regions where interesting physics objects may be found (eg: large energy depositions in the calorimeter system). An approved event will be further processed by more complex software algorithms at the second level where detailed features are extracted (full detector granularity data for small portions of the detector is available). Events accepted at this level will be further processed at the so-called event filter level. Full detector data at full granularity is available for offline like processing with complete calib...

  2. Study of the performance of the data acquisition chain for BCM1F software upgrade

    Energy Technology Data Exchange (ETDEWEB)

    Hempel, Maria

    2011-05-15

    BCM1F, the Fast Beam Conditions Monitor, is a sub-detector of the CMS experiment at LHC. It monitors the beam halo and the collision product rates inside the CMS experiment. The data acquisition of BCM1F is independent from CMS. Major components of the BCM1F back-end are discriminators, ADCs, TDCs, look-up tables and a Veto module. In the thesis the performance of several components is investigated. For the TDC two different readout modes are compared, and the impact of a Ring Buffer in the readout software was investigated. For one discriminator the thresholds of all channels are investigated and offsets of about 10 mV are found. Data taken in the LHC runs with the TDC are presented and discussed. Also the application of BCM1F as a luminosity monitor is studied. (orig.)

  3. Estimation of Characteristics of a Software Team for Implementing Effective Inspection Process through Inspection Performance Metric

    CERN Document Server

    Nair, T R Gopalakrishnan

    2011-01-01

    The continued existence of any software industry depends on its capability to develop nearly zero-defect product, which is achievable through effective defect management. Inspection has proven to be one of the promising techniques of defect management. Introductions of metrics like, Depth of Inspection (DI, a process metric) and Inspection Performance Metric (IPM, a people metric) enable one to have an appropriate measurement of inspection technique. This article elucidates a mathematical approach to estimate the IPM value without depending on shop floor defect count at every time. By applying multiple linear regression models, a set of characteristic coefficients of the team is evaluated. These coefficients are calculated from the empirical projects that are sampled from the teams of product-based and service-based IT industries. A sample of three verification projects indicates a close match between the IPM values obtained from the defect count (IPMdc) and IPM values obtained using the team coefficients usi...

  4. Automated load balancing in the ATLAS high-performance storage software

    CERN Document Server

    Le Goff, Fabrice; The ATLAS collaboration

    2017-01-01

    The ATLAS experiment collects proton-proton collision events delivered by the LHC accelerator at CERN. The ATLAS Trigger and Data Acquisition (TDAQ) system selects, transports and eventually records event data from the detector at several gigabytes per second. The data are recorded on transient storage before being delivered to permanent storage. The transient storage consists of high-performance direct-attached storage servers accounting for about 500 hard drives. The transient storage operates dedicated software in the form of a distributed multi-threaded application. The workload includes both CPU-demanding and IO-oriented tasks. This paper presents the original application threading model for this particular workload, discussing the load-sharing strategy among the available CPU cores. The limitations of this strategy were reached in 2016 due to changes in the trigger configuration involving a new data distribution pattern. We then describe a novel data-driven load-sharing strategy, designed to automatical...

  5. GTE: a new software for gravitational terrain effect computation: theory and performances

    Science.gov (United States)

    Sampietro, D.; Capponi, M.; Triglione, D.; Mansi, A. H.; Marchetti, P.; Sansò, F.

    2016-07-01

    The computation of the vertical attraction due to the topographic masses, the so-called Terrain Correction, is a fundamental step in geodetic and geophysical applications: it is required in high-precision geoid estimation by means of the remove-restore technique and it is used to isolate the gravitational effect of anomalous masses in geophysical exploration. The increasing resolution of recently developed digital terrain models, the increasing number of observation points due to extensive use of airborne gravimetry in geophysical exploration and the increasing accuracy of gravity data represents nowadays major issues for the terrain correction computation. Classical methods such as prism or point masses approximations are indeed too slow while Fourier based techniques are usually too approximate for the required accuracy. In this work a new software, called Gravity Terrain Effects (GTE), developed to guarantee high accuracy and fast computation of terrain corrections is presented. GTE has been thought expressly for geophysical applications allowing the computation not only of the effect of topographic and bathymetric masses but also those due to sedimentary layers or to the Earth crust-mantle discontinuity (the so-called Moho). In the present contribution, after recalling the main classical algorithms for the computation of the terrain correction we summarize the basic theory of the software and its practical implementation. Some tests to prove its performances are also described showing GTE capability to compute high accurate terrain corrections in a very short time: results obtained for a real airborne survey with GTE ranges between few hours and few minutes, according to the GTE profile used, with differences with respect to both planar and spherical computations (performed by prism and tesseroid respectively) of the order of 0.02 mGal even when using fastest profiles.

  6. A versatile simulation software for performance analysis of DIAL system for the detection of toxic agents

    Science.gov (United States)

    Jindal, Mukesh K.; Veerabuthiran, S.; Dudeja, Jai Paul; Dubey, Deepak K.

    2006-12-01

    Simulation studies have been carried out to analyze the performance of a Differential Absorption Lidar (DIAL) system for the remote detection of a large variety of toxic agents in the 2-5 μm and 9-11 μm spectral bands. Stand-alone Graphical User Interface (GUI) software has been developed in the MATLAB platform to perform the simulation operations. It takes various system inputs from the user and computes the required laser energy to be transmitted, backscattered signal strengths, signal-to-noise ratio and minimum detectable concentrations for various agents from different ranges for the given system parameters. It has the flexibility of varying any of the system parameters for computation in order to provide inputs for the required design of proposed DIAL system. This software has the advantage of optimizing system parameters in the design of Lidar system. As a case study, the DIAL system with specified pulse energy of OPO based laser transmitter (2-5 μm) and a TEA CO II laser transmitter (9-11μm) has been considered. The proposed system further consists of a 500-mm diameter Newtonian telescope, 0.5-mm diameter detector and 10-MHz digitizer. A toxic agent cloud with given thickness and concentration has been assumed to be detected in the ambient atmospheric conditions at various ranges between 0.2 and 5 km. For a given set of system parameters, the required energy of laser transmitter, power levels of the return signals, signal-to-noise ratio and minimum detectable concentrations from different ranges have been calculated for each of these toxic agents.

  7. Performance analysis and optimization of an advanced pharmaceutical wastewater treatment plant through a visual basic software tool (PWWT.VB).

    Science.gov (United States)

    Pal, Parimal; Thakura, Ritwik; Chakrabortty, Sankha

    2016-05-01

    A user-friendly, menu-driven simulation software tool has been developed for the first time to optimize and analyze the system performance of an advanced continuous membrane-integrated pharmaceutical wastewater treatment plant. The software allows pre-analysis and manipulation of input data which helps in optimization and shows the software performance visually on a graphical platform. Moreover, the software helps the user to "visualize" the effects of the operating parameters through its model-predicted output profiles. The software is based on a dynamic mathematical model, developed for a systematically integrated forward osmosis-nanofiltration process for removal of toxic organic compounds from pharmaceutical wastewater. The model-predicted values have been observed to corroborate well with the extensive experimental investigations which were found to be consistent under varying operating conditions like operating pressure, operating flow rate, and draw solute concentration. Low values of the relative error (RE = 0.09) and high values of Willmott-d-index (d will = 0.981) reflected a high degree of accuracy and reliability of the software. This software is likely to be a very efficient tool for system design or simulation of an advanced membrane-integrated treatment plant for hazardous wastewater.

  8. Performance of Cost Assessment on Reusable Components for Software Development using Genetic Programming

    Directory of Open Access Journals (Sweden)

    T.Tejaswini

    2015-08-01

    Full Text Available Reusability is the quality of a piece of software, which enables it to be used again, be it partial, modified or complete. A wide range of modeling techniques have been proposed and applied for software quality predictions. Complexity and size metrics have been used to predict the number of defects in software components. Estimation of cost is important, during the process of software development. There are two main types of cost estimation approaches: algorithmic methods and non-algorithmic methods. In this work, using genetic programming which is a branch of evolutionary algorithms, a new algorithmic method is presented for software development cost estimation, using the implementation of this method; new formulas were obtained for software development cost estimation in which reusability of components is given priority. After evaluation of these formulas, the mean and standard deviation of the magnitude of relative error is better than related algorithmic methods such as COCOMO formulas.

  9. A software package for evaluating the performance of a star sensor operation

    CERN Document Server

    Sarpotdar, Mayuresh; Sreejith, A G; Nirmal, K; Ambily, S; Prakash, Ajin; Safonova, Margarita; Murthy, Jayant

    2016-01-01

    We have developed a low-cost off-the-shelf component star sensor (StarSense) for use in minisatellites and CubeSats to determine the attitude of a satellite in orbit. StarSense is an imaging camera with a limiting magnitude of 6.5, which extracts information from star patterns it records in the images. The star sensor implements a centroiding algorithm to find centroids of the stars in the image, a Geometric Voting algorithm for star pattern identification, and a QUEST algorithm for attitude quaternion calculation. Here, we describe the software package to evaluate the performance of these algorithms as a star sensor single operating system. We simulate the ideal case where sky background and instrument errors are omitted, and a more realistic case where noise and camera parameters are added to the simulated images. We evaluate such performance parameters of the algorithms as attitude accuracy, calculation time, required memory, star catalog size, sky coverage, etc., and estimate the errors introduced by each...

  10. PaRSEC: A Software Framework for Performance and Productivity on Hybrid, Manycore Platforms

    Energy Technology Data Exchange (ETDEWEB)

    Dongarra, Jack [Univ. of Tennessee, Knoxville, TN (United States)

    2016-06-30

    As the era of computer architectures dominated by serial processors ends, the convergence of several unprecedented challenges suggests that closing the longstanding "application–architecture performance gap" will become more challenging than ever. To address this problem, the Parallel Runtime Scheduling and Execution Control (PaRSEC) project created a modular software framework that achieved two major objectives: first, it built a task-based runtime capable of delivering portable performance to a wide range of science and engineering applications at all levels of the platform pyramid, including the upcoming 100 Pflop/s systems and then exascale; and second, it supported and facilitated the work of developers in migrating their legacy codes and writing entirely new ones for the emerging hybrid and massively parallel manycore processor system designs. PaRSEC will support multiple domain-specific languages capable of increasing the developers' productivity while also providing the runtime with the constructs and flexibility necessary to exploit the maximal parallelism from parallel applications. Extensive preliminary research in dense linear algebra showed convincingly that a parameterized task graph representation that symbolically describes the algorithm content can achieve the project's twofold objective within that domain. The research also strongly suggested that this powerful method could be generalized to a far-wider variety of applications.

  11. Commercial Building Energy Baseline Modeling Software: Performance Metrics and Method Testing with Open Source Models and Implications for Proprietary Software Testing

    Energy Technology Data Exchange (ETDEWEB)

    Price, Phillip N.; Granderson, Jessica; Sohn, Michael; Addy, Nathan; Jump, David

    2013-09-01

    The overarching goal of this work is to advance the capabilities of technology evaluators in evaluating the building-level baseline modeling capabilities of Energy Management and Information System (EMIS) software. Through their customer engagement platforms and products, EMIS software products have the potential to produce whole-building energy savings through multiple strategies: building system operation improvements, equipment efficiency upgrades and replacements, and inducement of behavioral change among the occupants and operations personnel. Some offerings may also automate the quantification of whole-building energy savings, relative to a baseline period, using empirical models that relate energy consumption to key influencing parameters, such as ambient weather conditions and building operation schedule. These automated baseline models can be used to streamline the whole-building measurement and verification (M&V) process, and therefore are of critical importance in the context of multi-measure whole-building focused utility efficiency programs. This report documents the findings of a study that was conducted to begin answering critical questions regarding quantification of savings at the whole-building level, and the use of automated and commercial software tools. To evaluate the modeling capabilities of EMIS software particular to the use case of whole-building savings estimation, four research questions were addressed: 1. What is a general methodology that can be used to evaluate baseline model performance, both in terms of a) overall robustness, and b) relative to other models? 2. How can that general methodology be applied to evaluate proprietary models that are embedded in commercial EMIS tools? How might one handle practical issues associated with data security, intellectual property, appropriate testing ‘blinds’, and large data sets? 3. How can buildings be pre-screened to identify those that are the most model-predictable, and therefore those

  12. On the Performance of Fault Screeners in Software Development and Deployment

    NARCIS (Netherlands)

    Abreu, R.F.; Gonzalez, A.; Zoeteweij, P.; Van Gemund, A.J.C.

    2008-01-01

    Preprint of paper published in: ENASE 2008 - Proceedings of the 3rd International Conference on Evaluation of Novel Approaches to Software Engineering, 4-7 May 2008 Fault screeners are simple software (or hardware) constructs that detect variable value errors based on unary invariant checking. In

  13. Performance Test of Openflow Agent on Openflow Software-Based Mikrotik RB750 Switch

    Directory of Open Access Journals (Sweden)

    Rikie Kartadie

    2016-11-01

    Full Text Available A network is usually developed by several devices such as router, switch etc. Every device forwards data package manipulation with complicated protocol planted in its hardware. An operator is responsible for running configuration either to manage rules or application applied in the network. Human error may occur when device configuration run manually by operator. Some famous vendors, one of them is MikroTik, has also been implementing this OpenFlow on its operation. It provides the implementation of SDN/OpenFlow architecture with affordable cost. The second phase research result showed that switch OF software-based MikroTik resulted higher latency value than both mininet and switch OF software-based OpenWRT. The average gap value of switch OF software-based MikroTik is 2012 kbps lower than the value of switch OF software-based OpenWRT. The average gap value of throughput bandwidth protocol UDP switch OF software-based MikroTik is 3.6176 kBps lower than switch OF software-based OpenWRT and it is 8.68 kBps lower than mininet. The average gap throughput jitter protokol UDP of switch OF software-based MiktoTik is 0.0103ms lower than switch OF software-based OpenWRT and 0.0093ms lower than mininet. 

  14. On the Performance of Fault Screeners in Software Development and Deployment

    NARCIS (Netherlands)

    Abreu, R.F.; Gonzalez, A.; Zoeteweij, P.; Van Gemund, A.J.C.

    2008-01-01

    Preprint of paper published in: ENASE 2008 - Proceedings of the 3rd International Conference on Evaluation of Novel Approaches to Software Engineering, 4-7 May 2008 Fault screeners are simple software (or hardware) constructs that detect variable value errors based on unary invariant checking. In

  15. Performance Test of Openflow Agent on Openflow Software-Based Mikrotik RB750 Switch

    Directory of Open Access Journals (Sweden)

    Rikie Kartadie

    2016-11-01

    Full Text Available A network is usually developed by several devices such as router, switch etc. Every device forwards data package manipulation with complicated protocol planted in its hardware. An operator is responsible for running configuration either to manage rules or application applied in the network. Human error may occur when device configuration run manually by operator. Some famous vendors, one of them is MikroTik, has also been implementing this OpenFlow on its operation. It provides the implementation of SDN/OpenFlow architecture with affordable cost. The second phase research result showed that switch OF software-based MikroTik resulted higher latency value than both mininet and switch OF software-based OpenWRT. The average gap value of switch OF software-based MikroTik is 2012 kbps lower than the value of switch OF software-based OpenWRT. The average gap value of throughput bandwidth protocol UDP switch OF software-based MikroTik is 3.6176 kBps lower than switch OF software-based OpenWRT and it is 8.68 kBps lower than mininet. The average gap throughput jitter protokol UDP of switch OF software-based MiktoTik is 0.0103ms lower than switch OF software-based OpenWRT and 0.0093ms lower than mininet. 

  16. Investigation of isoprene oxidation in the atmosphere simulation chamber SAPHIR at low NO concentrations

    Science.gov (United States)

    Fuchs, H.; Rohrer, F.; Hofzumahaus, A.; Bohn, B.; Brauers, T.; Dorn, H.; Häseler, R.; Holland, F.; Li, X.; Lu, K.; Nehr, S.; Tillmann, R.; Wahner, A.

    2012-12-01

    During recent field campaigns, hydroxyl radical (OH) concentrations that were measured by laser-induced fluorescence spectroscopy (LIF) were up to a factor of ten larger than predicted by current chemical models for conditions of high OH reactivity and low nitrogen monoxide (NO) concentrations. These discrepancies were observed in the Pearl-River-Delta, China, which is an urban-influenced rural area, in rainforests, and forested areas in North America and Europe. Isoprene contributed significantly to the total OH reactivity in these field studies, so that potential explanations for the missing OH focused on new reaction pathways in the isoprene degradation mechanism. These pathways regenerate OH without oxidation of NO and thus without ozone production. In summer 2011, a series of experiments was carried out in the atmosphere simulation chamber SAPHIR in Juelich, Germany, in order to investigate the photochemical degradation of isoprene at low NO concentrations (NOSAPHIR by established chemical models like the Master Chemical Mechanism (MCM). Moreover, OH concentration measurements of two independent instruments (LIF and DOAS) agreed during all chamber experiments. Here, we present the results of the experiments and compare measurements with model predictions using the MCM. Furthermore, the validity of newly proposed reaction pathways in the isoprene degradation is evaluated by comparison with observations.

  17. Total OH reactivity study from VOC photochemical oxidation in the SAPHIR chamber

    Science.gov (United States)

    Yu, Z.; Tillmann, R.; Hohaus, T.; Fuchs, H.; Novelli, A.; Wegener, R.; Kaminski, M.; Schmitt, S. H.; Wahner, A.; Kiendler-Scharr, A.

    2015-12-01

    It is well known that hydroxyl radicals (OH) act as a dominant reactive species in the degradation of VOCs in the atmosphere. In recent field studies, directly measured total OH reactivity often showed poor agreement with OH reactivity calculated from VOC measurements (e.g. Nölscher et al., 2013; Lu et al., 2012a). This "missing OH reactivity" is attributed to unaccounted biogenic VOC emissions and/or oxidation products. The comparison of total OH reactivity being directly measured and calculated from single component measurements of VOCs and their oxidation products gives us a further understanding on the source of unmeasured reactive species in the atmosphere. This allows also the determination of the magnitude of the contribution of primary VOC emissions and their oxidation products to the missing OH reactivity. A series of experiments was carried out in the atmosphere simulation chamber SAPHIR in Jülich, Germany, to explore in detail the photochemical degradation of VOCs (isoprene, ß-pinene, limonene, and D6-benzene) by OH. The total OH reactivity was determined from the measurement of VOCs and their oxidation products by a Proton Transfer Reaction Time of Flight Mass Spectrometer (PTR-TOF-MS) with a GC/MS/FID system, and directly measured by a laser-induced fluorescence (LIF) at the same time. The comparison between these two total OH reactivity measurements showed an increase of missing OH reactivity in the presence of oxidation products of VOCs, indicating a strong contribution to missing OH reactivity from uncharacterized oxidation products.

  18. Gerasimov-Drell-Hearn Sum Rule and the Discrepancy between the New CLAS and SAPHIR Data

    CERN Document Server

    Mart, T

    2008-01-01

    Contribution of the K^+\\Lambda channel to the Gerasimov-Drell-Hearn (GDH) sum rule has been calculated by using the models that fit the recent SAPHIR or CLAS differential cross section data. It is shown that the two data sets yield quite different contributions. Contribution of this channel to the forward spin polarizability of the proton has been also calculated. It is also shown that the inclusion of the recent CLAS C_x and C_z data in the fitting data base does not significantly change the result of the present calculation. Results of the fit, however, reveal the role of the S_{11}(1650), P_{11}(1710), P_{13}(1720), and P_{13}(1900) resonances for the description of the C_x and C_z data. A brief discussion on the importance of these resonances is given. Measurements of the polarized total cross section \\sigma_{TT'} by the CLAS, LEPS, and MAMI collaborations are expected to verify this finding.

  19. Impact of horizontal and vertical localization scales on microwave sounder SAPHIR radiance assimilation

    Science.gov (United States)

    Krishnamoorthy, C.; Balaji, C.

    2016-05-01

    In the present study, the effect of horizontal and vertical localization scales on the assimilation of direct SAPHIR radiances is studied. An Artificial Neural Network (ANN) has been used as a surrogate for the forward radiative calculations. The training input dataset for ANN consists of vertical layers of atmospheric pressure, temperature, relative humidity and other hydrometeor profiles with 6 channel Brightness Temperatures (BTs) as output. The best neural network architecture has been arrived at, by a neuron independence study. Since vertical localization of radiance data requires weighting functions, a ANN has been trained for this purpose. The radiances were ingested into the NWP using the Ensemble Kalman Filter (EnKF) technique. The horizontal localization has been taken care of, by using a Gaussian localization function centered around the observed coordinates. Similarly, the vertical localization is accomplished by assuming a function which depends on the weighting function of the channel to be assimilated. The effect of both horizontal and vertical localizations has been studied in terms of ensemble spread in the precipitation. Aditionally, improvements in 24 hr forecast from assimilation are also reported.

  20. 雷肯Saphir-7系列播种机

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    雷肯Saphir-7系列播种机,是同大型拖拉机配套使用的机械传动播种机械。该机悬挂于拖拉机后进行播种作业,利用3点悬挂及机前控制装置调节播种深度,适用于大面积浅耕麦类播种作业。主要特点:可选装单、双圆盘开沟器及锄齿式开沟器,对土地适应性强:采用播种计量轮及油浴式齿轮箱,无级调节播种量,播量准确、节省种子;可与驱动耙或耕耘机具组成机组进行复式作业。

  1. Investigation of MACR oxidation by OH in the atmosphere simulation chamber SAPHIR at low NO concentrations.

    Science.gov (United States)

    Fuchs, Hendrik; Acir, Ismail-Hakki; Bohn, Birger; Brauers, Theo; Dorn, Hans-Peter; Häseler, Rolf; Hofzumahaus, Andreas; Holland, Frank; Li, Xin; Lu, Keding; Lutz, Anna; Kaminski, Martin; Nehr, Sascha; Rohrer, Franz; Tillmann, Ralf; Wegener, Robert; Wahner, Andreas

    2013-04-01

    During recent field campaigns, hydroxyl radical (OH) concentrations were up to a factor of ten larger than predicted by current chemical models for conditions of high OH reactivity and low nitrogen monoxide (NO) concentrations. These discrepancies were observed in forests, where isoprene oxidation turnover rates were large. Methacrolein (MACR) is one of the major first generation products of isoprene oxidation, so that MACR was also an important reactant for OH. Here, we present a detailed investigation of the MACR oxidation mechanism including a full set of accurate and precise radical measurements in the atmosphere simulation chamber SAPHIR in Juelich, Germany. The conditions during the chamber experiments were comparable to those during field campaigns with respect to radical and trace gas concentrations. In particular, OH reactivity was as high as 15 per second and NO mixing ratios were as low as 200pptv. Results of the experiments were compared to model predictions using the Master Chemical Mechanism, in order to identify so far unknown reaction pathways, which potentially recycle OH radicals without reactions with NO.

  2. Methodology and Toolset for Model Verification, Hardware/Software co-simulation, Performance Optimisation and Customisable Source-code generation

    DEFF Research Database (Denmark)

    Berger, Michael Stübert; Soler, José; Yu, Hao;

    2013-01-01

    The MODUS project aims to provide a pragmatic and viable solution that will allow SMEs to substantially improve their positioning in the embedded-systems development market. The MODUS tool will provide a model verification and Hardware/Software co-simulation tool (TRIAL) and a performance...... of system properties, and producing inputs to be fed into these engines, interfacing with standard (SystemC) simulation platforms for HW/SW co-simulation, customisable source-code generation towards respecting coding standards and conventions and software performance-tuning optimisation through automated...

  3. Performance verification of network function virtualization in software defined optical transport networks

    Science.gov (United States)

    Zhao, Yongli; Hu, Liyazhou; Wang, Wei; Li, Yajie; Zhang, Jie

    2017-01-01

    With the continuous opening of resource acquisition and application, there are a large variety of network hardware appliances deployed as the communication infrastructure. To lunch a new network application always implies to replace the obsolete devices and needs the related space and power to accommodate it, which will increase the energy and capital investment. Network function virtualization1 (NFV) aims to address these problems by consolidating many network equipment onto industry standard elements such as servers, switches and storage. Many types of IT resources have been deployed to run Virtual Network Functions (vNFs), such as virtual switches and routers. Then how to deploy NFV in optical transport networks is a of great importance problem. This paper focuses on this problem, and gives an implementation architecture of NFV-enabled optical transport networks based on Software Defined Optical Networking (SDON) with the procedure of vNFs call and return. Especially, an implementation solution of NFV-enabled optical transport node is designed, and a parallel processing method for NFV-enabled OTN nodes is proposed. To verify the performance of NFV-enabled SDON, the protocol interaction procedures of control function virtualization and node function virtualization are demonstrated on SDON testbed. Finally, the benefits and challenges of the parallel processing method for NFV-enabled OTN nodes are simulated and analyzed.

  4. A secure and high-performance multi-controller architecture for software-defined networking

    Institute of Scientific and Technical Information of China (English)

    Huan-zhao WANG; Peng ZHANG; Lei XIONG; Xin LIU; Cheng-chen HU

    2016-01-01

    Controllers play a critical role in software-defi ned networking (SDN). However, existing single-controller SDN architectures are vulnerable to single-point failures, where a controller’s capacity can be saturated by fl ooded fl ow requests. In addition, due to the complicated interactions between applications and controllers, the fl ow setup latency is relatively large. To address the above security and performance issues of current SDN controllers, we propose distributed rule store (DRS), a new multi-controller architecture for SDNs. In DRS, the controller caches the fl ow rules calculated by applications, and distributes these rules to multiple controller instances. Each controller instance holds only a subset of all rules, and periodically checks the consistency of fl ow rules with each other. Requests from switches are distributed among multiple controllers, in order to mitigate controller capacity saturation attack. At the same time, when rules at one controller are maliciously modifi ed, they can be detected and recovered in time. We implement DRS based on Floodlight and evaluate it with extensive emulation. The results show that DRS can effectively maintain a consistently distributed rule store, and at the same time can achieve a shorter fl ow setup time and a higher processing throughput, compared with ONOS and Floodlight.

  5. Design of the Jet Performance Software for the ATLAS Experiment at LHC

    CERN Document Server

    Doglioni, C; The ATLAS collaboration; Loch, P; Perez, K; Vitillo, RA

    2011-01-01

    This paper describes the design and implementation of the JetFramework, a software tool developed for the data analysis of the ATLAS experi- ment at CERN. JetFramework is based on Athena, an object oriented framework for data processing. The JetFramework Athena package im- plements a configurable data-flow graph (DFG) to represent an analysis. Each node of the graph can perform some computation on one or more particle collections in input. A standard set of nodes to retrieve, filter, sort and plot collections are provided. Users can also implement their own computation units inheriting from a generic interface. The analysis graph can be declared and configured in an Athena options file. To provide the requested flexibility to configure nodes from a configuration file, a sim- ple expression language permits to specify selection and plotting criterias. Viewing an analysis as an explicit DFG permits end-users to avoid writing code for repetitive tasks and to reuse user-defined computation units in other analysis...

  6. Minerva: using a software program to improve resident performance during independent call

    Science.gov (United States)

    Itri, Jason N.; Redfern, Regina O.; Cook, Tessa; Scanlon, Mary H.

    2010-03-01

    We have developed an application called Minerva that allows tracking of resident discrepancy rates and missed cases. Minerva mines the radiology information system (RIS) for preliminary interpretations provided by residents during independent call and copies both the preliminary and final interpretations to a database. Both versions are displayed for direct comparison by Minerva and classified as 'in agreement', 'minor discrepancy' or 'major discrepancy' by the resident program director. Minerva compiles statistics comparing minor, major and total discrepancy rates for individual residents relative to the overall group. Discrepant cases are categorized according to date, modality and body part and reviewed for trends in missed cases. The rate of minor, major and total discrepancies for residents on-call at our institution was similar to rates previously published, including a 2.4% major discrepancy rate for second year radiology residents in the DePICTORS study and a 2.6% major discrepancy rate for resident at a community hospital. Trend analysis of missed cases was used to generate a topic-specific resident missed case conference on acromioclavicular (AC) joint separation injuries, which resulted in a 75% decrease in the number of missed cases related to AC separation subsequent to the conference. Using a software program to track of minor and major discrepancy rates for residents taking independent call using modified RadPeer scoring guidelines provides a competency-based metric to determine resident performance. Topic-specific conferences using the cases identified by Minerva can result in a decrease in missed cases.

  7. Evaluation of the performance of drug-drug interaction screening software in community and hospital pharmacies.

    Science.gov (United States)

    Abarca, Jacob; Colon, Lisa R; Wang, Victoria S; Malone, Daniel C; Murphy, John E; Armstrong, Edward P

    2006-06-01

    Computerized drug-drug interaction (DDI) screening is widely used to identify potentially harmful drug combinations in the inpatient and outpatient setting. To evaluate the performance of drug-drug interaction (DDI) screening software in identifying select clinically significant DDIs in pharmacy computer systems in community and hospital pharmacies. Ten community pharmacies and 10 hospital pharmacies in the Tucson metropolitan area were invited to participate in the study in 2004. To test the performance of each of the systems used by the pharmacies, 25 medications were used to create 6 mock patient profiles containing 37 drug-drug pairs, 16 of which are clinically meaningful DDIs that pose a potential risk to patient safety. Each profile was entered into the computer pharmacy system, and the system response in terms of the presence or absence of a DDI alert was recorded for each drug pair. The percentage of correct responses and the sensitivity, specificity, positive predictive value, and negative predictive value of each system to correctly classify each drug pair as a DDI or not was calculated. Summary statistics of these measures were calculated separately for community and hospital pharmacies. Eight community pharmacies and 5 hospital pharmacies in the Tucson metropolitan area agreed to participate in the study. The median sensitivity and median specificity for community pharmacies was 0.88 (range, 0.81-0.94) and 0.91 (range, 0.67-1.00), respectively. For hospital pharmacies, the median sensitivity and median specificity was 0.38 (range, 0.15-0.94) and 0.95 (range, 0.81-0.95), respectively. Based on this convenience sample of 8 community pharmacies and 5 hospital pharmacies in 1 metropolitan area, the performance of community pharmacy computer systems in screening DDIs appears to have improved over the last several years compared with research published previously in 2001. However, significant variation remains in the performance of hospital pharmacy computer

  8. Relative humidity distribution from SAPHIR experiment on board Megha-Tropiques satellite mission: Comparison with global radiosonde and other satellite and reanalysis data sets

    Science.gov (United States)

    Venkat Ratnam, M.; Basha, Ghouse; Krishna Murthy, B. V.; Jayaraman, A.

    2013-09-01

    For better understanding the life cycle of the convective systems and their interactions with the environment, a joint Indo-French satellite mission named Megha-Tropiques has been launched in October 2011 in a low-inclination (20°) orbit. In the present study, we show the first results on the comparison of relative humidity (RH) obtained using a six-channel microwave sounder, covering from surface to 100 hPa, from one of the payloads SAPHIR (Sounder for Atmospheric Profiling of Humidity in the Inter-tropical Regions). The RH observations from SAPHIR illustrated the numerous scales of variability in the atmosphere both vertically and horizontally. As a part of its validation, we compare SAPHIR RH with simultaneous observations from a network of radiosondes distributed across the world (±30° latitude), other satellites (Atmospheric Infrared Sounder, Infrared Atmospheric Sounder Interferometer, Constellation Observation System for Meteorology Ionosphere and Climate (COSMIC)), and various reanalysis (National Center for Environmental Prediction (NCEP), European Center for Medium-Range Weather Forecasts reanalysis (ERA)-Interim, Modern-Era Retrospective Analysis for Research and Application (MERRA)) products. Being at a low inclination, SAPHIR is able to show better global coverage when compared to any other existing satellites in the tropical region where some important weather processes take place. A very good correlation is noticed with the RH obtained from a global radiosonde network particularly in the altitude range corresponding to 850-250 hPa, thus providing a valuable data set for investigating the convective processes. In the case of satellite data sets, SAPHIR RH is well comparable with COSMIC RH. Among the reanalysis products, NCEP shows less difference with SAPHIR followed by ERA-Interim, and the MERRA products show large differences in the middle and upper troposphere.

  9. Anukalpana 2.0: A Performance Evaluation Software Package for Akash Surface to Air Missile System

    Directory of Open Access Journals (Sweden)

    G.S. Raju

    1997-07-01

    Full Text Available Abstract : "An air defence system is a complex dynamic system comprising sensors, control centres, launchers and missiles. Practical evaluation of such a complex system is almost impossible and very expensive. Further, during development of the system, there is a necessity to evaluate certain design characteristics before it is implemented. Consequently, need arises for a comprehensive simulation package which will simulate various subsystems of the air defence weapon system, so that performance of the system can be evaluated. With the above objectives in mind, a software package, called Anukalpana 2.0, has been developed. The first version of the package was developed at the Indian Institute of Science, Bangalore. This program has been subsequently updated. The main objectives of this package are: (i evaluation of the performance of Akash air defence system and other similar air defence systems against any specified aerial threat, (ii investigation of effectiveness of the deployment tactics and operational logic employed at the firing batteries and refining them, (iii provision of aid for refining standard operating procedures (SOPs for the multitarget defence, and (iv exploring the possibility of using it as a user training tool at the level of Air Defence Commanders. The design specification and the simulation/modelling philosophy adopted for the development of this package are discussed at length. Since Akash air defence system has many probabilistic events, Monte Carlo method of simulation is used for both threat and defence. Implementation details of the package are discussed in brief. These include: data flow diagrams and interface details. Analysis of results for certain input cases is also covered."

  10. Integrating Multi-Vendor Software Analysis into the Lifecycle for Reliability, Productivity, and Performance Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The goal of the proposed work is to create new ways to manage, visualize, and share data produced by multiple software analysis tools, and to create a framework for...

  11. Spectral Graph Theory Analysis of Software-Defined Networks to Improve Performance and Security

    Science.gov (United States)

    2015-09-01

    Analysis and Machine Intelligence, vol. 15, no. 11, pp. 1101–1113, Nov. 1993. [43] N. Nise , Control Systems Engineering, Menlo Park, CA: Addison...Software-defined networks are revolutionizing networking by providing unprecedented visibility into and control over data communication networks...The focus of this work is to develop a method to extract network features, develop a closed-loop control framework for a software-defined network

  12. The Use of Commercial Flight Simulation Software as a Psychometrically Sound, Ecologically Valid Measure of Fatigued Performance

    Science.gov (United States)

    2011-08-12

    workstation individually. All software was developed with Microsoft Visual Studio 2008 using a combination of the C, C++ (for the plug-in) and C# (for the...Selection.AutoFill Destination:=Range("K2:K901") Figure A1 . Code for calculating statistics on performance in maintaining elevation

  13. DATA MINING FOR PREDICTION OF HUMAN PERFORMANCE CAPABILITY IN THE SOFTWARE-INDUSTRY

    Directory of Open Access Journals (Sweden)

    Gaurav Singh Thakur1

    2015-03-01

    Full Text Available The recruitment of new personnel is one of the most essential business processes which affect the quality of human capital within any company. It is highly essential for the companies to ensure the recruitment of right talent to maintain a competitive edge over the others in the market. However IT companies often face a problem while recruiting new people for their ongoing projects due to lack of a proper framework that defines a criteria for the selection process. In this paper we aim to develop a framework that would allow any project manager to take the right decision for selecting new talent by correlating performance parameters with the other domain-specific attributes of the candidates. Also, another important motivation behind this project is to check the validity of the selection procedure often followed by various big companies in both public and private sectors which focus only on academic scores, GPA/grades of students from colleges and other academic backgrounds. We test if such a decision will produce optimal results in the industry or is there a need for change that offers a more holistic approach to recruitment of new talent in the software companies. The scope of this work extends beyond the IT domain and a similar procedure can be adopted to develop a recruitment framework in other fields as well. Data-mining techniques provide useful information from the historical projects depending on which the hiring-manager can make decisions for recruiting high-quality workforce. This study aims to bridge this hiatus by developing a data-mining framework based on an ensemble-learning technique to refocus on the criteria for personnel selection. The results from this research clearly demonstrated that there is a need to refocus on the selection-criteria for quality objectives.

  14. AGSuite: Software to conduct feature analysis of artificial grammar learning performance.

    Science.gov (United States)

    Cook, Matthew T; Chubala, Chrissy M; Jamieson, Randall K

    2017-06-08

    To simplify the problem of studying how people learn natural language, researchers use the artificial grammar learning (AGL) task. In this task, participants study letter strings constructed according to the rules of an artificial grammar and subsequently attempt to discriminate grammatical from ungrammatical test strings. Although the data from these experiments are usually analyzed by comparing the mean discrimination performance between experimental conditions, this practice discards information about the individual items and participants that could otherwise help uncover the particular features of strings associated with grammaticality judgments. However, feature analysis is tedious to compute, often complicated, and ill-defined in the literature. Moreover, the data violate the assumption of independence underlying standard linear regression models, leading to Type I error inflation. To solve these problems, we present AGSuite, a free Shiny application for researchers studying AGL. The suite's intuitive Web-based user interface allows researchers to generate strings from a database of published grammars, compute feature measures (e.g., Levenshtein distance) for each letter string, and conduct a feature analysis on the strings using linear mixed effects (LME) analyses. The LME analysis solves the inflation of Type I errors that afflicts more common methods of repeated measures regression analysis. Finally, the software can generate a number of graphical representations of the data to support an accurate interpretation of results. We hope the ease and availability of these tools will encourage researchers to take full advantage of item-level variance in their datasets in the study of AGL. We moreover discuss the broader applicability of the tools for researchers looking to conduct feature analysis in any field.

  15. Performance Assessment of a Gnss-Based Troposphere Path Delay Estimation Software

    Science.gov (United States)

    Mariotti, Gilles; Avanzi, Alessandro; Graziani, Alberto; Tortora, Paolo

    2013-04-01

    perform the differentiation. The code relies on several IGS products, like SP3 precise orbits and SINEX positions available for the master stations in order to remove several error components, while the phase ambiguities (both wide and narrow lane) are resolved using the modified LAMBDA (MLAMBDA) method. The double-differenced data are then processed by a Kalman Filter that estimates the contingent positioning error of the rover station, its Zenith Wet Delay (ZWD) and the residual phase ambiguities. On the other hand, the Zenith Hydrostatic Delay (ZHD) is preliminarily computed using a mathematical model, based on surface meteorological measurements. The final product of the developed code is an output file containing the estimated ZWD and ZHD time-series in a format compatible with the major orbit determination software, e.g. the CSP card format (TRK-2-23) used by NASA JPL's Orbit Determination Program.

  16. Development of User-Friendly Software to Design Dairy Heat Exchanger and Performance Evaluation

    Directory of Open Access Journals (Sweden)

    DipankarMandal

    2015-02-01

    Full Text Available The paper proposes a calculation algorithm and development of a software in Visual Basic(Visual Studio 2012 Express Desktop used in heat transfer studies when different heat exchangers are involved (e.g. Helical Type Triple Tube Heat Exchanger , Plate Type Heat Exchanger.It includes the easy calculation of heat transfer coefficient and followed by the design and simulation of heat exchanger design parameter by inputting general known parameters of a heat exchanger into the developed software—-―DAIRY –HE ―. A parametric study is conducted using the software interface to determine the length of tubes or dimensions of heat exchanger.

  17. Cross Sectional Study of Agile Software Development Methods and Project Performance

    Science.gov (United States)

    Lambert, Tracy

    2011-01-01

    Agile software development methods, characterized by delivering customer value via incremental and iterative time-boxed development processes, have moved into the mainstream of the Information Technology (IT) industry. However, despite a growing body of research which suggests that a predictive manufacturing approach, with big up-front…

  18. Cross Sectional Study of Agile Software Development Methods and Project Performance

    Science.gov (United States)

    Lambert, Tracy

    2011-01-01

    Agile software development methods, characterized by delivering customer value via incremental and iterative time-boxed development processes, have moved into the mainstream of the Information Technology (IT) industry. However, despite a growing body of research which suggests that a predictive manufacturing approach, with big up-front…

  19. Free software for performing physical analysis of systems for digital radiography and mammography

    Energy Technology Data Exchange (ETDEWEB)

    Donini, Bruno; Lanconelli, Nico, E-mail: nico.lanconelli@unibo.it [Alma Mater Studiorum, Department of Physics and Astronomy, University of Bologna, Bologna 40127 (Italy); Rivetti, Stefano [Fisica Medica, Ospedale di Sassuolo S.p.A., Sassuolo 41049 (Italy); Bertolini, Marco [Medical Physics Unit, Azienda Ospedaliera ASMN, Istituto di Ricovero e Cura a Carattere Scientifico, Reggio Emilia 42123 (Italy)

    2014-05-15

    Purpose: In this paper, the authors present a free software for assisting users in achieving the physical characterization of x-ray digital systems and image quality checks. Methods: The program was developed as a plugin of a well-known public-domain suite ImageJ. The software can assist users in calculating various physical parameters such as the response curve (also termed signal transfer property), modulation transfer function (MTF), noise power spectra (NPS), and detective quantum efficiency (DQE). It also includes the computation of some image quality checks: defective pixel analysis, uniformity, dark analysis, and lag. Results: The software was made available in 2009 and has been used during the last couple of years by many users who gave us valuable feedback for improving its usability. It was tested for achieving the physical characterization of several clinical systems for digital radiography and mammography. Various published papers made use of the outcomes of the plugin. Conclusions: This software is potentially beneficial to a variety of users: physicists working in hospitals, staff working in radiological departments, such as medical physicists, physicians, engineers. The plugin, together with a brief user manual, are freely available and can be found online ( http://www.medphys.it/downloads.htm ). With our plugin users can estimate all three most important parameters used for physical characterization (MTF, NPS, and also DQE). The plugin can run on any operating system equipped with ImageJ suite. The authors validated the software by comparing MTF and NPS curves on a common set of images with those obtained with other dedicated programs, achieving a very good agreement.

  20. Spectral analysis software improves confidence in plant and soil water stable isotope analyses performed by isotope ratio infrared spectroscopy (IRIS).

    Science.gov (United States)

    West, A G; Goldsmith, G R; Matimati, I; Dawson, T E

    2011-08-30

    Previous studies have demonstrated the potential for large errors to occur when analyzing waters containing organic contaminants using isotope ratio infrared spectroscopy (IRIS). In an attempt to address this problem, IRIS manufacturers now provide post-processing spectral analysis software capable of identifying samples with the types of spectral interference that compromises their stable isotope analysis. Here we report two independent tests of this post-processing spectral analysis software on two IRIS systems, OA-ICOS (Los Gatos Research Inc.) and WS-CRDS (Picarro Inc.). Following a similar methodology to a previous study, we cryogenically extracted plant leaf water and soil water and measured the δ(2)H and δ(18)O values of identical samples by isotope ratio mass spectrometry (IRMS) and IRIS. As an additional test, we analyzed plant stem waters and tap waters by IRMS and IRIS in an independent laboratory. For all tests we assumed that the IRMS value represented the "true" value against which we could compare the stable isotope results from the IRIS methods. Samples showing significant deviations from the IRMS value (>2σ) were considered to be contaminated and representative of spectral interference in the IRIS measurement. Over the two studies, 83% of plant species were considered contaminated on OA-ICOS and 58% on WS-CRDS. Post-analysis, spectra were analyzed using the manufacturer's spectral analysis software, in order to see if the software correctly identified contaminated samples. In our tests the software performed well, identifying all the samples with major errors. However, some false negatives indicate that user evaluation and testing of the software are necessary. Repeat sampling of plants showed considerable variation in the discrepancies between IRIS and IRMS. As such, we recommend that spectral analysis of IRIS data must be incorporated into standard post-processing routines. Furthermore, we suggest that the results from spectral analysis be

  1. The Performance Evaluation of Multi-Image 3d Reconstruction Software with Different Sensors

    Science.gov (United States)

    Mousavi, V.; Khosravi, M.; Ahmadi, M.; Noori, N.; Naveh, A. Hosseini; Varshosaz, M.

    2015-12-01

    Today, multi-image 3D reconstruction is an active research field and generating three dimensional model of the objects is one the most discussed issues in Photogrammetry and Computer Vision that can be accomplished using range-based or image-based methods. Very accurate and dense point clouds generated by range-based methods such as structured light systems and laser scanners has introduced them as reliable tools in the industry. Image-based 3D digitization methodologies offer the option of reconstructing an object by a set of unordered images that depict it from different viewpoints. As their hardware requirements are narrowed down to a digital camera and a computer system, they compose an attractive 3D digitization approach, consequently, although range-based methods are generally very accurate, image-based methods are low-cost and can be easily used by non-professional users. One of the factors affecting the accuracy of the obtained model in image-based methods is the software and algorithm used to generate three dimensional model. These algorithms are provided in the form of commercial software, open source and web-based services. Another important factor in the accuracy of the obtained model is the type of sensor used. Due to availability of mobile sensors to the public, popularity of professional sensors and the advent of stereo sensors, a comparison of these three sensors plays an effective role in evaluating and finding the optimized method to generate three-dimensional models. Lots of research has been accomplished to identify a suitable software and algorithm to achieve an accurate and complete model, however little attention is paid to the type of sensors used and its effects on the quality of the final model. The purpose of this paper is deliberation and the introduction of an appropriate combination of a sensor and software to provide a complete model with the highest accuracy. To do this, different software, used in previous studies, were compared and

  2. High performance computing software package for multitemporal Remote-Sensing computations

    Directory of Open Access Journals (Sweden)

    Asaad Chahboun

    2010-10-01

    Full Text Available With the huge satellite data actually stored, remote sensing multitemporal study is nowadays one of the most challenging fields of computer science. The multicore hardware support and Multithreading can play an important role in speeding up algorithm computations. In the present paper, a software package (called Multitemporal Software Package for Satellite Remote sensing data (MSPSRS has been developed for the multitemporal treatment of satellite remote sensing images in a standard format. Due to portability intend, the interface was developed using the QT application framework and the core wasdeveloped integrating C++ classes. MSP.SRS can run under different operating systems (i.e., Linux, Mac OS X, Windows, Embedded Linux, Windows CE, etc.. Final benchmark results, using multiple remote sensing biophysical indices, show a gain up to 6X on a quad core i7 personal computer.

  3. High Performance Embedded Computing Software Initiative (HPEC-SI) Program Facilitation of VSIPL++ Standardization

    Science.gov (United States)

    2008-04-01

    parallel VSIPL++, and other parallel computing systems. The cluster is a fifty five node Beowulf style cluster with 116 compute processors of varying types...consoles, which GTRI inserted into to the parallel software testbed. A computer that is used as a compute node in a Beowulf -style cluster requires a... Beowulf -style cluster. GTRI also participated in technical advisory planning for the HPEC-SI program. 5. References 1. Schwartz, D. A ., Judd, R. R

  4. The effect of regional differences on the performance of software firms in the Netherlands

    OpenAIRE

    Weterings, Anet; Boschma, Ron

    2004-01-01

    In this paper, we concentrate on how evolutionary economics contributes to a better understanding of the spatial evolution of newly emerging industries. Inspired by evolutionary thinking, four types of explanations are discussed and tested in an empirical analysis of the spatial pattern of the software sector in the Netherlands. Traditionally, agglomeration economies provide an explanation for the spatial concentration of an industry. Firms located in a cluster of similar or related sectors b...

  5. A methodology for evaluating the performance of software, using {gamma} spectrometry, for determining the isotopic composition of plutonium

    Energy Technology Data Exchange (ETDEWEB)

    Granier, G. [CEA, DEN, CETAMA, Marcoule F-30207 Bagnols-sur-Ceze (France); Porcher, J. B.; Payan, E. [CEA, DEN, Nuclear Measurement Laboratory, F-13108 St. Paul-lez-Durance (France); Pepin, N. [IRSN, DEND, F-92262 Fontenay aux Roses (France); Simon, A-C. [CEA, LIST, Service Systemes et Technologies Pour la Mesure, F-91191 Gif-sur-Yvette (France); Benezech, B. [CEA, DEN, Nuclear Measurement Laboratory, F-13108 St. Paul-lez-Durance (France); AREVA MSIS (France); Veyer, C. [Veyer Consultant, 21, rue du May, 59570 Saint Waast la Vallee (France)

    2009-07-01

    This paper presents the progress of an ongoing study regarding the performances of software used for determining the isotopic composition of plutonium and uranium, by means of {gamma} spectrometry, in the presence of 'disturbing' radioactive emitters and of various matrices. The 'disturbing' radio-emitters are some minor actinides (Am{sup 242}, Am{sup 243}, Np{sup 237}, Cm{sup 243} etc.), fission products (Cs{sup 137}, Sb{sup 125}, Eu{sup 154} etc.) and/or activation products (Co{sup 60} etc.). All these radionuclides can be found in waste from the nuclear industry. Matrices can also vary (metal, vinyl...) and have a disturbing impact on the results given by the software for determining the isotopic composition (IC). The original aim of the study was to determine the field of applicability of each IC determination software tool when considering various types of package, radiological contents, measurement conditions and configuration and equipment. The proposed method involved the 3 following steps. First step: the acquisition of elementary mono-isotopic or mono-elemental spectra on mock-up packages with standard sources in well defined measurement conditions in order to establish a common database. Second step: the computation of virtual spectra by summing these 'elementary' spectra on each channel (in the same conditions). Third step: testing these virtual spectra with the various software tools

  6. Enhancing performance of LCoS-SLM as adaptive optics by using computer-generated holograms modulation software

    Science.gov (United States)

    Tsai, Chun-Wei; Lyu, Bo-Han; Wang, Chen; Hung, Cheng-Chieh

    2017-05-01

    We have already developed multi-function and easy-to-use modulation software that was based on LabVIEW system. There are mainly four functions in this modulation software, such as computer generated holograms (CGH) generation, CGH reconstruction, image trimming, and special phase distribution. Based on the above development of CGH modulation software, we could enhance the performance of liquid crystal on silicon - spatial light modulator (LCoSSLM) as similar as the diffractive optical element (DOE) and use it on various adaptive optics (AO) applications. Through the development of special phase distribution, we are going to use the LCoS-SLM with CGH modulation software into AO technology, such as optical microscope system. When the LCOS-SLM panel is integrated in an optical microscope system, it could be placed on the illumination path or on the image forming path. However, LCOS-SLM provides a program-controllable liquid crystal array for optical microscope. It dynamically changes the amplitude or phase of light and gives the obvious advantage, "Flexibility", to the system

  7. Modeling and performance analysis for composite network–compute service provisioning in software-defined cloud environments

    Directory of Open Access Journals (Sweden)

    Qiang Duan

    2015-08-01

    Full Text Available The crucial role of networking in Cloud computing calls for a holistic vision of both networking and computing systems that leads to composite network–compute service provisioning. Software-Defined Network (SDN is a fundamental advancement in networking that enables network programmability. SDN and software-defined compute/storage systems form a Software-Defined Cloud Environment (SDCE that may greatly facilitate composite network–compute service provisioning to Cloud users. Therefore, networking and computing systems need to be modeled and analyzed as composite service provisioning systems in order to obtain thorough understanding about service performance in SDCEs. In this paper, a novel approach for modeling composite network–compute service capabilities and a technique for evaluating composite network–compute service performance are developed. The analytic method proposed in this paper is general and agnostic to service implementation technologies; thus is applicable to a wide variety of network–compute services in SDCEs. The results obtained in this paper provide useful guidelines for federated control and management of networking and computing resources to achieve Cloud service performance guarantees.

  8. Software Epistemology

    Science.gov (United States)

    2016-03-01

    comprehensive approach for determining software epistemology which significantly advances the state of the art in automated vulnerability discovery...straightforward. First, internet -based repositories of open source software (e.g., FreeBSD ports, GitHub, SourceForge, etc.) are mined Approved for...the fix delta, we attempted to perform the same process to determine if the firmware release present in an Internet -of-Things (IoT) streaming camera

  9. Comparison of N2O5 mixing ratios during NO3Comp 2007 in SAPHIR

    Directory of Open Access Journals (Sweden)

    A. W. Rollins

    2012-07-01

    Full Text Available N2O5 detection in the atmosphere has been accomplished using techniques which have been developed during the last decade. Most techniques use a heated inlet to thermally decompose N2O5 to NO3, which can be detected by either cavity based absorption at 662 nm or by laser-induced fluorescence. In summer 2007, a large set of instruments, which were capable of measuring NO3 mixing ratios, were simultaneously deployed in the atmosphere simulation chamber SAPHIR in Jülich, Germany. Some of these instruments measured N2O5 mixing ratios either simultaneously or alternatively. Experiments focussed on the investigation of potential interferences from e.g. water vapor or aerosol and on the investigation of the oxidation of biogenic volatile organic compounds by NO3. The comparison of N2O5 mixing ratios shows an excellent agreement between measurements of instruments applying different techniques (3 cavity ring-down (CRDS instruments, 2 laser-induced fluorescence (LIF instruments. Data sets are highly correlated as indicated by the square of the linear correlation coefficients, R2, which values are larger than 0.96 for the entire data sets. N2O5 mixing ratios well agree within the combined accuracy of measurements. Slopes of the linear regression range between 0.87 and 1.26 and intercepts are negligible. The most critical aspect of N2O5 measurements by cavity ring-down instruments is the determination of the inlet and filter transmission efficiency. Measurements here show that the N2O5 inlet transmission efficiency can decrease in the presence of high aerosol loads, and that frequent filter/inlet changing is necessary to quantitatively sample N2O5 in some environments. The analysis of data also demonstrates that a general correction for degrading filter transmission is not applicable for all conditions encountered during this campaign. Besides the effect of a gradual degradation of the inlet transmission efficiency aerosol exposure, no other interference

  10. OH regeneration from methacrolein oxidation investigated in the atmosphere simulation chamber SAPHIR

    Science.gov (United States)

    Fuchs, H.; Acir, I.-H.; Bohn, B.; Brauers, T.; Dorn, H.-P.; Häseler, R.; Hofzumahaus, A.; Holland, F.; Kaminski, M.; Li, X.; Lu, K.; Lutz, A.; Nehr, S.; Rohrer, F.; Tillmann, R.; Wegener, R.; Wahner, A.

    2014-08-01

    Hydroxyl radicals (OH) are the most important reagent for the oxidation of trace gases in the atmosphere. OH concentrations measured during recent field campaigns in isoprene-rich environments were unexpectedly large. A number of studies showed that unimolecular reactions of organic peroxy radicals (RO2) formed in the initial reaction step of isoprene with OH play an important role for the OH budget in the atmosphere at low mixing ratios of nitrogen monoxide (NO) of less than 100 pptv. It has also been suggested that similar reactions potentially play an important role for RO2 from other compounds. Here, we investigate the oxidation of methacrolein (MACR), one major oxidation product of isoprene, by OH in experiments in the simulation chamber SAPHIR under controlled atmospheric conditions. The experiments show that measured OH concentrations are approximately 50% larger than calculated by the Master Chemical Mechanism (MCM) for conditions of the experiments (NO mixing ratio of 90 pptv). The analysis of the OH budget reveals an OH source that is not accounted for in MCM, which is correlated with the production rate of RO2 radicals from MACR. In order to balance the measured OH destruction rate, 0.77 OH radicals (1σ error: ± 0.31) need to be additionally reformed from each reaction of OH with MACR. The strong correlation of the missing OH source with the production of RO2 radicals is consistent with the concept of OH formation from unimolecular isomerization and decomposition reactions of RO2. The comparison of observations with model calculations gives a lower limit of 0.03 s-1 for the reaction rate constant if the OH source is attributed to an isomerization reaction of MACR-1-OH-2-OO and MACR-2-OH-2-OO formed in the MACR + OH reaction as suggested in the literature (Crounse et al., 2012). This fast isomerization reaction would be a competitor to the reaction of this RO2 species with a minimum of 150 pptv NO. The isomerization reaction would be the dominant

  11. Comparison of N2O5 mixing ratios during NO3Comp 2007 in SAPHIR

    Science.gov (United States)

    Fuchs, H.; Simpson, W. R.; Apodaca, R. L.; Brauers, T.; Cohen, R. C.; Crowley, J. N.; Dorn, H.-P.; Dubé, W. P.; Fry, J. L.; Häseler, R.; Kajii, Y.; Kiendler-Scharr, A.; Labazan, I.; Matsumoto, J.; Mentel, T. F.; Nakashima, Y.; Rohrer, F.; Rollins, A. W.; Schuster, G.; Tillmann, R.; Wahner, A.; Wooldridge, P. J.; Brown, S. S.

    2012-11-01

    N2O5 detection in the atmosphere has been accomplished using techniques which have been developed during the last decade. Most techniques use a heated inlet to thermally decompose N2O5 to NO3, which can be detected by either cavity based absorption at 662 nm or by laser-induced fluorescence. In summer 2007, a large set of instruments, which were capable of measuring NO3 mixing ratios, were simultaneously deployed in the atmosphere simulation chamber SAPHIR in Jülich, Germany. Some of these instruments measured N2O5 mixing ratios either simultaneously or alternatively. Experiments focused on the investigation of potential interferences from, e.g., water vapour or aerosol and on the investigation of the oxidation of biogenic volatile organic compounds by NO3. The comparison of N2O5 mixing ratios shows an excellent agreement between measurements of instruments applying different techniques (3 cavity ring-down (CRDS) instruments, 2 laser-induced fluorescence (LIF) instruments). Datasets are highly correlated as indicated by the square of the linear correlation coefficients, R2, which values were larger than 0.96 for the entire datasets. N2O5 mixing ratios well agree within the combined accuracy of measurements. Slopes of the linear regression range between 0.87 and 1.26 and intercepts are negligible. The most critical aspect of N2O5 measurements by cavity ring-down instruments is the determination of the inlet and filter transmission efficiency. Measurements here show that the N2O5 inlet transmission efficiency can decrease in the presence of high aerosol loads, and that frequent filter/inlet changing is necessary to quantitatively sample N2O5 in some environments. The analysis of data also demonstrates that a general correction for degrading filter transmission is not applicable for all conditions encountered during this campaign. Besides the effect of a gradual degradation of the inlet transmission efficiency aerosol exposure, no other interference for N2O5

  12. SOFTWARE METRICS VALIDATION METHODOLOGIES IN SOFTWARE ENGINEERING

    Directory of Open Access Journals (Sweden)

    K.P. Srinivasan

    2014-12-01

    Full Text Available In the software measurement validations, assessing the validation of software metrics in software engineering is a very difficult task due to lack of theoretical methodology and empirical methodology [41, 44, 45]. During recent years, there have been a number of researchers addressing the issue of validating software metrics. At present, software metrics are validated theoretically using properties of measures. Further, software measurement plays an important role in understanding and controlling software development practices and products. The major requirement in software measurement is that the measures must represent accurately those attributes they purport to quantify and validation is critical to the success of software measurement. Normally, validation is a collection of analysis and testing activities across the full life cycle and complements the efforts of other quality engineering functions and validation is a critical task in any engineering project. Further, validation objective is to discover defects in a system and assess whether or not the system is useful and usable in operational situation. In the case of software engineering, validation is one of the software engineering disciplines that help build quality into software. The major objective of software validation process is to determine that the software performs its intended functions correctly and provides information about its quality and reliability. This paper discusses the validation methodology, techniques and different properties of measures that are used for software metrics validation. In most cases, theoretical and empirical validations are conducted for software metrics validations in software engineering [1-50].

  13. Performance Assessment of Three Rendering Engines in 3D Computer Graphics Software

    Directory of Open Access Journals (Sweden)

    Žan Vidmar

    2015-03-01

    Full Text Available The aim of the research was the determination of testing conditions and visual and numerical evaluation of renderings made with three different rendering engines in Maya software, which is widely used for educational and computer art purposes. In the theoretical part the overview of light phenomena and their simulation in virtual space is presented. This is followed by a detailed presentation of the main rendering methods and the results and limitations of their applications to 3D objects. At the end of the theoretical part the importance of a proper testing scene and especially the role of Cornell box are explained. In the experimental part the terms and conditions as well as hardware and software used for the research are presented. This is followed by a description of the procedures, where we focused on the rendering quality and time, which enabled the comparison of settings of different render engines and determination of conditions for further rendering of testing scenes. The experimental part continued with rendering a variety of simple virtual scenes including Cornell box and virtual object with different materials and colours. Apart from visual evaluation, which was the starting point for comparison of renderings, a procedure for numerical estimation and colour deviations of renderings using the selected regions of interest in the final images is presented.

  14. Isotope effect in the formation of H2 from H2CO studied at the atmospheric simulation chamber SAPHIR

    Directory of Open Access Journals (Sweden)

    R. Koppmann

    2010-06-01

    Full Text Available Formaldehyde of known, near-natural isotopic composition was photolyzed in the SAPHIR atmosphere simulation chamber under ambient conditions. The isotopic composition of the product H2 was used to determine the isotope effects in formaldehyde photolysis. The experiments are sensitive to the molecular photolysis channel, and the radical channel has only an indirect effect and cannot be effectively constrained. The molecular channel kinetic isotope effect KIEmol, the ratio of photolysis frequencies j(HCHO→CO+H2/j(HCDO→CO+HD at surface pressure, is determined to be KIEmol=1.63−0.046+0.038. This is similar to the kinetic isotope effect for the total removal of HCHO from a recent relative rate experiment (KIEtot=1.58±0.03, which indicates that the KIEs in the molecular and radical photolysis channels at surface pressure (≈100 kPa may not be as different as described previously in the literature.

  15. Performance of an implantable automatic atrial fibrillation detection device: impact of software adjustments and relevance of manual episode analysis.

    Science.gov (United States)

    Eitel, Charlotte; Husser, Daniela; Hindricks, Gerhard; Frühauf, Manuela; Hilbert, Sebastian; Arya, Arash; Gaspar, Thomas; Wetzel, Ulrike; Bollmann, Andreas; Piorkowski, Christopher

    2011-04-01

    Implantable loop recorders (ILRs) with specific atrial fibrillation (AF) detection algorithms (ILR-AF) have been developed for continuous AF monitoring. We sought to analyse the clinical value of a new AF monitoring device and to compare it to serial 7-day Holter. Sixty-four consecutive patients suffering from paroxysmal AF were included in this prospective analysis and received an ILR-AF. Manual electrogram analysis was performed for each automatically detected episode and each was categorized into one of three possible diagnoses: 'no AF', 'definite AF', and 'possible AF' (non-diagnostic). Analysis was performed separately before and after a software upgrade that was introduced during the course of the study. A subgroup of patients (51 of 64) underwent AF catheter ablation with subsequent serial 7-day Holter in comparison with the ILR-AF. A total of 333 interrogations were performed (203 before and 130 after software upgrade). The number of patients with AF misdetection was significantly reduced from 72 to 44% following the software upgrade (P = 0.001). The number of patients with non-diagnostic interrogations went from 38 to 16% (P = 0.001). Compared with serial 7-day Holter, the ILR-AF had a tendency to detect a higher number of patients with AF recurrences (31 vs. 24%; P = 0.125). The rate of AF detection on ILR-AF may be higher compared with standard AF monitoring. However, false-positive AF recordings hamper the clinical value. Developments in device technology and device handling are necessary to minimize non-diagnostic interrogations.

  16. Performance of Student Software Development Teams: The Influence of Personality and Identifying as Team Members

    Science.gov (United States)

    Monaghan, Conal; Bizumic, Boris; Reynolds, Katherine; Smithson, Michael; Johns-Boast, Lynette; van Rooy, Dirk

    2015-01-01

    One prominent approach in the exploration of the variations in project team performance has been to study two components of the aggregate personalities of the team members: conscientiousness and agreeableness. A second line of research, known as self-categorisation theory, argues that identifying as team members and the team's performance norms…

  17. Performance of Student Software Development Teams: The Influence of Personality and Identifying as Team Members

    Science.gov (United States)

    Monaghan, Conal; Bizumic, Boris; Reynolds, Katherine; Smithson, Michael; Johns-Boast, Lynette; van Rooy, Dirk

    2015-01-01

    One prominent approach in the exploration of the variations in project team performance has been to study two components of the aggregate personalities of the team members: conscientiousness and agreeableness. A second line of research, known as self-categorisation theory, argues that identifying as team members and the team's performance norms…

  18. Evaluating Performance of Water Hammer Control Equipment using Hytran Software in Hasanlu Dam Pumping Station

    Directory of Open Access Journals (Sweden)

    Parisa Nazari

    2016-09-01

    Full Text Available Unsteady flows start from a steady state and end the other steady state condition. In water lines unsteady flows occur mainly due to the closure of valves, sudden pumps stops or sudden pumps starts. To prevent these losses, the major ways which can be used are pressure valves, air tanks and surge tanks. All various methods of controlling water hammer pursue a common goal, and that is to balance pressure from water hammer to adjust the pressure in an acceptable range in the network. In this paper, unsteady hydraulic flow control methods include protective measures such as the use of check valve and installation of air valves, air chambers and surge tanks are investigated and compared. And so that the1400 mm existing pipe line of Hasanlu dam pump station, can be simulated using Hytran software, and then minimum and maximum pressure due to the different choking in the throat connecting the main route was evaluated. The results presented that the use of check valve with built-in soft starter in the present case study reduces the positive and negative pressure caused by the water hammer phenomenon as possible value.

  19. Performance Evaluation and Software Design for EVA Robotic Assistant Stereo Vision Heads

    Science.gov (United States)

    DiPaolo, Daniel

    2003-01-01

    The purpose of this project was to aid the EVA Robotic Assistant project by evaluating and designing the necessary interfaces for two stereo vision heads - the TracLabs Biclops pan-tilt-verge head, and the Helpmate Zebra pan-tilt-verge head. The first half of the project consisted of designing the necessary software interface so that the other modules of the EVA Robotic Assistant had proper access to all of the functionalities offered by each of the stereovision heads. This half took most of the project time, due to a lack of ready-made CORBA drivers for either of the heads. Once this was overcome, the evaluation stage of the project began. The second half of the project was to take these interfaces and to evaluate each of the stereo vision heads in terms of usefulness to the project. In the key project areas such as stability and reliability, the Zebra pan-tilt-verge head came out on top. However, the Biclops did have many more advantages over the Zebra, such as: lower power consumption, faster communications, and a simpler, cleaner API. Overall, the Biclops pan-tilt-verge head outperformed the Zebra pan-tilt-verge head.

  20. Development of expert system software to improve performance of high-voltage arresters in substations

    Energy Technology Data Exchange (ETDEWEB)

    Souza, Andre Nunes de; Oltremari, Anderson; Zago, Maria Goretti; Silva, Paulo Sergio da; Costa Junior, Pedro da; Ferraz, Kleber [Sao Paulo State Univ. (UNESP), Bauru, SP (Brazil). Lab. of Power Systems and Intelligent Techniques], E-mail: andrejau@feb.unesp.br; Gusmao, Euripedes Silva; Prado, Jose Martins [ELETRONORTE, MT (Brazil)], E-mail: euripedes.gusmao@eln.gov.br

    2007-07-01

    One of the main causes of interruption and power outage on the energy distribution system in Brazil is related to lightning, which is also the main responsible by the reduction of service life and destruction of consumers and Utilities' equipment. As a manner of improving the protection of the energy distribution system, the Utilities have given attention on establishing maintenance techniques, such preventive as predictive, of the high-voltage arresters in substation. Currently, one of the main manners to obtain the installed arresters' characteristics involves the utilization of high cost equipment, such as leakage current meters. In this way, this paper aims to fulfill the needs of obtaining reliable results with the utilization of lower cost equipment, proposing a Expert System Software for diagnosing and aiding to decision through the utilization of intelligent techniques, which makes possible the monitoring of service life and the identification of aged arresters, allowing the establishment of one reliable chronogram for the removal of equipment, such for maintenance as for substitution. (author)

  1. Optimizing Performance of Scientific Visualization Software to Support Frontier-Class Computations

    Science.gov (United States)

    2015-08-01

    assistance with accessing graphics processing unit ( GPU )- enabled nodes on the HPC utility server systems via the Portable Batch System (PBS) batch job... graphics processing unit ( GPU )-enabled and large memory compute nodes. The EnSight client will run on the first allocated node (which is the graphics ...Defense DR Clients distributed rendering clients GPU graphics processing unit HPC high-performance computing HPCMDC High-Performance Computing

  2. Performance of a sound card as data acquisition system and a lock-in emulated by software in capillary electrophoresis.

    Science.gov (United States)

    Mandaji, Marcos; Buckup, Tiago; Rech, Rafael; Correia, Ricardo Rego Bordalo; Kist, Tarso Ledur

    2007-03-30

    The performance of fluorescence detectors in capillary electrophoresis is maximized when the excitation light intensity is modulated in time with optimal frequencies. This is especially true when photomultiplier tubes are used to detect the fluorescent light. The photomultiplier tube amplified raw output signal can in principle be captured directly by a personal computer sound card (PCSC) and processed by a lock-in emulated by software. This possibility is demonstrated in the present work and the performance of this new setup is compared with a traditional data acquisition system. The results obtained with this "PCSC and lock-in emulated by software" were of the same quality or even better compared to that obtained by conventional time integrators (Boxcars) and data acquisition boards. With PCSC the limits of detection (LOD) found for both naphthalene-2,3-dicarboxaldehyde-derivatized tyrosine and alanine were 3.3 and 3.5fmol (injection of 5nL of samples at 0.66 and 0.70micromol/L), respectively. This is at least three times better compared to conventional systems when light emitting diodes (LEDs) are used as the excitation source in fluorescence detectors. The PCSC linear response range was also larger compared to conventional data acquisition boards. This scheme showed to be a practical and convenient alternative of data acquisition and signal processing for detection systems used in capillary electrophoresis.

  3. Observer performance in estimating upper arm elevation angles under ideal viewing conditions when assisted by posture matching software.

    Science.gov (United States)

    Jackson, Jennie A; Mathiassen, Svend Erik; Liv, Per

    2016-07-01

    Selecting a suitable body posture measurement method requires performance indices of candidate tools. Such data are lacking for observational assessments made at a high degree of resolution. The aim of this study was to determine the performance (bias and between- and within-observer variance) of novice observers estimating upper arm elevation postures assisted by posture matching software to the nearest degree from still images taken under ideal conditions. Estimates were minimally biased from true angles: the mean error across observers was less than 2°. Variance between observers was minimal. Considerable variance within observers, however, underlined the risk of relying on single observations. Observers were more proficient at estimating 0° and 90° postures, and less proficient at 60°. Thus, under ideal visual conditions observers, on average, proved proficient at high resolution posture estimates; further investigation is required to determine how non-optimal image conditions, as would be expected from occupational data, impact proficiency.

  4. Problem solving performance and learning strategies of undergraduate students who solved microbiology problems using IMMEX educational software

    Science.gov (United States)

    Ebomoyi, Josephine Itota

    The objectives of this study were as follows: (1) Determine the relationship between learning strategies and performance in problem solving, (2) Explore the role of a student's declared major on performance in problem solving, (3) Understand the decision making process of high and low achievers during problem solving. Participants (N = 65) solved problems using the Interactive multimedia exercise (IMMEX) software. All participants not only solved "Microquest," which focuses on cellular processes and mode of action of antibiotics, but also "Creeping Crud," which focuses on the cause, origin and transmission of diseases. Participants also responded to the "Motivated Strategy Learning Questionnaire" (MSLQ). Hierarchical multiple regression was used for analysis with GPA (Gracie point average) as a control. There were 49 (78.6%) that successfully solved "Microquest" while 52 (82.5%) successfully solved "Creeping Crud". Metacognitive self regulation strategy was significantly (p self-esteem problems. The implications for educational and relevance to real life situations are discussed.

  5. Hardware Architecture of Polyphase Filter Banks Performing Embedded Resampling for Software-Defined Radio Front-Ends

    DEFF Research Database (Denmark)

    Awan, Mehmood-Ur-Rehman; Le Moullec, Yannick; Koch, Peter

    2012-01-01

    , and power optimization for field programmable gate array (FPGA) based architectures in an M -path polyphase filter bank with modified N -path polyphase filter. Such systems allow resampling by arbitrary ratios while simultaneously performing baseband aliasing from center frequencies at Nyquist zones...... for maximally decimated, under-decimated, over-decimated, and combined up- and down-sampled scenarios is used as a case study, and an analysis of area, time, and power for their FPGA architectures is given. For resource-optimized SDR front-ends, RA is superior for reducing operating clock rates and dynamic......In this paper, we describe resource-efficient hardware architectures for software-defined radio (SDR) front-ends. These architectures are made efficient by using a polyphase channelizer that performs arbitrary sample rate changes, frequency selection, and bandwidth control. We discuss area, time...

  6. Reuse without Compromising Performance: Industrial Experience from RPG Software Product Line for Mobile Devices

    DEFF Research Database (Denmark)

    Zhang, Weishan; Jarzabek, Stan

    2005-01-01

    It is often believed that reusable solutions, being generic, must necessarily compromise performance. In this paper, we consider a family of Role-Playing Games (RPGs). We analyzed similarities and differences among four RPGs. By applying a reuse technique of XVCL, we built an RPG product line...... architecture (RPG-PLA) from which we could derive any of the four RPGs. We built into the RPG-PLA a number of performance optimization strategies that could benefit any of the four (and possibly other similar) RPGs. By comparing the original vs. the new RPGs derived from the RPG-PLA, we demonstrated that reuse...... allowed us to achieve improved performance, both speed and memory utilization, as compared to each game developed individually. At the same time, our solution facilitated rapid development of new games, for new mobile devices, as well as ease of evolving with new features the RPG-PLA and custom games...

  7. A Tool for Optimizing the Build Performance of Large Software Code Bases

    NARCIS (Netherlands)

    Telea, Alexandru; Voinea, Lucian; Kontogiannis, K; Tjortjis, C; Winter, A

    2008-01-01

    We present Build Analyzer, a tool that helps developers optimize the build performance of huge systems written in C Due to complex C header dependencies, even small code changes can cause extremely long rebuilds, which are problematic when code is shared and modified by teams of hundreds of individu

  8. Performance of Student Software Development Teams: The Influence of Personality and Identifying as Team Members

    Science.gov (United States)

    Monaghan, Conal; Bizumic, Boris; Reynolds, Katherine; Smithson, Michael; Johns-Boast, Lynette; van Rooy, Dirk

    2015-01-01

    One prominent approach in the exploration of the variations in project team performance has been to study two components of the aggregate personalities of the team members: conscientiousness and agreeableness. A second line of research, known as self-categorisation theory, argues that identifying as team members and the team's performance…

  9. A blueprint for system-level performance modeling of software-intensive embedded systems

    NARCIS (Netherlands)

    Hendriks, M.; Basten, T.; Verriet, J.; Brassé, M.; Somers, L.

    2014-01-01

    Exploration of design alternatives and estimation of their key performance metrics such as latency and energy consumption is essential for making the proper design decisions in the early phases of system development. Often, highlevel models of the dynamic behavior of the system are used for the anal

  10. Reuse without Compromising Performance: Industrial Experience from RPG Software Product Line for Mobile Devices

    DEFF Research Database (Denmark)

    Zhang, Weishan; Jarzabek, Stan

    2005-01-01

    allowed us to achieve improved performance, both speed and memory utilization, as compared to each game developed individually. At the same time, our solution facilitated rapid development of new games, for new mobile devices, as well as ease of evolving with new features the RPG-PLA and custom games...

  11. A blueprint for system-level performance modeling of software-intensive embedded systems

    NARCIS (Netherlands)

    Hendriks, M.; Basten, T.; Verriet, J.; Brassé, M.; Somers, L.

    2016-01-01

    Exploration of design alternatives and estimation of their key performance metrics such as latency and energy consumption is essential for making the proper design decisions in the early phases of system development. Often, high-level models of the dynamic behavior of the system are used for the

  12. A blueprint for system-level performance modeling of software-intensive embedded systems

    NARCIS (Netherlands)

    Hendriks, M.; Basten, T.; Verriet, J.; Brassé, M.; Somers, L.

    2014-01-01

    Exploration of design alternatives and estimation of their key performance metrics such as latency and energy consumption is essential for making the proper design decisions in the early phases of system development. Often, highlevel models of the dynamic behavior of the system are used for the

  13. A blueprint for system-level performance modeling of software-intensive embedded systems

    NARCIS (Netherlands)

    Hendriks, M.; Basten, T.; Verriet, J.; Brassé, M.; Somers, L.

    2014-01-01

    Exploration of design alternatives and estimation of their key performance metrics such as latency and energy consumption is essential for making the proper design decisions in the early phases of system development. Often, highlevel models of the dynamic behavior of the system are used for the anal

  14. A blueprint for system-level performance modeling of software-intensive embedded systems

    NARCIS (Netherlands)

    Hendriks, M.; Basten, T.; Verriet, J.; Brassé, M.; Somers, L.

    2016-01-01

    Exploration of design alternatives and estimation of their key performance metrics such as latency and energy consumption is essential for making the proper design decisions in the early phases of system development. Often, high-level models of the dynamic behavior of the system are used for the ana

  15. A Tool for Optimizing the Build Performance of Large Software Code Bases

    NARCIS (Netherlands)

    Telea, Alexandru; Voinea, Lucian; Kontogiannis, K; Tjortjis, C; Winter, A

    2008-01-01

    We present Build Analyzer, a tool that helps developers optimize the build performance of huge systems written in C Due to complex C header dependencies, even small code changes can cause extremely long rebuilds, which are problematic when code is shared and modified by teams of hundreds of

  16. Performance of High-Reliability Space-Qualified Processors Implementing Software Defined Radios

    Science.gov (United States)

    2014-03-01

    SCHOOL Monterey, California 93943- 5000 Ronald A. Route Douglas A. Hensler President Provost The report entitled “Performance of High...exceptions to this generalization have emerged recently. The U.S. Government has been sponsoring the OPERA project through which the Boeing Corp. has...N N Data- collection sink tile 2 Compute , 0 1NkX k≤ < − 13 We used the system interfaces to the underlying tile-to-tile communications functions

  17. Combustion LES Software for Improved Emissions Predictions of High Performance Gas Turbine Combustors

    Science.gov (United States)

    2005-09-01

    94 vii 8503/8 ABSTRACT Low emissions of CO, NO,, and unburned hydrocarbons ( UHC ) are a difficult...NOR, UHC , and smoke, are becoming a requirement for today’s and future military gas turbine engines. Advanced, high performance gas turbines will...range, and operating pressure. 2 850318 1. INTRODUCTION Low emissions of pollutants, including CO, NO,,, UHC , and smoke, are becoming a requirement

  18. Programming implementation of performance testing of low light level ICCD camera based on LabVIEW software

    Science.gov (United States)

    Ni, Li; Ye, Qiong; Qian, Yunsheng

    2016-10-01

    Low light level (LLL) imaging technology major roles in the night and in other low light illumination stage, through a variety of low light level image intensifier and charge-coupled device (CCD), gains image information on the target acquisition, photoelectric conversion and high-performance enhancement, storing and displaying. In order to comprehensively test the parameters such as intensified charge-coupled device (ICCD) signal noise ratio (SNR) and dynamic range, this paper uses Laboratory Virtual Instrument Engineering Platform (LabVIEW) software for programming. Data acquisition is the core of the entire software programming, according to the function; it is divided into three parts: a) initializing acquisition cards; b) data collection and storage of useful data; c) closing the acquisition card. NI PXIe-5122 analog acquisition card and PXIe-1435 digital acquisition card were used to collect pal cameras and camera link cameras' shooting pictures, developing with analog interface and the digital interface of ICCD test work. After obtaining data, we can then analyze the performance of the camera by calculating the data according to the principle programmed parameters. Experimental testing process, the use of half-moon test target signal to noise ratio, dynamic range parameters and uniformity test target will be normal. Meanwhile, in order to increase the practicality of the program, we also add the database module into the program. LabSQL is a free, multi-database, cross-platform database access LabVIEW Toolkit. Using LabSQL can access almost any type of database, perform a variety of inquiries and record various operations. With just a simple programming, database access can be achieved in LabVIEW.

  19. The effects of exercise reminder software program on office workers' perceived pain level, work performance and quality of life.

    Science.gov (United States)

    Irmak, A; Bumin, G; Irmak, R

    2012-01-01

    In direct proportion to current technological developments, both the computer usage in the workplaces is increased and requirement of leaving the desk for an office worker in order to photocopy a document, send or receive an e-mail is decreased. Therefore, office workers stay in the same postures accompanied by long periods of keyboard usage. In recent years, with intent to reduce the incidence of work related musculoskeletal disorders several exercise reminder software programs have been developed. The purpose of this study is to evaluate the effectiveness of the exercise reminder software program on office workers' perceived pain level, work performance and quality of life. 39 healthy office workers accepted to attend the study. Participants were randomly split in to two groups, control group (n = 19) and intervention group (n = 20). Visual Analogue Scale to evaluate the perceived pain was administered all of the participants in the beginning and at the end of the study. The intervention group used the program for 10 weeks. Findings showed that the control group VAS scores remained the same, but the intervention group VAS scores decreased in a statistically significant way (p office workers. Further long term studies with more subjects are needed to describe the effects of these programs and the mechanism under these effects.

  20. Performance evaluation of time-aware enhanced software defined networking (TeSDN) for elastic data center optical interconnection.

    Science.gov (United States)

    Yang, Hui; Zhang, Jie; Zhao, Yongli; Ji, Yuefeng; Li, Hui; Lin, Yi; Li, Gang; Han, Jianrui; Lee, Young; Ma, Teng

    2014-07-28

    Data center interconnection with elastic optical networks is a promising scenario to meet the high burstiness and high-bandwidth requirements of data center services. We previously implemented enhanced software defined networking over elastic optical network for data center application [Opt. Express 21, 26990 (2013)]. On the basis of it, this study extends to consider the time-aware data center service scheduling with elastic service time and service bandwidth according to the various time sensitivity requirements. A novel time-aware enhanced software defined networking (TeSDN) architecture for elastic data center optical interconnection has been proposed in this paper, by introducing a time-aware resources scheduling (TaRS) scheme. The TeSDN can accommodate the data center services with required QoS considering the time dimensionality, and enhance cross stratum optimization of application and elastic optical network stratums resources based on spectrum elasticity, application elasticity and time elasticity. The overall feasibility and efficiency of the proposed architecture is experimentally verified on our OpenFlow-based testbed. The performance of TaRS scheme under heavy traffic load scenario is also quantitatively evaluated based on TeSDN architecture in terms of blocking probability and resource occupation rate.

  1. Optimized Architectural Approaches in Hardware and Software Enabling Very High Performance Shared Storage Systems

    CERN Document Server

    CERN. Geneva

    2004-01-01

    There are issues encountered in high performance storage systems that normally lead to compromises in architecture. Compute clusters tend to have compute phases followed by an I/O phase that must move data from the entire cluster in one operation. That data may then be shared by a large number of clients creating unpredictable read and write patterns. In some cases the aggregate performance of a server cluster must exceed 100 GB/s to minimize the time required for the I/O cycle thus maximizing compute availability. Accessing the same content from multiple points in a shared file system leads to the classical problems of data "hot spots" on the disk drive side and access collisions on the data connectivity side. The traditional method for increasing apparent bandwidth usually includes data replication which is costly in both storage and management. Scaling a model that includes replicated data presents additional management challenges as capacity and bandwidth expand asymmetrically while the system is scaled. ...

  2. Developing a High Performance Software Library with MPI and CUDA for Matrix Computations

    Directory of Open Access Journals (Sweden)

    Bogdan Oancea

    2014-04-01

    Full Text Available Nowadays, the paradigm of parallel computing is changing. CUDA is now a popular programming model for general purpose computations on GPUs and a great number of applications were ported to CUDA obtaining speedups of orders of magnitude comparing to optimized CPU implementations. Hybrid approaches that combine the message passing model with the shared memory model for parallel computing are a solution for very large applications. We considered a heterogeneous cluster that combines the CPU and GPU computations using MPI and CUDA for developing a high performance linear algebra library. Our library deals with large linear systems solvers because they are a common problem in the fields of science and engineering. Direct methods for computing the solution of such systems can be very expensive due to high memory requirements and computational cost. An efficient alternative are iterative methods which computes only an approximation of the solution. In this paper we present an implementation of a library that uses a hybrid model of computation using MPI and CUDA implementing both direct and iterative linear systems solvers. Our library implements LU and Cholesky factorization based solvers and some of the non-stationary iterative methods using the MPI/CUDA combination. We compared the performance of our MPI/CUDA implementation with classic programs written to be run on a single CPU.

  3. The Software Architecture for Performing Scientific Computation with the JLAPACK Libraries in ScalaLab

    Directory of Open Access Journals (Sweden)

    Stergios Papadimitriou

    2012-01-01

    Full Text Available Although LAPACK is a powerful library its utilization is difficult. JLAPACK, a Java translation obtained automatically from the Fortran LAPACK sources, retains exactly the same difficult to use interface of LAPACK routines. The MTJ library implements an object oriented Java interface to JLAPACK that hides many complicated details. ScalaLab exploits the flexibility of the Scala language to present an even more friendly and convenient interface to the powerful but complicated JLAPACK library. The article describes the interfacing of the low-level JLAPACK routines within the ScalaLab environment. This is performed rather easily by exploiting well suited features of the Scala language. Also, the paper demonstrates the convenience of using JLAPACK routines for linear algebra operations from within ScalaLab.

  4. SafetyBarrierManager, a software tool to perform risk analysis using ARAMIS's principles

    DEFF Research Database (Denmark)

    Duijm, Nijs Jan

    of the ARAMIS project, Risø National Laboratory started developing a tool that could implement these methodologies, leading to SafetyBarrierManager. The tool is based on the principles of “safety‐barrier diagrams”, which are very similar to “bowties”, with the possibility of performing quantitative analysis......The ARAMIS project resulted in a number of methodologies, dealing with among others: the development of standard fault trees and “bowties”; the identification and classification of safety barriers; and including the quality of safety management into the quantified risk assessment. After conclusion....... The tool allows constructing comprehensive fault trees, event trees and safety‐barrier diagrams. The tool implements the ARAMIS idea of a set of safety barrier types, to which a number of safety management issues can be linked. By rating the quality of these management issues, the operational probability...

  5. Subpixelic Measurement of Large 1D Displacements: Principle, Processing Algorithms, Performances and Software

    Directory of Open Access Journals (Sweden)

    Valérian Guelpa

    2014-03-01

    Full Text Available This paper presents a visual measurement method able to sense 1D rigid body displacements with very high resolutions, large ranges and high processing rates. Sub-pixelic resolution is obtained thanks to a structured pattern placed on the target. The pattern is made of twin periodic grids with slightly different periods. The periodic frames are suited for Fourier-like phase calculations—leading to high resolution—while the period difference allows the removal of phase ambiguity and thus a high range-to-resolution ratio. The paper presents the measurement principle as well as the processing algorithms (source files are provided as supplementary materials. The theoretical and experimental performances are also discussed. The processing time is around 3 µs for a line of 780 pixels, which means that the measurement rate is mostly limited by the image acquisition frame rate. A 3-σ repeatability of 5 nm is experimentally demonstrated which has to be compared with the 168 µm measurement range.

  6. Subpixelic measurement of large 1D displacements: principle, processing algorithms, performances and software.

    Science.gov (United States)

    Guelpa, Valérian; Laurent, Guillaume J; Sandoz, Patrick; Zea, July Galeano; Clévy, Cédric

    2014-03-12

    This paper presents a visual measurement method able to sense 1D rigid body displacements with very high resolutions, large ranges and high processing rates. Sub-pixelic resolution is obtained thanks to a structured pattern placed on the target. The pattern is made of twin periodic grids with slightly different periods. The periodic frames are suited for Fourier-like phase calculations-leading to high resolution-while the period difference allows the removal of phase ambiguity and thus a high range-to-resolution ratio. The paper presents the measurement principle as well as the processing algorithms (source files are provided as supplementary materials). The theoretical and experimental performances are also discussed. The processing time is around 3 µs for a line of 780 pixels, which means that the measurement rate is mostly limited by the image acquisition frame rate. A 3-σ repeatability of 5 nm is experimentally demonstrated which has to be compared with the 168 µm measurement range.

  7. Investigation on the Performance of CIGS/TiO2 Heterojunction Using SCAPS Software for Highly Efficient Solar Cells

    Science.gov (United States)

    Chihi, A.; Boujmil, M. F.; Bessais, B.

    2017-08-01

    In this study, Cu (In, Ga) Se2 (CIGS) material with the non-toxic titanium dioxide TiO2 as an n-type buffer layer and indium tin oxide as the window layer are numerically simulated using a solar cell capacitance simulation software package. This numerical analysis has been carried out with the aim of boosting the performances of CIGS/TiO2 solar cells by tuning the defect density and band gap energy of the ordered vacancy compound (OVC) layer and by using a Back-electron reflector (EBR) layer, namely Al2O3. Solar cell performance is investigated as a function of absorber thickness. It is found that there exists an optimal thickness. The effect of OVC compounds on the performance of the device structure are discussed leading to an optimal band gap energy and defect density of about 1.17 eV and 4.97 × 1013 cm-3, respectively. The matching solar cell conversion efficiency reached a maximum value of 12.38% by introducing the OVC layer. It is also shown that, in spite of a decrease in thickness, the external quantum efficiency (EQE) of ultrathin CIGS solar cells can be enhanced owing to the employment of EBR. The significant improvement of EQE, mainly in the near-infrared part of the solar spectrum, can be ascribed to the low parasitic absorption loss in the ultrathin CIGS layer (˜570 nm).

  8. Performance evaluation of linear algebra software in parallel architectures. [FPS, CDC, Cray, Burroughs, ICL, and TI computers

    Energy Technology Data Exchange (ETDEWEB)

    Jordan, T.L.

    1979-10-01

    Performance data of parallel computers on several of the problems of linear algebra using direct methods are provided. The computers considered include software pipeline, hardware pipeline, single-instruction multiple-data, and multiple-instruction multiple-data. Special features of each architecture are considered. Factors such as start-up time, scalar-vector break-even points, consistency in operation count, parallel steps required, and speed-up and efficiency of the hardware are discussed. A reasonably broad comparison is given for LU factorization without pivoting. A less extensive comparison is given for LU factorization with pivoting. Also various intracomputer comparisons are presented to show the performance of different implementations of a particcular algorithm as well as the performance of different algorithms for solving the same problem. Data were collected for the linear algebraic problems of matrix multiplication, regular sparse systems (including tridiagonal systems and dissection techniques), and random sparse systems. The eigenvalue problem is not addressed. 15 figures, 7 tables.

  9. Experimental performance evaluation of software defined networking (SDN) based data communication networks for large scale flexi-grid optical networks.

    Science.gov (United States)

    Zhao, Yongli; He, Ruiying; Chen, Haoran; Zhang, Jie; Ji, Yuefeng; Zheng, Haomian; Lin, Yi; Wang, Xinbo

    2014-04-21

    Software defined networking (SDN) has become the focus in the current information and communication technology area because of its flexibility and programmability. It has been introduced into various network scenarios, such as datacenter networks, carrier networks, and wireless networks. Optical transport network is also regarded as an important application scenario for SDN, which is adopted as the enabling technology of data communication networks (DCN) instead of general multi-protocol label switching (GMPLS). However, the practical performance of SDN based DCN for large scale optical networks, which is very important for the technology selection in the future optical network deployment, has not been evaluated up to now. In this paper we have built a large scale flexi-grid optical network testbed with 1000 virtual optical transport nodes to evaluate the performance of SDN based DCN, including network scalability, DCN bandwidth limitation, and restoration time. A series of network performance parameters including blocking probability, bandwidth utilization, average lightpath provisioning time, and failure restoration time have been demonstrated under various network environments, such as with different traffic loads and different DCN bandwidths. The demonstration in this work can be taken as a proof for the future network deployment.

  10. Software Engineering Improvement Plan

    Science.gov (United States)

    2006-01-01

    In performance of this task order, bd Systems personnel provided support to the Flight Software Branch and the Software Working Group through multiple tasks related to software engineering improvement and to activities of the independent Technical Authority (iTA) Discipline Technical Warrant Holder (DTWH) for software engineering. To ensure that the products, comments, and recommendations complied with customer requirements and the statement of work, bd Systems personnel maintained close coordination with the customer. These personnel performed work in areas such as update of agency requirements and directives database, software effort estimation, software problem reports, a web-based process asset library, miscellaneous documentation review, software system requirements, issue tracking software survey, systems engineering NPR, and project-related reviews. This report contains a summary of the work performed and the accomplishments in each of these areas.

  11. EPIQR software

    Energy Technology Data Exchange (ETDEWEB)

    Flourentzos, F. [Federal Institute of Technology, Lausanne (Switzerland); Droutsa, K. [National Observatory of Athens, Athens (Greece); Wittchen, K.B. [Danish Building Research Institute, Hoersholm (Denmark)

    1999-11-01

    The support of the EPIQR method is a multimedia computer program. Several modules help the users of the method to treat the data collected during a diagnosis survey, to set up refurbishment scenario and calculate their cost or energy performance, and finally to visualize the results in a comprehensive way and to prepare quality reports. This article presents the structure and the main features of the software. (au)

  12. EPIQR software

    Energy Technology Data Exchange (ETDEWEB)

    Flourentzos, F. [Federal Institute of Technology-Lausanne (EPFL), Solar Energy and Building Physics Laboratory (LESO-PB), Lausanne (Switzerland); Droutsa, K. [National Observatory of Athens, Institute of Meteorology and Physics of Atmospheric Environment, Group Energy Conservation, Athens (Greece); Wittchen, K.B. [Danish Building Research Institute, Division of Energy and Indoor Environment, Hoersholm, (Denmark)

    2000-07-01

    The support of the EPIQR method is a multimedia computer program. Several modules help the users of the method to treat the data collected during a diagnosis survey, to set up refurbishment scenarios and calculate their cost or energy performance, and finally to visualize the results in a comprehensive way and to prepare quality reports. This article presents the structure and the main features of the software. (author)

  13. Intercomparison of measurements of NO2 concentrations in the atmosphere simulation chamber SAPHIR during the NO3Comp campaign

    Directory of Open Access Journals (Sweden)

    H. Fuchs

    2010-01-01

    Full Text Available NO2 concentrations were measured by various instruments during the NO3Comp campaign at the atmosphere simulation chamber SAPHIR at Forschungszentrum Jülich, Germany, in June 2007. Analytical methods included photolytic conversion with chemiluminescence (PC-CLD, broadband cavity ring-down spectroscopy (BBCRDS, pulsed cavity ring-down spectroscopy (CRDS, incoherent broadband cavity-enhanced absorption spectroscopy (IBB\\-CEAS, and laser-induced fluorescence (LIF. All broadband absorption spectrometers were optimized for the detection of the main target species of the campaign, NO3, but were also capable of detecting NO2 simultaneously with reduced sensitivity. NO2 mixing ratios in the chamber were within a range characteristic of polluted, urban conditions, with a maximum mixing ratio of approximately 75 ppbv. The overall agreement between measurements of all instruments was excellent. Linear fits of the combined data sets resulted in slopes that differ from unity only within the stated uncertainty of each instrument. Possible interferences from species such as water vapor and ozone were negligible under the experimental conditions.

  14. Intercomparison of measurements of NO2 concentrations in the atmosphere simulation chamber SAPHIR during the NO3Comp campaign

    Directory of Open Access Journals (Sweden)

    R. M. Varma

    2009-10-01

    Full Text Available NO2 concentrations were measured by various instruments during the NO3Comp campaign at the atmosphere simulation chamber SAPHIR at Forschungszentrum Jülich, Germany, in June 2007. Analytic methods included photolytic conversion with chemiluminescence (PC-CLD, broadband cavity ring-down spectroscopy (BBCRDS, pulsed cavity ring-down spectroscopy (CRDS, incoherent broadband cavity-enhanced absorption spectroscopy (IBBCEAS, and laser-induced fluorescence (LIF. All broadband absorption spectrometers were optimized for the detection of the main target species of the campaign, NO2, but were also capable of detecting NO2 simultaneously with reduced sensitivity. NO2 mixing ratios in the chamber were within a range characteristic of polluted, urban conditions, with a maximum mixing ratio of approximately 75 ppbv. The overall agreement between measurements of all instruments was excellent. Linear fits of the combined data sets resulted in slopes that differ from unity only within the stated uncertainty of each instrument. Possible interferences from species such as water vapor and ozone were negligible under the experimental conditions.

  15. SAPHIR - a multi-scale, multi-resolution modeling environment targeting blood pressure regulation and fluid homeostasis.

    Science.gov (United States)

    Thomas, S; Abdulhay, Enas; Baconnier, Pierre; Fontecave, Julie; Francoise, Jean-Pierre; Guillaud, Francois; Hannaert, Patrick; Hernandez, Alfredo; Le Rolle, Virginie; Maziere, Pierre; Tahi, Fariza; Zehraoui, Farida

    2007-01-01

    We present progress on a comprehensive, modular, interactive modeling environment centered on overall regulation of blood pressure and body fluid homeostasis. We call the project SAPHIR, for "a Systems Approach for PHysiological Integration of Renal, cardiac, and respiratory functions". The project uses state-of-the-art multi-scale simulation methods. The basic core model will give succinct input-output (reduced-dimension) descriptions of all relevant organ systems and regulatory processes, and it will be modular, multi-resolution, and extensible, in the sense that detailed submodules of any process(es) can be "plugged-in" to the basic model in order to explore, eg. system-level implications of local perturbations. The goal is to keep the basic core model compact enough to insure fast execution time (in view of eventual use in the clinic) and yet to allow elaborate detailed modules of target tissues or organs in order to focus on the problem area while maintaining the system-level regulatory compensations.

  16. Computational environment and software configuration management of the 1996 performance assessment for the Waste Isolation Pilot Plant

    Energy Technology Data Exchange (ETDEWEB)

    FROEHLICH,GARY K.; WILLIAMSON,CHARLES MICHAEL; OGDEN,HARVEY C.

    2000-05-23

    The US Department of Energy (DOE) Waste Isolation Pilot Plant (WIPP), located in southeast New Mexico, is a deep geologic repository for the permanent disposal of transuranic waste generated by DOE defense-related activities. Sandia National Laboratories (SNL), in its role as scientific advisor to the DOE, is responsible for evaluating the long-term performance of the WIPP. This risk-based Performance Assessment (PA) is accomplished in part through the use of numerous scientific modeling codes, which rely for some of their inputs on data gathered during characterization of the site. The PA is subject to formal requirements set forth in federal regulations. In particular, the components of the calculation fall under the configuration management and software quality assurance aegis of the American Society of Mechanical Engineers (ASME) Nuclear Quality Assurance (NQA) requirements. This paper describes SNL's implementation of the NQA requirements regarding configuration management. The complexity of the PA calculation is described, and the rationale for developing a flexible, robust run-control process is discussed. The run-control implementation is described, and its integration with the configuration-management system is then explained, to show how a calculation requiring 37,000 CPU-hours, and involving 225,000 output files totaling 95 Gigabytes, was accomplished in 5 months by 2 individuals, with full traceability and reproducibility.

  17. Software Engineering Reviews and Audits

    CERN Document Server

    Summers, Boyd L

    2011-01-01

    Accurate software engineering reviews and audits have become essential to the success of software companies and military and aerospace programs. These reviews and audits define the framework and specific requirements for verifying software development efforts. Authored by an industry professional with three decades of experience, Software Engineering Reviews and Audits offers authoritative guidance for conducting and performing software first article inspections, and functional and physical configuration software audits. It prepares readers to answer common questions for conducting and perform

  18. Software Engineering for Portability.

    Science.gov (United States)

    Stanchev, Ivan

    1990-01-01

    Discussion of the portability of educational software focuses on the software design and development process. Topics discussed include levels of portability; the user-computer dialog; software engineering principles; design techniques for student performance records; techniques of courseware programing; and suggestions for further research and…

  19. EFFECT OF SOYBEAN OIL BIOFUEL BLENDING ON THE PERFORMANCE AND EMISSIONS OF DIESEL ENGINE USING DIESEL-RK SOFTWARE

    Directory of Open Access Journals (Sweden)

    Mohamed F. Al-Dawody,

    2011-06-01

    Full Text Available The scope of the technology is to provide utility and comfort with no damage to the user or to the surroundings. For many years now, petroleum products and other fossil fuels have given us utility andcomfort in a variety of areas, but causes environmental problems which threaten wild and human life. In this study, the performance and emissions of single cylinder, four stroke, direct injection diesel engine operating on diesel oil and different Soybean Methyl Ester (SME blends have been investigated theoretically using thesimulation software Diesel-RK. Based on the computed modeling results it’s found that 41.3 %, 53.2 % & 62.6 % reduction in the Bosch smoke number obtained with B20% SME, B40 % SME and B100% SME respectively, compared to pure diesel operation. In addition a reduction in PM emissions is observed 47.2%, 60 % & 68% for the B20 % SME, B40 % SME, and B 100% SME respectively. On the average basis there is a reduction in the thermal efficiency, power, and SFC, for all SME blends by 2%, 3%, and 12% respectively compared to pure diesel fuel. All blending of SME produce higher NOx emissions more than 28% compared with pure diesel fuel. A parametric study of retarding injection timing, varying engine speed and compression ratio effects has been performed. Its observed that retarding the injection timing can reduce the increase in the NOx emissions to great extent. Among all tested fuels its noticed that B20% SME was the best tested fuel which gave the same performance results with good reduction in emissions as compared to pure diesel operation. A very good agreement was obtained between the results and the available theoretical and experimental results of other researchers.

  20. A Study of Performance and Effort Expectancy Factors among Generational and Gender Groups to Predict Enterprise Social Software Technology Adoption

    Science.gov (United States)

    Patel, Sunil S.

    2013-01-01

    Social software technology has gained considerable popularity over the last decade and has had a great impact on hundreds of millions of people across the globe. Businesses have also expressed their interest in leveraging its use in business contexts. As a result, software vendors and business consumers have invested billions of dollars to use…

  1. A Study of Performance and Effort Expectancy Factors among Generational and Gender Groups to Predict Enterprise Social Software Technology Adoption

    Science.gov (United States)

    Patel, Sunil S.

    2013-01-01

    Social software technology has gained considerable popularity over the last decade and has had a great impact on hundreds of millions of people across the globe. Businesses have also expressed their interest in leveraging its use in business contexts. As a result, software vendors and business consumers have invested billions of dollars to use…

  2. Novel, Highly-Parallel Software for the Online Storage System of the ATLAS Experiment at CERN: Design and Performances

    CERN Document Server

    Colombo, T; The ATLAS collaboration

    2012-01-01

    Abstract--- The ATLAS experiment observes proton-proton collisions delivered by the LHC accelerator at CERN. The ATLAS Trigger and Data Acquisition (TDAQ) system selects interesting events on-line in a three-level trigger system in order to store them at a budgeted rate of several hundred Hz, for an average event size of ~1.5 MB. This paper focuses on the TDAQ data-logging system and in particular on the implementation and performance of a novel software design, reporting on the effort of exploiting the full power of multi-core hardware. In this respect, the main challenge presented by the data-logging workload is the conflict between the largely parallel nature of the event processing, including the recently introduced on-line event-compression, and the constraint of sequential file writing and checksum evaluation. This is further complicated by the necessity of operating in a fully data-driven mode, to cope with continuously evolving trigger and detector configurations. In this paper we will briefly discuss...

  3. A New High-Performance Digital FM Modulator and Demodulator for Software-Defined Radio and Its FPGA Implementation

    Directory of Open Access Journals (Sweden)

    Indranil Hatai

    2011-01-01

    Full Text Available This paper deals with an FPGA implementation of a high performance FM modulator and demodulator for software defined radio (SDR system. The individual component of proposed FM modulator and demodulator has been optimized in such a way that the overall design consists of a high-speed, area optimized and low-power features. The modulator and demodulator contain an optimized direct digital frequency synthesizer (DDFS based on quarter-wave symmetry technique for generating the carrier frequency with spurious free dynamic range (SFDR of more than 64 dB. The FM modulator uses pipelined version of the DDFS to support the up conversion in the digital domain. The proposed FM modulator and demodulator has been implemented and tested using XC2VP30-7ff896 FPGA as a target device and can operate at a maximum frequency of 334.5 MHz and 131 MHz involving around 1.93 K and 6.4 K equivalent gates for FM modulator and FM demodulator respectively. After applying a 10 KHz triangular wave input and by setting the system clock frequency to 100 MHz using Xpower the power has been calculated. The FM modulator consumes 107.67 mW power while FM demodulator consumes 108.67 mW power for the same input running at same data rate.

  4. New insights into the degradation of terpenoids with OH: a study of the OH budget in the atmosphere simulation chamber SAPHIR

    Science.gov (United States)

    Kaminski, Martin; Fuchs, Hendrik; Acir, Ismail-Hakki; Bohn, Birger; Brauers, Theo; Dorn, Hans-Peter; Häseler, Rolf; Hofzumahaus, Andreas; Li, Xin; Lutz, Anna; Nehr, Sascha; Rohrer, Franz; Tillmann, Ralf; Wegener, Robert; Kiendler-Scharr, Astrid; Wahner, Andreas

    2014-05-01

    The hydroxyl radical (OH) is the main oxidation agent in the atmosphere during daytime. Recent field campaigns studying the radical chemistry in forested areas showed large discrepancies between measured and modeled OH concentration at low NOx conditions and when OH reactivity was dominated by VOC. These observations were only partially explained by the evidence for new efficient hydroxyl radical regeneration pathways in the isoprene oxidation mechanism. The question arises if other reactive VOCs with high global emission rates are also capable of additional OH recycling. Beside isoprene, monoterpenes and 2-methyl-3-buten-2-ol (MBO) are the volatile organic compounds (VOC) with the highest global emission rates. Due to their high reactivity towards OH monoterpenes and MBO can dominate the radical chemistry of the atmosphere in forested areas under certain conditions. In the present study the photochemical degradation mechanism of α-pinene, β-pinene, limonene, myrcene and MBO was investigated in the Jülich atmosphere simulation chamber SAPHIR. The focus of this study was in particular on the investigation of the OH budget in the degradation process. The photochemical degradation of these terpenoids was studied in a dedicated series of experiments in the years 2012 and 2013. The SAPHIR chamber was equipped with instrumentation to measure radicals (OH, HO2, RO2), the total OH reactivity, all important OH precursors (O3, HONO, HCHO), the parent VOC, its main oxidation products and photolysis frequencies to investigate the radical budget in the SAPHIR chamber. All experiments were carried out under low NOx conditions (≤ 2ppb) and atmospheric terpenoid concentrations (≤ 5ppb) with and without addition of ozone into the SAPHIR chamber. For the investigation of the OH budget all measured OH production terms were compared to the measured OH destruction. Within the limits of accuracy of the instruments the OH budget was balanced in all cases. Consequently unaccounted

  5. Performance analysis of the Early-Est software within the tsunami early warning system installed at the INGV

    Science.gov (United States)

    Bernardi, Fabrizio; Lauciani, Valentino; Lomax, Anthony; Lorito, Stefano; Michelini, Alberto; Piatanesi, Alessio

    2014-05-01

    Fast, accurate and reliable earthquake source parameters (epicenter, depth and magnitude) are crucial for seismologically based tsunami early warning procedures. These parameters should be obtained within a few minutes after event origin time when coastlines in the near-field of the seismic source are potentially threatened. Thus there is no time for a detailed analysis and accurate revision of the automatic solution, and only a quick validation/rejection of the results may be performed in most of the cases by a seismologist. Within this context it is important to have a reliable estimate of the uncertainties of the earthquake epicenter location, depth and magnitude. Early-Est (EE) is a software currently installed at the recently established Centro Allerta Tsunami (CAT), the operational segment of the Italian National Tsunami Warning Centre (It-NTWC), in the seismic monitoring centre of the Istituto Nazionale di Geofisica e Vulcanologia (INGV) in Rome (Italy). EE operates on continuous-realtime seismic waveform data to perform trace processing and picking, phase association, event detection, hypocenter location, and event characterization. This characterization includes mb and Mwp magnitudes, and the determination of duration, T0, large earthquake magnitude, Mwpd, and assessment of tsunamigenic potential using Td and T50Ex. In order to test the performance of the fully automatic EE solutions for tsunami early warning, we first compare the hypocenters and magnitudes provided at global scale by different agencies (NEIC, GFZ, CSEM, GCMT) for events with magnitude Mw ≥ 5.5. We then compare the empirical uncertainties we obtain in this way with EE solution and with the differences between the EE system and the reference catalogues. Our analysis shows that EE is suitable for the purpose of the CAT since it generally provides fully automatic reliable locations and magnitudes within the uncertainties expected from statistical analysis of the manually revised reference

  6. System Software and Tools for High Performance Computing Environments: A report on the findings of the Pasadena Workshop, April 14--16, 1992

    Energy Technology Data Exchange (ETDEWEB)

    Sterling, T. [Universities Space Research Association, Washington, DC (United States); Messina, P. [Jet Propulsion Lab., Pasadena, CA (United States); Chen, M. [Yale Univ., New Haven, CT (United States)] [and others

    1993-04-01

    The Pasadena Workshop on System Software and Tools for High Performance Computing Environments was held at the Jet Propulsion Laboratory from April 14 through April 16, 1992. The workshop was sponsored by a number of Federal agencies committed to the advancement of high performance computing (HPC) both as a means to advance their respective missions and as a national resource to enhance American productivity and competitiveness. Over a hundred experts in related fields from industry, academia, and government were invited to participate in this effort to assess the current status of software technology in support of HPC systems. The overall objectives of the workshop were to understand the requirements and current limitations of HPC software technology and to contribute to a basis for establishing new directions in research and development for software technology in HPC environments. This report includes reports written by the participants of the workshop`s seven working groups. Materials presented at the workshop are reproduced in appendices. Additional chapters summarize the findings and analyze their implications for future directions in HPC software technology development.

  7. Order Effects of Learning with Modeling and Simulation Software on Field-Dependent and Field-Independent Children's Cognitive Performance: An Interaction Effect

    Science.gov (United States)

    Angeli, Charoula; Valanides, Nicos; Polemitou, Eirini; Fraggoulidou, Elena

    2014-01-01

    The study examined the interaction between field dependence-independence (FD/I) and learning with modeling software and simulations, and their effect on children's performance. Participants were randomly assigned into two groups. Group A first learned with a modeling tool and then with simulations. Group B learned first with simulations and then…

  8. Software engineering

    CERN Document Server

    Sommerville, Ian

    2010-01-01

    The ninth edition of Software Engineering presents a broad perspective of software engineering, focusing on the processes and techniques fundamental to the creation of reliable, software systems. Increased coverage of agile methods and software reuse, along with coverage of 'traditional' plan-driven software engineering, gives readers the most up-to-date view of the field currently available. Practical case studies, a full set of easy-to-access supplements, and extensive web resources make teaching the course easier than ever.

  9. DNA Commission of the International Society for Forensic Genetics: Recommendations on the validation of software programs performing biostatistical calculations for forensic genetics applications.

    Science.gov (United States)

    Coble, M D; Buckleton, J; Butler, J M; Egeland, T; Fimmers, R; Gill, P; Gusmão, L; Guttman, B; Krawczak, M; Morling, N; Parson, W; Pinto, N; Schneider, P M; Sherry, S T; Willuweit, S; Prinz, M

    2016-11-01

    The use of biostatistical software programs to assist in data interpretation and calculate likelihood ratios is essential to forensic geneticists and part of the daily case work flow for both kinship and DNA identification laboratories. Previous recommendations issued by the DNA Commission of the International Society for Forensic Genetics (ISFG) covered the application of bio-statistical evaluations for STR typing results in identification and kinship cases, and this is now being expanded to provide best practices regarding validation and verification of the software required for these calculations. With larger multiplexes, more complex mixtures, and increasing requests for extended family testing, laboratories are relying more than ever on specific software solutions and sufficient validation, training and extensive documentation are of upmost importance. Here, we present recommendations for the minimum requirements to validate bio-statistical software to be used in forensic genetics. We distinguish between developmental validation and the responsibilities of the software developer or provider, and the internal validation studies to be performed by the end user. Recommendations for the software provider address, for example, the documentation of the underlying models used by the software, validation data expectations, version control, implementation and training support, as well as continuity and user notifications. For the internal validations the recommendations include: creating a validation plan, requirements for the range of samples to be tested, Standard Operating Procedure development, and internal laboratory training and education. To ensure that all laboratories have access to a wide range of samples for validation and training purposes the ISFG DNA commission encourages collaborative studies and public repositories of STR typing results. Published by Elsevier Ireland Ltd.

  10. Particle Loss Calculator – a new software tool for the assessment of the performance of aerosol inlet systems

    Directory of Open Access Journals (Sweden)

    S.-L. von der Weiden

    2009-09-01

    Full Text Available Most aerosol measurements require an inlet system to transport aerosols from a select sampling location to a suitable measurement device through some length of tubing. Such inlet systems must be optimized to minimize aerosol sampling artifacts and maximize sampling efficiency. In this study we introduce a new multifunctional software tool (Particle Loss Calculator, PLC that can be used to quickly determine aerosol sampling efficiency and particle transport losses due to passage through arbitrary tubing systems. The software employs relevant empirical and theoretical relationships found in established literature and accounts for the most important sampling and transport effects that might be encountered during deployment of typical, ground-based ambient aerosol measurements through a constant-diameter sampling probe. The software treats non-isoaxial and non-isokinetic aerosol sampling, aerosol diffusion and sedimentation as well as turbulent inertial deposition and inertial deposition in bends and contractions of tubing. This software was validated through comparison with experimentally determined particle losses for several tubing systems bent to create various diffusion, sedimentation and inertial deposition properties. As long as the tube geometries are not "too extreme", agreement is satisfactory. We discuss the conclusions of these experiments, the limitations of the software and present three examples of the use of the Particle Loss Calculator in the field.

  11. Particle Loss Calculator – a new software tool for the assessment of the performance of aerosol inlet systems

    Directory of Open Access Journals (Sweden)

    S.-L. von der Weiden

    2009-04-01

    Full Text Available Most aerosol measurements require an inlet system to transport aerosols from a select sampling location to a suitable measurement device through some length of tubing. Such inlet systems must be optimized to minimize aerosol sampling artifacts and maximize sampling efficiency. In this study we introduce a new multifunctional software tool (Particle Loss Calculator, PLC that can be used to quickly determine aerosol sampling efficiency and particle transport losses due to passage through arbitrary tubing systems. The software employs relevant empirical and theoretical relationships found in established literature and accounts for the most important sampling and transport effects that might be encountered during deployment of typical, ground-based ambient aerosol measurements. The software treats non-isoaxial and non-isokinetic aerosol sampling, aerosol diffusion and sedimentation as well as turbulent inertial deposition and inertial deposition in bends and contractions of tubing. This software was validated through comparison with experimentally determined particle losses for several tubing systems bent to create various diffusion, sedimentation and inertial deposition properties. As long as the tube geometries are not "too extreme", agreement is satisfactory. We discuss the conclusions of these experiments, the limitations of the software and present three examples of the use of the Particle Loss Calculator in the field.

  12. Software cost/resource modeling: Software quality tradeoff measurement

    Science.gov (United States)

    Lawler, R. W.

    1980-01-01

    A conceptual framework for treating software quality from a total system perspective is developed. Examples are given to show how system quality objectives may be allocated to hardware and software; to illustrate trades among quality factors, both hardware and software, to achieve system performance objectives; and to illustrate the impact of certain design choices on software functionality.

  13. SOFTWARE OPEN SOURCE, SOFTWARE GRATIS?

    Directory of Open Access Journals (Sweden)

    Nur Aini Rakhmawati

    2006-01-01

    Full Text Available Normal 0 false false false IN X-NONE X-NONE MicrosoftInternetExplorer4 Berlakunya Undang – undang Hak Atas Kekayaan Intelektual (HAKI, memunculkan suatu alternatif baru untuk menggunakan software open source. Penggunaan software open source menyebar seiring dengan isu global pada Information Communication Technology (ICT saat ini. Beberapa organisasi dan perusahaan mulai menjadikan software open source sebagai pertimbangan. Banyak konsep mengenai software open source ini. Mulai dari software yang gratis sampai software tidak berlisensi. Tidak sepenuhnya isu software open source benar, untuk itu perlu dikenalkan konsep software open source mulai dari sejarah, lisensi dan bagaimana cara memilih lisensi, serta pertimbangan dalam memilih software open source yang ada. Kata kunci :Lisensi, Open Source, HAKI

  14. Software Reviews.

    Science.gov (United States)

    Smith, Richard L., Ed.

    1985-01-01

    Reviews software packages by providing extensive descriptions and discussions of their strengths and weaknesses. Software reviewed include (1) "VISIFROG: Vertebrate Anatomy" (grade seven-adult); (2) "Fraction Bars Computer Program" (grades three to six) and (3) four telecommunications utilities. (JN)

  15. Software Project Performance Evaluation Based on Fuzzy Neural Network%基于FNN的软件项目绩效评价模型研究

    Institute of Scientific and Technical Information of China (English)

    于本海; 张金隆; 吴恒亮; 郑丽伟

    2011-01-01

    从分析软件项目绩效评价指标体系不完善、评价方法不规范和模型考虑因素过于单一入手,应用统计分析理论建立软件组织状态、软件项目自身特征的指标体系;以文献研究的方式,界定软件项目绩效的内涵;提出了一种新的网络拓扑结构设计方法,建立了基于模糊神经网络的软件项目绩效评价模型;引入改进粒子群学习算法,准确高效地解决了评价模型连接权系数的确定问题.实证研究表明,该模型能够有效地评价软件项目绩效和识别项目风险因素,对软件组织制定风险规避策略、改善项目绩效水平、提供了决策支持信息.%From the analysis of software project performance evaluation indicator system, owing to its incompleteness, excessive simple evaluation model, and informal approach, using the statistic a-nalysis theory, an indicator system about software organizations status and software projects characteristics is given. The connotation of software project performance is defined through literature research. Then, the author develops a software project performance evaluation model based on Fuzzy Neural Network using a new network topology structure, which precisely and efficiently resolves the problem of evaluation model' s connection right weights. The empirical research indicates that the model may effectively evaluate the software project performance and cognize project risk factors, provide risk avoiding measures, better project performance, and give decision-making support information.

  16. Parallel Software Model Checking

    Science.gov (United States)

    2015-01-08

    JAN 2015 2. REPORT TYPE N/A 3. DATES COVERED 4. TITLE AND SUBTITLE Parallel Software Model Checking 5a. CONTRACT NUMBER 5b. GRANT NUMBER...AND ADDRESS(ES) Software Engineering Institute Carnegie Mellon University Pittsburgh, PA 15213 8. PERFORMING ORGANIZATION REPORT NUMBER 9...3: ∧ ≥ 10 ∧ ≠ 10 ⇒ : Parallel Software Model Checking Team Members Sagar Chaki, Arie Gurfinkel

  17. Quantitative and qualitative assessment of diurnal variability in tropospheric humidity using SAPHIR on-board Megha-Tropiques

    Science.gov (United States)

    Uma, K. N.; Das, Siddarth Shankar

    2016-08-01

    The global diurnal variability of relative humidity (RH) from August 2012 to May 2014 is discussed for the first time using 'Sounder for Atmospheric Profiling of Humidity in the Inter-tropical Regions (SAPHIR)', a microwave humidity sounder onboard Megha-Tropiques (MT). It is superior to other microwave satellite humidity sounders in terms of its higher repetitive cycle in the tropics owing to its low-inclination orbit and the availability of six dedicated humidity sounding channels. The six layers obtained are 1000-850, 850-700, 700-550, 550-400, 400-250 and 250-100 hPa. Three hourly data over a month has been combined using equivalent day analysis to attain a composite profile of complete diurnal cycle in each grid (2.5°×2.5°). A distinct diurnal variation is obtained over the continental and the oceanic regions at all the layers. The magnitude in the lower tropospheric humidity (LTH), middle tropospheric humidity (MTH) and the upper tropospheric humidity (UTH) show a large variability over the continental regions compared to that over oceans. The monthly variability of the diurnal variation over the years has also been discussed by segregating into five different continental and four different oceanic regions. Afternoon peaks dominate in the LTH over the land and the desert regions. The MTH is found to vary between the evening and the early morning hours over different geographical regions and not as consistent as that of the LTH. The UTH maximum magnitude is generally observed during the early morning hours, over the continents. Interestingly, the Oceanic regions are found to have a dominant magnitude in the afternoon hours similar to that of the continents in the LTH, evening maximum in the MTH and the early morning maximum in the UTH. The underlying mechanisms involved in the variability of humidity over different regions are also discussed. The study reveals the complexity involved in the understanding the diurnal variability over the continents and open

  18. Software reliability

    CERN Document Server

    Bendell, A

    1986-01-01

    Software Reliability reviews some fundamental issues of software reliability as well as the techniques, models, and metrics used to predict the reliability of software. Topics covered include fault avoidance, fault removal, and fault tolerance, along with statistical methods for the objective assessment of predictive accuracy. Development cost models and life-cycle cost models are also discussed. This book is divided into eight sections and begins with a chapter on adaptive modeling used to predict software reliability, followed by a discussion on failure rate in software reliability growth mo

  19. MieLab: A Software Tool to Perform Calculations on the Scattering of Electromagnetic Waves by Multilayered Spheres

    Directory of Open Access Journals (Sweden)

    Ovidio Peña-Rodríguez

    2011-01-01

    Full Text Available In this paper, we present MieLab, a free computational package for simulating the scattering of electromagnetic radiation by multilayered spheres or an ensemble of particles with normal size distribution. It has been designed as a virtual laboratory, including a friendly graphical user interface (GUI, an optimization algorithm (to fit the simulations to experimental results and scripting capabilities. The paper is structured in five different sections: the introduction is a perspective on the importance of the software for the study of scattering of light scattering. In the second section, various approaches used for modeling the scattering of electromagnetic radiation by small particles are discussed. The third and fourth sections are devoted to provide an overview of MieLab and to describe the main features of its architectural model and functional behavior, respectively. Finally, several examples are provided to illustrate the main characteristics of the software.

  20. Measurement of the reaction {gamma}p{yields}K{sup 0}{sigma}{sup +} for photon energies up to 2.65 GeV with the SAPHIR detector at ELSA; Messung der Reaktion {gamma}p {yields} K{sup 0}{sigma}{sup +} fuer Photonenergien bis 2.65 GeV mit dem SAPHIR-Detektor an ELSA

    Energy Technology Data Exchange (ETDEWEB)

    Lawall, R.

    2004-01-01

    The reaction {gamma}p {yields} K{sup 0}{sigma}{sup +} was measured with the SAPHIR-detector at ELSA during the run periods 1997 and 1998. Results were obtained for cross sections in the photon energy range from threshold up to 2.65 GeV for all production angles and for the {sigma}{sup +}-polarization. Emphasis has been put on the determination and reduction of the contributions of background reactions and the comparison with other measurements and theoretical predictions. (orig.)

  1. Threats to Bitcoin Software

    OpenAIRE

    Kateraas, Christian H

    2014-01-01

    Collect and analyse threat models to the Bitcoin ecosystem and its software. The create misuse case, attack trees, and sequence diagrams of the threats. Create a malicious client from the gathered threat models. Once the development of the client is complete, test the client and evaluate its performance. From this, assess the security of the Bitcoin software.

  2. COTS software selection process.

    Energy Technology Data Exchange (ETDEWEB)

    Watkins, William M. (Strike Wire Technologies, Louisville, CO); Lin, Han Wei; McClelland, Kelly (U.S. Security Associates, Livermore, CA); Ullrich, Rebecca Ann; Khanjenoori, Soheil; Dalton, Karen; Lai, Anh Tri; Kuca, Michal; Pacheco, Sandra; Shaffer-Gant, Jessica

    2006-05-01

    Today's need for rapid software development has generated a great interest in employing Commercial-Off-The-Shelf (COTS) software products as a way of managing cost, developing time, and effort. With an abundance of COTS software packages to choose from, the problem now is how to systematically evaluate, rank, and select a COTS product that best meets the software project requirements and at the same time can leverage off the current corporate information technology architectural environment. This paper describes a systematic process for decision support in evaluating and ranking COTS software. Performed right after the requirements analysis, this process provides the evaluators with more concise, structural, and step-by-step activities for determining the best COTS software product with manageable risk. In addition, the process is presented in phases that are flexible to allow for customization or tailoring to meet various projects' requirements.

  3. HeatmapGenerator: high performance RNAseq and microarray visualization software suite to examine differential gene expression levels using an R and C++ hybrid computational pipeline.

    Science.gov (United States)

    Khomtchouk, Bohdan B; Van Booven, Derek J; Wahlestedt, Claes

    2014-01-01

    The graphical visualization of gene expression data using heatmaps has become an integral component of modern-day medical research. Heatmaps are used extensively to plot quantitative differences in gene expression levels, such as those measured with RNAseq and microarray experiments, to provide qualitative large-scale views of the transcriptonomic landscape. Creating high-quality heatmaps is a computationally intensive task, often requiring considerable programming experience, particularly for customizing features to a specific dataset at hand. Software to create publication-quality heatmaps is developed with the R programming language, C++ programming language, and OpenGL application programming interface (API) to create industry-grade high performance graphics. We create a graphical user interface (GUI) software package called HeatmapGenerator for Windows OS and Mac OS X as an intuitive, user-friendly alternative to researchers with minimal prior coding experience to allow them to create publication-quality heatmaps using R graphics without sacrificing their desired level of customization. The simplicity of HeatmapGenerator is that it only requires the user to upload a preformatted input file and download the publicly available R software language, among a few other operating system-specific requirements. Advanced features such as color, text labels, scaling, legend construction, and even database storage can be easily customized with no prior programming knowledge. We provide an intuitive and user-friendly software package, HeatmapGenerator, to create high-quality, customizable heatmaps generated using the high-resolution color graphics capabilities of R. The software is available for Microsoft Windows and Apple Mac OS X. HeatmapGenerator is released under the GNU General Public License and publicly available at: http://sourceforge.net/projects/heatmapgenerator/. The Mac OS X direct download is available at: http://sourceforge.net/projects

  4. Software piracy

    OpenAIRE

    Kráčmer, Stanislav

    2011-01-01

    The objective of the present thesis is to clarify the term of software piracy and to determine responsibility of individual entities as to actual realization of software piracy. First, the thesis focuses on a computer programme, causes, realization and pitfalls of its inclusion under copyright protection. Subsequently, it observes methods of legal usage of a computer programme. This is the point of departure for the following attempt to define software piracy, accompanied with methods of actu...

  5. Comparison of Relative Humidity obtained from SAPHIR on board Megha-Tropiques and Ground based Microwave Radiometer Profiler over an equatorial station

    Science.gov (United States)

    Renju, Ramachandran Pillai; Uma, K. N.; Krishna Moorthy, K.; Mathew, Nizy; Raju C, Suresh

    A comparison has been made between the SAPHIR on board Megha-Tropiques (MT) derived Relative Humidity (RH (%)) with that derived from a ground based multi-frequency Microwave Radiometer Profiler (MRP) observations over an equatorial station Thiruvananthapuram (8.5(°) N and 76.9(°) E) for a one year period. As a first step, the validation of MRP has been made against the radiosonde for two years (2010 and 2011) during the Indian monsoon period July-September. This analysis shows a wet bias below 6 km and dry bias above. The comparison between the MRP and the MT derived RH has been made at five different altitudinal levels (0.75, 2.25, 4.0, 6.25 and 9.2 km range) strictly under clear sky condition. The regression analysis between the two reveals very good correlation (>0.8) in the altitudinal layer of 2.25 to 6.25 km. The differences between the two observations had also been explained interms of percentage of occurrence between MT and the MRP at each altitudinal layer. About 70-80% of the time, the difference in the RH is found to below 10% at first three layer. The RMSE of 2% is observed at almost all the height layers. The differences have been attributed to the different measurement and retrieval techniques involved in the ground based and satellite based measurements. Since MRP frequecy channels are not sensitive to small water vapor variabilities above 6 km, large differences are observed. Radiative Transfer computation for the channels of both MRP and SAPHIR will be carried out to understand the variabilities.

  6. 基于AHP与云模型的软件过程绩效评价方法%Software process performance evaluation based on AHP and cloud model

    Institute of Scientific and Technical Information of China (English)

    刘东飞

    2013-01-01

    To make software quality better and improvement the organization software process capability, software process performance evaluation based on AHP and cloud model is proposed. Under the premise of the analysis of organization process goal, process-sub process-measure items mapping is established. Through the collection of organization historical data, based on project measurement and product measurement, the process baseline is constructed. The AHP method determines the weight value of the various levels relative to the goal quantitatively. Based on the value, to analyze project measurement and product measurement, the cloud model is presented to evaluate process performance qualitatively, continuous improvement software process capability quantitatively. Finally, a practical application example is given out to prove the feasibility of the evaluation method.%为改进软件质量和提高组织软件过程能力,提出了一种基于AHP与云模型的软件过程绩效评价方法.在对组织过程目标分析的前提下,建立起过程-子过程-度量项的映射关系.通过采集组织的历史数据,以项目测量和产品测量为基础,建立过程基线,运用AHP方法定量的确定出各层次对目标的权重值.基于此,结合云模型分析项目测量和产品测量数据,对过程绩效进行定性评价,持续性量化的改进软件过程能力.同时还给出了该方法的实际应用,应用结果表明了该评价方法的可行性.

  7. CONTRIBUTION TO THE DEVELOPMENT OF A SIMULATION SOFTWARE PERFORMANCE AND SHARING RATIO IN LIQUID-LIQUID EXTRACTION

    Directory of Open Access Journals (Sweden)

    A. Hadj Seyd

    2015-07-01

    Full Text Available The present work is to develop software to predict the value yield and the distribution coefficient in the process of liquid-liquid extraction of components of a mixture, from mathematical models expressing these entities, based on equations equilibrium between liquid-liquid phases, and predict the conditions under which the extraction operation is favorable, unfavorable or impossible to realize, by studying the variation of the entities cited, based on the parameters influencing the extraction, which are: initial concentrations, rate of solvent and pH, in the case of a simple extraction (extraction of neutral products or when it is reactive (extraction of complex acids or bases for one or more components.The programming language used is "Delphi" which is a very powerful oriented object programming under Windows.

  8. Software engineering

    CERN Document Server

    Sommerville, Ian

    2016-01-01

    For courses in computer science and software engineering The Fundamental Practice of Software Engineering Software Engineering introduces readers to the overwhelmingly important subject of software programming and development. In the past few years, computer systems have come to dominate not just our technological growth, but the foundations of our world's major industries. This text seeks to lay out the fundamental concepts of this huge and continually growing subject area in a clear and comprehensive manner. The Tenth Edition contains new information that highlights various technological updates of recent years, providing readers with highly relevant and current information. Sommerville's experience in system dependability and systems engineering guides the text through a traditional plan-based approach that incorporates some novel agile methods. The text strives to teach the innovators of tomorrow how to create software that will make our world a better, safer, and more advanced place to live.

  9. Technical Note: Development and performance of a software tool for quality assurance of online replanning with a conventional Linac or MR-Linac.

    Science.gov (United States)

    Chen, Guang-Pei; Ahunbay, Ergun; Li, X Allen

    2016-04-01

    To develop an integrated quality assurance (QA) software tool for online replanning capable of efficiently and automatically checking radiation treatment (RT) planning parameters and gross plan quality, verifying treatment plan data transfer from treatment planning system (TPS) to record and verify (R&V) system, performing a secondary monitor unit (MU) calculation with or without a presence of a magnetic field from MR-Linac, and validating the delivery record consistency with the plan. The software tool, named ArtQA, was developed to obtain and compare plan and treatment parameters from both the TPS and the R&V system database. The TPS data are accessed via direct file reading and the R&V data are retrieved via open database connectivity and structured query language. Plan quality is evaluated with both the logical consistency of planning parameters and the achieved dose-volume histograms. Beams in between the TPS and R&V system are matched based on geometry configurations. To consider the effect of a 1.5 T transverse magnetic field from MR-Linac in the secondary MU calculation, a method based on modified Clarkson integration algorithm was developed and tested for a series of clinical situations. ArtQA has been used in their clinic and can quickly detect inconsistencies and deviations in the entire RT planning process. With the use of the ArtQA tool, the efficiency for plan check including plan quality, data transfer, and delivery check can be improved by at least 60%. The newly developed independent MU calculation tool for MR-Linac reduces the difference between the plan and calculated MUs by 10%. The software tool ArtQA can be used to perform a comprehensive QA check from planning to delivery with conventional Linac or MR-Linac and is an essential tool for online replanning where the QA check needs to be performed rapidly.

  10. nu-TRLan User Guide Version 1.0: A High-Performance Software Package for Large-Scale Harmitian Eigenvalue Problems

    Energy Technology Data Exchange (ETDEWEB)

    Yamazaki, Ichitaro; Wu, Kesheng; Simon, Horst

    2008-10-27

    The original software package TRLan, [TRLan User Guide], page 24, implements the thick restart Lanczos method, [Wu and Simon 2001], page 24, for computing eigenvalues {lambda} and their corresponding eigenvectors v of a symmetric matrix A: Av = {lambda}v. Its effectiveness in computing the exterior eigenvalues of a large matrix has been demonstrated, [LBNL-42982], page 24. However, its performance strongly depends on the user-specified dimension of a projection subspace. If the dimension is too small, TRLan suffers from slow convergence. If it is too large, the computational and memory costs become expensive. Therefore, to balance the solution convergence and costs, users must select an appropriate subspace dimension for each eigenvalue problem at hand. To free users from this difficult task, nu-TRLan, [LNBL-1059E], page 23, adjusts the subspace dimension at every restart such that optimal performance in solving the eigenvalue problem is automatically obtained. This document provides a user guide to the nu-TRLan software package. The original TRLan software package was implemented in Fortran 90 to solve symmetric eigenvalue problems using static projection subspace dimensions. nu-TRLan was developed in C and extended to solve Hermitian eigenvalue problems. It can be invoked using either a static or an adaptive subspace dimension. In order to simplify its use for TRLan users, nu-TRLan has interfaces and features similar to those of TRLan: (1) Solver parameters are stored in a single data structure called trl-info, Chapter 4 [trl-info structure], page 7. (2) Most of the numerical computations are performed by BLAS, [BLAS], page 23, and LAPACK, [LAPACK], page 23, subroutines, which allow nu-TRLan to achieve optimized performance across a wide range of platforms. (3) To solve eigenvalue problems on distributed memory systems, the message passing interface (MPI), [MPI forum], page 23, is used. The rest of this document is organized as follows. In Chapter 2 [Installation

  11. Software Defined Networking Demands on Software Technologies

    DEFF Research Database (Denmark)

    Galinac Grbac, T.; Caba, Cosmin Marius; Soler, José

    2015-01-01

    Software Defined Networking (SDN) is a networking approach based on a centralized control plane architecture with standardised interfaces between control and data planes. SDN enables fast configuration and reconfiguration of the network to enhance resource utilization and service performances....... This new approach enables a more dynamic and flexible network, which may adapt to user needs and application requirements. To this end, systemized solutions must be implemented in network software, aiming to provide secure network services that meet the required service performance levels. In this paper......, we review this new approach to networking from an architectural point of view, and identify and discuss some critical quality issues that require new developments in software technologies. These issues we discuss along with use case scenarios. Here in this paper we aim to identify challenges...

  12. Software Requirements Management

    Directory of Open Access Journals (Sweden)

    Ali Altalbe

    2015-04-01

    Full Text Available Requirements are defined as the desired set of characteristics of a product or a service. In the world of software development, it is estimated that more than half of the failures are attributed towards poor requirements management. This means that although the software functions correctly, it is not what the client requested. Modern software requirements management methodologies are available to reduce the occur-rence of such incidents. This paper performs a review on the available literature in the area while tabulating possible methods of managing requirements. It also highlights the benefits of following a proper guideline for the requirements management task. With the introduction of specific software tools for the requirements management task, better software products are now been developed with lesser resources.

  13. Software safety hazard analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lawrence, J.D. [Lawrence Livermore National Lab., CA (United States)

    1996-02-01

    Techniques for analyzing the safety and reliability of analog-based electronic protection systems that serve to mitigate hazards in process control systems have been developed over many years, and are reasonably well understood. An example is the protection system in a nuclear power plant. The extension of these techniques to systems which include digital computers is not well developed, and there is little consensus among software engineering experts and safety experts on how to analyze such systems. One possible technique is to extend hazard analysis to include digital computer-based systems. Software is frequently overlooked during system hazard analyses, but this is unacceptable when the software is in control of a potentially hazardous operation. In such cases, hazard analysis should be extended to fully cover the software. A method for performing software hazard analysis is proposed in this paper.

  14. Software requirements

    CERN Document Server

    Wiegers, Karl E

    2003-01-01

    Without formal, verifiable software requirements-and an effective system for managing them-the programs that developers think they've agreed to build often will not be the same products their customers are expecting. In SOFTWARE REQUIREMENTS, Second Edition, requirements engineering authority Karl Wiegers amplifies the best practices presented in his original award-winning text?now a mainstay for anyone participating in the software development process. In this book, you'll discover effective techniques for managing the requirements engineering process all the way through the development cy

  15. Computational Software to Fit Seismic Data Using Epidemic-Type Aftershock Sequence Models and Modeling Performance Comparisons

    Science.gov (United States)

    Chu, A.

    2016-12-01

    Modern earthquake catalogs are often analyzed using spatial-temporal point process models such as the epidemic-type aftershock sequence (ETAS) models of Ogata (1998). My work implements three of the homogeneous ETAS models described in Ogata (1998). With a model's log-likelihood function, my software finds the Maximum-Likelihood Estimates (MLEs) of the model's parameters to estimate the homogeneous background rate and the temporal and spatial parameters that govern triggering effects. EM-algorithm is employed for its advantages of stability and robustness (Veen and Schoenberg, 2008). My work also presents comparisons among the three models in robustness, convergence speed, and implementations from theory to computing practice. Up-to-date regional seismic data of seismic active areas such as Southern California and Japan are used to demonstrate the comparisons. Data analysis has been done using computer languages Java and R. Java has the advantages of being strong-typed and easiness of controlling memory resources, while R has the advantages of having numerous available functions in statistical computing. Comparisons are also made between the two programming languages in convergence and stability, computational speed, and easiness of implementation. Issues that may affect convergence such as spatial shapes are discussed.

  16. Scientific Software Component Technology

    Energy Technology Data Exchange (ETDEWEB)

    Kohn, S.; Dykman, N.; Kumfert, G.; Smolinski, B.

    2000-02-16

    We are developing new software component technology for high-performance parallel scientific computing to address issues of complexity, re-use, and interoperability for laboratory software. Component technology enables cross-project code re-use, reduces software development costs, and provides additional simulation capabilities for massively parallel laboratory application codes. The success of our approach will be measured by its impact on DOE mathematical and scientific software efforts. Thus, we are collaborating closely with library developers and application scientists in the Common Component Architecture forum, the Equation Solver Interface forum, and other DOE mathematical software groups to gather requirements, write and adopt a variety of design specifications, and develop demonstration projects to validate our approach. Numerical simulation is essential to the science mission at the laboratory. However, it is becoming increasingly difficult to manage the complexity of modern simulation software. Computational scientists develop complex, three-dimensional, massively parallel, full-physics simulations that require the integration of diverse software packages written by outside development teams. Currently, the integration of a new software package, such as a new linear solver library, can require several months of effort. Current industry component technologies such as CORBA, JavaBeans, and COM have all been used successfully in the business domain to reduce software development costs and increase software quality. However, these existing industry component infrastructures will not scale to support massively parallel applications in science and engineering. In particular, they do not address issues related to high-performance parallel computing on ASCI-class machines, such as fast in-process connections between components, language interoperability for scientific languages such as Fortran, parallel data redistribution between components, and massively

  17. Software Innovation

    DEFF Research Database (Denmark)

    Rose, Jeremy

      Innovation is the forgotten key to modern systems development - the element that defines the enterprising engineer, the thriving software firm and the cutting edge software application.  Traditional forms of technical education pay little attention to creativity - often encouraging overly...... rationalistic ways of thinking which stifle the ability to innovate. Professional software developers are often drowned in commercial drudgery and overwhelmed by work pressure and deadlines. The topic that will both ensure success in the market and revitalize their work lives is never addressed. This book sets...... out the new field of software innovation. It organizes the existing scientific research into eight simple heuristics - guiding principles for organizing a system developer's work-life so that it focuses on innovation....

  18. Software Reviews.

    Science.gov (United States)

    Classroom Computer Learning, 1990

    1990-01-01

    Reviewed are three computer software packages including "Martin Luther King, Jr.: Instant Replay of History,""Weeds to Trees," and "The New Print Shop, School Edition." Discussed are hardware requirements, costs, grade levels, availability, emphasis, strengths, and weaknesses. (CW)

  19. Software Reviews.

    Science.gov (United States)

    Wulfson, Stephen, Ed.

    1987-01-01

    Reviews seven computer software programs that can be used in science education programs. Describes courseware which deals with muscles and bones, terminology, classifying animals without backbones, molecular structures, drugs, genetics, and shaping the earth's surface. (TW)

  20. Software Reviews.

    Science.gov (United States)

    Dwyer, Donna; And Others

    1989-01-01

    Reviewed are seven software packages for Apple and IBM computers. Included are: "Toxicology"; "Science Corner: Space Probe"; "Alcohol and Pregnancy"; "Science Tool Kit Plus"; Computer Investigations: Plant Growth"; "Climatrolls"; and "Animal Watch: Whales." (CW)

  1. Reusable Software.

    Science.gov (United States)

    1984-03-01

    overseeing reusable software, the Reusable Software Organization ( RUSO ). This author does not feel at this time that establishment of such a specific...49] have not been accompanied by establishment of RUSO -like activities. There is need, however, for assurance that functions which a RUSO might be...assurance 6. establishment and maintenance of reuse archival facilities and activities. Actual establishment of a RUSO is best dictated by size of the

  2. The Impact of Internal and External Resources, and Strategic Actions in Business Networks on Firm Performance in the Software Industry

    OpenAIRE

    Anggraeni, E.

    2014-01-01

    Understanding the variance in firm performance has been an important topic in the strategic management literature. In the last two decades it has become particularly interesting as business networks increasingly have become an integrated part of a firm's environment. Besides the internal resources, the less-controlled external resources in the firm’s business networks to affect its performance too. The uncertainty associated with the lower levels of control over external resources implies tha...

  3. The Impact of Internal and External Resources, and Strategic Actions in Business Networks on Firm Performance in the Software Industry

    OpenAIRE

    Anggraeni, E.

    2014-01-01

    Understanding the variance in firm performance has been an important topic in the strategic management literature. In the last two decades it has become particularly interesting as business networks increasingly have become an integrated part of a firm's environment. Besides the internal resources, the less-controlled external resources in the firm’s business networks to affect its performance too. The uncertainty associated with the lower levels of control over external resources implies tha...

  4. Performance of automated software in the assessment of segmental left ventricular function in cardiac CT: Comparison with cardiac magnetic resonance

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Rui [Medical University of South Carolina, Department of Radiology and Radiological Science, Charleston, SC (United States); Capital Medical University, Department of Radiology, Beijing Anzhen Hospital, Beijing (China); Meinel, Felix G. [Medical University of South Carolina, Department of Radiology and Radiological Science, Charleston, SC (United States); Ludwig-Maximilians-University Hospital, Institute for Clinical Radiology, Munich (Germany); Schoepf, U.J. [Medical University of South Carolina, Department of Radiology and Radiological Science, Charleston, SC (United States); Medical University of South Carolina, Division of Cardiology, Department of Medicine, Charleston, SC (United States); Canstein, Christian [Siemens Medical Solutions USA, Malvern, PA (United States); Spearman, James V. [Medical University of South Carolina, Department of Radiology and Radiological Science, Charleston, SC (United States); De Cecco, Carlo N. [Medical University of South Carolina, Department of Radiology and Radiological Science, Charleston, SC (United States); University of Rome ' ' Sapienza' ' , Departments of Radiological Sciences, Oncology and Pathology, Latina (Italy)

    2015-12-15

    To evaluate the accuracy, reliability and time saving potential of a novel cardiac CT (CCT)-based, automated software for the assessment of segmental left ventricular function compared to visual and manual quantitative assessment of CCT and cardiac magnetic resonance (CMR). Forty-seven patients with suspected or known coronary artery disease (CAD) were enrolled in the study. Wall thickening was calculated. Segmental LV wall motion was automatically calculated and shown as a colour-coded polar map. Processing time for each method was recorded. Mean wall thickness in both systolic and diastolic phases on polar map, CCT, and CMR was 9.2 ± 0.1 mm and 14.9 ± 0.2 mm, 8.9 ± 0.1 mm and 14.5 ± 0.1 mm, 8.3 ± 0.1 mm and 13.6 ± 0.1 mm, respectively. Mean wall thickening was 68.4 ± 1.5 %, 64.8 ± 1.4 % and 67.1 ± 1.4 %, respectively. Agreement for the assessment of LV wall motion between CCT, CMR and polar maps was good. Bland-Altman plots and ICC indicated good agreement between CCT, CMR and automated polar maps of the diastolic and systolic segmental wall thickness and thickening. The processing time using polar map was significantly decreased compared with CCT and CMR. Automated evaluation of segmental LV function with polar maps provides similar measurements to manual CCT and CMR evaluation, albeit with substantially reduced analysis time. (orig.)

  5. Gammasphere software development. Progress report

    Energy Technology Data Exchange (ETDEWEB)

    Piercey, R.B.

    1994-01-01

    This report describes the activities of the nuclear physics group at Mississippi State University which were performed during 1993. Significant progress has been made in the focus areas: chairing the Gammasphere Software Working Group (SWG); assisting with the porting and enhancement of the ORNL UPAK histogramming software package; and developing standard formats for Gammasphere data products. In addition, they have established a new public ftp archive to distribute software and software development tools and information.

  6. Advanced fingerprint verification software

    Science.gov (United States)

    Baradarani, A.; Taylor, J. R. B.; Severin, F.; Maev, R. Gr.

    2016-05-01

    We have developed a fingerprint software package that can be used in a wide range of applications from law enforcement to public and private security systems, and to personal devices such as laptops, vehicles, and door- locks. The software and processing units are a unique implementation of new and sophisticated algorithms that compete with the current best systems in the world. Development of the software package has been in line with the third generation of our ultrasonic fingerprinting machine1. Solid and robust performance is achieved in the presence of misplaced and low quality fingerprints.

  7. Speakeasy software development

    Science.gov (United States)

    Baskinger, Patricia J.; Ozarow, Larry; Chruscicki, Mary C.

    1993-08-01

    The Speakeasy Software Development Project had three primary objectives. The first objective was to perform Independent Verification and Validation (IV & V) of the software and documentation associated with the signal processor being developed by Hazeltine and TRW under the Speakeasy program. The IV & V task also included an analysis and assessment of the ability of the signal processor software to provide LPI communications functions. The second objective was to assist in the enhancement and modification of an existing Rome Lab signal processor workstation. Finally, TASC developed project management support tools and provided program management support to the Speakeasy Program Office.

  8. Software quality assurance handbook

    Energy Technology Data Exchange (ETDEWEB)

    1990-09-01

    There are two important reasons for Software Quality Assurance (SQA) at Allied-Signal Inc., Kansas City Division (KCD): First, the benefits from SQA make good business sense. Second, the Department of Energy has requested SQA. This handbook is one of the first steps in a plant-wide implementation of Software Quality Assurance at KCD. The handbook has two main purposes. The first is to provide information that you will need to perform software quality assurance activities. The second is to provide a common thread to unify the approach to SQA at KCD. 2 figs.

  9. Performability evaluation of the SIFT computer. [Software-Implemented Fault Tolerance computer onboard commercial aircraft during transoceanic flight

    Science.gov (United States)

    Meyer, J. F.; Furchtgott, D. G.; Wu, L. T.

    1980-01-01

    The paper deals with the models, techniques, and evaluation methods that were successfully used to test the performance of the SIFT degradable computing system. The performance of the computer plus its air transport mission environment is modeled as a random variable, taking values in a set of 'accomplishment level'. The levels are defined in terms of four attributes of total system (computer plus environment) behavior, namely safety, no change in mission profile, no operational penalties, and no economic penalties. The base model of the total system is a stochastic process, whose states describe the internal structure of SIFT and the relevant conditions of its computational environment. Base model state trajectories are related to accomplishment levels via a special function, and solution methods are then used to determine the performability of the total system for various parameters of the computer and environment.

  10. The Impact of Internal and External Resources, and Strategic Actions in Business Networks on Firm Performance in the Software Industry

    NARCIS (Netherlands)

    Anggraeni, E.

    2014-01-01

    Understanding the variance in firm performance has been an important topic in the strategic management literature. In the last two decades it has become particularly interesting as business networks increasingly have become an integrated part of a firm's environment. Besides the internal resources,

  11. The Impact of Project Role on Perceptions of Risk and Performance in Information Technology Software Development: A Comparative Analysis

    Science.gov (United States)

    Okongo, James

    2014-01-01

    The failure rate of information technology (IT) development projects is a significant concern for today's organizations. Perceptions of IT project risk and project performance have been identified as important factors by scholars studying the topic, and Wallace, Keil, and Rai (2004a) developed a survey instrument to measure how dimensions of…

  12. The Impact of Internal and External Resources, and Strategic Actions in Business Networks on Firm Performance in the Software Industry

    NARCIS (Netherlands)

    Anggraeni, E.

    2014-01-01

    Understanding the variance in firm performance has been an important topic in the strategic management literature. In the last two decades it has become particularly interesting as business networks increasingly have become an integrated part of a firm's environment. Besides the internal resources,

  13. The Impact of Project Role on Perceptions of Risk and Performance in Information Technology Software Development: A Comparative Analysis

    Science.gov (United States)

    Okongo, James

    2014-01-01

    The failure rate of information technology (IT) development projects is a significant concern for today's organizations. Perceptions of IT project risk and project performance have been identified as important factors by scholars studying the topic, and Wallace, Keil, and Rai (2004a) developed a survey instrument to measure how dimensions of…

  14. Using Writing Strategies and Visual Thinking Software To Enhance the Written Performance of Students with Mild Disabilities.

    Science.gov (United States)

    Blair, Regina B.; Ormsbee, Christine; Brandes, Joyce

    Students with mild disabilities often have difficulties with organization and written performance. These students can be helped by a combination of effective instructional strategies, compensatory strategies, and technological tools. Planning and organizing tools can encourage activities such as concept mapping, story webbing, brainstorming,…

  15. A software engineering process for safety-critical software application.

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Byung Heon; Kim, Hang Bae; Chang, Hoon Seon; Jeon, Jong Sun [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1995-02-01

    Application of computer software to safety-critical systems in on the increase. To be successful, the software must be designed and constructed to meet the functional and performance requirements of the system. For safety reason, the software must be demonstrated not only to meet these requirements, but also to operate safely as a component within the system. For longer-term cost consideration, the software must be designed and structured to ease future maintenance and modifications. This paper presents a software engineering process for the production of safety-critical software for a nuclear power plant. The presentation is expository in nature of a viable high quality safety-critical software development. It is based on the ideas of a rational design process and on the experience of the adaptation of such process in the production of the safety-critical software for the shutdown system number two of Wolsung 2, 3 and 4 nuclear power generation plants. This process is significantly different from a conventional process in terms of rigorous software development phases and software design techniques, The process covers documentation, design, verification and testing using mathematically precise notations and highly reviewable tabular format to specify software requirements and software requirements and software requirements and code against software design using static analysis. The software engineering process described in this paper applies the principle of information-hiding decomposition in software design using a modular design technique so that when a change is required or an error is detected, the affected scope can be readily and confidently located. it also facilitates a sense of high degree of confidence in the `correctness` of the software production, and provides a relatively simple and straightforward code implementation effort. 1 figs., 10 refs. (Author).

  16. MIAWARE Software

    DEFF Research Database (Denmark)

    Wilkowski, Bartlomiej; Pereira, Oscar N. M.; Dias, Paulo

    2008-01-01

    This article presents MIAWARE, a software for Medical Image Analysis With Automated Reporting Engine, which was designed and developed for doctor/radiologist assistance. It allows to analyze an image stack from computed axial tomography scan of lungs (thorax) and, at the same time, to mark all...... pathologies on images and report their characteristics. The reporting process is normalized - radiologists cannot describe pathological changes with their own words, but can only use some terms from a specific vocabulary set provided by the software. Consequently, a normalized radiological report...... is automatically generated. Furthermore, MIAWARE software is accompanied with an intelligent search engine for medical reports, based on the relations between parts of the lungs. A logical structure of the lungs is introduced to the search algorithm through the specially developed ontology. As a result...

  17. Comparison of OH concentration measurements by DOAS and LIF during SAPHIR chamber experiments at high OH reactivity and low NO concentration

    Directory of Open Access Journals (Sweden)

    H. Fuchs

    2012-07-01

    Full Text Available During recent field campaigns, hydroxyl radical (OH concentrations that were measured by laser-induced fluorescence (LIF were up to a factor of ten larger than predicted by current chemical models for conditions of high OH reactivity and low NO concentration. These discrepancies, which were observed in forests and urban-influenced rural environments, are so far not entirely understood. In summer 2011, a series of experiments was carried out in the atmosphere simulation chamber SAPHIR in Jülich, Germany, in order to investigate the photochemical degradation of isoprene, methyl-vinyl ketone (MVK, methacrolein (MACR and aromatic compounds by OH. Conditions were similar to those experienced during the PRIDE-PRD2006 campaign in the Pearl River Delta (PRD, China, in 2006, where a large difference between OH measurements and model predictions was found. During experiments in SAPHIR, OH was simultaneously detected by two independent instruments: LIF and differential optical absorption spectroscopy (DOAS. Because DOAS is an inherently calibration-free technique, DOAS measurements are regarded as a reference standard. The comparison of the two techniques was used to investigate potential artifacts in the LIF measurements for PRD-like conditions of OH reactivities of 10 to 30 s−1 and NO mixing ratios of 0.1 to 0.3 ppbv. The analysis of twenty experiment days shows good agreement. The linear regression of the combined data set (averaged to the DOAS time resolution, 2495 data points yields a slope of 1.02 ± 0.01 with an intercept of (0.10 ± 0.03 × 106 cm−3 and a linear correlation coefficient of R2 = 0.86. This indicates that the sensitivity of the LIF instrument is well-defined by its calibration procedure. No hints for artifacts are observed for isoprene, MACR, and different aromatic compounds. LIF measurements were approximately 30–40% (median larger than those by DOAS after MVK (20 ppbv and

  18. Comparison of OH concentration measurements by DOAS and LIF during SAPHIR chamber experiments at high OH reactivity and low NO concentration

    Science.gov (United States)

    Fuchs, H.; Dorn, H.-P.; Bachner, M.; Bohn, B.; Brauers, T.; Gomm, S.; Hofzumahaus, A.; Holland, F.; Nehr, S.; Rohrer, F.; Tillmann, R.; Wahner, A.

    2012-07-01

    During recent field campaigns, hydroxyl radical (OH) concentrations that were measured by laser-induced fluorescence (LIF) were up to a factor of ten larger than predicted by current chemical models for conditions of high OH reactivity and low NO concentration. These discrepancies, which were observed in forests and urban-influenced rural environments, are so far not entirely understood. In summer 2011, a series of experiments was carried out in the atmosphere simulation chamber SAPHIR in Jülich, Germany, in order to investigate the photochemical degradation of isoprene, methyl-vinyl ketone (MVK), methacrolein (MACR) and aromatic compounds by OH. Conditions were similar to those experienced during the PRIDE-PRD2006 campaign in the Pearl River Delta (PRD), China, in 2006, where a large difference between OH measurements and model predictions was found. During experiments in SAPHIR, OH was simultaneously detected by two independent instruments: LIF and differential optical absorption spectroscopy (DOAS). Because DOAS is an inherently calibration-free technique, DOAS measurements are regarded as a reference standard. The comparison of the two techniques was used to investigate potential artifacts in the LIF measurements for PRD-like conditions of OH reactivities of 10 to 30 s-1 and NO mixing ratios of 0.1 to 0.3 ppbv. The analysis of twenty experiment days shows good agreement. The linear regression of the combined data set (averaged to the DOAS time resolution, 2495 data points) yields a slope of 1.02 ± 0.01 with an intercept of (0.10 ± 0.03) × 106 cm-3 and a linear correlation coefficient of R2 = 0.86. This indicates that the sensitivity of the LIF instrument is well-defined by its calibration procedure. No hints for artifacts are observed for isoprene, MACR, and different aromatic compounds. LIF measurements were approximately 30-40% (median) larger than those by DOAS after MVK (20 ppbv) and toluene (90 ppbv) had been added. However, this discrepancy has a

  19. Utilizing Commercial Hardware and Open Source Computer Vision Software to Perform Motion Capture for Reduced Gravity Flight

    Science.gov (United States)

    Humphreys, Brad; Bellisario, Brian; Gallo, Christopher; Thompson, William K.; Lewandowski, Beth

    2016-01-01

    Long duration space travel to Mars or to an asteroid will expose astronauts to extended periods of reduced gravity. Since gravity is not present to aid loading, astronauts will use resistive and aerobic exercise regimes for the duration of the space flight to minimize the loss of bone density, muscle mass and aerobic capacity that occurs during exposure to a reduced gravity environment. Unlike the International Space Station (ISS), the area available for an exercise device in the next generation of spacecraft is limited. Therefore, compact resistance exercise device prototypes are being developed. The NASA Digital Astronaut Project (DAP) is supporting the Advanced Exercise Concepts (AEC) Project, Exercise Physiology and Countermeasures (ExPC) project and the National Space Biomedical Research Institute (NSBRI) funded researchers by developing computational models of exercising with these new advanced exercise device concepts. To perform validation of these models and to support the Advanced Exercise Concepts Project, several candidate devices have been flown onboard NASAs Reduced Gravity Aircraft. In terrestrial laboratories, researchers typically have available to them motion capture systems for the measurement of subject kinematics. Onboard the parabolic flight aircraft it is not practical to utilize the traditional motion capture systems due to the large working volume they require and their relatively high replacement cost if damaged. To support measuring kinematics on board parabolic aircraft, a motion capture system is being developed utilizing open source computer vision code with commercial off the shelf (COTS) video camera hardware. While the systems accuracy is lower than lab setups, it provides a means to produce quantitative comparison motion capture kinematic data. Additionally, data such as required exercise volume for small spaces such as the Orion capsule can be determined. METHODS: OpenCV is an open source computer vision library that provides the

  20. Software engineering

    CERN Document Server

    Thorin, Marc

    1985-01-01

    Software Engineering describes the conceptual bases as well as the main methods and rules on computer programming. This book presents software engineering as a coherent and logically built synthesis and makes it possible to properly carry out an application of small or medium difficulty that can later be developed and adapted to more complex cases. This text is comprised of six chapters and begins by introducing the reader to the fundamental notions of entities, actions, and programming. The next two chapters elaborate on the concepts of information and consistency domains and show that a proc

  1. Predicting the performances of a CAMPRO engine retrofitted with liquefied petroleum gas (LPG system using 1-dimensional software

    Directory of Open Access Journals (Sweden)

    Kamaruddin M. Hazeem

    2017-01-01

    Full Text Available Recently, the depletion of petroleum resources and the impact of exhaust emission caused by combustion towards environmental has been forced to all researchers to come out with an alternative ways to prevent this situation become worse. Liquefied petroleum gas (LPG is the most compatible and have a potential to become a source of energy for internal combustion engine. Unfortunately, the investigation of LPG in internal combustion engine among researcher still have a gap in research. Thus, in this study a 1-Dimensional simulation CAMPRO 1.6L engine model using GT-Power is developed to predict the performances of engines that using LPG as a fuel for internal combustion engine. The constructed model simulation will throughout the validation process with the experimental data to make sure the precision of this model. The validation process shows that the results have a good agreement between the simulation model and the experimental data. As a result, the performance of LPG simulation model shows that a Brake Torque (BT, Brake Power (BP and Brake Mean Effective Pressure (BMEP were significantly improved in average of 7% in comparison with gasoline model. In addition, Brake Specific Fuel Consumption (BSFC also shows an improvement by 5%, which is become more economic. Therefore, the developed GT-Power model offer a successful fuel conversion to LPG systems via retrofit technology to provide comprehensive support for implementation of energy efficient and environmental friendly vehicles.

  2. [Software version and medical device software supervision].

    Science.gov (United States)

    Peng, Liang; Liu, Xiaoyan

    2015-01-01

    The importance of software version in the medical device software supervision does not cause enough attention at present. First of all, the effect of software version in the medical device software supervision is discussed, and then the necessity of software version in the medical device software supervision is analyzed based on the discussion of the misunderstanding of software version. Finally the concrete suggestions on software version naming rules, software version supervision for the software in medical devices, and software version supervision scheme are proposed.

  3. Astronomical Software Directory Service

    Science.gov (United States)

    Hanisch, Robert J.; Payne, Harry; Hayes, Jeffrey

    1997-01-01

    With the support of NASA's Astrophysics Data Program (NRA 92-OSSA-15), we have developed the Astronomical Software Directory Service (ASDS): a distributed, searchable, WWW-based database of software packages and their related documentation. ASDS provides integrated access to 56 astronomical software packages, with more than 16,000 URLs indexed for full-text searching. Users are performing about 400 searches per month. A new aspect of our service is the inclusion of telescope and instrumentation manuals, which prompted us to change the name to the Astronomical Software and Documentation Service. ASDS was originally conceived to serve two purposes: to provide a useful Internet service in an area of expertise of the investigators (astronomical software), and as a research project to investigate various architectures for searching through a set of documents distributed across the Internet. Two of the co-investigators were then installing and maintaining astronomical software as their primary job responsibility. We felt that a service which incorporated our experience in this area would be more useful than a straightforward listing of software packages. The original concept was for a service based on the client/server model, which would function as a directory/referral service rather than as an archive. For performing the searches, we began our investigation with a decision to evaluate the Isite software from the Center for Networked Information Discovery and Retrieval (CNIDR). This software was intended as a replacement for Wide-Area Information Service (WAIS), a client/server technology for performing full-text searches through a set of documents. Isite had some additional features that we considered attractive, and we enjoyed the cooperation of the Isite developers, who were happy to have ASDS as a demonstration project. We ended up staying with the software throughout the project, making modifications to take advantage of new features as they came along, as well as

  4. Hardware Architecture of Polyphase Filter Banks Performing Embedded Resampling for Software-Defined Radio Front-Ends

    DEFF Research Database (Denmark)

    Awan, Mehmood-Ur-Rehman; Le Moullec, Yannick; Koch, Peter;

    2012-01-01

    , and power optimization for field programmable gate array (FPGA) based architectures in an M -path polyphase filter bank with modified N -path polyphase filter. Such systems allow resampling by arbitrary ratios while simultaneously performing baseband aliasing from center frequencies at Nyquist zones...... that are not multiples of the output sample rate. A non-maximally decimated polyphase filter bank, where the number of data loads is not equal to the number of M subfilters, processes M subfilters in a time period that is either less than or greater than the M data-load’s time period. We present a load...... of the down-sampled data. In RA, M subfilters processes are efficiently scheduled within N data-load time while simultaneously loading N subfilters. This requires reduced clock rates compared with LPA, and potentially less power is consumed. A polyphase filter bank that uses different resampling factors...

  5. Implementation and testing of a fault detection software tool for improving control system performance in a large commercial building

    Energy Technology Data Exchange (ETDEWEB)

    Salsbury, T.I.; Diamond, R.C.

    2000-05-01

    This paper describes a model-based, feedforward control scheme that can detect faults in the controlled process and improve control performance over traditional PID control. The tool uses static simulation models of the system under control to generate feed-forward control action, which acts as a reference of correct operation. Faults that occur in the system cause discrepancies between the feedforward models and the controlled process. The scheme facilitates detection of faults by monitoring the level of these discrepancies. We present results from the first phase of tests on a dual-duct air-handling unit installed in a large office building in San Francisco. We demonstrate the ability of the tool to detect a number of preexisting faults in the system and discuss practical issues related to implementation.

  6. Hardware Architecture of Polyphase Filter Banks Performing Embedded Resampling for SoftwareDefined Radio FrontEnds

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    In this paper, we describe resourceefficient hardware architectures for softwaredefined radio (SDR) frontends. These architectures are made efficient by using a polyphase channelizer that performs arbitrary sample rate changes, frequency selection, and bandwidth control. We discuss area, time, and power optimization for field programmable gate array (FPGA) based architectures in an Mpath polyphase filter bank with modified Npath polyphase filter. Such systems allow resampling by arbitrary ratios while simultaneously performing baseband aliasing from center frequencies at Nyquist zones that are not multiples of the output sample rate. A nonmaximally decimated polyphase filter bank, where the number of data loads is not equal to the number of M subfilters, processes M subfilters in a time period that is either less than or greater than the Mdataload ' s time period. We present a loadprocess architecture (LPA) and a runtime architecture (RA) (based on serial polyphase structure) which have different scheduling. In LPA, Nsubfilters are loaded, and then M subfilters are processed at a clock rate that is a multiple of the input data rate. This is necessary to meet the output time constraint of the down-sampled data. In RA, Msubfilters processes are efficiently scheduled within Ndataload time while simultaneously loading N subfilters. This requires reduced clock rates compared with LPA, and potentially less power is consumed. A polyphase filter bank that uses different resampling factors for maximally decimated, underdecimated, overdecimated, and combined upand downsampled scenarios is used as a case study, and an analysis of area, time, and power for their FPGA architectures is given. For resourceoptimized SDR frontends, RA is superior for reducing operating clock rates and dynamic power consumption. RA is also superior for reducing area resources, except when indices are prestored in LUTs.

  7. Educational Software.

    Science.gov (United States)

    Northwest Regional Educational Lab., Portland, OR.

    The third session of IT@EDU98 consisted of five papers on educational software and was chaired by Tran Van Hao (University of Education, Ho Chi Minh City, Vietnam). "Courseware Engineering" (Nguyen Thanh Son, Ngo Ngoc Bao Tran, Quan Thanh Tho, Nguyen Hong Lam) briefly describes the use of courseware. "Machine Discovery Theorems in Geometry: A…

  8. Software Patents.

    Science.gov (United States)

    Burke, Edmund B.

    1994-01-01

    Outlines basic patent law information that pertains to computer software programs. Topics addressed include protection in other countries; how to obtain patents; kinds of patents; duration; classes of patentable subject matter, including machines and processes; patentability searches; experimental use prior to obtaining a patent; and patent…

  9. Software Systems

    Institute of Scientific and Technical Information of China (English)

    崔涛; 周淼

    1996-01-01

    The information used with computers is known as software and includesprograms and data. Programs are sets of instructions telling the computerwhat operations have to be carried out and in what order they should be done. Specialised programs which enable the computer to be used for particularpurposes are called applications programs. A collection of these programs kept

  10. Software Reviews.

    Science.gov (United States)

    Science and Children, 1990

    1990-01-01

    Reviewed are seven computer software packages for IBM and/or Apple Computers. Included are "Windows on Science: Volume 1--Physical Science"; "Science Probe--Physical Science"; "Wildlife Adventures--Grizzly Bears"; "Science Skills--Development Programs"; "The Clean Machine"; "Rock Doctor"; and "Geology Search." Cost, quality, hardware, and…

  11. Software Review.

    Science.gov (United States)

    McGrath, Diane, Ed.

    1989-01-01

    Reviewed is a computer software package entitled "Audubon Wildlife Adventures: Grizzly Bears" for Apple II and IBM microcomputers. Included are availability, hardware requirements, cost, and a description of the program. The murder-mystery flavor of the program is stressed in this program that focuses on illegal hunting and game management. (CW)

  12. Characteristics for Software Optimization Projects

    Directory of Open Access Journals (Sweden)

    Iulian NITESCU

    2008-01-01

    Full Text Available The increasing of the software systems complexity imposes the identification and implementation of some methods and techniques in order to manage it. The software optimization project is a way in which the software complexity is controlled. The software optimization project must face to the organization need to earn profit. The software optimization project is an integrated part of the application cycle because share same resources, depends on other stages and influences next phases. The optimization project has some particularities because it works on an finished product around its quality. The process is quality and performance oriented and it assumes that the product life cycle is almost finished.

  13. Novel, Highly-Parallel Software for the Online Storage System of the ATLAS Experiment at CERN: Design and Performances

    CERN Document Server

    Colombo, T; The ATLAS collaboration

    2012-01-01

    The ATLAS experiment observes proton-proton collisions delivered by the LHC accelerator at CERN. The ATLAS Trigger and Data Acquisition (TDAQ) system selects interesting events on-line in a three-level trigger system in order to store them at a budgeted rate of several hundred Hz, for an average event size of ~1.5 MB. This paper focuses on the TDAQ data-logging system and in particular on the implementation and performance of a novel SW design, reporting on the effort of exploiting the full power of recently installed multi-core hardware. In this respect, the main challenge presented by the data-logging workload is the conflict between the largely parallel nature of the event processing, including the recently introduced on-line event-compression, and the constraint of sequential file writing and checksum evaluation. This is further complicated by the necessity of operating in a fully data-driven mode, to cope with continuously evolving trigger and detector configurations. In this paper we report on the desig...

  14. Electronic cleansing for CT colonography: does it help CAD software performance in a high-risk population for colorectal cancer?

    Energy Technology Data Exchange (ETDEWEB)

    Wi, Jae Yeon [Seoul National University College of Medicine, Department of Radiology, Jongno-gu, Seoul (Korea); Kim, Se Hyung; Lee, Jae Young; Han, Joon Koo; Choi, Byung Ihn [Seoul National University College of Medicine, Department of Radiology, Jongno-gu, Seoul (Korea); Seoul National University Hospital, Institute of Radiation Medicine, Seoul (Korea); Kim, Sang Gyun [Seoul National University Hospital, Department of Internal Medicine, Seoul (Korea)

    2010-08-15

    To compare the performance of computer-aided detection (CAD) for CT colonography (CTC) with and without electronic cleansing (EC) in a high-risk population tagged with a faecal tagging (FT) protocol. Thirty-two patients underwent CTC followed by same-day colonoscopy. All patients underwent bowel preparation and FT with barium and gastrografin. Each CTC dataset was processed with colon CAD with and without EC. Per-polyp sensitivity was calculated. The average number of false-positive (FP) results and their causes were also analysed and compared. Eighty-six polyps were detected in 29 patients. Per-polyp sensitivities of CAD with EC (93.8% and 100%) were higher than those without EC (84.4% and 87.5%) for polyps {>=}6 mm and {>=}10 mm, respectively. However, the differences were not significant. The average number (6.3) of FPs of CAD with EC was significantly larger than that (3.1) without EC. The distribution of FPs in both CAD settings was also significantly different. The most common cause of FPs was the ileocaecal valve in both datasets. However, untagged faeces was a significantly less common cause of FPs with EC, EC-related artefacts being more common. Electronic cleansing has the potential to improve per-polyp sensitivity of CTC CAD, although the significantly larger number of FPs with EC remains to be improved. (orig.)

  15. Software Design Improvements. Part 1; Software Benefits and Limitations

    Science.gov (United States)

    Lalli, Vincent R.; Packard, Michael H.; Ziemianski, Tom

    1997-01-01

    Computer hardware and associated software have been used for many years to process accounting information, to analyze test data and to perform engineering analysis. Now computers and software also control everything from automobiles to washing machines and the number and type of applications are growing at an exponential rate. The size of individual program has shown similar growth. Furthermore, software and hardware are used to monitor and/or control potentially dangerous products and safety-critical systems. These uses include everything from airplanes and braking systems to medical devices and nuclear plants. The question is: how can this hardware and software be made more reliable? Also, how can software quality be improved? What methodology needs to be provided on large and small software products to improve the design and how can software be verified?

  16. Software preservation

    Directory of Open Access Journals (Sweden)

    Tadej Vodopivec

    2011-01-01

    Full Text Available Comtrade Ltd. covers a wide range of activities related to information and communication technologies; its deliverables include web applications, locally installed programs,system software, drivers, embedded software (used e.g. in medical devices, auto parts,communication switchboards. Also the extensive knowledge and practical experience about digital long-term preservation technologies have been acquired. This wide spectrum of activities puts us in the position to discuss the often overlooked aspect of the digital preservation - preservation of software programs. There are many resources dedicated to digital preservation of digital data, documents and multimedia records,but not so many about how to preserve the functionalities and features of computer programs. Exactly these functionalities - dynamic response to inputs - render the computer programs rich compared to documents or linear multimedia. The article opens the questions on the beginning of the way to the permanent digital preservation. The purpose is to find a way in the right direction, where all relevant aspects will be covered in proper balance. The following questions are asked: why at all to preserve computer programs permanently, who should do this and for whom, when we should think about permanent program preservation, what should be persevered (such as source code, screenshots, documentation, and social context of the program - e.g. media response to it ..., where and how? To illustrate the theoretic concepts given the idea of virtual national museum of electronic banking is also presented.

  17. Modernization of software quality assurance

    Science.gov (United States)

    Bhaumik, Gokul

    1988-01-01

    The customers satisfaction depends not only on functional performance, it also depends on the quality characteristics of the software products. An examination of this quality aspect of software products will provide a clear, well defined framework for quality assurance functions, which improve the life-cycle activities of software development. Software developers must be aware of the following aspects which have been expressed by many quality experts: quality cannot be added on; the level of quality built into a program is a function of the quality attributes employed during the development process; and finally, quality must be managed. These concepts have guided our development of the following definition for a Software Quality Assurance function: Software Quality Assurance is a formal, planned approach of actions designed to evaluate the degree of an identifiable set of quality attributes present in all software systems and their products. This paper is an explanation of how this definition was developed and how it is used.

  18. Software Engineering to Professionalize Software Development

    Directory of Open Access Journals (Sweden)

    Juan Miguel Alonso

    2011-12-01

    Full Text Available The role, increasingly important, that plays the software in the systems with widespread effects presents new challenges for the formation of Software Engineers. Not only because social dependence software is increasing, but also because the character of software development is also changing and with it the demands for software developers certified. In this paper are propose some challenges and aspirations that guide the learning processes Software Engineering and help to identify the need to train professionals in software development.

  19. The Ettention software package

    Energy Technology Data Exchange (ETDEWEB)

    Dahmen, Tim, E-mail: Tim.Dahmen@dfki.de [German Research Center for Artificial Intelligence GmbH (DFKI), 66123 Saarbrücken (Germany); Saarland University, 66123 Saarbrücken (Germany); Marsalek, Lukas [Eyen SE, Na Nivách 1043/16, 141 00 Praha 4 (Czech Republic); Saarland University, 66123 Saarbrücken (Germany); Marniok, Nico [Saarland University, 66123 Saarbrücken (Germany); Turoňová, Beata [Saarland University, 66123 Saarbrücken (Germany); IMPRS-CS, Max-Planck Institute for Informatics, Campus E 1.4, 66123 Saarbrücken (Germany); Bogachev, Sviatoslav [Saarland University, 66123 Saarbrücken (Germany); Trampert, Patrick; Nickels, Stefan [German Research Center for Artificial Intelligence GmbH (DFKI), 66123 Saarbrücken (Germany); Slusallek, Philipp [German Research Center for Artificial Intelligence GmbH (DFKI), 66123 Saarbrücken (Germany); Saarland University, 66123 Saarbrücken (Germany)

    2016-02-15

    We present a novel software package for the problem “reconstruction from projections” in electron microscopy. The Ettention framework consists of a set of modular building-blocks for tomographic reconstruction algorithms. The well-known block iterative reconstruction method based on Kaczmarz algorithm is implemented using these building-blocks, including adaptations specific to electron tomography. Ettention simultaneously features (1) a modular, object-oriented software design, (2) optimized access to high-performance computing (HPC) platforms such as graphic processing units (GPU) or many-core architectures like Xeon Phi, and (3) accessibility to microscopy end-users via integration in the IMOD package and eTomo user interface. We also provide developers with a clean and well-structured application programming interface (API) that allows for extending the software easily and thus makes it an ideal platform for algorithmic research while hiding most of the technical details of high-performance computing. - Highlights: • Novel software package for “reconstruction from projections” in electron microscopy. • Support for high-resolution reconstructions on iterative reconstruction algorithms. • Support for CPU, GPU and Xeon Phi. • Integration in the IMOD software. • Platform for algorithm researchers: object oriented, modular design.

  20. Boosting Government Performance with Open Source Software? – A Roadmap for Germany (¿Impulsar el desempeño del estado con software de código abierto? - Plan de trabajo para Alemania

    Directory of Open Access Journals (Sweden)

    Norbert Jesse

    2014-05-01

    Full Text Available English abstract Governments face a considerable pressure from all directions: budget restrictions, citizens’ expectations, demographical trends, local competition from surrounding areas – to name just a few. EGovernment is regarded as an imminent tool to tackle many of these challenges. Obviously, IT itself is object of increasing complexity, constant change and financial implications. This paper outlines how the federal German government follows a strategic roadmap for eGovernment by shaping the objectives and goals for IT expansion. We discuss the role of open source software and how new approaches for software development can turn the ambitious aims into reality. Spanish abstract Los gobiernos se enfrentan a una gran presión desde todas las direcciones: las restricciones presupuestarias, las expectativas de los ciudadanos, las tendencias demográficas, la competencia local- por nombrar algunos. El gobierno electrónico es considerado una herramienta inminente para abordar muchos de estos problemas. Obviamente, las TIC son objetos de complejidad creciente, cambio constante e implicaciones financieras. Este documento describe cómo el gobierno federal alemán sigue una hoja de ruta estratégica para el gobierno electrónico dándole forma a los objetivos y metas para su expansión tecnológica. Se discute el papel del software libre y cómo los nuevos enfoques para el desarrollo de software pueden convertir estos ambiciosos objetivos en realidad.

  1. Software, Software Engineering and Software Engineering Research:Some Unconventional Thoughts

    Institute of Scientific and Technical Information of China (English)

    David Notkin

    2009-01-01

    Software engineering is broadly discussed as falling far short of expectations. Data and examples are used to justify how software itself is often poor, how the engineering of software leaves much to be desired, and how research in software engineering has not made enough progress to help overcome these weaknesses. However, these data and examples are presented and interpreted in ways that are arguably imbalanced. This imbalance, usually taken at face value, may be distracting the field from making significant progress towards improving the effective engineering of software, a goal the entire community shares. Research dichotomies, which tend to pit one approach against another, often subtly hint that there is a best way to engineer software or a best way to perform research on software. This, too, may be distracting the field from important classes of progress.

  2. The FARE Software

    Science.gov (United States)

    Pitarello, Adriana

    2015-01-01

    This article highlights the importance of immediate corrective feedback in tutorial software for language teaching in an academic learning environment. We aim to demonstrate that, rather than simply reporting on the performance of the foreign language learner, this feedback can act as a mediator of students' cognitive and metacognitive activity.…

  3. Space Software

    Science.gov (United States)

    1990-01-01

    Xontech, Inc.'s software package, XonVu, simulates the missions of Voyager 1 at Jupiter and Saturn, Voyager 2 at Jupiter, Saturn, Uranus and Neptune, and Giotto in close encounter with Comet Halley. With the program, the user can generate scenes of the planets, moons, stars or Halley's nucleus and tail as seen by Giotto, all graphically reproduced with high accuracy in wireframe representation. Program can be used on a wide range of computers, including PCs. User friendly and interactive, with many options, XonVu can be used by a space novice or a professional astronomer. With a companion user's manual, it sells for $79.

  4. Software architecture

    CERN Document Server

    Vogel, Oliver; Chughtai, Arif

    2011-01-01

    As a software architect you work in a wide-ranging and dynamic environment. You have to understand the needs of your customer, design architectures that satisfy both functional and non-functional requirements, and lead development teams in implementing the architecture. And it is an environment that is constantly changing: trends such as cloud computing, service orientation, and model-driven procedures open up new architectural possibilities. This book will help you to develop a holistic architectural awareness and knowledge base that extends beyond concrete methods, techniques, and technologi

  5. First performance evaluation of software for automatic segmentation, labeling and reformation of anatomical aligned axial images of the thoracolumbar spine at CT

    Energy Technology Data Exchange (ETDEWEB)

    Scholtz, Jan-Erik, E-mail: janerikscholtz@gmail.com; Wichmann, Julian L.; Kaup, Moritz; Fischer, Sebastian; Kerl, J. Matthias; Lehnert, Thomas; Vogl, Thomas J.; Bauer, Ralf W.

    2015-03-15

    Highlights: •Automatic segmentation and labeling of the thoracolumbar spine. •Automatically generated double-angulated and aligned axial images of spine segments. •High grade of accurateness for the symmetric depiction of anatomical structures. •Time-saving and may improve workflow in daily practice. -- Abstract: Objectives: To evaluate software for automatic segmentation, labeling and reformation of anatomical aligned axial images of the thoracolumbar spine on CT in terms of accuracy, potential for time savings and workflow improvement. Material and methods: 77 patients (28 women, 49 men, mean age 65.3 ± 14.4 years) with known or suspected spinal disorders (degenerative spine disease n = 32; disc herniation n = 36; traumatic vertebral fractures n = 9) underwent 64-slice MDCT with thin-slab reconstruction. Time for automatic labeling of the thoracolumbar spine and reconstruction of double-angulated axial images of the pathological vertebrae was compared with manually performed reconstruction of anatomical aligned axial images. Reformatted images of both reconstruction methods were assessed by two observers regarding accuracy of symmetric depiction of anatomical structures. Results: In 33 cases double-angulated axial images were created in 1 vertebra, in 28 cases in 2 vertebrae and in 16 cases in 3 vertebrae. Correct automatic labeling was achieved in 72 of 77 patients (93.5%). Errors could be manually corrected in 4 cases. Automatic labeling required 1 min in average. In cases where anatomical aligned axial images of 1 vertebra were created, reconstructions made by hand were significantly faster (p < 0.05). Automatic reconstruction was time-saving in cases of 2 and more vertebrae (p < 0.05). Both reconstruction methods revealed good image quality with excellent inter-observer agreement. Conclusion: The evaluated software for automatic labeling and anatomically aligned, double-angulated axial image reconstruction of the thoracolumbar spine on CT is time

  6. TIA Software User's Manual

    Science.gov (United States)

    Cramer, K. Elliott; Syed, Hazari I.

    1995-01-01

    This user's manual describes the installation and operation of TIA, the Thermal-Imaging acquisition and processing Application, developed by the Nondestructive Evaluation Sciences Branch at NASA Langley Research Center, Hampton, Virginia. TIA is a user friendly graphical interface application for the Macintosh 2 and higher series computers. The software has been developed to interface with the Perceptics/Westinghouse Pixelpipe(TM) and PixelStore(TM) NuBus cards and the GW Instruments MacADIOS(TM) input-output (I/O) card for the Macintosh for imaging thermal data. The software is also capable of performing generic image-processing functions.

  7. Global Software Engineering: A Software Process Approach

    Science.gov (United States)

    Richardson, Ita; Casey, Valentine; Burton, John; McCaffery, Fergal

    Our research has shown that many companies are struggling with the successful implementation of global software engineering, due to temporal, cultural and geographical distance, which causes a range of factors to come into play. For example, cultural, project managementproject management and communication difficulties continually cause problems for software engineers and project managers. While the implementation of efficient software processes can be used to improve the quality of the software product, published software process models do not cater explicitly for the recent growth in global software engineering. Our thesis is that global software engineering factors should be included in software process models to ensure their continued usefulness in global organisations. Based on extensive global software engineering research, we have developed a software process, Global Teaming, which includes specific practices and sub-practices. The purpose is to ensure that requirements for successful global software engineering are stipulated so that organisations can ensure successful implementation of global software engineering.

  8. 高性能应用软件并行优化策略的研究%On the Parallelization Optimization Strategy for High Performance Computing Software

    Institute of Scientific and Technical Information of China (English)

    贾伟乐; 史小冬; 吕海峰

    2013-01-01

    With the arrival of the multi-core era, traditional software can barely utilize the peak performance of the hardware. Parallelization and optimization of the industrial code is a tough problem the HPC community is facing. In this paper, we present our parallel im-plementation using MPI, OpenMP and CUDA programming models. Different methods, implementation skills and optimization strategies are introduced. At the end, the challenge and the vision of the future work of our optimization strategy are discussed.%多核时代的来临对现有的应用软件提出了严重挑战,串行代码难以充分发挥硬件资源的性能;软件的并行优化成为亟待解决的重要问题。本文综合了MPI,OpenMP,众核编程模型CUDA三个编程模型进行研究,讨论了适用于不同软件并行优化的方法,提出了适用于企业级应用的软件并行优化策略,最后总结和展望了软件并行优化的挑战和前景。

  9. Rapid identification of mycolic acid patterns of mycobacteria by high-performance liquid chromatography using pattern recognition software and a Mycobacterium library.

    Science.gov (United States)

    Glickman, S E; Kilburn, J O; Butler, W R; Ramos, L S

    1994-01-01

    Current methods for identifying mycobacteria by high-performance liquid chromatography (HPLC) require a visual assessment of the generated chromatographic data, which often involves time-consuming hand calculations and the use of flow charts. Our laboratory has developed a personal computer-based file containing patterns of mycolic acids detected in 45 species of Mycobacterium, including both slowly and rapidly growing species, as well as Tsukamurella paurometabolum and members of the genera Corynebacterium, Nocardia, Rhodococcus, and Gordona. The library was designed to be used in conjunction with a commercially available pattern recognition software package, Pirouette (Infometrix, Seattle, Wash.). Pirouette uses the K-nearest neighbor algorithm, a similarity-based classification method, to categorize unknown samples on the basis of their multivariate proximities to samples of a preassigned category. Multivariate proximity is calculated from peak height data, while peak heights are named by retention time matching. The system was tested for accuracy by using 24 species of Mycobacterium. Of the 1,333 strains evaluated, > or = 97% were correctly identified. Identification of M. tuberculosis (n = 649) was 99.85% accurate, and identification of the M. avium complex (n = 211) was > or = 98% accurate; > or = 95% of strains of both double-cluster and single-cluster M. gordonae (n = 47) were correctly identified. This system provides a rapid, highly reliable assessment of HPLC-generated chromatographic data for the identification of mycobacteria. PMID:8195387

  10. Software Engineering Improvement Activities/Plan

    Science.gov (United States)

    2003-01-01

    bd Systems personnel accomplished the technical responsibilities for this reporting period, as planned. A close working relationship was maintained with personnel of the MSFC Avionics Department Software Group (ED14). Work accomplishments included development, evaluation, and enhancement of a software cost model, performing literature search and evaluation of software tools available for code analysis and requirements analysis, and participating in other relevant software engineering activities. Monthly reports were submitted. This support was provided to the Flight Software Group/ED 1 4 in accomplishing the software engineering improvement engineering activities of the Marshall Space Flight Center (MSFC) Software Engineering Improvement Plan.

  11. Software Engineering Improvement Activities/Plan

    Science.gov (United States)

    2003-01-01

    bd Systems personnel accomplished the technical responsibilities for this reporting period, as planned. A close working relationship was maintained with personnel of the MSFC Avionics Department Software Group (ED14). Work accomplishments included development, evaluation, and enhancement of a software cost model, performing literature search and evaluation of software tools available for code analysis and requirements analysis, and participating in other relevant software engineering activities. Monthly reports were submitted. This support was provided to the Flight Software Group/ED 1 4 in accomplishing the software engineering improvement engineering activities of the Marshall Space Flight Center (MSFC) Software Engineering Improvement Plan.

  12. Architecture of a high-performance surgical guidance system based on C-arm cone-beam CT: software platform for technical integration and clinical translation

    Science.gov (United States)

    Uneri, Ali; Schafer, Sebastian; Mirota, Daniel; Nithiananthan, Sajendra; Otake, Yoshito; Reaungamornrat, Sureerat; Yoo, Jongheun; Stayman, J. Webster; Reh, Douglas; Gallia, Gary L.; Khanna, A. Jay; Hager, Gregory; Taylor, Russell H.; Kleinszig, Gerhard; Siewerdsen, Jeffrey H.

    2011-03-01

    Intraoperative imaging modalities are becoming more prevalent in recent years, and the need for integration of these modalities with surgical guidance is rising, creating new possibilities as well as challenges. In the context of such emerging technologies and new clinical applications, a software architecture for cone-beam CT (CBCT) guided surgery has been developed with emphasis on binding open-source surgical navigation libraries and integrating intraoperative CBCT with novel, application-specific registration and guidance technologies. The architecture design is focused on accelerating translation of task-specific technical development in a wide range of applications, including orthopaedic, head-and-neck, and thoracic surgeries. The surgical guidance system is interfaced with a prototype mobile C-arm for high-quality CBCT and through a modular software architecture, integration of different tools and devices consistent with surgical workflow in each of these applications is realized. Specific modules are developed according to the surgical task, such as: 3D-3D rigid or deformable registration of preoperative images, surgical planning data, and up-to-date CBCT images; 3D-2D registration of planning and image data in real-time fluoroscopy and/or digitally reconstructed radiographs (DRRs); compatibility with infrared, electromagnetic, and video-based trackers used individually or in hybrid arrangements; augmented overlay of image and planning data in endoscopic or in-room video; real-time "virtual fluoroscopy" computed from GPU-accelerated DRRs; and multi-modality image display. The platform aims to minimize offline data processing by exposing quantitative tools that analyze and communicate factors of geometric precision. The system was translated to preclinical phantom and cadaver studies for assessment of fiducial (FRE) and target registration error (TRE) showing sub-mm accuracy in targeting and video overlay within intraoperative CBCT. The work culminates in

  13. Software Security Rules: SDLC Perspective

    Directory of Open Access Journals (Sweden)

    S. K. Pandey

    2009-10-01

    Full Text Available Software has become an integral part of everyday life. Everyday, millions of people perform transaction through internet, ATM, mobile phone, they send email & e-greetings, and use word processing and spreadsheet for various purpose. People use software bearing in mind that it is reliable and can be trust upon and the operation they perform is secured. Now, if these software have exploitable security hole then how can they be safe for use. Security brings value to software in terms of people’s trust. The value provided by secure software is of vital importance because many critical functions are entirely dependent on the software. That is why security is a serious topic which should be given proper attention during the entire SDLC, ‘right from the beginning’. For the proper implementation of security in the software, twenty one security rules are proposed in this paper along with validation results. It is found that by applying these rules as per given implementation mechanism, most of the vulnerabilities are eliminated in the software and a more secure software can be built.

  14. ACTS: from ATLAS software towards a common track reconstruction software

    CERN Document Server

    Gumpert, Christian; The ATLAS collaboration

    2017-01-01

    Reconstruction of charged particles' trajectories is a crucial task for most particle physics experiments. The high instantaneous luminosity achieved at the LHC leads to a high number of proton-proton collisions per bunch crossing, which has put the track reconstruction software of the LHC experiments through a thorough test. Preserving track reconstruction performance under increasingly difficult experimental conditions, while keeping the usage of computational resources at a reasonable level, is an inherent problem for many HEP experiments. Exploiting concurrent algorithms and using multivariate techniques for track identification are the primary strategies to achieve that goal. Starting from current ATLAS software, the ACTS project aims to encapsulate track reconstruction software into a generic, framework- and experiment-independent software package. It provides a set of high-level algorithms and data structures for performing track reconstruction tasks as well as fast track simulation. The software is de...

  15. Sandia software guidelines: Software quality planning

    Energy Technology Data Exchange (ETDEWEB)

    1987-08-01

    This volume is one in a series of Sandia Software Guidelines intended for use in producing quality software within Sandia National Laboratories. In consonance with the IEEE Standard for Software Quality Assurance Plans, this volume identifies procedures to follow in producing a Software Quality Assurance Plan for an organization or a project, and provides an example project SQA plan. 2 figs., 4 tabs.

  16. Software engineering architecture-driven software development

    CERN Document Server

    Schmidt, Richard F

    2013-01-01

    Software Engineering: Architecture-driven Software Development is the first comprehensive guide to the underlying skills embodied in the IEEE's Software Engineering Body of Knowledge (SWEBOK) standard. Standards expert Richard Schmidt explains the traditional software engineering practices recognized for developing projects for government or corporate systems. Software engineering education often lacks standardization, with many institutions focusing on implementation rather than design as it impacts product architecture. Many graduates join the workforce with incomplete skil

  17. Software Metrics to Estimate Software Quality using Software Component Reusability

    Directory of Open Access Journals (Sweden)

    Prakriti Trivedi

    2012-03-01

    Full Text Available Today most of the applications developed using some existing libraries, codes, open sources etc. As a code is accessed in a program, it is represented as the software component. Such as in java beans and in .net ActiveX controls are the software components. These components are ready to use programming code or controls that excel the code development. A component based software system defines the concept of software reusability. While using these components the main question arise is whether to use such components is beneficial or not. In this proposed work we are trying to present the answer for the same question. In this work we are presenting a set of software matrix that will check the interconnection between the software component and the application. How strong this relation defines the software quality after using this software component. The overall metrics will return the final result in terms of the boundless of the component with application.

  18. Unified Engineering Software System

    Science.gov (United States)

    Purves, L. R.; Gordon, S.; Peltzman, A.; Dube, M.

    1989-01-01

    Collection of computer programs performs diverse functions in prototype engineering. NEXUS, NASA Engineering Extendible Unified Software system, is research set of computer programs designed to support full sequence of activities encountered in NASA engineering projects. Sequence spans preliminary design, design analysis, detailed design, manufacturing, assembly, and testing. Primarily addresses process of prototype engineering, task of getting single or small number of copies of product to work. Written in FORTRAN 77 and PROLOG.

  19. The software life cycle

    CERN Document Server

    Ince, Darrel

    1990-01-01

    The Software Life Cycle deals with the software lifecycle, that is, what exactly happens when software is developed. Topics covered include aspects of software engineering, structured techniques of software development, and software project management. The use of mathematics to design and develop computer systems is also discussed. This book is comprised of 20 chapters divided into four sections and begins with an overview of software engineering and software development, paying particular attention to the birth of software engineering and the introduction of formal methods of software develop

  20. Software Defect Detection with Rocus

    Institute of Scientific and Technical Information of China (English)

    Yuan Jiang; Ming Li; Zhi-Hua Zhou

    2011-01-01

    Software defect detection aims to automatically identify defective software modules for efficient software test in order to improve the quality of a software system. Although many machine learning methods have been successfully applied to the task, most of them fail to consider two practical yet important issues in software defect detection. First, it is rather difficult to collect a large amount of labeled training data for learning a well-performing model; second, in a software system there are usually much fewer defective modules than defect-free modules, so learning would have to be conducted over an imbalanced data set. In this paper, we address these two practical issues simultaneously by proposing a novel semi-supervised learning approach named Rocus. This method exploits the abundant unlabeled examples to improve the detection accuracy, as well as employs under-sampling to tackle the class-imbalance problem in the learning process. Experimental results of real-world software defect detection tasks show that Rocgs is effective for software defect detection. Its performance is better than a semi-supervised learning method that ignores the class-imbalance nature of the task and a class-imbalance learning method that does not make effective use of unlabeled data.

  1. PhasePlot: An Interactive Software Tool for Visualizing Phase Relations, Performing Virtual Experiments, and for Teaching Thermodynamic Concepts in Petrology

    Science.gov (United States)

    Ghiorso, M. S.

    2012-12-01

    The computer program PhasePlot was developed for Macintosh computers and released via the Mac App Store in December 2011. It permits the visualization of phase relations calculated from internally consistent thermodynamic data-model collections, including those from MELTS (Ghiorso and Sack, 1995, CMP 119, 197-212), pMELTS (Ghiorso et al., 2002, G-cubed 3, 10.1029/2001GC000217) and the deep mantle database of Stixrude and Lithgow-Bertelloni (2011, GJI 184, 1180-1213). The software allows users to enter a system bulk composition and a range of reference conditions, and then calculate a grid of phase relations. These relations may be visualized in a variety of ways including pseudosections, phase diagrams, phase proportion plots, and contour diagrams of phase compositions and abundances. The program interface is user friendly and the computations are fast on laptop-scale machines, which makes PhasePlot amenable to in-class demonstrations, as a tool in instructional laboratories, and as an aid in support of out-of-class exercises and research. Users focus on problem specification and interpretation of results rather than on manipulation and mechanics of computation. The software has been developed with NSF support and is free. The PhasePlot web site is at phaseplot.org where extensive user documentation, video tutorials and examples of use may be found. The original release of phase plot permitted calculations to be performed on pressure-, temperature-grids (P-T), by direct minimization of the Gibbs free energy of the system at each grid point. A revision of PhasePlot (scheduled for release to the Mac App Store in December 2012) extends capabilities to include pressure-, entropy-grids (P-S) by system enthalpy minimization, volume-, temperature-grids (V-T) by system Helmholtz energy minimization, and volume-,entropy-grids (V-S) by minimization of the Internal Energy of the system. P-S gridded results may be utilized to visualize phase relations as a function of heat

  2. OH Oxidation of α-Pinene in the Atmosphere Simulation Chamber SAPHIR: Investigation of the Role of Pinonaldehyde Photolysis as an HO2 Source

    Science.gov (United States)

    Kaminski, M.; Acir, I. H.; Bohn, B.; Dorn, H. P.; Fuchs, H.; Häseler, R.; Hofzumahaus, A.; Li, X.; Rohrer, F.; Tillmann, R.; Wegener, R.; Kiendler-Scharr, A.; Wahner, A.

    2015-12-01

    About one third of the land surface is covered by forests, emitting approximately 75% of the total biogenic volatile organic compounds (BVOCs). The main atmospheric sink of these BVOCs during daytime is the oxidation by the hydroxyl radical (OH). Over the last decades field campaigns investigating the radical chemistry in forested regions showed that atmospheric chemistry models are often not able to describe the measured OH concentration well. At low NO concentrations and an OH reactivity dominated by BVOCs the OH was underestimated. This discrepancy could only partly be explained by the discovery of new OH regeneration pathways in the isoprene oxidation mechanism. Field campaigns in the U.S.A and Finland (Kim 2013 ACP, Hens 2014 ACP) demonstrated that in monoterpene (e.g. α-pinene) dominated environments model calculations also underpredict the observed HO2 and OH concentrations significantly even if the OH budget was closed by the measured OH production and destruction terms. These observations suggest the existence of an unaccounted source of HO2. One potential HO2 source in forests is the photolysis of monoterpene degradation products such as aldehydes. In the present study the photochemical degradation mechanism of α-pinene was investigated in the Jülich atmosphere simulation chamber SAPHIR. The focus of this study was in particular on the investigation of the role of pinonaldehyde, a main first generation product of α-pinene, as a possible HO2 source. For that purpose the pinonaldehyde yields of the reaction α-pinene + OH were determined at ambient monoterpene concentrations (<5 ppb) under low NOx as well as high NOx conditions. The pinonaldehyde yield under high NOx conditions (30.5 %) is in agreement with literature values of Wisthaler (2001 AE) and Aschmann (2002 JGR), under low NOx conditions the yield (10.8 %) is approximately a factor of three lower than the value published by Eddingsaas (2012 ACP). In a second set of experiments the photolysis

  3. Rapid Application Development Using Software Factories

    CERN Document Server

    Stojanovski, Toni

    2012-01-01

    Software development is still based on manufactory production, and most of the programming code is still hand-crafted. Software development is very far away from the ultimate goal of industrialization in software production, something which has been achieved long time ago in the other industries. The lack of software industrialization creates an inability to cope with fast and frequent changes in user requirements, and causes cost and time inefficiencies during their implementation. Analogous to what other industries had done long time ago, industrialization of software development has been proposed using the concept of software factories. We have accepted this vision about software factories, and developed our own software factory which produces three-layered ASP.NET web applications. In this paper we report about our experience with using this approach in the process of software development, and present comparative results on performances and deliverables in both traditional development and development usin...

  4. Amalgamation of Personal Software Process in Software ...

    African Journals Online (AJOL)

    evolutionary series of personal software engineering techniques that an engineer learns and ... Article History: Received : 30-04- ... began to realize that software process, plans and methodologies for ..... Executive Strategy. Addison-Wesley ...

  5. The Ettention software package.

    Science.gov (United States)

    Dahmen, Tim; Marsalek, Lukas; Marniok, Nico; Turoňová, Beata; Bogachev, Sviatoslav; Trampert, Patrick; Nickels, Stefan; Slusallek, Philipp

    2016-02-01

    We present a novel software package for the problem "reconstruction from projections" in electron microscopy. The Ettention framework consists of a set of modular building-blocks for tomographic reconstruction algorithms. The well-known block iterative reconstruction method based on Kaczmarz algorithm is implemented using these building-blocks, including adaptations specific to electron tomography. Ettention simultaneously features (1) a modular, object-oriented software design, (2) optimized access to high-performance computing (HPC) platforms such as graphic processing units (GPU) or many-core architectures like Xeon Phi, and (3) accessibility to microscopy end-users via integration in the IMOD package and eTomo user interface. We also provide developers with a clean and well-structured application programming interface (API) that allows for extending the software easily and thus makes it an ideal platform for algorithmic research while hiding most of the technical details of high-performance computing. Copyright © 2015 Elsevier B.V. All rights reserved.

  6. Internet and Intranet Use with a PC: Effects of Adapter Cards, Windows Versions and TCP/IP Software on Networking Performance.

    Science.gov (United States)

    Nieuwenhuysen, Paul

    1997-01-01

    Explores data transfer speeds obtained with various combinations of hardware and software components through a study of access to the Internet from a notebook computer connected to a local area network based on Ethernet and TCP/IP (transmission control protocol/Internet protocol) network protocols. Upgrading is recommended for higher transfer…

  7. Software Development Standard Processes (SDSP)

    Science.gov (United States)

    Lavin, Milton L.; Wang, James J.; Morillo, Ronald; Mayer, John T.; Jamshidian, Barzia; Shimizu, Kenneth J.; Wilkinson, Belinda M.; Hihn, Jairus M.; Borgen, Rosana B.; Meyer, Kenneth N.; Crean, Kathleen A.; Rinker, George C.; Smith, Thomas P.; Lum, Karen T.; Hanna, Robert A.; Erickson, Daniel E.; Gamble, Edward B., Jr.; Morgan, Scott C.; Kelsay, Michael G.; Newport, Brian J.; Lewicki, Scott A.; Stipanuk, Jeane G.; Cooper, Tonja M.; Meshkat, Leila

    2011-01-01

    A JPL-created set of standard processes is to be used throughout the lifecycle of software development. These SDSPs cover a range of activities, from management and engineering activities, to assurance and support activities. These processes must be applied to software tasks per a prescribed set of procedures. JPL s Software Quality Improvement Project is currently working at the behest of the JPL Software Process Owner to ensure that all applicable software tasks follow these procedures. The SDSPs are captured as a set of 22 standards in JPL s software process domain. They were developed in-house at JPL by a number of Subject Matter Experts (SMEs) residing primarily within the Engineering and Science Directorate, but also from the Business Operations Directorate and Safety and Mission Success Directorate. These practices include not only currently performed best practices, but also JPL-desired future practices in key thrust areas like software architecting and software reuse analysis. Additionally, these SDSPs conform to many standards and requirements to which JPL projects are beholden.

  8. Software attribute visualization for high integrity software

    Energy Technology Data Exchange (ETDEWEB)

    Pollock, G.M.

    1998-03-01

    This report documents a prototype tool developed to investigate the use of visualization and virtual reality technologies for improving software surety confidence. The tool is utilized within the execution phase of the software life cycle. It provides a capability to monitor an executing program against prespecified requirements constraints provided in a program written in the requirements specification language SAGE. The resulting Software Attribute Visual Analysis Tool (SAVAnT) also provides a technique to assess the completeness of a software specification.

  9. The use of Software Quality Metrics in Software Maintenance

    OpenAIRE

    Kafura, Dennis G.; Reddy, Geereddy R.

    1985-01-01

    This paper reports on a modest study which relates seven different software complexity metrics to the experience of maintenance activities performed on a medium size sofhvare system. Three different versions of the system that evolved over aperiod of three years were analyzed in this study. A major revision of the system, while still in its design phase, was also analyzed. The results of this study indicate: (1) that the growth in system complexity as determined by the software...

  10. Ontologies for software engineering and software technology

    CERN Document Server

    Calero, Coral; Piattini, Mario

    2006-01-01

    Covers two applications of ontologies in software engineering and software technology: sharing knowledge of the problem domain and using a common terminology among all stakeholders; and filtering the knowledge when defining models and metamodels. This book is of benefit to software engineering researchers in both academia and industry.

  11. Teamwork in Distributed Agile Software Development

    OpenAIRE

    Gurram, Chaitanya; Bandi, Srinivas Goud

    2013-01-01

    Context: Distributed software development has become a most desired way of software development. Application of agile development methodologies in distributed environments has taken a new trend in developing software due to its benefits of improved communication and collaboration. Teamwork is an important concept that agile methodologies facilitate and is one of the potential determinants of team performance which was not focused in distributed agile software development. Objectives: This res...

  12. Teamwork in Distributed Agile Software Development

    OpenAIRE

    Gurram, Chaitanya; Bandi, Srinivas Goud

    2013-01-01

    Context: Distributed software development has become a most desired way of software development. Application of agile development methodologies in distributed environments has taken a new trend in developing software due to its benefits of improved communication and collaboration. Teamwork is an important concept that agile methodologies facilitate and is one of the potential determinants of team performance which was not focused in distributed agile software development. Objectives: This res...

  13. 面向软件交付的绩效评估系统应用研究%Application research of performance appraisal system oriented to software delivery process

    Institute of Scientific and Technical Information of China (English)

    祁长兴; 刘杰; 李航; 杜庆东

    2011-01-01

    针对软件交付过程中绩效考核不规范的问题,根据给定的软件交付过程模型,利用关键绩效指标的考核体系,设计了绩效评估系统,给出了关键指标和相应权重的选取和设定过程.系统根据给出的各个阶段的指标体系和权重,应用模糊综合评价的方法,对各个阶段各个指标的评价成绩和总成绩进行绩效评估,实现了对软件项目的量化管理.最后分析了系统的应用对软件项目的影响.%Aiming at the irregularities of the performance appraisal during software delivery process, according to a given model of software delivery process, using key performance indicators evaluation system, a performance appraisal system is designed. It also gives the process of selection for the key performance indication and the corresponding weight setting. This system can evaluate performance of each stage and the whole process with the evaluation scores of the indicators through fuzzy comprehensive evaluation method and the indicator and weight in the various stages, and realizes the quantitative software project management. Finally, the impact on software projects of the system application is analyzed.

  14. Software Switching for Data Acquisition

    CERN Document Server

    CERN. Geneva; Malone, David

    2016-01-01

    In this talk we discuss the feasibility of replacing telecom-class routers with a topology of commodity servers acting as software switches in data acquisition. We extend the popular software switch, Open vSwitch, with a dedicated, throughput-oriented buffering mechanism. We compare the performance under heavy many-to-one congestion to typical Ethernet switches and evaluate the scalability when building larger topologies, exploiting the integration with software-defined networking technologies. Please note that David Malone will speak on behalf of Grzegorz Jereczek.

  15. Controlling Software Piracy.

    Science.gov (United States)

    King, Albert S.

    1992-01-01

    Explains what software manufacturers are doing to combat software piracy, recommends how managers should deal with this problem, and provides a role-playing exercise to help students understand the issues in software piracy. (SR)

  16. Space Flight Software Development Software for Intelligent System Health Management

    Science.gov (United States)

    Trevino, Luis C.; Crumbley, Tim

    2004-01-01

    The slide presentation examines the Marshall Space Flight Center Flight Software Branch, including software development projects, mission critical space flight software development, software technical insight, advanced software development technologies, and continuous improvement in the software development processes and methods.

  17. Software Engineering Guidebook

    Science.gov (United States)

    Connell, John; Wenneson, Greg

    1993-01-01

    The Software Engineering Guidebook describes SEPG (Software Engineering Process Group) supported processes and techniques for engineering quality software in NASA environments. Three process models are supported: structured, object-oriented, and evolutionary rapid-prototyping. The guidebook covers software life-cycles, engineering, assurance, and configuration management. The guidebook is written for managers and engineers who manage, develop, enhance, and/or maintain software under the Computer Software Services Contract.

  18. Software defined radio architectures evaluation

    OpenAIRE

    Palomo, Alvaro; Villing, Rudi; Farrell, Ronan

    2008-01-01

    This paper presents an performance evaluation of GNU Radio and OSSIE, two open source Software Defined Radio (SDR) architectures. The two architectures were compared by running implementations of a BPSK waveform utilising a software loopback channel on each. The upper bound full duplex throughput was found to be around 700kbps in both cases, though OSSIE was slightly faster than GNU Radio. CPU and memory loads did not differ significantly.

  19. Software and systems traceability

    CERN Document Server

    Cleland-Huang, Jane; Zisman, Andrea

    2012-01-01

    ""Software and Systems Traceability"" provides a comprehensive description of the practices and theories of software traceability across all phases of the software development lifecycle. The term software traceability is derived from the concept of requirements traceability. Requirements traceability is the ability to track a requirement all the way from its origins to the downstream work products that implement that requirement in a software system. Software traceability is defined as the ability to relate the various types of software artefacts created during the development of software syst

  20. 用Java开发基于Web的网络性能监测软件的研制%Developing the Web-based Software for Network Performance Monitoring by Java Technology

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    讨论用Java语言实现基于Web的网络性能监测软件的方法,该方法采用三层网络模式,以及JMX、JDBC、RMI和Swing等技术.%The methods of developing the Web-based software for network performance monitoring by Java technology was discussed,which involved three tier network model and technology of JMX, JDBC, RMI and Swing.

  1. Man versus Machine: Software Training for Surgeons-An Objective Evaluation of Human and Computer-Based Training Tools for Cataract Surgical Performance.

    Science.gov (United States)

    Din, Nizar; Smith, Phillip; Emeriewen, Krisztina; Sharma, Anant; Jones, Simon; Wawrzynski, James; Tang, Hongying; Sullivan, Paul; Caputo, Silvestro; Saleh, George M

    2016-01-01

    This study aimed to address two queries: firstly, the relationship between two cataract surgical feedback tools for training, one human and one software based, and, secondly, evaluating microscope control during phacoemulsification using the software. Videos of surgeons with varying experience were enrolled and independently scored with the validated PhacoTrack motion capture software and the Objective Structured Assessment of Cataract Surgical Skill (OSACCS) human scoring tool. Microscope centration and path length travelled were also evaluated with the PhacoTrack software. Twenty-two videos correlated PhacoTrack motion capture with OSACCS. The PhacoTrack path length, number of movements, and total procedure time were found to have high levels of Spearman's rank correlation of -0.6792619 (p = 0.001), -0.6652021 (p = 0.002), and -0.771529 (p = 0001), respectively, with OSACCS. Sixty-two videos evaluated microscope camera control. Novice surgeons had their camera off the pupil centre at a far greater mean distance (SD) of 6.9 (3.3) mm, compared with experts of 3.6 (1.6) mm (p ≪ 0.05). The expert surgeons maintained good microscope camera control and limited total pupil path length travelled 2512 (1031) mm compared with novices of 4049 (2709) mm (p ≪ 0.05). Good agreement between human and machine quantified measurements of surgical skill exists. Our results demonstrate that surrogate markers for camera control are predictors of surgical skills.

  2. Maximizing ROI on software development

    CERN Document Server

    Sikka, Vijay

    2004-01-01

    A brief review of software development history. Software complexity crisis. Software development ROI. The case for global software development and testing. Software quality and test ROI. How do you implement global software development and testing. Case studies.

  3. FPGAs for software programmers

    CERN Document Server

    Hannig, Frank; Ziener, Daniel

    2016-01-01

    This book makes powerful Field Programmable Gate Array (FPGA) and reconfigurable technology accessible to software engineers by covering different state-of-the-art high-level synthesis approaches (e.g., OpenCL and several C-to-gates compilers). It introduces FPGA technology, its programming model, and how various applications can be implemented on FPGAs without going through low-level hardware design phases. Readers will get a realistic sense for problems that are suited for FPGAs and how to implement them from a software designer’s point of view. The authors demonstrate that FPGAs and their programming model reflect the needs of stream processing problems much better than traditional CPU or GPU architectures, making them well-suited for a wide variety of systems, from embedded systems performing sensor processing to large setups for Big Data number crunching. This book serves as an invaluable tool for software designers and FPGA design engineers who are interested in high design productivity through behavi...

  4. The ALMA software architecture

    Science.gov (United States)

    Schwarz, Joseph; Farris, Allen; Sommer, Heiko

    2004-09-01

    The software for the Atacama Large Millimeter Array (ALMA) is being developed by many institutes on two continents. The software itself will function in a distributed environment, from the 0.5-14 kmbaselines that separate antennas to the larger distances that separate the array site at the Llano de Chajnantor in Chile from the operations and user support facilities in Chile, North America and Europe. Distributed development demands 1) interfaces that allow separated groups to work with minimal dependence on their counterparts at other locations; and 2) a common architecture to minimize duplication and ensure that developers can always perform similar tasks in a similar way. The Container/Component model provides a blueprint for the separation of functional from technical concerns: application developers concentrate on implementing functionality in Components, which depend on Containers to provide them with services such as access to remote resources, transparent serialization of entity objects to XML, logging, error handling and security. Early system integrations have verified that this architecture is sound and that developers can successfully exploit its features. The Containers and their services are provided by a system-orienteddevelopment team as part of the ALMA Common Software (ACS), middleware that is based on CORBA.

  5. Software Metrics Evaluation Based on Entropy

    CERN Document Server

    Selvarani, R; Ramachandran, Muthu; Prasad, Kamakshi

    2010-01-01

    Software engineering activities in the Industry has come a long way with various improve- ments brought in various stages of the software development life cycle. The complexity of modern software, the commercial constraints and the expectation for high quality products demand the accurate fault prediction based on OO design metrics in the class level in the early stages of software development. The object oriented class metrics are used as quality predictors in the entire OO software development life cycle even when a highly iterative, incremental model or agile software process is employed. Recent research has shown some of the OO design metrics are useful for predicting fault-proneness of classes. In this paper the empirical validation of a set of metrics proposed by Chidamber and Kemerer is performed to assess their ability in predicting the software quality in terms of fault proneness and degradation. We have also proposed the design complexity of object-oriented software with Weighted Methods per Class m...

  6. Fault tree analysis of KNICS RPS software

    Energy Technology Data Exchange (ETDEWEB)

    Park, Gee Yong; Kwon, Kee Choon [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Koh, Kwang Yong; Jee, Eun Kyoung; Seong, Poong Hyun [Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of); Lee, Dae Hyung [Doosan Heavy Industries and Construction, Yongin (Korea, Republic of)

    2008-08-15

    This paper describes the application of a software Fault Tree Analysis (FTA) as one of the analysis techniques for a Software Safety Analysis (SSA) at the design phase and its analysis results for the safety-critical software of a digital reactor protection system, which is called the KNICS RPS, being developed in the KNICS (Korea Nuclear Instrumentation and Control Systems) project. The software modules in the design description were represented by Function Blocks (FBs), and the software FTA was performed based on the well-defined fault tree templates for the FBs. The SSA, which is part of the verification and validation (V and V) activities, was activated at each phase of the software lifecycle for the KNICS RPS. At the design phase, the software HAZOP (Hazard and Operability) and the software FTA were employed in the SSA in such a way that the software HAZOP was performed first and then the software FTA was applied. The software FTA was applied to some critical modules selected from the software HAZOP analysis.

  7. Modern Tools for Modern Software

    Energy Technology Data Exchange (ETDEWEB)

    Kumfert, G; Epperly, T

    2001-10-31

    This is a proposal for a new software configure/build tool for building, maintaining, deploying, and installing software. At its completion, this new tool will replace current standard tool suites such as ''autoconf'', ''automake'', ''libtool'', and the de facto standard build tool, ''make''. This ambitious project is born out of the realization that as scientific software has grown in size and complexity over the years, the difficulty of configuring and building software has increased as well. For high performance scientific software, additional complexities often arises from the need for portability to multiple platforms (including many one-of-a-kind platforms), multilanguage implementations, use of third party libraries, and a need to adapt algorithms to the specific features of the hardware. Development of scientific software is being hampered by the quality of configuration and build tools commonly available. Inordinate amounts of time and expertise are required to develop and maintain the configure and build system for a moderately complex project. Better build and configure tools will increase developer productivity. This proposal is a first step in a process of shoring up the foundation upon which DOE software is created and used.

  8. Application of Formal Methods in Software Engineering

    Directory of Open Access Journals (Sweden)

    Adriana Morales

    2011-12-01

    Full Text Available The purpose of this research work is to examine: (1 why are necessary the formal methods for software systems today, (2 high integrity systems through the methodology C-by-C –Correctness-by-Construction–, and (3 an affordable methodology to apply formal methods in software engineering. The research process included reviews of the literature through Internet, in publications and presentations in events. Among the Research results found that: (1 there is increasing the dependence that the nations have, the companies and people of software systems, (2 there is growing demand for software Engineering to increase social trust in the software systems, (3 exist methodologies, as C-by-C, that can provide that level of trust, (4 Formal Methods constitute a principle of computer science that can be applied software engineering to perform reliable process in software development, (5 software users have the responsibility to demand reliable software products, and (6 software engineers have the responsibility to develop reliable software products. Furthermore, it is concluded that: (1 it takes more research to identify and analyze other methodologies and tools that provide process to apply the Formal Software Engineering methods, (2 Formal Methods provide an unprecedented ability to increase the trust in the exactitude of the software products and (3 by development of new methodologies and tools is being achieved costs are not more a disadvantage for application of formal methods.

  9. Improving Software Developer's Competence

    DEFF Research Database (Denmark)

    Abrahamsson, Pekka; Kautz, Karlheinz; Sieppi, Heikki

    2002-01-01

    Emerging agile software development methods are people oriented development approaches to be used by the software industry. The personal software process (PSP) is an accepted method for improving the capabilities of a single software engineer. Five original hypotheses regarding the impact...

  10. Ensuring Software IP Cleanliness

    Directory of Open Access Journals (Sweden)

    Mahshad Koohgoli

    2007-12-01

    Full Text Available At many points in the life of a software enterprise, determination of intellectual property (IP cleanliness becomes critical. The value of an enterprise that develops and sells software may depend on how clean the software is from the IP perspective. This article examines various methods of ensuring software IP cleanliness and discusses some of the benefits and shortcomings of current solutions.

  11. Improving Software Developer's Competence

    DEFF Research Database (Denmark)

    Abrahamsson, Pekka; Kautz, Karlheinz; Sieppi, Heikki

    2002-01-01

    Emerging agile software development methods are people oriented development approaches to be used by the software industry. The personal software process (PSP) is an accepted method for improving the capabilities of a single software engineer. Five original hypotheses regarding the impact...

  12. Agile Software Development

    Science.gov (United States)

    Biju, Soly Mathew

    2008-01-01

    Many software development firms are now adopting the agile software development method. This method involves the customer at every level of software development, thus reducing the impact of change in the requirement at a later stage. In this article, the principles of the agile method for software development are explored and there is a focus on…

  13. Software distribution using xnetlib

    Energy Technology Data Exchange (ETDEWEB)

    Dongarra, J.J. [Univ. of Tennessee, Knoxville, TN (US). Dept. of Computer Science]|[Oak Ridge National Lab., TN (US); Rowan, T.H. [Oak Ridge National Lab., TN (US); Wade, R.C. [Univ. of Tennessee, Knoxville, TN (US). Dept. of Computer Science

    1993-06-01

    Xnetlib is a new tool for software distribution. Whereas its predecessor netlib uses e-mail as the user interface to its large collection of public-domain mathematical software, xnetlib uses an X Window interface and socket-based communication. Xnetlib makes it easy to search through a large distributed collection of software and to retrieve requested software in seconds.

  14. Image Processing Software

    Science.gov (United States)

    Bosio, M. A.

    1990-11-01

    ABSTRACT: A brief description of astronomical image software is presented. This software was developed in a Digital Micro Vax II Computer System. : St presenta una somera descripci6n del software para procesamiento de imagenes. Este software fue desarrollado en un equipo Digital Micro Vax II. : DATA ANALYSIS - IMAGE PROCESSING

  15. Agile Software Development

    Science.gov (United States)

    Biju, Soly Mathew

    2008-01-01

    Many software development firms are now adopting the agile software development method. This method involves the customer at every level of software development, thus reducing the impact of change in the requirement at a later stage. In this article, the principles of the agile method for software development are explored and there is a focus on…

  16. Software productivity improvement through software engineering technology

    Science.gov (United States)

    Mcgarry, F. E.

    1985-01-01

    It has been estimated that NASA expends anywhere from 6 to 10 percent of its annual budget on the acquisition, implementation and maintenance of computer software. Although researchers have produced numerous software engineering approaches over the past 5-10 years; each claiming to be more effective than the other, there is very limited quantitative information verifying the measurable impact htat any of these technologies may have in a production environment. At NASA/GSFC, an extended research effort aimed at identifying and measuring software techniques that favorably impact productivity of software development, has been active over the past 8 years. Specific, measurable, software development technologies have been applied and measured in a production environment. Resulting software development approaches have been shown to be effective in both improving quality as well as productivity in this one environment.

  17. Great software debates

    CERN Document Server

    Davis, A

    2004-01-01

    The industry’s most outspoken and insightful critic explains how the software industry REALLY works. In Great Software Debates, Al Davis, shares what he has learned about the difference between the theory and the realities of business and encourages you to question and think about software engineering in ways that will help you succeed where others fail. In short, provocative essays, Davis fearlessly reveals the truth about process improvement, productivity, software quality, metrics, agile development, requirements documentation, modeling, software marketing and sales, empiricism, start-up financing, software research, requirements triage, software estimation, and entrepreneurship.

  18. A user's guide to the GoldSim/BLT-MS integrated software package:a low-level radioactive waste disposal performance assessment model.

    Energy Technology Data Exchange (ETDEWEB)

    Knowlton, Robert G.; Arnold, Bill Walter; Mattie, Patrick D.

    2007-03-01

    Sandia National Laboratories (Sandia), a U.S. Department of Energy National Laboratory, has over 30 years experience in the assessment of radioactive waste disposal and at the time of this publication is providing assistance internationally in a number of areas relevant to the safety assessment of radioactive waste disposal systems. In countries with small radioactive waste programs, international technology transfer program efforts are often hampered by small budgets, schedule constraints, and a lack of experienced personnel. In an effort to surmount these difficulties, Sandia has developed a system that utilizes a combination of commercially available software codes and existing legacy codes for probabilistic safety assessment modeling that facilitates the technology transfer and maximizes limited available funding. Numerous codes developed and endorsed by the United States Nuclear Regulatory Commission (NRC) and codes developed and maintained by United States Department of Energy are generally available to foreign countries after addressing import/export control and copyright requirements. From a programmatic view, it is easier to utilize existing codes than to develop new codes. From an economic perspective, it is not possible for most countries with small radioactive waste disposal programs to maintain complex software, which meets the rigors of both domestic regulatory requirements and international peer review. Therefore, revitalization of deterministic legacy codes, as well as an adaptation of contemporary deterministic codes, provides a credible and solid computational platform for constructing probabilistic safety assessment models. This document is a reference users guide for the GoldSim/BLT-MS integrated modeling software package developed as part of a cooperative technology transfer project between Sandia National Laboratories and the Institute of Nuclear Energy Research (INER) in Taiwan for the preliminary assessment of several candidate low

  19. Software for the LHCb experiment

    CERN Document Server

    Corti, Gloria; Belyaev, Ivan; Cattaneo, Marco; Charpentier, Philippe; Frank, Markus; Koppenburg, Patrick; Mato-Vila, P; Ranjard, Florence; Roiser, Stefan

    2006-01-01

    LHCb is an experiment for precision measurements of CP-violation and rare decays in B mesons at the LHC collider at CERN. The LHCb software development strategy follows an architecture-centric approach as a way of creating a resilient software framework that can withstand changes in requirements and technology over the expected long lifetime of the experiment. The software architecture, called GAUDI, supports event data processing applications that run in different processing environments ranging from the real-time high- level triggers in the online system to the final physics analysis performed by more than one hundred physicists. The major architectural design choices and the arguments that lead to these choices will be outlined. Object oriented technologies have been used throughout. Initially developed for the LHCb experiment, GAUDI has been adopted and extended by other experiments. Several iterations of the GAUDI software framework have been released and are now being used routinely by the physicists of...

  20. Software Engineering for Practiced Software Enhancement

    Directory of Open Access Journals (Sweden)

    Rashmi Yadav

    2011-03-01

    Full Text Available Software development scenario particularly in IT industries is very competitive and demands for development with minimum resources. Software development started and prevailed up to an extent in industry without the use of software engineering practices, which was perceived as an overhead. This approach causes over use of resources, such as money, man-hours, hardware components. This paper attempts to present the causes of inefficiencies in an almost exhaustive way. Further, an attempt has been made to elaborate the software engineering methods as remedies against the listed causes of inefficiencies of development.

  1. Software Metrics for Identifying Software Size in Software Development Projects

    Directory of Open Access Journals (Sweden)

    V.S.P Vidanapathirana

    2015-11-01

    Full Text Available Measurements are fundamental any engineering discipline. They indicate the amount, extent, dimension or capacity of an attribute or a product, in a quantitative manner. The analyzed results of the measured data can be given as the basic idea of metrics. It is a quantitative representation of the measurements of the degree to which a system, component, or process possesses a given attribute. When it comes to software, the metrics are a wide scope of measurements of computer programming. The size oriented metrics takes a main role in it since they can be used as the key for better estimations, to improve trust and confidence, and to have a better control over the software products. Software professionals traditionally have been measuring the size of software applications by using several methods. In this paper the researchers discuss about the software size metrics for identifying software size and it is mainly focused on the software development projects in today’s Information Technology (IT industry.

  2. Software-Engineering Process Simulation (SEPS) model

    Science.gov (United States)

    Lin, C. Y.; Abdel-Hamid, T.; Sherif, J. S.

    1992-01-01

    The Software Engineering Process Simulation (SEPS) model is described which was developed at JPL. SEPS is a dynamic simulation model of the software project development process. It uses the feedback principles of system dynamics to simulate the dynamic interactions among various software life cycle development activities and management decision making processes. The model is designed to be a planning tool to examine tradeoffs of cost, schedule, and functionality, and to test the implications of different managerial policies on a project's outcome. Furthermore, SEPS will enable software managers to gain a better understanding of the dynamics of software project development and perform postmodern assessments.

  3. Software Cost Estimation Review

    OpenAIRE

    Ongere, Alphonce

    2013-01-01

    Software cost estimation is the process of predicting the effort, the time and the cost re-quired to complete software project successfully. It involves size measurement of the soft-ware project to be produced, estimating and allocating the effort, drawing the project schedules, and finally, estimating overall cost of the project. Accurate estimation of software project cost is an important factor for business and the welfare of software organization in general. If cost and effort estimat...

  4. Software Partitioning Technologies

    Science.gov (United States)

    2001-05-29

    1 Software Partitioning Technologies Tim Skutt Smiths Aerospace 3290 Patterson Ave. SE Grand Rapids, MI 49512-1991 (616) 241-8645 skutt_timothy...Limitation of Abstract UU Number of Pages 12 2 Agenda n Software Partitioning Overview n Smiths Software Partitioning Technology n Software Partitioning...Partition Level OS Core Module Level OS Timers MMU I/O API Layer Partitioning Services 6 Smiths Software Partitioning Technology n Smiths has developed

  5. Software For Diagnosis Of Parallel Processing

    Science.gov (United States)

    Hontalas, Philip; Yan, Jerry; Fineman, Charles

    1995-01-01

    Ames Instrumentation System (AIMS) computer program package of software tools measuring and analyzing performances of parallel-processing application programs. Helps programmer to debug and refine, and to monitor and visualize execution of, parallel-processing application software for Intel iPSC/860 (or equivalent) multicomputer. Performance data collected displayed graphically on computer workstations supporting X-Windows.

  6. ACTS: from ATLAS software towards a common track reconstruction software

    CERN Document Server

    Gumpert, Christian; The ATLAS collaboration

    2016-01-01

    The reconstruction of charged particles trajectories is a crucial task for most particle physics experiments. The high instantaneous luminosity achieved at the LHC leads to a high number of proton-proton collisions per bunch crossing, which has put the track reconstruction software of the LHC experiments through a thorough test. Preserving track reconstruction performance under increasingly difficult experimental conditions, while keeping the usage of computational resources at a reasonable level, is an inherent problem for many HEP experiments. Exploiting concurrent algorithms and using multivariate techniques for track identification are the primary strategies to achieve that goal. Starting from current ATLAS software, the ACTS project aims to encapsulate track reconstruction software into a generic package, which can be built against the Gaudi(Hive) framework. It provides a set of high-level algorithms and data structures for performing track reconstruction tasks as well as fast track simulation. The softw...

  7. Workflow-Based Software Development Environment

    Science.gov (United States)

    Izygon, Michel E.

    2013-01-01

    The Software Developer's Assistant (SDA) helps software teams more efficiently and accurately conduct or execute software processes associated with NASA mission-critical software. SDA is a process enactment platform that guides software teams through project-specific standards, processes, and procedures. Software projects are decomposed into all of their required process steps or tasks, and each task is assigned to project personnel. SDA orchestrates the performance of work required to complete all process tasks in the correct sequence. The software then notifies team members when they may begin work on their assigned tasks and provides the tools, instructions, reference materials, and supportive artifacts that allow users to compliantly perform the work. A combination of technology components captures and enacts any software process use to support the software lifecycle. It creates an adaptive workflow environment that can be modified as needed. SDA achieves software process automation through a Business Process Management (BPM) approach to managing the software lifecycle for mission-critical projects. It contains five main parts: TieFlow (workflow engine), Business Rules (rules to alter process flow), Common Repository (storage for project artifacts, versions, history, schedules, etc.), SOA (interface to allow internal, GFE, or COTS tools integration), and the Web Portal Interface (collaborative web environment

  8. TMT approach to observatory software development process

    Science.gov (United States)

    Buur, Hanne; Subramaniam, Annapurni; Gillies, Kim; Dumas, Christophe; Bhatia, Ravinder

    2016-07-01

    The purpose of the Observatory Software System (OSW) is to integrate all software and hardware components of the Thirty Meter Telescope (TMT) to enable observations and data capture; thus it is a complex software system that is defined by four principal software subsystems: Common Software (CSW), Executive Software (ESW), Data Management System (DMS) and Science Operations Support System (SOSS), all of which have interdependencies with the observatory control systems and data acquisition systems. Therefore, the software development process and plan must consider dependencies to other subsystems, manage architecture, interfaces and design, manage software scope and complexity, and standardize and optimize use of resources and tools. Additionally, the TMT Observatory Software will largely be developed in India through TMT's workshare relationship with the India TMT Coordination Centre (ITCC) and use of Indian software industry vendors, which adds complexity and challenges to the software development process, communication and coordination of activities and priorities as well as measuring performance and managing quality and risk. The software project management challenge for the TMT OSW is thus a multi-faceted technical, managerial, communications and interpersonal relations challenge. The approach TMT is using to manage this multifaceted challenge is a combination of establishing an effective geographically distributed software team (Integrated Product Team) with strong project management and technical leadership provided by the TMT Project Office (PO) and the ITCC partner to manage plans, process, performance, risk and quality, and to facilitate effective communications; establishing an effective cross-functional software management team composed of stakeholders, OSW leadership and ITCC leadership to manage dependencies and software release plans, technical complexities and change to approved interfaces, architecture, design and tool set, and to facilitate

  9. Software Engineering Program: Software Process Improvement Guidebook

    Science.gov (United States)

    1996-01-01

    The purpose of this document is to provide experience-based guidance in implementing a software process improvement program in any NASA software development or maintenance community. This guidebook details how to define, operate, and implement a working software process improvement program. It describes the concept of the software process improvement program and its basic organizational components. It then describes the structure, organization, and operation of the software process improvement program, illustrating all these concepts with specific NASA examples. The information presented in the document is derived from the experiences of several NASA software organizations, including the SEL, the SEAL, and the SORCE. Their experiences reflect many of the elements of software process improvement within NASA. This guidebook presents lessons learned in a form usable by anyone considering establishing a software process improvement program within his or her own environment. This guidebook attempts to balance general and detailed information. It provides material general enough to be usable by NASA organizations whose characteristics do not directly match those of the sources of the information and models presented herein. It also keeps the ideas sufficiently close to the sources of the practical experiences that have generated the models and information.

  10. Payload software technology: Software technology development plan

    Science.gov (United States)

    1977-01-01

    Programmatic requirements for the advancement of software technology are identified for meeting the space flight requirements in the 1980 to 1990 time period. The development items are described, and software technology item derivation worksheets are presented along with the cost/time/priority assessments.

  11. Pragmatic Software Innovation

    DEFF Research Database (Denmark)

    Aaen, Ivan; Jensen, Rikke Hagensby

    2014-01-01

    We understand software innovation as concerned with introducing innovation into the development of software intensive systems, i.e. systems in which software development and/or integration are dominant considerations. Innovation is key in almost any strategy for competitiveness in existing markets......, for creating new markets, or for curbing rising public expenses, and software intensive systems are core elements in most such strategies. Software innovation therefore is vital for about every sector of the economy. Changes in software technologies over the last decades have opened up for experimentation...

  12. Software reliability models for critical applications

    Energy Technology Data Exchange (ETDEWEB)

    Pham, H.; Pham, M.

    1991-12-01

    This report presents the results of the first phase of the ongoing EG&G Idaho, Inc. Software Reliability Research Program. The program is studying the existing software reliability models and proposes a state-of-the-art software reliability model that is relevant to the nuclear reactor control environment. This report consists of three parts: (1) summaries of the literature review of existing software reliability and fault tolerant software reliability models and their related issues, (2) proposed technique for software reliability enhancement, and (3) general discussion and future research. The development of this proposed state-of-the-art software reliability model will be performed in the second place. 407 refs., 4 figs., 2 tabs.

  13. Software reliability models for critical applications

    Energy Technology Data Exchange (ETDEWEB)

    Pham, H.; Pham, M.

    1991-12-01

    This report presents the results of the first phase of the ongoing EG G Idaho, Inc. Software Reliability Research Program. The program is studying the existing software reliability models and proposes a state-of-the-art software reliability model that is relevant to the nuclear reactor control environment. This report consists of three parts: (1) summaries of the literature review of existing software reliability and fault tolerant software reliability models and their related issues, (2) proposed technique for software reliability enhancement, and (3) general discussion and future research. The development of this proposed state-of-the-art software reliability model will be performed in the second place. 407 refs., 4 figs., 2 tabs.

  14. Exploring Enterprise, System of Systems, and System and Software Architectures

    Science.gov (United States)

    2016-06-13

    2009 Carnegie Mellon University Exploring Enterprise, System of Systems, and System and Software Architectures Software Engineering Institute...TITLE AND SUBTITLE Exploring Enterprise, System of Systems, and System and Software Architectures 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c...Carnegie Mellon University, Software Engineering Institute,Pittsburgh,PA,15213 8. PERFORMING ORGANIZATION REPORT NUMBER 9. SPONSORING/MONITORING AGENCY

  15. 用自监测信息的卫星钟长稳特性评估软件设计%DESIGNING EVALUATION SOFTWARE FOR LONG-TERM STABILITY PERFORMANCE OF SATELLITE CLOCK USING AUTONOMOUS MONITORING INFORMATION

    Institute of Scientific and Technical Information of China (English)

    崔小准; 王璐; 毕少筠; 董海青

    2015-01-01

    利用卫星钟相位差的自监测信息进行在轨导航卫星星钟长稳性能估计的方法实现简单,能满足实际评估需要。由此开发了相应的软件。该软件读取地面接收机解调的卫星自监测信息,通过软件操作完成卫星钟各观察时段主备钟相位差计算,然后通过分析实现卫星钟长稳特性评估。软件具有卫星主备钟监测相位计满程刻度的修正功能和野值剔除功能,能修正由于各卫星钟实际频率与标称频率差异带来的相差计算误差,剔除小概率频率突变值,降低长稳特性评估的误差。软件界面友好,操作简单,在导航卫星在轨测试和在轨管理中得到应用。%The method of estimating long-term stability performance of on-board satellites clock using autonomous monitoring information of satellite’s phase difference for in-orbit navigation satellites is simple in implementation and can meet the requirement of actual assessment.Therefore the corresponding software has been developed.The software reads the self-monitoring information demodulated by ground receivers,completes the calculation of phase difference between master and slave clocks of satellite clock in each observation time period through software operation,and then realises the evaluation of long-term stability performance of satellite clock by analyses.The software has full scale correction function and outliers elimination function for phase meters of master and slave satellite clocks’monitoring, which can correct the calculation errors of phase differences brought about by the differences between actual frequency of each satellite clock and nominal frequency,and weed out mutated value of small probability frequency,as well as reduce the error of long-term stability performanceassessment.The software is friendly in interface,easy to operate,and has been applied in in-orbit testing and management of navigation satellites.

  16. Strengthening Software Authentication with the ROSE Software Suite

    Energy Technology Data Exchange (ETDEWEB)

    White, G

    2006-06-15

    Many recent nonproliferation and arms control software projects include a software authentication regime. These include U.S. Government-sponsored projects both in the United States and in the Russian Federation (RF). This trend toward requiring software authentication is only accelerating. Demonstrating assurance that software performs as expected without hidden ''backdoors'' is crucial to a project's success. In this context, ''authentication'' is defined as determining that a software package performs only its intended purpose and performs said purpose correctly and reliably over the planned duration of an agreement. In addition to visual inspections by knowledgeable computer scientists, automated tools are needed to highlight suspicious code constructs, both to aid visual inspection and to guide program development. While many commercial tools are available for portions of the authentication task, they are proprietary and not extensible. An open-source, extensible tool can be customized to the unique needs of each project (projects can have both common and custom rules to detect flaws and security holes). Any such extensible tool has to be based on a complete language compiler. ROSE is precisely such a compiler infrastructure developed within the Department of Energy (DOE) and targeted at the optimization of scientific applications and user-defined libraries within large-scale applications (typically applications of a million lines of code). ROSE is a robust, source-to-source analysis and optimization infrastructure currently addressing large, million-line DOE applications in C and C++ (handling the full C, C99, C++ languages and with current collaborations to support Fortran90). We propose to extend ROSE to address a number of security-specific requirements, and apply it to software authentication for nonproliferation and arms control projects.

  17. Software Design for Smile Analysis

    Directory of Open Access Journals (Sweden)

    A. Sarkhosh

    2010-12-01

    Full Text Available Introduction: Esthetics and attractiveness of the smile is one of the major demands in contemporary orthodontic treatment. In order to improve a smile design, it is necessary to record “posed smile” as an intentional, non-pressure, static, natural and reproduciblesmile. The record then should be analyzed to determine its characteristics. In this study,we intended to design and introduce a software to analyze the smile rapidly and precisely in order to produce an attractive smile for the patients.Materials and Methods: For this purpose, a practical study was performed to design multimedia software “Smile Analysis” which can receive patients’ photographs and videographs. After giving records to the software, the operator should mark the points and lines which are displayed on the system’s guide and also define the correct scale for each image. Thirty-three variables are measured by the software and displayed on the report page. Reliability of measurements in both image and video was significantly high(=0.7-1.Results: In order to evaluate intra- operator and inter-operator reliability, five cases were selected randomly. Statistical analysis showed that calculations performed in smile analysis software were both valid and highly reliable (for both video and photo.Conclusion: The results obtained from smile analysis could be used in diagnosis,treatment planning and evaluation of the treatment progress.

  18. Software-Defined Cluster

    Institute of Scientific and Technical Information of China (English)

    聂华; 杨晓君; 刘淘英

    2015-01-01

    The cluster architecture has played an important role in high-end computing for the past 20 years. With the advent of Internet services, big data, and cloud computing, traditional clusters face three challenges: 1) providing flexible system balance among computing, memory, and I/O capabilities;2) reducing resource pooling overheads;and 3) addressing low performance-power efficiency. This position paper proposes a software-defined cluster (SDC) architecture to deal with these challenges. The SDC architecture inherits two features of traditional cluster: its architecture is multicomputer and it has loosely-coupled interconnect. SDC provides two new mechanisms: global I/O space (GIO) and hardware-supported native access (HNA) to remote devices. Application software can define a virtual cluster best suited to its needs from resources pools provided by a physical cluster, and traditional cluster ecosystems need no modification. We also discuss a prototype design and implementation of a 32-processor cloud server utilizing the SDC architecture.

  19. CSAM Metrology Software Tool

    Science.gov (United States)

    Vu, Duc; Sandor, Michael; Agarwal, Shri

    2005-01-01

    CSAM Metrology Software Tool (CMeST) is a computer program for analysis of false-color CSAM images of plastic-encapsulated microcircuits. (CSAM signifies C-mode scanning acoustic microscopy.) The colors in the images indicate areas of delamination within the plastic packages. Heretofore, the images have been interpreted by human examiners. Hence, interpretations have not been entirely consistent and objective. CMeST processes the color information in image-data files to detect areas of delamination without incurring inconsistencies of subjective judgement. CMeST can be used to create a database of baseline images of packages acquired at given times for comparison with images of the same packages acquired at later times. Any area within an image can be selected for analysis, which can include examination of different delamination types by location. CMeST can also be used to perform statistical analyses of image data. Results of analyses are available in a spreadsheet format for further processing. The results can be exported to any data-base-processing software.

  20. Software Defined Cyberinfrastructure

    Energy Technology Data Exchange (ETDEWEB)

    Foster, Ian; Blaiszik, Ben; Chard, Kyle; Chard, Ryan

    2017-07-17

    Within and across thousands of science labs, researchers and students struggle to manage data produced in experiments, simulations, and analyses. Largely manual research data lifecycle management processes mean that much time is wasted, research results are often irreproducible, and data sharing and reuse remain rare. In response, we propose a new approach to data lifecycle management in which researchers are empowered to define the actions to be performed at individual storage systems when data are created or modified: actions such as analysis, transformation, copying, and publication. We term this approach software-defined cyberinfrastructure because users can implement powerful data management policies by deploying rules to local storage systems, much as software-defined networking allows users to configure networks by deploying rules to switches.We argue that this approach can enable a new class of responsive distributed storage infrastructure that will accelerate research innovation by allowing any researcher to associate data workflows with data sources, whether local or remote, for such purposes as data ingest, characterization, indexing, and sharing. We report on early experiments with this approach in the context of experimental science, in which a simple if-trigger-then-action (IFTA) notation is used to define rules.

  1. Pragmatic Software Innovation

    DEFF Research Database (Denmark)

    Aaen, Ivan; Jensen, Rikke Hagensby

    2014-01-01

    We understand software innovation as concerned with introducing innovation into the development of software intensive systems, i.e. systems in which software development and/or integration are dominant considerations. Innovation is key in almost any strategy for competitiveness in existing markets......, for creating new markets, or for curbing rising public expenses, and software intensive systems are core elements in most such strategies. Software innovation therefore is vital for about every sector of the economy. Changes in software technologies over the last decades have opened up for experimentation......, learning, and flexibility in ongoing software projects, but how can this change be used to facilitate software innovation? How can a team systematically identify and pursue opportunities to create added value in ongoing projects? In this paper, we describe Deweyan pragmatism as the philosophical foundation...

  2. Paladin Software Support Lab

    Data.gov (United States)

    Federal Laboratory Consortium — The Paladin Software Support Environment (SSE) occupies 2,241 square-feet. It contains the hardware and software tools required to support the Paladin Automatic Fire...

  3. ATLAS software packaging

    CERN Document Server

    Rybkin, G

    2012-01-01

    Software packaging is indispensable part of build and prerequisite for deployment processes. Full ATLAS software stack consists of TDAQ, HLT, and Offline software. These software groups depend on some 80 external software packages. We present tools, package PackDist, developed and used to package all this software except for TDAQ project. PackDist is based on and driven by CMT, ATLAS software configuration and build tool, and consists of shell and Python scripts. The packaging unit used is CMT project. Each CMT project is packaged as several packages - platform dependent (one per platform available), source code excluding header files, other platform independent files, documentation, and debug information packages (the last two being built optionally). Packaging can be done recursively to package all the dependencies. The whole set of packages for one software release, distribution kit, also includes configuration packages and contains some 120 packages for one platform. Also packaged are physics analysis pro...

  4. Commercial Data Mining Software

    Science.gov (United States)

    Zhang, Qingyu; Segall, Richard S.

    This chapter discusses selected commercial software for data mining, supercomputing data mining, text mining, and web mining. The selected software are compared with their features and also applied to available data sets. The software for data mining are SAS Enterprise Miner, Megaputer PolyAnalyst 5.0, PASW (formerly SPSS Clementine), IBM Intelligent Miner, and BioDiscovery GeneSight. The software for supercomputing are Avizo by Visualization Science Group and JMP Genomics from SAS Institute. The software for text mining are SAS Text Miner and Megaputer PolyAnalyst 5.0. The software for web mining are Megaputer PolyAnalyst and SPSS Clementine . Background on related literature and software are presented. Screen shots of each of the selected software are presented, as are conclusions and future directions.

  5. Software Testing Requires Variability

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak

    2003-01-01

    Software variability is the ability of a software system or artefact to be changed, customized or configured for use in a particular context. Variability in software systems is important from a number of perspectives. Some perspectives rightly receive much attention due to their direct economic...... impact in software production. As is also apparent from the call for papers these perspectives focus on qualities such as reuse, adaptability, and maintainability....

  6. Software engineer's pocket book

    CERN Document Server

    Tooley, Michael

    2013-01-01

    Software Engineer's Pocket Book provides a concise discussion on various aspects of software engineering. The book is comprised of six chapters that tackle various areas of concerns in software engineering. Chapter 1 discusses software development, and Chapter 2 covers programming languages. Chapter 3 deals with operating systems. The book also tackles discrete mathematics and numerical computation. Data structures and algorithms are also explained. The text will be of great use to individuals involved in the specification, design, development, implementation, testing, maintenance, and qualit

  7. Software Testing Requires Variability

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak

    2003-01-01

    Software variability is the ability of a software system or artefact to be changed, customized or configured for use in a particular context. Variability in software systems is important from a number of perspectives. Some perspectives rightly receive much attention due to their direct economic...... impact in software production. As is also apparent from the call for papers these perspectives focus on qualities such as reuse, adaptability, and maintainability....

  8. Software engineering measurement

    CERN Document Server

    Munson, PhD, John C

    2003-01-01

    By demonstrating how to develop simple experiments for the empirical validation of theoretical research and showing how to convert measurement data into meaningful and valuable information, this text fosters more precise use of software measurement in the computer science and software engineering literature. Software Engineering Measurement shows you how to convert your measurement data to valuable information that can be used immediately for software process improvement.

  9. Strategies for successful software development risk management

    Directory of Open Access Journals (Sweden)

    Marija Boban

    2003-01-01

    Full Text Available Nowadays, software is becoming a major part of enterprise business. Software development is activity connected with advanced technology and high level of knowledge. Risks on software development projects must be successfully mitigated to produce successful software systems. Lack of a defined approach to risk management is one of the common causes for project failures. To improve project chances for success, this work investigates common risk impact areas to perceive a foundation that can be used to define a common approach to software risk management. Based on typical risk impact areas on software development projects, we propose three risk management strategies suitable for a broad area of enterprises and software development projects with different amounts of connected risks. Proposed strategies define activities that should be performed for successful risk management, the one that will enable software development projects to perceive risks as soon as possible and to solve problems connected with risk materialization. We also propose a risk-based approach to software development planning and risk management as attempts to address and retire the highest impact risks as early as possible in the development process. Proposed strategies should improve risk management on software development projects and help create a successful software solution.

  10. The Impact of Computer and Mathematics Software Usage on Performance of School Leavers in the Western Cape Province of South Africa: A Comparative Analysis

    Science.gov (United States)

    Smith, Garth Spencer; Hardman, Joanne

    2014-01-01

    In this study the impact of computer immersion on performance of school leavers Senior Certificate mathematics scores was investigated across 31 schools in the EMDC East education district of Cape Town, South Africa by comparing performance between two groups: a control and an experimental group. The experimental group (14 high schools) had access…

  11. Software variability management

    NARCIS (Netherlands)

    Bosch, J; Nord, RL

    2004-01-01

    During recent years, the amount of variability that has to be supported by a software artefact is growing considerably and its management is evolving into a major challenge during development, usage, and evolution of software artefacts. Successful management of variability in software leads to

  12. Software Language Evolution

    NARCIS (Netherlands)

    Vermolen, S.D.

    2012-01-01

    Software plays a critical role in our daily life. Vast amounts of money are spent on more and more complex systems. All software, regardless if it controls a plane or the game on your phone is never finished. Software changes when it contains bugs or when new functionality is added. This process of

  13. Software Architecture Evolution

    Science.gov (United States)

    Barnes, Jeffrey M.

    2013-01-01

    Many software systems eventually undergo changes to their basic architectural structure. Such changes may be prompted by new feature requests, new quality attribute requirements, changing technology, or other reasons. Whatever the causes, architecture evolution is commonplace in real-world software projects. Today's software architects, however,…

  14. Java for flight software

    Science.gov (United States)

    Benowitz, E.; Niessner, A.

    2003-01-01

    This work involves developing representative mission-critical spacecraft software using the Real-Time Specification for Java (RTSJ). This work currently leverages actual flight software used in the design of actual flight software in the NASA's Deep Space 1 (DSI), which flew in 1998.

  15. Software Language Evolution

    NARCIS (Netherlands)

    Vermolen, S.D.

    2012-01-01

    Software plays a critical role in our daily life. Vast amounts of money are spent on more and more complex systems. All software, regardless if it controls a plane or the game on your phone is never finished. Software changes when it contains bugs or when new functionality is added. This process of

  16. Software Architecture Evolution

    Science.gov (United States)

    Barnes, Jeffrey M.

    2013-01-01

    Many software systems eventually undergo changes to their basic architectural structure. Such changes may be prompted by new feature requests, new quality attribute requirements, changing technology, or other reasons. Whatever the causes, architecture evolution is commonplace in real-world software projects. Today's software architects, however,…

  17. Avionics Simulation, Development and Software Engineering

    Science.gov (United States)

    2002-01-01

    During this reporting period, all technical responsibilities were accomplished as planned. A close working relationship was maintained with personnel of the MSFC Avionics Department Software Group (ED14), the MSFC EXPRESS Project Office (FD31), and the Huntsville Boeing Company. Accomplishments included: performing special tasks; supporting Software Review Board (SRB), Avionics Test Bed (ATB), and EXPRESS Software Control Panel (ESCP) activities; participating in technical meetings; and coordinating issues between the Boeing Company and the MSFC Project Office.

  18. Emerging Technologies for Software-Reliant Systems

    Science.gov (United States)

    2016-06-07

    2011 Carnegie Mellon University Emerging Technologies for Software -Reliant Systems Grace A. Lewis glewis@sei.cmu.edu SEI Webinar February 24, 2011...FEB 2011 2. REPORT TYPE 3. DATES COVERED 00-00-2011 to 00-00-2011 4. TITLE AND SUBTITLE Emerging Technologies for Software -Reliant Systems 5a...ORGANIZATION NAME(S) AND ADDRESS(ES) Carnegie Mellon University , Software Engineering Institute,Pittsburgh,PA,15213 8. PERFORMING ORGANIZATION

  19. Software Maintenance Success Recipes

    CERN Document Server

    Reifer, Donald J

    2011-01-01

    Dispelling much of the folklore surrounding software maintenance, Software Maintenance Success Recipes identifies actionable formulas for success based on in-depth analysis of more than 200 real-world maintenance projects. It details the set of factors that are usually present when effective software maintenance teams do their work and instructs on the methods required to achieve success. Donald J. Reifer--an award winner for his contributions to the field of software engineering and whose experience includes managing the DoD Software Initiatives Office--provides step-by-step guidance on how t

  20. Funding Research Software Development

    Science.gov (United States)

    Momcheva, Ivelina G.

    2017-01-01

    Astronomical software is used by each and every member of our scientific community. Purpose-build software is becoming ever more critical as we enter the regime of large datasets and simulations of increasing complexity. However, financial investments in building, maintaining and renovating the software infrastructure have been uneven. In this talk I will summarize past and current funding sources for astronomical software development, discuss other models of funding and introduce a new initiative for supporting community software at STScI. The purpose of this talk is to prompt discussion about how we allocate resources to this vital infrastructure.

  1. Management of Software Development Projects

    Directory of Open Access Journals (Sweden)

    Felician ALECU

    2011-04-01

    Full Text Available Any major software development starts with the Initiating process group. Once the charter document is approved, the Planning and then to the Executing stages will follow. Monitoring and Controlling is measuring the potential performance deviation of the project in terms of schedule and costs and performs the related Integrated Change Control activities. At the end, during the Closing, the program/project manager will check the entire work is completed and the objectives are met.

  2. Management of Software Development Projects

    OpenAIRE

    Felician ALECU

    2011-01-01

    Any major software development starts with the Initiating process group. Once the charter document is approved, the Planning and then to the Executing stages will follow. Monitoring and Controlling is measuring the potential performance deviation of the project in terms of schedule and costs and performs the related Integrated Change Control activities. At the end, during the Closing, the program/project manager will check the entire work is completed and the objectives are met.

  3. Performance Evaluation on Open Source Software Project Based on DEA Method Research%基于 DEA 方法的开源软件项目绩效评价研究

    Institute of Scientific and Technical Information of China (English)

    曾进群; 杨建梅; 陈泉

    2013-01-01

    随着网络技术的普及与发展,基于新型生产方式和创新方式的开源软件项目应运而生,如何对这种特殊项目进行绩效评价值得深入研究。以codeplex开源社区中的C#小社区为例,在总结codepelx开源社区基本特点的基础上,对DEA方法的决策单元和指标体系进行探讨,进而对选取的51个开源软件项目进行绩效评价,并基于绩效评价结果对开源软件项目开发提出建议。%With the popularization and development of network technology , open source software project based on the new mode of production and innovation ways arises at the historic moment .How to carry out the performance evaluation on this particular project deserves in -depth study .Taking the codeplex c #small community of open source community as an ex-ample, the paper summarizes the basic characteristics of the open source community , analyzes the DEA method of decision making units and index system , selects 51 open source software projects for performance evaluation , and puts forward sug-gestions finally .

  4. Patterns of Software Development Process

    Directory of Open Access Journals (Sweden)

    Sandro Javier Bolaños Castro

    2011-12-01

    Full Text Available "Times New Roman","serif";mso-fareast-font-family:"Times New Roman";mso-ansi-language:EN-US;mso-fareast-language:EN-US;mso-bidi-language:AR-SA">This article presents a set of patterns that can be found to perform best practices in software processes that are directly related to the problem of implementing the activities of the process, the roles involved, the knowledge generated and the inputs and outputs belonging to the process. In this work, a definition of the architecture is encouraged by using different recurrent configurations that strengthen the process and yield efficient results for the development of a software project. The patterns presented constitute a catalog, which serves as a vocabulary for communication among project participants [1], [2], and also can be implemented through software tools, thus facilitating patterns implementation [3]. Additionally, a tool that can be obtained under GPL (General Public license is provided for this purpose

  5. An Empirical Study of a Free Software Company

    OpenAIRE

    Pakusch, Cato

    2010-01-01

    Free software has matured well into the commercial software market, yet little qualitative research exists which accurately describes the state of commercial free software today. For this thesis, an instrumental case study was performed on a prominent free software company in Norway. The study found that the commercial free software market is largely driven by social networks, which have a social capital in its own that attracts more people, which in turn become members of the ...

  6. Quality-driven multi-objective optimization of software architecture design : method, tool, and application

    NARCIS (Netherlands)

    Etemadi Idgahi (Etemaadi), Ramin

    2014-01-01

    Software architecting is a non-trivial and demanding task for software engineers to perform. The architecture is a key enabler for software systems. Besides being crucial for user functionality, the software architecture has deep impact on software qualities such as performance, safety, and cost.

  7. Quality-driven multi-objective optimization of software architecture design : method, tool, and application

    NARCIS (Netherlands)

    Etemadi Idgahi (Etemaadi), Ramin

    2014-01-01

    Software architecting is a non-trivial and demanding task for software engineers to perform. The architecture is a key enabler for software systems. Besides being crucial for user functionality, the software architecture has deep impact on software qualities such as performance, safety, and cost.

  8. Good practices for educational software engineering projects

    NARCIS (Netherlands)

    van der Duim, Louwarnoud; Andersson, Jesper; Sinnema, Marco

    2007-01-01

    Recent publications indicate the importance of software engineering in the computer science curriculum. In this paper, we present the final part of software engineering education at University of Groningen in the Netherlands and Vaxjo University in Sweden, where student teams perform an industrial

  9. Good practices for educational software engineering projects

    NARCIS (Netherlands)

    van der Duim, Louwarnoud; Andersson, Jesper; Sinnema, Marco

    2007-01-01

    Recent publications indicate the importance of software engineering in the computer science curriculum. In this paper, we present the final part of software engineering education at University of Groningen in the Netherlands and Vaxjo University in Sweden, where student teams perform an industrial s

  10. Effective Packet-level FEC Software Coding

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    This paper introduces an effective software-based FEC redundant packets generating algorithm. The algorithm is based on Reed-Solomon coding over Galois Field. By operating on words of packets and performing polynomial multiplication via lookup tables, software coding efficiency is achieved to satisfy the needs of most of computer network applications. The approach to generate lookup tables is detailed.

  11. Improvements for Optics Measurement and Corrections software

    CERN Document Server

    Bach, T

    2013-01-01

    This note presents the improvements for the OMC software during a 14 month technical student internship at CERN. The goal of the work was to improve existing software in terms of maintainability, features and performance. Significant improvements in stability, speed and overall development process were reached. The main software, a Java GUI at the LHC CCC, run for months without noteworthy problems. The overall running time of the software chain used for optics corrections was reduced from nearly half an hour to around two minutes. This was the result of analysing and improving several involved programs and algorithms.

  12. Expert System Software Assistant for Payload Operations

    Science.gov (United States)

    Rogers, Mark N.

    1997-01-01

    The broad objective of this expert system software based application was to demonstrate the enhancements and cost savings that can be achieved through expert system software utilization in a spacecraft ground control center. Spacelab provided a valuable proving ground for this advanced software technology; a technology that will be exploited and expanded for future ISS operations. Our specific focus was on demonstrating payload cadre command and control efficiency improvements through the use of "smart" software which monitors flight telemetry, provides enhanced schematic-based data visualization, and performs advanced engineering data analysis.

  13. Software Maintenance and Evolution: The Implication for Software ...

    African Journals Online (AJOL)

    PROF. O. E. OSUAGWU

    2013-06-01

    Jun 1, 2013 ... ... the test of time. Keywords: Software, software maintenance, software evolution, reverse engineering, ... area of human endeavour be it automobile, software, etc. at .... greater efficiency and productivity in the maintenance ...

  14. Building quality into medical product software design.

    Science.gov (United States)

    Mallory, S R

    1993-01-01

    The software engineering and quality assurance disciplines are a requisite to the design of safe and effective software-based medical devices. It is in the areas of software methodology and process that the most beneficial application of these disciplines to software development can be made. Software is a product of complex operations and methodologies and is not amenable to the traditional electromechanical quality assurance processes. Software quality must be built in by the developers, with the software verification and validation engineers acting as the independent instruments for ensuring compliance with performance objectives and with development and maintenance standards. The implementation of a software quality assurance program is a complex process involving management support, organizational changes, and new skill sets, but the benefits are profound. Its rewards provide safe, reliable, cost-effective, maintainable, and manageable software, which may significantly speed the regulatory review process and therefore potentially shorten the overall time to market. The use of a trial project can greatly facilitate the learning process associated with the first-time application of a software quality assurance program.

  15. A company perspective on software engineering standards

    Energy Technology Data Exchange (ETDEWEB)

    Steer, R.W.

    1988-01-01

    Software engineering standards, as implemented via formal policies and procedures, have historically been used in the nuclear industry, especially for codes used in the design, analysis, or operation of the plant. Over the past two decades, a significant amount of software has been put in place to perform these functions, while the overall software life cycle has become better understood, more and different computer systems have become available, and industry has become increasingly aware of the advantages gained when these procedures are used in the development and maintenance of this large amount of software. The use of standards and attendant procedures is thus becoming increasingly important as more computerization is taking place, both in the design and the operation of the plant. It is difficult to categorize software used in activities related to nuclear plants in a simple manner. That difficulty is due to the diversity of those uses, with attendant diversity in the methods and procedures used in the production of the software, compounded by a changing business climate in which significant software engineering expertise is being applied to a broader range of applications on a variety of computing systems. The use of standards in the various phases of the production of software thus becomes more difficult as well. This paper discusses the various types of software and the importance of software standards in the development of each of them.

  16. NASA software documentation standard software engineering program

    Science.gov (United States)

    1991-01-01

    The NASA Software Documentation Standard (hereinafter referred to as Standard) can be applied to the documentation of all NASA software. This Standard is limited to documentation format and content requirements. It does not mandate specific management, engineering, or assurance standards or techniques. This Standard defines the format and content of documentation for software acquisition, development, and sustaining engineering. Format requirements address where information shall be recorded and content requirements address what information shall be recorded. This Standard provides a framework to allow consistency of documentation across NASA and visibility into the completeness of project documentation. This basic framework consists of four major sections (or volumes). The Management Plan contains all planning and business aspects of a software project, including engineering and assurance planning. The Product Specification contains all technical engineering information, including software requirements and design. The Assurance and Test Procedures contains all technical assurance information, including Test, Quality Assurance (QA), and Verification and Validation (V&V). The Management, Engineering, and Assurance Reports is the library and/or listing of all project reports.

  17. Heat Pump Performance Analysis Software Based on the Equivalent Thermodynamic Transformation Analysis Method%基于等价热力变换分析法的热泵性能分析软件

    Institute of Scientific and Technical Information of China (English)

    陈则韶; 谢文海; 胡芃; 贾磊

    2013-01-01

    介绍了一种等价热力变换分析法的原理及基于该方法研发的热泵性能分析软件。等价热力变换分析法,是通过定义系统能量交换过程和循环的等效热力温度,把实际热泵内不可逆循环等价变换为逆卡诺循环进行分析的方法;其使用的3个关键参数:等价逆卡诺循环的等效热源温度、等效冷凝温度和热泵理论循环的输出热流量,根据物性数据库拟合获得。而后,根据等价热力变换分析法,编制了热泵性能分析与优化软件。软件包括4个主功能模块:热泵性能单值分析、热泵性能区间分析、热泵有效能单值分析以及热泵有效能区间分析。通过选择相应的工质,并输入相应的设计参数,用户能够方便地获得相应工况下热泵的性能参数以及各环节有效能消耗数据。本软件适合变工况性能的模拟和预测,对热泵设计单位和用户有参考价值。%An equivalent thermodynamic transformation analysis method ( ETTAM ) and based on it a heat pump performance analysis software was been proposed .ETTAM is a transformation of the actual heat pump irreversible cycles into the equivalent re -verse Carnot cycles by defining the effective thermodynamic temperature ( ETT) of the system energy exchange process and cycle . The functions of three key parameters in ETTAM:the effective heat source temperature of the reverse Carnot cycle , the effective condensing temperature and the output heat flow of theoretical cycle have been obtained by fitting thermo -physical properties data . According to ETTAM, the heat pump performance analysis software is compiled .The software includes four main functional mod-ules, the heat pump performance single-value analysis, the heat pump performance interval analysis , heat pump exergy single-valued analysis and heat pump exergy interval analysis .The performance parameters with corresponding conditions and exergy consumption of various

  18. A Few Opinions on Slug Interpretation Method in Saphir Well Testing Software%对Saphir试井软件中段塞流解释方法的几点认识

    Institute of Scientific and Technical Information of China (English)

    余碧君; 耿青; 陈燕; 毛伟

    2003-01-01

    分析了Saphir试井软件中段塞流压力资料解释方法原理:由段塞流压力数据得到定产量压降试井考虑井筒储存和表皮效应的等效压力导数,然后和Bourdet压力导数曲线进行拟合,从而求出地层参数,给出了具体的分析步骤.对某油田6口探井的段塞流压力资料进行了分析,并对解释结果进行了定量检验,从定量检验结果可以看出解释结果是比较正确的,从而表明应用Saphir试井软件的段塞流解释方法,可以对段塞流压力资料进行解释从而得到动态条件下的地层特性参数.

  19. Software Based Supernova Recognition

    Science.gov (United States)

    Walters, Stephen M.

    2014-05-01

    This paper describes software for detecting Supernova (SN) in images. The software can operate in real-time to discover SN while data is being collected so the instrumentation can immediately be re-tasked to perform spectroscopy or photometry of a discovery. Because the instrumentation captures two images per minute, the realtime budget is constrained to 30 seconds per target, a challenging goal. Using a set of two to four images, the program creates a "Reference" (REF) image and a "New" (NEW) image where all images are used in both NEW and REF but any SN survives the combination process only in the NEW image. This process produces good quality images having similar noise characteristics but without artifacts that might be interpreted as SN. The images are then adjusted for seeing and brightness differences using a variant of Tomaney and Crotts method of Point Spread Function (PSF) matching after which REF is subtracted from NEW to produce a Difference (DIF) image. A Classifier is then trained on a grid of artificial SN to estimate the statistical properties of four attributes and used in a process to mask false positives that can be clearly identified as such. Further training to avoid any remaining false positives sets the range, in standard deviations for each attribute, that the Classifier will accept as a valid SN. This training enables the Classifier to discriminate between SN and most subtraction residue. Lastly, the DIF image is scanned and measured by the Classifier to find locations where all four properties fall within their acceptance ranges. If multiple locations are found, the one best conforming to the training estimates is chosen. This location is then declared as a Candidate SN, the instrumentation re-tasked and the operator notified.

  20. Trace Software Pipelining

    Institute of Scientific and Technical Information of China (English)

    王剑; AndreasKrall; 等

    1995-01-01

    Global software pipelining is a complex but efficient compilation technique to exploit instruction-level parallelism for loops with branches.This paper presents a novel global software pipelining technique,called Trace Software Pipelining,targeted to the instruction-level parallel processors such as Very Long Instruction Word (VLIW) and superscalar machines.Trace software pipelining applies a global code scheduling technique to compact the original loop body.The resulting loop is called a trace software pipelined (TSP) code.The trace softwrae pipelined code can be directly executed with special architectural support or can be transformed into a globally software pipelined loop for the current VLIW and superscalar processors.Thus,exploiting parallelism across all iterations of a loop can be completed through compacting the original loop body with any global code scheduling technique.This makes our new technique very promising in practical compilers.Finally,we also present the preliminary experimental results to support our new approach.