WorldWideScience

Sample records for saphire software performs

  1. SAPHIRE 8 Software Quality Assurance Plan

    Energy Technology Data Exchange (ETDEWEB)

    Curtis Smith

    2010-02-01

    This Quality Assurance (QA) Plan documents the QA activities that will be managed by the INL related to JCN N6423. The NRC developed the SAPHIRE computer code for performing probabilistic risk assessments (PRAs) using a personal computer (PC) at the Idaho National Laboratory (INL) under Job Code Number (JCN) L1429. SAPHIRE started out as a feasibility study for a PRA code to be run on a desktop personal PC and evolved through several phases into a state-of-the-art PRA code. The developmental activity of SAPHIRE was the result of two concurrent important events: The tremendous expansion of PC software and hardware capability of the 90s and the onset of a risk-informed regulation era.

  2. OVERVIEW OF THE SAPHIRE PROBABILISTIC RISK ANALYSIS SOFTWARE

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Curtis L.; Wood, Ted; Knudsen, James; Ma, Zhegang

    2016-10-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) is a software application developed for performing a complete probabilistic risk assessment (PRA) using a personal computer (PC) running the Microsoft Windows operating system. SAPHIRE Version 8 is funded by the U.S. Nuclear Regulatory Commission (NRC) and developed by the Idaho National Laboratory (INL). INL's primary role in this project is that of software developer and tester. However, INL also plays an important role in technology transfer by interfacing and supporting SAPHIRE users, who constitute a wide range of PRA practitioners from the NRC, national laboratories, the private sector, and foreign countries. In this paper, we provide an overview of the current technical capabilities found in SAPHIRE Version 8, including the user interface and enhanced solving algorithms.

  3. Saphire models and software for ASP evaluations

    International Nuclear Information System (INIS)

    Sattison, M.B.

    1997-01-01

    The Idaho National Engineering Laboratory (INEL) over the three years has created 75 plant-specific Accident Sequence Precursor (ASP) models using the SAPHIRE suite of PRA codes. Along with the new models, the INEL has also developed a new module for SAPHIRE which is tailored specifically to the unique needs of ASP evaluations. These models and software will be the next generation of risk tools for the evaluation of accident precursors by both the U.S. Nuclear Regulatory Commission's (NRC's) Office of Nuclear Reactor Regulation (NRR) and the Office for Analysis and Evaluation of Operational Data (AEOD). This paper presents an overview of the models and software. Key characteristics include: (1) classification of the plant models according to plant response with a unique set of event trees for each plant class, (2) plant-specific fault trees using supercomponents, (3) generation and retention of all system and sequence cutsets, (4) full flexibility in modifying logic, regenerating cutsets, and requantifying results, and (5) user interface for streamlined evaluation of ASP events. Future plans for the ASP models is also presented

  4. Independent Verification and Validation Of SAPHIRE 8 Software Requirements Project Number: N6423 U.S. Nuclear Regulatory Commission

    Energy Technology Data Exchange (ETDEWEB)

    Kent Norris

    2009-09-01

    The purpose of the Independent Verification and Validation (IV&V) role in the evaluation of the SAPHIRE requirements definition is to assess the activities that results in the specification, documentation, and review of the requirements that the software product must satisfy, including functionality, performance, design constraints, attributes and external interfaces. The IV&V team began this endeavor after the software engineering and software development of SAPHIRE had already been in production. IV&V reviewed the requirements specified in the NRC Form 189s to verify these requirements were included in SAPHIRE’s Software Verification and Validation Plan (SVVP).

  5. Verification and validation of the SAPHIRE Version 4.0 PRA software package

    International Nuclear Information System (INIS)

    Bolander, T.W.; Calley, M.B.; Capps, E.L.

    1994-02-01

    A verification and validation (V ampersand V) process has been performed for the System Analysis Programs for Hands-on Integrated Reliability Evaluation (SAPHIRE). SAPHIRE is a set of four computer programs that the Nuclear Regulatory Commission (NRC) developed to perform probabilistic risk assessments (PRAs). These programs allow an analyst to create, quantify, and evaluate the risk associated with a facility or process being analyzed. The programs included in this set are Integrated Reliability and Risk Analysis System (IRRAS), System Analysis and Risk Assessment (SARA), Models and Results Database (MAR-D), and Fault Tree/Event Tree/Piping and Instrumentation Diagram (FEP) graphical editor. The V ampersand V steps included a V ampersand V plan to describe the process and criteria by which the V ampersand V would be performed; a software requirements documentation review to determine the correctness, completeness, and traceability of the requirements; a user survey to determine the usefulness of the user documentation, identification and testing of vital and non-vital features, and documentation of the test results

  6. Key attributes of the SAPHIRE risk and reliability analysis software for risk-informed probabilistic applications

    International Nuclear Information System (INIS)

    Smith, Curtis; Knudsen, James; Kvarfordt, Kellie; Wood, Ted

    2008-01-01

    The Idaho National Laboratory is a primary developer of probabilistic risk and reliability analysis (PRRA) tools, dating back over 35 years. Evolving from mainframe-based software, the current state-of-the-practice has led to the creation of the SAPHIRE software. Currently, agencies such as the Nuclear Regulatory Commission, the National Aeronautics and Aerospace Agency, the Department of Energy, and the Department of Defense use version 7 of the SAPHIRE software for many of their risk-informed activities. In order to better understand and appreciate the power of software as part of risk-informed applications, we need to recall that our current analysis methods and solution methods have built upon pioneering work done 30-40 years ago. We contrast this work with the current capabilities in the SAPHIRE analysis package. As part of this discussion, we provide information for both the typical features and special analysis capabilities, which are available. We also present the application and results typically found with state-of-the-practice PRRA models. By providing both a high-level and detailed look at the SAPHIRE software, we give a snapshot in time for the current use of software tools in a risk-informed decision arena

  7. The capabilities and applications of the saphire 5.0 safety assessment software

    International Nuclear Information System (INIS)

    Russell, K.D.; Wood, S.T.; Kvarfordt, K.J.

    1994-01-01

    The System Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) refers to a suite of computer programs that were developed to create and analyze a probabilistic risk assessment (PRA) of a nuclear power plant. The programs in this suite include: Models and Results Data Base (MAR-D) software, Integrated Reliability and Risk Analysis System (IRRAS) software, System Analysis and Risk Assessment (SARA) software, and Fault tree, Event tree, and Piping and instrumentation diagram (FEP) graphical editor. Each of these programs performs a specific function in taking a PRA from the conceptual state all the way to publication. This paper provides an overview of the features and capabilities provided in version 5.0 of this software system. Some major new features include the ability to store unlimited cut sets, the ability to perform location transformations, the ability to perform seismic analysis, the ability to perform automated rule based recovery analysis and end state cut set partitioning, the ability to perform end state analysis, a new alphanumeric fault tree editor, and a new alphanumeric event tree editor. Many enhancements and improvements to the user interface as well as a significant reduction in the time required to perform an analysis are included in version 5.0. These new features and capabilities provide a powerful set of PC based PRA analysis tools

  8. Development of the software of the data taking system SOS for the SAPHIR experiment. Entwicklung der Software des Datennahmesystems SOS fuer das SAPHIR-Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Manns, J.

    1989-02-01

    The data acquistion system SOS has been developed for the SAPHIR experiment at the Bonn stretcher ring ELSA. It can handle up to 280 kilobytes of data per second or a maximum triggerrate of 200 Hz. The multiprocessor based online system consists of twenty VIP-microprocessors and two VAX-computers. Each component of the SAPHIR experiment has at least one program in the online system to maintain special functions for this specific component. All of these programs can receive event data without interfering with the transfer of events to a mass storage for offline analysis. A special program SOL has been developed to serve as a user interface to the data acquisition system and as a status display for most of the programs of the online system. Using modern features like windowing and mouse control on a VAX-station the SAPHIR online SOL establishes an easy way of controlling the data acquisition system. (orig.).

  9. SAPHIRE 8 Volume 6 - Quality Assurance

    Energy Technology Data Exchange (ETDEWEB)

    C. L. Smith; R. Nims; K. J. Kvarfordt

    2011-03-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) Version 8 is a software application developed for performing a complete probabilistic risk assessment using a personal computer running the Microsoft Windows™ operating system. SAPHIRE 8 is funded by the U.S. Nuclear Regulatory Commission (NRC). The role of the INL in this project is that of software developer and tester. This development takes place using formal software development procedures and is subject to quality assurance (QA) processes. The purpose of this document is to describe how the SAPHIRE software QA is performed for Version 8, what constitutes its parts, and limitations of those processes. In addition, this document describes the Independent Verification and Validation that was conducted for Version 8 as part of an overall QA process.

  10. Level-1 probability safety assessment of the Iranian heavy water reactor using SAPHIRE software

    Energy Technology Data Exchange (ETDEWEB)

    Faghihi, F. [Department of Nuclear Engineering, School of Engineering, Shiraz University, 71348-51153 Shiraz (Iran, Islamic Republic of); Research Center for Radiation Protection, Shiraz University, Shiraz (Iran, Islamic Republic of); Nuclear Safety Research Center, Shiraz University, Shiraz (Iran, Islamic Republic of)], E-mail: faghihif@shirazu.ac.ir; Ramezani, E. [Department of Nuclear Engineering, School of Engineering, Shiraz University, 71348-51153 Shiraz (Iran, Islamic Republic of); Yousefpour, F. [Atomic Energy Organization of Iran (AEOI), Tehran (Iran, Islamic Republic of); Mirvakili, S.M. [Department of Nuclear Engineering, School of Engineering, Shiraz University, 71348-51153 Shiraz (Iran, Islamic Republic of)

    2008-10-15

    The main goal of this review paper is to analyze the total frequency of the core damage of the Iranian Heavy Water Research Reactor (IHWRR) compared with standard criteria and to determine the strengths and the weaknesses of the reactor safety systems towards improving its design and operation. The PSA has been considered for full-power state of the reactor and this article represents a level-1 PSA analysis using System Analysis Programs for Hands-On Integrated Reliability Evaluations (SAPHIRE) software. It is specifically designed to permit a listing of the potential accident sequences, compute their frequencies of occurrence and assign each sequence to a consequence. The method used for modeling the systems and accident sequences, is Large Fault Tree/Small Event Tree method. This PSA level-1 for IHWRR indicates that, based on conservative assumptions, the total frequency of accidents that would lead to core damage from internal initiating events is 4.44E-05 per year of reactor operation.

  11. Level-1 probability safety assessment of the Iranian heavy water reactor using SAPHIRE software

    International Nuclear Information System (INIS)

    Faghihi, F.; Ramezani, E.; Yousefpour, F.; Mirvakili, S.M.

    2008-01-01

    The main goal of this review paper is to analyze the total frequency of the core damage of the Iranian Heavy Water Research Reactor (IHWRR) compared with standard criteria and to determine the strengths and the weaknesses of the reactor safety systems towards improving its design and operation. The PSA has been considered for full-power state of the reactor and this article represents a level-1 PSA analysis using System Analysis Programs for Hands-On Integrated Reliability Evaluations (SAPHIRE) software. It is specifically designed to permit a listing of the potential accident sequences, compute their frequencies of occurrence and assign each sequence to a consequence. The method used for modeling the systems and accident sequences, is Large Fault Tree/Small Event Tree method. This PSA level-1 for IHWRR indicates that, based on conservative assumptions, the total frequency of accidents that would lead to core damage from internal initiating events is 4.44E-05 per year of reactor operation

  12. SAPHIRE 8 Volume 1 - Overview and Summary

    International Nuclear Information System (INIS)

    Smith, C.L.; Wood, S.T.

    2011-01-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) is a software application developed for performing a complete probabilistic risk assessment (PRA) using a personal computer (PC) running the Microsoft Windows operating system. SAPHIRE Version 8 is funded by the U.S. Nuclear Regulatory Commission (NRC) and developed by the Idaho National Laboratory (INL). INL's primary role in this project is that of software developer and tester. However, INL also plays an important role in technology transfer by interfacing and supporting SAPHIRE users, who constitute a wide range of PRA practitioners from the NRC, national laboratories, the private sector, and foreign countries. SAPHIRE can be used to model a complex system's response to initiating events and quantify associated consequential outcome frequencies. Specifically, for nuclear power plant applications, SAPHIRE 8 can identify important contributors to core damage (Level 1 PRA) and containment failure during a severe accident which leads to releases (Level 2 PRA). It can be used for a PRA where the reactor is at full power, low power, or at shutdown conditions. Furthermore, it can be used to analyze both internal and external initiating events and has special features for managing models such as flooding and fire. It can also be used in a limited manner to quantify risk in terms of release consequences to the public and environment (Level 3 PRA). In SAPHIRE 8, the act of creating a model has been separated from the analysis of that model in order to improve the quality of both the model (e.g., by avoiding inadvertent changes) and the analysis. Consequently, in SAPHIRE 8, the analysis of models is performed by using what are called Workspaces. Currently, there are Workspaces for three types of analyses: (1) the NRC's Accident Sequence Precursor program, where the workspace is called 'Events and Condition Assessment (ECA);' (2) the NRC's Significance Determination Process (SDP); and

  13. Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) Quality Assurance Manual

    Energy Technology Data Exchange (ETDEWEB)

    C. L. Smith; R. Nims; K. J. Kvarfordt; C. Wharton

    2008-08-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) is a software application developed for performing a complete probabilistic risk assessment using a personal computer running the Microsoft Windows operating system. SAPHIRE is primarily funded by the U.S. Nuclear Regulatory Commission (NRC). The role of the INL in this project is that of software developer and tester. This development takes place using formal software development procedures and is subject to quality assurance (QA) processes. The purpose of this document is to describe how the SAPHIRE software QA is performed for Version 6 and 7, what constitutes its parts, and limitations of those processes.

  14. Development of the software of the data taking system SOS for the SAPHIR experiment

    International Nuclear Information System (INIS)

    Manns, J.

    1989-02-01

    The data acquistion system SOS has been developed for the SAPHIR experiment at the Bonn stretcher ring ELSA. It can handle up to 280 kilobytes of data per second or a maximum triggerrate of 200 Hz. The multiprocessor based online system consists of twenty VIP-microprocessors and two VAX-computers. Each component of the SAPHIR experiment has at least one program in the online system to maintain special functions for this specific component. All of these programs can receive event data without interfering with the transfer of events to a mass storage for offline analysis. A special program SOL has been developed to serve as a user interface to the data acquisition system and as a status display for most of the programs of the online system. Using modern features like windowing and mouse control on a VAX-station the SAPHIR online SOL establishes an easy way of controlling the data acquisition system. (orig.)

  15. SAPHIRE 8 Volume 3 - Users' Guide

    Energy Technology Data Exchange (ETDEWEB)

    C. L. Smith; K. Vedros; K. J. Kvarfordt

    2011-03-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) is a software application developed for performing a complete probabilistic risk assessment (PRA) using a personal computer. SAPHIRE is funded by the U.S. Nuclear Regulatory Commission (NRC) and developed by the Idaho National Laboratory (INL). The INL's primary role in this project is that of software developer. However, the INL also plays an important role in technology transfer by interfacing and supporting SAPHIRE users comprised of a wide range of PRA practitioners from the NRC, national laboratories, the private sector, and foreign countries. SAPHIRE can be used to model a complex system’s response to initiating events, quantify associated damage outcome frequencies, and identify important contributors to this damage (Level 1 PRA) and to analyze containment performance during a severe accident and quantify radioactive releases (Level 2 PRA). It can be used for a PRA evaluating a variety of operating conditions, for example, for a nuclear reactor at full power, low power, or at shutdown conditions. Furthermore, SAPHIRE can be used to analyze both internal and external initiating events and has special features for transforming models built for internal event analysis to models for external event analysis. It can also be used in a limited manner to quantify risk in terms of release consequences to both the public and the environment (Level 3 PRA). This reference guide will introduce the SAPHIRE Version 8.0 software. A brief discussion of the purpose and history of the software is included along with general information such as installation instructions, starting and stopping the program, and some pointers on how to get around inside the program. Next, database concepts and structure are discussed. Following that discussion are nine sections, one for each of the menu options on the SAPHIRE main menu, wherein the purpose and general capabilities for each option are

  16. Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) Code Reference Manual

    Energy Technology Data Exchange (ETDEWEB)

    C. L. Smith; K. J. Kvarfordt; S. T. Wood

    2008-08-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) is a software application developed for performing a complete probabilistic risk assessment (PRA) using a personal computer. SAPHIRE is funded by the U.S. Nuclear Regulatory Commission (NRC) and developed by the Idaho National Laboratory (INL). The INL's primary role in this project is that of software developer. However, the INL also plays an important role in technology transfer by interfacing and supporting SAPHIRE users comprised of a wide range of PRA practitioners from the NRC, national laboratories, the private sector, and foreign countries. SAPHIRE can be used to model a complex system’s response to initiating events, quantify associated damage outcome frequencies, and identify important contributors to this damage (Level 1 PRA) and to analyze containment performance during a severe accident and quantify radioactive releases (Level 2 PRA). It can be used for a PRA evaluating a variety of operating conditions, for example, for a nuclear reactor at full power, low power, or at shutdown conditions. Furthermore, SAPHIRE can be used to analyze both internal and external initiating events and has special features for transforming models built for internal event analysis to models for external event analysis. It can also be used in a limited manner to quantify risk in terms of release consequences to both the public and the environment (Level 3 PRA). SAPHIRE includes a separate module called the Graphical Evaluation Module (GEM). GEM provides a highly specialized user interface with SAPHIRE that automates SAPHIRE process steps for evaluating operational events at commercial nuclear power plants. Using GEM, an analyst can estimate the risk associated with operational events in a very efficient and expeditious manner. This reference guide will introduce the SAPHIRE Version 7.0 software. A brief discussion of the purpose and history of the software is included along with

  17. Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) Code Reference Manual

    Energy Technology Data Exchange (ETDEWEB)

    C. L. Smith; K. J. Kvarfordt; S. T. Wood

    2006-07-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) is a software application developed for performing a complete probabilistic risk assessment (PRA) using a personal computer. SAPHIRE is funded by the U.S. Nuclear Regulatory Commission (NRC) and developed by the Idaho National Laboratory (INL). The INL's primary role in this project is that of software developer. However, the INL also plays an important role in technology transfer by interfacing and supporting SAPHIRE users comprised of a wide range of PRA practitioners from the NRC, national laboratories, the private sector, and foreign countries. SAPHIRE can be used to model a complex system’s response to initiating events, quantify associated damage outcome frequencies, and identify important contributors to this damage (Level 1 PRA) and to analyze containment performance during a severe accident and quantify radioactive releases (Level 2 PRA). It can be used for a PRA evaluating a variety of operating conditions, for example, for a nuclear reactor at full power, low power, or at shutdown conditions. Furthermore, SAPHIRE can be used to analyze both internal and external initiating events and has special features for ansforming models built for internal event analysis to models for external event analysis. It can also be used in a limited manner to quantify risk in terms of release consequences to both the public and the environment (Level 3 PRA). SAPHIRE includes a separate module called the Graphical Evaluation Module (GEM). GEM provides a highly specialized user interface with SAPHIRE that automates SAPHIRE process steps for evaluating operational events at commercial nuclear power plants. Using GEM, an analyst can estimate the risk associated with operational events in a very efficient and expeditious manner. This reference guide will introduce the SAPHIRE Version 7.0 software. A brief discussion of the purpose and history of the software is included along with

  18. Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE)

    International Nuclear Information System (INIS)

    C. L. Smith

    2006-01-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) is a software application developed for performing a complete probabilistic risk assessment (PRA) using a personal computer (PC) running the Microsoft Windows operating system. SAPHIRE is primarily funded by the U.S. Nuclear Regulatory Commission (NRC) and developed by the Idaho National Laboratory (INL). INL's primary role in this project is that of software developer and tester. However, INL also plays an important role in technology transfer by interfacing and supporting SAPHIRE users, who constitute a wide range of PRA practitioners from the NRC, national laboratories, the private sector, and foreign countries. SAPHIRE can be used to model a complex system's response to initiating events and quantify associated consequential outcome frequencies. Specifically, for nuclear power plant applications, SAPHIRE can identify important contributors to core damage (Level 1 PRA) and containment failure during a severe accident which lead to releases (Level 2 PRA). It can be used for a PRA where the reactor is at full power, low power, or at shutdown conditions. Furthermore, it can be used to analyze both internal and external initiating events and has special features for transforming an internal events model to a model for external events, such as flooding and fire analysis. It can also be used in a limited manner to quantify risk in terms of release consequences to the public and environment (Level 3 PRA). SAPHIRE also includes a separate module called the Graphical Evaluation Module (GEM). GEM is a special user interface linked to SAPHIRE that automates the SAPHIRE process steps for evaluating operational events at commercial nuclear power plants. Using GEM, an analyst can estimate the risk associated with operational events (for example, to calculate a conditional core damage probability) very efficiently and expeditiously. This report provides an overview of the functions

  19. SAPHIR, how it ended

    International Nuclear Information System (INIS)

    Brogli, R.; Hammer, J.; Wiezel, L.; Christen, R.; Heyck, H.; Lehmann, E.

    1995-01-01

    On May 16th, 1994, PSI decided to discontinue its efforts to retrofit the SAPHIR reactor for operation at 10 MW. This decision was made because the effort and time for the retrofit work in progress had proven to be more complex than was anticipated. In view of the start-up of the new spallation-neutron source SINQ in 1996, the useful operating time between the eventual restart of SAPHIR and the start-up of SINQ became less than two years, which was regarded by PSI as too short a period to warrant the large retrofit effort. Following the decision of PSI not to re-use SAPHIR as a neutron source, several options for the further utilization of the facility were open. However, none of them appeared promising in comparison with other possibilities; it was therefore decided that SAPHIR should be decommissioned. A concerted effort was initiated to consolidate the nuclear and conventional safety for the post-operational period. (author) 3 figs., 3 tab

  20. Design and construction of the SAPHIR detector

    International Nuclear Information System (INIS)

    Schwille, W.J.; Bockhorst, M.; Burbach, G.; Burgwinkel, R.; Empt, J.; Guse, B.; Haas, K.M.; Hannappel, J.; Heinloth, K.; Hey, T.; Honscheid, K.; Jahnen, T.; Jakob, H.P.; Joepen, N.; Juengst, H.; Kirch, U.; Klein, F.J.; Kostrewa, D.; Lindemann, L.; Link, J.; Manns, J.; Menze, D.; Merkel, H.; Merkel, R.; Neuerburg, W.; Paul, E.; Ploetzke, R.; Schenk, U.; Schmidt, S.; Scholmann, J.; Schuetz, P.; Schultz-Coulon, H.C.; Schweitzer, M.; Tran, M.Q.; Vogl, W.; Wedemeyer, R.; Wehnes, F.; Wisskirchen, J.; Wolf, A.

    1994-01-01

    The design, construction, and performance of the large solid angle magnetic spectrometer SAPHIR is described. It was built for the investigation of photon-induced reactions on nucleous and light nuclei with mulit-particle final states up to photon energies of 3.1 GeV. The detector is equipped with a tagged photon beam facility and is operated at the stretcher ring ELSA in Bonn. (orig.)

  1. Design and construction of the SAPHIR detector

    Energy Technology Data Exchange (ETDEWEB)

    Schwille, W.J. (Bonn Univ. (Germany). Physikalisches Inst.); Bockhorst, M. (Bonn Univ. (Germany). Physikalisches Inst.); Burbach, G. (Bonn Univ. (Germany). Physikalisches Inst.); Burgwinkel, R. (Bonn Univ. (Germany). Physikalisches Inst.); Empt, J. (Bonn Univ. (Germany). Physikalisches Inst.); Guse, B. (Bonn Univ. (Germany). Physikalisches Inst.); Haas, K.M. (Bonn Univ. (Germany). Physikalisches Inst.); Hannappel, J. (Bonn Univ. (Germany). Physikalisches Inst.); Heinloth, K. (Bonn Univ. (Germany). Physikalisches Inst.); Hey, T. (Bonn Univ. (Germany). Physikalisches Inst.); Honscheid, K. (Bonn Univ. (Germany). Physikalisches Inst.); Jahnen, T. (Bonn Univ. (Germany). Physikalisches Inst.); Jakob, H.P. (Bonn Univ. (Germany). Physikalisches Inst.); Joepen, N. (Bonn Univ. (Germany). Physikalisches Inst.); Juengst, H. (Bonn Univ. (Germany). Physikalisches Inst.); Kirch, U. (Bonn Univ. (Germany). Physikalisches Inst.); Klein, F.J. (Bonn Univ. (Germany). Physikalisches Inst.)

    1994-05-15

    The design, construction, and performance of the large solid angle magnetic spectrometer SAPHIR is described. It was built for the investigation of photon-induced reactions on nucleous and light nuclei with mulit-particle final states up to photon energies of 3.1 GeV. The detector is equipped with a tagged photon beam facility and is operated at the stretcher ring ELSA in Bonn. (orig.)

  2. SAPHIRE6.64, System Analysis Programs for Hands-on Integrated Reliability

    International Nuclear Information System (INIS)

    2001-01-01

    1 - Description of program or function: SAPHIRE is a collection of programs developed for the purpose of performing those functions necessary to create and analyze a complete Probabilistic Risk Assessment (PRA) primarily for nuclear power plants. The programs included in this suite are the Integrated Reliability and Risk Analysis System (IRRAS), the System Analysis and Risk Assessment (SARA) system, the Models And Results Database (MAR-D) system, and the Fault tree, Event tree and P and ID (FEP) editors. Previously these programs were released as separate packages. These programs include functions to allow the user to create event trees and fault trees, to define accident sequences and basic event failure data, to solve system and accident sequence fault trees, to quantify cut sets, and to perform uncertainty analysis on the results. Also included in this program are features to allow the analyst to generate reports and displays that can be used to document the results of an analysis. Since this software is a very detailed technical tool, the user of this program should be familiar with PRA concepts and the methods used to perform these analyses. 2 - Methods: SAPHIRE is written in MODULA-2 and uses an integrated commercial graphics package to interactively construct and edit fault trees. The fault tree solving methods used are industry recognized top down algorithms. For quantification, the program uses standard methods to propagate the failure information through the generated cut sets. SAPHIRE includes a separate module called the Graphical Evaluation Module (GEM). GEM provides a highly specialized user interface with SAPHIRE which automates the process for evaluating operational events at commercial nuclear power plants. Using GEM an analyst can estimate the risk associated with operational events (that is, perform a Level 1, Level 2, and Level 3 analysis for operational events) in a very efficient and expeditious manner. This on-line reference guide will

  3. Modular Software Performance Monitoring

    CERN Document Server

    Kruse, D F

    2011-01-01

    CPU clock frequency is not likely to be increased significantly in the coming years, and data analysis speed can be improved by using more processors or buying new machines, only if one is willing to change the paradigm to a parallel one. Therefore, performance monitoring procedures and tools are needed to help programmers to optimize existing software running on current and future hardware. Low level information from hardware performance counters is vital to spot specific performance problems slowing program execution. HEP software is often huge and complex, and existing tools are unable to give results with the required granularity. We will report on the approach we have chose to solve this problem that involves decomposing the application into parts and monitoring each of them separately. Both counting and sampling methods are used to allow an analysis with the required custom granularity: from global level, up to the function level. A set of tools (based on perfmon2 – a software interface to hardware co...

  4. Independent Verification and Validation SAPHIRE Version 8 Final Report Project Number: N6423 U.S. Nuclear Regulatory Commission

    Energy Technology Data Exchange (ETDEWEB)

    Kent Norris

    2010-04-01

    This report provides an evaluation of the SAPHIRE version 8 software product. SAPHIRE version 8 is being developed with a phased or cyclic iterative rapid application development methodology. Due to this approach, a similar approach has been taken for the IV&V activities on each vital software object. IV&V and Software Quality Assurance (SQA) activities occur throughout the entire development life cycle and therefore, will be required through the full development of SAPHIRE version 8. Later phases of the software life cycle, the operation and maintenance phases, are not applicable in this effort since the IV&V is being done prior to releasing Version 8.

  5. The trigger and data acquisition system of the SAPHIR detector

    International Nuclear Information System (INIS)

    Honscheid, K.

    1988-10-01

    At present SAPHIR, a new experimental facility for medium energy physics is under construction at the Bonn electron accelerator ELSA (energy ≤ 3.5 GeV, duty cycle ≅ 100%). SAPHIR combines a large solid angle coverage with a tagging system and is therefore suited to investigate reactions with multi-particle final states. Structure and function of the multi-stage trigger system, which is used to select such processes, are described in this paper. With this system the trigger decision can be based on the number of charged particles as well as on the number of neutral particle detected. Several VMEbus modules have been developed, using memory look-up tables to make fast trigger decisions possible. In order to determine the number of neutral particles from the cluster distribution in the electromagnetic calorimeter some ideas of cellular had to be added. The system has a modular structure, so it can easily be extended. In the second part of this thesis the SAPHIR data acquisition system is discussed. It consists of a multiprocessor system with the VIP microcomputer as central element. The VIP is a VMEbus modul optimized for a multiprocessor environment. Its description as well as that of the other VMEbus boards developed for the SAPHIR online system can be found in this paper. As a basis for software development the operating system SOS is supplied. With SOS it is possible to write programs independent of the actual hardware configuration and so the complicated multiprocessor environment is hidden. To the user the system looks like a simple multi-tasking system. SOS is not restricted to the VIPs but can also be installed on computers of the VAX family, so that efficient mixed configurations are possible. The SAPHIR online system, based on the VIP microcomputer and the SOS operating system, is presented in the last part of this paper. This includes the read-out system, the monitoring of the different components etc. (orig./HSI) [de

  6. Systems analysis programs for Hands-on integrated reliability evaluations (SAPHIRE) Version 5.0: Verification and validation (V ampersand V) manual. Volume 9

    International Nuclear Information System (INIS)

    Jones, J.L.; Calley, M.B.; Capps, E.L.; Zeigler, S.L.; Galyean, W.J.; Novack, S.D.; Smith, C.L.; Wolfram, L.M.

    1995-03-01

    A verification and validation (V ampersand V) process has been performed for the System Analysis Programs for Hands-on Integrated Reliability Evaluation (SAPHIRE) Version 5.0. SAPHIRE is a set of four computer programs that NRC developed for performing probabilistic risk assessments. They allow an analyst to perform many of the functions necessary to create, quantify, and evaluate the risk associated with a facility or process being analyzed. The programs are Integrated Reliability and Risk Analysis System (IRRAS) System Analysis and Risk Assessment (SARA), Models And Results Database (MAR-D), and Fault tree, Event tree, and Piping and instrumentation diagram (FEP) graphical editor. Intent of this program is to perform a V ampersand V of successive versions of SAPHIRE. Previous efforts have been the V ampersand V of SAPHIRE Version 4.0. The SAPHIRE 5.0 V ampersand V plan is based on the SAPHIRE 4.0 V ampersand V plan with revisions to incorporate lessons learned from the previous effort. Also, the SAPHIRE 5.0 vital and nonvital test procedures are based on the test procedures from SAPHIRE 4.0 with revisions to include the new SAPHIRE 5.0 features as well as to incorporate lessons learned from the previous effort. Most results from the testing were acceptable; however, some discrepancies between expected code operation and actual code operation were identified. Modifications made to SAPHIRE are identified

  7. High performance in software development

    CERN Multimedia

    CERN. Geneva; Haapio, Petri; Liukkonen, Juha-Matti

    2015-01-01

    What are the ingredients of high-performing software? Software development, especially for large high-performance systems, is one the most complex tasks mankind has ever tried. Technological change leads to huge opportunities but challenges our old ways of working. Processing large data sets, possibly in real time or with other tight computational constraints, requires an efficient solution architecture. Efficiency requirements span from the distributed storage and large-scale organization of computation and data onto the lowest level of processor and data bus behavior. Integrating performance behavior over these levels is especially important when the computation is resource-bounded, as it is in numerics: physical simulation, machine learning, estimation of statistical models, etc. For example, memory locality and utilization of vector processing are essential for harnessing the computing power of modern processor architectures due to the deep memory hierarchies of modern general-purpose computers. As a r...

  8. Advanced Modular Software Performance Monitoring

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    The LHCb software is based on the Gaudi framework, on top of which are built several large and complex software applications. The LHCb experiment is now in the active phase of collecting and analyzing data and significant performance problems arise in the Gaudi based software beginning from High Level Trigger (HLT) programs and ending with data analysis frameworks (DaVinci). It’s not easy to find hot spots in the code - only special tools can help to understand where CPU or memory usage is not reasonable. There exist many performance analyzing tools, but the main problem is that they show reports in terms of class and function names and such information usually is not very useful - the majority of algorithm developers use the Gaudi framework abstractions and usually do not know about functions which lie at the lower level. We will show a new approach which adds to performance reports a higher abstraction level based on knowledge of framework architecture and run-time object properties. A set of profiling to...

  9. Advanced modular software performance monitoring

    CERN Document Server

    Mazurov, A

    2012-01-01

    The LHCb software is based on the Gaudi framework, on top of which are built several large and complex software applications. As the LHCb experiment is now in the active phase of collecting and analyzing data, performance problems arise in various parts of the software, from the High Level Trigger (HLT) programs to data analysis frameworks. It is not easy to find hotspots in the code - only specialized tools can help to understand where CPU or memory usage are not reasonable. There exist many performance analyzing tools, but the main problem is that they show reports in terms of class and function names and such information usually is not very useful - the majority of algorithm developers use the Gaudi framework abstractions and usually do not know about functions which lie at the lower level. We will show a new approach which adds to performance reports a higher abstraction level based on knowledge of framework architecture and run-time object properties. A set of profiling tools (based on Intel VTune Amplif...

  10. Strangeness photoproduction with the SAPHIR-detector

    International Nuclear Information System (INIS)

    Merkel, H.

    1993-12-01

    At the ELSA facility at Bonn a photon beam with a high duty cycle up to energies of 3.3 GeV is available. In this energy range the large solid angle detector SAPHIR enables us to investigate the strangeness photoproduction starting from threshold. SAPHIR has already achieved results for the reactions γ+p→K + +Λ and γ+p→K + +Σ 0 . This work investigates the possibilities to measure the related reactions γ+n→K 0 +Λ and γ+n→K 0 +Σ 0 at a deuteron target and to measure the reaction γ+p→K 0 +Σ + at a proton target. For the first time the Σ + polarisation has been measured. With an cross section 10 times smaller compared to the kaon hyperon reactions, the photoproduction of the Φ(1020) meson can be investigated with the SAPHIR detector too. First reconstructed events are shown. (orig.)

  11. Software Performs Complex Design Analysis

    Science.gov (United States)

    2008-01-01

    Designers use computational fluid dynamics (CFD) to gain greater understanding of the fluid flow phenomena involved in components being designed. They also use finite element analysis (FEA) as a tool to help gain greater understanding of the structural response of components to loads, stresses and strains, and the prediction of failure modes. Automated CFD and FEA engineering design has centered on shape optimization, which has been hindered by two major problems: 1) inadequate shape parameterization algorithms, and 2) inadequate algorithms for CFD and FEA grid modification. Working with software engineers at Stennis Space Center, a NASA commercial partner, Optimal Solutions Software LLC, was able to utilize its revolutionary, one-of-a-kind arbitrary shape deformation (ASD) capability-a major advancement in solving these two aforementioned problems-to optimize the shapes of complex pipe components that transport highly sensitive fluids. The ASD technology solves the problem of inadequate shape parameterization algorithms by allowing the CFD designers to freely create their own shape parameters, therefore eliminating the restriction of only being able to use the computer-aided design (CAD) parameters. The problem of inadequate algorithms for CFD grid modification is solved by the fact that the new software performs a smooth volumetric deformation. This eliminates the extremely costly process of having to remesh the grid for every shape change desired. The program can perform a design change in a markedly reduced amount of time, a process that would traditionally involve the designer returning to the CAD model to reshape and then remesh the shapes, something that has been known to take hours, days-even weeks or months-depending upon the size of the model.

  12. Systems analysis programs for hands-on integrated reliability evaluations (SAPHIRE), Version 5.0

    International Nuclear Information System (INIS)

    Russell, K.D.; Kvarfordt, K.J.; Hoffman, C.L.

    1995-10-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) refers to a set of several microcomputer programs that were developed to create and analyze probabilistic risk assessments (PRAs), primarily for nuclear power plants. The Graphical Evaluation Module (GEM) is a special application tool designed for evaluation of operational occurrences using the Accident Sequence Precursor (ASP) program methods. GEM provides the capability for an analyst to quickly and easily perform conditional core damage probability (CCDP) calculations. The analyst can then use the CCDP calculations to determine if the occurrence of an initiating event or a condition adversely impacts safety. It uses models and data developed in the SAPHIRE specially for the ASP program. GEM requires more data than that normally provided in SAPHIRE and will not perform properly with other models or data bases. This is the first release of GEM and the developers of GEM welcome user comments and feedback that will generate ideas for improvements to future versions. GEM is designated as version 5.0 to track GEM codes along with the other SAPHIRE codes as the GEM relies on the same, shared database structure

  13. SAPHIRE 8 Volume 2 - Technical Reference

    Energy Technology Data Exchange (ETDEWEB)

    C. L. Smith; S. T. Wood; W. J. Galyean; J. A. Schroeder; M. B. Sattison

    2011-03-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) refers to a set of computer programs that were developed to create and analyze probabilistic risk assessment (PRAs). Herein information is provided on the principles used in the construction and operation of Version 8.0 of the SAPHIRE system. This report summarizes the fundamental mathematical concepts of sets and logic, fault trees, and probability. This volume then describes the algorithms used to construct a fault tree and to obtain the minimal cut sets. It gives the formulas used to obtain the probability of the top event from the minimal cut sets, and the formulas for probabilities that apply for various assumptions concerning reparability and mission time. It defines the measures of basic event importance that SAPHIRE can calculate. This volume gives an overview of uncertainty analysis using simple Monte Carlo sampling or Latin Hypercube sampling, and states the algorithms used by this program to generate random basic event probabilities from various distributions. Also covered are enhance capabilities such as seismic analysis, Workspace algorithms, cut set "recovery," end state manipulation, and use of "compound events."

  14. Evaluation of high-performance computing software

    Energy Technology Data Exchange (ETDEWEB)

    Browne, S.; Dongarra, J. [Univ. of Tennessee, Knoxville, TN (United States); Rowan, T. [Oak Ridge National Lab., TN (United States)

    1996-12-31

    The absence of unbiased and up to date comparative evaluations of high-performance computing software complicates a user`s search for the appropriate software package. The National HPCC Software Exchange (NHSE) is attacking this problem using an approach that includes independent evaluations of software, incorporation of author and user feedback into the evaluations, and Web access to the evaluations. We are applying this approach to the Parallel Tools Library (PTLIB), a new software repository for parallel systems software and tools, and HPC-Netlib, a high performance branch of the Netlib mathematical software repository. Updating the evaluations with feed-back and making it available via the Web helps ensure accuracy and timeliness, and using independent reviewers produces unbiased comparative evaluations difficult to find elsewhere.

  15. Strategies employed for LHC software performance studies

    CERN Document Server

    Nowak, A

    2010-01-01

    The objective of this work is to collect and assess the software performance related strategies employed by the major players in the LHC software arena: the four main experiments (ALICE, ATLAS, CMS and LHCb) and the two main software frameworks (Geant4 and ROOT). As the software used differs between the parties, so do the directions and methods in optimization, and their intensity. The common feeling shared by nearly all interviewed parties is that performance is not one of their top priorities and that maintaining it at a constant level is a satisfactory solution, given the resources at hand. In principle, despite some organized efforts, a less structured approach seems to be the dominant one, and opportunistic optimization prevails. Four out of six surveyed groups are investigating memory management related effects, deemed to be the primary cause of their performance issues. The most commonly used tools include Valgrind and homegrown software. All questioned groups expressed the desire for advanced tools, s...

  16. Effect of Functional diversity on Software Performance

    OpenAIRE

    Viswanatha Rao, Balajee

    2011-01-01

    For the past few decades, there has been numerous literature produced on functional diversity and performance. However, the relationship between functional diversity and performance in software industry is clearly not explained and results are found to be inconsistent. The main focus of this research is to explore the effects of functional diversity on software project performance by conducting a qualitative study. Four metrics were chosen from literature namely decision making, creativity an...

  17. The evolution of CMS software performance studies

    CERN Document Server

    Kortelainen, Matti J

    2010-01-01

    CMS has had an ongoing and dedicated effort to optimize software performance for several years. Initially this effort focused primarily on the cleanup of many issues coming from basic C++ errors, namely reducing dynamic memory churn, unnecessary copies/temporaries and tools to routinely monitor these things. Over the past 1.5 years, however, the transition to 64bit, newer versions of the gcc compiler, newer tools and the enabling of techniques like vectorization have made possible more sophisticated improvements to the software performance. This presentation will cover this evolution and describe the current avenues being pursued for software performance, as well as the corresponding gains.

  18. The evolution of CMS software performance studies

    Science.gov (United States)

    Kortelainen, M. J.; Elmer, P.; Eulisse, G.; Innocente, V.; Jones, C. D.; Tuura, L.

    2011-12-01

    CMS has had an ongoing and dedicated effort to optimize software performance for several years. Initially this effort focused primarily on the cleanup of many issues coming from basic C++ errors, namely reducing dynamic memory churn, unnecessary copies/temporaries and tools to routinely monitor these things. Over the past 1.5 years, however, the transition to 64bit, newer versions of the gcc compiler, newer tools and the enabling of techniques like vectorization have made possible more sophisticated improvements to the software performance. This presentation will cover this evolution and describe the current avenues being pursued for software performance, as well as the corresponding gains.

  19. The alarm system of the SAPHIR detector

    International Nuclear Information System (INIS)

    Schultz-Coulon, H.C.

    1993-06-01

    In order to obtain an effective control of the different detector components an alarm system was built and implemented into the data acquisition system of the SAPHIR experiment. It provides an easy way of indicating errors by either adequate library calls or an appropriate hardware signal, both leading to an active alarm. This allows to react directly to any error detected by one of the specific control systems. In addition for selected kinds of errors the data run can be stopped automatically. Concept and construction of this system are described and some examples for its application are given. (orig.)

  20. Power and performance software analysis and optimization

    CERN Document Server

    Kukunas, Jim

    2015-01-01

    Power and Performance: Software Analysis and Optimization is a guide to solving performance problems in modern Linux systems. Power-efficient chips are no help if the software those chips run on is inefficient. Starting with the necessary architectural background as a foundation, the book demonstrates the proper usage of performance analysis tools in order to pinpoint the cause of performance problems, and includes best practices for handling common performance issues those tools identify. Provides expert perspective from a key member of Intel's optimization team on how processors and memory

  1. TOPAS 2 - a high-resolution tagging system at the Bonn SAPHIR detector

    International Nuclear Information System (INIS)

    Rappenecker, G.

    1989-02-01

    For the SAPHIR-arrangement in Bonn a high resolving tagging system has been developed achieving an energy resolution of 2 MeV, covering the range of (0.94-0.34) E 0 photon energy (1.0 GeV 0 2 , ArCH 4 and ArC 2 H 6 in concern of performance, clustersize and coincidence width. (orig.)

  2. Development of a model-independent evaluation of photon-deuteron reactions for the SAPHIR detector

    International Nuclear Information System (INIS)

    Wolf, A.

    1993-01-01

    The SAPHIR detector measures photon induced reactions with many particles in the final state. Thus a detailed investigation of those processes at photon energies between 0.4 and 3.3 GeV is possible. The interpretation of the distribution of the sample of events, which SAPHIR is able to reconstruct, has to be done after a correction of influences induced by the detector acceptance. In this work a model independent method of correcting and analysing the data is discussed. The implementation of the basic tools of this analysis is described and first tests with simulated and real events are performed. SAPHIR uses a time-of-flight system for the identification of particles. This work describes the structure of a program library, which supports an easy way of decoding the digitizations of this system (including calibration of the hardware) and obtaining the flight time for a particle in a event. The necessary step for calibrating the system are outlined, too. (orig.)

  3. ATLAS Offline Software Performance Monitoring and Optimization

    CERN Document Server

    Chauhan, N; Kittelmann, T; Langenberg, R; Mandrysch , R; Salzburger, A; Seuster, R; Ritsch, E; Stewart, G; van Eldik, N; Vitillo, R

    2014-01-01

    In a complex multi-developer, multi-package software environment, such as the ATLAS offline Athena framework, tracking the performance of the code can be a non-trivial task in itself. In this paper we describe improvements in the instrumentation of ATLAS offline software that have given considerable insight into the performance of the code and helped to guide optimisation. Code can be instrumented firstly using the PAPI tool, which is a programing interface for accessing hardware performance counters. PAPI events can count floating point operations, cycles and instructions and cache accesses. Triggering PAPI to start/stop counting for each algorithm and processed event gives a good understanding of the whole algorithm level performance of ATLAS code. Further data can be obtained using pin, a dynamic binary instrumentation tool. Pintools can be used to obtain similar statistics as PAPI, but advantageously without requiring recompilation of the code. Fine grained routine and instruction level instrumentation is...

  4. ATLAS Offline Software Performance Monitoring and Optimization

    CERN Document Server

    Chauhan, N; The ATLAS collaboration; Kittelmann, T; Langenberg, R; Mandrysch , R; Salzburger, A; Seuster, R; Ritsch, E; Stewart, G; van Eldik, N; Vitillo, R

    2013-01-01

    In a complex multi-developer, multi-package software environment, such as the ATLAS offline Athena framework, tracking the performance of the code can be a non-trivial task in itself. In this paper we describe improvements in the instrumentation of ATLAS offline software that have given considerable insight into the performance of the code and helped to guide optimisation. Code can be instrumented firstly using the PAPI tool, which is a programing interface for accessing hardware performance counters. PAPI events can count floating point operations, cycles and instructions and cache accesses. Triggering PAPI to start/stop counting for each algorithm and processed event gives a good understanding of the whole algorithm level performance of ATLAS code. Further data can be obtained using pin, a dynamic binary instrumentation tool. Pintools can be used to obtain similar statistics as PAPI, but advantageously without requiring recompilation of the code. Fine grained routine and instruction level instrumentation is...

  5. All-sky radiance simulation of Megha-Tropiques SAPHIR microwave ...

    Indian Academy of Sciences (India)

    used as input to the RTTOV model to simulate cloud-affected SAPHIR radiances. ... All-sky radiance simulation; Megha tropiques; microwave SAPHIR sensor; radiative transfer; data ... versions of these non-linear processes (Ohring and.

  6. The central drift chamber of the SAPHIR detector - implementation into the experiment and study of its properties

    International Nuclear Information System (INIS)

    Haas, K.M.

    1992-01-01

    At the Bonn accelerator facility ELSA the large solid angle detector SAPHIR was built for the investigation of photon induced reactions. A main component of SAPHIR is the central drift chamber (CDC) matching the magneto gap of 1m 3 . The diameter of the in total 1828 hexagonal drift cells is about 18 mm. The subject of this paper is the implementation of the CDC in the experiment. After the description of the hardware follows the presentation of the software tools for filtering and monitoring the data, which have been developed and tested. An algorithm for extracting the space time relationship is presented. The properties of the chamber with an improved gas mixture (Helium/Neon/Isobutane8 21.25:63.75:15) have been investigated. A spatial resolution of about 200 μm was achieved. The efficiency of the chamber is 97% at a tagged photon of 5x10 4 per second crossing the chamber. (orig.) [de

  7. Asset management: integrated software optimizes production performance

    Energy Technology Data Exchange (ETDEWEB)

    Polczer, S.

    1998-06-01

    Two new multi-dimensional databases, which expand the `row and column` concept of spreadsheets into multiple categories of data called dimensions, are described. These integrated software packages provide the foundation for industry players such as Poco Petroleum Ltd and Numac Energy Inc to gain a competitive advantage, by overhauling their respective data collection and retrieval systems to allow for timely cost analysis and financial reporting. Energy Warehouse, an on-line analytical processing product marketed by SysGold Ltd, is one of the software products described. It gathers various sources of information, allows advanced searches and generates reports previously unavailable in other conventional financial accounting systems. The second product discussed - the Canadian Upstream Energy System (CUES) - is an on-line analytical processing system developed by Oracle Corporation and Calgary-based Applied Terravision Systems (ATS) Inc. CUES combines Oracle`s universal data server and software development tools with ATS`s upstream financial, land, geotechnical and production applications. The software also allows for optimization of facilities, analysis of production efficiencies and comparison of performance against industry standards.

  8. Antecedents and Moderators of Software Professionals’ Performance

    Directory of Open Access Journals (Sweden)

    Shiva Prasad H. C.

    2014-02-01

    Full Text Available Software professionals’ (SPs' performance is often understood narrowly in terms of input–output productivity. This study approaches performance from a broader perspective and examines whether the emotional intelligence competencies (EICs of SPs, the leadership style of team leaders, social capital among team members, and human resource management (HRM practices of software firms affect performance of SPs. It also tests whether the value of and opportunities for knowledge sharing moderate such relationships. Data were collected from 441 Indian SPs in a questionnaire survey. Fifty-five team leaders assessed the performance of SPs, and SPs assessed the other constructs. Results revealed that EICs, transformational leadership style, social capital, and HRM practices positively affect performance. EICs are the most important predictors of performance. Under high (low value of and high (low opportunities for knowledge sharing, the antecedents influencing performance are strengthened (attenuated or nullified. The value of and opportunities for knowledge sharing are quasi-moderators. These findings have significant implications for organizing effective work teams.

  9. Software performance and scalability a quantitative approach

    CERN Document Server

    Liu, Henry H

    2009-01-01

    Praise from the Reviewers:"The practicality of the subject in a real-world situation distinguishes this book from othersavailable on the market."—Professor Behrouz Far, University of Calgary"This book could replace the computer organization texts now in use that every CS and CpEstudent must take. . . . It is much needed, well written, and thoughtful."—Professor Larry Bernstein, Stevens Institute of TechnologyA distinctive, educational text onsoftware performance and scalabilityThis is the first book to take a quantitative approach to the subject of software performance and scalability

  10. Performance Engineering Technology for Scientific Component Software

    Energy Technology Data Exchange (ETDEWEB)

    Malony, Allen D.

    2007-05-08

    Large-scale, complex scientific applications are beginning to benefit from the use of component software design methodology and technology for software development. Integral to the success of component-based applications is the ability to achieve high-performing code solutions through the use of performance engineering tools for both intra-component and inter-component analysis and optimization. Our work on this project aimed to develop performance engineering technology for scientific component software in association with the DOE CCTTSS SciDAC project (active during the contract period) and the broader Common Component Architecture (CCA) community. Our specific implementation objectives were to extend the TAU performance system and Program Database Toolkit (PDT) to support performance instrumentation, measurement, and analysis of CCA components and frameworks, and to develop performance measurement and monitoring infrastructure that could be integrated in CCA applications. These objectives have been met in the completion of all project milestones and in the transfer of the technology into the continuing CCA activities as part of the DOE TASCS SciDAC2 effort. In addition to these achievements, over the past three years, we have been an active member of the CCA Forum, attending all meetings and serving in several working groups, such as the CCA Toolkit working group, the CQoS working group, and the Tutorial working group. We have contributed significantly to CCA tutorials since SC'04, hosted two CCA meetings, participated in the annual ACTS workshops, and were co-authors on the recent CCA journal paper [24]. There are four main areas where our project has delivered results: component performance instrumentation and measurement, component performance modeling and optimization, performance database and data mining, and online performance monitoring. This final report outlines the achievements in these areas for the entire project period. The submitted progress

  11. The COMPASS Tokamak Plasma Control Software Performance

    Science.gov (United States)

    Valcarcel, Daniel F.; Neto, André; Carvalho, Ivo S.; Carvalho, Bernardo B.; Fernandes, Horácio; Sousa, Jorge; Janky, Filip; Havlicek, Josef; Beno, Radek; Horacek, Jan; Hron, Martin; Panek, Radomir

    2011-08-01

    The COMPASS tokamak has began operation at the IPP Prague in December 2008. A new control system has been built using an ATCA-based real-time system developed at IST Lisbon. The control software is implemented on top of the MARTe real-time framework attaining control cycles as short as 50 μs, with a jitter of less than 1 μs. The controlled parameters, important for the plasma performance, are the plasma current, position of the plasma current center, boundary shape and horizontal and vertical velocities. These are divided in two control cycles: slow at 500 μs and fast at 50 μs. The project has two phases. First, the software implements a digital controller, similar to the analog one used during the COMPASS-D operation in Culham. In the slow cycle, the plasma current and position are measured and controlled with PID and feedforward controllers, respectively, the shaping magnetic field is preprogrammed. The vertical instability and horizontal equilibrium are controlled with the faster 50-μs cycle PID controllers. The second phase will implement a plasma-shape reconstruction algorithm and controller, aiming at optimized plasma performance. The system was designed to be as modular as possible by breaking the functional requirements of the control system into several independent and specialized modules. This splitting enabled tuning the execution of each system part and to use the modules in a variety of applications with different time constraints. This paper presents the design and overall performance of the COMPASS control software.

  12. Nucleonic calculations for possible irradiation experiments in SAPHIR

    International Nuclear Information System (INIS)

    Caro, M.; Pelloni, S.

    1990-01-01

    Accurate two-dimensional calculations show that a 'neutronic environment' exists in the SAPHIR reactor at the Paul Scherrer Institute (PSI) to simulate the inner surface of a given trepan of the Gundremmingen reactor. Neutron fluences and DPA rates were calculated at two positions in SAPHIR using the modern codes and nuclear data (from JEF-1). A particular region of the reactor can be found in which fluences and DPA rates agree well within a few percent with the Gundremmingen reference case. (author) 13 figs., 4 tabs., 18 refs

  13. SAPhIR: a fission-fragment detector

    International Nuclear Information System (INIS)

    Theisen, Ch.; Gautherin, C.; Houry, M.; Korten, W.; Le Coz, Y.; Lucas, R.; Barreau, G.; Doan, T. P.; Belier, G.; Meot, V.; Ethvignot, Th.; Cahan, B.; Le Coguie, A.; Coppolani, X.; Delaitre, B.; Le Bourlout, P.; Legou, Ph.; Maillard, O.; Durand, G.; Bouillac, A.

    1998-01-01

    SAPhIR is the acronym for S a clay A q uitaine P ho tovoltaic cells for I s omer R e search. It consists of solar cells, used for fission-fragment detection. It is a collaboration between 3 laboratories: CEA Saclay, CENBG Bordeaux and CEA Bruyeres le Chatel. The coupling of a highly efficient fission-fragment detector like SAPhIR with EUROBALL will provide new insights in the study of very deformed nuclear matter and in the spectroscopy of neutron-rich nuclei

  14. Towards a Theory of Affect and Software Developers' Performance

    OpenAIRE

    Graziotin, Daniel

    2016-01-01

    For more than thirty years, it has been claimed that a way to improve software developers' productivity and software quality is to focus on people. The underlying assumption seems to be that "happy and satisfied software developers perform better". More specifically, affects-emotions and moods-have an impact on cognitive activities and the working performance of individuals. Development tasks are undertaken heavily through cognitive processes, yet software engineering research (SE) lacks theo...

  15. The scintillation counter system at the SAPHIR detector

    International Nuclear Information System (INIS)

    Bour, D.

    1989-10-01

    The scintillation-counters system of the SAPHIR-detector at the stretcher accelerator ELSA in Bonn consists of 64 counters. It supplies a fast hadronic trigger and is utilizised for the particle identification by time of flight measurements. Prototypes of the counters (340x21.25 x 6.0 cm 3 ) had been tested. The contribution to the resolution of the time of flight measurement was measured to σ=125 ps, the effective light velocity to 17.5 ns/cm and the attenuation length of 7.8 m. A pion kaon separation is possible up to a momentum of 1 GeV/c with time of flight measurement. With the first photon-beam at SAPHIR the counters were tested, first triggers were obtained and evaluated. (orig.) [de

  16. Construction and calibration studies of the SAPHIR scintillation counters

    International Nuclear Information System (INIS)

    Kostrewa, D.

    1988-03-01

    For the scintillation counter system of the SAPHIR detector at the stretcher ring ELSA in Bonn 50 time of flight counters and 12 trigger counters have been built. Each of them has two photomultipliers, one at each side. A laser calibration system with a pulsed nitrogen laser as central light source to monitor these photomultipliers has been optimized. It was used to adjust the photomultipliers and to test their long and short time instabilities. (orig.)

  17. Study on the BES Ⅲ offline software performance

    International Nuclear Information System (INIS)

    Zhang Xiaomei; Sun Gongxing

    2011-01-01

    Performance monitor and analysis on the BESⅢ offline software system is very useful for the software optimization and the improvement of CPU and memory usage. It presented a feasible performance monitoring service based on GAUDI, and carried out performance tests and analysis on the BESⅢ simulation and reconstruction with the service. (authors)

  18. Performance Evaluation of Software Routers with VPN Features

    Directory of Open Access Journals (Sweden)

    H. Redžović

    2017-11-01

    Full Text Available This paper presents implementation and analysis of the VPN software router which is based on Quagga and strongSwan open-source software tools. We validated the functionalities of strongSwan and Quagga in realistic environment which include scenarios with link failures. Also, we measured and analyzed the performance of encryption and hash algorithms supported by strongSwan software, in order to advise an optimal VPN configuration that provides the best performance.

  19. The COMPASS Tokamak Plasma Control Software Performance

    Czech Academy of Sciences Publication Activity Database

    Valcárcel, D.F.; Neto, A.; Carvalho, I.S.; Carvalho, B.B.; Fernandes, H.; Sousa, J.; Janky, F.; Havlíček, Josef; Beňo, R.; Horáček, Jan; Hron, Martin; Pánek, Radomír

    2011-01-01

    Roč. 58, č. 4 (2011), s. 1490-1496 ISSN 0018-9499. [Real Time Conference, RT10/17th./. Lisboa, 24.05.2010-28.05.2010] R&D Projects: GA MŠk 7G09042; GA ČR GD202/08/H057 Institutional research plan: CEZ:AV0Z20430508 Keywords : Real-Time * ATCA * Data Acquisition * Plasma Control Software Subject RIV: BL - Plasma and Gas Discharge Physics Impact factor: 1.447, year: 2011 http://dx.doi.org/10.1109/TNS.2011.2143726

  20. The CMS software performance at the start of data taking

    CERN Document Server

    Benelli, Gabriele

    2009-01-01

    The CMS software framework (CMSSW) is a complex project evolving very rapidly as the first LHC colliding beams approach. The computing requirements constrain performance in terms of CPU time, memory footprint and event size on disk to allow for planning and managing the computing infrastructure necessary to handle the needs of the experiment. A performance suite of tools has been developed to track all aspects of code performance, through the software release cycles, allowing for regression and guiding code development for optimization. In this talk, we describe the CMSSW performance suite tools used and present some sample performance results from the release integration process for the CMS software.

  1. Asset management -- Integrated software optimizes production performance

    International Nuclear Information System (INIS)

    Polczer, S.

    1998-01-01

    Developments in data collection and retrieval systems to allow timely cost analysis, financial reporting and production management are discussed. One of the most important new OLAP (on-line analytical processing) products is Energy Warehouse which gathers field information from various sources, allows advanced searches, and generates reports previously unavailable in other conventional financial accounting systems. Another OLAP-based system, the Canadian Upstream Energy System (CUES), was developed by the Oracle Corporation and the Calgary-based Applied Terravision Systems (ATS) Inc. CUES combines Oracle's universal data server software development tools with ATS's upstream financial, land, geotechnical and production applications. ATS also developed a product called IDPMARS (Integrated Daily Production Management Accounting Reporting System). It interfaces with CUES to link working interests, government royalties, administration, facility charges, lifting costs, transportation tooling, and customers by integrating field data collection systems with financial accounting

  2. Asset management -- Integrated software optimizes production performance

    Energy Technology Data Exchange (ETDEWEB)

    Polczer, S.

    1998-10-01

    Developments in data collection and retrieval systems to allow timely cost analysis, financial reporting and production management are discussed. One of the most important new OLAP (on-line analytical processing) products is Energy Warehouse which gathers field information from various sources, allows advanced searches, and generates reports previously unavailable in other conventional financial accounting systems. Another OLAP-based system, the Canadian Upstream Energy System (CUES), was developed by the Oracle Corporation and the Calgary-based Applied Terravision Systems (ATS) Inc. CUES combines Oracle`s universal data server software development tools with ATS`s upstream financial, land, geotechnical and production applications. ATS also developed a product called IDPMARS (Integrated Daily Production Management Accounting Reporting System). It interfaces with CUES to link working interests, government royalties, administration, facility charges, lifting costs, transportation tooling, and customers by integrating field data collection systems with financial accounting.

  3. Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE), Version 5.0: Integrated Reliability and Risk Analysis System (IRRAS) reference manual. Volume 2

    International Nuclear Information System (INIS)

    Russell, K.D.; Kvarfordt, K.J.; Skinner, N.L.; Wood, S.T.; Rasmuson, D.M.

    1994-07-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) refers to a set of several microcomputer programs that were developed to create and analyze probabilistic risk assessments (PRAs), primarily for nuclear power plants. The Integrated Reliability and Risk Analysis System (IRRAS) is a state-of-the-art, microcomputer-based probabilistic risk assessment (PRA) model development and analysis tool to address key nuclear plant safety issues. IRRAS is an integrated software tool that gives the use the ability to create and analyze fault trees and accident sequences using a microcomputer. This program provides functions that range from graphical fault tree construction to cut set generation and quantification to report generation. Version 1.0 of the IRRAS program was released in February of 1987. Since then, many user comments and enhancements have been incorporated into the program providing a much more powerful and user-friendly system. This version has been designated IRRAS 5.0 and is the subject of this Reference Manual. Version 5.0 of IRRAS provides the same capabilities as earlier versions and ads the ability to perform location transformations, seismic analysis, and provides enhancements to the user interface as well as improved algorithm performance. Additionally, version 5.0 contains new alphanumeric fault tree and event used for event tree rules, recovery rules, and end state partitioning

  4. Contribution to the microwave characterisation of superconductive materials by means of sapphire resonators; Contribution a la caracterisation hyperfrequence de materiaux supraconducteurs par des resonateurs-saphirs

    Energy Technology Data Exchange (ETDEWEB)

    Hanus, Xavier

    1993-12-06

    The objective of this research thesis is to find a compact resonant structure which would allow the residual surface impedance of superconductive samples to be simply, quickly and economically characterised. The author first explains why he decided to use a sapphire single-crystal as inner dielectric, given some performance reached by resonant structures equipped with such inner dielectrics, and given constraints adopted from the start. He explains the origin of microwave losses which appear in this type of resonant structure, i.e. respectively the surface impedance as far as metallic losses are concerned, and the sapphire dielectric loss angle for as far as dielectric losses are concerned. The experimental installation and the principle of microwave measurements are described. The performance of different possible solutions of resonant structures from starting criteria is presented. The solution of the cavity-sapphire with a TE{sub 011} resonant mode is derived [French] Le but de cette etude est de trouver une structure resonnante compacte permettant de caracteriser simplement, rapidement et economiquement l'impedance de surface residuelle d'echantillons supraconducteurs. Les contraintes de mise en oeuvre et les performances atteintes par des resonateurs avec saphirs synthetiques justifient le choix d'un tel dielectrique a faible angle de perte. L'evaluation des performances experimentales appuyee par des modelesanalytiques permet de rejeter differentes solutions. Ainsi les resonateurs fermes avec saphirs minces sont rejetes en raison des mauvais contacts metalliques. Les resonateurs ouverts avec saphirs minces et epais sont egalement rejetes, meme pour les modes de resonance en principe confines, en raison des pertes par rayonnement. La seule solution est donc d'utiliser une cavite-saphir TE{sub 011} qui offre une configuration de champs naturellement confines. Des mesures sur une premiere cavite en niobium massif ont permis de selectionner un saphir obtenu par

  5. A proposal for performing software safety hazard analysis

    International Nuclear Information System (INIS)

    Lawrence, J.D.; Gallagher, J.M.

    1997-01-01

    Techniques for analyzing the safety and reliability of analog-based electronic protection systems that serve to mitigate hazards in process control systems have been developed over many years, and are reasonably understood. An example is the protection system in a nuclear power plant. The extension of these techniques to systems which include digital computers is not well developed, and there is little consensus among software engineering experts and safety experts on how to analyze such systems. One possible technique is to extend hazard analysis to include digital computer-based systems. Software is frequently overlooked during system hazard analyses, but this is unacceptable when the software is in control of a potentially hazardous operation. In such cases, hazard analysis should be extended to fully cover the software. A method for performing software hazard analysis is proposed in this paper. The method concentrates on finding hazards during the early stages of the software life cycle, using an extension of HAZOP

  6. Particle identification by time-of-flight measurement in the SAPHIR

    International Nuclear Information System (INIS)

    Hoffmann-Rothe, P.

    1993-02-01

    Using photoproduction data which have been measured with the SAPHIR-detector with different target materials (C H 2 solid , H 2 liquid , D 2 liquid ) a detailed investigation and discussion of the detectors performance to measure the time of flight of charged particles and to separate between particles of different mass has been accomplished. A FORTRAN program has been written which provides a calibration of the scintillator panels of the TOF hodoscopes, calculates correction factors for the time-walk effect an finally, by combining the time of flight with track momentum measurement, determines particle masses. The current configuration of the detector makes it possible to separate between proton and pion up to a particle momentum of 1.6 GeV/c. Proton and kaon can be separated up to a momentum of 1.3 GeV/c, kaon and pion up to a momentum of 0.85 GeV/c. (prog.) [de

  7. Systems analysis programs for hands-on integrated reliability evaluations (SAPHIRE) version 5.0

    International Nuclear Information System (INIS)

    Russell, K.D.; Kvarfordt, K.J.; Skinner, N.L.; Wood, S.T.

    1994-07-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) refers to a set of several microcomputer programs that were developed to create and analyze probabilistic risk assessments (PRAs), primarily for nuclear power plants. This volume is the reference manual for the Systems Analysis and Risk Assessment (SARA) System Version 5.0, a microcomputer-based system used to analyze the safety issues of a open-quotes familyclose quotes [i.e., a power plant, a manufacturing facility, any facility on which a probabilistic risk assessment (PRA) might be performed]. The SARA database contains PRA data primarily for the dominant accident sequences of a family and descriptive information about the family including event trees, fault trees, and system model diagrams. The number of facility databases that can be accessed is limited only by the amount of disk storage available. To simulate changes to family systems, SARA users change the failure rates of initiating and basic events and/or modify the structure of the cut sets that make up the event trees, fault trees, and systems. The user then evaluates the effects of these changes through the recalculation of the resultant accident sequence probabilities and importance measures. The results are displayed in tables and graphs that may be printed for reports. A preliminary version of the SARA program was completed in August 1985 and has undergone several updates in response to user suggestions and to maintain compatibility with the other SAPHIRE programs. Version 5.0 of SARA provides the same capability as earlier versions and adds the ability to process unlimited cut sets; display fire, flood, and seismic data; and perform more powerful cut set editing

  8. Teaching Software Developers to Perform UX Tasks

    DEFF Research Database (Denmark)

    Øvad, Tina; Bornoe, Nis; Larsen, Lars Bo

    2015-01-01

    . This is done via an action research study where the developers were provided with material concerning a modified AB usability test, by training them in performing this type of work, and by using their feedback to improve the method and the material. The overall result of the study is positive and it is found...... that by using the developers' feedback in the modification process, the method has truly become applicable within an agile, industrial setting. In combination with a guideline and template this has induced the developers to feel confident in independently performing this type of work....

  9. Building quality into performance and safety assessment software

    International Nuclear Information System (INIS)

    Wojciechowski, L.C.

    2011-01-01

    Quality assurance is integrated throughout the development lifecycle for performance and safety assessment software. The software used in the performance and safety assessment of a Canadian deep geological repository (DGR) follows the CSA quality assurance standard CSA-N286.7 [1], Quality Assurance of Analytical, Scientific and Design Computer Programs for Nuclear Power Plants. Quality assurance activities in this standard include tasks such as verification and inspection; however, much more is involved in producing a quality software computer program. The types of errors found with different verification methods are described. The integrated quality process ensures that defects are found and corrected as early as possible. (author)

  10. High Performance Computing Software Applications for Space Situational Awareness

    Science.gov (United States)

    Giuliano, C.; Schumacher, P.; Matson, C.; Chun, F.; Duncan, B.; Borelli, K.; Desonia, R.; Gusciora, G.; Roe, K.

    The High Performance Computing Software Applications Institute for Space Situational Awareness (HSAI-SSA) has completed its first full year of applications development. The emphasis of our work in this first year was in improving space surveillance sensor models and image enhancement software. These applications are the Space Surveillance Network Analysis Model (SSNAM), the Air Force Space Fence simulation (SimFence), and physically constrained iterative de-convolution (PCID) image enhancement software tool. Specifically, we have demonstrated order of magnitude speed-up in those codes running on the latest Cray XD-1 Linux supercomputer (Hoku) at the Maui High Performance Computing Center. The software applications improvements that HSAI-SSA has made, has had significant impact to the warfighter and has fundamentally changed the role of high performance computing in SSA.

  11. Radio-science performance analysis software

    Science.gov (United States)

    Morabito, D. D.; Asmar, S. W.

    1995-02-01

    The Radio Science Systems Group (RSSG) provides various support functions for several flight project radio-science teams. Among these support functions are uplink and sequence planning, real-time operations monitoring and support, data validation, archiving and distribution functions, and data processing and analysis. This article describes the support functions that encompass radio-science data performance analysis. The primary tool used by the RSSG to fulfill this support function is the STBLTY program set. STBLTY is used to reconstruct observable frequencies and calculate model frequencies, frequency residuals, frequency stability in terms of Allan deviation, reconstructed phase, frequency and phase power spectral density, and frequency drift rates. In the case of one-way data, using an ultrastable oscillator (USO) as a frequency reference, the program set computes the spacecraft transmitted frequency and maintains a database containing the in-flight history of the USO measurements. The program set also produces graphical displays. Some examples and discussions on operating the program set on Galileo and Ulysses data will be presented.

  12. Performance testing of 3D point cloud software

    Science.gov (United States)

    Varela-González, M.; González-Jorge, H.; Riveiro, B.; Arias, P.

    2013-10-01

    LiDAR systems are being used widely in recent years for many applications in the engineering field: civil engineering, cultural heritage, mining, industry and environmental engineering. One of the most important limitations of this technology is the large computational requirements involved in data processing, especially for large mobile LiDAR datasets. Several software solutions for data managing are available in the market, including open source suites, however, users often unknown methodologies to verify their performance properly. In this work a methodology for LiDAR software performance testing is presented and four different suites are studied: QT Modeler, VR Mesh, AutoCAD 3D Civil and the Point Cloud Library running in software developed at the University of Vigo (SITEGI). The software based on the Point Cloud Library shows better results in the loading time of the point clouds and CPU usage. However, it is not as strong as commercial suites in working set and commit size tests.

  13. Performance comparison between ISCSI and other hardware and software solutions

    CERN Document Server

    Gug, M

    2003-01-01

    We report on our investigations on some technologies that can be used to build disk servers and networks of disk servers using commodity hardware and software solutions. It focuses on the performance that can be achieved by these systems and gives measured figures for different configurations. It is divided into two parts : iSCSI and other technologies and hardware and software RAID solutions. The first part studies different technologies that can be used by clients to access disk servers using a gigabit ethernet network. It covers block access technologies (iSCSI, hyperSCSI, ENBD). Experimental figures are given for different numbers of clients and servers. The second part compares a system based on 3ware hardware RAID controllers, a system using linux software RAID and IDE cards and a system mixing both hardware RAID and software RAID. Performance measurements for reading and writing are given for different RAID levels.

  14. The impact of new accelerator control software on LEP performance

    International Nuclear Information System (INIS)

    Bailey, R.; Belk, A.; Collier, P.; Lamont, M.; Rigk, G. de; Tarrant, M.

    1993-01-01

    After the first year of running LEP, it became apparent that a new generation of application software would be required for efficient long term exploitation of the accelerator. In response to this need, a suite of accelerator control software has been developed, which is new both in style and functionality. During 1992 this software has been extensively used for driving LEP in many different operational modes, which include several different optics, polarisation runs at different energies and 8 bunch operation with Pretzels. The software has performed well and has undoubtedly enhanced the efficiency of accelerator operations. In particular the turnaround time has been significantly reduced, giving an increase of around 20% in the integrated luminosity for the year. Furthermore the software has made the accelerator accessible to less experienced operators. After outlining the development strategy, the overall functionality and performance of the software is discussed, with particular emphasis on improvements in operating efficiency. Some evaluation of the performance and reliability of ORACLE as an on-line database is also given

  15. New software for improving performance in wind farm operations

    Energy Technology Data Exchange (ETDEWEB)

    Collins, Mark [Ekho for Wind (Canada)

    2011-07-01

    The performance of wind farms depends on multiple field and business systems. This makes operational planning difficult because of so many data being in separate systems, duplication of data and the impossibility of gathering all relevant data together in one place. The aim of this paper is to present a new software, Ekho for Wind, which helps improve performance in wind farm operations by providing features such as high level views, performance analysis, downtime tracking, quality data management and forecast generation. This new software provides operational intelligence which offers incentives for continuous improvement. Ekho for Wind can bring such benefits as maximization of generation, increased lifetime of assets, minimization of costs and increased profitability. This presentation introduced a new software for improving the performance of wind farms and the lifetime of assets, resulting in significant payback.

  16. The photon detection system of the SAPHIR spectrometer

    International Nuclear Information System (INIS)

    Joepen, N.

    1990-09-01

    Worldwide a new generation of Electron Accelerators with energies below 5 GeV and a high duty cycle up to 100% is being built or planned. The first machine of this kind is ELSA, the Electron Stretcher and Accelerator, at the Physics Institute of Bonn University. Due to the high duty cycle of ELSA, experiments with tagged photon beams and a large angular acceptance become possible. At present SAPHIR, a new magnetic detector, especially layed out to detect multi-particle final states with good accuracy, is going into operation. Besides a large arrangement of drift chambers, for a good momentum resolution, and a trigger- and time-of-flight counter system, for particle identification, one of the main features of SAPHIR is a good photon detection capability. This is accomplished by a large electromagnetic calorimeter consisting of 98 modules covering a detection area of about 16 m 2 in forward direction. For the calorimeter a brass-gas-sandwich detector was developed. Its signal wires are strung perpendicular to the converter planes. The chambers are filled with a standard gas mixture Ar/CH 4 (90:10) at atmospheric pressure and operated with a considerably high voltage in the semi-proportional mode. A sample of nine shower counter modules was tested at the electron test beam of the Bonn 2.5 GeV electron synchrotron. An energy resolution of σ(E)/(E*√E(GeV)) = 13.55 ± 0.6% for a single module was achieved. The incident angle of the electrons was varied between 0 and 45 degrees. No significant change of energy resolution and linearity was observed. Combining the information from wire and cathode signals a position resolution (E = 1 GeV:Φ=0deg → σ = 15 mm, Φ=45deg → σ x = 19 mm) was reached. The second part of this paper gives a description of the shower counter arrangement in the SAPHIR detector. It requires a sophisticated control and calibration system, whose details are presented. Further on some aspects of the calorimeter calibration procedure are discussed

  17. Validation of geotechnical software for repository performance assessment

    International Nuclear Information System (INIS)

    LeGore, T.; Hoover, J.D.; Khaleel, R.; Thornton, E.C.; Anantatmula, R.P.; Lanigan, D.C.

    1989-01-01

    An important step in the characterization of a high level nuclear waste repository is to demonstrate that geotechnical software, used in performance assessment, correctly models validation. There is another type of validation, called software validation. It is based on meeting the requirements of specifications documents (e.g. IEEE specifications) and does not directly address the correctness of the specifications. The process of comparing physical experimental results with the predicted results should incorporate an objective measure of the level of confidence regarding correctness. This paper reports on a methodology developed that allows the experimental uncertainties to be explicitly included in the comparison process. The methodology also allows objective confidence levels to be associated with the software. In the event of a poor comparison, the method also lays the foundation for improving the software

  18. The muon trigger of the SAPHIR shower detector

    International Nuclear Information System (INIS)

    Rufeger-Hurek, H.

    1989-12-01

    The muon trigger system of the SAPHIR shower counter consists of 4 scintillation counters. The total trigger rate of cosmic muons is about 55 Hz which is reduced to about 45 Hz by the selecting algorithms. This rate of clean muon events allows a simultaneous monitoring of the whole electronics system and the calibration of the gas sandwich detector by measuring the gas gain. The dependences of the signals on the geometry have been simulated with the help of a Monte Carlo program. The comparison of simulated and measured pulse heights shows that faults in the electronics as well as defects in the detector hardware, e.g., the HV system, or temperature effects, can be recognized at the level of a few percent. In addition the muon signals are used to determine the calibration factor for each cathode channel individually. (orig.) [de

  19. Methods improvements incorporated into the SAPHIRE ASP models

    International Nuclear Information System (INIS)

    Sattison, M.B.; Blackman, H.S.; Novack, S.D.; Smith, C.L.; Rasmuson, D.M.

    1994-01-01

    The Office for Analysis and Evaluation of Operational Data (AEOD) has sought the assistance of the Idaho National Engineering Laboratory (INEL) to make some significant enhancements to the SAPHIRE-based Accident Sequence Precursor (ASP) models recently developed by the INEL. The challenge of this project is to provide the features of a full-scale PRA within the framework of the simplified ASP models. Some of these features include: (1) uncertainty analysis addressing the standard PRA uncertainties and the uncertainties unique to the ASP models and methodology, (2) incorporation and proper quantification of individual human actions and the interaction among human actions, (3) enhanced treatment of common cause failures, and (4) extension of the ASP models to more closely mimic full-scale PRAs (inclusion of more initiators, explicitly modeling support system failures, etc.). This paper provides an overview of the methods being used to make the above improvements

  20. Methods improvements incorporated into the SAPHIRE ASP models

    International Nuclear Information System (INIS)

    Sattison, M.B.; Blackman, H.S.; Novack, S.D.

    1995-01-01

    The Office for Analysis and Evaluation of Operational Data (AEOD) has sought the assistance of the Idaho National Engineering Laboratory (INEL) to make some significant enhancements to the SAPHIRE-based Accident Sequence Precursor (ASP) models recently developed by the INEL. The challenge of this project is to provide the features of a full-scale PRA within the framework of the simplified ASP models. Some of these features include: (1) uncertainty analysis addressing the standard PRA uncertainties and the uncertainties unique to the ASP models and methods, (2) incorporation and proper quantification of individual human actions and the interaction among human actions, (3) enhanced treatment of common cause failures, and (4) extension of the ASP models to more closely mimic full-scale PRAs (inclusion of more initiators, explicitly modeling support system failures, etc.). This paper provides an overview of the methods being used to make the above improvements

  1. Performance evaluation software moving object detection and tracking in videos

    CERN Document Server

    Karasulu, Bahadir

    2013-01-01

    Performance Evaluation Software: Moving Object Detection and Tracking in Videos introduces a software approach for the real-time evaluation and performance comparison of the methods specializing in moving object detection and/or tracking (D&T) in video processing. Digital video content analysis is an important item for multimedia content-based indexing (MCBI), content-based video retrieval (CBVR) and visual surveillance systems. There are some frequently-used generic algorithms for video object D&T in the literature, such as Background Subtraction (BS), Continuously Adaptive Mean-shift (CMS),

  2. Performance testing of LiDAR exploitation software

    Science.gov (United States)

    Varela-González, M.; González-Jorge, H.; Riveiro, B.; Arias, P.

    2013-04-01

    Mobile LiDAR systems are being used widely in recent years for many applications in the field of geoscience. One of most important limitations of this technology is the large computational requirements involved in data processing. Several software solutions for data processing are available in the market, but users are often unknown about the methodologies to verify their performance accurately. In this work a methodology for LiDAR software performance testing is presented and six different suites are studied: QT Modeler, AutoCAD Civil 3D, Mars 7, Fledermaus, Carlson and TopoDOT (all of them in x64). Results depict as QTModeler, TopoDOT and AutoCAD Civil 3D allow the loading of large datasets, while Fledermaus, Mars7 and Carlson do not achieve these powerful performance. AutoCAD Civil 3D needs large loading time in comparison with the most powerful softwares such as QTModeler and TopoDOT. Carlson suite depicts the poorest results among all the softwares under study, where point clouds larger than 5 million points cannot be loaded and loading time is very large in comparison with the other suites even for the smaller datasets. AutoCAD Civil 3D, Carlson and TopoDOT show more threads than other softwares like QTModeler, Mars7 and Fledermaus.

  3. Simulation of the beam guiding of the SAPHIR experiment by means of a differential-equation model; Simulation der Strahlfuehrung des SAPHIR-Experiments mittels eines Differentialgleichungsmodells

    Energy Technology Data Exchange (ETDEWEB)

    Greve, T.

    1991-08-01

    This paper shows the numerical simulation of a beam line by means of a model of differential equations simulating the beam line from the Bonn Electron Stretcher Accelerator ELSA to the SAPHIR spectrometer. Furthermore a method for calculating the initial values based on measurements of beam profiles is being discussed. (orig.). [Deutsch] Diese Arbeit zeigt die numerische Simulation einer Strahlfuehrung mittels eines Differentialgleichungsmodells anhand der Strahlfuehrung vom Bonner ELSA-Beschleuniger zum SAPHIR-Experiment. Weiterhin wird eine Methode zur Gewinnung der Startwerte aus Strahlprofilmessungen diskutiert. (orig.).

  4. Conference on High Performance Software for Nonlinear Optimization

    CERN Document Server

    Murli, Almerico; Pardalos, Panos; Toraldo, Gerardo

    1998-01-01

    This book contains a selection of papers presented at the conference on High Performance Software for Nonlinear Optimization (HPSN097) which was held in Ischia, Italy, in June 1997. The rapid progress of computer technologies, including new parallel architec­ tures, has stimulated a large amount of research devoted to building software environments and defining algorithms able to fully exploit this new computa­ tional power. In some sense, numerical analysis has to conform itself to the new tools. The impact of parallel computing in nonlinear optimization, which had a slow start at the beginning, seems now to increase at a fast rate, and it is reasonable to expect an even greater acceleration in the future. As with the first HPSNO conference, the goal of the HPSN097 conference was to supply a broad overview of the more recent developments and trends in nonlinear optimization, emphasizing the algorithmic and high performance software aspects. Bringing together new computational methodologies with theoretical...

  5. Mitigating the controller performance bottlenecks in Software Defined Networks

    DEFF Research Database (Denmark)

    Caba, Cosmin Marius; Soler, José

    2016-01-01

    The centralization of the control plane decision logic in Software Defined Networking (SDN) has raised concerns regarding the performance of the SDN Controller (SDNC) when the network scales up. A number of solutions have been proposed in the literature to address these concerns. This paper...

  6. Component-based software for high-performance scientific computing

    Energy Technology Data Exchange (ETDEWEB)

    Alexeev, Yuri; Allan, Benjamin A; Armstrong, Robert C; Bernholdt, David E; Dahlgren, Tamara L; Gannon, Dennis; Janssen, Curtis L; Kenny, Joseph P; Krishnan, Manojkumar; Kohl, James A; Kumfert, Gary; McInnes, Lois Curfman; Nieplocha, Jarek; Parker, Steven G; Rasmussen, Craig; Windus, Theresa L

    2005-01-01

    Recent advances in both computational hardware and multidisciplinary science have given rise to an unprecedented level of complexity in scientific simulation software. This paper describes an ongoing grass roots effort aimed at addressing complexity in high-performance computing through the use of Component-Based Software Engineering (CBSE). Highlights of the benefits and accomplishments of the Common Component Architecture (CCA) Forum and SciDAC ISIC are given, followed by an illustrative example of how the CCA has been applied to drive scientific discovery in quantum chemistry. Thrusts for future research are also described briefly.

  7. Component-based software for high-performance scientific computing

    International Nuclear Information System (INIS)

    Alexeev, Yuri; Allan, Benjamin A; Armstrong, Robert C; Bernholdt, David E; Dahlgren, Tamara L; Gannon, Dennis; Janssen, Curtis L; Kenny, Joseph P; Krishnan, Manojkumar; Kohl, James A; Kumfert, Gary; McInnes, Lois Curfman; Nieplocha, Jarek; Parker, Steven G; Rasmussen, Craig; Windus, Theresa L

    2005-01-01

    Recent advances in both computational hardware and multidisciplinary science have given rise to an unprecedented level of complexity in scientific simulation software. This paper describes an ongoing grass roots effort aimed at addressing complexity in high-performance computing through the use of Component-Based Software Engineering (CBSE). Highlights of the benefits and accomplishments of the Common Component Architecture (CCA) Forum and SciDAC ISIC are given, followed by an illustrative example of how the CCA has been applied to drive scientific discovery in quantum chemistry. Thrusts for future research are also described briefly

  8. Software life cycle dynamic simulation model: The organizational performance submodel

    Science.gov (United States)

    Tausworthe, Robert C.

    1985-01-01

    The submodel structure of a software life cycle dynamic simulation model is described. The software process is divided into seven phases, each with product, staff, and funding flows. The model is subdivided into an organizational response submodel, a management submodel, a management influence interface, and a model analyst interface. The concentration here is on the organizational response model, which simulates the performance characteristics of a software development subject to external and internal influences. These influences emanate from two sources: the model analyst interface, which configures the model to simulate the response of an implementing organization subject to its own internal influences, and the management submodel that exerts external dynamic control over the production process. A complete characterization is given of the organizational response submodel in the form of parameterized differential equations governing product, staffing, and funding levels. The parameter values and functions are allocated to the two interfaces.

  9. A Knowledge-based Environment for Software Process Performance Analysis

    Directory of Open Access Journals (Sweden)

    Natália Chaves Lessa Schots

    2015-08-01

    Full Text Available Background: Process performance analysis is a key step for implementing continuous improvement in software organizations. However, the knowledge to execute such analysis is not trivial and the person responsible to executing it must be provided with appropriate support. Aim: This paper presents a knowledge-based environment, named SPEAKER, proposed for supporting software organizations during the execution of process performance analysis. SPEAKER comprises a body of knowledge and a set of activities and tasks for software process performance analysis along with supporting tools to executing these activities and tasks. Method: We conducted an informal literature reviews and a systematic mapping study, which provided basic requirements for the proposed environment. We implemented the SPEAKER environment integrating supporting tools for the execution of activities and tasks of performance analysis and the knowledge necessary to execute them, in order to meet the variability presented by the characteristics of these activities. Results: In this paper, we describe each SPEAKER module and the individual evaluations of these modules, and also present an example of use comprising how the environment can guide the user through a specific performance analysis activity. Conclusion: Although we only conducted individual evaluations of SPEAKER’s modules, the example of use indicates the feasibility of the proposed environment. Therefore, the environment as a whole will be further evaluated to verify if it attains its goal of assisting in the execution of process performance analysis by non-specialist people.

  10. Hardware support for software controlled fast reconfiguration of performance counters

    Science.gov (United States)

    Salapura, Valentina; Wisniewski, Robert W.

    2013-06-18

    Hardware support for software controlled reconfiguration of performance counters may include a plurality of performance counters collecting one or more counts of one or more selected activities. A storage element stores data value representing a time interval, and a timer element reads the data value and detects expiration of the time interval based on the data value and generates a signal. A plurality of configuration registers stores a set of performance counter configurations. A state machine receives the signal and selects a configuration register from the plurality of configuration registers for reconfiguring the one or more performance counters.

  11. Communication Software Performance for Linux Clusters with Mesh Connections

    Energy Technology Data Exchange (ETDEWEB)

    Jie Chen; William Watson

    2003-09-01

    Recent progress in copper based commodity Gigabit Ethernet interconnects enables constructing clusters to achieve extremely high I/O bandwidth at low cost with mesh connections. However, the TCP/IP protocol stack cannot match the improved performance of Gigabit Ethernet networks especially in the case of multiple interconnects on a single host. In this paper, we evaluate and compare the performance characteristics of TCP/IP and M-VIA software that is an implementation of VIA.In particular, we focus on the performance of the software systems for a mesh communication architecture and demonstrate the feasibility of using multiple Gigabit Ethernet cards on one host to achieve aggregated bandwidth and latency that are not only better than what TCP provides but also compare favorably to some of the special purpose high-speed networks. In addition, implementation of a new M-VIA driver for one type of Gigabit Ethernet card will be discussed.

  12. Performance testing of 3D point cloud software

    Directory of Open Access Journals (Sweden)

    M. Varela-González

    2013-10-01

    Full Text Available LiDAR systems are being used widely in recent years for many applications in the engineering field: civil engineering, cultural heritage, mining, industry and environmental engineering. One of the most important limitations of this technology is the large computational requirements involved in data processing, especially for large mobile LiDAR datasets. Several software solutions for data managing are available in the market, including open source suites, however, users often unknown methodologies to verify their performance properly. In this work a methodology for LiDAR software performance testing is presented and four different suites are studied: QT Modeler, VR Mesh, AutoCAD 3D Civil and the Point Cloud Library running in software developed at the University of Vigo (SITEGI. The software based on the Point Cloud Library shows better results in the loading time of the point clouds and CPU usage. However, it is not as strong as commercial suites in working set and commit size tests.

  13. Large scale and performance tests of the ATLAS online software

    International Nuclear Information System (INIS)

    Alexandrov; Kotov, V.; Mineev, M.; Roumiantsev, V.; Wolters, H.; Amorim, A.; Pedro, L.; Ribeiro, A.; Badescu, E.; Caprini, M.; Burckhart-Chromek, D.; Dobson, M.; Jones, R.; Kazarov, A.; Kolos, S.; Liko, D.; Lucio, L.; Mapelli, L.; Nassiakou, M.; Schweiger, D.; Soloviev, I.; Hart, R.; Ryabov, Y.; Moneta, L.

    2001-01-01

    One of the sub-systems of the Trigger/DAQ system of the future ATLAS experiment is the Online Software system. It encompasses the functionality needed to configure, control and monitor the DAQ. Its architecture is based on a component structure described in the ATLAS Trigger/DAQ technical proposal. Regular integration tests ensure its smooth operation in test beam setups during its evolutionary development towards the final ATLAS online system. Feedback is received and returned into the development process. Studies of the system behavior have been performed on a set of up to 111 PCs on a configuration which is getting closer to the final size. Large scale and performance test of the integrated system were performed on this setup with emphasis on investigating the aspects of the inter-dependence of the components and the performance of the communication software. Of particular interest were the run control state transitions in various configurations of the run control hierarchy. For the purpose of the tests, the software from other Trigger/DAQ sub-systems has been emulated. The author presents a brief overview of the online system structure, its components and the large scale integration tests and their results

  14. High-Level Synthesis: Productivity, Performance, and Software Constraints

    Directory of Open Access Journals (Sweden)

    Yun Liang

    2012-01-01

    Full Text Available FPGAs are an attractive platform for applications with high computation demand and low energy consumption requirements. However, design effort for FPGA implementations remains high—often an order of magnitude larger than design effort using high-level languages. Instead of this time-consuming process, high-level synthesis (HLS tools generate hardware implementations from algorithm descriptions in languages such as C/C++ and SystemC. Such tools reduce design effort: high-level descriptions are more compact and less error prone. HLS tools promise hardware development abstracted from software designer knowledge of the implementation platform. In this paper, we present an unbiased study of the performance, usability and productivity of HLS using AutoPilot (a state-of-the-art HLS tool. In particular, we first evaluate AutoPilot using the popular embedded benchmark kernels. Then, to evaluate the suitability of HLS on real-world applications, we perform a case study of stereo matching, an active area of computer vision research that uses techniques also common for image denoising, image retrieval, feature matching, and face recognition. Based on our study, we provide insights on current limitations of mapping general-purpose software to hardware using HLS and some future directions for HLS tool development. We also offer several guidelines for hardware-friendly software design. For popular embedded benchmark kernels, the designs produced by HLS achieve 4X to 126X speedup over the software version. The stereo matching algorithms achieve between 3.5X and 67.9X speedup over software (but still less than manual RTL design with a fivefold reduction in design effort versus manual RTL design.

  15. Software for evaluation of EPR-dosimetry performance

    International Nuclear Information System (INIS)

    Shishkina, E.A.; Timofeev, Yu.S.; Ivanov, D.V.

    2014-01-01

    Electron paramagnetic resonance (EPR) with tooth enamel is a method extensively used for retrospective external dosimetry. Different research groups apply different equipment, sample preparation procedures and spectrum processing algorithms for EPR dosimetry. A uniform algorithm for description and comparison of performances was designed and implemented in a new computer code. The aim of the paper is to introduce the new software 'EPR-dosimetry performance'. The computer code is a user-friendly tool for providing a full description of method-specific capabilities of EPR tooth dosimetry, from metrological characteristics to practical limitations in applications. The software designed for scientists and engineers has several applications, including support of method calibration by evaluation of calibration parameters, evaluation of critical value and detection limit for registration of radiation-induced signal amplitude, estimation of critical value and detection limit for dose evaluation, estimation of minimal detectable value for anthropogenic dose assessment and description of method uncertainty. (authors)

  16. Runtime Performance Monitoring Tool for RTEMS System Software

    Science.gov (United States)

    Cho, B.; Kim, S.; Park, H.; Kim, H.; Choi, J.; Chae, D.; Lee, J.

    2007-08-01

    RTEMS is a commercial-grade real-time operating system that supports multi-processor computers. However, there are not many development tools for RTEMS. In this paper, we report new RTEMS-based runtime performance monitoring tool. We have implemented a light weight runtime monitoring task with an extension to the RTEMS APIs. Using our tool, software developers can verify various performance- related parameters during runtime. Our tool can be used during software development phase and in-orbit operation as well. Our implemented target agent is light weight and has small overhead using SpaceWire interface. Efforts to reduce overhead and to add other monitoring parameters are currently under research.

  17. The atmosphere simulation chamber SAPHIR: a tool for the investigation of photochemistry.

    Science.gov (United States)

    Brauers, T.; Bohn, B.; Johnen, F.-J.; Rohrer, R.; Rodriguez Bares, S.; Tillmann, R.; Wahner, A.

    2003-04-01

    On the campus of the Forschungszentrum Jülich we constructed SAPHIR (Simulation of Atmospheric PHotochemistry In a large Reaction Chamber) which was accomplished in fall 2001. The chamber consists of a 280-m^3 double-wall Teflon bag of cylindrical shape that is held by a steel frame. Typically 75% of the outside actinic flux (290~nm~--~420~nm) is available inside the chamber. A louvre system allows switching between full sun light and dark within 40 s giving the opportunity to study relaxation processes of the photo chemical system. The SAPHIR chamber is equipped with a comprehensive set of sensitive instruments including the measurements of OH, HO_2, CO, hydrocarbons, aldehydes, nitrogen-oxides and solar radiation. Moreover, the modular concept of SAPHIR allows fast and flexible integration of new instruments and techniques. In this paper we will show the unique and new features of the SAPHIR chamber, namely the clean air supply and high purity water vapor supply providing a wide range of trace gas concentrations being accessible through the experiments. We will also present examples from the first year of SAPHIR experiment showing the scope of application from high quality instrument inter-comparison and kinetic studies to the simulation of complex mixtures of trace gases at ambient concentrations.

  18. Improving Software Performance in the Compute Unified Device Architecture

    Directory of Open Access Journals (Sweden)

    Alexandru PIRJAN

    2010-01-01

    Full Text Available This paper analyzes several aspects regarding the improvement of software performance for applications written in the Compute Unified Device Architecture CUDA. We address an issue of great importance when programming a CUDA application: the Graphics Processing Unit’s (GPU’s memory management through ranspose ernels. We also benchmark and evaluate the performance for progressively optimizing a transposing matrix application in CUDA. One particular interest was to research how well the optimization techniques, applied to software application written in CUDA, scale to the latest generation of general-purpose graphic processors units (GPGPU, like the Fermi architecture implemented in the GTX480 and the previous architecture implemented in GTX280. Lately, there has been a lot of interest in the literature for this type of optimization analysis, but none of the works so far (to our best knowledge tried to validate if the optimizations can apply to a GPU from the latest Fermi architecture and how well does the Fermi architecture scale to these software performance improving techniques.

  19. Structural Analysis of PTM Hotspots (SAPH-ire) – A Quantitative Informatics Method Enabling the Discovery of Novel Regulatory Elements in Protein Families*

    Science.gov (United States)

    Dewhurst, Henry M.; Choudhury, Shilpa; Torres, Matthew P.

    2015-01-01

    Predicting the biological function potential of post-translational modifications (PTMs) is becoming increasingly important in light of the exponential increase in available PTM data from high-throughput proteomics. We developed structural analysis of PTM hotspots (SAPH-ire)—a quantitative PTM ranking method that integrates experimental PTM observations, sequence conservation, protein structure, and interaction data to allow rank order comparisons within or between protein families. Here, we applied SAPH-ire to the study of PTMs in diverse G protein families, a conserved and ubiquitous class of proteins essential for maintenance of intracellular structure (tubulins) and signal transduction (large and small Ras-like G proteins). A total of 1728 experimentally verified PTMs from eight unique G protein families were clustered into 451 unique hotspots, 51 of which have a known and cited biological function or response. Using customized software, the hotspots were analyzed in the context of 598 unique protein structures. By comparing distributions of hotspots with known versus unknown function, we show that SAPH-ire analysis is predictive for PTM biological function. Notably, SAPH-ire revealed high-ranking hotspots for which a functional impact has not yet been determined, including phosphorylation hotspots in the N-terminal tails of G protein gamma subunits—conserved protein structures never before reported as regulators of G protein coupled receptor signaling. To validate this prediction we used the yeast model system for G protein coupled receptor signaling, revealing that gamma subunit–N-terminal tail phosphorylation is activated in response to G protein coupled receptor stimulation and regulates protein stability in vivo. These results demonstrate the utility of integrating protein structural and sequence features into PTM prioritization schemes that can improve the analysis and functional power of modification-specific proteomics data. PMID:26070665

  20. Software Systems for High-performance Quantum Computing

    Energy Technology Data Exchange (ETDEWEB)

    Humble, Travis S [ORNL; Britt, Keith A [ORNL

    2016-01-01

    Quantum computing promises new opportunities for solving hard computational problems, but harnessing this novelty requires breakthrough concepts in the design, operation, and application of computing systems. We define some of the challenges facing the development of quantum computing systems as well as software-based approaches that can be used to overcome these challenges. Following a brief overview of the state of the art, we present models for the quantum programming and execution models, the development of architectures for hybrid high-performance computing systems, and the realization of software stacks for quantum networking. This leads to a discussion of the role that conventional computing plays in the quantum paradigm and how some of the current challenges for exascale computing overlap with those facing quantum computing.

  1. 76 FR 60939 - Metal Fatigue Analysis Performed by Computer Software

    Science.gov (United States)

    2011-09-30

    ... Software AGENCY: Nuclear Regulatory Commission. ACTION: Regulatory issue summary; request for comment... computer software package, WESTEMS TM , to demonstrate compliance with Section III, ``Rules for... Software Addressees All holders of, and applicants for, a power reactor operating license or construction...

  2. Implementation of the FASTBUS data-acquisition system in the readout of the SAPHIR detector

    International Nuclear Information System (INIS)

    Empt, J.

    1993-12-01

    The magnetic detector SAPHIR is layed out to detect multiparticle final states with good acuracy, especially a good photon detection capability is designed. Therefore a large electromagnetic calorimeter was built, consisting of 98 modules covering a detection area of about 16 m 2 in forward direction. For this calorimeter a brass-gas-sandwich detector was developed with signal wires perpendicular to the converter planes. For data acquisition of a major part of this calorimeter a modular FASTBUS system is used. In this report the FASTBUS system and its installation in the SAPHIR Online Program are described. (orig.)

  3. Software

    Energy Technology Data Exchange (ETDEWEB)

    Macedo, R.; Budd, G.; Ross, E.; Wells, P.

    2010-07-15

    The software section of this journal presented new software programs that have been developed to help in the exploration and development of hydrocarbon resources. Software provider IHS Inc. has made additions to its geological and engineering analysis software tool, IHS PETRA, a product used by geoscientists and engineers to visualize, analyze and manage well production, well log, drilling, reservoir, seismic and other related information. IHS PETRA also includes a directional well module and a decline curve analysis module to improve analysis capabilities in unconventional reservoirs. Petris Technology Inc. has developed a software to help manage the large volumes of data. PetrisWinds Enterprise (PWE) helps users find and manage wellbore data, including conventional wireline and MWD core data; analysis core photos and images; waveforms and NMR; and external files documentation. Ottawa-based Ambercore Software Inc. has been collaborating with Nexen on the Petroleum iQ software for steam assisted gravity drainage (SAGD) producers. Petroleum iQ integrates geology and geophysics data with engineering data in 3D and 4D. Calgary-based Envirosoft Corporation has developed a software that reduces the costly and time-consuming effort required to comply with Directive 39 of the Alberta Energy Resources Conservation Board. The product includes an emissions modelling software. Houston-based Seismic Micro-Technology (SMT) has developed the Kingdom software that features the latest in seismic interpretation. Holland-based Joa Oil and Gas and Calgary-based Computer Modelling Group have both supplied the petroleum industry with advanced reservoir simulation software that enables reservoir interpretation. The 2010 software survey included a guide to new software applications designed to facilitate petroleum exploration, drilling and production activities. Oil and gas producers can use the products for a range of functions, including reservoir characterization and accounting. In

  4. Quality Assurance in Software Development: An Exploratory Investigation in Software Project Failures and Business Performance

    Science.gov (United States)

    Ichu, Emmanuel A.

    2010-01-01

    Software quality is perhaps one of the most sought-after attributes in product development, however; this goal is unattained. Problem factors in software development and how these have affected the maintainability of the delivered software systems requires a thorough investigation. It was, therefore, very important to understand software…

  5. Characterisation of the photolytic HONO-source in the atmosphere simulation chamber SAPHIR

    Directory of Open Access Journals (Sweden)

    F. Rohrer

    2005-01-01

    Full Text Available HONO formation has been proposed as an important OH radical source in simulation chambers for more than two decades. Besides the heterogeneous HONO formation by the dark reaction of NO2 and adsorbed water, a photolytic source has been proposed to explain the elevated reactivity in simulation chamber experiments. However, the mechanism of the photolytic process is not well understood so far. As expected, production of HONO and NOx was also observed inside the new atmospheric simulation chamber SAPHIR under solar irradiation. This photolytic HONO and NOx formation was studied with a sensitive HONO instrument under reproducible controlled conditions at atmospheric concentrations of other trace gases. It is shown that the photolytic HONO source in the SAPHIR chamber is not caused by NO2 reactions and that it is the only direct NOy source under illuminated conditions. In addition, the photolysis of nitrate which was recently postulated for the observed photolytic HONO formation on snow, ground, and glass surfaces, can be excluded in the chamber. A photolytic HONO source at the surface of the chamber is proposed which is strongly dependent on humidity, on light intensity, and on temperature. An empirical function describes these dependencies and reproduces the observed HONO formation rates to within 10%. It is shown that the photolysis of HONO represents the dominant radical source in the SAPHIR chamber for typical tropospheric O3/H2O concentrations. For these conditions, the HONO concentrations inside SAPHIR are similar to recent observations in ambient air.

  6. Assimilation of SAPHIR radiance: impact on hyperspectral radiances in 4D-VAR

    Science.gov (United States)

    Indira Rani, S.; Doherty, Amy; Atkinson, Nigel; Bell, William; Newman, Stuart; Renshaw, Richard; George, John P.; Rajagopal, E. N.

    2016-04-01

    Assimilation of a new observation dataset in an NWP system may affect the quality of an existing observation data set against the model background (short forecast), which in-turn influence the use of an existing observation in the NWP system. Effect of the use of one data set on the use of another data set can be quantified as positive, negative or neutral. Impact of the addition of new dataset is defined as positive if the number of assimilated observations of an existing type of observation increases, and bias and standard deviation decreases compared to the control (without the new dataset) experiment. Recently a new dataset, Megha Tropiques SAPHIR radiances, which provides atmospheric humidity information, is added in the Unified Model 4D-VAR assimilation system. In this paper we discuss the impact of SAPHIR on the assimilation of hyper-spectral radiances like AIRS, IASI and CrIS. Though SAPHIR is a Microwave instrument, its impact can be clearly seen in the use of hyper-spectral radiances in the 4D-VAR data assimilation systems in addition to other Microwave and InfraRed observation. SAPHIR assimilation decreased the standard deviation of the spectral channels of wave number from 650 -1600 cm-1 in all the three hyperspectral radiances. Similar impact on the hyperspectral radiances can be seen due to the assimilation of other Microwave radiances like from AMSR2 and SSMIS Imager.

  7. The Future of Software Engineering for High Performance Computing

    Energy Technology Data Exchange (ETDEWEB)

    Pope, G [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-07-16

    DOE ASCR requested that from May through mid-July 2015 a study group identify issues and recommend solutions from a software engineering perspective transitioning into the next generation of High Performance Computing. The approach used was to ask some of the DOE complex experts who will be responsible for doing this work to contribute to the study group. The technique used was to solicit elevator speeches: a short and concise write up done as if the author was a speaker with only a few minutes to convince a decision maker of their top issues. Pages 2-18 contain the original texts of the contributed elevator speeches and end notes identifying the 20 contributors. The study group also ranked the importance of each topic, and those scores are displayed with each topic heading. A perfect score (and highest priority) is three, two is medium priority, and one is lowest priority. The highest scoring topic areas were software engineering and testing resources; the lowest scoring area was compliance to DOE standards. The following two paragraphs are an elevator speech summarizing the contributed elevator speeches. Each sentence or phrase in the summary is hyperlinked to its source via a numeral embedded in the text. A risk one liner has also been added to each topic to allow future risk tracking and mitigation.

  8. Software on the Peregrine System | High-Performance Computing | NREL

    Science.gov (United States)

    on the Peregrine System Software on the Peregrine System NREL maintains a variety of applications environment modules for use on Peregrine. Applications View list of software applications by name and research area/discipline. Libraries View list of software libraries available for linking and loading

  9. Performing Verification and Validation in Reuse-Based Software Engineering

    Science.gov (United States)

    Addy, Edward A.

    1999-01-01

    The implementation of reuse-based software engineering not only introduces new activities to the software development process, such as domain analysis and domain modeling, it also impacts other aspects of software engineering. Other areas of software engineering that are affected include Configuration Management, Testing, Quality Control, and Verification and Validation (V&V). Activities in each of these areas must be adapted to address the entire domain or product line rather than a specific application system. This paper discusses changes and enhancements to the V&V process, in order to adapt V&V to reuse-based software engineering.

  10. Track recognition in the central drift chamber of the SAPHIR detector at ELSA and first reconstruction of real tracks

    International Nuclear Information System (INIS)

    Korn, P.

    1991-02-01

    The FORTRAN program for pattern recognition in the central drift chamber of SAPHIR has been modified in order to find tracks with more than one missing wire signal and has been optimized in resolving the left/right ambiguities. The second part of this report deals with the reconstruction of some real tracks (γ → e + e - ), which were measured with SAPHIR. The efficiency of the central drift chamber and the space-to-drift time-relation are discussed. (orig.)

  11. Automated Improvement of Software Architecture Models for Performance and Other Quality Attributes

    OpenAIRE

    Koziolek, Anne

    2013-01-01

    Quality attributes, such as performance or reliability, are crucial for the success of a software system and largely influenced by the software architecture. Their quantitative prediction supports systematic, goal-oriented software design and forms a base of an engineering approach to software design. This thesis proposes a method and tool to automatically improve component-based software architecture (CBA) models based on such quantitative quality prediction techniques.

  12. Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE), Version 5.0: Models and Results Database (MAR-D) reference manual. Volume 8

    International Nuclear Information System (INIS)

    Russell, K.D.; Skinner, N.L.

    1994-07-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) refers to a set of several microcomputer programs that were developed to create and analyze probabilistic risk assessments (PRAs), primarily for nuclear power plants. The primary function of MAR-D is to create a data repository for completed PRAs and Individual Plant Examinations (IPEs) by providing input, conversion, and output capabilities for data used by IRRAS, SARA, SETS, and FRANTIC software. As probabilistic risk assessments and individual plant examinations are submitted to the NRC for review, MAR-D can be used to convert the models and results from the study for use with IRRAS and SARA. Then, these data can be easily accessed by future studies and will be in a form that will enhance the analysis process. This reference manual provides an overview of the functions available within MAR-D and step-by-step operating instructions

  13. Engineering bioinformatics: building reliability, performance and productivity into bioinformatics software.

    Science.gov (United States)

    Lawlor, Brendan; Walsh, Paul

    2015-01-01

    There is a lack of software engineering skills in bioinformatic contexts. We discuss the consequences of this lack, examine existing explanations and remedies to the problem, point out their shortcomings, and propose alternatives. Previous analyses of the problem have tended to treat the use of software in scientific contexts as categorically different from the general application of software engineering in commercial settings. In contrast, we describe bioinformatic software engineering as a specialization of general software engineering, and examine how it should be practiced. Specifically, we highlight the difference between programming and software engineering, list elements of the latter and present the results of a survey of bioinformatic practitioners which quantifies the extent to which those elements are employed in bioinformatics. We propose that the ideal way to bring engineering values into research projects is to bring engineers themselves. We identify the role of Bioinformatic Engineer and describe how such a role would work within bioinformatic research teams. We conclude by recommending an educational emphasis on cross-training software engineers into life sciences, and propose research on Domain Specific Languages to facilitate collaboration between engineers and bioinformaticians.

  14. Engineering bioinformatics: building reliability, performance and productivity into bioinformatics software

    Science.gov (United States)

    Lawlor, Brendan; Walsh, Paul

    2015-01-01

    There is a lack of software engineering skills in bioinformatic contexts. We discuss the consequences of this lack, examine existing explanations and remedies to the problem, point out their shortcomings, and propose alternatives. Previous analyses of the problem have tended to treat the use of software in scientific contexts as categorically different from the general application of software engineering in commercial settings. In contrast, we describe bioinformatic software engineering as a specialization of general software engineering, and examine how it should be practiced. Specifically, we highlight the difference between programming and software engineering, list elements of the latter and present the results of a survey of bioinformatic practitioners which quantifies the extent to which those elements are employed in bioinformatics. We propose that the ideal way to bring engineering values into research projects is to bring engineers themselves. We identify the role of Bioinformatic Engineer and describe how such a role would work within bioinformatic research teams. We conclude by recommending an educational emphasis on cross-training software engineers into life sciences, and propose research on Domain Specific Languages to facilitate collaboration between engineers and bioinformaticians. PMID:25996054

  15. SAPHIR, a simulator for engineering and training on N4-type nuclear power plants

    International Nuclear Information System (INIS)

    Vovan, C.

    1999-01-01

    SAPHIR, the new simulator developed by FRAMATOME, has been designed to be a convenient tool for engineering and training for different types of nuclear power plants. Its first application is for the French 'N4' four-loop 1500MWe PWR. The basic features of SAPHIR are: (1) Use of advanced codes for modelling He primary and secondary systems' including an axial steam generator model, (2) Use of a simulation workshop containing different tools for modelling fluid, electrical, instrument and control networks, (3) A Man-Machine Interface designed for an easy and convivial use which can simulate the different computerized control consoles of the 'N4' control room. This paper outlines features and capabilities of this tool, both for engineering and training purposes. (author)

  16. [Software for performing a global phenotypic and genotypic nutritional assessment].

    Science.gov (United States)

    García de Diego, L; Cuervo, M; Martínez, J A

    2013-01-01

    The nutritional assessment of a patient needs the simultaneous managing a extensive information and a great number of databases, as both aspects of the process of nutrition and the clinical situation of the patient are analyzed. The introduction of computers in the nutritional area constitutes an extraordinary advance in the administration of nutrition information, providing a complete assessment of nutritional aspects in a quick and easy way. To develop a computer program that can be used as a tool for assessing the nutritional status of the patient, the education of clinical staff, for epidemiological studies and for educational purposes. Based on a computer program which assists the health specialist to perform a full nutritional evaluation of the patient, through the registration and assessment of the phenotypic and genotypic features. The application provides nutritional prognosis based on anthropometric and biochemical parameters, images of states of malnutrition, questionnaires to characterize diseases, diagnostic criteria, identification of alleles associated with the development of specific metabolic illnesses and questionnaires of quality of life, for a custom actuation. The program includes, as part of the nutritional assessment of the patient, food intake analysis, design of diets and promotion of physical activity, introducing food frequency questionnaires, dietary recalls, healthy eating indexes, model diets, fitness tests, and recommendations, recalls and questionnaires of physical activity. A computer program performed under Java Swing, using SQLite database and some external libraries such as JfreeChart for plotting graphs. This brand new designed software is composed of five blocks categorized into ten modules named: Patients, Anthropometry, Clinical History, Biochemistry, Dietary History, Diagnostic (with genetic make up), Quality of life, Physical activity, Energy expenditure and Diets. Each module has a specific function which evaluates a

  17. Technical Performance Assessment: Mission Success in Software Acquisition Management

    Science.gov (United States)

    2010-04-27

    Examples Design constraints make software acquisition and development t l iti lex reme y cr ca Application domain – Operational Flight Program, Air...environment – used to produce the software Ri k t t bli h d d i t i d i k ts managemen – es a s e an ma n a ne r s managemen systems Milestone reviews...Directorate for Information Operations and Reports, 1215 Jefferson Davis Highway, Suite 1204, Arlington VA 22202-4302. Respondents should be aware that

  18. Improving Performance of Software Implemented Floating Point Addition

    DEFF Research Database (Denmark)

    Hindborg, Andreas Erik; Karlsson, Sven

    2011-01-01

    We outline and evaluate hardware extensions to an integer processor pipeline which allow IEEE 754 oating point, FP, addition to be eciently implemented in software. With a very moderate increase in hardware resources, our perfor- mance evaluation shows that, for a benchmark that executes 12.5% FP...... addition instructions, our approach exhibits a rel- ative slowdown of 3.38 to 15.15 as compared to dedicated hardware. This is a signicant improvement of pure software emulation which leads to relative slowdowns up to 45.33....

  19. Comparison of OH reactivity measurements in the atmospheric simulation chamber SAPHIR

    Science.gov (United States)

    Fuchs, Hendrik; Novelli, Anna; Rolletter, Michael; Hofzumahaus, Andreas; Pfannerstill, Eva Y.; Kessel, Stephan; Edtbauer, Achim; Williams, Jonathan; Michoud, Vincent; Dusanter, Sebastien; Locoge, Nadine; Zannoni, Nora; Gros, Valerie; Truong, Francois; Sarda-Esteve, Roland; Cryer, Danny R.; Brumby, Charlotte A.; Whalley, Lisa K.; Stone, Daniel; Seakins, Paul W.; Heard, Dwayne E.; Schoemaecker, Coralie; Blocquet, Marion; Coudert, Sebastien; Batut, Sebastien; Fittschen, Christa; Thames, Alexander B.; Brune, William H.; Ernest, Cheryl; Harder, Hartwig; Muller, Jennifer B. A.; Elste, Thomas; Kubistin, Dagmar; Andres, Stefanie; Bohn, Birger; Hohaus, Thorsten; Holland, Frank; Li, Xin; Rohrer, Franz; Kiendler-Scharr, Astrid; Tillmann, Ralf; Wegener, Robert; Yu, Zhujun; Zou, Qi; Wahner, Andreas

    2017-10-01

    Hydroxyl (OH) radical reactivity (kOH) has been measured for 18 years with different measurement techniques. In order to compare the performances of instruments deployed in the field, two campaigns were conducted performing experiments in the atmospheric simulation chamber SAPHIR at Forschungszentrum Jülich in October 2015 and April 2016. Chemical conditions were chosen either to be representative of the atmosphere or to test potential limitations of instruments. All types of instruments that are currently used for atmospheric measurements were used in one of the two campaigns. The results of these campaigns demonstrate that OH reactivity can be accurately measured for a wide range of atmospherically relevant chemical conditions (e.g. water vapour, nitrogen oxides, various organic compounds) by all instruments. The precision of the measurements (limit of detection CRM) has a higher limit of detection of 2 s-1 at a time resolution of 10 to 15 min. The performances of the instruments were systematically tested by stepwise increasing, for example, the concentrations of carbon monoxide (CO), water vapour or nitric oxide (NO). In further experiments, mixtures of organic reactants were injected into the chamber to simulate urban and forested environments. Overall, the results show that the instruments are capable of measuring OH reactivity in the presence of CO, alkanes, alkenes and aromatic compounds. The transmission efficiency in Teflon inlet lines could have introduced systematic errors in measurements for low-volatile organic compounds in some instruments. CRM instruments exhibited a larger scatter in the data compared to the other instruments. The largest differences to reference measurements or to calculated reactivity were observed by CRM instruments in the presence of terpenes and oxygenated organic compounds (mixing ratio of OH reactants were up to 10 ppbv). In some of these experiments, only a small fraction of the reactivity is detected. The accuracy of CRM

  20. Comparison of OH Reactivity Instruments in the Atmosphere Simulation Chamber SAPHIR.

    Science.gov (United States)

    Fuchs, H.; Novelli, A.; Rolletter, M.; Hofzumahaus, A.; Pfannerstill, E.; Edtbauer, A.; Kessel, S.; Williams, J.; Michoud, V.; Dusanter, S.; Locoge, N.; Zannoni, N.; Gros, V.; Truong, F.; Sarda Esteve, R.; Cryer, D. R.; Brumby, C.; Whalley, L.; Stone, D. J.; Seakins, P. W.; Heard, D. E.; Schoemaecker, C.; Blocquet, M.; Fittschen, C. M.; Thames, A. B.; Coudert, S.; Brune, W. H.; Batut, S.; Tatum Ernest, C.; Harder, H.; Elste, T.; Bohn, B.; Hohaus, T.; Holland, F.; Muller, J. B. A.; Li, X.; Rohrer, F.; Kubistin, D.; Kiendler-Scharr, A.; Tillmann, R.; Andres, S.; Wegener, R.; Yu, Z.; Zou, Q.; Wahner, A.

    2017-12-01

    Two campaigns were conducted performing experiments in the atmospheric simulation chamber SAPHIR at Forschungszentrum Jülich in October 2015 and April 2016 to compare hydroxyl (OH) radical reactivity (kOH) measurements. Chemical conditions were chosen either to be representative of the atmosphere or to test potential limitations of instruments. The results of these campaigns demonstrate that OH reactivity can be accurately measured for a wide range of atmospherically relevant chemical conditions (e.g. water vapor, nitrogen oxides, various organic compounds) by all instruments. The precision of the measurements is higher for instruments directly detecting hydroxyl radicals (OH), whereas the indirect Comparative Reactivity Method (CRM) has a higher limit of detection of 2s-1 at a time resolution of 10 to 15 min. The performances of the instruments were systematically tested by stepwise increasing, for example, the concentrations of carbon monoxide (CO), water vapor or nitric oxide (NO). In further experiments, mixtures of organic reactants were injected in the chamber to simulate urban and forested environments. Overall, the results show that instruments are capable of measuring OH reactivity in the presence of CO, alkanes, alkenes and aromatic compounds. The transmission efficiency in Teflon inlet lines could have introduced systematic errors in measurements for low-volatile organic compounds in some instruments. CRM instruments exhibited a larger scatter in the data compared to the other instruments. The largest differences to the reference were observed by CRM instruments in the presence of terpenes and oxygenated organic compounds. In some of these experiments, only a small fraction of the reactivity is detected. The accuracy of CRM measurements is most likely limited by the corrections that need to be applied in order to account for known effects of, for example, deviations from pseudo-first order conditions, nitrogen oxides or water vapor on the measurement

  1. Design, Implementation, and Performance of CREAM Data Acquisition Software

    CERN Document Server

    Zinn, S Y; Bagliesi, M G; Beatty, J J; Childers, J T; Coutu, S; Duvernois, M A; Ganel, O; Kim, H J; Lee, M H; Lutz, L; Malinine, A; Maestro, P; Marrocchesi, P S; Park, I H; Seo, E S; Song, C; Swordy, S; Wu, J

    2005-01-01

    Cosmic Ray Energetics and Mass (CREAM) is a balloon-borne experiment scheduled for launching from Antarctica in late 2004. Its aim is to measure the energy spectrum and composition of cosmic rays from proton to iron nuclei at ultra high energies from 1 to 1,000 TeV. Ultra long duration balloons are expected to fly about 100 days. One special feature of the CREAM data acquisition software (CDAQ) is the telemetric operation of the instrument using satellites. During a flight the science event and housekeeping data are sent from the instrument to a ground facility. Likewise, commands for controlling both the hardware and the software are uploaded from the ground facility. This requires a robust, reliable, and fast software system. CDAQ has been developed and tested during three beam tests at CERN in July, September, and November 2003. Recently the interfaces to the transition radiation detector (TRD) and to the timing-based charge detector (TCD) have been added. These new additions to CDAQ will be checked at a t...

  2. A Data Specification for Software Project Performance Measures: Results of a Collaboration on Performance Measurement

    National Research Council Canada - National Science Library

    Kasunic, Mark

    2008-01-01

    ... between completed projects. These terms and definitions were developed using a collaborative, consensus-based approach involving the Software Engineering Institute's Software Engineering Process Management program and service...

  3. The new CERN tape software - getting ready for total performance

    CERN Document Server

    Cano, E; Kruse, D F; Kotlyar, V; Côme, D

    2015-01-01

    CASTOR (the CERN Advanced STORage system) is used to store the custodial copy of all of the physics data collected from the CERN experiments, both past and present. CASTOR is a hierarchical storage management system that has a disk-based front-end and a tape-based back-end. The software responsible for controlling the tape back-end has been redesigned and redeveloped over the last year and was put in production at the beginning of 2015. This paper summarises the motives behind the redesign, describes in detail the redevelopment work and concludes with the short and long-term benefits.

  4. Frequency Estimator Performance for a Software-Based Beacon Receiver

    Science.gov (United States)

    Zemba, Michael J.; Morse, Jacquelynne Rose; Nessel, James A.; Miranda, Felix

    2014-01-01

    As propagation terminals have evolved, their design has trended more toward a software-based approach that facilitates convenient adjustment and customization of the receiver algorithms. One potential improvement is the implementation of a frequency estimation algorithm, through which the primary frequency component of the received signal can be estimated with a much greater resolution than with a simple peak search of the FFT spectrum. To select an estimator for usage in a QV-band beacon receiver, analysis of six frequency estimators was conducted to characterize their effectiveness as they relate to beacon receiver design.

  5. Assessing students' performance in software requirements engineering education using scoring rubrics

    Science.gov (United States)

    Mkpojiogu, Emmanuel O. C.; Hussain, Azham

    2017-10-01

    The study investigates how helpful the use of scoring rubrics is, in the performance assessment of software requirements engineering students and whether its use can lead to students' performance improvement in the development of software requirements artifacts and models. Scoring rubrics were used by two instructors to assess the cognitive performance of a student in the design and development of software requirements artifacts. The study results indicate that the use of scoring rubrics is very helpful in objectively assessing the performance of software requirements or software engineering students. Furthermore, the results revealed that the use of scoring rubrics can also produce a good achievement assessments direction showing whether a student is either improving or not in a repeated or iterative assessment. In a nutshell, its use leads to the performance improvement of students. The results provided some insights for further investigation and will be beneficial to researchers, requirements engineers, system designers, developers and project managers.

  6. Software development for simplified performance tests and weekly performance check in Younggwang NPP Unit 3 and 4

    International Nuclear Information System (INIS)

    Hur, K. Y.; Jang, S. H.; Lee, J. W.; Kim, J. T.; Park, J. C.

    2002-01-01

    This paper covers the current status of turbine cycle performance test in nuclear power plants and the software development which can solve some shortcomings related to performance tests. The software developed is for simplified performance tests and weekly performance checks in Yonggwang nuclear power plant unit 3 and 4. This software includes the requirements from the efficiency division for the consistency with actual performance analysis work and the usability of the collected performance test data. From the working survey, we identify the difference between the embedded performance analysis modules and the actual performance analysis work. This software helps operation or maintenance personnel to reduce work load, to support the trend analysis of essential parameters in a turbine cycle, and to utilize the correction curves for the decision-making in their work

  7. Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) Version 5.0: Data loading manual. Volume 10

    International Nuclear Information System (INIS)

    VanHorn, R.L.; Wolfram, L.M.; Fowler, R.D.; Beck, S.T.; Smith, C.L.

    1995-04-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) suite of programs can be used to organize and standardize in an electronic format information from probabilistic risk assessments or individual plant examinations. The Models and Results Database (MAR-D) program of the SAPHIRE suite serves as the repository for probabilistic risk assessment and individual plant examination data and information. This report demonstrates by examples the common electronic and manual methods used to load these types of data. It is not a stand alone document but references documents that contribute information relative to the data loading process. This document provides a more detailed discussion and instructions for using SAPHIRE 5.0 only when enough information on a specific topic is not provided by another available source

  8. Optimizing the Performance of Radionuclide Identification Software in the Hunt for Nuclear Security Threats

    International Nuclear Information System (INIS)

    Fotion, Katherine A.

    2016-01-01

    The Radionuclide Analysis Kit (RNAK), my team's most recent nuclide identification software, is entering the testing phase. A question arises: will removing rare nuclides from the software's library improve its overall performance? An affirmative response indicates fundamental errors in the software's framework, while a negative response confirms the effectiveness of the software's key machine learning algorithms. After thorough testing, I found that the performance of RNAK cannot be improved with the library choice effect, thus verifying the effectiveness of RNAK's algorithms - multiple linear regression, Bayesian network using the Viterbi algorithm, and branch and bound search.

  9. Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE), Version 5.0. Volume 5, Systems Analysis and Risk Assessment (SARA) tutorial manual

    International Nuclear Information System (INIS)

    Sattison, M.B.; Russell, K.D.; Skinner, N.L.

    1994-07-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) refers to a set of several microcomputer programs that were developed to create and analyze probabilistic risk assessments (PRAs) primarily for nuclear power plants. This volume is the tutorial manual for the Systems Analysis and Risk Assessment (SARA) System Version 5.0, a microcomputer-based system used to analyze the safety issues of a open-quotes familyclose quotes [i.e., a power plant, a manufacturing facility, any facility on which a probabilistic risk assessment (PRA) might be performed]. A series of lessons is provided that guides the user through some basic steps common to most analyses performed with SARA. The example problems presented in the lessons build on one another, and in combination, lead the user through all aspects of SARA sensitivity analysis capabilities

  10. Balancing technical and regulatory concerns related to testing and control of performance assessment software

    International Nuclear Information System (INIS)

    Seitz, R.R.; Matthews, S.D.; Kostelnik, K.M.

    1990-01-01

    What activities are required to assure that a performance assessment (PA) computer code operates as it is intended? Answers to this question will vary depending on the individual's area of expertise. Different perspectives on testing and control of PA software are discussed based on interpretations of the testing and control process associated with the different involved parties. This discussion leads into the presentation of a general approach to software testing and control that address regulatory requirements. Finally, the need for balance between regulatory and scientific concerns is illustrated through lessons learned in previous implementations of software testing and control programs. Configuration control and software testing are required to provide assurance that a computer code performs as intended. Configuration control provides traceability and reproducibility of results produced with PA software and provides a system to assure that users have access to the current version of the software. Software testing is conducted to assure that the computer code has been written properly, solution techniques have been properly implemented, and the software is capable of representing the behavior of the specific system to be modeled. Comprehensive software testing includes: software analysis, verification testing, benchmark testing, and site-specific calibration/validation testing

  11. A new plant chamber facility PLUS coupled to the atmospheric simulation chamber SAPHIR

    Science.gov (United States)

    Hohaus, T.; Kuhn, U.; Andres, S.; Kaminski, M.; Rohrer, F.; Tillmann, R.; Wahner, A.; Wegener, R.; Yu, Z.; Kiendler-Scharr, A.

    2015-11-01

    A new PLant chamber Unit for Simulation (PLUS) for use with the atmosphere simulation chamber SAPHIR (Simulation of Atmospheric PHotochemistry In a large Reaction Chamber) has been build and characterized at the Forschungszentrum Jülich GmbH, Germany. The PLUS chamber is an environmentally controlled flow through plant chamber. Inside PLUS the natural blend of biogenic emissions of trees are mixed with synthetic air and are transferred to the SAPHIR chamber where the atmospheric chemistry and the impact of biogenic volatile organic compounds (BVOC) can be studied in detail. In PLUS all important enviromental parameters (e.g. temperature, PAR, soil RH etc.) are well-controlled. The gas exchange volume of 9.32 m3 which encloses the stem and the leafes of the plants is constructed such that gases are exposed to FEP Teflon film and other Teflon surfaces only to minimize any potential losses of BVOCs in the chamber. Solar radiation is simulated using 15 LED panels which have an emission strength up to 800 μmol m-2 s-1. Results of the initial characterization experiments are presented in detail. Background concentrations, mixing inside the gas exchange volume, and transfer rate of volatile organic compounds (VOC) through PLUS under different humidity conditions are explored. Typical plant characteristics such as light and temperature dependent BVOC emissions are studied using six Quercus Ilex trees and compared to previous studies. Results of an initial ozonolysis experiment of BVOC emissions from Quercus Ilex at typical atmospheric concentrations inside SAPHIR are presented to demonstrate a typical experimental set up and the utility of the newly added plant chamber.

  12. A new plant chamber facility, PLUS, coupled to the atmosphere simulation chamber SAPHIR

    Science.gov (United States)

    Hohaus, T.; Kuhn, U.; Andres, S.; Kaminski, M.; Rohrer, F.; Tillmann, R.; Wahner, A.; Wegener, R.; Yu, Z.; Kiendler-Scharr, A.

    2016-03-01

    A new PLant chamber Unit for Simulation (PLUS) for use with the atmosphere simulation chamber SAPHIR (Simulation of Atmospheric PHotochemistry In a large Reaction Chamber) has been built and characterized at the Forschungszentrum Jülich GmbH, Germany. The PLUS chamber is an environmentally controlled flow-through plant chamber. Inside PLUS the natural blend of biogenic emissions of trees is mixed with synthetic air and transferred to the SAPHIR chamber, where the atmospheric chemistry and the impact of biogenic volatile organic compounds (BVOCs) can be studied in detail. In PLUS all important environmental parameters (e.g., temperature, photosynthetically active radiation (PAR), soil relative humidity (RH)) are well controlled. The gas exchange volume of 9.32 m3 which encloses the stem and the leaves of the plants is constructed such that gases are exposed to only fluorinated ethylene propylene (FEP) Teflon film and other Teflon surfaces to minimize any potential losses of BVOCs in the chamber. Solar radiation is simulated using 15 light-emitting diode (LED) panels, which have an emission strength up to 800 µmol m-2 s-1. Results of the initial characterization experiments are presented in detail. Background concentrations, mixing inside the gas exchange volume, and transfer rate of volatile organic compounds (VOCs) through PLUS under different humidity conditions are explored. Typical plant characteristics such as light- and temperature- dependent BVOC emissions are studied using six Quercus ilex trees and compared to previous studies. Results of an initial ozonolysis experiment of BVOC emissions from Quercus ilex at typical atmospheric concentrations inside SAPHIR are presented to demonstrate a typical experimental setup and the utility of the newly added plant chamber.

  13. Measuring CMS Software Performance in the first years of LHC collisions

    CERN Document Server

    Benelli, Gabriele; Pfeiffer, Andreas; Piparo, Danilo; Zemleris, Vidmantas

    2011-01-01

    The CMSSW software framework is a complex project enabling the CMS collaboration to investigate the fast growing LHC collision data sample. A software performance suite of tools has been developed and integrated in CMSSW to keep track of cpu time, memory footprint and event size on disk. These three metrics are key constraints in software development in order to meet the computing requirements used in the planning and management of the CMS computing infrastructure. The performance suite allows the measurement and tracking of the performance across the framework, publishing the results in a dedicated database. A web application makes the results easily accessible to software release managers allowing for automatic integration in CMSSW release cycle quality assurance. The performance suite is also available to individual developers for dedicated code optimization and the web application allows historic regression and comparisons across releases. The performance suite tools and the performance of the CMSSW frame...

  14. Using applicative software and software tools for the performance of leaching and bio-leaching

    OpenAIRE

    Krstev, Boris; Krstev, Aleksandar; Gocev, Zivko; Zdravev, Zoran; Krstev, Dejan; Zivanovic, Jordan

    2013-01-01

    The refractory or low grade lead/zinc domestic ores in Republic of Macedonia are investigated by conventional separation technology or flotation separation. In the mean time, investigations are directed to the new possibilities of leaching by microorganisms – bioleaching. The paper is result of these technologies and investigations carried out for recovery of in the mentioned ores. Using Simplex EVOP and computer program Multisimplex performances are appropriate and most acceptable and exce...

  15. Performance Evaluation of a Software Engineering Tool for Automated Design of Cooling Systems in Injection Moulding

    DEFF Research Database (Denmark)

    Jauregui-Becker, Juan M.; Tosello, Guido; van Houten, Fred J.A.M.

    2013-01-01

    This paper presents a software tool for automating the design of cooling systems for injection moulding and a validation of its performance. Cooling system designs were automatically generated by the proposed software tool and by applying a best practice tool engineering design approach. The two...

  16. Comparative Performance Analysis of Machine Learning Techniques for Software Bug Detection

    OpenAIRE

    Saiqa Aleem; Luiz Fernando Capretz; Faheem Ahmed

    2015-01-01

    Machine learning techniques can be used to analyse data from different perspectives and enable developers to retrieve useful information. Machine learning techniques are proven to be useful in terms of software bug prediction. In this paper, a comparative performance analysis of different machine learning techniques is explored f or software bug prediction on public available data sets. Results showed most of the mac ...

  17. The Effect of Firm Strategy and Corporate Performance on Software Market Growth in Emerging Regions

    Science.gov (United States)

    Mertz, Sharon A.

    2013-01-01

    The purpose of this research is to evaluate the impact of firm strategies and corporate performance on enterprise software market growth in emerging regions. The emerging regions of Asia Pacific, Eastern Europe, the Middle East and Africa, and Latin America, currently represent smaller overall markets for software vendors, but exhibit high growth…

  18. Hardware support for software controlled fast multiplexing of performance counters

    Science.gov (United States)

    Salapura, Valentina; Wisniewski, Robert W.

    2013-01-01

    Performance counters may be operable to collect one or more counts of one or more selected activities, and registers may be operable to store a set of performance counter configurations. A state machine may be operable to automatically select a register from the registers for reconfiguring the one or more performance counters in response to receiving a first signal. The state machine may be further operable to reconfigure the one or more performance counters based on a configuration specified in the selected register. The state machine yet further may be operable to copy data in selected one or more of the performance counters to a memory location, or to copy data from the memory location to the counters, in response to receiving a second signal. The state machine may be operable to store or restore the counter values and state machine configuration in response to a context switch event.

  19. Development of a FASTBUS data acquisition system for the SAPHIR calorimeter

    International Nuclear Information System (INIS)

    Klein, F.J.

    1992-01-01

    Due to the high duty cycle of the new Electron Accelerator at the Physics Institute of Bonn University, ELSA, experiments with tagged photon beams and a large angular acceptance become possible. The new magnetic detector SAPHIR is layed out to detect multi-particle final states with good accuracy, especially a good photon detection capability is designed. Therefore a large electromagnetic calorimeter is built, consisting of 98 modules covering a detection area of about 16 m 2 in forward direction. For this calorimeter a brass-gas-sandwich detector was developed with signal wires perpendicular to the converter planes. The chambers are filled with a standard gas mixture Ar/CH 4 (90:10) at atmospheric pressure and operated with a considerably high voltage in the semi-proportional mode. A modified shower counter module, containing 20 μm thick signal wires, was tested at the electron test beam of the Bonn 2.5 GeV electron synchrotron. An energy resolution of σ(E)/E*√E(GeV) = 12.2±0.5% was achieved. For data acquisition a modular FASTBUS system was used, which will be installed in the SAPHIR Online Program. (orig.) [de

  20. The graphics system and the data saving for the SAPHIR experiment

    International Nuclear Information System (INIS)

    Albold, D.

    1990-08-01

    Important extensions have been made to the data acquisition system SOS for the SAPHIR experiment at the Bonn ELSA facilities. As support for various online-programs, controlling components of the detector, a graphic system for presenting data was developed. This enables any program in the system to use all graphic devices. Main component is a program serving requests for presentation on a 19 inch color monitor. Window-technique allows a presentation of several graphics on one screen. Equipped with a trackball and using menus, this is an easy to use and powerful tool in controlling the experiment. Other important extensions concern data storage. A huge amount of event data can be stored on 8 mm cassettes by the program Eventsaver. This program can be controlled by a component of the SAPHIR-Online SOL running on a VAX-Computer and using windows and menus. The smaller amount of data, containing parameters and programs, which should be accessible within a small period of time, can be stored on a magnetic disk. A program supporting a file-structure for access to this disk is described. (orig./HSI) [de

  1. Analysis for Parallel Execution without Performing Hardware/Software Co-simulation

    OpenAIRE

    Muhammad Rashid

    2014-01-01

    Hardware/software co-simulation improves the performance of embedded applications by executing the applications on a virtual platform before the actual hardware is available in silicon. However, the virtual platform of the target architecture is often not available during early stages of the embedded design flow. Consequently, analysis for parallel execution without performing hardware/software co-simulation is required. This article presents an analysis methodology for parallel execution of ...

  2. A performance improvement plan to increase nurse adherence to use of medication safety software.

    Science.gov (United States)

    Gavriloff, Carrie

    2012-08-01

    Nurses can protect patients receiving intravenous (IV) medication by using medication safety software to program "smart" pumps to administer IV medications. After a patient safety event identified inconsistent use of medication safety software by nurses, a performance improvement team implemented the Deming Cycle performance improvement methodology. The combined use of improved direct care nurse communication, programming strategies, staff education, medication safety champions, adherence monitoring, and technology acquisition resulted in a statistically significant (p < .001) increase in nurse adherence to using medication safety software from 28% to above 85%, exceeding national benchmark adherence rates (Cohen, Cooke, Husch & Woodley, 2007; Carefusion, 2011). Copyright © 2012 Elsevier Inc. All rights reserved.

  3. Simulation of the beam guiding of the SAPHIR experiment by means of a differential-equation model

    International Nuclear Information System (INIS)

    Greve, T.

    1991-08-01

    This paper shows the numerical simulation of a beam line by means of a model of differential equations simulating the beam line from the Bonn Electron Stretcher Accelerator ELSA to the SAPHIR spectrometer. Furthermore a method for calculating the initial values based on measurements of beam profiles is being discussed. (orig.) [de

  4. Comparison of OH reactivity measurements in the atmospheric simulation chamber SAPHIR

    Directory of Open Access Journals (Sweden)

    H. Fuchs

    2017-10-01

    Full Text Available Hydroxyl (OH radical reactivity (kOH has been measured for 18 years with different measurement techniques. In order to compare the performances of instruments deployed in the field, two campaigns were conducted performing experiments in the atmospheric simulation chamber SAPHIR at Forschungszentrum Jülich in October 2015 and April 2016. Chemical conditions were chosen either to be representative of the atmosphere or to test potential limitations of instruments. All types of instruments that are currently used for atmospheric measurements were used in one of the two campaigns. The results of these campaigns demonstrate that OH reactivity can be accurately measured for a wide range of atmospherically relevant chemical conditions (e.g. water vapour, nitrogen oxides, various organic compounds by all instruments. The precision of the measurements (limit of detection  < 1 s−1 at a time resolution of 30 s to a few minutes is higher for instruments directly detecting hydroxyl radicals, whereas the indirect comparative reactivity method (CRM has a higher limit of detection of 2 s−1 at a time resolution of 10 to 15 min. The performances of the instruments were systematically tested by stepwise increasing, for example, the concentrations of carbon monoxide (CO, water vapour or nitric oxide (NO. In further experiments, mixtures of organic reactants were injected into the chamber to simulate urban and forested environments. Overall, the results show that the instruments are capable of measuring OH reactivity in the presence of CO, alkanes, alkenes and aromatic compounds. The transmission efficiency in Teflon inlet lines could have introduced systematic errors in measurements for low-volatile organic compounds in some instruments. CRM instruments exhibited a larger scatter in the data compared to the other instruments. The largest differences to reference measurements or to calculated reactivity were observed by CRM instruments in

  5. Planning and Analysis of the Company’s Financial Performances by Using a Software Simulation

    Directory of Open Access Journals (Sweden)

    Meri BOSHKOSKA

    2017-06-01

    Full Text Available Information Technology includes a wide range of software solution that helps managers in decision making processes in order to increase the company's business performance. Using software solution in financial analysis is a valuable tool for managers in the financial decision making process. The objective of the study was accomplished by developing Software that easily determines the financial performances of the company through integration of the analysis of financial indicators and DuPont profitability analysis model. Through this software, managers will be able to calculate the current financial state and visually analyze how their actions will affect the financial performance of the company. This will enable them to identify the best ways to improve the financial performance of the company. The software can perform a financial analysis and give a clear, useful overview of the current business performance and can also help in planning the growth of the company. The Software can also be implemented in educational purposes for students and managers in the field of financial management.

  6. Remote software upload techniques in future vehicles and their performance analysis

    Science.gov (United States)

    Hossain, Irina

    could benefit from it. However, like the unicast RSU, the security requirements of multicast communication, i.e., authenticity, confidentiality and integrity of the software transmitted and access control of the group members is challenging. In this thesis, an infrastructure-based mobile multicasting for RSU in vehicle ECUs is proposed where an ECU receives the software from a remote software distribution center using the road side BSs as gateways. The Vehicular Software Distribution Network (VSDN) is divided into small regions administered by a Regional Group Manager (RGM). Two multicast Group Key Management (GKM) techniques are proposed based on the degree of trust on the BSs named Fully-trusted (FT) and Semi-trusted (ST) systems. Analytical models are developed to find the multicast session establishment latency and handover latency for these two protocols. The average latency to perform mutual authentication of the software vendor and a vehicle, and to send the multicast session key by the software provider during multicast session initialization, and the handoff latency during multicast session is calculated. Analytical and simulation results show that the link establishment latency per vehicle of our proposed schemes is in the range of few seconds and the ST system requires few ms higher time than the FT system. The handoff latency is also in the range of few seconds and in some cases ST system requires less handoff time than the FT system. Thus, it is possible to build an efficient GKM protocol without putting too much trust on the BSs.

  7. Intercomparison of NO3 radical detection instruments in the atmosphere simulation chamber SAPHIR

    Directory of Open Access Journals (Sweden)

    H.-P. Dorn

    2013-05-01

    Full Text Available The detection of atmospheric NO3 radicals is still challenging owing to its low mixing ratios (≈ 1 to 300 pptv in the troposphere. While long-path differential optical absorption spectroscopy (DOAS has been a well-established NO3 detection approach for over 25 yr, newly sensitive techniques have been developed in the past decade. This publication outlines the results of the first comprehensive intercomparison of seven instruments developed for the spectroscopic detection of tropospheric NO3. Four instruments were based on cavity ring-down spectroscopy (CRDS, two utilised open-path cavity-enhanced absorption spectroscopy (CEAS, and one applied "classical" long-path DOAS. The intercomparison campaign "NO3Comp" was held at the atmosphere simulation chamber SAPHIR in Jülich (Germany in June 2007. Twelve experiments were performed in the well-mixed chamber for variable concentrations of NO3, N2O5, NO2, hydrocarbons, and water vapour, in the absence and in the presence of inorganic or organic aerosol. The overall precision of the cavity instruments varied between 0.5 and 5 pptv for integration times of 1 s to 5 min; that of the DOAS instrument was 9 pptv for an acquisition time of 1 min. The NO3 data of all instruments correlated excellently with the NOAA-CRDS instrument, which was selected as the common reference because of its superb sensitivity, high time resolution, and most comprehensive data coverage. The median of the coefficient of determination (r2 over all experiments of the campaign (60 correlations is r2 = 0.981 (quartile 1 (Q1: 0.949; quartile 3 (Q3: 0.994; min/max: 0.540/0.999. The linear regression analysis of the campaign data set yielded very small intercepts (median: 1.1 pptv; Q1/Q3: −1.1/2.6 pptv; min/max: −14.1/28.0 pptv, and the slopes of the regression lines were close to unity (median: 1.01; Q1/Q3: 0.92/1.10; min/max: 0.72/1.36. The deviation of individual regression slopes from unity was always within the combined

  8. Investigation of OH Radical Regeneration from Isoprene Oxidation Across Different NOx Regimes in the Atmosphere Simulation Chamber SAPHIR

    Science.gov (United States)

    Novelli, A.; Bohn, B.; Dorn, H. P.; Häseler, R.; Hofzumahaus, A.; Kaminski, M.; Yu, Z.; Li, X.; Tillmann, R.; Wegener, R.; Fuchs, H.; Kiendler-Scharr, A.; Wahner, A.

    2017-12-01

    The hydroxyl radical (OH) is the dominant daytime oxidant in the troposphere. It starts the degradation of volatile organic compounds (VOC) originating from both anthropogenic and biogenic emissions. Hence, it is a crucial trace species in model simulations as it has a large impact on many reactive trace gases. Many field campaigns performed in isoprene dominated environment in low NOx conditions have shown large discrepancies between the measured and the modelled OH radical concentrations. These results have contributed to the discovery of new regeneration paths for OH radicals from isoprene-OH second generation products with maximum efficiency at low NO. The current chemical models (e.g. MCM 3.3.1) include this novel chemistry allowing for an investigation of the validity of the OH regeneration at different chemical conditions. Over 11 experiments focusing on the OH oxidation of isoprene were performed at the SAPHIR chamber in the Forschungszentrum Jülich. Measurements of VOCs, NOx, O3, HONO were performed together with the measurement of OH radicals (by both LIF-FAGE and DOAS) and OH reactivity. Within the simulation chamber, the NO mixing ratio was varied between 0.05 to 2 ppbv allowing the investigation of both the "new" regeneration path for OH radicals and the well-known NO+HO2 mechanism. A comparison with the MCM 3.3.1 that includes the upgraded LIM1 mechanism showed very good agreement (within 10%) for the OH data at all concentrations of NOx investigated. Comparison with different models, without LIM1 and with updated rates for the OH regeneration, will be presented together with a detailed analysis of the impact of this study on results from previous field campaigns.

  9. Software Design Document for the AMP Nuclear Fuel Performance Code

    International Nuclear Information System (INIS)

    Philip, Bobby; Clarno, Kevin T.; Cochran, Bill

    2010-01-01

    The purpose of this document is to describe the design of the AMP nuclear fuel performance code. It provides an overview of the decomposition into separable components, an overview of what those components will do, and the strategic basis for the design. The primary components of a computational physics code include a user interface, physics packages, material properties, mathematics solvers, and computational infrastructure. Some capability from established off-the-shelf (OTS) packages will be leveraged in the development of AMP, but the primary physics components will be entirely new. The material properties required by these physics operators include many highly non-linear properties, which will be replicated from FRAPCON and LIFE where applicable, as well as some computationally-intensive operations, such as gap conductance, which depends upon the plenum pressure. Because there is extensive capability in off-the-shelf leadership class computational solvers, AMP will leverage the Trilinos, PETSc, and SUNDIALS packages. The computational infrastructure includes a build system, mesh database, and other building blocks of a computational physics package. The user interface will be developed through a collaborative effort with the Nuclear Energy Advanced Modeling and Simulation (NEAMS) Capability Transfer program element as much as possible and will be discussed in detail in a future document.

  10. Performance of student software development teams: the influence of personality and identifying as team members

    Science.gov (United States)

    Monaghan, Conal; Bizumic, Boris; Reynolds, Katherine; Smithson, Michael; Johns-Boast, Lynette; van Rooy, Dirk

    2015-01-01

    One prominent approach in the exploration of the variations in project team performance has been to study two components of the aggregate personalities of the team members: conscientiousness and agreeableness. A second line of research, known as self-categorisation theory, argues that identifying as team members and the team's performance norms should substantially influence the team's performance. This paper explores the influence of both these perspectives in university software engineering project teams. Eighty students worked to complete a piece of software in small project teams during 2007 or 2008. To reduce limitations in statistical analysis, Monte Carlo simulation techniques were employed to extrapolate from the results of the original sample to a larger simulated sample (2043 cases, within 319 teams). The results emphasise the importance of taking into account personality (particularly conscientiousness), and both team identification and the team's norm of performance, in order to cultivate higher levels of performance in student software engineering project teams.

  11. JMorph: Software for performing rapid morphometric measurements on digital images of fossil assemblages

    Science.gov (United States)

    Lelièvre, Peter G.; Grey, Melissa

    2017-08-01

    Quantitative morphometric analyses of form are widely used in palaeontology, especially for taxonomic and evolutionary research. These analyses can involve several measurements performed on hundreds or even thousands of samples. Performing measurements of size and shape on large assemblages of macro- or microfossil samples is generally infeasible or impossible with traditional instruments such as vernier calipers. Instead, digital image processing software is required to perform measurements via suitable digital images of samples. Many software packages exist for morphometric analyses but there is not much available for the integral stage of data collection, particularly for the measurement of the outlines of samples. Some software exists to automatically detect the outline of a fossil sample from a digital image. However, automatic outline detection methods may perform inadequately when samples have incomplete outlines or images contain poor contrast between the sample and staging background. Hence, a manual digitization approach may be the only option. We are not aware of any software packages that are designed specifically for efficient digital measurement of fossil assemblages with numerous samples, especially for the purposes of manual outline analysis. Throughout several previous studies, we have developed a new software tool, JMorph, that is custom-built for that task. JMorph provides the means to perform many different types of measurements, which we describe in this manuscript. We focus on JMorph's ability to rapidly and accurately digitize the outlines of fossils. JMorph is freely available from the authors.

  12. Supporting Performance Isolation in Software as a Service Systems with Rich Clients

    NARCIS (Netherlands)

    Oral, A.; Tekinerdogan, B.

    2015-01-01

    In a non-isolated Software as a Service (SaaS) system, different clients can freely use the resources of the SaaS. Hereby, disruptive tenants who exceed their limits can easily cause degradation of performance of the provided services for other tenants. To ensure performance demands of the multiple

  13. Frameworks for Performing on Cloud Automated Software Testing Using Swarm Intelligence Algorithm: Brief Survey

    Directory of Open Access Journals (Sweden)

    Mohammad Hossain

    2018-04-01

    Full Text Available This paper surveys on Cloud Based Automated Testing Software that is able to perform Black-box testing, White-box testing, as well as Unit and Integration Testing as a whole. In this paper, we discuss few of the available automated software testing frameworks on the cloud. These frameworks are found to be more efficient and cost effective because they execute test suites over a distributed cloud infrastructure. One of the framework effectiveness was attributed to having a module that accepts manual test cases from users and it prioritize them accordingly. Software testing, in general, accounts for as much as 50% of the total efforts of the software development project. To lessen the efforts, one the frameworks discussed in this paper used swarm intelligence algorithms. It uses the Ant Colony Algorithm for complete path coverage to minimize time and the Bee Colony Optimization (BCO for regression testing to ensure backward compatibility.

  14. Track finding and track reconstruction in the internal forward drift chamber of SAPHIR

    International Nuclear Information System (INIS)

    Umlauf, G.

    1993-03-01

    A track finding algorithm has been developed for the inner forward drift chamber of the SAPHIR detector (at ELSA in Bonn) using the Principal Components Analysis as a tool for interpolating track coordinates. The drift chamber consists of twelve planar layers with six different inclinations and is being operated in an inhomogenous magnetic field. The task of track finding is basicly split into a primary stage that defines track candidates without the use of drift-time information and a second stage that serves to verify the track candidate and to resolve the intrinsic left-right ambiguities of the drift chamber signals. Tracks with at most three missing signals can be found. (orig.) [de

  15. PIPER: Performance Insight for Programmers and Exascale Runtimes: Guiding the Development of the Exascale Software Stack

    Energy Technology Data Exchange (ETDEWEB)

    Mellor-Crummey, John [Rice Univ., Houston, TX (United States)

    2017-10-20

    The PIPER project set out to develop methodologies and software for measurement, analysis, attribution, and presentation of performance data for extreme-scale systems. Goals of the project were to support analysis of massive multi-scale parallelism, heterogeneous architectures, multi-faceted performance concerns, and to support both post-mortem performance analysis to identify program features that contribute to problematic performance and on-line performance analysis to drive adaptation. This final report summarizes the research and development activity at Rice University as part of the PIPER project. Producing a complete suite of performance tools for exascale platforms during the course of this project was impossible since both hardware and software for exascale systems is still a moving target. For that reason, the project focused broadly on the development of new techniques for measurement and analysis of performance on modern parallel architectures, enhancements to HPCToolkit’s software infrastructure to support our research goals or use on sophisticated applications, engaging developers of multithreaded runtimes to explore how support for tools should be integrated into their designs, engaging operating system developers with feature requests for enhanced monitoring support, engaging vendors with requests that they add hardware measure- ment capabilities and software interfaces needed by tools as they design new components of HPC platforms including processors, accelerators and networks, and finally collaborations with partners interested in using HPCToolkit to analyze and tune scalable parallel applications.

  16. Systems analysis programs for hands-on integrated reliability evaluations (SAPHIRE) Version 5.0. Fault tree, event tree, and piping ampersand instrumentation diagram (FEP) editors reference manual: Volume 7

    International Nuclear Information System (INIS)

    McKay, M.K.; Skinner, N.L.; Wood, S.T.

    1994-07-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) refers to a set of several microcomputer programs that were developed to create and analyze probabilistic risk assessments (PRAs), primarily for nuclear power plants. The Fault Tree, Event Tree, and Piping and Instrumentation Diagram (FEP) editors allow the user to graphically build and edit fault trees, and event trees, and piping and instrumentation diagrams (P and IDs). The software is designed to enable the independent use of the graphical-based editors found in the Integrated Reliability and Risk Assessment System (IRRAS). FEP is comprised of three separate editors (Fault Tree, Event Tree, and Piping and Instrumentation Diagram) and a utility module. This reference manual provides a screen-by-screen guide of the entire FEP System

  17. Analysing sensory panel performance in a proficiency test using the PanelCheck software

    DEFF Research Database (Denmark)

    Tomic, O.; Luciano, G.; Nilsen, A.

    2010-01-01

    Check software, a workflow is proposed that guides the user through the data analysis process. This allows practitioners and non-statisticians to get an overview over panel performances in a rapid manner without the need to be familiar with details on the statistical methods. Visualisation of data analysis...... results plays an important role as this provides a time saving and efficient way of screening and investigating sensory panel performances. Most of the statistical methods used in this paper are available in the open source software PanelCheck, which may be downloaded and used for free....

  18. Maximizing Use of Extension Beef Cattle Benchmarks Data Derived from Cow Herd Appraisal Performance Software

    Science.gov (United States)

    Ramsay, Jennifer M.; Hanna, Lauren L. Hulsman; Ringwall, Kris A.

    2016-01-01

    One goal of Extension is to provide practical information that makes a difference to producers. Cow Herd Appraisal Performance Software (CHAPS) has provided beef producers with production benchmarks for 30 years, creating a large historical data set. Many such large data sets contain useful information but are underutilized. Our goal was to create…

  19. A virtualized software based on the NVIDIA cuFFT library for image denoising: performance analysis

    DEFF Research Database (Denmark)

    Galletti, Ardelio; Marcellino, Livia; Montella, Raffaele

    2017-01-01

    Abstract Generic Virtualization Service (GVirtuS) is a new solution for enabling GPGPU on Virtual Machines or low powered devices. This paper focuses on the performance analysis that can be obtained using a GPGPU virtualized software. Recently, GVirtuS has been extended in order to support CUDA...... ancillary libraries with good results. Here, our aim is to analyze the applicability of this powerful tool to a real problem, which uses the NVIDIA cuFFT library. As case study we consider a simple denoising algorithm, implementing a virtualized GPU-parallel software based on the convolution theorem...

  20. Distributed control software of high-performance control-loop algorithm

    CERN Document Server

    Blanc, D

    1999-01-01

    The majority of industrial cooling and ventilation plants require the control of complex processes. All these processes are highly important for the operation of the machines. The stability and reliability of these processes are leading factors identifying the quality of the service provided. The control system architecture and software structure, as well, are required to have high dynamical performance and robust behaviour. The intelligent systems based on PID or RST controllers are used for their high level of stability and accuracy. The design and tuning of these complex controllers require the dynamic model of the plant to be known (generally obtained by identification) and the desired performance of the various control loops to be specified for achieving good performances. The concept of having a distributed control algorithm software provides full automation facilities with well-adapted functionality and good performances, giving methodology, means and tools to master the dynamic process optimization an...

  1. Representation of the Physiological Factors Contributing to Postflight Changes in Functional Performance Using Motion Analysis Software

    Science.gov (United States)

    Parks, Kelsey

    2010-01-01

    Astronauts experience changes in multiple physiological systems due to exposure to the microgravity conditions of space flight. To understand how changes in physiological function influence functional performance, a testing procedure has been developed that evaluates both astronaut postflight functional performance and related physiological changes. Astronauts complete seven functional and physiological tests. The objective of this project is to use motion tracking and digitizing software to visually display the postflight decrement in the functional performance of the astronauts. The motion analysis software will be used to digitize astronaut data videos into stick figure videos to represent the astronauts as they perform the Functional Tasks Tests. This project will benefit NASA by allowing NASA scientists to present data of their neurological studies without revealing the identities of the astronauts.

  2. A Data Specification for Software Project Performance Measures: Results of a Collaboration on Performance Measurement

    Science.gov (United States)

    2008-07-01

    cycle Evolution of a system, product, service, project or other human-made entity from conception through retirement [ ISO 12207 ]. Logical line of...012 [ ISO 1995] International Organization for Standardization. ISO /IEC 12207 :1995—Information technology— Software life cycle processes. http...definitions, authors were asked to use or align with already existing standards such as those available through ISO and IEEE when possible. Literature

  3. AN ENHANCED MODEL TO ESTIMATE EFFORT, PERFORMANCE AND COST OF THE SOFTWARE PROJECTS

    Directory of Open Access Journals (Sweden)

    M. Pauline

    2013-04-01

    Full Text Available The Authors have proposed a model that first captures the fundamentals of software metrics in the phase 1 consisting of three primitive primary software engineering metrics; they are person-months (PM, function-points (FP, and lines of code (LOC. The phase 2 consists of the proposed function point which is obtained by grouping the adjustment factors to simplify the process of adjustment and to ensure more consistency in the adjustments. In the proposed method fuzzy logic is used for quantifying the quality of requirements and is added as one of the adjustment factor, thus a fuzzy based approach for the Enhanced General System Characteristics to Estimate Effort of the Software Projects using productivity has been obtained. The phase 3 takes the calculated function point from our work and is given as input to the static single variable model (i.e. to the Intermediate COCOMO and COCOMO II for cost estimation. The Authors have tailored the cost factors in intermediate COCOMO and both; cost and scale factors are tailored in COCOMO II to suite to the individual development environment, which is very important for the accuracy of the cost estimates. The software performance indicators are project duration, schedule predictability, requirements completion ratio and post-release defect density, are also measured for the software projects in my work. A comparative study for effort, performance measurement and cost estimation of the software project is done between the existing model and the authors proposed work. Thus our work analyzes the interaction¬al process through which the estimation tasks were collectively accomplished.

  4. Evaluation of Software Quality to Improve Application Performance Using Mc Call Model

    Directory of Open Access Journals (Sweden)

    Inda D Lestantri

    2018-04-01

    Full Text Available The existence of software should have more value to improve the performance of the organization in addition to having the primary function to automate. Before being implemented in an operational environment, software must pass the test gradually to ensure that the software is functioning properly, meeting user needs and providing convenience for users to use it. This test is performed on a web-based application, by taking a test case in an e-SAP application. E-SAP is an application used to monitor teaching and learning activities used by a university in Jakarta. To measure software quality, testing can be done on users randomly. The user samples selected in this test are users with an age range of 18 years old up to 25 years, background information technology. This test was conducted on 30 respondents. This test is done by using Mc Call model. Model of testing Mc Call consists of 11 dimensions are grouped into 3 categories. This paper describes the testing with reference to the category of product operation, which includes 5 dimensions. The dimensions of testing performed include the dimensions of correctness, usability, efficiency, reliability, and integrity. This paper discusses testing on each dimension to measure software quality as an effort to improve performance. The result of research is e-SAP application has good quality with product operation value equal to 85.09%. This indicates that the e-SAP application has a great quality, so this application deserves to be examined in the next stage on the operational environment.

  5. Revisioning Theoretical Framework of Electronic Performance Support Systems (EPSS within the Software Application Examples

    Directory of Open Access Journals (Sweden)

    Dr. Servet BAYRAM,

    2004-04-01

    Full Text Available Revisioning Theoretical Framework of Electronic Performance Support Systems (EPSS within the Software Application Examples Assoc. Prof. Dr. Servet BAYRAM Computer Education & Instructional Technologies Marmara University , TURKEY ABSTRACT EPSS provides electronic support to learners in achieving a performance objective; a feature which makes it universally and consistently available on demand any time, any place, regardless of situation, without unnecessary intermediaries involved in the process. The aim of this review is to develop a set of theoretical construct that provide descriptive power for explanation of EPSS and its roots and features within the software application examples (i.e., Microsoft SharePoint Server”v2.0” Beta 2, IBM Lotus Notes 6 & Domino 6, Oracle 9i Collaboration Suite, and Mac OS X v10.2. From the educational and training point of view, the paper visualizes a pentagon model for the interrelated domains of the theoretical framework of EPSS. These domains are: learning theories, information processing theories, developmental theories, instructional theories, and acceptance theories. This descriptive framework explains a set of descriptions as to which outcomes occur under given theoretical conditions for a given EPSS model within software examples. It summarizes some of the theoretical concepts supporting to the EPSS’ related features and explains how such concepts sharing same features with the example software programs in education and job training.

  6. A Framework for Performing Verification and Validation in Reuse Based Software Engineering

    Science.gov (United States)

    Addy, Edward A.

    1997-01-01

    Verification and Validation (V&V) is currently performed during application development for many systems, especially safety-critical and mission- critical systems. The V&V process is intended to discover errors, especially errors related to critical processing, as early as possible during the development process. The system application provides the context under which the software artifacts are validated. This paper describes a framework that extends V&V from an individual application system to a product line of systems that are developed within an architecture-based software engineering environment. This framework includes the activities of traditional application-level V&V, and extends these activities into domain engineering and into the transition between domain engineering and application engineering. The framework includes descriptions of the types of activities to be performed during each of the life-cycle phases, and provides motivation for the activities.

  7. Integrated State Estimation and Contingency Analysis Software Implementation using High Performance Computing Techniques

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Yousu; Glaesemann, Kurt R.; Rice, Mark J.; Huang, Zhenyu

    2015-12-31

    Power system simulation tools are traditionally developed in sequential mode and codes are optimized for single core computing only. However, the increasing complexity in the power grid models requires more intensive computation. The traditional simulation tools will soon not be able to meet the grid operation requirements. Therefore, power system simulation tools need to evolve accordingly to provide faster and better results for grid operations. This paper presents an integrated state estimation and contingency analysis software implementation using high performance computing techniques. The software is able to solve large size state estimation problems within one second and achieve a near-linear speedup of 9,800 with 10,000 cores for contingency analysis application. The performance evaluation is presented to show its effectiveness.

  8. Dynamic CT myocardial perfusion imaging: performance of 3D semi-automated evaluation software

    Energy Technology Data Exchange (ETDEWEB)

    Ebersberger, Ullrich [Medical University of South Carolina, Heart and Vascular Center, Charleston, SC (United States); Heart Center Munich-Bogenhausen, Department of Cardiology and Intensive Care Medicine, Munich (Germany); Marcus, Roy P.; Nikolaou, Konstantin; Bamberg, Fabian [University of Munich, Institute of Clinical Radiology, Munich (Germany); Schoepf, U.J.; Gray, J.C.; McQuiston, Andrew D. [Medical University of South Carolina, Heart and Vascular Center, Charleston, SC (United States); Lo, Gladys G. [Hong Kong Sanatorium and Hospital, Department of Diagnostic and Interventional Radiology, Hong Kong (China); Wang, Yining [Medical University of South Carolina, Heart and Vascular Center, Charleston, SC (United States); Peking Union Medical College Hospital, Chinese Academy of Medical Sciences, Department of Radiology, Beijing (China); Blanke, Philipp [Medical University of South Carolina, Heart and Vascular Center, Charleston, SC (United States); University Hospital Freiburg, Department of Diagnostic Radiology, Freiburg (Germany); Geyer, Lucas L. [Medical University of South Carolina, Heart and Vascular Center, Charleston, SC (United States); University of Munich, Institute of Clinical Radiology, Munich (Germany); Cho, Young Jun [Medical University of South Carolina, Heart and Vascular Center, Charleston, SC (United States); Konyang University College of Medicine, Department of Radiology, Daejeon (Korea, Republic of); Scheuering, Michael; Canstein, Christian [Siemens Healthcare, CT Division, Forchheim (Germany); Hoffmann, Ellen [Heart Center Munich-Bogenhausen, Department of Cardiology and Intensive Care Medicine, Munich (Germany)

    2014-01-15

    To evaluate the performance of three-dimensional semi-automated evaluation software for the assessment of myocardial blood flow (MBF) and blood volume (MBV) at dynamic myocardial perfusion computed tomography (CT). Volume-based software relying on marginal space learning and probabilistic boosting tree-based contour fitting was applied to CT myocardial perfusion imaging data of 37 subjects. In addition, all image data were analysed manually and both approaches were compared with SPECT findings. Study endpoints included time of analysis and conventional measures of diagnostic accuracy. Of 592 analysable segments, 42 showed perfusion defects on SPECT. Average analysis times for the manual and software-based approaches were 49.1 ± 11.2 and 16.5 ± 3.7 min respectively (P < 0.01). There was strong agreement between the two measures of interest (MBF, ICC = 0.91, and MBV, ICC = 0.88, both P < 0.01) and no significant difference in MBF/MBV with respect to diagnostic accuracy between the two approaches for both MBF and MBV for manual versus software-based approach; respectively; all comparisons P > 0.05. Three-dimensional semi-automated evaluation of dynamic myocardial perfusion CT data provides similar measures and diagnostic accuracy to manual evaluation, albeit with substantially reduced analysis times. This capability may aid the integration of this test into clinical workflows. (orig.)

  9. A SOFTWARE TOOL TO COMPARE MEASURED AND SIMULATED BUILDING ENERGY PERFORMANCE DATA

    Energy Technology Data Exchange (ETDEWEB)

    Maile, Tobias; Bazjanac, Vladimir; O' Donnell, James; Garr, Matthew

    2011-11-01

    Building energy performance is often inadequate when compared to design goals. To link design goals to actual operation one can compare measured with simulated energy performance data. Our previously developed comparison approach is the Energy Performance Comparison Methodology (EPCM), which enables the identification of performance problems based on a comparison of measured and simulated performance data. In context of this method, we developed a software tool that provides graphing and data processing capabilities of the two performance data sets. The software tool called SEE IT (Stanford Energy Efficiency Information Tool) eliminates the need for manual generation of data plots and data reformatting. SEE IT makes the generation of time series, scatter and carpet plots independent of the source of data (measured or simulated) and provides a valuable tool for comparing measurements with simulation results. SEE IT also allows assigning data points on a predefined building object hierarchy and supports different versions of simulated performance data. This paper briefly introduces the EPCM, describes the SEE IT tool and illustrates its use in the context of a building case study.

  10. Total OH reactivity study from VOC photochemical oxidation in the SAPHIR chamber

    Science.gov (United States)

    Yu, Z.; Tillmann, R.; Hohaus, T.; Fuchs, H.; Novelli, A.; Wegener, R.; Kaminski, M.; Schmitt, S. H.; Wahner, A.; Kiendler-Scharr, A.

    2015-12-01

    It is well known that hydroxyl radicals (OH) act as a dominant reactive species in the degradation of VOCs in the atmosphere. In recent field studies, directly measured total OH reactivity often showed poor agreement with OH reactivity calculated from VOC measurements (e.g. Nölscher et al., 2013; Lu et al., 2012a). This "missing OH reactivity" is attributed to unaccounted biogenic VOC emissions and/or oxidation products. The comparison of total OH reactivity being directly measured and calculated from single component measurements of VOCs and their oxidation products gives us a further understanding on the source of unmeasured reactive species in the atmosphere. This allows also the determination of the magnitude of the contribution of primary VOC emissions and their oxidation products to the missing OH reactivity. A series of experiments was carried out in the atmosphere simulation chamber SAPHIR in Jülich, Germany, to explore in detail the photochemical degradation of VOCs (isoprene, ß-pinene, limonene, and D6-benzene) by OH. The total OH reactivity was determined from the measurement of VOCs and their oxidation products by a Proton Transfer Reaction Time of Flight Mass Spectrometer (PTR-TOF-MS) with a GC/MS/FID system, and directly measured by a laser-induced fluorescence (LIF) at the same time. The comparison between these two total OH reactivity measurements showed an increase of missing OH reactivity in the presence of oxidation products of VOCs, indicating a strong contribution to missing OH reactivity from uncharacterized oxidation products.

  11. Carotid artery stenosis: Performance of advanced vessel analysis software in evaluating CTA

    International Nuclear Information System (INIS)

    Tsiflikas, Ilias; Biermann, Christina; Thomas, Christoph; Ketelsen, Dominik; Claussen, Claus D.; Heuschmid, Martin

    2012-01-01

    Objectives: The aim of this study was to evaluate time efficiency and diagnostic reproducibility of an advanced vessel analysis software for diagnosis of carotid artery stenosis. Material and methods: 40 patients with suspected carotid artery stenosis received head and neck DE-CTA as part of their pre-interventional workup. Acquired data were evaluated by 2 independent radiologists. Stenosis grading was performed by MPR eyeballing with freely adjustable MPRs and with a preliminary prototype of the meanwhile available client-server and advanced visualization software syngo.via CT Vascular (Siemens Healthcare, Erlangen, Germany). Stenoses were graded according to the following 5 categories: I: 0%, II: 1–50%, III: 51–69%, IV: 70–99% and V: total occlusion. Furthermore, time to diagnosis for each carotid artery was recorded. Results: Both readers achieved very good specificity values and good respectively very good sensitivity values without significant differences between both reading methods. Furthermore, there was a very good correlation between both readers for both reading methods without significant differences (kappa value: standard image interpretation k = 0.809; advanced vessel analysis software k = 0.863). Using advanced vessel analysis software resulted in a significant time saving (p < 0.0001) for both readers. Time to diagnosis could be decreased by approximately 55%. Conclusions: Advanced vessel analysis application CT Vascular of the new imaging software syngo.via (Siemens Healthcare, Forchheim, Germany) provides a high rate of reproducibility in assessment of carotid artery stenosis. Furthermore a significant time saving in comparison to standard image interpretation is achievable

  12. Carotid artery stenosis: Performance of advanced vessel analysis software in evaluating CTA

    Energy Technology Data Exchange (ETDEWEB)

    Tsiflikas, Ilias, E-mail: ilias.tsiflikas@med.uni-tuebingen.de [University Hospital of Tuebingen, Diagnostic and Interventional Radiology, Hoppe-Seyler-Str. 3, 72076 Tuebingen (Germany); Biermann, Christina, E-mail: christina.biermann@siemens.com [University Hospital of Tuebingen, Diagnostic and Interventional Radiology, Hoppe-Seyler-Str. 3, 72076 Tuebingen (Germany); Siemens AG, Siemens Healthcare Consulting, Allee am Röthelheimpark 3A, 91052 Erlangen (Germany); Thomas, Christoph, E-mail: christoph.thomas@med.uni-tuebingen.de [University Hospital of Tuebingen, Diagnostic and Interventional Radiology, Hoppe-Seyler-Str. 3, 72076 Tuebingen (Germany); Ketelsen, Dominik, E-mail: dominik.ketelsen@med.uni-tuebingen.de [University Hospital of Tuebingen, Diagnostic and Interventional Radiology, Hoppe-Seyler-Str. 3, 72076 Tuebingen (Germany); Claussen, Claus D., E-mail: claus.claussen@med.uni-tuebingen.de [University Hospital of Tuebingen, Diagnostic and Interventional Radiology, Hoppe-Seyler-Str. 3, 72076 Tuebingen (Germany); Heuschmid, Martin, E-mail: martin.heuschmid@med.uni-tuebingen.de [University Hospital of Tuebingen, Diagnostic and Interventional Radiology, Hoppe-Seyler-Str. 3, 72076 Tuebingen (Germany)

    2012-09-15

    Objectives: The aim of this study was to evaluate time efficiency and diagnostic reproducibility of an advanced vessel analysis software for diagnosis of carotid artery stenosis. Material and methods: 40 patients with suspected carotid artery stenosis received head and neck DE-CTA as part of their pre-interventional workup. Acquired data were evaluated by 2 independent radiologists. Stenosis grading was performed by MPR eyeballing with freely adjustable MPRs and with a preliminary prototype of the meanwhile available client-server and advanced visualization software syngo.via CT Vascular (Siemens Healthcare, Erlangen, Germany). Stenoses were graded according to the following 5 categories: I: 0%, II: 1–50%, III: 51–69%, IV: 70–99% and V: total occlusion. Furthermore, time to diagnosis for each carotid artery was recorded. Results: Both readers achieved very good specificity values and good respectively very good sensitivity values without significant differences between both reading methods. Furthermore, there was a very good correlation between both readers for both reading methods without significant differences (kappa value: standard image interpretation k = 0.809; advanced vessel analysis software k = 0.863). Using advanced vessel analysis software resulted in a significant time saving (p < 0.0001) for both readers. Time to diagnosis could be decreased by approximately 55%. Conclusions: Advanced vessel analysis application CT Vascular of the new imaging software syngo.via (Siemens Healthcare, Forchheim, Germany) provides a high rate of reproducibility in assessment of carotid artery stenosis. Furthermore a significant time saving in comparison to standard image interpretation is achievable.

  13. Optimizing the Performance of Radionuclide Identification Software in the Hunt for Nuclear Security Threats

    Energy Technology Data Exchange (ETDEWEB)

    Fotion, Katherine A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-08-18

    The Radionuclide Analysis Kit (RNAK), my team’s most recent nuclide identification software, is entering the testing phase. A question arises: will removing rare nuclides from the software’s library improve its overall performance? An affirmative response indicates fundamental errors in the software’s framework, while a negative response confirms the effectiveness of the software’s key machine learning algorithms. After thorough testing, I found that the performance of RNAK cannot be improved with the library choice effect, thus verifying the effectiveness of RNAK’s algorithms—multiple linear regression, Bayesian network using the Viterbi algorithm, and branch and bound search.

  14. Optimisation of Software-Defined Networks Performance Using a Hybrid Intelligent System

    Directory of Open Access Journals (Sweden)

    Ann Sabih

    2017-06-01

    Full Text Available This paper proposes a novel intelligent technique that has been designed to optimise the performance of Software Defined Networks (SDN. The proposed hybrid intelligent system has employed integration of intelligence-based optimisation approaches with the artificial neural network. These heuristic optimisation methods include Genetic Algorithms (GA and Particle Swarm Optimisation (PSO. These methods were utilised separately in order to select the best inputs to maximise SDN performance. In order to identify SDN behaviour, the neural network model is trained and applied. The maximal optimisation approach has been identified using an analytical approach that considered SDN performance and the computational time as objective functions. Initially, the general model of the neural network was tested with unseen data before implementing the model using GA and PSO to determine the optimal performance of SDN. The results showed that the SDN represented by Artificial Neural Network ANN, and optmised by PSO, generated a better configuration with regards to computational efficiency and performance index.

  15. New Software Performance with Balanced Score Card Assessment: Case Study at LPGI Jakarta

    Directory of Open Access Journals (Sweden)

    Brata Wibawa Djojo

    2011-09-01

    Full Text Available Implementation of information technology (IT, especially new software applications, needs to be evaluated for its impact to organization’s business performance related to its strategic goal. The measurement and evaluation of a new software implementation impact in LPGI Jakarta uses Balanced Scorecard (BSC analysis by making comparison of three-year data. The analysis involves four perspectives of BSC: (1 Financial aspect with the growth of gross premium written (GPW, net premium written (NPW, underwriting profit; (2 internal business aspect: the frequency of policy issued and the average production per policy; (3 people or learning and growth which consists of human error and system error; (4 customer aspect with external endorsement and renewal ratio. This research measures and evaluates for the impact of the implementation of a new software application to the new business performance as Marginal and Fair contribution.  At the end of this paper the writer suggests LPGI Jakarta to increase the sales activities to reach the target which is related directly to financial aspect and internal business process aspect.

  16. Performance evaluation of spectral deconvolution analysis tool (SDAT) software used for nuclear explosion radionuclide measurements

    International Nuclear Information System (INIS)

    Foltz Biegalski, K.M.; Biegalski, S.R.; Haas, D.A.

    2008-01-01

    The Spectral Deconvolution Analysis Tool (SDAT) software was developed to improve counting statistics and detection limits for nuclear explosion radionuclide measurements. SDAT utilizes spectral deconvolution spectroscopy techniques and can analyze both β-γ coincidence spectra for radioxenon isotopes and high-resolution HPGe spectra from aerosol monitors. Spectral deconvolution spectroscopy is an analysis method that utilizes the entire signal deposited in a gamma-ray detector rather than the small portion of the signal that is present in one gamma-ray peak. This method shows promise to improve detection limits over classical gamma-ray spectroscopy analytical techniques; however, this hypothesis has not been tested. To address this issue, we performed three tests to compare the detection ability and variance of SDAT results to those of commercial off- the-shelf (COTS) software which utilizes a standard peak search algorithm. (author)

  17. A Business Analytics Software Tool for Monitoring and Predicting Radiology Throughput Performance.

    Science.gov (United States)

    Jones, Stephen; Cournane, Seán; Sheehy, Niall; Hederman, Lucy

    2016-12-01

    Business analytics (BA) is increasingly being utilised by radiology departments to analyse and present data. It encompasses statistical analysis, forecasting and predictive modelling and is used as an umbrella term for decision support and business intelligence systems. The primary aim of this study was to determine whether utilising BA technologies could contribute towards improved decision support and resource management within radiology departments. A set of information technology requirements were identified with key stakeholders, and a prototype BA software tool was designed, developed and implemented. A qualitative evaluation of the tool was carried out through a series of semi-structured interviews with key stakeholders. Feedback was collated, and emergent themes were identified. The results indicated that BA software applications can provide visibility of radiology performance data across all time horizons. The study demonstrated that the tool could potentially assist with improving operational efficiencies and management of radiology resources.

  18. Studies in Software Cost Model Behavior: Do We Really Understand Cost Model Performance?

    Science.gov (United States)

    Lum, Karen; Hihn, Jairus; Menzies, Tim

    2006-01-01

    While there exists extensive literature on software cost estimation techniques, industry practice continues to rely upon standard regression-based algorithms. These software effort models are typically calibrated or tuned to local conditions using local data. This paper cautions that current approaches to model calibration often produce sub-optimal models because of the large variance problem inherent in cost data and by including far more effort multipliers than the data supports. Building optimal models requires that a wider range of models be considered while correctly calibrating these models requires rejection rules that prune variables and records and use multiple criteria for evaluating model performance. The main contribution of this paper is to document a standard method that integrates formal model identification, estimation, and validation. It also documents what we call the large variance problem that is a leading cause of cost model brittleness or instability.

  19. On the Tradeoff between Performance and Programmability for Software Defined WiFi Networks

    Directory of Open Access Journals (Sweden)

    Tausif Zahid

    2018-01-01

    Full Text Available WiFi has become one of the major network access networks due to its simple technical implementation and high-bandwidth provisioning. In this paper, we studied software defined WiFi networks (SDWN against traditional WiFi networks to understand the potential benefits, such as the ability of SDWN to effectively hide the handover delay between access points (AP of the adoption of the SDWN architecture on WiFi networks and identify representative application scenarios where such SDWN approach could bring additional benefits. This study delineated the performance bottlenecks such as the throughput degradation by around 50% compared with the conventional WiFi networks. In addition, our study also shed some insights into performance optimization issues. All of the performance measurements were conducted on a network testbed consisting of a single basic service set (BSS and an extended service set (ESS managed by a single SDN controller deployed with various laboratory settings. Our evaluation included the throughput performance under different traffic loads with different number of nodes and packet sizes for both TCP and UDP traffic flows. Handover delays were measured during the roaming phase between different APs against the traditional WiFi networks. Our results have demonstrated the tradeoff between performance and programmability of software defined APs.

  20. Measurement of the reaction γd →pnπ+π- at SAPHIR and investigation of the decay angular distribution of the Δ++(1232) resonance

    International Nuclear Information System (INIS)

    Schuetz, P.

    1993-03-01

    SAPHIR, a new experiment at the Bonn electron stretcher ring ELSA, started taking data in spring 1992. It was set up for the investigation of photon induced reactions with multiparticle final states. In the first part of this paper the special design of the target is described. It can be operated with liquefied hydrogen or deuterium and is placed in the middle of the central drift chamber. To project the surrounding chamber in case of a fracture of the target cell as safety system is installed. In addition two independent methods of monitoring the cell are procided. The first measurement was performed with a deuterium target at a photon energy range of E γ = 500-700 MeV. In the second part of this paper first results of an analysis of the decay angular distribution of the Δ ++ (1232) in the reaction γd → nΔ ++ π - are presented. They are compared to old data from a hydrogen bubble chamber experiment and are discussed on the basis of a spectator model. (orig.) [de

  1. Impact Analysis of Generalized Audit Software (GAS Utilization to Auditor Performances

    Directory of Open Access Journals (Sweden)

    Aries Wicaksono

    2016-09-01

    Full Text Available This study aimed to understand whether the use of Generalized Audit Software (GAS in the audit process had an impact on the auditors performance and to acquire conclusions in the evaluation form towards GAS audit process to provide a positive impact on the performance of auditors. The models used to evaluate the impact of GAS were Quantity of Work, Quality of Work, Job Knowledge, Creativeness, Cooperation, Dependability, Initiative, and Personal Qualities. The method used in this research was a qualitative method of analytical descriptive and evaluative, by analyzing the impact of the GAS implementation to the components of the user’s performance. The results indicate that the use of GAS has a positive impact on user’s performance components.

  2. A Framework for Performing V&V within Reuse-Based Software Engineering

    Science.gov (United States)

    Addy, Edward A.

    1996-01-01

    Verification and validation (V&V) is performed during application development for many systems, especially safety-critical and mission-critical systems. The V&V process is intended to discover errors, especially errors related to critical processing, as early as possible during the development process. Early discovery is important in order to minimize the cost and other impacts of correcting these errors. In order to provide early detection of errors, V&V is conducted in parallel with system development, often beginning with the concept phase. In reuse-based software engineering, however, decisions on the requirements, design and even implementation of domain assets can be made prior to beginning development of a specific system. In this case, V&V must be performed during domain engineering in order to have an impact on system development. This paper describes a framework for performing V&V within architecture-centric, reuse-based software engineering. This framework includes the activities of traditional application-level V&V, and extends these activities into domain engineering and into the transition between domain engineering and application engineering. The framework includes descriptions of the types of activities to be performed during each of the life-cycle phases, and provides motivation for the activities.

  3. Improvement of the drift chamber system in the SAPHIR detector and first measurements of the Φ meson production at threshold

    International Nuclear Information System (INIS)

    Scholmann, J.N.

    1996-09-01

    The SAPHIR detector at ELSA enables the measurement of photon induced Φ meson production from threshold up to 3 GeV in the full kinematical range. A considerable improvement of the drift chamber system is a precondition of gaining the necessary data rate in an acceptable time. The research focuses attention on the choice of the chamber gas and on a different mechanical construction, so as to minimize the negative influences of the photon beam crossing the sensitive volume of the drift chamber system. In addition, first preliminary results of the total and the differential cross section for the Φ meson production close to threshold were evaluated. (orig.)

  4. A Visual Basic simulation software tool for performance analysis of a membrane-based advanced water treatment plant.

    Science.gov (United States)

    Pal, P; Kumar, R; Srivastava, N; Chaudhuri, J

    2014-02-01

    A Visual Basic simulation software (WATTPPA) has been developed to analyse the performance of an advanced wastewater treatment plant. This user-friendly and menu-driven software is based on the dynamic mathematical model for an industrial wastewater treatment scheme that integrates chemical, biological and membrane-based unit operations. The software-predicted results corroborate very well with the experimental findings as indicated in the overall correlation coefficient of the order of 0.99. The software permits pre-analysis and manipulation of input data, helps in optimization and exhibits performance of an integrated plant visually on a graphical platform. It allows quick performance analysis of the whole system as well as the individual units. The software first of its kind in its domain and in the well-known Microsoft Excel environment is likely to be very useful in successful design, optimization and operation of an advanced hybrid treatment plant for hazardous wastewater.

  5. Performance Estimation for Hardware/Software codesign using Hierarchical Colored Petri Nets

    DEFF Research Database (Denmark)

    Grode, Jesper Nicolai Riis; Madsen, Jan; Jerraya, Ahmed-Amine

    1998-01-01

    This paper presents an approach for abstract modeling of the functional behavior of hardware architectures using Hierarchical Colored Petri Nets (HCPNs). Using HCPNs as architectural models has several advantages such as higher estimation accuracy, higher flexibility, and the need for only one...... estimation tool. This makes the approach very useful for designing component models used for performance estimation in Hardware/Software Codesign frameworks such as the LYCOS system. The paper presents the methodology and rules for designing component models using HCPNs. Two examples of architectural models...

  6. An open-source software program for performing Bonferroni and related corrections for multiple comparisons

    Directory of Open Access Journals (Sweden)

    Kyle Lesack

    2011-01-01

    Full Text Available Increased type I error resulting from multiple statistical comparisons remains a common problem in the scientific literature. This may result in the reporting and promulgation of spurious findings. One approach to this problem is to correct groups of P-values for "family-wide significance" using a Bonferroni correction or the less conservative Bonferroni-Holm correction or to correct for the "false discovery rate" with a Benjamini-Hochberg correction. Although several solutions are available for performing this correction through commercially available software there are no widely available easy to use open source programs to perform these calculations. In this paper we present an open source program written in Python 3.2 that performs calculations for standard Bonferroni, Bonferroni-Holm and Benjamini-Hochberg corrections.

  7. Software Tools for Development on the Peregrine System | High-Performance

    Science.gov (United States)

    Computing | NREL Software Tools for Development on the Peregrine System Software Tools for and manage software at the source code level. Cross-Platform Make and SCons The "Cross-Platform Make" (CMake) package is from Kitware, and SCons is a modern software build tool based on Python

  8. Measurement of the reaction {gamma}d {yields}pn{pi}{sup +}{pi}{sup -} at SAPHIR and investigation of the decay angular distribution of the {Delta}{sup ++}(1232) resonance; Messung der Reaktion {gamma}d {yields}pn{pi}{sup +}{pi}{sup -} an SAPHIR und Untersuchung der Zerfallswinkelverteilung der {Delta}{sup ++}(1232)-Resonanz

    Energy Technology Data Exchange (ETDEWEB)

    Schuetz, P.

    1993-03-01

    SAPHIR, a new experiment at the Bonn electron stretcher ring ELSA, started taking data in spring 1992. It was set up for the investigation of photon induced reactions with multiparticle final states. In the first part of this paper the special design of the target is described. It can be operated with liquefied hydrogen or deuterium and is placed in the middle of the central drift chamber. To project the surrounding chamber in case of a fracture of the target cell as safety system is installed. In addition two independent methods of monitoring the cell are procided. The first measurement was performed with a deuterium target at a photon energy range of E{sub {gamma}} = 500-700 MeV. In the second part of this paper first results of an analysis of the decay angular distribution of the {Delta}{sup ++}(1232) in the reaction {gamma}d {yields} n{Delta}{sup ++}{pi}{sup -} are presented. They are compared to old data from a hydrogen bubble chamber experiment and are discussed on the basis of a spectator model. (orig.) [Deutsch] Im Rahmen dieser Arbeit ist der Aufbau eines Fluessiggas-Targets beschrieben worden, das speziell fuer den Einsatz im SAPHIR-Detektor entwickelt worden ist. Es wurden Funktionen zur Ueberwachung der Targetzelle vorgestellt und ein Sicherheitssystem zum Schutz der zentralen Driftkammer, die das Target unmittelbar umgibt. Weiterhin ist in Simulationsrechnungen untersucht worden, welchen Einfluss die Konstruktion des Targetstreutopfes auf die Messung unterschiedlicher Reaktionen haben kann. Dabei sind bei 50% bis 70% der Ereignisse Treffer in den Aluminiumstuetzen des Targetstreutopfes aufgetreten. Diese starke Beeintraechtigung kann durch eine Neukonstruktion des Streutopfes und der Verwendung von z.B. Rohazell als Streutopffenster vermieden werden. Rohazell zeichnet sich durch eine hohe Festigkeit und grosse Strahlungslaenge aus. An der Neukonstruktion des Streutopfes wird z.Z. gearbeitet. Der zweite Teil der Arbeit beschreibt eine der ersten

  9. Enabling Diverse Software Stacks on Supercomputers using High Performance Virtual Clusters.

    Energy Technology Data Exchange (ETDEWEB)

    Younge, Andrew J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Pedretti, Kevin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Grant, Ryan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Brightwell, Ron [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-05-01

    While large-scale simulations have been the hallmark of the High Performance Computing (HPC) community for decades, Large Scale Data Analytics (LSDA) workloads are gaining attention within the scientific community not only as a processing component to large HPC simulations, but also as standalone scientific tools for knowledge discovery. With the path towards Exascale, new HPC runtime systems are also emerging in a way that differs from classical distributed com- puting models. However, system software for such capabilities on the latest extreme-scale DOE supercomputing needs to be enhanced to more appropriately support these types of emerging soft- ware ecosystems. In this paper, we propose the use of Virtual Clusters on advanced supercomputing resources to enable systems to support not only HPC workloads, but also emerging big data stacks. Specifi- cally, we have deployed the KVM hypervisor within Cray's Compute Node Linux on a XC-series supercomputer testbed. We also use libvirt and QEMU to manage and provision VMs directly on compute nodes, leveraging Ethernet-over-Aries network emulation. To our knowledge, this is the first known use of KVM on a true MPP supercomputer. We investigate the overhead our solution using HPC benchmarks, both evaluating single-node performance as well as weak scaling of a 32-node virtual cluster. Overall, we find single node performance of our solution using KVM on a Cray is very efficient with near-native performance. However overhead increases by up to 20% as virtual cluster size increases, due to limitations of the Ethernet-over-Aries bridged network. Furthermore, we deploy Apache Spark with large data analysis workloads in a Virtual Cluster, ef- fectively demonstrating how diverse software ecosystems can be supported by High Performance Virtual Clusters.

  10. Perprof-py: A Python Package for Performance Profile of Mathematical Optimization Software

    Directory of Open Access Journals (Sweden)

    Abel Soares Siqueira

    2016-04-01

    Full Text Available A very important area of research in the field of Mathematical Optimization is the benchmarking of optimization packages to compare solvers. During benchmarking, one usually collects a large amount of information like CPU time, number of functions evaluations, number of iterations, and much more. This information, if presented as tables, can be difficult to analyze and compare due to large amount of data. Therefore tools to better process and understand optimization benchmark data have been developed. One of the most widespread tools is the Performance Profile graphics proposed by Dolan and Moré [2]. In this context, this paper describes perprof-py, a free/open source software that creates 'Performance Profile' graphics. This software produces graphics in PDF using LaTeX with PGF/TikZ [22] and PGFPLOTS [4] packages, in PNG using matplotlib [9], and in HTML using Bokeh [1]. Perprof-py can also be easily extended to be used with other plot libraries. It is implemented in Python 3 with support for internationalization, and is under the General Public License Version 3 (GPLv3.

  11. Software quality assurance in the 1996 performance assessment for the Waste Isolation Pilot Plant

    International Nuclear Information System (INIS)

    Froehlich, Gary K.; Ogden, Harvey C.; Byle, Kathleen A.

    2000-01-01

    The US Department of Energy (DOE) Waste Isolation Pilot Plant (WIPP), located in southeast New Mexico, is a deep geologic repository for the permanent disposal of transuranic waste generated by DOE defense-related activities. Sandia National Laboratories (SNL), in its role as scientific advisor to the DOE, is responsible for evaluating the long-term performance of the WIPP. This risk-based Performance Assessment (PA) is accomplished in part through the use of numerous scientific modeling codes, which rely for some of their inputs on data gathered during characterization of the site. The PA is subject to formal requirements set forth in federal regulations. In particular, the components of the calculation fall under the configuration management and software quality assurance aegis of the American Society of Mechanical Engineers(ASME) Nuclear Quality Assurance (NQA) requirements. This paper describes SNL's implementation of the NQA requirements regarding software quality assurance (SQA). The description of the implementation of SQA for a PA calculation addresses not only the interpretation of the NQA requirements, it also discusses roles, deliverables, and the resources necessary for effective implementation. Finally, examples are given which illustrate the effectiveness of SNL's SQA program, followed by a detailed discussion of lessons learned

  12. Software quality assurance in the 1996 performance assessment for the Waste Isolation Pilot Plant

    International Nuclear Information System (INIS)

    Froehlich, G.K.; Ogden, H.C.; Byle, K.A.

    2000-01-01

    The US Department of Energy (DOE) Waste Isolation Pilot Plant (WIPP), located in southeast New Mexico, is a deep geologic repository for the permanent disposal of transuranic waste generated by DOE defense-related activities. Sandia National Laboratories (SNL), in its role as scientific advisor to the DOE, is responsible for evaluating the long-term performance of the WIPP. This risk-based Performance Assessment (PA) is accomplished in part through the use of numerous scientific modeling codes, which rely for some of their inputs on data gathered during characterization of the site. The PA is subject to formal requirements set forth in federal regulations. In particular, the components of the calculation fall under the configuration management and software quality assurance aegis of the American Society of Mechanical Engineers (ASME) Nuclear Quality Assurance (NQA) requirements. This paper describes SNL's implementation of the NQA requirements regarding software quality assurance (SQA). The description of the implementation of SQA for a PA calculation addresses not only the interpretation of the NQA requirements, it also discusses roles, deliverables, and the resources necessary for effective implementation. Finally, examples are given which illustrate the effectiveness of SNL's SQA program, followed by a detailed discussion of lessons learned

  13. Performance evaluation of the zero-multipole summation method in modern molecular dynamics software.

    Science.gov (United States)

    Sakuraba, Shun; Fukuda, Ikuo

    2018-05-04

    The zero-multiple summation method (ZMM) is a cutoff-based method for calculating electrostatic interactions in molecular dynamics simulations, utilizing an electrostatic neutralization principle as a physical basis. Since the accuracies of the ZMM have been revealed to be sufficient in previous studies, it is highly desirable to clarify its practical performance. In this paper, the performance of the ZMM is compared with that of the smooth particle mesh Ewald method (SPME), where the both methods are implemented in molecular dynamics software package GROMACS. Extensive performance comparisons against a highly optimized, parameter-tuned SPME implementation are performed for various-sized water systems and two protein-water systems. We analyze in detail the dependence of the performance on the potential parameters and the number of CPU cores. Even though the ZMM uses a larger cutoff distance than the SPME does, the performance of the ZMM is comparable to or better than that of the SPME. This is because the ZMM does not require a time-consuming electrostatic convolution and because the ZMM gains short neighbor-list distances due to the smooth damping feature of the pairwise potential function near the cutoff length. We found, in particular, that the ZMM with quadrupole or octupole cancellation and no damping factor is an excellent candidate for the fast calculation of electrostatic interactions. © 2018 Wiley Periodicals, Inc. © 2018 Wiley Periodicals, Inc.

  14. TOPAS 1 - construction and test of a scintillation counter hodoscope for the tagging of bremsstrahlung photons for the SAPHIR detector

    International Nuclear Information System (INIS)

    Merkel, R.

    1989-09-01

    The development of a tagging-hodoscope for the SAPHIR-detector at the stretcher ring ELSA in Bonn is described. The hodoscope covers the energy range 2.175 GeV γ 0 =3.500 GeV. 24 scintillation counters are used for the determination of the photon energy, giving a resolution of ΔE γ =25 MeV. The tagging method requires a good coincidence timing resoluting τ between the tagging hodoscope and the detector for the photon-induced reactions in order to keep the accidental coincidences low. The timing information is given by 8 fast timing counters (40 mm thick), covering 5 up to 7 energy channels each. Fluctuations of the timing signal which result from different impact-locations on the timing counter, due to different light travelling distances, are corrected by the energy defining counters. The timing-component (8 timing counters) is commpleted and tested. The results of first mesurements show an upper limit of σ=250 psec for the resolution of 7 coincidences out of 45 possible channels in the tagging hodscope. These results are obtained with a preliminary adjustment of the SAPHIR beam-line and with a not yet optimized signal to noize ratio in the extracted beam. We hope to obtain a σ<200 psec under optimized conditions. (orig.)

  15. A software package for evaluating the performance of a star sensor operation

    Science.gov (United States)

    Sarpotdar, Mayuresh; Mathew, Joice; Sreejith, A. G.; Nirmal, K.; Ambily, S.; Prakash, Ajin; Safonova, Margarita; Murthy, Jayant

    2017-02-01

    We have developed a low-cost off-the-shelf component star sensor ( StarSense) for use in minisatellites and CubeSats to determine the attitude of a satellite in orbit. StarSense is an imaging camera with a limiting magnitude of 6.5, which extracts information from star patterns it records in the images. The star sensor implements a centroiding algorithm to find centroids of the stars in the image, a Geometric Voting algorithm for star pattern identification, and a QUEST algorithm for attitude quaternion calculation. Here, we describe the software package to evaluate the performance of these algorithms as a star sensor single operating system. We simulate the ideal case where sky background and instrument errors are omitted, and a more realistic case where noise and camera parameters are added to the simulated images. We evaluate such performance parameters of the algorithms as attitude accuracy, calculation time, required memory, star catalog size, sky coverage, etc., and estimate the errors introduced by each algorithm. This software package is written for use in MATLAB. The testing is parametrized for different hardware parameters, such as the focal length of the imaging setup, the field of view (FOV) of the camera, angle measurement accuracy, distortion effects, etc., and therefore, can be applied to evaluate the performance of such algorithms in any star sensor. For its hardware implementation on our StarSense, we are currently porting the codes in form of functions written in C. This is done keeping in view its easy implementation on any star sensor electronics hardware.

  16. Impact of Recent Hardware and Software Trends on High Performance Transaction Processing and Analytics

    Science.gov (United States)

    Mohan, C.

    In this paper, I survey briefly some of the recent and emerging trends in hardware and software features which impact high performance transaction processing and data analytics applications. These features include multicore processor chips, ultra large main memories, flash storage, storage class memories, database appliances, field programmable gate arrays, transactional memory, key-value stores, and cloud computing. While some applications, e.g., Web 2.0 ones, were initially built without traditional transaction processing functionality in mind, slowly system architects and designers are beginning to address such previously ignored issues. The availability, analytics and response time requirements of these applications were initially given more importance than ACID transaction semantics and resource consumption characteristics. A project at IBM Almaden is studying the implications of phase change memory on transaction processing, in the context of a key-value store. Bitemporal data management has also become an important requirement, especially for financial applications. Power consumption and heat dissipation properties are also major considerations in the emergence of modern software and hardware architectural features. Considerations relating to ease of configuration, installation, maintenance and monitoring, and improvement of total cost of ownership have resulted in database appliances becoming very popular. The MapReduce paradigm is now quite popular for large scale data analysis, in spite of the major inefficiencies associated with it.

  17. Simulation of pellet-cladding interaction with the Pleiades fuel performance software environment

    International Nuclear Information System (INIS)

    Michel, B.; Nonon, C.; Sercombe, J.; Michel, F.; Marelle, V.

    2013-01-01

    This paper focuses on the PLEIADES fuel performance software environment and its application to the modeling of pellet-cladding interaction (PCI). The PLEIADES platform has been under development for 10 yr; a unified software environment, including the multidimensional finite element solver CAST3M, has been used to develop eight computation schemes now under operation. Among the latter, the ALCYONE application is devoted to pressurized water reactor fuel rod behavior. This application provides a three-dimensional (3-D) model for a detailed analysis of fuel element behavior and enables validation through comparing simulation and post-irradiation examination results (cladding residual diameter and ridges, dishing filling, pellet cracking, etc.). These last years the 3-D computation scheme of the ALCYONE application has been enriched with a complete set of physical models to take into account thermomechanical and chemical-physical behavior of the fuel element under irradiation. These models have been validated through the ALCYONE application on a large experimental database composed of approximately 400 study cases. The strong point of the ALCYONE application concerns the local approach of stress-corrosion-cracking rupture under PCI, which can be computed with the 3-D finite element solver. Further developments for PCI modeling in the PLEIADES platform are devoted to a new mesh refinement method for assessing stress-and-strain concentration (multigrid technique) and a new component for assessing fission product chemical recombination. (authors)

  18. Computer software.

    Science.gov (United States)

    Rosenthal, L E

    1986-10-01

    Software is the component in a computer system that permits the hardware to perform the various functions that a computer system is capable of doing. The history of software and its development can be traced to the early nineteenth century. All computer systems are designed to utilize the "stored program concept" as first developed by Charles Babbage in the 1850s. The concept was lost until the mid-1940s, when modern computers made their appearance. Today, because of the complex and myriad tasks that a computer system can perform, there has been a differentiation of types of software. There is software designed to perform specific business applications. There is software that controls the overall operation of a computer system. And there is software that is designed to carry out specialized tasks. Regardless of types, software is the most critical component of any computer system. Without it, all one has is a collection of circuits, transistors, and silicone chips.

  19. Installing and Setting Up Git Software Tool on Windows | High-Performance

    Science.gov (United States)

    Computing | NREL Git Software Tool on Windows Installing and Setting Up Git Software Tool on Windows Learn how to set up the Git software tool on Windows for use with the Peregrine system. Git is this doc, we'll show you how to get git installed on Windows 7, and how to get things set up on NREL's

  20. Parameter definition using vibration prediction software leads to significant drilling performance improvements

    Energy Technology Data Exchange (ETDEWEB)

    Amorim, Dalmo; Hanley, Chris Hanley; Fonseca, Isaac; Santos, Juliana [National Oilwell Varco, Houston TX (United States); Leite, Daltro J.; Borella, Augusto; Gozzi, Danilo [Petroleo Brasileiro S.A. (PETROBRAS), Rio de Janeiro, RJ (Brazil)

    2012-07-01

    The understanding and mitigation of downhole vibration has been a heavily researched subject in the oil industry as it results in more expensive drilling operations, as vibrations significantly diminish the amount of effective drilling energy available to the bit and generate forces that can push the bit or the Bottom Hole Assembly (BHA) off its concentric axis of rotation, producing high magnitude impacts with the borehole wall. In order to drill ahead, a sufficient amount of energy must be supplied by the rig to overcome the resistance of the drilling system, including the reactive torque of the system, drag forces, fluid pressure losses and energy dissipated by downhole vibrations, then providing the bit with the energy required to fail the rock. If the drill string enters resonant modes of vibration, not only does it decreases the amount of available energy to drill, but increases the potential for catastrophic downhole equipment and drilling bit failures. In this sense, the mitigation of downhole vibrations will result in faster, smoother, and cheaper drilling operations. A software tool using Finite Element Analysis (FEA) has been developed to provide better understanding of downhole vibration phenomena in drilling environments. The software tool calculates the response of the drilling system at various input conditions, based on the design of the wellbore along with the geometry of the Bottom Hole Assembly (BHA) and the drill string. It identifies where undesired levels of resonant vibration will be driven by certain combinations of specific drilling parameters, and also which combinations of drilling parameters will result in lower levels of vibration, so the least shocks, the highest penetration rate and the lowest cost per foot can be achieved. With the growing performance of personal computers, complex software systems modeling the drilling vibrations using FEA has been accessible to a wider audience of field users, further complimenting with real time

  1. Numerical verification of equilibrium chemistry software within nuclear fuel performance codes

    International Nuclear Information System (INIS)

    Piro, M.H.; Lewis, B.J.; Thompson, W.T.; Simunovic, S.; Besmann, T.M.

    2010-01-01

    A numerical tool is in an advanced state of development to compute the equilibrium compositions of phases and their proportions in multi-component systems of importance to the nuclear industry. The resulting software is being conceived for direct integration into large multi-physics fuel performance codes, particularly for providing transport source terms, material properties, and boundary conditions in heat and mass transport modules. Consequently, any numerical errors produced in equilibrium chemistry computations will be propagated in subsequent heat and mass transport calculations, thus falsely predicting nuclear fuel behaviour. The necessity for a reliable method to numerically verify chemical equilibrium computations is emphasized by the requirement to handle the very large number of elements necessary to capture the entire fission product inventory. A simple, reliable and comprehensive numerical verification method called the Gibbs Criteria is presented which can be invoked by any equilibrium chemistry solver for quality assurance purposes. (author)

  2. Study of the performance of the data acquisition chain for BCM1F software upgrade

    Energy Technology Data Exchange (ETDEWEB)

    Hempel, Maria

    2011-05-15

    BCM1F, the Fast Beam Conditions Monitor, is a sub-detector of the CMS experiment at LHC. It monitors the beam halo and the collision product rates inside the CMS experiment. The data acquisition of BCM1F is independent from CMS. Major components of the BCM1F back-end are discriminators, ADCs, TDCs, look-up tables and a Veto module. In the thesis the performance of several components is investigated. For the TDC two different readout modes are compared, and the impact of a Ring Buffer in the readout software was investigated. For one discriminator the thresholds of all channels are investigated and offsets of about 10 mV are found. Data taken in the LHC runs with the TDC are presented and discussed. Also the application of BCM1F as a luminosity monitor is studied. (orig.)

  3. Study of the performance of the data acquisition chain for BCM1F software upgrade

    International Nuclear Information System (INIS)

    Hempel, Maria

    2011-05-01

    BCM1F, the Fast Beam Conditions Monitor, is a sub-detector of the CMS experiment at LHC. It monitors the beam halo and the collision product rates inside the CMS experiment. The data acquisition of BCM1F is independent from CMS. Major components of the BCM1F back-end are discriminators, ADCs, TDCs, look-up tables and a Veto module. In the thesis the performance of several components is investigated. For the TDC two different readout modes are compared, and the impact of a Ring Buffer in the readout software was investigated. For one discriminator the thresholds of all channels are investigated and offsets of about 10 mV are found. Data taken in the LHC runs with the TDC are presented and discussed. Also the application of BCM1F as a luminosity monitor is studied. (orig.)

  4. Automated load balancing in the ATLAS high-performance storage software

    CERN Document Server

    Le Goff, Fabrice; The ATLAS collaboration

    2017-01-01

    The ATLAS experiment collects proton-proton collision events delivered by the LHC accelerator at CERN. The ATLAS Trigger and Data Acquisition (TDAQ) system selects, transports and eventually records event data from the detector at several gigabytes per second. The data are recorded on transient storage before being delivered to permanent storage. The transient storage consists of high-performance direct-attached storage servers accounting for about 500 hard drives. The transient storage operates dedicated software in the form of a distributed multi-threaded application. The workload includes both CPU-demanding and IO-oriented tasks. This paper presents the original application threading model for this particular workload, discussing the load-sharing strategy among the available CPU cores. The limitations of this strategy were reached in 2016 due to changes in the trigger configuration involving a new data distribution pattern. We then describe a novel data-driven load-sharing strategy, designed to automatical...

  5. ATLAS High Level Calorimeter Trigger Software Performance for Cosmic Ray Events

    CERN Document Server

    Oliveira Damazio, Denis; The ATLAS collaboration

    2009-01-01

    The ATLAS detector is undergoing intense commissioning effort with cosmic rays preparing for the first LHC collisions next spring. Combined runs with all of the ATLAS subsystems are being taken in order to evaluate the detector performance. This is an unique opportunity also for the trigger system to be studied with different detector operation modes, such as different event rates and detector configuration. The ATLAS trigger starts with a hardware based system which tries to identify detector regions where interesting physics objects may be found (eg: large energy depositions in the calorimeter system). An approved event will be further processed by more complex software algorithms at the second level where detailed features are extracted (full detector granularity data for small portions of the detector is available). Events accepted at this level will be further processed at the so-called event filter level. Full detector data at full granularity is available for offline like processing with complete calib...

  6. Comparison of N2O5 mixing ratios during NO3Comp 2007 in SAPHIR

    Directory of Open Access Journals (Sweden)

    A. W. Rollins

    2012-11-01

    Full Text Available N2O5 detection in the atmosphere has been accomplished using techniques which have been developed during the last decade. Most techniques use a heated inlet to thermally decompose N2O5 to NO3, which can be detected by either cavity based absorption at 662 nm or by laser-induced fluorescence. In summer 2007, a large set of instruments, which were capable of measuring NO3 mixing ratios, were simultaneously deployed in the atmosphere simulation chamber SAPHIR in Jülich, Germany. Some of these instruments measured N2O5 mixing ratios either simultaneously or alternatively. Experiments focused on the investigation of potential interferences from, e.g., water vapour or aerosol and on the investigation of the oxidation of biogenic volatile organic compounds by NO3. The comparison of N2O5 mixing ratios shows an excellent agreement between measurements of instruments applying different techniques (3 cavity ring-down (CRDS instruments, 2 laser-induced fluorescence (LIF instruments. Datasets are highly correlated as indicated by the square of the linear correlation coefficients, R2, which values were larger than 0.96 for the entire datasets. N2O5 mixing ratios well agree within the combined accuracy of measurements. Slopes of the linear regression range between 0.87 and 1.26 and intercepts are negligible. The most critical aspect of N2O5 measurements by cavity ring-down instruments is the determination of the inlet and filter transmission efficiency. Measurements here show that the N2O5 inlet transmission efficiency can decrease in the presence of high aerosol loads, and that frequent filter/inlet changing is necessary to quantitatively sample N2O5 in some environments. The analysis of data also demonstrates that a general correction for degrading filter transmission is not applicable for all conditions encountered during this campaign. Besides the effect of a gradual degradation of the inlet transmission efficiency aerosol exposure, no other interference

  7. FMEA Performed on the SPINLINE3 Operational System Software as part of the TIHANGE 1 NIS Refurbishment Safety Case

    International Nuclear Information System (INIS)

    Ristord, L.; Esmenjaud, C.

    2002-01-01

    This paper introduces the SPINLINE3 technology and TIHANGE 1 the NIS project. It then focuses on the specificity of FMEA performed on software. It points out the benefits of this analysis and also some of the limitations and possible developments. It also gives characteristics that, if present in the software, help the analysis and the defenses. It takes as an example the analysis performed on the Operational System Software of the Schneider Electric safety digital generic platform SPINLINE3. The New TIHANGE 1 Nuclear Instrumentation System successfully started operation on the beginning of Marsh 2001 after the plant outage, as planned at the beginning of the project. The choice of a software-based technology has raised the issue of the risk of CCF due to the same software being used in redundant independent units. Implementing functional diversity or equipment diversity has been considered but found either not practicable or of little value within this context. The safety characteristics of the SPINLINE3 solution and the stringent and proven safety software development process applied by the Nuclear department of the Schneider Electric company have made acceptable the principle of a design based on redundant identical processing units for this project. In addition, because of the possible consequences in case of the NIS not performing its protection function on demand, the licensing authority has required an FMEA oriented toward the SCCF risk as part of the safety case. This FMEA has been performed on : - the NIS architecture, - the SPINLINE3 Operational System Software, - the three Tihange 1 application software (i.e. source, intermediate and power range). The process used and the results have been elaborated by Schneider Electric and reviewed by the customer and the licensing authority all along the project development until final acceptance. Issues have been raised and answers and/or complementary analyses provided, some of them making direct references to the

  8. Performance analysis and optimization of an advanced pharmaceutical wastewater treatment plant through a visual basic software tool (PWWT.VB).

    Science.gov (United States)

    Pal, Parimal; Thakura, Ritwik; Chakrabortty, Sankha

    2016-05-01

    A user-friendly, menu-driven simulation software tool has been developed for the first time to optimize and analyze the system performance of an advanced continuous membrane-integrated pharmaceutical wastewater treatment plant. The software allows pre-analysis and manipulation of input data which helps in optimization and shows the software performance visually on a graphical platform. Moreover, the software helps the user to "visualize" the effects of the operating parameters through its model-predicted output profiles. The software is based on a dynamic mathematical model, developed for a systematically integrated forward osmosis-nanofiltration process for removal of toxic organic compounds from pharmaceutical wastewater. The model-predicted values have been observed to corroborate well with the extensive experimental investigations which were found to be consistent under varying operating conditions like operating pressure, operating flow rate, and draw solute concentration. Low values of the relative error (RE = 0.09) and high values of Willmott-d-index (d will = 0.981) reflected a high degree of accuracy and reliability of the software. This software is likely to be a very efficient tool for system design or simulation of an advanced membrane-integrated treatment plant for hazardous wastewater.

  9. Performance Analysis of Congestion Control Mechanism in Software Defined Network (SDN

    Directory of Open Access Journals (Sweden)

    Rahman M. Z. A.

    2017-01-01

    Full Text Available In the near future, the traditional networks architecture will be difficult to be managed. Hence, Software Defined Network (SDN will be an alternative in the future of programmable networks to replace the conventional network architecture. The main idea of SDN architecture is to separate the forwarding plane and control plane of network system, where network operators can program packet forwarding behaviour to improve the network performance. Congestion control is important mechanism for network traffic to improve network capability and achieve high end Quality of Service (QoS. In this paper, extensive simulation is conducted to analyse the performance of SDN by implementing Link Layer Discovery Protocol (LLDP under congested network. The simulation was conducted on Mininet by creating four different fanout and the result was analysed based on differences of matrix performance. As a result, the packet loss and throughput reduction were observed when number of fanout in the topology was increased. By using LLDP protocol, huge reduction in packet loss rate has been achieved while maximizing percentage packet delivery ratio.

  10. On integrating modeling software for application to total-system performance assessment

    International Nuclear Information System (INIS)

    Lewis, L.C.; Wilson, M.L.

    1994-05-01

    We examine the processes and methods used to facilitate collaboration in software development between two organizations at separate locations -- Lawrence Livermore National Laboratory (LLNL) in California and Sandia National Laboratories (SNL) in New Mexico. Our software development process integrated the efforts of these two laboratories. Software developed at LLNL to model corrosion and failure of waste packages and subsequent releases of radionuclides was incorporated as a source term into SNLs computer models for fluid flow and radionuclide transport through the geosphere

  11. PaRSEC: A Software Framework for Performance and Productivity on Hybrid, Manycore Platforms

    Energy Technology Data Exchange (ETDEWEB)

    Dongarra, Jack [Univ. of Tennessee, Knoxville, TN (United States)

    2016-06-30

    As the era of computer architectures dominated by serial processors ends, the convergence of several unprecedented challenges suggests that closing the longstanding "application–architecture performance gap" will become more challenging than ever. To address this problem, the Parallel Runtime Scheduling and Execution Control (PaRSEC) project created a modular software framework that achieved two major objectives: first, it built a task-based runtime capable of delivering portable performance to a wide range of science and engineering applications at all levels of the platform pyramid, including the upcoming 100 Pflop/s systems and then exascale; and second, it supported and facilitated the work of developers in migrating their legacy codes and writing entirely new ones for the emerging hybrid and massively parallel manycore processor system designs. PaRSEC will support multiple domain-specific languages capable of increasing the developers' productivity while also providing the runtime with the constructs and flexibility necessary to exploit the maximal parallelism from parallel applications. Extensive preliminary research in dense linear algebra showed convincingly that a parameterized task graph representation that symbolically describes the algorithm content can achieve the project's twofold objective within that domain. The research also strongly suggested that this powerful method could be generalized to a far-wider variety of applications.

  12. Software Application Profile: PHESANT: a tool for performing automated phenome scans in UK Biobank.

    Science.gov (United States)

    Millard, Louise A C; Davies, Neil M; Gaunt, Tom R; Davey Smith, George; Tilling, Kate

    2017-10-05

    Epidemiological cohorts typically contain a diverse set of phenotypes such that automation of phenome scans is non-trivial, because they require highly heterogeneous models. For this reason, phenome scans have to date tended to use a smaller homogeneous set of phenotypes that can be analysed in a consistent fashion. We present PHESANT (PHEnome Scan ANalysis Tool), a software package for performing comprehensive phenome scans in UK Biobank. PHESANT tests the association of a specified trait with all continuous, integer and categorical variables in UK Biobank, or a specified subset. PHESANT uses a novel rule-based algorithm to determine how to appropriately test each trait, then performs the analyses and produces plots and summary tables. The PHESANT phenome scan is implemented in R. PHESANT includes a novel Javascript D3.js visualization and accompanying Java code that converts the phenome scan results to the required JavaScript Object Notation (JSON) format. PHESANT is available on GitHub at [https://github.com/MRCIEU/PHESANT]. Git tag v0.5 corresponds to the version presented here. © The Author 2017. Published by Oxford University Press on behalf of the International Epidemiological Association.

  13. Performance assessment of the commercial CFD software for the prediction of the PWR internal flow - Corrected version

    International Nuclear Information System (INIS)

    Lee, Gong Hee; Bang, Young Seok; Woo, Sweng Woong; Cheong, Ae Ju; Kim, Do Hyeong; Kang, Min Ku

    2013-01-01

    As the computer hardware technology develops the license applicants for nuclear power plant use the commercial CFD software with the aim of reducing the excessive conservatism associated with using simplified and conservative analysis tools. Even if some of CFD software developers and its users think that a state of the art CFD software can be used to solve reasonably at least the single-phase nuclear reactor safety problems there is still the limitations and the uncertainties in the calculation result. From a regulatory perspective, Korea Institute of Nuclear Safety (KINS) has been presently conducting the performance assessment of the commercial CFD software for the nuclear reactor safety problems. In this study, in order to examine the prediction performance of the commercial CFD software with the porous model in the analysis of the scale-down APR+ (Advanced Power Reactor Plus) internal flow, simulation was conducted with the on-board numerical models in ANSYS CFX R.14 and FLUENT R.14. It was concluded that depending on the CFD software the internal flow distribution of the scale-down APR+ was locally some-what different. Although there was a limitation in estimating the prediction performance of the commercial CFD software due to the limited number of the measured data, CFXR.14 showed the more reasonable predicted results in comparison with FLUENT R.14. Meanwhile, due to the difference of discretization methodology, FLUENT R.14 required more computational memory than CFX R.14 for the same grid system. Therefore the CFD software suitable to the available computational resource should be selected for the massive parallel computation. (authors)

  14. Commercial Building Energy Baseline Modeling Software: Performance Metrics and Method Testing with Open Source Models and Implications for Proprietary Software Testing

    Energy Technology Data Exchange (ETDEWEB)

    Price, Phillip N.; Granderson, Jessica; Sohn, Michael; Addy, Nathan; Jump, David

    2013-09-01

    The overarching goal of this work is to advance the capabilities of technology evaluators in evaluating the building-level baseline modeling capabilities of Energy Management and Information System (EMIS) software. Through their customer engagement platforms and products, EMIS software products have the potential to produce whole-building energy savings through multiple strategies: building system operation improvements, equipment efficiency upgrades and replacements, and inducement of behavioral change among the occupants and operations personnel. Some offerings may also automate the quantification of whole-building energy savings, relative to a baseline period, using empirical models that relate energy consumption to key influencing parameters, such as ambient weather conditions and building operation schedule. These automated baseline models can be used to streamline the whole-building measurement and verification (M&V) process, and therefore are of critical importance in the context of multi-measure whole-building focused utility efficiency programs. This report documents the findings of a study that was conducted to begin answering critical questions regarding quantification of savings at the whole-building level, and the use of automated and commercial software tools. To evaluate the modeling capabilities of EMIS software particular to the use case of whole-building savings estimation, four research questions were addressed: 1. What is a general methodology that can be used to evaluate baseline model performance, both in terms of a) overall robustness, and b) relative to other models? 2. How can that general methodology be applied to evaluate proprietary models that are embedded in commercial EMIS tools? How might one handle practical issues associated with data security, intellectual property, appropriate testing ‘blinds’, and large data sets? 3. How can buildings be pre-screened to identify those that are the most model-predictable, and therefore those

  15. Performance Test of Openflow Agent on Openflow Software-Based Mikrotik RB750 Switch

    Directory of Open Access Journals (Sweden)

    Rikie Kartadie

    2016-11-01

    Full Text Available A network is usually developed by several devices such as router, switch etc. Every device forwards data package manipulation with complicated protocol planted in its hardware. An operator is responsible for running configuration either to manage rules or application applied in the network. Human error may occur when device configuration run manually by operator. Some famous vendors, one of them is MikroTik, has also been implementing this OpenFlow on its operation. It provides the implementation of SDN/OpenFlow architecture with affordable cost. The second phase research result showed that switch OF software-based MikroTik resulted higher latency value than both mininet and switch OF software-based OpenWRT. The average gap value of switch OF software-based MikroTik is 2012 kbps lower than the value of switch OF software-based OpenWRT. The average gap value of throughput bandwidth protocol UDP switch OF software-based MikroTik is 3.6176 kBps lower than switch OF software-based OpenWRT and it is 8.68 kBps lower than mininet. The average gap throughput jitter protokol UDP of switch OF software-based MiktoTik is 0.0103ms lower than switch OF software-based OpenWRT and 0.0093ms lower than mininet. 

  16. Performance verification of network function virtualization in software defined optical transport networks

    Science.gov (United States)

    Zhao, Yongli; Hu, Liyazhou; Wang, Wei; Li, Yajie; Zhang, Jie

    2017-01-01

    With the continuous opening of resource acquisition and application, there are a large variety of network hardware appliances deployed as the communication infrastructure. To lunch a new network application always implies to replace the obsolete devices and needs the related space and power to accommodate it, which will increase the energy and capital investment. Network function virtualization1 (NFV) aims to address these problems by consolidating many network equipment onto industry standard elements such as servers, switches and storage. Many types of IT resources have been deployed to run Virtual Network Functions (vNFs), such as virtual switches and routers. Then how to deploy NFV in optical transport networks is a of great importance problem. This paper focuses on this problem, and gives an implementation architecture of NFV-enabled optical transport networks based on Software Defined Optical Networking (SDON) with the procedure of vNFs call and return. Especially, an implementation solution of NFV-enabled optical transport node is designed, and a parallel processing method for NFV-enabled OTN nodes is proposed. To verify the performance of NFV-enabled SDON, the protocol interaction procedures of control function virtualization and node function virtualization are demonstrated on SDON testbed. Finally, the benefits and challenges of the parallel processing method for NFV-enabled OTN nodes are simulated and analyzed.

  17. Design of the Jet Performance Software for the ATLAS Experiment at LHC

    CERN Document Server

    Doglioni, C; The ATLAS collaboration; Loch, P; Perez, K; Vitillo, RA

    2011-01-01

    This paper describes the design and implementation of the JetFramework, a software tool developed for the data analysis of the ATLAS experi- ment at CERN. JetFramework is based on Athena, an object oriented framework for data processing. The JetFramework Athena package im- plements a configurable data-flow graph (DFG) to represent an analysis. Each node of the graph can perform some computation on one or more particle collections in input. A standard set of nodes to retrieve, filter, sort and plot collections are provided. Users can also implement their own computation units inheriting from a generic interface. The analysis graph can be declared and configured in an Athena options file. To provide the requested flexibility to configure nodes from a configuration file, a sim- ple expression language permits to specify selection and plotting criterias. Viewing an analysis as an explicit DFG permits end-users to avoid writing code for repetitive tasks and to reuse user-defined computation units in other analysis...

  18. Minerva: using a software program to improve resident performance during independent call

    Science.gov (United States)

    Itri, Jason N.; Redfern, Regina O.; Cook, Tessa; Scanlon, Mary H.

    2010-03-01

    We have developed an application called Minerva that allows tracking of resident discrepancy rates and missed cases. Minerva mines the radiology information system (RIS) for preliminary interpretations provided by residents during independent call and copies both the preliminary and final interpretations to a database. Both versions are displayed for direct comparison by Minerva and classified as 'in agreement', 'minor discrepancy' or 'major discrepancy' by the resident program director. Minerva compiles statistics comparing minor, major and total discrepancy rates for individual residents relative to the overall group. Discrepant cases are categorized according to date, modality and body part and reviewed for trends in missed cases. The rate of minor, major and total discrepancies for residents on-call at our institution was similar to rates previously published, including a 2.4% major discrepancy rate for second year radiology residents in the DePICTORS study and a 2.6% major discrepancy rate for resident at a community hospital. Trend analysis of missed cases was used to generate a topic-specific resident missed case conference on acromioclavicular (AC) joint separation injuries, which resulted in a 75% decrease in the number of missed cases related to AC separation subsequent to the conference. Using a software program to track of minor and major discrepancy rates for residents taking independent call using modified RadPeer scoring guidelines provides a competency-based metric to determine resident performance. Topic-specific conferences using the cases identified by Minerva can result in a decrease in missed cases.

  19. Contributions to large scale and performance tests of the ATLAS online software

    International Nuclear Information System (INIS)

    Badescu, E.; Caprini, M.

    2003-01-01

    One of the sub-system of the Trigger/DAQ system of the future ATLAS experiment is the Online Software system. It encompasses the functionality needed to configure, control and monitor the DAQ. Its architecture is based on a component structure described in the ATLAS Trigger/DAQ technical proposal. Online Software is responsible for control, supervision and internal communication, excluding the event data flow. For the final ATLAS experiment in 2006 it is expected that it will have to control up to 1000 processors. The core components are the run control, process manager, configuration database, inter process communication, message reporting system and information exchange system. The auxiliary components, namely resource manager, online bookkeeper and the integrated graphical user interface were in use for tests. All the components are unit tested for functionality, fault tolerance, performance and scalability. Extended functionality tests are performed at CERN and remote institutes before each official release. The test objective was the verification of the scalability of the system to a configuration containing a large number of nodes. The aim was to study the interaction between the components, to identify critical areas and to investigate the variation and optimization of online system parameters. The timing of the data acquisition transition phases were recorded and analysed. The information on all processes and their relationships, the run control hierarchy in the online system as well as startup and shutdown dependencies are defined in the configuration database data file. Timing measurements were performed for the transitions shown in the paper and defined as follows: Setup: start online server infrastructure; Close: remove online infrastructure; Boot: start all supervised processes; Shutdown: stop all supervised processes; Cold start: start the supervised processes and go to the Running state; Cold stop: reverse of the cold start phase; Luke warm start

  20. First measurement of the reactions γp→K+Λ and γp→K+Σ0 with SAPHIR at ELSA

    International Nuclear Information System (INIS)

    Lindemann, L.

    1993-04-01

    This report can be subdivided into two main parts. The first part concerns the reconstruction program which has been developed to analyse the data taken with the large solid angle detector SAPHIR which is in operation at the Bonn electron accelerator facility ELSA. A survey on this program is given and some improvements as well as the efficiency concerning real data are discussed. The subject of the second part concerns the measurements of the reactions γp→K + Λand γp→K + Σ 0 . The analysis of a sample of data taken with the SAPHIR in June 1992 is discussed in detail. As a result of this analysis total and differential cross sections as well as the recoil polarization for the two processes are presented. In particular the first measurement of the Σ 0 polarization in photoproduction can be reported. (orig.)

  1. Application of optimization methods for nuclear energy system performance assessment by the MESSAGE software

    International Nuclear Information System (INIS)

    Andrianov, A.A.; Kuptsov, I.S.; Utyanskaya, T.V.

    2016-01-01

    This paper defines the multi-objective optimization and uncertainty treatment modules for the IAEA energy planning software MESSAGE intended for multi-objective optimization and sustainability assessments of innovative nuclear energy systems with account of uncertainty [ru

  2. AGSuite: Software to conduct feature analysis of artificial grammar learning performance.

    Science.gov (United States)

    Cook, Matthew T; Chubala, Chrissy M; Jamieson, Randall K

    2017-10-01

    To simplify the problem of studying how people learn natural language, researchers use the artificial grammar learning (AGL) task. In this task, participants study letter strings constructed according to the rules of an artificial grammar and subsequently attempt to discriminate grammatical from ungrammatical test strings. Although the data from these experiments are usually analyzed by comparing the mean discrimination performance between experimental conditions, this practice discards information about the individual items and participants that could otherwise help uncover the particular features of strings associated with grammaticality judgments. However, feature analysis is tedious to compute, often complicated, and ill-defined in the literature. Moreover, the data violate the assumption of independence underlying standard linear regression models, leading to Type I error inflation. To solve these problems, we present AGSuite, a free Shiny application for researchers studying AGL. The suite's intuitive Web-based user interface allows researchers to generate strings from a database of published grammars, compute feature measures (e.g., Levenshtein distance) for each letter string, and conduct a feature analysis on the strings using linear mixed effects (LME) analyses. The LME analysis solves the inflation of Type I errors that afflicts more common methods of repeated measures regression analysis. Finally, the software can generate a number of graphical representations of the data to support an accurate interpretation of results. We hope the ease and availability of these tools will encourage researchers to take full advantage of item-level variance in their datasets in the study of AGL. We moreover discuss the broader applicability of the tools for researchers looking to conduct feature analysis in any field.

  3. Development and verification of a leningrad NPP unit 1 living PSA model in the INL SAPHIRE code format for prompt operational safety level monitoring

    International Nuclear Information System (INIS)

    Bronislav, Vinnikov

    2007-01-01

    The first part of the paper presents results of the work, that was carried out in complete conformity with the Technical Assignment, which was developed by the Leningrad Nuclear Power Plant. The initial scientific and technical information, contained into the In-Depth Safety Assessment Reports, was given to the author of the work. This information included graphical Fault Trees of Safety Systems and Auxiliary Technical Systems, Event Trees for the necessary number of Initial Events, and also information about failure probabilities of basic components of the nuclear unit. On the basis of this information and fueling it to the Usa Idaho National Laboratory (INL) SAPHIRE code, we have developed an electronic version of the Data Base for failure probabilities of the components of technical systems. Then, we have developed both the electronic versions of the necessary Fault Trees, and an electronic versions of the necessary Event Trees. And at last, we have carried out the linkage of the Event Trees. This work has resulted in the Living PSA (LPSA - Living Probabilistic Safety Assessment) Model of the Leningrad NPP Unit 1. The LPSA-model is completely adapted to be consistent with the USA INL SAPHIRE Risk Monitor. The second part of the paper results in analysis of fire consequences in various places of Leningrad NPP Unit 1. The computations were carried out with the help of the LPSA-model, developed in SAPHIRE code format. On the basis of the computations the order of priority of implementation of fire prevention measures was established. (author)

  4. Performance Assessment of a Gnss-Based Troposphere Path Delay Estimation Software

    Science.gov (United States)

    Mariotti, Gilles; Avanzi, Alessandro; Graziani, Alberto; Tortora, Paolo

    2013-04-01

    perform the differentiation. The code relies on several IGS products, like SP3 precise orbits and SINEX positions available for the master stations in order to remove several error components, while the phase ambiguities (both wide and narrow lane) are resolved using the modified LAMBDA (MLAMBDA) method. The double-differenced data are then processed by a Kalman Filter that estimates the contingent positioning error of the rover station, its Zenith Wet Delay (ZWD) and the residual phase ambiguities. On the other hand, the Zenith Hydrostatic Delay (ZHD) is preliminarily computed using a mathematical model, based on surface meteorological measurements. The final product of the developed code is an output file containing the estimated ZWD and ZHD time-series in a format compatible with the major orbit determination software, e.g. the CSP card format (TRK-2-23) used by NASA JPL's Orbit Determination Program.

  5. Infrastructure for Multiphysics Software Integration in High Performance Computing-Aided Science and Engineering

    Energy Technology Data Exchange (ETDEWEB)

    Campbell, Michael T. [Illinois Rocstar LLC, Champaign, IL (United States); Safdari, Masoud [Illinois Rocstar LLC, Champaign, IL (United States); Kress, Jessica E. [Illinois Rocstar LLC, Champaign, IL (United States); Anderson, Michael J. [Illinois Rocstar LLC, Champaign, IL (United States); Horvath, Samantha [Illinois Rocstar LLC, Champaign, IL (United States); Brandyberry, Mark D. [Illinois Rocstar LLC, Champaign, IL (United States); Kim, Woohyun [Illinois Rocstar LLC, Champaign, IL (United States); Sarwal, Neil [Illinois Rocstar LLC, Champaign, IL (United States); Weisberg, Brian [Illinois Rocstar LLC, Champaign, IL (United States)

    2016-10-15

    The project described in this report constructed and exercised an innovative multiphysics coupling toolkit called the Illinois Rocstar MultiPhysics Application Coupling Toolkit (IMPACT). IMPACT is an open source, flexible, natively parallel infrastructure for coupling multiple uniphysics simulation codes into multiphysics computational systems. IMPACT works with codes written in several high-performance-computing (HPC) programming languages, and is designed from the beginning for HPC multiphysics code development. It is designed to be minimally invasive to the individual physics codes being integrated, and has few requirements on those physics codes for integration. The goal of IMPACT is to provide the support needed to enable coupling existing tools together in unique and innovative ways to produce powerful new multiphysics technologies without extensive modification and rewrite of the physics packages being integrated. There are three major outcomes from this project: 1) construction, testing, application, and open-source release of the IMPACT infrastructure, 2) production of example open-source multiphysics tools using IMPACT, and 3) identification and engagement of interested organizations in the tools and applications resulting from the project. This last outcome represents the incipient development of a user community and application echosystem being built using IMPACT. Multiphysics coupling standardization can only come from organizations working together to define needs and processes that span the space of necessary multiphysics outcomes, which Illinois Rocstar plans to continue driving toward. The IMPACT system, including source code, documentation, and test problems are all now available through the public gitHUB.org system to anyone interested in multiphysics code coupling. Many of the basic documents explaining use and architecture of IMPACT are also attached as appendices to this document. Online HTML documentation is available through the gitHUB site

  6. Supporting motivation, task performance and retention in video tutorials for software training

    NARCIS (Netherlands)

    van der Meij, Hans; van der Meij, Jan; Voerman, Tessa; Duipmans, Evert

    2017-01-01

    Video tutorials for software training are becoming more and more popular, but their construction and effectiveness is understudied. This paper presents a theoretical model that combines demonstration-based training (DBT) and multimedia learning theory as a framework for design. The study

  7. Cross Sectional Study of Agile Software Development Methods and Project Performance

    Science.gov (United States)

    Lambert, Tracy

    2011-01-01

    Agile software development methods, characterized by delivering customer value via incremental and iterative time-boxed development processes, have moved into the mainstream of the Information Technology (IT) industry. However, despite a growing body of research which suggests that a predictive manufacturing approach, with big up-front…

  8. Development of a software application to evaluate the performance and energy losses of grid-connected photovoltaic systems

    International Nuclear Information System (INIS)

    Trillo-Montero, D.; Santiago, I.; Luna-Rodriguez, J.J.; Real-Calvo, R.

    2014-01-01

    Highlights: • Software application to perform an automated analysis of grid-connected PV systems. • It integrates data from all devices registering data on typical PV installations. • Flexible to analyze installations with different configurations and components. • An analysis of two grid-connected PV systems located in Andalusia, was performed. • Temperature losses in summer months varying between 15% and 25% of energy production. - Abstract: The aim of this paper was to design and develop a software application that enables users to perform an automated analysis of data from the monitoring of grid-connected photovoltaic (PV) systems. This application integrates data from all devices already in operation such as environmental sensors, inverters and meters, which record information on typical PV installations. This required the development of a Relational Database Management System (RDBMS), consisting of a series of linked databases, enabling all PV system information to be stored; and a software, called S·lar, which enables all information from the monitoring to be automatically migrated to the database as well as determining some standard magnitudes related to performances and losses of PV installation components at different time scales. A visualization tool, which is both graphical and numerical, makes access to all of the information be a simple task. Moreover, the application enables relationships between parameters and/or magnitudes to be easily established. Furthermore, it can perform a preliminary analysis of the influence of PV installations on the distribution grids where the produced electricity is injected. The operation of such a software application was implemented by performing the analysis of two grid-connected PV installations located in Andalusia, Spain, via data monitoring therein. The monitoring took place from January 2011 to May 2012

  9. Free software for performing physical analysis of systems for digital radiography and mammography

    Energy Technology Data Exchange (ETDEWEB)

    Donini, Bruno; Lanconelli, Nico, E-mail: nico.lanconelli@unibo.it [Alma Mater Studiorum, Department of Physics and Astronomy, University of Bologna, Bologna 40127 (Italy); Rivetti, Stefano [Fisica Medica, Ospedale di Sassuolo S.p.A., Sassuolo 41049 (Italy); Bertolini, Marco [Medical Physics Unit, Azienda Ospedaliera ASMN, Istituto di Ricovero e Cura a Carattere Scientifico, Reggio Emilia 42123 (Italy)

    2014-05-15

    Purpose: In this paper, the authors present a free software for assisting users in achieving the physical characterization of x-ray digital systems and image quality checks. Methods: The program was developed as a plugin of a well-known public-domain suite ImageJ. The software can assist users in calculating various physical parameters such as the response curve (also termed signal transfer property), modulation transfer function (MTF), noise power spectra (NPS), and detective quantum efficiency (DQE). It also includes the computation of some image quality checks: defective pixel analysis, uniformity, dark analysis, and lag. Results: The software was made available in 2009 and has been used during the last couple of years by many users who gave us valuable feedback for improving its usability. It was tested for achieving the physical characterization of several clinical systems for digital radiography and mammography. Various published papers made use of the outcomes of the plugin. Conclusions: This software is potentially beneficial to a variety of users: physicists working in hospitals, staff working in radiological departments, such as medical physicists, physicians, engineers. The plugin, together with a brief user manual, are freely available and can be found online ( http://www.medphys.it/downloads.htm ). With our plugin users can estimate all three most important parameters used for physical characterization (MTF, NPS, and also DQE). The plugin can run on any operating system equipped with ImageJ suite. The authors validated the software by comparing MTF and NPS curves on a common set of images with those obtained with other dedicated programs, achieving a very good agreement.

  10. Free software for performing physical analysis of systems for digital radiography and mammography.

    Science.gov (United States)

    Donini, Bruno; Rivetti, Stefano; Lanconelli, Nico; Bertolini, Marco

    2014-05-01

    In this paper, the authors present a free software for assisting users in achieving the physical characterization of x-ray digital systems and image quality checks. The program was developed as a plugin of a well-known public-domain suite ImageJ. The software can assist users in calculating various physical parameters such as the response curve (also termed signal transfer property), modulation transfer function (MTF), noise power spectra (NPS), and detective quantum efficiency (DQE). It also includes the computation of some image quality checks: defective pixel analysis, uniformity, dark analysis, and lag. The software was made available in 2009 and has been used during the last couple of years by many users who gave us valuable feedback for improving its usability. It was tested for achieving the physical characterization of several clinical systems for digital radiography and mammography. Various published papers made use of the outcomes of the plugin. This software is potentially beneficial to a variety of users: physicists working in hospitals, staff working in radiological departments, such as medical physicists, physicians, engineers. The plugin, together with a brief user manual, are freely available and can be found online (www.medphys.it/downloads.htm). With our plugin users can estimate all three most important parameters used for physical characterization (MTF, NPS, and also DQE). The plugin can run on any operating system equipped with ImageJ suite. The authors validated the software by comparing MTF and NPS curves on a common set of images with those obtained with other dedicated programs, achieving a very good agreement.

  11. Free software for performing physical analysis of systems for digital radiography and mammography

    International Nuclear Information System (INIS)

    Donini, Bruno; Lanconelli, Nico; Rivetti, Stefano; Bertolini, Marco

    2014-01-01

    Purpose: In this paper, the authors present a free software for assisting users in achieving the physical characterization of x-ray digital systems and image quality checks. Methods: The program was developed as a plugin of a well-known public-domain suite ImageJ. The software can assist users in calculating various physical parameters such as the response curve (also termed signal transfer property), modulation transfer function (MTF), noise power spectra (NPS), and detective quantum efficiency (DQE). It also includes the computation of some image quality checks: defective pixel analysis, uniformity, dark analysis, and lag. Results: The software was made available in 2009 and has been used during the last couple of years by many users who gave us valuable feedback for improving its usability. It was tested for achieving the physical characterization of several clinical systems for digital radiography and mammography. Various published papers made use of the outcomes of the plugin. Conclusions: This software is potentially beneficial to a variety of users: physicists working in hospitals, staff working in radiological departments, such as medical physicists, physicians, engineers. The plugin, together with a brief user manual, are freely available and can be found online ( http://www.medphys.it/downloads.htm ). With our plugin users can estimate all three most important parameters used for physical characterization (MTF, NPS, and also DQE). The plugin can run on any operating system equipped with ImageJ suite. The authors validated the software by comparing MTF and NPS curves on a common set of images with those obtained with other dedicated programs, achieving a very good agreement

  12. AR2, a novel automatic muscle artifact reduction software method for ictal EEG interpretation: Validation and comparison of performance with commercially available software [version 2; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    Shennan Aibel Weiss

    2017-04-01

    Full Text Available Objective: To develop a novel software method (AR2 for reducing muscle contamination of ictal scalp electroencephalogram (EEG, and validate this method on the basis of its performance in comparison to a commercially available software method (AR1 to accurately depict seizure-onset location. Methods: A blinded investigation used 23 EEG recordings of seizures from 8 patients. Each recording was uninterpretable with digital filtering because of muscle artifact and processed using AR1 and AR2 and reviewed by 26 EEG specialists. EEG readers assessed seizure-onset time, lateralization, and region, and specified confidence for each determination. The two methods were validated on the basis of the number of readers able to render assignments, confidence, the intra-class correlation (ICC, and agreement with other clinical findings. Results: Among the 23 seizures, two-thirds of the readers were able to delineate seizure-onset time in 10 of 23 using AR1, and 15 of 23 using AR2 (p<0.01. Fewer readers could lateralize seizure-onset (p<0.05. The confidence measures of the assignments were low (probable-unlikely, but increased using AR2 (p<0.05. The ICC for identifying the time of seizure-onset was 0.15 (95% confidence interval (CI, 0.11-0.18 using AR1 and 0.26 (95% CI 0.21-0.30 using AR2.  The EEG interpretations were often consistent with behavioral, neurophysiological, and neuro-radiological findings, with left sided assignments correct in 95.9% (CI 85.7-98.9%, n=4 of cases using AR2, and 91.9% (77.0-97.5% (n=4 of cases using AR1. Conclusions: EEG artifact reduction methods for localizing seizure-onset does not result in high rates of interpretability, reader confidence, and inter-reader agreement. However, the assignments by groups of readers are often congruent with other clinical data. Utilization of the AR2 software method may improve the validity of ictal EEG artifact reduction.

  13. The Performance Evaluation of Multi-Image 3d Reconstruction Software with Different Sensors

    Science.gov (United States)

    Mousavi, V.; Khosravi, M.; Ahmadi, M.; Noori, N.; Naveh, A. Hosseini; Varshosaz, M.

    2015-12-01

    Today, multi-image 3D reconstruction is an active research field and generating three dimensional model of the objects is one the most discussed issues in Photogrammetry and Computer Vision that can be accomplished using range-based or image-based methods. Very accurate and dense point clouds generated by range-based methods such as structured light systems and laser scanners has introduced them as reliable tools in the industry. Image-based 3D digitization methodologies offer the option of reconstructing an object by a set of unordered images that depict it from different viewpoints. As their hardware requirements are narrowed down to a digital camera and a computer system, they compose an attractive 3D digitization approach, consequently, although range-based methods are generally very accurate, image-based methods are low-cost and can be easily used by non-professional users. One of the factors affecting the accuracy of the obtained model in image-based methods is the software and algorithm used to generate three dimensional model. These algorithms are provided in the form of commercial software, open source and web-based services. Another important factor in the accuracy of the obtained model is the type of sensor used. Due to availability of mobile sensors to the public, popularity of professional sensors and the advent of stereo sensors, a comparison of these three sensors plays an effective role in evaluating and finding the optimized method to generate three-dimensional models. Lots of research has been accomplished to identify a suitable software and algorithm to achieve an accurate and complete model, however little attention is paid to the type of sensors used and its effects on the quality of the final model. The purpose of this paper is deliberation and the introduction of an appropriate combination of a sensor and software to provide a complete model with the highest accuracy. To do this, different software, used in previous studies, were compared and

  14. Spectral analysis software improves confidence in plant and soil water stable isotope analyses performed by isotope ratio infrared spectroscopy (IRIS).

    Science.gov (United States)

    West, A G; Goldsmith, G R; Matimati, I; Dawson, T E

    2011-08-30

    Previous studies have demonstrated the potential for large errors to occur when analyzing waters containing organic contaminants using isotope ratio infrared spectroscopy (IRIS). In an attempt to address this problem, IRIS manufacturers now provide post-processing spectral analysis software capable of identifying samples with the types of spectral interference that compromises their stable isotope analysis. Here we report two independent tests of this post-processing spectral analysis software on two IRIS systems, OA-ICOS (Los Gatos Research Inc.) and WS-CRDS (Picarro Inc.). Following a similar methodology to a previous study, we cryogenically extracted plant leaf water and soil water and measured the δ(2)H and δ(18)O values of identical samples by isotope ratio mass spectrometry (IRMS) and IRIS. As an additional test, we analyzed plant stem waters and tap waters by IRMS and IRIS in an independent laboratory. For all tests we assumed that the IRMS value represented the "true" value against which we could compare the stable isotope results from the IRIS methods. Samples showing significant deviations from the IRMS value (>2σ) were considered to be contaminated and representative of spectral interference in the IRIS measurement. Over the two studies, 83% of plant species were considered contaminated on OA-ICOS and 58% on WS-CRDS. Post-analysis, spectra were analyzed using the manufacturer's spectral analysis software, in order to see if the software correctly identified contaminated samples. In our tests the software performed well, identifying all the samples with major errors. However, some false negatives indicate that user evaluation and testing of the software are necessary. Repeat sampling of plants showed considerable variation in the discrepancies between IRIS and IRMS. As such, we recommend that spectral analysis of IRIS data must be incorporated into standard post-processing routines. Furthermore, we suggest that the results from spectral analysis be

  15. 33rd International School of Mathematics "G Stampacchia ": High Performance Algorithms and Software for Nonlinear Optics "Ettore Majorana"

    CERN Document Server

    Murli, Almerico; High Performance Algorithms and Software for Nonlinear Optics

    2003-01-01

    This volume contains the edited texts of the lectures presented at the Workshop on High Performance Algorithms and Software for Nonlinear Optimization held in Erice, Sicily, at the "G. Stampacchia" School of Mathematics of the "E. Majorana" Centre for Scientific Culture, June 30 - July 8, 2001. In the first year of the new century, the aim of the Workshop was to assess the past and to discuss the future of Nonlinear Optimization, and to highlight recent achieve­ ments and promising research trends in this field. An emphasis was requested on algorithmic and high performance software developments and on new computational experiences, as well as on theoretical advances. We believe that such goal was basically achieved. The Workshop was attended by 71 people from 22 countries. Although not all topics were covered, the presentations gave indeed a wide overview of the field, from different and complementary stand­ points. Besides the lectures, several formal and informal discussions took place. We wish ...

  16. Modeling and performance analysis for composite network–compute service provisioning in software-defined cloud environments

    Directory of Open Access Journals (Sweden)

    Qiang Duan

    2015-08-01

    Full Text Available The crucial role of networking in Cloud computing calls for a holistic vision of both networking and computing systems that leads to composite network–compute service provisioning. Software-Defined Network (SDN is a fundamental advancement in networking that enables network programmability. SDN and software-defined compute/storage systems form a Software-Defined Cloud Environment (SDCE that may greatly facilitate composite network–compute service provisioning to Cloud users. Therefore, networking and computing systems need to be modeled and analyzed as composite service provisioning systems in order to obtain thorough understanding about service performance in SDCEs. In this paper, a novel approach for modeling composite network–compute service capabilities and a technique for evaluating composite network–compute service performance are developed. The analytic method proposed in this paper is general and agnostic to service implementation technologies; thus is applicable to a wide variety of network–compute services in SDCEs. The results obtained in this paper provide useful guidelines for federated control and management of networking and computing resources to achieve Cloud service performance guarantees.

  17. Systems analysis programs for hands-on integrated reliability evaluations (SAPHIRE) version 5.0, technical reference manual

    International Nuclear Information System (INIS)

    Russell, K.D.; Atwood, C.L.; Galyean, W.J.; Sattison, M.B.; Rasmuson, D.M.

    1994-07-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) refers to a set of several microcomputer programs that were developed to create and analyze probabilistic risk assessments (PRAs), primarily for nuclear power plants. This volume provides information on the principles used in the construction and operation of Version 5.0 of the Integrated Reliability and Risk Analysis System (IRRAS) and the System Analysis and Risk Assessment (SARA) system. It summarizes the fundamental mathematical concepts of sets and logic, fault trees, and probability. This volume then describes the algorithms that these programs use to construct a fault tree and to obtain the minimal cut sets. It gives the formulas used to obtain the probability of the top event from the minimal cut sets, and the formulas for probabilities that are appropriate under various assumptions concerning repairability and mission time. It defines the measures of basic event importance that these programs can calculate. This volume gives an overview of uncertainty analysis using simple Monte Carlo sampling or Latin Hypercube sampling, and states the algorithms used by these programs to generate random basic event probabilities from various distributions. Further references are given, and a detailed example of the reduction and quantification of a simple fault tree is provided in an appendix

  18. Intercomparison of measurements of NO2 concentrations in the atmosphere simulation chamber SAPHIR during the NO3Comp campaign

    Directory of Open Access Journals (Sweden)

    H. Fuchs

    2010-01-01

    Full Text Available NO2 concentrations were measured by various instruments during the NO3Comp campaign at the atmosphere simulation chamber SAPHIR at Forschungszentrum Jülich, Germany, in June 2007. Analytical methods included photolytic conversion with chemiluminescence (PC-CLD, broadband cavity ring-down spectroscopy (BBCRDS, pulsed cavity ring-down spectroscopy (CRDS, incoherent broadband cavity-enhanced absorption spectroscopy (IBB\\-CEAS, and laser-induced fluorescence (LIF. All broadband absorption spectrometers were optimized for the detection of the main target species of the campaign, NO3, but were also capable of detecting NO2 simultaneously with reduced sensitivity. NO2 mixing ratios in the chamber were within a range characteristic of polluted, urban conditions, with a maximum mixing ratio of approximately 75 ppbv. The overall agreement between measurements of all instruments was excellent. Linear fits of the combined data sets resulted in slopes that differ from unity only within the stated uncertainty of each instrument. Possible interferences from species such as water vapor and ozone were negligible under the experimental conditions.

  19. Modeling of electronic power steering system for IKCO SAMAND vehicle and investigating on its performance via CARSIM software

    Science.gov (United States)

    Haghgoo, Esmail; Zamani, Mohammad; Sharbati, Ali

    2017-02-01

    The point of this article is introducing the usage of electronic power steering (ESP) system in IKCO SAMAND vehicle and investigating on it's benefit's. Also the operation of electronic steering system and it's performance in IKCO SAMAND vehicle have been described. The optimization of IC engine efficiency and it's fuel consumption have been simulated via ADVISOR software used in MATLAB software. Usually, mechanical steering systems and hydraulic steering systems are producing inside IRAN that the mechanical types have not accepted because of it's too many disadvantages. The hydraulic steering systems, that have been replaced with mechanical types, indeed have the same features with mechanical types but with a difference which they have a hydraulic booster to facilitate the rotation of steering wheel. Beside advantages in hydraulic systems, they are some disadvantages in this system that one of the most important of them is reducing the output power of engine. To restore this power dissipated, we use ESP systems. In this article output diagrams given by software, are showing that IKCO SAMAND vehicle which equipped with ESP system, exerts less torque and power on steering wheel. This improves the safety of driver and also performance of the vehicle at high speeds and reduces fuel consumption beside increasing the efficiency of IC engine.

  20. The definitive analysis of the Bendandi's methodology performed with a specific software

    Science.gov (United States)

    Ballabene, Adriano; Pescerelli Lagorio, Paola; Georgiadis, Teodoro

    2015-04-01

    The presentation aims to clarify the "Method Bendandi" supposed, in the past, to be able to forecast earthquakes and never let expressly resolved by the geophysicist from Faenza to posterity. The geoethics implications of the Bendandi's forecasts, and those that arise around the speculation of possible earthquakes inferred from suppositories "Bendandiane" methodologies, rose up in previous years caused by social alarms during supposed occurrences of earthquakes which never happened but where widely spread by media following some 'well informed' non conventional scientists. The analysis was conducted through an extensive literature search of the archive 'Raffaele Bendandi' at Geophy sical Observatory of Faenza and the forecasts analyzed utilising a specially developed software, called "Bendandiano Dashboard", that can reproduce the planetary configurations reported in the graphs made by Italian geophysicist. This analysis should serve to clarify 'definitively' what are the basis of the Bendandi's calculations as well as to prevent future unwarranted warnings issued on the basis of supposed prophecies and illusory legacy documents.

  1. The effects of exercise reminder software program on office workers' perceived pain level, work performance and quality of life.

    Science.gov (United States)

    Irmak, A; Bumin, G; Irmak, R

    2012-01-01

    In direct proportion to current technological developments, both the computer usage in the workplaces is increased and requirement of leaving the desk for an office worker in order to photocopy a document, send or receive an e-mail is decreased. Therefore, office workers stay in the same postures accompanied by long periods of keyboard usage. In recent years, with intent to reduce the incidence of work related musculoskeletal disorders several exercise reminder software programs have been developed. The purpose of this study is to evaluate the effectiveness of the exercise reminder software program on office workers' perceived pain level, work performance and quality of life. 39 healthy office workers accepted to attend the study. Participants were randomly split in to two groups, control group (n = 19) and intervention group (n = 20). Visual Analogue Scale to evaluate the perceived pain was administered all of the participants in the beginning and at the end of the study. The intervention group used the program for 10 weeks. Findings showed that the control group VAS scores remained the same, but the intervention group VAS scores decreased in a statistically significant way (p software programs may help to reduce perceived pain among office workers. Further long term studies with more subjects are needed to describe the effects of these programs and the mechanism under these effects.

  2. Assessing the performance of commercial Agisoft PhotoScan software to deliver reliable data for accurate3D modelling

    Directory of Open Access Journals (Sweden)

    Jebur Ahmed

    2018-01-01

    Full Text Available 3D models delivered from digital photogrammetric techniques have massively increased and developed to meet the requirements of many applications. The reliability of these models is basically dependent on the data processing cycle and the adopted tool solution in addition to data quality. Agisoft PhotoScan is a professional image-based 3D modelling software, which seeks to create orderly, precise n 3D content from fixed images. It works with arbitrary images those qualified in both controlled and uncontrolled conditions. Following the recommendations of many users all around the globe, Agisoft PhotoScan, has become an important source to generate precise 3D data for different applications. How reliable is this data for accurate 3D modelling applications is the current question that needs an answer. Therefore; in this paper, the performance of the Agisoft PhotoScan software was assessed and analyzed to show the potential of the software for accurate 3D modelling applications. To investigate this, a study was carried out in the University of Baghdad / Al-Jaderia campus using data collected from airborne metric camera with 457m flying height. The Agisoft results show potential according to the research objective and the dataset quality following statistical and validation shape analysis.

  3. Development of a software system for spatial resolved trace analysis of high performance materials with SIMS

    International Nuclear Information System (INIS)

    Brunner, Ch. H.

    1997-09-01

    The following work is separated into two distinctly different parts. The first one is dealing with the SIMSScan software project, an application system for secondary ion mass spectrometry. This application system primarily lays down the foundation, for the research activity introduced in the second part of this work. SIMSScan is an application system designed to provide data acquisition routines for different requirements in the field of secondary ion mass spectroscopy. The whole application package is divided into three major sections, each one dealing with specific measurement tasks. Various supporting clients and wizards, providing extended functionality to the main application, build the core of the software. The MassScan as well as the DepthScan module incorporate the SIMS in the direct imaging or stigmatic mode and are featuring the capabilities for mass spectra recording or depth profile analysis. In combination with an image recording facility the DepthScan module features the capability of spatial resolved material analysis - 3D SIMS. The RasterScan module incorporates the SIMS in scanning mode and supports an fiber optical link for optimized data transfer. The primary goal of this work is to introduce the basic ideas behind the implementation of the main application modules and the supporting clients. Furthermore, it is the intention to lay down the foundation for further developments. At the beginning a short introduction into the paradigm of object oriented programming as well as Windows TM programming is given. Besides explaining the basic ideas behind the Doc/View application architecture the focus is mainly shifted to the routines controlling the SIMS hardware and the basic concepts of multithreaded programming. The elementary structures of the view and document objects is discussed in detail only for the MassScan module, because the ideas behind data abstraction and encapsulation are quite similar. The second part introduces the research activities

  4. Methodology and Toolset for Model Verification, Hardware/Software co-simulation, Performance Optimisation and Customisable Source-code generation

    DEFF Research Database (Denmark)

    Berger, Michael Stübert; Soler, José; Yu, Hao

    2013-01-01

    The MODUS project aims to provide a pragmatic and viable solution that will allow SMEs to substantially improve their positioning in the embedded-systems development market. The MODUS tool will provide a model verification and Hardware/Software co-simulation tool (TRIAL) and a performance...... optimisation and customisable source-code generation tool (TUNE). The concept is depicted in automated modelling and optimisation of embedded-systems development. The tool will enable model verification by guiding the selection of existing open-source model verification engines, based on the automated analysis...

  5. Chemical analysis of particulate and gaseous products from the monoterpene oxidation in the SAPHIR chamber during the EUCAARI campaign 2008

    Science.gov (United States)

    Kahnt, A.; Iinuma, Y.; Herrmann, H.; Mentel, T. F.; Fisseha, R.; Kiendler-Scharr, A.

    2009-04-01

    The atmospheric oxidation of monoterpenes leads to multifunctional products with lower vapour pressure. These products condense and coagulate to existing particles leading to particle formation and growth. In order to obtain better insights into the mechanisms and the importance of sources to organic aerosol, a mixture of monoterpenes was oxidised in the SAPHIR outdoor chamber during the EUCAARI campaign in 2008. The mixture was made of α-pinene, β-pinene, limonene, 3-carene and ocimene, representing a typical monoterpene emission from a boreal forest. In addition, two sesquiterpenes (α-farnesene and caryophyllene) were reacted together with the monoterpene mixture in some experiments. The VOC (volatile organic compound) mixture was reacted under tropospheric oxidation and light conditions in a prolonged time scale over two days. In the present study, a special emphasis is put on the detection of carbonyl compounds from the off-line analysis of collected filter and denuder samples from the campaign in 2008. The oxidation products which contain carbonyl groups are important first stable intermediates during the monoterpene and sesquiterpene oxidation. They react further with atmospheric oxidants to form lower volatile acidic compounds, contributing to secondary organic aerosol (SOA). Commonly used methods for the analysis of carbonyl compounds involve derivatisation steps prior to separation and subsequent UV or MS detection. In the present study, 2,4-dinitrophenylhydrazine (DNPH) was used to derivatise the extracted filter and denuder samples. The DNPH converts aldehyde- and keto-groups to stable hydrazones, which can be purified afterwards using a solid phase extraction (SPE) cartridge. The derivatised samples were analysed with HPLC/ESI-TOFMS which allowed us to determine the exact chemical formula of unknown products. In addition to known carbonyl compounds from monoterpene oxidation such as pinonaldehyde and nopinon, previously unreported molecular masses

  6. Development of expert system software to improve performance of high-voltage arresters in substations

    Energy Technology Data Exchange (ETDEWEB)

    Souza, Andre Nunes de; Oltremari, Anderson; Zago, Maria Goretti; Silva, Paulo Sergio da; Costa Junior, Pedro da; Ferraz, Kleber [Sao Paulo State Univ. (UNESP), Bauru, SP (Brazil). Lab. of Power Systems and Intelligent Techniques], E-mail: andrejau@feb.unesp.br; Gusmao, Euripedes Silva; Prado, Jose Martins [ELETRONORTE, MT (Brazil)], E-mail: euripedes.gusmao@eln.gov.br

    2007-07-01

    One of the main causes of interruption and power outage on the energy distribution system in Brazil is related to lightning, which is also the main responsible by the reduction of service life and destruction of consumers and Utilities' equipment. As a manner of improving the protection of the energy distribution system, the Utilities have given attention on establishing maintenance techniques, such preventive as predictive, of the high-voltage arresters in substation. Currently, one of the main manners to obtain the installed arresters' characteristics involves the utilization of high cost equipment, such as leakage current meters. In this way, this paper aims to fulfill the needs of obtaining reliable results with the utilization of lower cost equipment, proposing a Expert System Software for diagnosing and aiding to decision through the utilization of intelligent techniques, which makes possible the monitoring of service life and the identification of aged arresters, allowing the establishment of one reliable chronogram for the removal of equipment, such for maintenance as for substitution. (author)

  7. DnaSAM: Software to perform neutrality testing for large datasets with complex null models.

    Science.gov (United States)

    Eckert, Andrew J; Liechty, John D; Tearse, Brandon R; Pande, Barnaly; Neale, David B

    2010-05-01

    Patterns of DNA sequence polymorphisms can be used to understand the processes of demography and adaptation within natural populations. High-throughput generation of DNA sequence data has historically been the bottleneck with respect to data processing and experimental inference. Advances in marker technologies have largely solved this problem. Currently, the limiting step is computational, with most molecular population genetic software allowing a gene-by-gene analysis through a graphical user interface. An easy-to-use analysis program that allows both high-throughput processing of multiple sequence alignments along with the flexibility to simulate data under complex demographic scenarios is currently lacking. We introduce a new program, named DnaSAM, which allows high-throughput estimation of DNA sequence diversity and neutrality statistics from experimental data along with the ability to test those statistics via Monte Carlo coalescent simulations. These simulations are conducted using the ms program, which is able to incorporate several genetic parameters (e.g. recombination) and demographic scenarios (e.g. population bottlenecks). The output is a set of diversity and neutrality statistics with associated probability values under a user-specified null model that are stored in easy to manipulate text file. © 2009 Blackwell Publishing Ltd.

  8. Performance of Student Software Development Teams: The Influence of Personality and Identifying as Team Members

    Science.gov (United States)

    Monaghan, Conal; Bizumic, Boris; Reynolds, Katherine; Smithson, Michael; Johns-Boast, Lynette; van Rooy, Dirk

    2015-01-01

    One prominent approach in the exploration of the variations in project team performance has been to study two components of the aggregate personalities of the team members: conscientiousness and agreeableness. A second line of research, known as self-categorisation theory, argues that identifying as team members and the team's performance norms…

  9. High Performance Electrical Modeling and Simulation Software Normal Environment Verification and Validation Plan, Version 1.0; TOPICAL

    International Nuclear Information System (INIS)

    WIX, STEVEN D.; BOGDAN, CAROLYN W.; MARCHIONDO JR., JULIO P.; DEVENEY, MICHAEL F.; NUNEZ, ALBERT V.

    2002-01-01

    The requirements in modeling and simulation are driven by two fundamental changes in the nuclear weapons landscape: (1) The Comprehensive Test Ban Treaty and (2) The Stockpile Life Extension Program which extends weapon lifetimes well beyond their originally anticipated field lifetimes. The move from confidence based on nuclear testing to confidence based on predictive simulation forces a profound change in the performance asked of codes. The scope of this document is to improve the confidence in the computational results by demonstration and documentation of the predictive capability of electrical circuit codes and the underlying conceptual, mathematical and numerical models as applied to a specific stockpile driver. This document describes the High Performance Electrical Modeling and Simulation software normal environment Verification and Validation Plan

  10. Hardware Architecture of Polyphase Filter Banks Performing Embedded Resampling for Software-Defined Radio Front-Ends

    DEFF Research Database (Denmark)

    Awan, Mehmood-Ur-Rehman; Le Moullec, Yannick; Koch, Peter

    2012-01-01

    , and power optimization for field programmable gate array (FPGA) based architectures in an M -path polyphase filter bank with modified N -path polyphase filter. Such systems allow resampling by arbitrary ratios while simultaneously performing baseband aliasing from center frequencies at Nyquist zones......In this paper, we describe resource-efficient hardware architectures for software-defined radio (SDR) front-ends. These architectures are made efficient by using a polyphase channelizer that performs arbitrary sample rate changes, frequency selection, and bandwidth control. We discuss area, time...... that are not multiples of the output sample rate. A non-maximally decimated polyphase filter bank, where the number of data loads is not equal to the number of M subfilters, processes M subfilters in a time period that is either less than or greater than the M data-load’s time period. We present a load...

  11. Measuring the Impact of Language-Learning Software on Test Performance of Chinese Learners of English

    Science.gov (United States)

    Nicholes, Justin

    2016-01-01

    This classroom quasi-experiment aimed to learn if and to what degree supplementing classroom instruction with Rosetta Stone (RS), Tell Me More (TMM), Memrise (MEM), or ESL WOW (WOW) impacted high-stakes English test performance in areas of university-level writing, reading, speaking, listening, and grammar. Seventy-eight (N = 78) Chinese learners…

  12. Software and DVFS Tuning for Performance and Energy-Efficiency on Intel KNL Processors

    Directory of Open Access Journals (Sweden)

    Enrico Calore

    2018-06-01

    Full Text Available Energy consumption of processors and memories is quickly becoming a limiting factor in the deployment of large computing systems. For this reason, it is important to understand the energy performance of these processors and to study strategies allowing their use in the most efficient way. In this work, we focus on the computing and energy performance of the Knights Landing Xeon Phi, the latest Intel many-core architecture processor for HPC applications. We consider the 64-core Xeon Phi 7230 and profile its performance and energy efficiency using both its on-chip MCDRAM and the off-chip DDR4 memory as the main storage for application data. As a benchmark application, we use a lattice Boltzmann code heavily optimized for this architecture and implemented using several different arrangements of the application data in memory (data-layouts, in short. We also assess the dependence of energy consumption on data-layouts, memory configurations (DDR4 or MCDRAM and the number of threads per core. We finally consider possible trade-offs between computing performance and energy efficiency, tuning the clock frequency of the processor using the Dynamic Voltage and Frequency Scaling (DVFS technique.

  13. Integration of control and building performance simulation software by run-time coupling

    NARCIS (Netherlands)

    Yahiaoui, A.; Hensen, J.L.M.; Soethout, L.L.

    2003-01-01

    This paper presents the background, approach and initial results of a project, which aims to achieve better integrated building and systems control modeling in building performance simulation by runtime coupling of distributed computer programs. This paper focuses on one of the essential steps

  14. Reuse without Compromising Performance: Industrial Experience from RPG Software Product Line for Mobile Devices

    DEFF Research Database (Denmark)

    Zhang, Weishan; Jarzabek, Stan

    2005-01-01

    allowed us to achieve improved performance, both speed and memory utilization, as compared to each game developed individually. At the same time, our solution facilitated rapid development of new games, for new mobile devices, as well as ease of evolving with new features the RPG-PLA and custom games...

  15. Software for automated evaluation of technical and economic performance factors of nuclear power plant units

    International Nuclear Information System (INIS)

    Cvan, M.; Zadrazil, J.; Barnak, M.

    1989-01-01

    Computer codes TEP V2, TEP EDU and TEP V1 are used especially in real-time evaluation of technical and economic performance factors of the power unit. Their basic functions include filtration of credibility of input data obtained by measurement, simultaneous calculation of flows of various types of energy, calculation of technical and economic factors, listings and filing of the results. Code ZMEK is designed for executing changes in the calculation constants file for codes TEP V2 and TEP EDU. Code TEP DEN is used in processing the complete daily report on the technical and economic performance factors of the unit. Briefly described are the basic algorithms of credibility filtration for the measured quantities, the methodology of fundamental balances and the method of guaranteeing the continuity of measurement. Experiences are given with the use of the codes, and the trends are outlined of their future development. (J.B.). 5 refs

  16. Cost Control and Performance Review of Software Projects by Using the Earned Value Management

    Directory of Open Access Journals (Sweden)

    Felician ALECU

    2014-08-01

    Full Text Available EVM (Earned Value Management is a method that can be successfully used to measure the performance of a project from the cost and schedule points of view. Initially developed for the US government programs in the 60s, it later becomes an important feature of any modern project management practice thanks to its simplicity and efficiency in signaling project anomalies in time. EVM become extremely popular because it can be equally applied for any project in any industry.

  17. Investigation Into Informational Compatibility Of Building Information Modelling And Building Performance Analysis Software Solutions

    OpenAIRE

    Hyun, S.; Marjanovic-Halburd, L.; Raslan, R.

    2015-01-01

    There are significant opportunities for Building Information Modelling (BIM) to address issues related to sustainable and energy efficient building design. While the potential benefits associated with the integration of BIM and BPA (Building Performance Analysis) have been recognised, its specifications and formats remain in their early infancy and often fail to live up to the promise of seamless interoperability at various stages of design process. This paper conducts a case study to investi...

  18. Process is king: Evaluating the performance of technology-mediated learning in vocational software training

    OpenAIRE

    Söllner, Matthias; Bitzer, Philipp; Janson, Andreas; Leimeister, Jan Marco

    2017-01-01

    Technology-mediated learning (TML) is a major trend in education, since it allows to integrate the strengths of traditional- and IT-based learning activities. However, TML providers still struggle in identifying areas for improvement in their TML offerings. One reason for their struggles is inconsistencies in the literature regarding drivers of TML performance. Prior research suggests that these inconsistencies in TML literature might stem from neglecting the importance of considering the pro...

  19. A STUDY ON THE LINKAGE BETWEEN EMOTIONAL INTELLIGENCE AND JOB PERFORMANCE OF SOFTWARE PROFESSIONALS IN TAMILNADU

    OpenAIRE

    B. Rajkumar

    2018-01-01

    Emotional intelligence (EI), a recent construct which predicts various performance and leadership traits helps companies to deploy quality work force. Emotional Intelligence (EI) has emerged as a theme of widespread interest in psychological research in recent years. It affects the day-to-day life of everyone. EI is the ability to recognize our own potential as well manages everything as per situation. At work place, emotions are mainly based on two prospectors, namely, sociological and psych...

  20. Software reengineering

    Science.gov (United States)

    Fridge, Ernest M., III

    1991-01-01

    Today's software systems generally use obsolete technology, are not integrated properly with other software systems, and are difficult and costly to maintain. The discipline of reverse engineering is becoming prominent as organizations try to move their systems up to more modern and maintainable technology in a cost effective manner. JSC created a significant set of tools to develop and maintain FORTRAN and C code during development of the Space Shuttle. This tool set forms the basis for an integrated environment to re-engineer existing code into modern software engineering structures which are then easier and less costly to maintain and which allow a fairly straightforward translation into other target languages. The environment will support these structures and practices even in areas where the language definition and compilers do not enforce good software engineering. The knowledge and data captured using the reverse engineering tools is passed to standard forward engineering tools to redesign or perform major upgrades to software systems in a much more cost effective manner than using older technologies. A beta vision of the environment was released in Mar. 1991. The commercial potential for such re-engineering tools is very great. CASE TRENDS magazine reported it to be the primary concern of over four hundred of the top MIS executives.

  1. Optimized Architectural Approaches in Hardware and Software Enabling Very High Performance Shared Storage Systems

    CERN Multimedia

    CERN. Geneva

    2004-01-01

    There are issues encountered in high performance storage systems that normally lead to compromises in architecture. Compute clusters tend to have compute phases followed by an I/O phase that must move data from the entire cluster in one operation. That data may then be shared by a large number of clients creating unpredictable read and write patterns. In some cases the aggregate performance of a server cluster must exceed 100 GB/s to minimize the time required for the I/O cycle thus maximizing compute availability. Accessing the same content from multiple points in a shared file system leads to the classical problems of data "hot spots" on the disk drive side and access collisions on the data connectivity side. The traditional method for increasing apparent bandwidth usually includes data replication which is costly in both storage and management. Scaling a model that includes replicated data presents additional management challenges as capacity and bandwidth expand asymmetrically while the system is scaled. ...

  2. Adoption of High Performance Computational (HPC) Modeling Software for Widespread Use in the Manufacture of Welded Structures

    Energy Technology Data Exchange (ETDEWEB)

    Brust, Frederick W. [Engineering Mechanics Corporation of Columbus, Columbus, OH (United States); Punch, Edward F. [Engineering Mechanics Corporation of Columbus, Columbus, OH (United States); Twombly, Elizabeth Kurth [Engineering Mechanics Corporation of Columbus, Columbus, OH (United States); Kalyanam, Suresh [Engineering Mechanics Corporation of Columbus, Columbus, OH (United States); Kennedy, James [Engineering Mechanics Corporation of Columbus, Columbus, OH (United States); Hattery, Garty R. [Engineering Mechanics Corporation of Columbus, Columbus, OH (United States); Dodds, Robert H. [Professional Consulting Services, Inc., Lisle, IL (United States); Mach, Justin C [Caterpillar, Peoria, IL (United States); Chalker, Alan [Ohio Supercomputer Center (OSC), Columbus, OH (United States); Nicklas, Jeremy [Ohio Supercomputer Center (OSC), Columbus, OH (United States); Gohar, Basil M [Ohio Supercomputer Center (OSC), Columbus, OH (United States); Hudak, David [Ohio Supercomputer Center (OSC), Columbus, OH (United States)

    2016-12-30

    This report summarizes the final product developed for the US DOE Small Business Innovation Research (SBIR) Phase II grant made to Engineering Mechanics Corporation of Columbus (Emc2) between April 16, 2014 and August 31, 2016 titled ‘Adoption of High Performance Computational (HPC) Modeling Software for Widespread Use in the Manufacture of Welded Structures’. Many US companies have moved fabrication and production facilities off shore because of cheaper labor costs. A key aspect in bringing these jobs back to the US is the use of technology to render US-made fabrications more cost-efficient overall with higher quality. One significant advantage that has emerged in the US over the last two decades is the use of virtual design for fabrication of small and large structures in weld fabrication industries. Industries that use virtual design and analysis tools have reduced material part size, developed environmentally-friendly fabrication processes, improved product quality and performance, and reduced manufacturing costs. Indeed, Caterpillar Inc. (CAT), one of the partners in this effort, continues to have a large fabrication presence in the US because of the use of weld fabrication modeling to optimize fabrications by controlling weld residual stresses and distortions and improving fatigue, corrosion, and fracture performance. This report describes Emc2’s DOE SBIR Phase II final results to extend an existing, state-of-the-art software code, Virtual Fabrication Technology (VFT®), currently used to design and model large welded structures prior to fabrication - to a broader range of products with widespread applications for small and medium-sized enterprises (SMEs). VFT® helps control distortion, can minimize and/or control residual stresses, control welding microstructure, and pre-determine welding parameters such as weld-sequencing, pre-bending, thermal-tensioning, etc. VFT® uses material properties, consumable properties, etc. as inputs

  3. The Software Architecture for Performing Scientific Computation with the JLAPACK Libraries in ScalaLab

    Directory of Open Access Journals (Sweden)

    Stergios Papadimitriou

    2012-01-01

    Full Text Available Although LAPACK is a powerful library its utilization is difficult. JLAPACK, a Java translation obtained automatically from the Fortran LAPACK sources, retains exactly the same difficult to use interface of LAPACK routines. The MTJ library implements an object oriented Java interface to JLAPACK that hides many complicated details. ScalaLab exploits the flexibility of the Scala language to present an even more friendly and convenient interface to the powerful but complicated JLAPACK library. The article describes the interfacing of the low-level JLAPACK routines within the ScalaLab environment. This is performed rather easily by exploiting well suited features of the Scala language. Also, the paper demonstrates the convenience of using JLAPACK routines for linear algebra operations from within ScalaLab.

  4. SafetyBarrierManager, a software tool to perform risk analysis using ARAMIS's principles

    DEFF Research Database (Denmark)

    Duijm, Nijs Jan

    2017-01-01

    of the ARAMIS project, Risø National Laboratory started developing a tool that could implement these methodologies, leading to SafetyBarrierManager. The tool is based on the principles of “safety‐barrier diagrams”, which are very similar to “bowties”, with the possibility of performing quantitative analysis......The ARAMIS project resulted in a number of methodologies, dealing with among others: the development of standard fault trees and “bowties”; the identification and classification of safety barriers; and including the quality of safety management into the quantified risk assessment. After conclusion....... The tool allows constructing comprehensive fault trees, event trees and safety‐barrier diagrams. The tool implements the ARAMIS idea of a set of safety barrier types, to which a number of safety management issues can be linked. By rating the quality of these management issues, the operational probability...

  5. Characterization and photometric performance of the Hyper Suprime-Cam Software Pipeline

    Science.gov (United States)

    Huang, Song; Leauthaud, Alexie; Murata, Ryoma; Bosch, James; Price, Paul; Lupton, Robert; Mandelbaum, Rachel; Lackner, Claire; Bickerton, Steven; Miyazaki, Satoshi; Coupon, Jean; Tanaka, Masayuki

    2018-01-01

    The Subaru Strategic Program (SSP) is an ambitious multi-band survey using the Hyper Suprime-Cam (HSC) on the Subaru telescope. The Wide layer of the SSP is both wide and deep, reaching a detection limit of i ˜ 26.0 mag. At these depths, it is challenging to achieve accurate, unbiased, and consistent photometry across all five bands. The HSC data are reduced using a pipeline that builds on the prototype pipeline for the Large Synoptic Survey Telescope. We have developed a Python-based, flexible framework to inject synthetic galaxies into real HSC images, called SynPipe. Here we explain the design and implementation of SynPipe and generate a sample of synthetic galaxies to examine the photometric performance of the HSC pipeline. For stars, we achieve 1% photometric precision at i ˜ 19.0 mag and 6% precision at i ˜ 25.0 in the i band (corresponding to statistical scatters of ˜0.01 and ˜0.06 mag respectively). For synthetic galaxies with single-Sérsic profiles, forced CModel photometry achieves 13% photometric precision at i ˜ 20.0 mag and 18% precision at i ˜ 25.0 in the i band (corresponding to statistical scatters of ˜0.15 and ˜0.22 mag respectively). We show that both forced point spread function and CModel photometry yield unbiased color estimates that are robust to seeing conditions. We identify several caveats that apply to the version of HSC pipeline used for the first public HSC data release (DR1) that need to be taking into consideration. First, the degree to which an object is blended with other objects impacts the overall photometric performance. This is especially true for point sources. Highly blended objects tend to have larger photometric uncertainties, systematically underestimated fluxes, and slightly biased colors. Secondly, >20% of stars at 22.5 21.5 mag.

  6. Subpixelic measurement of large 1D displacements: principle, processing algorithms, performances and software.

    Science.gov (United States)

    Guelpa, Valérian; Laurent, Guillaume J; Sandoz, Patrick; Zea, July Galeano; Clévy, Cédric

    2014-03-12

    This paper presents a visual measurement method able to sense 1D rigid body displacements with very high resolutions, large ranges and high processing rates. Sub-pixelic resolution is obtained thanks to a structured pattern placed on the target. The pattern is made of twin periodic grids with slightly different periods. The periodic frames are suited for Fourier-like phase calculations-leading to high resolution-while the period difference allows the removal of phase ambiguity and thus a high range-to-resolution ratio. The paper presents the measurement principle as well as the processing algorithms (source files are provided as supplementary materials). The theoretical and experimental performances are also discussed. The processing time is around 3 µs for a line of 780 pixels, which means that the measurement rate is mostly limited by the image acquisition frame rate. A 3-σ repeatability of 5 nm is experimentally demonstrated which has to be compared with the 168 µm measurement range.

  7. Applying integrated software to optimize corporate production performance: a case study at Suncor

    International Nuclear Information System (INIS)

    Masse, L.P.; Rhynes, P.

    1997-01-01

    The feasibility and need to introduce a central database of basic well data for use in the petroleum industry in order to enhance production performance was discussed. Suncor developed a central database of well data as the foundation for a future systems architecture for its own use. The perceived, current and future benefits of such a system were described. Suncor identified the need for a corporate repository which is accessible to multiple applications, and provides the opportunity to upgrade the system to new technology that will benefit from integration. The objective was to document existing data sets, identify what additional data would be useful and document existing processes around this well data. The integrated set of data is supplied by multiple vendors and includes public land data, production budget, public well data, forecasting, economics, drilling, procurement system, fixed assets, maintenance, land administration, field data capture, production accounting and financial accounting. In addition to being able to access the current well data, significant added value is expected from the pro-active communication within the departments, and the additional time available for analysis and decisions as opposed to searching for data and comparing sources. 4 figs

  8. MEGADOCK 4.0: an ultra-high-performance protein-protein docking software for heterogeneous supercomputers.

    Science.gov (United States)

    Ohue, Masahito; Shimoda, Takehiro; Suzuki, Shuji; Matsuzaki, Yuri; Ishida, Takashi; Akiyama, Yutaka

    2014-11-15

    The application of protein-protein docking in large-scale interactome analysis is a major challenge in structural bioinformatics and requires huge computing resources. In this work, we present MEGADOCK 4.0, an FFT-based docking software that makes extensive use of recent heterogeneous supercomputers and shows powerful, scalable performance of >97% strong scaling. MEGADOCK 4.0 is written in C++ with OpenMPI and NVIDIA CUDA 5.0 (or later) and is freely available to all academic and non-profit users at: http://www.bi.cs.titech.ac.jp/megadock. akiyama@cs.titech.ac.jp Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press.

  9. Experimental performance evaluation of software defined networking (SDN) based data communication networks for large scale flexi-grid optical networks.

    Science.gov (United States)

    Zhao, Yongli; He, Ruiying; Chen, Haoran; Zhang, Jie; Ji, Yuefeng; Zheng, Haomian; Lin, Yi; Wang, Xinbo

    2014-04-21

    Software defined networking (SDN) has become the focus in the current information and communication technology area because of its flexibility and programmability. It has been introduced into various network scenarios, such as datacenter networks, carrier networks, and wireless networks. Optical transport network is also regarded as an important application scenario for SDN, which is adopted as the enabling technology of data communication networks (DCN) instead of general multi-protocol label switching (GMPLS). However, the practical performance of SDN based DCN for large scale optical networks, which is very important for the technology selection in the future optical network deployment, has not been evaluated up to now. In this paper we have built a large scale flexi-grid optical network testbed with 1000 virtual optical transport nodes to evaluate the performance of SDN based DCN, including network scalability, DCN bandwidth limitation, and restoration time. A series of network performance parameters including blocking probability, bandwidth utilization, average lightpath provisioning time, and failure restoration time have been demonstrated under various network environments, such as with different traffic loads and different DCN bandwidths. The demonstration in this work can be taken as a proof for the future network deployment.

  10. Measurement of the reaction {gamma}p{yields}K{sup 0}{sigma}{sup +} for photon energies up to 2.65 GeV with the SAPHIR detector at ELSA; Messung der Reaktion {gamma}p {yields} K{sup 0}{sigma}{sup +} fuer Photonenergien bis 2.65 GeV mit dem SAPHIR-Detektor an ELSA

    Energy Technology Data Exchange (ETDEWEB)

    Lawall, R.

    2004-01-01

    The reaction {gamma}p {yields} K{sup 0}{sigma}{sup +} was measured with the SAPHIR-detector at ELSA during the run periods 1997 and 1998. Results were obtained for cross sections in the photon energy range from threshold up to 2.65 GeV for all production angles and for the {sigma}{sup +}-polarization. Emphasis has been put on the determination and reduction of the contributions of background reactions and the comparison with other measurements and theoretical predictions. (orig.)

  11. EPIQR software

    Energy Technology Data Exchange (ETDEWEB)

    Flourentzos, F. [Federal Institute of Technology, Lausanne (Switzerland); Droutsa, K. [National Observatory of Athens, Athens (Greece); Wittchen, K.B. [Danish Building Research Institute, Hoersholm (Denmark)

    1999-11-01

    The support of the EPIQR method is a multimedia computer program. Several modules help the users of the method to treat the data collected during a diagnosis survey, to set up refurbishment scenario and calculate their cost or energy performance, and finally to visualize the results in a comprehensive way and to prepare quality reports. This article presents the structure and the main features of the software. (au)

  12. Computational Fluid Dynamics (CFD) Computations With Zonal Navier-Stokes Flow Solver (ZNSFLOW) Common High Performance Computing Scalable Software Initiative (CHSSI) Software

    National Research Council Canada - National Science Library

    Edge, Harris

    1999-01-01

    ...), computational fluid dynamics (CFD) 6 project. Under the project, a proven zonal Navier-Stokes solver was rewritten for scalable parallel performance on both shared memory and distributed memory high performance computers...

  13. Software Engineering Improvement Plan

    Science.gov (United States)

    2006-01-01

    In performance of this task order, bd Systems personnel provided support to the Flight Software Branch and the Software Working Group through multiple tasks related to software engineering improvement and to activities of the independent Technical Authority (iTA) Discipline Technical Warrant Holder (DTWH) for software engineering. To ensure that the products, comments, and recommendations complied with customer requirements and the statement of work, bd Systems personnel maintained close coordination with the customer. These personnel performed work in areas such as update of agency requirements and directives database, software effort estimation, software problem reports, a web-based process asset library, miscellaneous documentation review, software system requirements, issue tracking software survey, systems engineering NPR, and project-related reviews. This report contains a summary of the work performed and the accomplishments in each of these areas.

  14. Problem solving performance and learning strategies of undergraduate students who solved microbiology problems using IMMEX educational software

    Science.gov (United States)

    Ebomoyi, Josephine Itota

    The objectives of this study were as follows: (1) Determine the relationship between learning strategies and performance in problem solving, (2) Explore the role of a student's declared major on performance in problem solving, (3) Understand the decision making process of high and low achievers during problem solving. Participants (N = 65) solved problems using the Interactive multimedia exercise (IMMEX) software. All participants not only solved "Microquest," which focuses on cellular processes and mode of action of antibiotics, but also "Creeping Crud," which focuses on the cause, origin and transmission of diseases. Participants also responded to the "Motivated Strategy Learning Questionnaire" (MSLQ). Hierarchical multiple regression was used for analysis with GPA (Gracie point average) as a control. There were 49 (78.6%) that successfully solved "Microquest" while 52 (82.5%) successfully solved "Creeping Crud". Metacognitive self regulation strategy was significantly (p low achievers. Common strategies and attributes included metacognitive skills, writing to keep track, using prior knowledge. Others included elements of frustration/confusion and self-esteem problems. The implications for educational and relevance to real life situations are discussed.

  15. Computational environment and software configuration management of the 1996 performance assessment for the Waste Isolation Pilot Plant

    International Nuclear Information System (INIS)

    Froehlich, Gary K.; Williamson, Charles Michael; Ogden, Harvey C.

    2000-01-01

    The US Department of Energy (DOE) Waste Isolation Pilot Plant (WIPP), located in southeast New Mexico, is a deep geologic repository for the permanent disposal of transuranic waste generated by DOE defense-related activities. Sandia National Laboratories (SNL), in its role as scientific advisor to the DOE, is responsible for evaluating the long-term performance of the WIPP. This risk-based Performance Assessment (PA) is accomplished in part through the use of numerous scientific modeling codes, which rely for some of their inputs on data gathered during characterization of the site. The PA is subject to formal requirements set forth in federal regulations. In particular, the components of the calculation fall under the configuration management and software quality assurance aegis of the American Society of Mechanical Engineers (ASME) Nuclear Quality Assurance (NQA) requirements. This paper describes SNL's implementation of the NQA requirements regarding configuration management. The complexity of the PA calculation is described, and the rationale for developing a flexible, robust run-control process is discussed. The run-control implementation is described, and its integration with the configuration-management system is then explained, to show how a calculation requiring 37,000 CPU-hours, and involving 225,000 output files totaling 95 Gigabytes, was accomplished in 5 months by 2 individuals, with full traceability and reproducibility

  16. Computational environment and software configuration management of the 1996 performance assessment for the Waste Isolation Pilot Plant

    International Nuclear Information System (INIS)

    Froehlich, G.K.; Williamson, C.M.; Ogden, H.C.

    2000-01-01

    The US Department of Energy (DOE) Waste Isolation Pilot Plant (WIPP), located in southeast New Mexico, is a deep geologic repository for the permanent disposal of transuranic waste generated by DOE defense-related activities. Sandia National Laboratories (SNL), in its role as scientific advisor to the DOE, is responsible for evaluating the long-term performance of the WIPP. This risk-based Performance Assessment (PA) is accomplished in part through the use of numerous scientific modeling codes, which rely for some of their inputs on data gathered during characterization of the site. The PA is subject to formal requirements set forth in federal regulations. In particular, the components of the calculation fall under the configuration management and software quality assurance aegis of the American Society of Mechanical Engineers (ASME) Nuclear Quality Assurance (NQA) requirements. This paper describes SNL's implementation of the NQA requirements regarding configuration management. The complexity of the PA calculation is described, and the rationale for developing a flexible, robust run-control process is discussed. The run-control implementation is described, and its integration with the configuration-management system is then explained, to show how a calculation requiring 37,000 CPU-hours, and involving 225,000 output files totaling 95 GB, was accomplished in 5 months by two individuals, with full traceability and reproducibility

  17. Simulation calculations on the construction of the energy-tagged photon beam as well as development and test of the side drift chambers of the Bonn SAPHIR detector

    International Nuclear Information System (INIS)

    Jahnen, T.

    1990-01-01

    The SAPHIR-detector is built up at the continuous photon beam of the Electron Stretcher and Accelerator ELSA in Bonn. The equipment is designed for investigations of reactions with more then two particles in the final state and for photon energies up to 3.5 GeV. A tagging-system determines the energy of the Bremsstrahlung-photons and a set-up of five large driftchambers measures the tracks of the charged particles. This work describes a program which was used to develop the best design of the tagging-hodoscope. In a second part the tests of the planar side-chambers and their evaluation is described. These measurements were carried out to fix the gasfilling and the parameters of the best working point. It is shown, that the chambers can reach a resolution of σ≤200 μm. (orig.) [de

  18. Software Engineering Reviews and Audits

    CERN Document Server

    Summers, Boyd L

    2011-01-01

    Accurate software engineering reviews and audits have become essential to the success of software companies and military and aerospace programs. These reviews and audits define the framework and specific requirements for verifying software development efforts. Authored by an industry professional with three decades of experience, Software Engineering Reviews and Audits offers authoritative guidance for conducting and performing software first article inspections, and functional and physical configuration software audits. It prepares readers to answer common questions for conducting and perform

  19. A Study of Performance and Effort Expectancy Factors among Generational and Gender Groups to Predict Enterprise Social Software Technology Adoption

    Science.gov (United States)

    Patel, Sunil S.

    2013-01-01

    Social software technology has gained considerable popularity over the last decade and has had a great impact on hundreds of millions of people across the globe. Businesses have also expressed their interest in leveraging its use in business contexts. As a result, software vendors and business consumers have invested billions of dollars to use…

  20. Software Engineering for Portability.

    Science.gov (United States)

    Stanchev, Ivan

    1990-01-01

    Discussion of the portability of educational software focuses on the software design and development process. Topics discussed include levels of portability; the user-computer dialog; software engineering principles; design techniques for student performance records; techniques of courseware programing; and suggestions for further research and…

  1. Investigation of the oxidation of methyl vinyl ketone (MVK) by OH radicals in the atmospheric simulation chamber SAPHIR

    Science.gov (United States)

    Fuchs, Hendrik; Albrecht, Sascha; Acir, Ismail-Hakki; Bohn, Birger; Breitenlechner, Martin; Dorn, Hans-Peter; Gkatzelis, Georgios I.; Hofzumahaus, Andreas; Holland, Frank; Kaminski, Martin; Keutsch, Frank N.; Novelli, Anna; Reimer, David; Rohrer, Franz; Tillmann, Ralf; Vereecken, Luc; Wegener, Robert; Zaytsev, Alexander; Kiendler-Scharr, Astrid; Wahner, Andreas

    2018-06-01

    The photooxidation of methyl vinyl ketone (MVK) was investigated in the atmospheric simulation chamber SAPHIR for conditions at which organic peroxy radicals (RO2) mainly reacted with NO (high NO case) and for conditions at which other reaction channels could compete (low NO case). Measurements of trace gas concentrations were compared to calculated concentration time series applying the Master Chemical Mechanism (MCM version 3.3.1). Product yields of methylglyoxal and glycolaldehyde were determined from measurements. For the high NO case, the methylglyoxal yield was (19 ± 3) % and the glycolaldehyde yield was (65 ± 14) %, consistent with recent literature studies. For the low NO case, the methylglyoxal yield reduced to (5 ± 2) % because other RO2 reaction channels that do not form methylglyoxal became important. Consistent with literature data, the glycolaldehyde yield of (37 ± 9) % determined in the experiment was not reduced as much as implemented in the MCM, suggesting additional reaction channels producing glycolaldehyde. At the same time, direct quantification of OH radicals in the experiments shows the need for an enhanced OH radical production at low NO conditions similar to previous studies investigating the oxidation of the parent VOC isoprene and methacrolein, the second major oxidation product of isoprene. For MVK the model-measurement discrepancy was up to a factor of 2. Product yields and OH observations were consistent with assumptions of additional RO2 plus HO2 reaction channels as proposed in literature for the major RO2 species formed from the reaction of MVK with OH. However, this study shows that also HO2 radical concentrations are underestimated by the model, suggesting that additional OH is not directly produced from RO2 radical reactions, but indirectly via increased HO2. Quantum chemical calculations show that HO2 could be produced from a fast 1,4-H shift of the second most important MVK derived RO2 species (reaction rate constant 0

  2. SIMPATIQCO: a server-based software suite which facilitates monitoring the time course of LC-MS performance metrics on Orbitrap instruments.

    Science.gov (United States)

    Pichler, Peter; Mazanek, Michael; Dusberger, Frederico; Weilnböck, Lisa; Huber, Christian G; Stingl, Christoph; Luider, Theo M; Straube, Werner L; Köcher, Thomas; Mechtler, Karl

    2012-11-02

    While the performance of liquid chromatography (LC) and mass spectrometry (MS) instrumentation continues to increase, applications such as analyses of complete or near-complete proteomes and quantitative studies require constant and optimal system performance. For this reason, research laboratories and core facilities alike are recommended to implement quality control (QC) measures as part of their routine workflows. Many laboratories perform sporadic quality control checks. However, successive and systematic longitudinal monitoring of system performance would be facilitated by dedicated automatic or semiautomatic software solutions that aid an effortless analysis and display of QC metrics over time. We present the software package SIMPATIQCO (SIMPle AuTomatIc Quality COntrol) designed for evaluation of data from LTQ Orbitrap, Q-Exactive, LTQ FT, and LTQ instruments. A centralized SIMPATIQCO server can process QC data from multiple instruments. The software calculates QC metrics supervising every step of data acquisition from LC and electrospray to MS. For each QC metric the software learns the range indicating adequate system performance from the uploaded data using robust statistics. Results are stored in a database and can be displayed in a comfortable manner from any computer in the laboratory via a web browser. QC data can be monitored for individual LC runs as well as plotted over time. SIMPATIQCO thus assists the longitudinal monitoring of important QC metrics such as peptide elution times, peak widths, intensities, total ion current (TIC) as well as sensitivity, and overall LC-MS system performance; in this way the software also helps identify potential problems. The SIMPATIQCO software package is available free of charge.

  3. Gammasphere software development

    International Nuclear Information System (INIS)

    Piercey, R.B.

    1994-01-01

    This report describes the activities of the nuclear physics group at Mississippi State University which were performed during 1993. Significant progress has been made in the focus areas: chairing the Gammasphere Software Working Group (SWG); assisting with the porting and enhancement of the ORNL UPAK histogramming software package; and developing standard formats for Gammasphere data products. In addition, they have established a new public ftp archive to distribute software and software development tools and information

  4. Novel, Highly-Parallel Software for the Online Storage System of the ATLAS Experiment at CERN: Design and Performances

    CERN Document Server

    Colombo, T; The ATLAS collaboration

    2012-01-01

    Abstract--- The ATLAS experiment observes proton-proton collisions delivered by the LHC accelerator at CERN. The ATLAS Trigger and Data Acquisition (TDAQ) system selects interesting events on-line in a three-level trigger system in order to store them at a budgeted rate of several hundred Hz, for an average event size of ~1.5 MB. This paper focuses on the TDAQ data-logging system and in particular on the implementation and performance of a novel software design, reporting on the effort of exploiting the full power of multi-core hardware. In this respect, the main challenge presented by the data-logging workload is the conflict between the largely parallel nature of the event processing, including the recently introduced on-line event-compression, and the constraint of sequential file writing and checksum evaluation. This is further complicated by the necessity of operating in a fully data-driven mode, to cope with continuously evolving trigger and detector configurations. In this paper we will briefly discuss...

  5. Performance evaluation of multi-stratum resources integrated resilience for software defined inter-data center interconnect.

    Science.gov (United States)

    Yang, Hui; Zhang, Jie; Zhao, Yongli; Ji, Yuefeng; Wu, Jialin; Lin, Yi; Han, Jianrui; Lee, Young

    2015-05-18

    Inter-data center interconnect with IP over elastic optical network (EON) is a promising scenario to meet the high burstiness and high-bandwidth requirements of data center services. In our previous work, we implemented multi-stratum resources integration among IP networks, optical networks and application stratums resources that allows to accommodate data center services. In view of this, this study extends to consider the service resilience in case of edge optical node failure. We propose a novel multi-stratum resources integrated resilience (MSRIR) architecture for the services in software defined inter-data center interconnect based on IP over EON. A global resources integrated resilience (GRIR) algorithm is introduced based on the proposed architecture. The MSRIR can enable cross stratum optimization and provide resilience using the multiple stratums resources, and enhance the data center service resilience responsiveness to the dynamic end-to-end service demands. The overall feasibility and efficiency of the proposed architecture is experimentally verified on the control plane of our OpenFlow-based enhanced SDN (eSDN) testbed. The performance of GRIR algorithm under heavy traffic load scenario is also quantitatively evaluated based on MSRIR architecture in terms of path blocking probability, resilience latency and resource utilization, compared with other resilience algorithms.

  6. Performance evaluation of data center service localization based on virtual resource migration in software defined elastic optical network.

    Science.gov (United States)

    Yang, Hui; Zhang, Jie; Ji, Yuefeng; Tan, Yuanlong; Lin, Yi; Han, Jianrui; Lee, Young

    2015-09-07

    Data center interconnection with elastic optical network is a promising scenario to meet the high burstiness and high-bandwidth requirements of data center services. In our previous work, we implemented cross stratum optimization of optical network and application stratums resources that allows to accommodate data center services. In view of this, this study extends the data center resources to user side to enhance the end-to-end quality of service. We propose a novel data center service localization (DCSL) architecture based on virtual resource migration in software defined elastic data center optical network. A migration evaluation scheme (MES) is introduced for DCSL based on the proposed architecture. The DCSL can enhance the responsiveness to the dynamic end-to-end data center demands, and effectively reduce the blocking probability to globally optimize optical network and application resources. The overall feasibility and efficiency of the proposed architecture are experimentally verified on the control plane of our OpenFlow-based enhanced SDN testbed. The performance of MES scheme under heavy traffic load scenario is also quantitatively evaluated based on DCSL architecture in terms of path blocking probability, provisioning latency and resource utilization, compared with other provisioning scheme.

  7. Automated detection of lung nodules in multidetector CT: influence of different reconstruction protocols on performance of a software prototype

    International Nuclear Information System (INIS)

    Gurung, J.; Maataoui, A.; Khan, M.; Wetter, A.; Harth, M.; Jacobi, V.; Vogl, T.J.

    2006-01-01

    Purpose: To evaluate the accuracy of software for computer-aided detection (CAD) of lung nodules using different reconstruction slice thickness protocols in multidetector CT. Materials and Methods: Raw image data sets for 15 patients who had undergone 16-row multidetector CT (MDCT) for known pulmonary nodules were reconstructed at a reconstruction thickness of 5.0, 2.0 and 1.0 mm with a reconstruction increment of 1.5, 1.0 and 0.5 mm, respectively. The ''Nodule Enhanced Viewing'' (NEV) tool of LungCare for computer-aided detection of lung nodules was applied to the reconstructed images. The reconstructed images were also blinded and then evaluated by 2 radiologists (A and B). Data from the evaluating radiologists and CAD was then compared to an independent reference standard established using the consensus of 2 independent experienced chest radiologists. The eligible nodules were grouped according to their size (diameter >10, 5 - 10, <5 mm) for assessment. Statistical analysis was performed using the receiver operating characteristic (ROC) curve analysis, t-test and two-rater Cohen's Kappa co-efficient. Results: A total of 103 nodules were included in the reference standard by the consensus panel. The performance of CAD was marginally lower than that of readers at a 5.0-mm reconstruction thickness (AUC = 0.522, 0.517 and 0.497 for A, B and CAD, respectively). In the case of 2.0-mm reconstruction slices, the performance of CAD was better than that of the readers (AUC = 0.524, 0.524 and 0.614 for A, B and CAD, respectively). CAD was found to be significantly superior to radiologists in the case of 1.0-mm reconstruction slices (AUC = 0.537, 0.531 and 0.675 for A, B and CAD, respectively). The sensitivity at a reconstruction thickness of 1.0 mm was determined to be 66.99%, 68.93% and 80.58% for A, B and CAD, respectively. The time required for detection was shortest for CAD at reconstruction slices of 1.0 mm (mean t = 4 min). The performance of radiologists was greatly

  8. Maintenance planning and performance software for valve packing programs at nuclear power stations (ValvePro Version 2.5)

    International Nuclear Information System (INIS)

    Hutcheson, N.D.

    1994-01-01

    ValvePro Version 2.5 for Windows was developed to help power plant maintenance personnel improve maintenance productivity and quality through a simple, attractive software program, which can be installed on personal computer systems in use at many utilities today. This paper explains the functions of this software and how it can be used by a maintenance organization as a foundation for a consistent, effective valve packing program utilizing sound packing principles

  9. System Software and Tools for High Performance Computing Environments: A report on the findings of the Pasadena Workshop, April 14--16, 1992

    Energy Technology Data Exchange (ETDEWEB)

    Sterling, T. [Universities Space Research Association, Washington, DC (United States); Messina, P. [Jet Propulsion Lab., Pasadena, CA (United States); Chen, M. [Yale Univ., New Haven, CT (United States)] [and others

    1993-04-01

    The Pasadena Workshop on System Software and Tools for High Performance Computing Environments was held at the Jet Propulsion Laboratory from April 14 through April 16, 1992. The workshop was sponsored by a number of Federal agencies committed to the advancement of high performance computing (HPC) both as a means to advance their respective missions and as a national resource to enhance American productivity and competitiveness. Over a hundred experts in related fields from industry, academia, and government were invited to participate in this effort to assess the current status of software technology in support of HPC systems. The overall objectives of the workshop were to understand the requirements and current limitations of HPC software technology and to contribute to a basis for establishing new directions in research and development for software technology in HPC environments. This report includes reports written by the participants of the workshop`s seven working groups. Materials presented at the workshop are reproduced in appendices. Additional chapters summarize the findings and analyze their implications for future directions in HPC software technology development.

  10. Software engineering

    CERN Document Server

    Sommerville, Ian

    2010-01-01

    The ninth edition of Software Engineering presents a broad perspective of software engineering, focusing on the processes and techniques fundamental to the creation of reliable, software systems. Increased coverage of agile methods and software reuse, along with coverage of 'traditional' plan-driven software engineering, gives readers the most up-to-date view of the field currently available. Practical case studies, a full set of easy-to-access supplements, and extensive web resources make teaching the course easier than ever.

  11. DNA Commission of the International Society for Forensic Genetics: Recommendations on the validation of software programs performing biostatistical calculations for forensic genetics applications.

    Science.gov (United States)

    Coble, M D; Buckleton, J; Butler, J M; Egeland, T; Fimmers, R; Gill, P; Gusmão, L; Guttman, B; Krawczak, M; Morling, N; Parson, W; Pinto, N; Schneider, P M; Sherry, S T; Willuweit, S; Prinz, M

    2016-11-01

    The use of biostatistical software programs to assist in data interpretation and calculate likelihood ratios is essential to forensic geneticists and part of the daily case work flow for both kinship and DNA identification laboratories. Previous recommendations issued by the DNA Commission of the International Society for Forensic Genetics (ISFG) covered the application of bio-statistical evaluations for STR typing results in identification and kinship cases, and this is now being expanded to provide best practices regarding validation and verification of the software required for these calculations. With larger multiplexes, more complex mixtures, and increasing requests for extended family testing, laboratories are relying more than ever on specific software solutions and sufficient validation, training and extensive documentation are of upmost importance. Here, we present recommendations for the minimum requirements to validate bio-statistical software to be used in forensic genetics. We distinguish between developmental validation and the responsibilities of the software developer or provider, and the internal validation studies to be performed by the end user. Recommendations for the software provider address, for example, the documentation of the underlying models used by the software, validation data expectations, version control, implementation and training support, as well as continuity and user notifications. For the internal validations the recommendations include: creating a validation plan, requirements for the range of samples to be tested, Standard Operating Procedure development, and internal laboratory training and education. To ensure that all laboratories have access to a wide range of samples for validation and training purposes the ISFG DNA commission encourages collaborative studies and public repositories of STR typing results. Published by Elsevier Ireland Ltd.

  12. Software cost/resource modeling: Software quality tradeoff measurement

    Science.gov (United States)

    Lawler, R. W.

    1980-01-01

    A conceptual framework for treating software quality from a total system perspective is developed. Examples are given to show how system quality objectives may be allocated to hardware and software; to illustrate trades among quality factors, both hardware and software, to achieve system performance objectives; and to illustrate the impact of certain design choices on software functionality.

  13. Software Prototyping

    Science.gov (United States)

    Del Fiol, Guilherme; Hanseler, Haley; Crouch, Barbara Insley; Cummins, Mollie R.

    2016-01-01

    Summary Background Health information exchange (HIE) between Poison Control Centers (PCCs) and Emergency Departments (EDs) could improve care of poisoned patients. However, PCC information systems are not designed to facilitate HIE with EDs; therefore, we are developing specialized software to support HIE within the normal workflow of the PCC using user-centered design and rapid prototyping. Objective To describe the design of an HIE dashboard and the refinement of user requirements through rapid prototyping. Methods Using previously elicited user requirements, we designed low-fidelity sketches of designs on paper with iterative refinement. Next, we designed an interactive high-fidelity prototype and conducted scenario-based usability tests with end users. Users were asked to think aloud while accomplishing tasks related to a case vignette. After testing, the users provided feedback and evaluated the prototype using the System Usability Scale (SUS). Results Survey results from three users provided useful feedback that was then incorporated into the design. After achieving a stable design, we used the prototype itself as the specification for development of the actual software. Benefits of prototyping included having 1) subject-matter experts heavily involved with the design; 2) flexibility to make rapid changes, 3) the ability to minimize software development efforts early in the design stage; 4) rapid finalization of requirements; 5) early visualization of designs; 6) and a powerful vehicle for communication of the design to the programmers. Challenges included 1) time and effort to develop the prototypes and case scenarios; 2) no simulation of system performance; 3) not having all proposed functionality available in the final product; and 4) missing needed data elements in the PCC information system. PMID:27081404

  14. Comparison of OH concentration measurements by DOAS and LIF during SAPHIR chamber experiments at high OH reactivity and low NO concentration

    Directory of Open Access Journals (Sweden)

    H. Fuchs

    2012-07-01

    Full Text Available During recent field campaigns, hydroxyl radical (OH concentrations that were measured by laser-induced fluorescence (LIF were up to a factor of ten larger than predicted by current chemical models for conditions of high OH reactivity and low NO concentration. These discrepancies, which were observed in forests and urban-influenced rural environments, are so far not entirely understood. In summer 2011, a series of experiments was carried out in the atmosphere simulation chamber SAPHIR in Jülich, Germany, in order to investigate the photochemical degradation of isoprene, methyl-vinyl ketone (MVK, methacrolein (MACR and aromatic compounds by OH. Conditions were similar to those experienced during the PRIDE-PRD2006 campaign in the Pearl River Delta (PRD, China, in 2006, where a large difference between OH measurements and model predictions was found. During experiments in SAPHIR, OH was simultaneously detected by two independent instruments: LIF and differential optical absorption spectroscopy (DOAS. Because DOAS is an inherently calibration-free technique, DOAS measurements are regarded as a reference standard. The comparison of the two techniques was used to investigate potential artifacts in the LIF measurements for PRD-like conditions of OH reactivities of 10 to 30 s−1 and NO mixing ratios of 0.1 to 0.3 ppbv. The analysis of twenty experiment days shows good agreement. The linear regression of the combined data set (averaged to the DOAS time resolution, 2495 data points yields a slope of 1.02 ± 0.01 with an intercept of (0.10 ± 0.03 × 106 cm−3 and a linear correlation coefficient of R2 = 0.86. This indicates that the sensitivity of the LIF instrument is well-defined by its calibration procedure. No hints for artifacts are observed for isoprene, MACR, and different aromatic compounds. LIF measurements were approximately 30–40% (median larger than those by DOAS after MVK (20 ppbv and

  15. SOFTWARE OPEN SOURCE, SOFTWARE GRATIS?

    Directory of Open Access Journals (Sweden)

    Nur Aini Rakhmawati

    2006-01-01

    Full Text Available Normal 0 false false false IN X-NONE X-NONE MicrosoftInternetExplorer4 Berlakunya Undang – undang Hak Atas Kekayaan Intelektual (HAKI, memunculkan suatu alternatif baru untuk menggunakan software open source. Penggunaan software open source menyebar seiring dengan isu global pada Information Communication Technology (ICT saat ini. Beberapa organisasi dan perusahaan mulai menjadikan software open source sebagai pertimbangan. Banyak konsep mengenai software open source ini. Mulai dari software yang gratis sampai software tidak berlisensi. Tidak sepenuhnya isu software open source benar, untuk itu perlu dikenalkan konsep software open source mulai dari sejarah, lisensi dan bagaimana cara memilih lisensi, serta pertimbangan dalam memilih software open source yang ada. Kata kunci :Lisensi, Open Source, HAKI

  16. Software Epistemology

    Science.gov (United States)

    2016-03-01

    in-vitro decision to incubate a startup, Lexumo [7], which is developing a commercial Software as a Service ( SaaS ) vulnerability assessment...LTS Label Transition System MUSE Mining and Understanding Software Enclaves RTEMS Real-Time Executive for Multi-processor Systems SaaS Software ...as a Service SSA Static Single Assignment SWE Software Epistemology UD/DU Def-Use/Use-Def Chains (Dataflow Graph)

  17. HAZARD ANALYSIS SOFTWARE

    International Nuclear Information System (INIS)

    Sommer, S; Tinh Tran, T.

    2008-01-01

    Washington Safety Management Solutions, LLC developed web-based software to improve the efficiency and consistency of hazard identification and analysis, control selection and classification, and to standardize analysis reporting at Savannah River Site. In the new nuclear age, information technology provides methods to improve the efficiency of the documented safety analysis development process which includes hazard analysis activities. This software provides a web interface that interacts with a relational database to support analysis, record data, and to ensure reporting consistency. A team of subject matter experts participated in a series of meetings to review the associated processes and procedures for requirements and standard practices. Through these meetings, a set of software requirements were developed and compiled into a requirements traceability matrix from which software could be developed. The software was tested to ensure compliance with the requirements. Training was provided to the hazard analysis leads. Hazard analysis teams using the software have verified its operability. The software has been classified as NQA-1, Level D, as it supports the analysis team but does not perform the analysis. The software can be transported to other sites with alternate risk schemes. The software is being used to support the development of 14 hazard analyses. User responses have been positive with a number of suggestions for improvement which are being incorporated as time permits. The software has enforced a uniform implementation of the site procedures. The software has significantly improved the efficiency and standardization of the hazard analysis process

  18. Software for microcircuit systems

    International Nuclear Information System (INIS)

    Kunz, P.F.

    1978-10-01

    Modern Large Scale Integration (LSI) microcircuits are meant to be programed in order to control the function that they perform. The basics of microprograming and new microcircuits have already been discussed. In this course, the methods of developing software for these microcircuits are explored. This generally requires a package of support software in order to assemble the microprogram, and also some amount of support software to test the microprograms and to test the microprogramed circuit itself. 15 figures, 2 tables

  19. Software reliability

    CERN Document Server

    Bendell, A

    1986-01-01

    Software Reliability reviews some fundamental issues of software reliability as well as the techniques, models, and metrics used to predict the reliability of software. Topics covered include fault avoidance, fault removal, and fault tolerance, along with statistical methods for the objective assessment of predictive accuracy. Development cost models and life-cycle cost models are also discussed. This book is divided into eight sections and begins with a chapter on adaptive modeling used to predict software reliability, followed by a discussion on failure rate in software reliability growth mo

  20. Software FMEA analysis for safety-related application software

    International Nuclear Information System (INIS)

    Park, Gee-Yong; Kim, Dong Hoon; Lee, Dong Young

    2014-01-01

    Highlights: • We develop a modified FMEA analysis suited for applying to software architecture. • A template for failure modes on a specific software language is established. • A detailed-level software FMEA analysis on nuclear safety software is presented. - Abstract: A method of a software safety analysis is described in this paper for safety-related application software. The target software system is a software code installed at an Automatic Test and Interface Processor (ATIP) in a digital reactor protection system (DRPS). For the ATIP software safety analysis, at first, an overall safety or hazard analysis is performed over the software architecture and modules, and then a detailed safety analysis based on the software FMEA (Failure Modes and Effect Analysis) method is applied to the ATIP program. For an efficient analysis, the software FMEA analysis is carried out based on the so-called failure-mode template extracted from the function blocks used in the function block diagram (FBD) for the ATIP software. The software safety analysis by the software FMEA analysis, being applied to the ATIP software code, which has been integrated and passed through a very rigorous system test procedure, is proven to be able to provide very valuable results (i.e., software defects) that could not be identified during various system tests

  1. Science and Software

    Science.gov (United States)

    Zelt, C. A.

    2017-12-01

    Earth science attempts to understand how the earth works. This research often depends on software for modeling, processing, inverting or imaging. Freely sharing open-source software is essential to prevent reinventing the wheel and allows software to be improved and applied in ways the original author may never have envisioned. For young scientists, releasing software can increase their name ID when applying for jobs and funding, and create opportunities for collaborations when scientists who collect data want the software's creator to be involved in their project. However, we frequently hear scientists say software is a tool, it's not science. Creating software that implements a new or better way of earth modeling or geophysical processing, inverting or imaging should be viewed as earth science. Creating software for things like data visualization, format conversion, storage, or transmission, or programming to enhance computational performance, may be viewed as computer science. The former, ideally with an application to real data, can be published in earth science journals, the latter possibly in computer science journals. Citations in either case should accurately reflect the impact of the software on the community. Funding agencies need to support more software development and open-source releasing, and the community should give more high-profile awards for developing impactful open-source software. Funding support and community recognition for software development can have far reaching benefits when the software is used in foreseen and unforeseen ways, potentially for years after the original investment in the software development. For funding, an open-source release that is well documented should be required, with example input and output files. Appropriate funding will provide the incentive and time to release user-friendly software, and minimize the need for others to duplicate the effort. All funded software should be available through a single web site

  2. CONTRIBUTION TO THE DEVELOPMENT OF A SIMULATION SOFTWARE PERFORMANCE AND SHARING RATIO IN LIQUID-LIQUID EXTRACTION

    Directory of Open Access Journals (Sweden)

    A. Hadj Seyd

    2015-07-01

    Full Text Available The present work is to develop software to predict the value yield and the distribution coefficient in the process of liquid-liquid extraction of components of a mixture, from mathematical models expressing these entities, based on equations equilibrium between liquid-liquid phases, and predict the conditions under which the extraction operation is favorable, unfavorable or impossible to realize, by studying the variation of the entities cited, based on the parameters influencing the extraction, which are: initial concentrations, rate of solvent and pH, in the case of a simple extraction (extraction of neutral products or when it is reactive (extraction of complex acids or bases for one or more components.The programming language used is "Delphi" which is a very powerful oriented object programming under Windows.

  3. Software Innovation

    DEFF Research Database (Denmark)

    Rose, Jeremy

      Innovation is the forgotten key to modern systems development - the element that defines the enterprising engineer, the thriving software firm and the cutting edge software application.  Traditional forms of technical education pay little attention to creativity - often encouraging overly...

  4. Software engineering

    CERN Document Server

    Sommerville, Ian

    2016-01-01

    For courses in computer science and software engineering The Fundamental Practice of Software Engineering Software Engineering introduces readers to the overwhelmingly important subject of software programming and development. In the past few years, computer systems have come to dominate not just our technological growth, but the foundations of our world's major industries. This text seeks to lay out the fundamental concepts of this huge and continually growing subject area in a clear and comprehensive manner. The Tenth Edition contains new information that highlights various technological updates of recent years, providing readers with highly relevant and current information. Sommerville's experience in system dependability and systems engineering guides the text through a traditional plan-based approach that incorporates some novel agile methods. The text strives to teach the innovators of tomorrow how to create software that will make our world a better, safer, and more advanced place to live.

  5. Thyroid uptake software

    International Nuclear Information System (INIS)

    Alonso, Dolores; Arista, Eduardo

    2003-01-01

    The DETEC-PC software was developed as a complement to a measurement system (hardware) able to perform Iodine Thyroid Uptake studies. The software was designed according to the principles of Object oriented programming using C++ language. The software automatically fixes spectrometric measurement parameters and besides patient measurement also performs statistical analysis of a batch of samples. It possesses a PARADOX database with all information of measured patients and a help system with the system options and medical concepts related to the thyroid uptake study

  6. Technical Note: Development and performance of a software tool for quality assurance of online replanning with a conventional Linac or MR-Linac

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Guang-Pei, E-mail: gpchen@mcw.edu; Ahunbay, Ergun; Li, X. Allen [Department of Radiation Oncology, Medical College of Wisconsin, Milwaukee, Wisconsin 53226 (United States)

    2016-04-15

    Purpose: To develop an integrated quality assurance (QA) software tool for online replanning capable of efficiently and automatically checking radiation treatment (RT) planning parameters and gross plan quality, verifying treatment plan data transfer from treatment planning system (TPS) to record and verify (R&V) system, performing a secondary monitor unit (MU) calculation with or without a presence of a magnetic field from MR-Linac, and validating the delivery record consistency with the plan. Methods: The software tool, named ArtQA, was developed to obtain and compare plan and treatment parameters from both the TPS and the R&V system database. The TPS data are accessed via direct file reading and the R&V data are retrieved via open database connectivity and structured query language. Plan quality is evaluated with both the logical consistency of planning parameters and the achieved dose–volume histograms. Beams in between the TPS and R&V system are matched based on geometry configurations. To consider the effect of a 1.5 T transverse magnetic field from MR-Linac in the secondary MU calculation, a method based on modified Clarkson integration algorithm was developed and tested for a series of clinical situations. Results: ArtQA has been used in their clinic and can quickly detect inconsistencies and deviations in the entire RT planning process. With the use of the ArtQA tool, the efficiency for plan check including plan quality, data transfer, and delivery check can be improved by at least 60%. The newly developed independent MU calculation tool for MR-Linac reduces the difference between the plan and calculated MUs by 10%. Conclusions: The software tool ArtQA can be used to perform a comprehensive QA check from planning to delivery with conventional Linac or MR-Linac and is an essential tool for online replanning where the QA check needs to be performed rapidly.

  7. Technical Note: Development and performance of a software tool for quality assurance of online replanning with a conventional Linac or MR-Linac.

    Science.gov (United States)

    Chen, Guang-Pei; Ahunbay, Ergun; Li, X Allen

    2016-04-01

    To develop an integrated quality assurance (QA) software tool for online replanning capable of efficiently and automatically checking radiation treatment (RT) planning parameters and gross plan quality, verifying treatment plan data transfer from treatment planning system (TPS) to record and verify (R&V) system, performing a secondary monitor unit (MU) calculation with or without a presence of a magnetic field from MR-Linac, and validating the delivery record consistency with the plan. The software tool, named ArtQA, was developed to obtain and compare plan and treatment parameters from both the TPS and the R&V system database. The TPS data are accessed via direct file reading and the R&V data are retrieved via open database connectivity and structured query language. Plan quality is evaluated with both the logical consistency of planning parameters and the achieved dose-volume histograms. Beams in between the TPS and R&V system are matched based on geometry configurations. To consider the effect of a 1.5 T transverse magnetic field from MR-Linac in the secondary MU calculation, a method based on modified Clarkson integration algorithm was developed and tested for a series of clinical situations. ArtQA has been used in their clinic and can quickly detect inconsistencies and deviations in the entire RT planning process. With the use of the ArtQA tool, the efficiency for plan check including plan quality, data transfer, and delivery check can be improved by at least 60%. The newly developed independent MU calculation tool for MR-Linac reduces the difference between the plan and calculated MUs by 10%. The software tool ArtQA can be used to perform a comprehensive QA check from planning to delivery with conventional Linac or MR-Linac and is an essential tool for online replanning where the QA check needs to be performed rapidly.

  8. Performance evaluation of a real time OFDM radio over fiber system at 2.5 GHz using software defined radio SDR

    DEFF Research Database (Denmark)

    David Cepeda, Juan; Rodriguez, Santiago Isaac; Rico-Martinez, Monica

    2017-01-01

    This paper presents the implementation of an OFDM radio over fiber (RoF) system at 2.5 GHz using software defined radio (SDR). In this work, first we present an introduction of the main concepts about radio over fiber and an orthogonal frequency-division multiplexing (OFDM) system at 2.5 GHz......, then we present a comparison of an OFDM RoF system in three scenarios, modifying the wireless distances and the optical fiber distance in order to evaluate the performance of the system taking into account the symbol error rate (SER) vs signal to noise ratio (SNR) curves....

  9. Software Quality Assurance Plan for GoldSim Models Supporting the Area 3 and Area 5 Radioactive Waste Management Site Performance Assessment Program

    International Nuclear Information System (INIS)

    Gregory J. Shott, Vefa Yucel

    2007-01-01

    This Software Quality Assurance Plan (SQAP) applies to the development and maintenance of GoldSim models supporting the Area 3 and Area 5 Radioactive Waste Management Sites (RWMSs) performance assessments (PAs) and composite analyses (CAs). Two PA models have been approved by the U.S. Department of Energy (DOE), National Nuclear Security Administration Nevada Site Office (NNSA/NSO) as of November 2006 for the PA maintenance work undertaken by National Security Technologies, LLC (NSTec). NNSA/NSO asked NSTec to assume the custodianship of the models for future development and maintenance. The models were initially developed by Neptune and Company (N and C)

  10. Software Quality Assurance Plan for GoldSim Models Supporting the Area 3 and Area 5 Radioactive Waste Management Sites Performance Assessment Program

    Energy Technology Data Exchange (ETDEWEB)

    Gregory J. Shott, Vefa Yucel

    2007-01-03

    This Software Quality Assurance Plan (SQAP) applies to the development and maintenance of GoldSim models supporting the Area 3 and Area 5 Radioactive Waste Management Sites (RWMSs) performance assessments (PAs) and composite analyses (CAs). Two PA models have been approved by the U.S. Department of Energy (DOE), National Nuclear Security Administration Nevada Site Office (NNSA/NSO) as of November 2006 for the PA maintenance work undertaken by National Security Technologies, LLC (NSTec). NNSA/NSO asked NSTec to assume the custodianship of the models for future development and maintenance. The models were initially developed by Neptune and Company (N&C).

  11. Software requirements

    CERN Document Server

    Wiegers, Karl E

    2003-01-01

    Without formal, verifiable software requirements-and an effective system for managing them-the programs that developers think they've agreed to build often will not be the same products their customers are expecting. In SOFTWARE REQUIREMENTS, Second Edition, requirements engineering authority Karl Wiegers amplifies the best practices presented in his original award-winning text?now a mainstay for anyone participating in the software development process. In this book, you'll discover effective techniques for managing the requirements engineering process all the way through the development cy

  12. Software Defined Networking Demands on Software Technologies

    DEFF Research Database (Denmark)

    Galinac Grbac, T.; Caba, Cosmin Marius; Soler, José

    2015-01-01

    Software Defined Networking (SDN) is a networking approach based on a centralized control plane architecture with standardised interfaces between control and data planes. SDN enables fast configuration and reconfiguration of the network to enhance resource utilization and service performances....... This new approach enables a more dynamic and flexible network, which may adapt to user needs and application requirements. To this end, systemized solutions must be implemented in network software, aiming to provide secure network services that meet the required service performance levels. In this paper......, we review this new approach to networking from an architectural point of view, and identify and discuss some critical quality issues that require new developments in software technologies. These issues we discuss along with use case scenarios. Here in this paper we aim to identify challenges...

  13. Incremental validity of proactive personality over the Big Five for predicting job performance of software engineers in an innovative context

    Directory of Open Access Journals (Sweden)

    Nuno Rodrigues

    2013-01-01

    Full Text Available Este estudio examina la validez añadida de la personalidad proactiva sobre los «cinco grandes» para predecir el desempeño en el trabajo en el contexto de un puesto de trabajo de ingeniero de software. La personalidad proactiva y los «cinco grandes» fueron medidos en una muestra de 243 ingenieros y el desempeño global fue evaluado mediante valoraciones del supervisor en una sub-muestra de 95 de estos ingenieros. Los resultados mostraron que aun cuando la personalidad proactiva representa un importante y válido predictor del desempeño no muestra un incremento relevante en la predicción producida por la extraversión, apertura, conciencia, estabilidad emocional y antigüedad en el puesto. Se discuten las implicaciones, la relevancia y el valor práctico de la personalidad proactiva para la selección de personal.

  14. Computational Software to Fit Seismic Data Using Epidemic-Type Aftershock Sequence Models and Modeling Performance Comparisons

    Science.gov (United States)

    Chu, A.

    2016-12-01

    Modern earthquake catalogs are often analyzed using spatial-temporal point process models such as the epidemic-type aftershock sequence (ETAS) models of Ogata (1998). My work implements three of the homogeneous ETAS models described in Ogata (1998). With a model's log-likelihood function, my software finds the Maximum-Likelihood Estimates (MLEs) of the model's parameters to estimate the homogeneous background rate and the temporal and spatial parameters that govern triggering effects. EM-algorithm is employed for its advantages of stability and robustness (Veen and Schoenberg, 2008). My work also presents comparisons among the three models in robustness, convergence speed, and implementations from theory to computing practice. Up-to-date regional seismic data of seismic active areas such as Southern California and Japan are used to demonstrate the comparisons. Data analysis has been done using computer languages Java and R. Java has the advantages of being strong-typed and easiness of controlling memory resources, while R has the advantages of having numerous available functions in statistical computing. Comparisons are also made between the two programming languages in convergence and stability, computational speed, and easiness of implementation. Issues that may affect convergence such as spatial shapes are discussed.

  15. Software safety hazard analysis

    International Nuclear Information System (INIS)

    Lawrence, J.D.

    1996-02-01

    Techniques for analyzing the safety and reliability of analog-based electronic protection systems that serve to mitigate hazards in process control systems have been developed over many years, and are reasonably well understood. An example is the protection system in a nuclear power plant. The extension of these techniques to systems which include digital computers is not well developed, and there is little consensus among software engineering experts and safety experts on how to analyze such systems. One possible technique is to extend hazard analysis to include digital computer-based systems. Software is frequently overlooked during system hazard analyses, but this is unacceptable when the software is in control of a potentially hazardous operation. In such cases, hazard analysis should be extended to fully cover the software. A method for performing software hazard analysis is proposed in this paper

  16. Software Reviews.

    Science.gov (United States)

    Dwyer, Donna; And Others

    1989-01-01

    Reviewed are seven software packages for Apple and IBM computers. Included are: "Toxicology"; "Science Corner: Space Probe"; "Alcohol and Pregnancy"; "Science Tool Kit Plus"; Computer Investigations: Plant Growth"; "Climatrolls"; and "Animal Watch: Whales." (CW)

  17. Software Reviews.

    Science.gov (United States)

    Davis, Shelly J., Ed.; Knaupp, Jon, Ed.

    1984-01-01

    Reviewed is computer software on: (1) classification of living things, a tutorial program for grades 5-10; and (2) polynomial practice using tiles, a drill-and-practice program for algebra students. (MNS)

  18. Software Reviews.

    Science.gov (United States)

    Miller, Anne, Ed.; Radziemski, Cathy, Ed.

    1988-01-01

    Three pieces of computer software are described and reviewed: HyperCard, to build and use varied applications; Iggy's Gnees, for problem solving with shapes in grades kindergarten-two; and Algebra Shop, for practicing skills and problem solving. (MNS)

  19. Analysis of the Company Officer Management Information System (COMIS) Performance Measurement Software at the United States Naval Academy

    National Research Council Canada - National Science Library

    Larges, Chad

    2000-01-01

    .... In 1999, the Company Officer Management Information System (COMIS) prototype was created to work in conjunction with MIDS to enhance a Company Officer's ability to develop midshipmen and measure their performance...

  20. The Impact of Internal and External Resources, and Strategic Actions in Business Networks on Firm Performance in the Software Industry

    OpenAIRE

    Anggraeni, E.

    2014-01-01

    Understanding the variance in firm performance has been an important topic in the strategic management literature. In the last two decades it has become particularly interesting as business networks increasingly have become an integrated part of a firm's environment. Besides the internal resources, the less-controlled external resources in the firm’s business networks to affect its performance too. The uncertainty associated with the lower levels of control over external resources implies tha...

  1. NASA's Software Safety Standard

    Science.gov (United States)

    Ramsay, Christopher M.

    2007-01-01

    requirements. This allows the projects leeway to meet these requirements in many forms that best suit a particular project's needs and safety risk. In other words, it tells the project what to do, not how to do it. This update also incorporated advances in the state of the practice of software safety from academia and private industry. It addresses some of the more common issues now facing software developers in the NASA environment such as the use of Commercial-Off-the-Shelf Software (COTS), Modified OTS (MOTS), Government OTS (GOTS), and reused software. A team from across NASA developed the update and it has had both NASA-wide internal reviews by software engineering, quality, safety, and project management. It has also had expert external review. This presentation and paper will discuss the new NASA Software Safety Standard, its organization, and key features. It will start with a brief discussion of some NASA mission failures and incidents that had software as one of their root causes. It will then give a brief overview of the NASA Software Safety Process. This will include an overview of the key personnel responsibilities and functions that must be performed for safety-critical software.

  2. Gammasphere software development. Progress report

    Energy Technology Data Exchange (ETDEWEB)

    Piercey, R.B.

    1994-01-01

    This report describes the activities of the nuclear physics group at Mississippi State University which were performed during 1993. Significant progress has been made in the focus areas: chairing the Gammasphere Software Working Group (SWG); assisting with the porting and enhancement of the ORNL UPAK histogramming software package; and developing standard formats for Gammasphere data products. In addition, they have established a new public ftp archive to distribute software and software development tools and information.

  3. Software for microcircuit systems

    International Nuclear Information System (INIS)

    Kunz, P.F.

    1978-01-01

    Modern Large Scale Integration (LSI) microcircuits are meant to be programmed in order to control the function that they perform. In the previous paper the author has already discussed the basics of microprogramming and have studied in some detail two types of new microcircuits. In this paper, methods of developing software for these microcircuits are explored. This generally requires a package of support software in order to assemble the microprogram, and also some amount of support software to test the microprograms and to test the microprogrammed circuit itself. (Auth.)

  4. Software quality assurance handbook

    Energy Technology Data Exchange (ETDEWEB)

    1990-09-01

    There are two important reasons for Software Quality Assurance (SQA) at Allied-Signal Inc., Kansas City Division (KCD): First, the benefits from SQA make good business sense. Second, the Department of Energy has requested SQA. This handbook is one of the first steps in a plant-wide implementation of Software Quality Assurance at KCD. The handbook has two main purposes. The first is to provide information that you will need to perform software quality assurance activities. The second is to provide a common thread to unify the approach to SQA at KCD. 2 figs.

  5. The Impact of Project Role on Perceptions of Risk and Performance in Information Technology Software Development: A Comparative Analysis

    Science.gov (United States)

    Okongo, James

    2014-01-01

    The failure rate of information technology (IT) development projects is a significant concern for today's organizations. Perceptions of IT project risk and project performance have been identified as important factors by scholars studying the topic, and Wallace, Keil, and Rai (2004a) developed a survey instrument to measure how dimensions of…

  6. The Impact of Internal and External Resources, and Strategic Actions in Business Networks on Firm Performance in the Software Industry

    NARCIS (Netherlands)

    Anggraeni, E.

    2014-01-01

    Understanding the variance in firm performance has been an important topic in the strategic management literature. In the last two decades it has become particularly interesting as business networks increasingly have become an integrated part of a firm's environment. Besides the internal resources,

  7. MAGIC user's group software

    International Nuclear Information System (INIS)

    Warren, G.; Ludeking, L.; McDonald, J.; Nguyen, K.; Goplen, B.

    1990-01-01

    The MAGIC User's Group has been established to facilitate the use of electromagnetic particle-in-cell software by universities, government agencies, and industrial firms. The software consists of a series of independent executables that are capable of inter-communication. MAGIC, SOS, μ SOS are used to perform electromagnetic simulations while POSTER is used to provide post-processing capabilities. Each is described in the paper. Use of the codes for Klystrode simulation is discussed

  8. The Job Assessment Software System (JASS) and a Strategy for Integrating Output into the Improved Performance Research Integration Tool (IMPRINT)

    Science.gov (United States)

    2018-01-01

    Oral Comprehension Is it necessary to listen to and understand spoken words and sentences? 1 Understand a McDonald’s hamburger commercial 3...6 Fig. 4 Skill rating for the Oral Comprehension skill ....................................... 7 Fig. 5 JASS results in CSV...to assess “oral comprehension ,” the first question would be, “In order to perform the task, is it necessary that the person know the English

  9. Software Authentication

    International Nuclear Information System (INIS)

    Wolford, J.K.; Geelhood, B.D.; Hamilton, V.A.; Ingraham, J.; MacArthur, D.W.; Mitchell, D.J.; Mullens, J.A.; Vanier, P. E.; White, G.K.; Whiteson, R.

    2001-01-01

    The effort to define guidance for authentication of software for arms control and nuclear material transparency measurements draws on a variety of disciplines and has involved synthesizing established criteria and practices with newer methods. Challenges include the need to protect classified information that the software manipulates as well as deal with the rapid pace of innovation in the technology of nuclear material monitoring. The resulting guidance will shape the design of future systems and inform the process of authentication of instruments now being developed. This paper explores the technical issues underlying the guidance and presents its major tenets

  10. Software engineering

    CERN Document Server

    Thorin, Marc

    1985-01-01

    Software Engineering describes the conceptual bases as well as the main methods and rules on computer programming. This book presents software engineering as a coherent and logically built synthesis and makes it possible to properly carry out an application of small or medium difficulty that can later be developed and adapted to more complex cases. This text is comprised of six chapters and begins by introducing the reader to the fundamental notions of entities, actions, and programming. The next two chapters elaborate on the concepts of information and consistency domains and show that a proc

  11. Utilizing Commercial Hardware and Open Source Computer Vision Software to Perform Motion Capture for Reduced Gravity Flight

    Science.gov (United States)

    Humphreys, Brad; Bellisario, Brian; Gallo, Christopher; Thompson, William K.; Lewandowski, Beth

    2016-01-01

    Long duration space travel to Mars or to an asteroid will expose astronauts to extended periods of reduced gravity. Since gravity is not present to aid loading, astronauts will use resistive and aerobic exercise regimes for the duration of the space flight to minimize the loss of bone density, muscle mass and aerobic capacity that occurs during exposure to a reduced gravity environment. Unlike the International Space Station (ISS), the area available for an exercise device in the next generation of spacecraft is limited. Therefore, compact resistance exercise device prototypes are being developed. The NASA Digital Astronaut Project (DAP) is supporting the Advanced Exercise Concepts (AEC) Project, Exercise Physiology and Countermeasures (ExPC) project and the National Space Biomedical Research Institute (NSBRI) funded researchers by developing computational models of exercising with these new advanced exercise device concepts. To perform validation of these models and to support the Advanced Exercise Concepts Project, several candidate devices have been flown onboard NASAs Reduced Gravity Aircraft. In terrestrial laboratories, researchers typically have available to them motion capture systems for the measurement of subject kinematics. Onboard the parabolic flight aircraft it is not practical to utilize the traditional motion capture systems due to the large working volume they require and their relatively high replacement cost if damaged. To support measuring kinematics on board parabolic aircraft, a motion capture system is being developed utilizing open source computer vision code with commercial off the shelf (COTS) video camera hardware. While the systems accuracy is lower than lab setups, it provides a means to produce quantitative comparison motion capture kinematic data. Additionally, data such as required exercise volume for small spaces such as the Orion capsule can be determined. METHODS: OpenCV is an open source computer vision library that provides the

  12. The MSRC Ab Initio Methods Benchmark Suite: A measurement of hardware and software performance in the area of electronic structure methods

    Energy Technology Data Exchange (ETDEWEB)

    Feller, D.F.

    1993-07-01

    This collection of benchmark timings represents a snapshot of the hardware and software capabilities available for ab initio quantum chemical calculations at Pacific Northwest Laboratory`s Molecular Science Research Center in late 1992 and early 1993. The ``snapshot`` nature of these results should not be underestimated, because of the speed with which both hardware and software are changing. Even during the brief period of this study, we were presented with newer, faster versions of several of the codes. However, the deadline for completing this edition of the benchmarks precluded updating all the relevant entries in the tables. As will be discussed below, a similar situation occurred with the hardware. The timing data included in this report are subject to all the normal failures, omissions, and errors that accompany any human activity. In an attempt to mimic the manner in which calculations are typically performed, we have run the calculations with the maximum number of defaults provided by each program and a near minimum amount of memory. This approach may not produce the fastest performance that a particular code can deliver. It is not known to what extent improved timings could be obtained for each code by varying the run parameters. If sufficient interest exists, it might be possible to compile a second list of timing data corresponding to the fastest observed performance from each application, using an unrestricted set of input parameters. Improvements in I/O might have been possible by fine tuning the Unix kernel, but we resisted the temptation to make changes to the operating system. Due to the large number of possible variations in levels of operating system, compilers, speed of disks and memory, versions of applications, etc., readers of this report may not be able to exactly reproduce the times indicated. Copies of the output files from individual runs are available if questions arise about a particular set of timings.

  13. Comparison of performance of object-based image analysis techniques available in open source software (Spring and Orfeo Toolbox/Monteverdi) considering very high spatial resolution data

    Science.gov (United States)

    Teodoro, Ana C.; Araujo, Ricardo

    2016-01-01

    The use of unmanned aerial vehicles (UAVs) for remote sensing applications is becoming more frequent. However, this type of information can result in several software problems related to the huge amount of data available. Object-based image analysis (OBIA) has proven to be superior to pixel-based analysis for very high-resolution images. The main objective of this work was to explore the potentialities of the OBIA methods available in two different open source software applications, Spring and OTB/Monteverdi, in order to generate an urban land cover map. An orthomosaic derived from UAVs was considered, 10 different regions of interest were selected, and two different approaches were followed. The first one (Spring) uses the region growing segmentation algorithm followed by the Bhattacharya classifier. The second approach (OTB/Monteverdi) uses the mean shift segmentation algorithm followed by the support vector machine (SVM) classifier. Two strategies were followed: four classes were considered using Spring and thereafter seven classes were considered for OTB/Monteverdi. The SVM classifier produces slightly better results and presents a shorter processing time. However, the poor spectral resolution of the data (only RGB bands) is an important factor that limits the performance of the classifiers applied.

  14. Performance evaluation of multi-stratum resources integration based on network function virtualization in software defined elastic data center optical interconnect.

    Science.gov (United States)

    Yang, Hui; Zhang, Jie; Ji, Yuefeng; Tian, Rui; Han, Jianrui; Lee, Young

    2015-11-30

    Data center interconnect with elastic optical network is a promising scenario to meet the high burstiness and high-bandwidth requirements of data center services. In our previous work, we implemented multi-stratum resilience between IP and elastic optical networks that allows to accommodate data center services. In view of this, this study extends to consider the resource integration by breaking the limit of network device, which can enhance the resource utilization. We propose a novel multi-stratum resources integration (MSRI) architecture based on network function virtualization in software defined elastic data center optical interconnect. A resource integrated mapping (RIM) scheme for MSRI is introduced in the proposed architecture. The MSRI can accommodate the data center services with resources integration when the single function or resource is relatively scarce to provision the services, and enhance globally integrated optimization of optical network and application resources. The overall feasibility and efficiency of the proposed architecture are experimentally verified on the control plane of OpenFlow-based enhanced software defined networking (eSDN) testbed. The performance of RIM scheme under heavy traffic load scenario is also quantitatively evaluated based on MSRI architecture in terms of path blocking probability, provisioning latency and resource utilization, compared with other provisioning schemes.

  15. Predicting the performances of a CAMPRO engine retrofitted with liquefied petroleum gas (LPG system using 1-dimensional software

    Directory of Open Access Journals (Sweden)

    Kamaruddin M. Hazeem

    2017-01-01

    Full Text Available Recently, the depletion of petroleum resources and the impact of exhaust emission caused by combustion towards environmental has been forced to all researchers to come out with an alternative ways to prevent this situation become worse. Liquefied petroleum gas (LPG is the most compatible and have a potential to become a source of energy for internal combustion engine. Unfortunately, the investigation of LPG in internal combustion engine among researcher still have a gap in research. Thus, in this study a 1-Dimensional simulation CAMPRO 1.6L engine model using GT-Power is developed to predict the performances of engines that using LPG as a fuel for internal combustion engine. The constructed model simulation will throughout the validation process with the experimental data to make sure the precision of this model. The validation process shows that the results have a good agreement between the simulation model and the experimental data. As a result, the performance of LPG simulation model shows that a Brake Torque (BT, Brake Power (BP and Brake Mean Effective Pressure (BMEP were significantly improved in average of 7% in comparison with gasoline model. In addition, Brake Specific Fuel Consumption (BSFC also shows an improvement by 5%, which is become more economic. Therefore, the developed GT-Power model offer a successful fuel conversion to LPG systems via retrofit technology to provide comprehensive support for implementation of energy efficient and environmental friendly vehicles.

  16. A software engineering process for safety-critical software application

    International Nuclear Information System (INIS)

    Kang, Byung Heon; Kim, Hang Bae; Chang, Hoon Seon; Jeon, Jong Sun

    1995-01-01

    Application of computer software to safety-critical systems in on the increase. To be successful, the software must be designed and constructed to meet the functional and performance requirements of the system. For safety reason, the software must be demonstrated not only to meet these requirements, but also to operate safely as a component within the system. For longer-term cost consideration, the software must be designed and structured to ease future maintenance and modifications. This paper presents a software engineering process for the production of safety-critical software for a nuclear power plant. The presentation is expository in nature of a viable high quality safety-critical software development. It is based on the ideas of a rational design process and on the experience of the adaptation of such process in the production of the safety-critical software for the shutdown system number two of Wolsung 2, 3 and 4 nuclear power generation plants. This process is significantly different from a conventional process in terms of rigorous software development phases and software design techniques, The process covers documentation, design, verification and testing using mathematically precise notations and highly reviewable tabular format to specify software requirements and software requirements and software requirements and code against software design using static analysis. The software engineering process described in this paper applies the principle of information-hiding decomposition in software design using a modular design technique so that when a change is required or an error is detected, the affected scope can be readily and confidently located. it also facilitates a sense of high degree of confidence in the 'correctness' of the software production, and provides a relatively simple and straightforward code implementation effort. 1 figs., 10 refs. (Author)

  17. LDUA software custodian's notebook

    International Nuclear Information System (INIS)

    Aftanas, B.L.

    1998-01-01

    This plan describes the activities to be performed and controls to be applied to the process of specifying, obtaining, and qualifying the control and data acquisition software for the Light Duty Utility Arm (LDUA) System. It serves the purpose of a software quality assurance plan, a verification and validation plan, and a configuration management plan. This plan applies to all software that is an integral part of the LDUA control and data acquisition system, that is, software that is installed in the computers that are part of the LDUA system as it is deployed in the field. This plan applies to the entire development process, including: requirements; design; implementation; and operations and maintenance. This plan does not apply to any software that is not integral with the LDUA system. This plan has-been prepared in accordance with WHC-CM-6-1 Engineering Practices, EP-2.1; WHC-CM-3-10 Software Practices; and WHC-CM-4-2, QR 19.0, Software Quality Assurance Requirements

  18. Implementation and testing of a fault detection software tool for improving control system performance in a large commercial building

    Energy Technology Data Exchange (ETDEWEB)

    Salsbury, T.I.; Diamond, R.C.

    2000-05-01

    This paper describes a model-based, feedforward control scheme that can detect faults in the controlled process and improve control performance over traditional PID control. The tool uses static simulation models of the system under control to generate feed-forward control action, which acts as a reference of correct operation. Faults that occur in the system cause discrepancies between the feedforward models and the controlled process. The scheme facilitates detection of faults by monitoring the level of these discrepancies. We present results from the first phase of tests on a dual-duct air-handling unit installed in a large office building in San Francisco. We demonstrate the ability of the tool to detect a number of preexisting faults in the system and discuss practical issues related to implementation.

  19. The Multi-Attribute Task Battery II (MATB-II) Software for Human Performance and Workload Research: A User's Guide

    Science.gov (United States)

    Santiago-Espada, Yamira; Myer, Robert R.; Latorella, Kara A.; Comstock, James R., Jr.

    2011-01-01

    The Multi-Attribute Task Battery (MAT Battery). is a computer-based task designed to evaluate operator performance and workload, has been redeveloped to operate in Windows XP Service Pack 3, Windows Vista and Windows 7 operating systems.MATB-II includes essentially the same tasks as the original MAT Battery, plus new configuration options including a graphical user interface for controlling modes of operation. MATB-II can be executed either in training or testing mode, as defined by the MATB-II configuration file. The configuration file also allows set up of the default timeouts for the tasks, the flow rates of the pumps and tank levels of the Resource Management (RESMAN) task. MATB-II comes with a default event file that an experimenter can modify and adapt

  20. Reviews, Software.

    Science.gov (United States)

    Science Teacher, 1988

    1988-01-01

    Reviews two computer software packages for use in physical science, physics, and chemistry classes. Includes "Physics of Model Rocketry" for Apple II, and "Black Box" for Apple II and IBM compatible computers. "Black Box" is designed to help students understand the concept of indirect evidence. (CW)

  1. Software Reviews.

    Science.gov (United States)

    Kinnaman, Daniel E.; And Others

    1988-01-01

    Reviews four educational software packages for Apple, IBM, and Tandy computers. Includes "How the West was One + Three x Four,""Mavis Beacon Teaches Typing,""Math and Me," and "Write On." Reviews list hardware requirements, emphasis, levels, publisher, purchase agreements, and price. Discusses the strengths…

  2. Software Review.

    Science.gov (United States)

    McGrath, Diane, Ed.

    1989-01-01

    Reviewed is a computer software package entitled "Audubon Wildlife Adventures: Grizzly Bears" for Apple II and IBM microcomputers. Included are availability, hardware requirements, cost, and a description of the program. The murder-mystery flavor of the program is stressed in this program that focuses on illegal hunting and game…

  3. Software Reviews.

    Science.gov (United States)

    Teles, Elizabeth, Ed.; And Others

    1990-01-01

    Reviewed are two computer software packages for Macintosh microcomputers including "Phase Portraits," an exploratory graphics tool for studying first-order planar systems; and "MacMath," a set of programs for exploring differential equations, linear algebra, and other mathematical topics. Features, ease of use, cost, availability, and hardware…

  4. MIAWARE Software

    DEFF Research Database (Denmark)

    Wilkowski, Bartlomiej; Pereira, Oscar N. M.; Dias, Paulo

    2008-01-01

    is automatically generated. Furthermore, MIAWARE software is accompanied with an intelligent search engine for medical reports, based on the relations between parts of the lungs. A logical structure of the lungs is introduced to the search algorithm through the specially developed ontology. As a result...

  5. Novel, Highly-Parallel Software for the Online Storage System of the ATLAS Experiment at CERN: Design and Performances

    CERN Document Server

    Colombo, T; The ATLAS collaboration

    2012-01-01

    The ATLAS experiment observes proton-proton collisions delivered by the LHC accelerator at CERN. The ATLAS Trigger and Data Acquisition (TDAQ) system selects interesting events on-line in a three-level trigger system in order to store them at a budgeted rate of several hundred Hz, for an average event size of ~1.5 MB. This paper focuses on the TDAQ data-logging system and in particular on the implementation and performance of a novel SW design, reporting on the effort of exploiting the full power of recently installed multi-core hardware. In this respect, the main challenge presented by the data-logging workload is the conflict between the largely parallel nature of the event processing, including the recently introduced on-line event-compression, and the constraint of sequential file writing and checksum evaluation. This is further complicated by the necessity of operating in a fully data-driven mode, to cope with continuously evolving trigger and detector configurations. In this paper we report on the desig...

  6. First performance evaluation of software for automatic segmentation, labeling and reformation of anatomical aligned axial images of the thoracolumbar spine at CT.

    Science.gov (United States)

    Scholtz, Jan-Erik; Wichmann, Julian L; Kaup, Moritz; Fischer, Sebastian; Kerl, J Matthias; Lehnert, Thomas; Vogl, Thomas J; Bauer, Ralf W

    2015-03-01

    To evaluate software for automatic segmentation, labeling and reformation of anatomical aligned axial images of the thoracolumbar spine on CT in terms of accuracy, potential for time savings and workflow improvement. 77 patients (28 women, 49 men, mean age 65.3±14.4 years) with known or suspected spinal disorders (degenerative spine disease n=32; disc herniation n=36; traumatic vertebral fractures n=9) underwent 64-slice MDCT with thin-slab reconstruction. Time for automatic labeling of the thoracolumbar spine and reconstruction of double-angulated axial images of the pathological vertebrae was compared with manually performed reconstruction of anatomical aligned axial images. Reformatted images of both reconstruction methods were assessed by two observers regarding accuracy of symmetric depiction of anatomical structures. In 33 cases double-angulated axial images were created in 1 vertebra, in 28 cases in 2 vertebrae and in 16 cases in 3 vertebrae. Correct automatic labeling was achieved in 72 of 77 patients (93.5%). Errors could be manually corrected in 4 cases. Automatic labeling required 1min in average. In cases where anatomical aligned axial images of 1 vertebra were created, reconstructions made by hand were significantly faster (pquality with excellent inter-observer agreement. The evaluated software for automatic labeling and anatomically aligned, double-angulated axial image reconstruction of the thoracolumbar spine on CT is time-saving when reconstructions of 2 and more vertebrae are performed. Checking results of automatic labeling is necessary to prevent errors in labeling. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  7. Characteristics for Software Optimization Projects

    Directory of Open Access Journals (Sweden)

    Iulian NITESCU

    2008-01-01

    Full Text Available The increasing of the software systems complexity imposes the identification and implementation of some methods and techniques in order to manage it. The software optimization project is a way in which the software complexity is controlled. The software optimization project must face to the organization need to earn profit. The software optimization project is an integrated part of the application cycle because share same resources, depends on other stages and influences next phases. The optimization project has some particularities because it works on an finished product around its quality. The process is quality and performance oriented and it assumes that the product life cycle is almost finished.

  8. Software Tools for Software Maintenance

    Science.gov (United States)

    1988-10-01

    COMMUNICATIONS, AND COMPUTER SCIENCES I ,(AIRMICS) FO~SOFTWARE TOOLS (.o FOR SOF1 ’ARE MAINTENANCE (ASQBG-1-89-001) October, 1988 DTIC ELECTE -ifB...SUNWW~. B..c Program An~Iysw HA.c C-Tractr C Cobol Stncturing Facility VS Cobol 11 F-Scan Foctma Futbol Cobol Fortran Sltiuc Code Anaiyaer Fortran IS

  9. Assessing the performance of the 'Simple Model of the Atmospheric Radiative Transfer of Sunshine' (SMARTS2) in a first tier of software using empirical weather data

    International Nuclear Information System (INIS)

    Askar, H.K.; Batty, W.J.

    2005-01-01

    Software is being developed to assess the performance of a new form of triple glazing system that can be used in hot arid countries. The method includes the insertion of an angled glazing element within the window cavity to maximize the reflection of incident direct insolation while maintaining an acceptable level of day lighting. SMARTS2 (Simple Model of the Atmospheric Radiative Transfer of Sunshine) is used as a first tier platform to provide solar input (i.e. direct, diffused and albedo) for tilted surfaces for simulations of optical performance, using the visible band of the electromagnetic spectrum. Results, thus, obtained can be used in a ray-tracing algorithm to calculate an optimal angle of insertion of the suggested element that corresponds to the solar geometry of particular latitudes. General weather files of eight countries were used for the analysis, which included an examination of detailed annual solar data and turbidity (i.e. dust) levels for Kuwait. SMARTS2 performance as a solar model was assessed within the narrow visible band

  10. Software preservation

    Directory of Open Access Journals (Sweden)

    Tadej Vodopivec

    2011-01-01

    Full Text Available Comtrade Ltd. covers a wide range of activities related to information and communication technologies; its deliverables include web applications, locally installed programs,system software, drivers, embedded software (used e.g. in medical devices, auto parts,communication switchboards. Also the extensive knowledge and practical experience about digital long-term preservation technologies have been acquired. This wide spectrum of activities puts us in the position to discuss the often overlooked aspect of the digital preservation - preservation of software programs. There are many resources dedicated to digital preservation of digital data, documents and multimedia records,but not so many about how to preserve the functionalities and features of computer programs. Exactly these functionalities - dynamic response to inputs - render the computer programs rich compared to documents or linear multimedia. The article opens the questions on the beginning of the way to the permanent digital preservation. The purpose is to find a way in the right direction, where all relevant aspects will be covered in proper balance. The following questions are asked: why at all to preserve computer programs permanently, who should do this and for whom, when we should think about permanent program preservation, what should be persevered (such as source code, screenshots, documentation, and social context of the program - e.g. media response to it ..., where and how? To illustrate the theoretic concepts given the idea of virtual national museum of electronic banking is also presented.

  11. Software quality assurance plans for safety-critical software

    International Nuclear Information System (INIS)

    Liddle, P.

    2006-01-01

    Application software is defined as safety-critical if a fault in the software could prevent the system components from performing their nuclear-safety functions. Therefore, for nuclear-safety systems, the AREVA TELEPERM R XS (TXS) system is classified 1E, as defined in the Inst. of Electrical and Electronics Engineers (IEEE) Std 603-1998. The application software is classified as Software Integrity Level (SIL)-4, as defined in IEEE Std 7-4.3.2-2003. The AREVA NP Inc. Software Program Manual (SPM) describes the measures taken to ensure that the TELEPERM XS application software attains a level of quality commensurate with its importance to safety. The manual also describes how TELEPERM XS correctly performs the required safety functions and conforms to established technical and documentation requirements, conventions, rules, and standards. The program manual covers the requirements definition, detailed design, integration, and test phases for the TELEPERM XS application software, and supporting software created by AREVA NP Inc. The SPM is required for all safety-related TELEPERM XS system applications. The program comprises several basic plans and practices: 1. A Software Quality-Assurance Plan (SQAP) that describes the processes necessary to ensure that the software attains a level of quality commensurate with its importance to safety function. 2. A Software Safety Plan (SSP) that identifies the process to reasonably ensure that safety-critical software performs as intended during all abnormal conditions and events, and does not introduce any new hazards that could jeopardize the health and safety of the public. 3. A Software Verification and Validation (V and V) Plan that describes the method of ensuring the software is in accordance with the requirements. 4. A Software Configuration Management Plan (SCMP) that describes the method of maintaining the software in an identifiable state at all times. 5. A Software Operations and Maintenance Plan (SO and MP) that

  12. Establishing software quality assurance

    International Nuclear Information System (INIS)

    Malsbury, J.

    1983-01-01

    This paper is concerned with four questions about establishing software QA: What is software QA. Why have software QA. What is the role of software QA. What is necessary to ensure the success of software QA

  13. Modernization of software quality assurance

    Science.gov (United States)

    Bhaumik, Gokul

    1988-01-01

    The customers satisfaction depends not only on functional performance, it also depends on the quality characteristics of the software products. An examination of this quality aspect of software products will provide a clear, well defined framework for quality assurance functions, which improve the life-cycle activities of software development. Software developers must be aware of the following aspects which have been expressed by many quality experts: quality cannot be added on; the level of quality built into a program is a function of the quality attributes employed during the development process; and finally, quality must be managed. These concepts have guided our development of the following definition for a Software Quality Assurance function: Software Quality Assurance is a formal, planned approach of actions designed to evaluate the degree of an identifiable set of quality attributes present in all software systems and their products. This paper is an explanation of how this definition was developed and how it is used.

  14. Boosting Government Performance with Open Source Software? – A Roadmap for Germany (¿Impulsar el desempeño del estado con software de código abierto? - Plan de trabajo para Alemania

    Directory of Open Access Journals (Sweden)

    Norbert Jesse

    2014-05-01

    Full Text Available English abstract Governments face a considerable pressure from all directions: budget restrictions, citizens’ expectations, demographical trends, local competition from surrounding areas – to name just a few. EGovernment is regarded as an imminent tool to tackle many of these challenges. Obviously, IT itself is object of increasing complexity, constant change and financial implications. This paper outlines how the federal German government follows a strategic roadmap for eGovernment by shaping the objectives and goals for IT expansion. We discuss the role of open source software and how new approaches for software development can turn the ambitious aims into reality. Spanish abstract Los gobiernos se enfrentan a una gran presión desde todas las direcciones: las restricciones presupuestarias, las expectativas de los ciudadanos, las tendencias demográficas, la competencia local- por nombrar algunos. El gobierno electrónico es considerado una herramienta inminente para abordar muchos de estos problemas. Obviamente, las TIC son objetos de complejidad creciente, cambio constante e implicaciones financieras. Este documento describe cómo el gobierno federal alemán sigue una hoja de ruta estratégica para el gobierno electrónico dándole forma a los objetivos y metas para su expansión tecnológica. Se discute el papel del software libre y cómo los nuevos enfoques para el desarrollo de software pueden convertir estos ambiciosos objetivos en realidad.

  15. Harmonized Constraints in Software Engineering and Acquisition Process Management Requirements are the Clue to Meet Future Performance Goals Successfully in an Environment of Scarce Resources

    National Research Council Canada - National Science Library

    Reich, Holger

    2008-01-01

    This MBA project investigates the importance of correctly deriving requirements from the capability gap and operational environment, and translating them into the processes of contracting, software...

  16. Study of the photoproduction of the vector meson Φ(1020) and the hyperon Λ(1520) from the production threshold up to a photon energy of 2.65 GeV with SAPHIR

    International Nuclear Information System (INIS)

    Wiegers, B.

    2001-05-01

    The photoproduction of the vector meson φ(1020) and the hyperon Λ(1520) have been measured in the finale state pK + K - from their thresholds up to 2.65 GeV using the high duty-factor electron accelerator ELSA and the 4π-detectorsystem SAPHIR. The t-dependence of φ(1020)-production shows an exponential behavior as expected from diffractive production. s-channel helicity conservation can be seen in the decay angular distribution in the helicity frame. The decay angular distribution in the Gottfried-Jackson frame is not conformable with the exchange of a Pomeron in the t-channel. For the first time, differential cross sections of the Λ(1520) photoproduction from the threshold are measured. The production angular distribution and the decay angular distribution in the Gottfried-Jackson frame show a K * exchange in the t-channel. (orig.)

  17. Software-In-the-Loop based Modeling and Simulation of Unmanned Semi-submersible Vehicle for Performance Verification of Autonomous Navigation

    Science.gov (United States)

    Lee, Kwangkook; Jeong, Mijin; Kim, Dong Hun

    2017-12-01

    Since an unmanned semi-submersible is mainly used for the purpose of carrying out dangerous missions in the sea, it is possible to work in a region where it is difficult to access due to safety reasons. In this study, an USV hull design was determined using Myring hull profile, and reinforcement work was performed by designing and implementing inner stiffener member for 3D printing. In order to simulate a sea state 5.0 or more at sea, which is difficult to implement in practice, a regular and irregular wave equation was implemented in Matlab / Simulink. We performed modeling and simulation of semi - submersible simulation based on DMWorks considering the rolling motion in wave. To verify and improve unpredicted errors, we implemented a numeric and physical simulation model of the USV based on software-in-the-loop (SIL) method. This simulation allows shipbuilders to participate in new value-added markets such as engineering, procurement, construction, installation, commissioning, operation, and maintenance for the USV.

  18. The Ettention software package

    International Nuclear Information System (INIS)

    Dahmen, Tim; Marsalek, Lukas; Marniok, Nico; Turoňová, Beata; Bogachev, Sviatoslav; Trampert, Patrick; Nickels, Stefan; Slusallek, Philipp

    2016-01-01

    We present a novel software package for the problem “reconstruction from projections” in electron microscopy. The Ettention framework consists of a set of modular building-blocks for tomographic reconstruction algorithms. The well-known block iterative reconstruction method based on Kaczmarz algorithm is implemented using these building-blocks, including adaptations specific to electron tomography. Ettention simultaneously features (1) a modular, object-oriented software design, (2) optimized access to high-performance computing (HPC) platforms such as graphic processing units (GPU) or many-core architectures like Xeon Phi, and (3) accessibility to microscopy end-users via integration in the IMOD package and eTomo user interface. We also provide developers with a clean and well-structured application programming interface (API) that allows for extending the software easily and thus makes it an ideal platform for algorithmic research while hiding most of the technical details of high-performance computing. - Highlights: • Novel software package for “reconstruction from projections” in electron microscopy. • Support for high-resolution reconstructions on iterative reconstruction algorithms. • Support for CPU, GPU and Xeon Phi. • Integration in the IMOD software. • Platform for algorithm researchers: object oriented, modular design.

  19. The Ettention software package

    Energy Technology Data Exchange (ETDEWEB)

    Dahmen, Tim, E-mail: Tim.Dahmen@dfki.de [German Research Center for Artificial Intelligence GmbH (DFKI), 66123 Saarbrücken (Germany); Saarland University, 66123 Saarbrücken (Germany); Marsalek, Lukas [Eyen SE, Na Nivách 1043/16, 141 00 Praha 4 (Czech Republic); Saarland University, 66123 Saarbrücken (Germany); Marniok, Nico [Saarland University, 66123 Saarbrücken (Germany); Turoňová, Beata [Saarland University, 66123 Saarbrücken (Germany); IMPRS-CS, Max-Planck Institute for Informatics, Campus E 1.4, 66123 Saarbrücken (Germany); Bogachev, Sviatoslav [Saarland University, 66123 Saarbrücken (Germany); Trampert, Patrick; Nickels, Stefan [German Research Center for Artificial Intelligence GmbH (DFKI), 66123 Saarbrücken (Germany); Slusallek, Philipp [German Research Center for Artificial Intelligence GmbH (DFKI), 66123 Saarbrücken (Germany); Saarland University, 66123 Saarbrücken (Germany)

    2016-02-15

    We present a novel software package for the problem “reconstruction from projections” in electron microscopy. The Ettention framework consists of a set of modular building-blocks for tomographic reconstruction algorithms. The well-known block iterative reconstruction method based on Kaczmarz algorithm is implemented using these building-blocks, including adaptations specific to electron tomography. Ettention simultaneously features (1) a modular, object-oriented software design, (2) optimized access to high-performance computing (HPC) platforms such as graphic processing units (GPU) or many-core architectures like Xeon Phi, and (3) accessibility to microscopy end-users via integration in the IMOD package and eTomo user interface. We also provide developers with a clean and well-structured application programming interface (API) that allows for extending the software easily and thus makes it an ideal platform for algorithmic research while hiding most of the technical details of high-performance computing. - Highlights: • Novel software package for “reconstruction from projections” in electron microscopy. • Support for high-resolution reconstructions on iterative reconstruction algorithms. • Support for CPU, GPU and Xeon Phi. • Integration in the IMOD software. • Platform for algorithm researchers: object oriented, modular design.

  20. First performance evaluation of software for automatic segmentation, labeling and reformation of anatomical aligned axial images of the thoracolumbar spine at CT

    Energy Technology Data Exchange (ETDEWEB)

    Scholtz, Jan-Erik, E-mail: janerikscholtz@gmail.com; Wichmann, Julian L.; Kaup, Moritz; Fischer, Sebastian; Kerl, J. Matthias; Lehnert, Thomas; Vogl, Thomas J.; Bauer, Ralf W.

    2015-03-15

    Highlights: •Automatic segmentation and labeling of the thoracolumbar spine. •Automatically generated double-angulated and aligned axial images of spine segments. •High grade of accurateness for the symmetric depiction of anatomical structures. •Time-saving and may improve workflow in daily practice. -- Abstract: Objectives: To evaluate software for automatic segmentation, labeling and reformation of anatomical aligned axial images of the thoracolumbar spine on CT in terms of accuracy, potential for time savings and workflow improvement. Material and methods: 77 patients (28 women, 49 men, mean age 65.3 ± 14.4 years) with known or suspected spinal disorders (degenerative spine disease n = 32; disc herniation n = 36; traumatic vertebral fractures n = 9) underwent 64-slice MDCT with thin-slab reconstruction. Time for automatic labeling of the thoracolumbar spine and reconstruction of double-angulated axial images of the pathological vertebrae was compared with manually performed reconstruction of anatomical aligned axial images. Reformatted images of both reconstruction methods were assessed by two observers regarding accuracy of symmetric depiction of anatomical structures. Results: In 33 cases double-angulated axial images were created in 1 vertebra, in 28 cases in 2 vertebrae and in 16 cases in 3 vertebrae. Correct automatic labeling was achieved in 72 of 77 patients (93.5%). Errors could be manually corrected in 4 cases. Automatic labeling required 1 min in average. In cases where anatomical aligned axial images of 1 vertebra were created, reconstructions made by hand were significantly faster (p < 0.05). Automatic reconstruction was time-saving in cases of 2 and more vertebrae (p < 0.05). Both reconstruction methods revealed good image quality with excellent inter-observer agreement. Conclusion: The evaluated software for automatic labeling and anatomically aligned, double-angulated axial image reconstruction of the thoracolumbar spine on CT is time

  1. Global Software Engineering: A Software Process Approach

    Science.gov (United States)

    Richardson, Ita; Casey, Valentine; Burton, John; McCaffery, Fergal

    Our research has shown that many companies are struggling with the successful implementation of global software engineering, due to temporal, cultural and geographical distance, which causes a range of factors to come into play. For example, cultural, project managementproject management and communication difficulties continually cause problems for software engineers and project managers. While the implementation of efficient software processes can be used to improve the quality of the software product, published software process models do not cater explicitly for the recent growth in global software engineering. Our thesis is that global software engineering factors should be included in software process models to ensure their continued usefulness in global organisations. Based on extensive global software engineering research, we have developed a software process, Global Teaming, which includes specific practices and sub-practices. The purpose is to ensure that requirements for successful global software engineering are stipulated so that organisations can ensure successful implementation of global software engineering.

  2. Software system safety

    Science.gov (United States)

    Uber, James G.

    1988-01-01

    Software itself is not hazardous, but since software and hardware share common interfaces there is an opportunity for software to create hazards. Further, these software systems are complex, and proven methods for the design, analysis, and measurement of software safety are not yet available. Some past software failures, future NASA software trends, software engineering methods, and tools and techniques for various software safety analyses are reviewed. Recommendations to NASA are made based on this review.

  3. Belle II Software

    International Nuclear Information System (INIS)

    Kuhr, T; Ritter, M

    2016-01-01

    Belle II is a next generation B factory experiment that will collect 50 times more data than its predecessor, Belle. The higher luminosity at the SuperKEKB accelerator leads to higher background levels and requires a major upgrade of the detector. As a consequence, the simulation, reconstruction, and analysis software must also be upgraded substantially. Most of the software has been redesigned from scratch, taking into account the experience from Belle and other experiments and utilizing new technologies. The large amount of experimental and simulated data requires a high level of reliability and reproducibility, even in parallel environments. Several technologies, tools, and organizational measures are employed to evaluate and monitor the performance of the software during development. (paper)

  4. Assigning Robust Default Values in Building Performance Simulation Software for Improved Decision-Making in the Initial Stages of Building Design

    Directory of Open Access Journals (Sweden)

    Kyosuke Hiyama

    2015-01-01

    Full Text Available Applying data mining techniques on a database of BIM models could provide valuable insights in key design patterns implicitly present in these BIM models. The architectural designer would then be able to use previous data from existing building projects as default values in building performance simulation software for the early phases of building design. The author has proposed the method to minimize the magnitude of the variation in these default values in subsequent design stages. This approach maintains the accuracy of the simulation results in the initial stages of building design. In this study, a more convincing argument is presented to demonstrate the significance of the new method. The variation in the ideal default values for different building design conditions is assessed first. Next, the influence of each condition on these variations is investigated. The space depth is found to have a large impact on the ideal default value of the window to wall ratio. In addition, the presence or absence of lighting control and natural ventilation has a significant influence on the ideal default value. These effects can be used to identify the types of building conditions that should be considered to determine the ideal default values.

  5. Assigning Robust Default Values in Building Performance Simulation Software for Improved Decision-Making in the Initial Stages of Building Design.

    Science.gov (United States)

    Hiyama, Kyosuke

    2015-01-01

    Applying data mining techniques on a database of BIM models could provide valuable insights in key design patterns implicitly present in these BIM models. The architectural designer would then be able to use previous data from existing building projects as default values in building performance simulation software for the early phases of building design. The author has proposed the method to minimize the magnitude of the variation in these default values in subsequent design stages. This approach maintains the accuracy of the simulation results in the initial stages of building design. In this study, a more convincing argument is presented to demonstrate the significance of the new method. The variation in the ideal default values for different building design conditions is assessed first. Next, the influence of each condition on these variations is investigated. The space depth is found to have a large impact on the ideal default value of the window to wall ratio. In addition, the presence or absence of lighting control and natural ventilation has a significant influence on the ideal default value. These effects can be used to identify the types of building conditions that should be considered to determine the ideal default values.

  6. Software Quality Assurance activities of ITER CODAC

    Energy Technology Data Exchange (ETDEWEB)

    Pande, Sopan, E-mail: sopan.pande@iter.org [ITER Organization, Route de Vinon sur Verdon, 13115 St Paul Lez Durance (France); DiMaio, Franck; Kim, Changseung; Kim, Joohan; Klotz, Wolf-Dieter; Makijarvi, Petri; Stepanov, Denis; Wallander, Anders [ITER Organization, Route de Vinon sur Verdon, 13115 St Paul Lez Durance (France)

    2013-10-15

    Highlights: ► Comprehensive and consistent software engineering and quality assurance of CODAC. ► Applicable to all CODAC software projects executed by ITER DAs and contractors. ► Configurable plans for cost effective application of SQA processes. ► CODAC software plans SQAP, SVVP, SDP, and SCMP. ► CODAC software processes based on IEEE 12207-2008. -- Abstract: Software as an integral part of the plant system I and C is crucial in the manufacturing and integrated operation of ITER plant systems. Software Quality Assurance is necessary to ensure the development and maintenance of consistently high quality I and C software throughout the lifetime of ITER. CODAC decided to follow IEEE 12207-2008 software lifecycle processes for Software Engineering and Software Quality Assurance. Software Development Plan, Software Configuration Management Plan and Software Verification and Validation Plan are the mainstay of Software Quality Assurance which is documented in the Software Quality Assurance Plan. This paper describes the Software Quality Assurance (SQA) activities performed by CODAC. The SQA includes development and maintenance of above plans, processes and resources. With the help of Verification and Validation Teams they gather evidence of process conformance and product conformance, and record process data for quality audits and perform process improvements.

  7. Software Quality Assurance activities of ITER CODAC

    International Nuclear Information System (INIS)

    Pande, Sopan; DiMaio, Franck; Kim, Changseung; Kim, Joohan; Klotz, Wolf-Dieter; Makijarvi, Petri; Stepanov, Denis; Wallander, Anders

    2013-01-01

    Highlights: ► Comprehensive and consistent software engineering and quality assurance of CODAC. ► Applicable to all CODAC software projects executed by ITER DAs and contractors. ► Configurable plans for cost effective application of SQA processes. ► CODAC software plans SQAP, SVVP, SDP, and SCMP. ► CODAC software processes based on IEEE 12207-2008. -- Abstract: Software as an integral part of the plant system I and C is crucial in the manufacturing and integrated operation of ITER plant systems. Software Quality Assurance is necessary to ensure the development and maintenance of consistently high quality I and C software throughout the lifetime of ITER. CODAC decided to follow IEEE 12207-2008 software lifecycle processes for Software Engineering and Software Quality Assurance. Software Development Plan, Software Configuration Management Plan and Software Verification and Validation Plan are the mainstay of Software Quality Assurance which is documented in the Software Quality Assurance Plan. This paper describes the Software Quality Assurance (SQA) activities performed by CODAC. The SQA includes development and maintenance of above plans, processes and resources. With the help of Verification and Validation Teams they gather evidence of process conformance and product conformance, and record process data for quality audits and perform process improvements

  8. Verification of safety critical software

    International Nuclear Information System (INIS)

    Son, Ki Chang; Chun, Chong Son; Lee, Byeong Joo; Lee, Soon Sung; Lee, Byung Chai

    1996-01-01

    To assure quality of safety critical software, software should be developed in accordance with software development procedures and rigorous software verification and validation should be performed. Software verification is the formal act of reviewing, testing of checking, and documenting whether software components comply with the specified requirements for a particular stage of the development phase[1]. New software verification methodology was developed and was applied to the Shutdown System No. 1 and 2 (SDS1,2) for Wolsung 2,3 and 4 nuclear power plants by Korea Atomic Energy Research Institute(KAERI) and Atomic Energy of Canada Limited(AECL) in order to satisfy new regulation requirements of Atomic Energy Control Boars(AECB). Software verification methodology applied to SDS1 for Wolsung 2,3 and 4 project will be described in this paper. Some errors were found by this methodology during the software development for SDS1 and were corrected by software designer. Outputs from Wolsung 2,3 and 4 project have demonstrated that the use of this methodology results in a high quality, cost-effective product. 15 refs., 6 figs. (author)

  9. Software Engineering Improvement Activities/Plan

    Science.gov (United States)

    2003-01-01

    bd Systems personnel accomplished the technical responsibilities for this reporting period, as planned. A close working relationship was maintained with personnel of the MSFC Avionics Department Software Group (ED14). Work accomplishments included development, evaluation, and enhancement of a software cost model, performing literature search and evaluation of software tools available for code analysis and requirements analysis, and participating in other relevant software engineering activities. Monthly reports were submitted. This support was provided to the Flight Software Group/ED 1 4 in accomplishing the software engineering improvement engineering activities of the Marshall Space Flight Center (MSFC) Software Engineering Improvement Plan.

  10. ACTS: from ATLAS software towards a common track reconstruction software

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00349786; The ATLAS collaboration; Salzburger, Andreas; Kiehn, Moritz; Hrdinka, Julia; Calace, Noemi

    2017-01-01

    Reconstruction of charged particles' trajectories is a crucial task for most particle physics experiments. The high instantaneous luminosity achieved at the LHC leads to a high number of proton-proton collisions per bunch crossing, which has put the track reconstruction software of the LHC experiments through a thorough test. Preserving track reconstruction performance under increasingly difficult experimental conditions, while keeping the usage of computational resources at a reasonable level, is an inherent problem for many HEP experiments. Exploiting concurrent algorithms and using multivariate techniques for track identification are the primary strategies to achieve that goal. Starting from current ATLAS software, the ACTS project aims to encapsulate track reconstruction software into a generic, framework- and experiment-independent software package. It provides a set of high-level algorithms and data structures for performing track reconstruction tasks as well as fast track simulation. The software is de...

  11. Sandia software guidelines: Software quality planning

    Energy Technology Data Exchange (ETDEWEB)

    1987-08-01

    This volume is one in a series of Sandia Software Guidelines intended for use in producing quality software within Sandia National Laboratories. In consonance with the IEEE Standard for Software Quality Assurance Plans, this volume identifies procedures to follow in producing a Software Quality Assurance Plan for an organization or a project, and provides an example project SQA plan. 2 figs., 4 tabs.

  12. Avoidable Software Procurements

    Science.gov (United States)

    2012-09-01

    software license, software usage, ELA, Software as a Service , SaaS , Software Asset...PaaS Platform as a Service SaaS Software as a Service SAM Software Asset Management SMS System Management Server SEWP Solutions for Enterprise Wide...delivery of full Cloud Services , we will see the transition of the Cloud Computing service model from Iaas to SaaS , or Software as a Service . Software

  13. The Status of User Software on QCDOC

    International Nuclear Information System (INIS)

    Boyle, P.A.; Chen, D.; Christ, N.H.; Clark, M.; Cohen, S.D.; Cristian, C.; Dong, Z.; Gara, A.; Joo, B.; Jung, C.; Kim, C.; Levkova, L.; Liao, X.; Li, S.; Lin, H.; Liu, G.; Mawhinney, R.D.; Ohta, S.; Petrov, K.; Wettig, T.; Yamaguchi, A.

    2005-01-01

    The current status of QCDOC application software and the user environment are summarized. The performance of optimized routines for the Asqtad Hybrid Monte Carlo is discussed. Also, an update on other SciDAC software is presented

  14. Software engineering architecture-driven software development

    CERN Document Server

    Schmidt, Richard F

    2013-01-01

    Software Engineering: Architecture-driven Software Development is the first comprehensive guide to the underlying skills embodied in the IEEE's Software Engineering Body of Knowledge (SWEBOK) standard. Standards expert Richard Schmidt explains the traditional software engineering practices recognized for developing projects for government or corporate systems. Software engineering education often lacks standardization, with many institutions focusing on implementation rather than design as it impacts product architecture. Many graduates join the workforce with incomplete skil

  15. The software life cycle

    CERN Document Server

    Ince, Darrel

    1990-01-01

    The Software Life Cycle deals with the software lifecycle, that is, what exactly happens when software is developed. Topics covered include aspects of software engineering, structured techniques of software development, and software project management. The use of mathematics to design and develop computer systems is also discussed. This book is comprised of 20 chapters divided into four sections and begins with an overview of software engineering and software development, paying particular attention to the birth of software engineering and the introduction of formal methods of software develop

  16. System support software for TSTA

    International Nuclear Information System (INIS)

    Claborn, G.W.; Mann, L.W.; Nielson, C.W.

    1987-01-01

    The software at the Tritium Systems Test Assembly (TSTA) is logically broken into two parts, the system support software and the subsystem software. The purpose of the system support software is to isolate the subsystem software from the physical hardware. In this sense the system support software forms the kernel of the software at TSTA. The kernel software performs several functions. It gathers data from CAMAC modules and makes that data available for subsystem processes. It services requests to send commands to CAMAC modules. It provides a system of logging functions and provides for a system-wide global program state that allows highly structured interaction between subsystem processes. The kernel's most visible function is to provide the Man-Machine Interface (MMI). The MMI allows the operators a window into the physical hardware and subsystem process state. Finally the kernel provides a data archiving and compression function that allows archival data to be accessed and plotted. Such kernel software as developed and implemented at TSTA is described

  17. Antenna Controller Replacement Software

    Science.gov (United States)

    Chao, Roger Y.; Morgan, Scott C.; Strain, Martha M.; Rockwell, Stephen T.; Shimizu, Kenneth J.; Tehrani, Barzia J.; Kwok, Jaclyn H.; Tuazon-Wong, Michelle; Valtier, Henry; Nalbandi, Reza; hide

    2010-01-01

    The Antenna Controller Replacement (ACR) software accurately points and monitors the Deep Space Network (DSN) 70-m and 34-m high-efficiency (HEF) ground-based antennas that are used to track primarily spacecraft and, periodically, celestial targets. To track a spacecraft, or other targets, the antenna must be accurately pointed at the spacecraft, which can be very far away with very weak signals. ACR s conical scanning capability collects the signal in a circular pattern around the target, calculates the location of the strongest signal, and adjusts the antenna pointing to point directly at the spacecraft. A real-time, closed-loop servo control algorithm performed every 0.02 second allows accurate positioning of the antenna in order to track these distant spacecraft. Additionally, this advanced servo control algorithm provides better antenna pointing performance in windy conditions. The ACR software provides high-level commands that provide a very easy user interface for the DSN operator. The operator only needs to enter two commands to start the antenna and subreflector, and Master Equatorial tracking. The most accurate antenna pointing is accomplished by aligning the antenna to the Master Equatorial, which because of its small size and sheltered location, has the most stable pointing. The antenna has hundreds of digital and analog monitor points. The ACR software provides compact displays to summarize the status of the antenna, subreflector, and the Master Equatorial. The ACR software has two major functions. First, it performs all of the steps required to accurately point the antenna (and subreflector and Master Equatorial) at the spacecraft (or celestial target). This involves controlling the antenna/ subreflector/Master-Equatorial hardware, initiating and monitoring the correct sequence of operations, calculating the position of the spacecraft relative to the antenna, executing the real-time servo control algorithm to maintain the correct position, and

  18. Unified Engineering Software System

    Science.gov (United States)

    Purves, L. R.; Gordon, S.; Peltzman, A.; Dube, M.

    1989-01-01

    Collection of computer programs performs diverse functions in prototype engineering. NEXUS, NASA Engineering Extendible Unified Software system, is research set of computer programs designed to support full sequence of activities encountered in NASA engineering projects. Sequence spans preliminary design, design analysis, detailed design, manufacturing, assembly, and testing. Primarily addresses process of prototype engineering, task of getting single or small number of copies of product to work. Written in FORTRAN 77 and PROLOG.

  19. Software trace cache

    OpenAIRE

    Ramírez Bellido, Alejandro; Larriba Pey, Josep; Valero Cortés, Mateo

    2005-01-01

    We explore the use of compiler optimizations, which optimize the layout of instructions in memory. The target is to enable the code to make better use of the underlying hardware resources regardless of the specific details of the processor/architecture in order to increase fetch performance. The Software Trace Cache (STC) is a code layout algorithm with a broader target than previous layout optimizations. We target not only an improvement in the instruction cache hit rate, but also an increas...

  20. Software reliability studies

    Science.gov (United States)

    Hoppa, Mary Ann; Wilson, Larry W.

    1994-01-01

    There are many software reliability models which try to predict future performance of software based on data generated by the debugging process. Our research has shown that by improving the quality of the data one can greatly improve the predictions. We are working on methodologies which control some of the randomness inherent in the standard data generation processes in order to improve the accuracy of predictions. Our contribution is twofold in that we describe an experimental methodology using a data structure called the debugging graph and apply this methodology to assess the robustness of existing models. The debugging graph is used to analyze the effects of various fault recovery orders on the predictive accuracy of several well-known software reliability algorithms. We found that, along a particular debugging path in the graph, the predictive performance of different models can vary greatly. Similarly, just because a model 'fits' a given path's data well does not guarantee that the model would perform well on a different path. Further we observed bug interactions and noted their potential effects on the predictive process. We saw that not only do different faults fail at different rates, but that those rates can be affected by the particular debugging stage at which the rates are evaluated. Based on our experiment, we conjecture that the accuracy of a reliability prediction is affected by the fault recovery order as well as by fault interaction.

  1. ROLE OF DATA MINING CLASSIFICATION TECHNIQUE IN SOFTWARE DEFECT PREDICTION

    OpenAIRE

    Dr.A.R.Pon Periyasamy; Mrs A.Misbahulhuda

    2017-01-01

    Software defect prediction is the process of locating defective modules in software. Software quality may be a field of study and apply that describes the fascinating attributes of software package product. The performance should be excellent with none defects. Software quality metrics are a set of software package metrics that target the standard aspects of the product, process, and project. The software package defect prediction model helps in early detection of defects and contributes to t...

  2. SBIR PHASE I FINAL REPORT: Adoption of High Performance Computational (HPC) Modeling Software for Widespread Use in the Manufacture of Welded Structures

    Energy Technology Data Exchange (ETDEWEB)

    Brust, Frederick W. [Engineering Mechanics Corporation of Columbus (Emc2), Columbus, OH (United States); Punch, Edward F. [Engineering Mechanics Corporation of Columbus (Emc2), Columbus, OH (United States); Kurth, Elizabeth A. [Engineering Mechanics Corporation of Columbus (Emc2), Columbus, OH (United States); Kennedy, James C. [Engineering Mechanics Corporation of Columbus (Emc2), Columbus, OH (United States)

    2013-12-02

    Many US manufacturing companies have moved fabrication and production facilities off shore because of cheaper labor costs. A key aspect in bringing these jobs back to the US is the use of technology to render US-made fabrications more efficient overall with higher quality. A new initiative of the current administration has the goal of enhancing competitiveness to retain manufacturing jobs in the US. One significant competitive advantage that has emerged in the US over the last two decades is the use of virtual design for fabrication of large structures in the light and heavy materials industries. Industries that have used virtual design and analysis tools have reduced material parts size, developed environmentally-friendly fabrication processes, improved product quality and performance, and reduced manufacturing costs. Indeed, Caterpillar Inc. (CAT), one of the partners in this effort, continues to have a large fabrication presence in the US because of the use of weld fabrication modeling to optimize fabrications by controlling weld residual stresses and distortions and improving fatigue, corrosion, and fracture performance. This report describes Engineering Mechanics Corporation of Columbus (Emc2's) DOE SBIR Phase I results which extended an existing, state-of-the-art software code, VFT, currently used to design and model large welded structures prior to fabrication - to a broader range of products with widespread applications for small and medium-sized enterprises (SMEs). VFT helps control distortion, can minimize and/or control residual stresses, control welding microstructure, and pre-determine welding parameters such as weld-sequencing, pre-bending, thermal-tensioning, etc. VFT uses material properties, consumable properties, etc. as inputs. Through VFT, manufacturing companies can avoid costly design changes after fabrication. This leads to the concept of joint design/fabrication where these important disciplines are intimately linked to minimize

  3. Producing and supporting sharable software

    International Nuclear Information System (INIS)

    Johnstad, H.; Nicholls, J.

    1987-02-01

    A survey is reported that addressed the question of shareable software for the High Energy Physics community. Statistics are compiled for the responses of 54 people attending a conference on the subject of shareable software to a questionnaire which addressed the usefulness of shareable software, preference of programming language, and source management tools. The results are found to reflect a continued need for shareable software in the High Energy Physics community and that this effort be performed in coordination. A strong mandate is also claimed for large facilities to support the community with software and that these facilities should act as distribution points. Considerable interest is expressed in languages other than FORTRAN, and the desire for standards or rules in programming is expressed. A need is identified for source management tools

  4. ESTSC - Software Best Practices

    Science.gov (United States)

    DOE Scientific and Technical Software Best Practices December 2010 Table of Contents 1.0 Introduction 2.0 Responsibilities 2.1 OSTI/ESTSC 2.2 SIACs 2.3 Software Submitting Sites/Creators 2.4 Software Sensitivity Review 3.0 Software Announcement and Submission 3.1 STI Software Appropriate for Announcement 3.2

  5. Software Assurance Competency Model

    Science.gov (United States)

    2013-03-01

    COTS) software , and software as a service ( SaaS ). L2: Define and analyze risks in the acquisition of contracted software , COTS software , and SaaS ...2010a]: Application of technologies and processes to achieve a required level of confidence that software systems and services function in the...

  6. The Ettention software package.

    Science.gov (United States)

    Dahmen, Tim; Marsalek, Lukas; Marniok, Nico; Turoňová, Beata; Bogachev, Sviatoslav; Trampert, Patrick; Nickels, Stefan; Slusallek, Philipp

    2016-02-01

    We present a novel software package for the problem "reconstruction from projections" in electron microscopy. The Ettention framework consists of a set of modular building-blocks for tomographic reconstruction algorithms. The well-known block iterative reconstruction method based on Kaczmarz algorithm is implemented using these building-blocks, including adaptations specific to electron tomography. Ettention simultaneously features (1) a modular, object-oriented software design, (2) optimized access to high-performance computing (HPC) platforms such as graphic processing units (GPU) or many-core architectures like Xeon Phi, and (3) accessibility to microscopy end-users via integration in the IMOD package and eTomo user interface. We also provide developers with a clean and well-structured application programming interface (API) that allows for extending the software easily and thus makes it an ideal platform for algorithmic research while hiding most of the technical details of high-performance computing. Copyright © 2015 Elsevier B.V. All rights reserved.

  7. Software attribute visualization for high integrity software

    Energy Technology Data Exchange (ETDEWEB)

    Pollock, G.M.

    1998-03-01

    This report documents a prototype tool developed to investigate the use of visualization and virtual reality technologies for improving software surety confidence. The tool is utilized within the execution phase of the software life cycle. It provides a capability to monitor an executing program against prespecified requirements constraints provided in a program written in the requirements specification language SAGE. The resulting Software Attribute Visual Analysis Tool (SAVAnT) also provides a technique to assess the completeness of a software specification.

  8. Survey on Projects at DLR Simulation and Software Technology with Focus on Software Engineering and HPC

    OpenAIRE

    Schreiber, Andreas; Basermann, Achim

    2013-01-01

    We introduce the DLR institute “Simulation and Software Technology” (SC) and present current activities regarding software engineering and high performance computing (HPC) in German or international projects. Software engineering at SC focusses on data and knowledge management as well as tools for studies and experiments. We discuss how we apply software configuration management, validation and verification in our projects. Concrete research topics are traceability of (software devel...

  9. Software ecosystems – a systematic literature review

    DEFF Research Database (Denmark)

    Manikas, Konstantinos; Hansen, Klaus Marius

    2013-01-01

    A software ecosystem is the interaction of a set of actors on top of a common technological platform that results in a number of software solutions or services. Arguably, software ecosystems are gaining importance with the advent of, e.g., the Google Android, Apache, and Salesforce.com ecosystems....... However, there exists no systematic overview of the research done on software ecosystems from a software engineering perspective. We performed a systematic literature review of software ecosystem research, analyzing 90 papers on the subject taken from a gross collection of 420. Our main conclusions...... are that while research on software ecosystems is increasing (a) there is little consensus on what constitutes a software ecosystem, (b) few analytical models of software ecosystems exist, and (c) little research is done in the context of real-world ecosystems. This work provides an overview of the field, while...

  10. Software Configurable Multichannel Transceiver

    Science.gov (United States)

    Freudinger, Lawrence C.; Cornelius, Harold; Hickling, Ron; Brooks, Walter

    2009-01-01

    Emerging test instrumentation and test scenarios increasingly require network communication to manage complexity. Adapting wireless communication infrastructure to accommodate challenging testing needs can benefit from reconfigurable radio technology. A fundamental requirement for a software-definable radio system is independence from carrier frequencies, one of the radio components that to date has seen only limited progress toward programmability. This paper overviews an ongoing project to validate the viability of a promising chipset that performs conversion of radio frequency (RF) signals directly into digital data for the wireless receiver and, for the transmitter, converts digital data into RF signals. The Software Configurable Multichannel Transceiver (SCMT) enables four transmitters and four receivers in a single unit the size of a commodity disk drive, programmable for any frequency band between 1 MHz and 6 GHz.

  11. Software Development Standard Processes (SDSP)

    Science.gov (United States)

    Lavin, Milton L.; Wang, James J.; Morillo, Ronald; Mayer, John T.; Jamshidian, Barzia; Shimizu, Kenneth J.; Wilkinson, Belinda M.; Hihn, Jairus M.; Borgen, Rosana B.; Meyer, Kenneth N.; hide

    2011-01-01

    A JPL-created set of standard processes is to be used throughout the lifecycle of software development. These SDSPs cover a range of activities, from management and engineering activities, to assurance and support activities. These processes must be applied to software tasks per a prescribed set of procedures. JPL s Software Quality Improvement Project is currently working at the behest of the JPL Software Process Owner to ensure that all applicable software tasks follow these procedures. The SDSPs are captured as a set of 22 standards in JPL s software process domain. They were developed in-house at JPL by a number of Subject Matter Experts (SMEs) residing primarily within the Engineering and Science Directorate, but also from the Business Operations Directorate and Safety and Mission Success Directorate. These practices include not only currently performed best practices, but also JPL-desired future practices in key thrust areas like software architecting and software reuse analysis. Additionally, these SDSPs conform to many standards and requirements to which JPL projects are beholden.

  12. Reliability of software

    International Nuclear Information System (INIS)

    Kopetz, H.

    1980-01-01

    Common factors and differences in the reliability of hardware and software; reliability increase by means of methods of software redundancy. Maintenance of software for long term operating behavior. (HP) [de

  13. Teamwork in Distributed Agile Software Development

    OpenAIRE

    Gurram, Chaitanya; Bandi, Srinivas Goud

    2013-01-01

    Context: Distributed software development has become a most desired way of software development. Application of agile development methodologies in distributed environments has taken a new trend in developing software due to its benefits of improved communication and collaboration. Teamwork is an important concept that agile methodologies facilitate and is one of the potential determinants of team performance which was not focused in distributed agile software development. Objectives: This res...

  14. Space Flight Software Development Software for Intelligent System Health Management

    Science.gov (United States)

    Trevino, Luis C.; Crumbley, Tim

    2004-01-01

    The slide presentation examines the Marshall Space Flight Center Flight Software Branch, including software development projects, mission critical space flight software development, software technical insight, advanced software development technologies, and continuous improvement in the software development processes and methods.

  15. Software Engineering Guidebook

    Science.gov (United States)

    Connell, John; Wenneson, Greg

    1993-01-01

    The Software Engineering Guidebook describes SEPG (Software Engineering Process Group) supported processes and techniques for engineering quality software in NASA environments. Three process models are supported: structured, object-oriented, and evolutionary rapid-prototyping. The guidebook covers software life-cycles, engineering, assurance, and configuration management. The guidebook is written for managers and engineers who manage, develop, enhance, and/or maintain software under the Computer Software Services Contract.

  16. Software Switching for Data Acquisition

    CERN Multimedia

    CERN. Geneva; Malone, David

    2016-01-01

    In this talk we discuss the feasibility of replacing telecom-class routers with a topology of commodity servers acting as software switches in data acquisition. We extend the popular software switch, Open vSwitch, with a dedicated, throughput-oriented buffering mechanism. We compare the performance under heavy many-to-one congestion to typical Ethernet switches and evaluate the scalability when building larger topologies, exploiting the integration with software-defined networking technologies. Please note that David Malone will speak on behalf of Grzegorz Jereczek.

  17. Software for mass spectrometer control

    International Nuclear Information System (INIS)

    Curuia, Marian; Culcer, Mihai; Anghel, Mihai; Iliescu, Mariana; Trancota, Dan; Kaucsar, Martin; Oprea, Cristiana

    2004-01-01

    The paper describes a software application for the MAT 250 mass spectrometer control, which was refurbished. The spectrometer was bring-up-to-date using a hardware structure on its support where the software application for mass spectrometer control was developed . The software application is composed of dedicated modules that perform given operations. The instructions that these modules have to perform are generated by a principal module. This module makes possible the change of information between the modules that compose the software application. The use of a modal structure is useful for adding new functions in the future. The developed application in our institute made possible the transformation of the mass spectrometer MAT 250 into a device endowed with other new generation tools. (authors)

  18. Measurement of the reactions γp→K+Λ and γp→K+Σ0 for photon energies up to 2.6 GeV with the SAPHIR detector at ELSA

    International Nuclear Information System (INIS)

    Glander, K.H.

    2003-02-01

    The reactions γp→K + Lambda and γp→K + Σ 0 were measured in the energy range from threshold up to a photon energy of 2.6 GeV. The data were taken with the SAPHIR detector at the electron stretcher facility ELSA. Results on cross sections and hyperon polarizations are presented as a function of kaon production angle and photon energy. The total cross section for Λ production shows a strong treshold enhancement wehreas the Σ 0 data have a maximum at about E γ =1.45 GeV. Cross sections together with their angular decompositions into Legendre polynomials suggest contributions from resonance production for both reactions. The K + Λ differential cross section is enhanced for backward produced kaons at E γ ∼1.45 GeV. This might be interpreted as contribution of a so called missing resonance D 13 (1895). In general, the induced polarization of Λ has negative values in the kaon forward direction and positive values in the backward direction. The magnitude varies with energy. The polarization of Σ 0 follows a similar angular and energy dependence as that of Λ, but with opposite sign. (orig.)

  19. FPGAs for software programmers

    CERN Document Server

    Hannig, Frank; Ziener, Daniel

    2016-01-01

    This book makes powerful Field Programmable Gate Array (FPGA) and reconfigurable technology accessible to software engineers by covering different state-of-the-art high-level synthesis approaches (e.g., OpenCL and several C-to-gates compilers). It introduces FPGA technology, its programming model, and how various applications can be implemented on FPGAs without going through low-level hardware design phases. Readers will get a realistic sense for problems that are suited for FPGAs and how to implement them from a software designer’s point of view. The authors demonstrate that FPGAs and their programming model reflect the needs of stream processing problems much better than traditional CPU or GPU architectures, making them well-suited for a wide variety of systems, from embedded systems performing sensor processing to large setups for Big Data number crunching. This book serves as an invaluable tool for software designers and FPGA design engineers who are interested in high design productivity through behavi...

  20. Software for Optimizing Quality Assurance of Other Software

    Science.gov (United States)

    Feather, Martin; Cornford, Steven; Menzies, Tim

    2004-01-01

    Software assurance is the planned and systematic set of activities that ensures that software processes and products conform to requirements, standards, and procedures. Examples of such activities are the following: code inspections, unit tests, design reviews, performance analyses, construction of traceability matrices, etc. In practice, software development projects have only limited resources (e.g., schedule, budget, and availability of personnel) to cover the entire development effort, of which assurance is but a part. Projects must therefore select judiciously from among the possible assurance activities. At its heart, this can be viewed as an optimization problem; namely, to determine the allocation of limited resources (time, money, and personnel) to minimize risk or, alternatively, to minimize the resources needed to reduce risk to an acceptable level. The end result of the work reported here is a means to optimize quality-assurance processes used in developing software.

  1. NASA Software Engineering Benchmarking Study

    Science.gov (United States)

    Rarick, Heather L.; Godfrey, Sara H.; Kelly, John C.; Crumbley, Robert T.; Wifl, Joel M.

    2013-01-01

    .onsolidate, collect and, if needed, develop common processes principles and other assets across the Agency in order to provide more consistency in software development and acquisition practices and to reduce the overall cost of maintaining or increasing current NASA CMMI maturity levels. 6. Provide additional support for small projects that includes: (a) guidance for appropriate tailoring of requirements for small projects, (b) availability of suitable tools, including support tool set-up and training, and (c) training for small project personnel, assurance personnel and technical authorities on the acceptable options for tailoring requirements and performing assurance on small projects. 7. Develop software training classes for the more experienced software engineers using on-line training, videos, or small separate modules of training that can be accommodated as needed throughout a project. 8. Create guidelines to structure non-classroom training opportunities such as mentoring, peer reviews, lessons learned sessions, and on-the-job training. 9. Develop a set of predictive software defect data and a process for assessing software testing metric data against it. 10. Assess Agency-wide licenses for commonly used software tools. 11. Fill the knowledge gap in common software engineering practices for new hires and co-ops.12. Work through the Science, Technology, Engineering and Mathematics (STEM) program with universities in strengthening education in the use of common software engineering practices and standards. 13. Follow up this benchmark study with a deeper look into what both internal and external organizations perceive as the scope of software assurance, the value they expect to obtain from it, and the shortcomings they experience in the current practice. 14. Continue interactions with external software engineering environment through collaborations, knowledge sharing, and benchmarking.

  2. EDS operator and control software

    International Nuclear Information System (INIS)

    Ott, L.L.

    1985-04-01

    The Enrichment Diagnostic System (EDS) was developed at Lawrence Livermore National Laboratory (LLNL) to acquire, display and analyze large quantities of transient data for a real-time Advanced Vapor Laser Isotope Separation (AVLIS) experiment. Major topics discussed in this paper are the EDS operator interface (SHELL) program, the data acquisition and analysis scheduling software, and the graphics software. The workstation concept used in EDS, the software used to configure a user's workstation, and the ownership and management of a diagnostic are described. An EDS diagnostic is a combination of hardware and software designed to study specific aspects of the process. Overall system performance is discussed from the standpoint of scheduling techniques, evaluation tools, optimization techniques, and program-to-program communication methods. EDS is based on a data driven design which keeps the need to modify software to a minimum. This design requires a fast and reliable data base management system. A third party data base management product, Berkeley Software System Database, written explicitly for HP1000's, is used for all EDS data bases. All graphics is done with an in-house graphics product, Device Independent Graphics Library (DIGLIB). Examples of devices supported by DIGLIB are: Versatec printer/plotters, Raster Technologies Graphic Display Controllers, and HP terminals (HP264x and HP262x). The benefits derived by using HP hardware and software as well as obstacles imposed by the HP environment are presented in relation to EDS development and implementation

  3. Ensuring Software IP Cleanliness

    Directory of Open Access Journals (Sweden)

    Mahshad Koohgoli

    2007-12-01

    Full Text Available At many points in the life of a software enterprise, determination of intellectual property (IP cleanliness becomes critical. The value of an enterprise that develops and sells software may depend on how clean the software is from the IP perspective. This article examines various methods of ensuring software IP cleanliness and discusses some of the benefits and shortcomings of current solutions.

  4. Commercial Literacy Software.

    Science.gov (United States)

    Balajthy, Ernest

    1997-01-01

    Presents the first year's results of a continuing project to monitor the availability of software of relevance for literacy education purposes. Concludes there is an enormous amount of software available for use by teachers of reading and literacy--whereas drill-and-practice software is the largest category of software available, large numbers of…

  5. Ensuring Software IP Cleanliness

    OpenAIRE

    Mahshad Koohgoli; Richard Mayer

    2007-01-01

    At many points in the life of a software enterprise, determination of intellectual property (IP) cleanliness becomes critical. The value of an enterprise that develops and sells software may depend on how clean the software is from the IP perspective. This article examines various methods of ensuring software IP cleanliness and discusses some of the benefits and shortcomings of current solutions.

  6. Statistical Software Engineering

    Science.gov (United States)

    1998-04-13

    multiversion software subject to coincident errors. IEEE Trans. Software Eng. SE-11:1511-1517. Eckhardt, D.E., A.K Caglayan, J.C. Knight, L.D. Lee, D.F...J.C. and N.G. Leveson. 1986. Experimental evaluation of the assumption of independence in multiversion software. IEEE Trans. Software

  7. Agile Software Development

    Science.gov (United States)

    Biju, Soly Mathew

    2008-01-01

    Many software development firms are now adopting the agile software development method. This method involves the customer at every level of software development, thus reducing the impact of change in the requirement at a later stage. In this article, the principles of the agile method for software development are explored and there is a focus on…

  8. Improving Software Developer's Competence

    DEFF Research Database (Denmark)

    Abrahamsson, Pekka; Kautz, Karlheinz; Sieppi, Heikki

    2002-01-01

    Emerging agile software development methods are people oriented development approaches to be used by the software industry. The personal software process (PSP) is an accepted method for improving the capabilities of a single software engineer. Five original hypotheses regarding the impact...

  9. Software - Naval Oceanography Portal

    Science.gov (United States)

    are here: Home › USNO › Earth Orientation › Software USNO Logo USNO Navigation Earth Orientation Products GPS-based Products VLBI-based Products EO Information Center Publications about Products Software Search databases Auxiliary Software Supporting Software Form Folder Earth Orientation Matrix Calculator

  10. Software Engineering Education Directory

    Science.gov (United States)

    1990-04-01

    and Engineering (CMSC 735) Codes: GPEV2 * Textiooks: IEEE Tutoria on Models and Metrics for Software Management and Engameeing by Basi, Victor R...Software Engineering (Comp 227) Codes: GPRY5 Textbooks: IEEE Tutoria on Software Design Techniques by Freeman, Peter and Wasserman, Anthony 1. Software

  11. Fault tree analysis of KNICS RPS software

    International Nuclear Information System (INIS)

    Park, Gee Yong; Kwon, Kee Choon; Koh, Kwang Yong; Jee, Eun Kyoung; Seong, Poong Hyun; Lee, Dae Hyung

    2008-01-01

    This paper describes the application of a software Fault Tree Analysis (FTA) as one of the analysis techniques for a Software Safety Analysis (SSA) at the design phase and its analysis results for the safety-critical software of a digital reactor protection system, which is called the KNICS RPS, being developed in the KNICS (Korea Nuclear Instrumentation and Control Systems) project. The software modules in the design description were represented by Function Blocks (FBs), and the software FTA was performed based on the well-defined fault tree templates for the FBs. The SSA, which is part of the verification and validation (V and V) activities, was activated at each phase of the software lifecycle for the KNICS RPS. At the design phase, the software HAZOP (Hazard and Operability) and the software FTA were employed in the SSA in such a way that the software HAZOP was performed first and then the software FTA was applied. The software FTA was applied to some critical modules selected from the software HAZOP analysis

  12. Application of Formal Methods in Software Engineering

    Directory of Open Access Journals (Sweden)

    Adriana Morales

    2011-12-01

    Full Text Available The purpose of this research work is to examine: (1 why are necessary the formal methods for software systems today, (2 high integrity systems through the methodology C-by-C –Correctness-by-Construction–, and (3 an affordable methodology to apply formal methods in software engineering. The research process included reviews of the literature through Internet, in publications and presentations in events. Among the Research results found that: (1 there is increasing the dependence that the nations have, the companies and people of software systems, (2 there is growing demand for software Engineering to increase social trust in the software systems, (3 exist methodologies, as C-by-C, that can provide that level of trust, (4 Formal Methods constitute a principle of computer science that can be applied software engineering to perform reliable process in software development, (5 software users have the responsibility to demand reliable software products, and (6 software engineers have the responsibility to develop reliable software products. Furthermore, it is concluded that: (1 it takes more research to identify and analyze other methodologies and tools that provide process to apply the Formal Software Engineering methods, (2 Formal Methods provide an unprecedented ability to increase the trust in the exactitude of the software products and (3 by development of new methodologies and tools is being achieved costs are not more a disadvantage for application of formal methods.

  13. A user's guide to the GoldSim/BLT-MS integrated software package:a low-level radioactive waste disposal performance assessment model

    International Nuclear Information System (INIS)

    Knowlton, Robert G.; Arnold, Bill Walter; Mattie, Patrick D.

    2007-01-01

    Sandia National Laboratories (Sandia), a U.S. Department of Energy National Laboratory, has over 30 years experience in the assessment of radioactive waste disposal and at the time of this publication is providing assistance internationally in a number of areas relevant to the safety assessment of radioactive waste disposal systems. In countries with small radioactive waste programs, international technology transfer program efforts are often hampered by small budgets, schedule constraints, and a lack of experienced personnel. In an effort to surmount these difficulties, Sandia has developed a system that utilizes a combination of commercially available software codes and existing legacy codes for probabilistic safety assessment modeling that facilitates the technology transfer and maximizes limited available funding. Numerous codes developed and endorsed by the United States Nuclear Regulatory Commission (NRC) and codes developed and maintained by United States Department of Energy are generally available to foreign countries after addressing import/export control and copyright requirements. From a programmatic view, it is easier to utilize existing codes than to develop new codes. From an economic perspective, it is not possible for most countries with small radioactive waste disposal programs to maintain complex software, which meets the rigors of both domestic regulatory requirements and international peer review. Therefore, revitalization of deterministic legacy codes, as well as an adaptation of contemporary deterministic codes, provides a credible and solid computational platform for constructing probabilistic safety assessment models. This document is a reference users guide for the GoldSim/BLT-MS integrated modeling software package developed as part of a cooperative technology transfer project between Sandia National Laboratories and the Institute of Nuclear Energy Research (INER) in Taiwan for the preliminary assessment of several candidate low

  14. Great software debates

    CERN Document Server

    Davis, A

    2004-01-01

    The industry’s most outspoken and insightful critic explains how the software industry REALLY works. In Great Software Debates, Al Davis, shares what he has learned about the difference between the theory and the realities of business and encourages you to question and think about software engineering in ways that will help you succeed where others fail. In short, provocative essays, Davis fearlessly reveals the truth about process improvement, productivity, software quality, metrics, agile development, requirements documentation, modeling, software marketing and sales, empiricism, start-up financing, software research, requirements triage, software estimation, and entrepreneurship.

  15. Experimental research control software system

    International Nuclear Information System (INIS)

    Cohn, I A; Kovalenko, A G; Vystavkin, A N

    2014-01-01

    A software system, intended for automation of a small scale research, has been developed. The software allows one to control equipment, acquire and process data by means of simple scripts. The main purpose of that development is to increase experiment automation easiness, thus significantly reducing experimental setup automation efforts. In particular, minimal programming skills are required and supervisors have no reviewing troubles. Interactions between scripts and equipment are managed automatically, thus allowing to run multiple scripts simultaneously. Unlike well-known data acquisition commercial software systems, the control is performed by an imperative scripting language. This approach eases complex control and data acquisition algorithms implementation. A modular interface library performs interaction with external interfaces. While most widely used interfaces are already implemented, a simple framework is developed for fast implementations of new software and hardware interfaces. While the software is in continuous development with new features being implemented, it is already used in our laboratory for automation of a helium-3 cryostat control and data acquisition. The software is open source and distributed under Gnu Public License.

  16. Experimental research control software system

    Science.gov (United States)

    Cohn, I. A.; Kovalenko, A. G.; Vystavkin, A. N.

    2014-05-01

    A software system, intended for automation of a small scale research, has been developed. The software allows one to control equipment, acquire and process data by means of simple scripts. The main purpose of that development is to increase experiment automation easiness, thus significantly reducing experimental setup automation efforts. In particular, minimal programming skills are required and supervisors have no reviewing troubles. Interactions between scripts and equipment are managed automatically, thus allowing to run multiple scripts simultaneously. Unlike well-known data acquisition commercial software systems, the control is performed by an imperative scripting language. This approach eases complex control and data acquisition algorithms implementation. A modular interface library performs interaction with external interfaces. While most widely used interfaces are already implemented, a simple framework is developed for fast implementations of new software and hardware interfaces. While the software is in continuous development with new features being implemented, it is already used in our laboratory for automation of a helium-3 cryostat control and data acquisition. The software is open source and distributed under Gnu Public License.

  17. Software for the LHCb experiment

    CERN Document Server

    Corti, Gloria; Belyaev, Ivan; Cattaneo, Marco; Charpentier, Philippe; Frank, Markus; Koppenburg, Patrick; Mato-Vila, P; Ranjard, Florence; Roiser, Stefan

    2006-01-01

    LHCb is an experiment for precision measurements of CP-violation and rare decays in B mesons at the LHC collider at CERN. The LHCb software development strategy follows an architecture-centric approach as a way of creating a resilient software framework that can withstand changes in requirements and technology over the expected long lifetime of the experiment. The software architecture, called GAUDI, supports event data processing applications that run in different processing environments ranging from the real-time high- level triggers in the online system to the final physics analysis performed by more than one hundred physicists. The major architectural design choices and the arguments that lead to these choices will be outlined. Object oriented technologies have been used throughout. Initially developed for the LHCb experiment, GAUDI has been adopted and extended by other experiments. Several iterations of the GAUDI software framework have been released and are now being used routinely by the physicists of...

  18. Views on Software Testability

    OpenAIRE

    Shimeall, Timothy; Friedman, Michael; Chilenski, John; Voas, Jeffrey

    1994-01-01

    The field of testability is an active, well-established part of engineering of modern computer systems. However, only recently have technologies for software testability began to be developed. These technologies focus on accessing the aspects of software that improve or depreciate the ease of testing. As both the size of implemented software and the amount of effort required to test that software increase, so will the important of software testability technologies in influencing the softwa...

  19. Agile software assessment

    OpenAIRE

    Nierstrasz Oscar; Lungu Mircea

    2012-01-01

    Informed decision making is a critical activity in software development but it is poorly supported by common development environments which focus mainly on low level programming tasks. We posit the need for agile software assessment which aims to support decision making by enabling rapid and effective construction of software models and custom analyses. Agile software assessment entails gathering and exploiting the broader context of software information related to the system at hand as well ...

  20. Software component quality evaluation

    Science.gov (United States)

    Clough, A. J.

    1991-01-01

    The paper describes a software inspection process that can be used to evaluate the quality of software components. Quality criteria, process application, independent testing of the process and proposed associated tool support are covered. Early results indicate that this technique is well suited for assessing software component quality in a standardized fashion. With automated machine assistance to facilitate both the evaluation and selection of software components, such a technique should promote effective reuse of software components.

  1. Test software for BESIII MDC electronics system

    International Nuclear Information System (INIS)

    Zhang Hongyu; Sheng Huayi; Zhu Haitao; Ji Xiaolu; Zhao Dongxu

    2006-01-01

    This paper presents the design of Test System Software for BESIII MDC Electronics. Two kinds of test systems, SBS VP7 based and PowerPC based systems, and their corresponding test software are introduced. The software is developed in LabVIEW 7.1 and Microsoft Visual C++ 6.0, some test functions of the software, as well as their user interfaces, are described in detail. The software has been applied in hardware debugging, performance test and long term stability test. (authors)

  2. Software-Engineering Process Simulation (SEPS) model

    Science.gov (United States)

    Lin, C. Y.; Abdel-Hamid, T.; Sherif, J. S.

    1992-01-01

    The Software Engineering Process Simulation (SEPS) model is described which was developed at JPL. SEPS is a dynamic simulation model of the software project development process. It uses the feedback principles of system dynamics to simulate the dynamic interactions among various software life cycle development activities and management decision making processes. The model is designed to be a planning tool to examine tradeoffs of cost, schedule, and functionality, and to test the implications of different managerial policies on a project's outcome. Furthermore, SEPS will enable software managers to gain a better understanding of the dynamics of software project development and perform postmodern assessments.

  3. Software Quality Assurance Metrics

    Science.gov (United States)

    McRae, Kalindra A.

    2004-01-01

    Software Quality Assurance (SQA) is a planned and systematic set of activities that ensures conformance of software life cycle processes and products conform to requirements, standards and procedures. In software development, software quality means meeting requirements and a degree of excellence and refinement of a project or product. Software Quality is a set of attributes of a software product by which its quality is described and evaluated. The set of attributes includes functionality, reliability, usability, efficiency, maintainability, and portability. Software Metrics help us understand the technical process that is used to develop a product. The process is measured to improve it and the product is measured to increase quality throughout the life cycle of software. Software Metrics are measurements of the quality of software. Software is measured to indicate the quality of the product, to assess the productivity of the people who produce the product, to assess the benefits derived from new software engineering methods and tools, to form a baseline for estimation, and to help justify requests for new tools or additional training. Any part of the software development can be measured. If Software Metrics are implemented in software development, it can save time, money, and allow the organization to identify the caused of defects which have the greatest effect on software development. The summer of 2004, I worked with Cynthia Calhoun and Frank Robinson in the Software Assurance/Risk Management department. My task was to research and collect, compile, and analyze SQA Metrics that have been used in other projects that are not currently being used by the SA team and report them to the Software Assurance team to see if any metrics can be implemented in their software assurance life cycle process.

  4. Software Engineering Program: Software Process Improvement Guidebook

    Science.gov (United States)

    1996-01-01

    The purpose of this document is to provide experience-based guidance in implementing a software process improvement program in any NASA software development or maintenance community. This guidebook details how to define, operate, and implement a working software process improvement program. It describes the concept of the software process improvement program and its basic organizational components. It then describes the structure, organization, and operation of the software process improvement program, illustrating all these concepts with specific NASA examples. The information presented in the document is derived from the experiences of several NASA software organizations, including the SEL, the SEAL, and the SORCE. Their experiences reflect many of the elements of software process improvement within NASA. This guidebook presents lessons learned in a form usable by anyone considering establishing a software process improvement program within his or her own environment. This guidebook attempts to balance general and detailed information. It provides material general enough to be usable by NASA organizations whose characteristics do not directly match those of the sources of the information and models presented herein. It also keeps the ideas sufficiently close to the sources of the practical experiences that have generated the models and information.

  5. From Software Development to Software Assembly

    NARCIS (Netherlands)

    Sneed, Harry M.; Verhoef, Chris

    2016-01-01

    The lack of skilled programming personnel and the growing burden of maintaining customized software are forcing organizations to quit producing their own software. It's high time they turned to ready-made, standard components to fulfill their business requirements. Cloud services might be one way to

  6. [Definition and specification requirements for PAC-systems (picture archiving and communication system). A performance index with reference to the standard "IEEE Recommended Practice for Software Requirement Specifications"].

    Science.gov (United States)

    König, H; Klose, K J

    1999-04-01

    The formulation of requirements is necessary to control the goals of a PACS project. Furthermore, in this way, the scope of functionality necessary to support radiological working processes becomes clear. Definitions of requirements and specification are formulated independently of systems according to the IEEE standard "Recommended Practice for Software Requirements Specifications". Definitions are given in the Request for Information, specifications in the Request for Proposal. Functional and non-functional requirements are distinguished. The solutions are rated with respect to scope, appropriateness and quality of implementation. A PACS checklist was created according to the methods described above. It is published on the homepage of the "Arbeitsgemeinschaft Informationstechnologie" (AGIT) within the "Deutsche Röntgengesellschaft" (DRG) (http://www.uni-marburg.de/mzr/agit). The checklist provides a discussion forum which should contribute to an agreement on accepted basic PACS functionalities.

  7. Workflow-Based Software Development Environment

    Science.gov (United States)

    Izygon, Michel E.

    2013-01-01

    The Software Developer's Assistant (SDA) helps software teams more efficiently and accurately conduct or execute software processes associated with NASA mission-critical software. SDA is a process enactment platform that guides software teams through project-specific standards, processes, and procedures. Software projects are decomposed into all of their required process steps or tasks, and each task is assigned to project personnel. SDA orchestrates the performance of work required to complete all process tasks in the correct sequence. The software then notifies team members when they may begin work on their assigned tasks and provides the tools, instructions, reference materials, and supportive artifacts that allow users to compliantly perform the work. A combination of technology components captures and enacts any software process use to support the software lifecycle. It creates an adaptive workflow environment that can be modified as needed. SDA achieves software process automation through a Business Process Management (BPM) approach to managing the software lifecycle for mission-critical projects. It contains five main parts: TieFlow (workflow engine), Business Rules (rules to alter process flow), Common Repository (storage for project artifacts, versions, history, schedules, etc.), SOA (interface to allow internal, GFE, or COTS tools integration), and the Web Portal Interface (collaborative web environment

  8. TMT approach to observatory software development process

    Science.gov (United States)

    Buur, Hanne; Subramaniam, Annapurni; Gillies, Kim; Dumas, Christophe; Bhatia, Ravinder

    2016-07-01

    The purpose of the Observatory Software System (OSW) is to integrate all software and hardware components of the Thirty Meter Telescope (TMT) to enable observations and data capture; thus it is a complex software system that is defined by four principal software subsystems: Common Software (CSW), Executive Software (ESW), Data Management System (DMS) and Science Operations Support System (SOSS), all of which have interdependencies with the observatory control systems and data acquisition systems. Therefore, the software development process and plan must consider dependencies to other subsystems, manage architecture, interfaces and design, manage software scope and complexity, and standardize and optimize use of resources and tools. Additionally, the TMT Observatory Software will largely be developed in India through TMT's workshare relationship with the India TMT Coordination Centre (ITCC) and use of Indian software industry vendors, which adds complexity and challenges to the software development process, communication and coordination of activities and priorities as well as measuring performance and managing quality and risk. The software project management challenge for the TMT OSW is thus a multi-faceted technical, managerial, communications and interpersonal relations challenge. The approach TMT is using to manage this multifaceted challenge is a combination of establishing an effective geographically distributed software team (Integrated Product Team) with strong project management and technical leadership provided by the TMT Project Office (PO) and the ITCC partner to manage plans, process, performance, risk and quality, and to facilitate effective communications; establishing an effective cross-functional software management team composed of stakeholders, OSW leadership and ITCC leadership to manage dependencies and software release plans, technical complexities and change to approved interfaces, architecture, design and tool set, and to facilitate

  9. ACTS: from ATLAS software towards a common track reconstruction software

    Science.gov (United States)

    Gumpert, C.; Salzburger, A.; Kiehn, M.; Hrdinka, J.; Calace, N.; ATLAS Collaboration

    2017-10-01

    Reconstruction of charged particles’ trajectories is a crucial task for most particle physics experiments. The high instantaneous luminosity achieved at the LHC leads to a high number of proton-proton collisions per bunch crossing, which has put the track reconstruction software of the LHC experiments through a thorough test. Preserving track reconstruction performance under increasingly difficult experimental conditions, while keeping the usage of computational resources at a reasonable level, is an inherent problem for many HEP experiments. Exploiting concurrent algorithms and using multivariate techniques for track identification are the primary strategies to achieve that goal. Starting from current ATLAS software, the ACTS project aims to encapsulate track reconstruction software into a generic, framework- and experiment-independent software package. It provides a set of high-level algorithms and data structures for performing track reconstruction tasks as well as fast track simulation. The software is developed with special emphasis on thread-safety to support parallel execution of the code and data structures are optimised for vectorisation to speed up linear algebra operations. The implementation is agnostic to the details of the detection technologies and magnetic field configuration which makes it applicable to many different experiments.

  10. Knowledge-Based Software Management

    International Nuclear Information System (INIS)

    Sally Schaffner; Matthew Bickley; Brian Bevins; Leon Clancy; Karen White

    2003-01-01

    Management of software in a dynamic environment such as is found at Jefferson Lab can be a daunting task. Software development tasks are distributed over a wide range of people with varying skill levels. The machine configuration is constantly changing requiring upgrades to software at both the hardware control level and the operator control level. In order to obtain high quality support from vendor service agreements, which is vital to maintaining 24/7 operations, hardware and software must be kept at industry's current levels. This means that periodic upgrades independent of machine configuration changes must take place. It is often difficult to identify and organize the information needed to guide the process of development, upgrades and enhancements. Dependencies between support software and applications need to be consistently identified to prevent introducing errors during upgrades and to allow adequate testing to be planned and performed. Developers also need access to information regarding compilers, make files and organized distribution directories. This paper describes a system under development at Jefferson Lab which will provide software developers and managers this type of information in a timely user-friendly fashion. The current status and future plans for the system will be detailed

  11. Software reliability models for critical applications

    Energy Technology Data Exchange (ETDEWEB)

    Pham, H.; Pham, M.

    1991-12-01

    This report presents the results of the first phase of the ongoing EG&G Idaho, Inc. Software Reliability Research Program. The program is studying the existing software reliability models and proposes a state-of-the-art software reliability model that is relevant to the nuclear reactor control environment. This report consists of three parts: (1) summaries of the literature review of existing software reliability and fault tolerant software reliability models and their related issues, (2) proposed technique for software reliability enhancement, and (3) general discussion and future research. The development of this proposed state-of-the-art software reliability model will be performed in the second place. 407 refs., 4 figs., 2 tabs.

  12. Software reliability models for critical applications

    Energy Technology Data Exchange (ETDEWEB)

    Pham, H.; Pham, M.

    1991-12-01

    This report presents the results of the first phase of the ongoing EG G Idaho, Inc. Software Reliability Research Program. The program is studying the existing software reliability models and proposes a state-of-the-art software reliability model that is relevant to the nuclear reactor control environment. This report consists of three parts: (1) summaries of the literature review of existing software reliability and fault tolerant software reliability models and their related issues, (2) proposed technique for software reliability enhancement, and (3) general discussion and future research. The development of this proposed state-of-the-art software reliability model will be performed in the second place. 407 refs., 4 figs., 2 tabs.

  13. Spotting software errors sooner

    International Nuclear Information System (INIS)

    Munro, D.

    1989-01-01

    Static analysis is helping to identify software errors at an earlier stage and more cheaply than conventional methods of testing. RTP Software's MALPAS system also has the ability to check that a code conforms to its original specification. (author)

  14. Avionics and Software Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The goal of the AES Avionics and Software (A&S) project is to develop a reference avionics and software architecture that is based on standards and that can be...

  15. Paladin Software Support Lab

    Data.gov (United States)

    Federal Laboratory Consortium — The Paladin Software Support Environment (SSE) occupies 2,241 square-feet. It contains the hardware and software tools required to support the Paladin Automatic Fire...

  16. Pragmatic Software Innovation

    DEFF Research Database (Denmark)

    Aaen, Ivan; Jensen, Rikke Hagensby

    2014-01-01

    We understand software innovation as concerned with introducing innovation into the development of software intensive systems, i.e. systems in which software development and/or integration are dominant considerations. Innovation is key in almost any strategy for competitiveness in existing markets......, for creating new markets, or for curbing rising public expenses, and software intensive systems are core elements in most such strategies. Software innovation therefore is vital for about every sector of the economy. Changes in software technologies over the last decades have opened up for experimentation......, learning, and flexibility in ongoing software projects, but how can this change be used to facilitate software innovation? How can a team systematically identify and pursue opportunities to create added value in ongoing projects? In this paper, we describe Deweyan pragmatism as the philosophical foundation...

  17. Process mining software repositories

    NARCIS (Netherlands)

    Poncin, W.; Serebrenik, A.; Brand, van den M.G.J.

    2011-01-01

    Software developers' activities are in general recorded in software repositories such as version control systems, bug trackers and mail archives. While abundant information is usually present in such repositories, successful information extraction is often challenged by the necessity to

  18. Software Defined Cyberinfrastructure

    Energy Technology Data Exchange (ETDEWEB)

    Foster, Ian; Blaiszik, Ben; Chard, Kyle; Chard, Ryan

    2017-07-17

    Within and across thousands of science labs, researchers and students struggle to manage data produced in experiments, simulations, and analyses. Largely manual research data lifecycle management processes mean that much time is wasted, research results are often irreproducible, and data sharing and reuse remain rare. In response, we propose a new approach to data lifecycle management in which researchers are empowered to define the actions to be performed at individual storage systems when data are created or modified: actions such as analysis, transformation, copying, and publication. We term this approach software-defined cyberinfrastructure because users can implement powerful data management policies by deploying rules to local storage systems, much as software-defined networking allows users to configure networks by deploying rules to switches.We argue that this approach can enable a new class of responsive distributed storage infrastructure that will accelerate research innovation by allowing any researcher to associate data workflows with data sources, whether local or remote, for such purposes as data ingest, characterization, indexing, and sharing. We report on early experiments with this approach in the context of experimental science, in which a simple if-trigger-then-action (IFTA) notation is used to define rules.

  19. Computer software review procedures

    International Nuclear Information System (INIS)

    Mauck, J.L.

    1993-01-01

    This article reviews the procedures which are used to review software written for computer based instrumentation and control functions in nuclear facilities. The utilization of computer based control systems is becoming much more prevalent in such installations, in addition to being retrofit into existing systems. Currently, the Nuclear Regulatory System uses Regulatory Guide 1.152, open-quotes Criteria for Programmable Digital Computer System Software in Safety-Related Systems of Nuclear Power Plantsclose quotes and ANSI/IEEE-ANS-7-4.3.2-1982, open-quotes Application Criteria for Programmable Digital Computer Systems in Safety Systems of Nuclear Power Generating Stationsclose quotes for guidance when performing reviews of digital systems. There is great concern about the process of verification and validation of these codes, so when inspections are done of such systems, inspectors examine very closely the processes which were followed in developing the codes, the errors which were detected, how they were found, and the analysis which went into tracing down the causes behind the errors to insure such errors were not propagated again in the future

  20. Strengthening Software Authentication with the ROSE Software Suite

    International Nuclear Information System (INIS)

    White, G

    2006-01-01

    Many recent nonproliferation and arms control software projects include a software authentication regime. These include U.S. Government-sponsored projects both in the United States and in the Russian Federation (RF). This trend toward requiring software authentication is only accelerating. Demonstrating assurance that software performs as expected without hidden ''backdoors'' is crucial to a project's success. In this context, ''authentication'' is defined as determining that a software package performs only its intended purpose and performs said purpose correctly and reliably over the planned duration of an agreement. In addition to visual inspections by knowledgeable computer scientists, automated tools are needed to highlight suspicious code constructs, both to aid visual inspection and to guide program development. While many commercial tools are available for portions of the authentication task, they are proprietary and not extensible. An open-source, extensible tool can be customized to the unique needs of each project (projects can have both common and custom rules to detect flaws and security holes). Any such extensible tool has to be based on a complete language compiler. ROSE is precisely such a compiler infrastructure developed within the Department of Energy (DOE) and targeted at the optimization of scientific applications and user-defined libraries within large-scale applications (typically applications of a million lines of code). ROSE is a robust, source-to-source analysis and optimization infrastructure currently addressing large, million-line DOE applications in C and C++ (handling the full C, C99, C++ languages and with current collaborations to support Fortran90). We propose to extend ROSE to address a number of security-specific requirements, and apply it to software authentication for nonproliferation and arms control projects

  1. MATLAB Software Versions and Licenses for the Peregrine System |

    Science.gov (United States)

    High-Performance Computing | NREL MATLAB Software Versions and Licenses for the Peregrine System MATLAB Software Versions and Licenses for the Peregrine System Learn about the MATLAB software Peregrine is R2017b. Licenses MATLAB is proprietary software. As such, users have access to a limited number

  2. Software Design for Smile Analysis

    Directory of Open Access Journals (Sweden)

    A. Sarkhosh

    2010-12-01

    Full Text Available Introduction: Esthetics and attractiveness of the smile is one of the major demands in contemporary orthodontic treatment. In order to improve a smile design, it is necessary to record “posed smile” as an intentional, non-pressure, static, natural and reproduciblesmile. The record then should be analyzed to determine its characteristics. In this study,we intended to design and introduce a software to analyze the smile rapidly and precisely in order to produce an attractive smile for the patients.Materials and Methods: For this purpose, a practical study was performed to design multimedia software “Smile Analysis” which can receive patients’ photographs and videographs. After giving records to the software, the operator should mark the points and lines which are displayed on the system’s guide and also define the correct scale for each image. Thirty-three variables are measured by the software and displayed on the report page. Reliability of measurements in both image and video was significantly high(=0.7-1.Results: In order to evaluate intra- operator and inter-operator reliability, five cases were selected randomly. Statistical analysis showed that calculations performed in smile analysis software were both valid and highly reliable (for both video and photo.Conclusion: The results obtained from smile analysis could be used in diagnosis,treatment planning and evaluation of the treatment progress.

  3. Optimization of Antivirus Software

    OpenAIRE

    Catalin BOJA; Adrian VISOIU

    2007-01-01

    The paper describes the main techniques used in development of computer antivirus software applications. For this particular category of software, are identified and defined optimum criteria that helps determine which solution is better and what are the objectives of the optimization process. From the general viewpoint of software optimization are presented methods and techniques that are applied at code development level. Regarding the particularities of antivirus software, the paper analyze...

  4. Open Source Software Development

    Science.gov (United States)

    2011-01-01

    appropriate to refer to FOSS or FLOSS (L for Libre , where the alternative term “ libre software ” has popularity in some parts of the world) in order...Applying Social Network Analysis to Community-Drive Libre Software Projects, Intern. J. Info. Tech. and Web Engineering, 2006, 1(3), 27-28. 17...Open Source Software Development* Walt Scacchi Institute for Software Researcher University of California, Irvine Irvine, CA 92697-3455 USA Abstract

  5. Software engineer's pocket book

    CERN Document Server

    Tooley, Michael

    2013-01-01

    Software Engineer's Pocket Book provides a concise discussion on various aspects of software engineering. The book is comprised of six chapters that tackle various areas of concerns in software engineering. Chapter 1 discusses software development, and Chapter 2 covers programming languages. Chapter 3 deals with operating systems. The book also tackles discrete mathematics and numerical computation. Data structures and algorithms are also explained. The text will be of great use to individuals involved in the specification, design, development, implementation, testing, maintenance, and qualit

  6. Software Testing Requires Variability

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak

    2003-01-01

    Software variability is the ability of a software system or artefact to be changed, customized or configured for use in a particular context. Variability in software systems is important from a number of perspectives. Some perspectives rightly receive much attention due to their direct economic...... impact in software production. As is also apparent from the call for papers these perspectives focus on qualities such as reuse, adaptability, and maintainability....

  7. The Impact of Computer and Mathematics Software Usage on Performance of School Leavers in the Western Cape Province of South Africa: A Comparative Analysis

    Science.gov (United States)

    Smith, Garth Spencer; Hardman, Joanne

    2014-01-01

    In this study the impact of computer immersion on performance of school leavers Senior Certificate mathematics scores was investigated across 31 schools in the EMDC East education district of Cape Town, South Africa by comparing performance between two groups: a control and an experimental group. The experimental group (14 high schools) had access…

  8. Computer software quality assurance

    International Nuclear Information System (INIS)

    Ives, K.A.

    1986-06-01

    The author defines some criteria for the evaluation of software quality assurance elements for applicability to the regulation of the nuclear industry. The author then analyses a number of software quality assurance (SQA) standards. The major extracted SQA elements are then discussed, and finally specific software quality assurance recommendations are made for the nuclear industry

  9. Software Architecture Evolution

    Science.gov (United States)

    Barnes, Jeffrey M.

    2013-01-01

    Many software systems eventually undergo changes to their basic architectural structure. Such changes may be prompted by new feature requests, new quality attribute requirements, changing technology, or other reasons. Whatever the causes, architecture evolution is commonplace in real-world software projects. Today's software architects, however,…

  10. XES Software Communication Extension

    NARCIS (Netherlands)

    Leemans, M.; Liu, C.

    2017-01-01

    During the execution of software, execution data can be recorded. With the development of process mining techniques on the one hand, and the growing availability of software execution data on the other hand, a new form of software analytics comes into reach. That is, applying process mining

  11. Neutron Scattering Software

    Science.gov (United States)

    Home Page | Facilities | Reference | Software | Conferences | Announcements | Mailing Lists Neutron Scattering Banner Neutron Scattering Software A new portal for neutron scattering has just been established sets KUPLOT: data plotting and fitting software ILL/TAS: Matlab probrams for analyzing triple axis data

  12. XES Software Event Extension

    NARCIS (Netherlands)

    Leemans, M.; Liu, C.

    2017-01-01

    During the execution of software, execution data can be recorded. With the development of process mining techniques on the one hand, and the growing availability of software execution data on the other hand, a new form of software analytics comes into reach. That is, applying process mining

  13. ARC Software and Models

    Science.gov (United States)

    Archives RESEARCH ▼ Research Areas Ongoing Projects Completed Projects SOFTWARE CONTACT ▼ Primary Contacts Researchers External Link MLibrary Deep Blue Software Archive Most research conducted at the ARC produce software code and methodologies that are transferred to TARDEC and industry partners. These

  14. XES Software Telemetry Extension

    NARCIS (Netherlands)

    Leemans, M.; Liu, C.

    2017-01-01

    During the execution of software, execution data can be recorded. With the development of process mining techniques on the one hand, and the growing availability of software execution data on the other hand, a new form of software analytics comes into reach. That is, applying process mining

  15. Specifications in software prototyping

    OpenAIRE

    Luqi; Chang, Carl K.; Zhu, Hong

    1998-01-01

    We explore the use of software speci®cations for software prototyping. This paper describes a process model for software prototyping, and shows how specifications can be used to support such a process via a cellular mobile phone switch example.

  16. Software Acquisition and Software Engineering Best Practices

    National Research Council Canada - National Science Library

    Eslinger, S

    1999-01-01

    The purpose of this white paper is to address the issues raised in the recently published Senate Armed Services Committee Report 106-50 concerning Software Management Improvements for the Department of Defense (DoD...

  17. Software Quality Assurance in Software Projects: A Study of Pakistan

    OpenAIRE

    Faisal Shafique Butt; Sundus Shaukat; M. Wasif Nisar; Ehsan Ullah Munir; Muhammad Waseem; Kashif Ayyub

    2013-01-01

    Software quality is specific property which tells what kind of standard software should have. In a software project, quality is the key factor of success and decline of software related organization. Many researches have been done regarding software quality. Software related organization follows standards introduced by Capability Maturity Model Integration (CMMI) to achieve good quality software. Quality is divided into three main layers which are Software Quality Assurance (SQA), Software Qu...

  18. Strategies for successful software development risk management

    Directory of Open Access Journals (Sweden)

    Marija Boban

    2003-01-01

    Full Text Available Nowadays, software is becoming a major part of enterprise business. Software development is activity connected with advanced technology and high level of knowledge. Risks on software development projects must be successfully mitigated to produce successful software systems. Lack of a defined approach to risk management is one of the common causes for project failures. To improve project chances for success, this work investigates common risk impact areas to perceive a foundation that can be used to define a common approach to software risk management. Based on typical risk impact areas on software development projects, we propose three risk management strategies suitable for a broad area of enterprises and software development projects with different amounts of connected risks. Proposed strategies define activities that should be performed for successful risk management, the one that will enable software development projects to perceive risks as soon as possible and to solve problems connected with risk materialization. We also propose a risk-based approach to software development planning and risk management as attempts to address and retire the highest impact risks as early as possible in the development process. Proposed strategies should improve risk management on software development projects and help create a successful software solution.

  19. Avionics Simulation, Development and Software Engineering

    Science.gov (United States)

    2002-01-01

    During this reporting period, all technical responsibilities were accomplished as planned. A close working relationship was maintained with personnel of the MSFC Avionics Department Software Group (ED14), the MSFC EXPRESS Project Office (FD31), and the Huntsville Boeing Company. Accomplishments included: performing special tasks; supporting Software Review Board (SRB), Avionics Test Bed (ATB), and EXPRESS Software Control Panel (ESCP) activities; participating in technical meetings; and coordinating issues between the Boeing Company and the MSFC Project Office.

  20. Security Risk Assessment in Software Development Projects

    OpenAIRE

    Svendsen, Heidi

    2017-01-01

    Software security is increasing in importance, linearly with vulnerabilities caused by software flaws. It is not possible to spend all the project s resources on software security. To spend the resources given to security in an effective way, one should know what is most important to protect. By performing a risk analysis the project know which vulnerabilities they face. A risk analysis will prioritise the vulnerabilities, and when the vulnerabilities are prioritised the project know where th...

  1. Software Reuse Within the Earth Science Community

    Science.gov (United States)

    Marshall, James J.; Olding, Steve; Wolfe, Robert E.; Delnore, Victor E.

    2006-01-01

    Scientific missions in the Earth sciences frequently require cost-effective, highly reliable, and easy-to-use software, which can be a challenge for software developers to provide. The NASA Earth Science Enterprise (ESE) spends a significant amount of resources developing software components and other software development artifacts that may also be of value if reused in other projects requiring similar functionality. In general, software reuse is often defined as utilizing existing software artifacts. Software reuse can improve productivity and quality while decreasing the cost of software development, as documented by case studies in the literature. Since large software systems are often the results of the integration of many smaller and sometimes reusable components, ensuring reusability of such software components becomes a necessity. Indeed, designing software components with reusability as a requirement can increase the software reuse potential within a community such as the NASA ESE community. The NASA Earth Science Data Systems (ESDS) Software Reuse Working Group is chartered to oversee the development of a process that will maximize the reuse potential of existing software components while recommending strategies for maximizing the reusability potential of yet-to-be-designed components. As part of this work, two surveys of the Earth science community were conducted. The first was performed in 2004 and distributed among government employees and contractors. A follow-up survey was performed in 2005 and distributed among a wider community, to include members of industry and academia. The surveys were designed to collect information on subjects such as the current software reuse practices of Earth science software developers, why they choose to reuse software, and what perceived barriers prevent them from reusing software. In this paper, we compare the results of these surveys, summarize the observed trends, and discuss the findings. The results are very

  2. Licensing safety critical software

    International Nuclear Information System (INIS)

    Archinoff, G.H.; Brown, R.A.

    1990-01-01

    Licensing difficulties with the shutdown system software at the Darlington Nuclear Generating Station contributed to delays in starting up the station. Even though the station has now been given approval by the Atomic Energy Control Board (AECB) to operate, the software issue has not disappeared - Ontario Hydro has been instructed by the AECB to redesign the software. This article attempts to explain why software based shutdown systems were chosen for Darlington, why there was so much difficulty licensing them, and what the implications are for other safety related software based applications

  3. Software Validation in ATLAS

    International Nuclear Information System (INIS)

    Hodgkinson, Mark; Seuster, Rolf; Simmons, Brinick; Sherwood, Peter; Rousseau, David

    2012-01-01

    The ATLAS collaboration operates an extensive set of protocols to validate the quality of the offline software in a timely manner. This is essential in order to process the large amounts of data being collected by the ATLAS detector in 2011 without complications on the offline software side. We will discuss a number of different strategies used to validate the ATLAS offline software; running the ATLAS framework software, Athena, in a variety of configurations daily on each nightly build via the ATLAS Nightly System (ATN) and Run Time Tester (RTT) systems; the monitoring of these tests and checking the compilation of the software via distributed teams of rotating shifters; monitoring of and follow up on bug reports by the shifter teams and periodic software cleaning weeks to improve the quality of the offline software further.

  4. Management of Software Development Projects

    Directory of Open Access Journals (Sweden)

    Felician ALECU

    2011-04-01

    Full Text Available Any major software development starts with the Initiating process group. Once the charter document is approved, the Planning and then to the Executing stages will follow. Monitoring and Controlling is measuring the potential performance deviation of the project in terms of schedule and costs and performs the related Integrated Change Control activities. At the end, during the Closing, the program/project manager will check the entire work is completed and the objectives are met.

  5. Patterns of Software Development Process

    Directory of Open Access Journals (Sweden)

    Sandro Javier Bolaños Castro

    2011-12-01

    Full Text Available "Times New Roman","serif";mso-fareast-font-family:"Times New Roman";mso-ansi-language:EN-US;mso-fareast-language:EN-US;mso-bidi-language:AR-SA">This article presents a set of patterns that can be found to perform best practices in software processes that are directly related to the problem of implementing the activities of the process, the roles involved, the knowledge generated and the inputs and outputs belonging to the process. In this work, a definition of the architecture is encouraged by using different recurrent configurations that strengthen the process and yield efficient results for the development of a software project. The patterns presented constitute a catalog, which serves as a vocabulary for communication among project participants [1], [2], and also can be implemented through software tools, thus facilitating patterns implementation [3]. Additionally, a tool that can be obtained under GPL (General Public license is provided for this purpose

  6. A Cloverleaf of Software Engineering

    DEFF Research Database (Denmark)

    Bjørner, Dines

    2005-01-01

    , however "lite". Third, despite 35 years of formal methods, the SE industry, maturity-wise still lags far behind that of other engineering disciplines. So we examine why. Finally, in several areas, in health care, in architecture, and others, we see that major undertakings are primarily spearheaded...... by senior academic staff. Professors of medicine daily perform specialized surgery and treatments at hospitals. Professors of architecture design new, daring buildings for industry, and professors of civil engineering head the engineering structural design of new, daring bridges. So we speculate what......We shall touch upon four issues of software engineering (SE): domain engineering, formal techniques, SE sociology, and academic software architects. First, before software can be designed one must understand its requirements; but before requirements can be formulated one must understand the domain...

  7. Virtual Exercise Training Software System

    Science.gov (United States)

    Vu, L.; Kim, H.; Benson, E.; Amonette, W. E.; Barrera, J.; Perera, J.; Rajulu, S.; Hanson, A.

    2018-01-01

    The purpose of this study was to develop and evaluate a virtual exercise training software system (VETSS) capable of providing real-time instruction and exercise feedback during exploration missions. A resistive exercise instructional system was developed using a Microsoft Kinect depth-camera device, which provides markerless 3-D whole-body motion capture at a small form factor and minimal setup effort. It was hypothesized that subjects using the newly developed instructional software tool would perform the deadlift exercise with more optimal kinematics and consistent technique than those without the instructional software. Following a comprehensive evaluation in the laboratory, the system was deployed for testing and refinement in the NASA Extreme Environment Mission Operations (NEEMO) analog.

  8. NASA software documentation standard software engineering program

    Science.gov (United States)

    1991-01-01

    The NASA Software Documentation Standard (hereinafter referred to as Standard) can be applied to the documentation of all NASA software. This Standard is limited to documentation format and content requirements. It does not mandate specific management, engineering, or assurance standards or techniques. This Standard defines the format and content of documentation for software acquisition, development, and sustaining engineering. Format requirements address where information shall be recorded and content requirements address what information shall be recorded. This Standard provides a framework to allow consistency of documentation across NASA and visibility into the completeness of project documentation. This basic framework consists of four major sections (or volumes). The Management Plan contains all planning and business aspects of a software project, including engineering and assurance planning. The Product Specification contains all technical engineering information, including software requirements and design. The Assurance and Test Procedures contains all technical assurance information, including Test, Quality Assurance (QA), and Verification and Validation (V&V). The Management, Engineering, and Assurance Reports is the library and/or listing of all project reports.

  9. Software Maintenance and Evolution: The Implication for Software ...

    African Journals Online (AJOL)

    Software Maintenance and Evolution: The Implication for Software Development. ... Software maintenance is the process of modifying existing operational software by correcting errors, ... EMAIL FREE FULL TEXT EMAIL FREE FULL TEXT

  10. An Empirical Study of a Free Software Company

    OpenAIRE

    Pakusch, Cato

    2010-01-01

    Free software has matured well into the commercial software market, yet little qualitative research exists which accurately describes the state of commercial free software today. For this thesis, an instrumental case study was performed on a prominent free software company in Norway. The study found that the commercial free software market is largely driven by social networks, which have a social capital in its own that attracts more people, which in turn become members of the ...

  11. The software quality control for gamma spectrometry

    International Nuclear Information System (INIS)

    Monte, L.

    1986-01-01

    One of major problems with wich the quality control program of an environmental measurements laboratory is confronted is the evaluation of the performances of software packages for the analysis of gamma-ray spectra. A program of tests for evaluating the performances of the software package (SPECTRAN-F, Canberra Inc.) used by our laboratory is being carried out. In this first paper the results of a preliminary study concerning the evaluation of the performance of the doublet analysis routine are presented

  12. Good practices for educational software engineering projects

    NARCIS (Netherlands)

    van der Duim, Louwarnoud; Andersson, Jesper; Sinnema, Marco

    2007-01-01

    Recent publications indicate the importance of software engineering in the computer science curriculum. In this paper, we present the final part of software engineering education at University of Groningen in the Netherlands and Vaxjo University in Sweden, where student teams perform an industrial

  13. Expert System Software Assistant for Payload Operations

    Science.gov (United States)

    Rogers, Mark N.

    1997-01-01

    The broad objective of this expert system software based application was to demonstrate the enhancements and cost savings that can be achieved through expert system software utilization in a spacecraft ground control center. Spacelab provided a valuable proving ground for this advanced software technology; a technology that will be exploited and expanded for future ISS operations. Our specific focus was on demonstrating payload cadre command and control efficiency improvements through the use of "smart" software which monitors flight telemetry, provides enhanced schematic-based data visualization, and performs advanced engineering data analysis.

  14. Example of software configuration management model

    International Nuclear Information System (INIS)

    Roth, P.

    2006-01-01

    Software configuration management is the mechanism used to track and control software changes and may include the following actions: A tracking system should be established for any changes made to the existing software configuration. Requirement of the configuration management system are the following: - Backup the different software configuration; - Record the details (the date, the subject, the filenames, the supporting documents, the tests, ...) of the changes introduced in the new configuration; - Document all the differences between the different versions. Configuration management allows simultaneous exploitation of one specific version and development of the next version. Minor correction can be perform in the current exploitation version

  15. Improvements for Optics Measurement and Corrections software

    CERN Document Server

    Bach, T

    2013-01-01

    This note presents the improvements for the OMC software during a 14 month technical student internship at CERN. The goal of the work was to improve existing software in terms of maintainability, features and performance. Significant improvements in stability, speed and overall development process were reached. The main software, a Java GUI at the LHC CCC, run for months without noteworthy problems. The overall running time of the software chain used for optics corrections was reduced from nearly half an hour to around two minutes. This was the result of analysing and improving several involved programs and algorithms.

  16. Software qualification for digital safety system in KNICS project

    International Nuclear Information System (INIS)

    Kwon, Kee-Choon; Lee, Dong-Young; Choi, Jong-Gyun

    2012-01-01

    In order to achieve technical self-reliance in the area of nuclear instrumentation and control, the Korea Nuclear Instrumentation and Control System (KNICS) project had been running for seven years from 2001. The safety-grade Programmable Logic Controller (PLC) and the digital safety system were developed by KNICS project. All the software of the PLC and digital safety system were developed and verified following the software development life cycle Verification and Validation (V and V) procedure. The main activities of the V and V process are preparation of software planning documentations, verification of the Software Requirement Specification (SRS), Software Design Specification (SDS) and codes, and a testing of the software components, the integrated software, and the integrated system. In addition, a software safety analysis and a software configuration management are included in the activities. For the software safety analysis at the SRS and SDS phases, the software Hazard Operability (HAZOP) was performed and then the software fault tree analysis was applied. The software fault tree analysis was applied to a part of software module with some critical defects identified by the software HAZOP in SDS phase. The software configuration management was performed using the in-house tool developed in the KNICS project. (author)

  17. A company perspective on software engineering standards

    International Nuclear Information System (INIS)

    Steer, R.W.

    1988-01-01

    Software engineering standards, as implemented via formal policies and procedures, have historically been used in the nuclear industry, especially for codes used in the design, analysis, or operation of the plant. Over the past two decades, a significant amount of software has been put in place to perform these functions, while the overall software life cycle has become better understood, more and different computer systems have become available, and industry has become increasingly aware of the advantages gained when these procedures are used in the development and maintenance of this large amount of software. The use of standards and attendant procedures is thus becoming increasingly important as more computerization is taking place, both in the design and the operation of the plant. It is difficult to categorize software used in activities related to nuclear plants in a simple manner. That difficulty is due to the diversity of those uses, with attendant diversity in the methods and procedures used in the production of the software, compounded by a changing business climate in which significant software engineering expertise is being applied to a broader range of applications on a variety of computing systems. The use of standards in the various phases of the production of software thus becomes more difficult as well. This paper discusses the various types of software and the importance of software standards in the development of each of them

  18. Development of a software for the curimeter model cdn102

    International Nuclear Information System (INIS)

    Dotres Llera, Armando

    2001-01-01

    The characteristics of the software for the Curimeter Model CD-N102 developed at CEADEN are presented. The software consists of two main parts: a basic software for the electrometer block and an application software for a P C. The basic software is totally independent of the Pc and performs all the basic functions of the process of measurement. The application software is optional and offers a friendlier interface and additional options to the user. Among these is the possibility to keep a statistical record of the measurements in a database, to create labels and to introduce new isotopes and calibrate them. A more detailed explanation of both software is given

  19. Software engineering in industry

    Science.gov (United States)

    Story, C. M.

    1989-12-01

    Can software be "engineered"? Can a few people with limited resources and a negligible budget produce high quality software solutions to complex software problems? It is possible to resolve the conflict between research activities and the necessity to view software development as a means to an end rather than as an end in itself? The aim of this paper is to encourage further thought and discussion on various topics which, in the author's experience, are becoming increasingly critical in large current software production and development projects, inside and outside high energy physics (HEP). This is done by briefly exploring some of the software engineering ideas and technologies now used in the information industry, using, as a case-study, a project with many similarities to those currently under way in HEP.

  20. A software product certification model

    NARCIS (Netherlands)

    Heck, P.M.; Klabbers, M.D.; van Eekelen, Marko

    2010-01-01

    Certification of software artifacts offers organizations more certainty and confidence about software. Certification of software helps software sales, acquisition, and can be used to certify legislative compliance or to achieve acceptable deliverables in outsourcing. In this article, we present a

  1. Software verification for nuclear industry

    International Nuclear Information System (INIS)

    Wilburn, N.P.

    1985-08-01

    Why verification of software products throughout the software life cycle is necessary is considered. Concepts of verification, software verification planning, and some verification methodologies for products generated throughout the software life cycle are then discussed

  2. Overview of NWIS Software

    International Nuclear Information System (INIS)

    Mullens, J.A.

    1999-01-01

    The Nuclear Weapons Identification System (NWIS) is a system that performs radiation signature measurements on objects such as nuclear weapons components. NWIS consists of a 252 Cf fission source, radiation detectors and associated analog electronics, data acquisition boards, and a computer running Windows NT and the application software. NWIS uses signal processing techniques to produce a radiation signature from the radiation emitted from the object. This signature can be stored and later compared to another signature to determine whether two objects are similar. A library of such signatures can be used to identify objects in closed containers as well as determine attributes such as fissile mass and in some cases enrichment. NWIS uses a 252 Cf source on one side of the object to produce radiation that its detectors measure on the other side of the target (active mode). If the object naturally emits enough radiation, the 252 Cf source is not required (passive mode). The NWIS data acquisition hardware has five detector channels. Each channel receives shaped detector pulses and times those pulses with 1 nanosecond resolution. In active mode measurements one of these channels receives pulses from a detector measuring the 252 Cf source fissions. Thus, for active mode measurements, NWIS has the time of each 252 Cf fission and the subsequent injection of neutrons and gamma rays into the object. The remaining channels receive pulses from the detectors measuring radiation from the object. These detectors record the amount and time of radiation exiting the object. By correlating the radiation events among the source and the other detectors, and among the detectors themselves, a characteristic response of the object to 252 Cf radiation or its own internal radiation is measured. The data acquisition hardware consists of two custom-made boards. The Data Capture and Compression (DCC) board is built around a Gallium Arsine (GaAs) chip designed at ORNL. This chip assigns a time

  3. Software evolution and maintenance

    CERN Document Server

    Tripathy, Priyadarshi

    2014-01-01

    Software Evolution and Maintenance: A Practitioner's Approach is an accessible textbook for students and professionals, which collates the advances in software development and provides the most current models and techniques in maintenance.Explains two maintenance standards: IEEE/EIA 1219 and ISO/IEC14764Discusses several commercial reverse and domain engineering toolkitsSlides for instructors are available onlineInformation is based on the IEEE SWEBOK (Software Engineering Body of Knowledge)

  4. Hospital Management Software Development

    OpenAIRE

    sobogunGod, olawale

    2012-01-01

    The purpose of this thesis was to implement a hospital management software which is suitable for small private hospitals in Nigeria, especially for the ones that use a file based system for storing information rather than having it stored in a more efficient and safer environment like databases or excel programming software. The software developed within this thesis project was specifically designed for the Rainbow specialist hospital which is based in Lagos, the commercial neurological cente...

  5. Computer software configuration management

    International Nuclear Information System (INIS)

    Pelletier, G.

    1987-08-01

    This report reviews the basic elements of software configuration management (SCM) as defined by military and industry standards. Several software configuration management standards are evaluated given the requirements of the nuclear industry. A survey is included of available automated tools for supporting SCM activities. Some information is given on the experience of establishing and using SCM plans of other organizations that manage critical software. The report concludes with recommendations of practices that would be most appropriate for the nuclear power industry in Canada

  6. Gammasphere software development

    International Nuclear Information System (INIS)

    Piercey, R.B.

    1993-01-01

    Activities of the nuclear physics group are described. Progress was made in organizing the Gammasphere Software Working Group, establishing a nuclear computing facility, participating in software development at Lawrence Berkeley, developing a common data file format, and adapting the ORNL UPAK software to run at Gammasphere. A universal histogram object was developed that defines a file format and provides for an objective-oriented programming model. An automated liquid nitrogen fill system was developed for Gammasphere (110 Ge detectors comprise the sphere)

  7. Software quality management

    International Nuclear Information System (INIS)

    Bishop, D.C.; Pymm, P.

    1991-01-01

    As programmable electronic (software-based) systems are increasingly being proposed as design solutions for high integrity applications in nuclear power stations, the need to adopt suitable quality management arrangements is paramount. The authors describe Scottish Nuclear's strategy for software quality management and, using the main on-line monitoring system at Torness Power Station as an example, explain how this strategy is put into practice. Particular attention is given to the topics of software quality planning and change control. (author)

  8. Software Process Improvement Defined

    DEFF Research Database (Denmark)

    Aaen, Ivan

    2002-01-01

    This paper argues in favor of the development of explanatory theory on software process improvement. The last one or two decades commitment to prescriptive approaches in software process improvement theory may contribute to the emergence of a gulf dividing theorists and practitioners....... It is proposed that this divide be met by the development of theory evaluating prescriptive approaches and informing practice with a focus on the software process policymaking and process control aspects of improvement efforts...

  9. Assuring Software Reliability

    Science.gov (United States)

    2014-08-01

    technologies and processes to achieve a required level of confidence that software systems and services function in the intended manner. 1.3 Security Example...that took three high-voltage lines out of service and a software fail- ure (a race condition3) that disabled the computing service that notified the... service had failed. Instead of analyzing the details of the alarm server failure, the reviewers asked why the following software assurance claim had

  10. Software evolution with XVCL

    DEFF Research Database (Denmark)

    Zhang, Weishan; Jarzabek, Stan; Zhang, Hongyu

    2004-01-01

    This chapter introduces software evolution with XVCL (XML-based Variant Configuration Language), which is an XML-based metaprogramming technique. As the software evolves, a large number of variants may arise, especially whtn such kinds of evolutions are related to multiple platforms as shown in our...... case study. Handling variants and tracing the impact of variants across the development lifecycle is a challenge. This chapter shows how we can maintain different versions of software in a reuse-based way....

  11. FASTBUS software status

    International Nuclear Information System (INIS)

    Gustavson, D.B.

    1980-10-01

    Computer software will be needed in addition to the mechanical, electrical, protocol and timing specifications of the FASTBUS, in order to facilitate the use of this flexible new multiprocessor and multisegment data acquisition and processing system. Software considerations have been important in the FASTBUS design, but standard subroutines and recommended algorithms will be needed as the FASTBUS comes into use. This paper summarizes current FASTBUS software projects, goals and status

  12. Software configuration management

    CERN Document Server

    Keyes, Jessica

    2004-01-01

    Software Configuration Management discusses the framework from a standards viewpoint, using the original DoD MIL-STD-973 and EIA-649 standards to describe the elements of configuration management within a software engineering perspective. Divided into two parts, the first section is composed of 14 chapters that explain every facet of configuration management related to software engineering. The second section consists of 25 appendices that contain many valuable real world CM templates.

  13. Solar Asset Management Software

    Energy Technology Data Exchange (ETDEWEB)

    Iverson, Aaron [Ra Power Management, Inc., Oakland, CA (United States); Zviagin, George [Ra Power Management, Inc., Oakland, CA (United States)

    2016-09-30

    Ra Power Management (RPM) has developed a cloud based software platform that manages the financial and operational functions of third party financed solar projects throughout their lifecycle. RPM’s software streamlines and automates the sales, financing, and management of a portfolio of solar assets. The software helps solar developers automate the most difficult aspects of asset management, leading to increased transparency, efficiency, and reduction in human error. More importantly, our platform will help developers save money by improving their operating margins.

  14. Essential software architecture

    CERN Document Server

    Gorton, Ian

    2011-01-01

    Job titles like ""Technical Architect"" and ""Chief Architect"" nowadays abound in software industry, yet many people suspect that ""architecture"" is one of the most overused and least understood terms in professional software development. Gorton's book tries to resolve this dilemma. It concisely describes the essential elements of knowledge and key skills required to be a software architect. The explanations encompass the essentials of architecture thinking, practices, and supporting technologies. They range from a general understanding of structure and quality attributes through technical i

  15. Software engineering the current practice

    CERN Document Server

    Rajlich, Vaclav

    2011-01-01

    INTRODUCTION History of Software EngineeringSoftware PropertiesOrigins of SoftwareBirth of Software EngineeringThird Paradigm: Iterative ApproachSoftware Life Span ModelsStaged ModelVariants of Staged ModelSoftware Technologies Programming Languages and CompilersObject-Oriented TechnologyVersion Control SystemSoftware ModelsClass DiagramsUML Activity DiagramsClass Dependency Graphs and ContractsSOFTWARE CHANGEIntroduction to Software ChangeCharacteristics of Software ChangePhases of Software ChangeRequirements and Their ElicitationRequirements Analysis and Change InitiationConcepts and Concept

  16. Agile software development

    CERN Document Server

    Dingsoyr, Torgeir; Moe, Nils Brede

    2010-01-01

    Agile software development has become an umbrella term for a number of changes in how software developers plan and coordinate their work, how they communicate with customers and external stakeholders, and how software development is organized in small, medium, and large companies, from the telecom and healthcare sectors to games and interactive media. Still, after a decade of research, agile software development is the source of continued debate due to its multifaceted nature and insufficient synthesis of research results. Dingsoyr, Dyba, and Moe now present a comprehensive snapshot of the kno

  17. Optimization of Antivirus Software

    Directory of Open Access Journals (Sweden)

    2007-01-01

    Full Text Available The paper describes the main techniques used in development of computer antivirus software applications. For this particular category of software, are identified and defined optimum criteria that helps determine which solution is better and what are the objectives of the optimization process. From the general viewpoint of software optimization are presented methods and techniques that are applied at code development level. Regarding the particularities of antivirus software, the paper analyzes some of the optimization concepts applied to this category of applications

  18. Dtest Testing Software

    Science.gov (United States)

    Jain, Abhinandan; Cameron, Jonathan M.; Myint, Steven

    2013-01-01

    This software runs a suite of arbitrary software tests spanning various software languages and types of tests (unit level, system level, or file comparison tests). The dtest utility can be set to automate periodic testing of large suites of software, as well as running individual tests. It supports distributing multiple tests over multiple CPU cores, if available. The dtest tool is a utility program (written in Python) that scans through a directory (and its subdirectories) and finds all directories that match a certain pattern and then executes any tests in that directory as described in simple configuration files.

  19. Software quality assurance

    CERN Document Server

    Laporte, Claude Y

    2018-01-01

    This book introduces Software Quality Assurance (SQA) and provides an overview of standards used to implement SQA. It defines ways to assess the effectiveness of how one approaches software quality across key industry sectors such as telecommunications, transport, defense, and aerospace. * Includes supplementary website with an instructor's guide and solutions * Applies IEEE software standards as well as the Capability Maturity Model Integration for Development (CMMI) * Illustrates the application of software quality assurance practices through the use of practical examples, quotes from experts, and tips from the authors

  20. Software architecture 2

    CERN Document Server

    Oussalah, Mourad Chabanne

    2014-01-01

    Over the past 20 years, software architectures have significantly contributed to the development of complex and distributed systems. Nowadays, it is recognized that one of the critical problems in the design and development of any complex software system is its architecture, i.e. the organization of its architectural elements. Software Architecture presents the software architecture paradigms based on objects, components, services and models, as well as the various architectural techniques and methods, the analysis of architectural qualities, models of representation of architectural templa