WorldWideScience

Sample records for high consequence software

  1. Achieving strategic surety for high consequence software

    Energy Technology Data Exchange (ETDEWEB)

    Pollock, G.M.

    1996-09-01

    A strategic surety roadmap for high consequence software systems under the High Integrity Software (HIS) Program at Sandia National Laboratories guides research in identifying methodologies to improve software surety. Selected research tracks within this roadmap are identified and described detailing current technology and outlining advancements to be pursued over the coming decade to reach HIS goals. The tracks discussed herein focus on Correctness by Design, and System Immunology{trademark}. Specific projects are discussed with greater detail given on projects involving Correct Specification via Visualization, Synthesis, & Analysis; Visualization of Abstract Objects; and Correct Implementation of Components.

  2. Dynamic visualization techniques for high consequence software

    Energy Technology Data Exchange (ETDEWEB)

    Pollock, G.M.

    1998-02-01

    This report documents a prototype tool developed to investigate the use of visualization and virtual reality technologies for improving software surety confidence. The tool is utilized within the execution phase of the software life cycle. It provides a capability to monitor an executing program against prespecified requirements constraints provided in a program written in the requirements specification language SAGE. The resulting Software Attribute Visual Analysis Tool (SAVAnT) also provides a technique to assess the completeness of a software specification. The prototype tool is described along with the requirements constraint language after a brief literature review is presented. Examples of how the tool can be used are also presented. In conclusion, the most significant advantage of this tool is to provide a first step in evaluating specification completeness, and to provide a more productive method for program comprehension and debugging. The expected payoff is increased software surety confidence, increased program comprehension, and reduced development and debugging time.

  3. A strategic surety roadmap for high consequence software

    Energy Technology Data Exchange (ETDEWEB)

    Pollock, G.M.; Dalton, L.J.

    1995-12-31

    A strategic surety roadmap for high consequence software systems developed under the High Integrity Software (HIS) Program at Sandia National Laboratories is presented. Selected research tracks are identified and described detailing current technology and outlining advancements to be pursued over the coming decade to reach HIS goals.

  4. An Embedded System for Safe, Secure and Reliable Execution of High Consequence Software

    Energy Technology Data Exchange (ETDEWEB)

    MCCOY,JAMES A.

    2000-08-29

    As more complex and functionally diverse requirements are placed on high consequence embedded applications, ensuring safe and secure operation requires an execution environment that is ultra reliable from a system viewpoint. In many cases the safety and security of the system depends upon the reliable cooperation between the hardware and the software to meet real-time system throughput requirements. The selection of a microprocessor and its associated development environment for an embedded application has the most far-reaching effects on the development and production of the system than any other element in the design. The effects of this choice ripple through the remainder of the hardware design and profoundly affect the entire software development process. While state-of-the-art software engineering principles indicate that an object oriented (OO) methodology provides a superior development environment, traditional programming languages available for microprocessors targeted for deeply embedded applications do not directly support OO techniques. Furthermore, the microprocessors themselves do not typically support nor do they enforce an OO environment. This paper describes a system level approach for the design of a microprocessor intended for use in deeply embedded high consequence applications that both supports and enforces an OO execution environment.

  5. software reliability: failures, consequences and improvement

    African Journals Online (AJOL)

    BARTH EKWUEME

    2009-07-16

    Jul 16, 2009 ... function of time, but it is believed that some modeling technique for software reliability is reaching propensity, by ..... February 25, 1991 during the Gulf war, the chopping ... Let us consider a few key concepts that apply to both.

  6. Engineering high quality medical software

    CERN Document Server

    Coronato, Antonio

    2018-01-01

    This book focuses on high-confidence medical software in the growing field of e-health, telecare services and health technology. It covers the development of methodologies and engineering tasks together with standards and regulations for medical software.

  7. Software attribute visualization for high integrity software

    Energy Technology Data Exchange (ETDEWEB)

    Pollock, G.M.

    1998-03-01

    This report documents a prototype tool developed to investigate the use of visualization and virtual reality technologies for improving software surety confidence. The tool is utilized within the execution phase of the software life cycle. It provides a capability to monitor an executing program against prespecified requirements constraints provided in a program written in the requirements specification language SAGE. The resulting Software Attribute Visual Analysis Tool (SAVAnT) also provides a technique to assess the completeness of a software specification.

  8. Economic Consequence Analysis of Disasters: The ECAT Software Tool

    Energy Technology Data Exchange (ETDEWEB)

    Rose, Adam; Prager, Fynn; Chen, Zhenhua; Chatterjee, Samrat; Wei, Dan; Heatwole, Nathaniel; Warren, Eric

    2017-04-15

    This study develops a methodology for rapidly obtaining approximate estimates of the economic consequences from numerous natural, man-made and technological threats. This software tool is intended for use by various decision makers and analysts to obtain estimates rapidly. It is programmed in Excel and Visual Basic for Applications (VBA) to facilitate its use. This tool is called E-CAT (Economic Consequence Analysis Tool) and accounts for the cumulative direct and indirect impacts (including resilience and behavioral factors that significantly affect base estimates) on the U.S. economy. E-CAT is intended to be a major step toward advancing the current state of economic consequence analysis (ECA) and also contributing to and developing interest in further research into complex but rapid turnaround approaches. The essence of the methodology involves running numerous simulations in a computable general equilibrium (CGE) model for each threat, yielding synthetic data for the estimation of a single regression equation based on the identification of key explanatory variables (threat characteristics and background conditions). This transforms the results of a complex model, which is beyond the reach of most users, into a "reduced form" model that is readily comprehensible. Functionality has been built into E-CAT so that its users can switch various consequence categories on and off in order to create customized profiles of economic consequences of numerous risk events. E-CAT incorporates uncertainty on both the input and output side in the course of the analysis.

  9. An examination of the consequences in high consequence operations

    Energy Technology Data Exchange (ETDEWEB)

    Spray, S.D.; Cooper, J.A.

    1996-06-01

    Traditional definitions of risk partition concern into the probability of occurrence and the consequence of the event. Most safety analyses focus on probabilistic assessment of an occurrence and the amount of some measurable result of the event, but the real meaning of the ``consequence`` partition is usually afforded less attention. In particular, acceptable social consequence (consequence accepted by the public) frequently differs significantly from the metrics commonly proposed by risk analysts. This paper addresses some of the important system development issues associated with consequences, focusing on ``high consequence operations safety.``

  10. SOFTWARE QUALITY ASSURANCE FOR EMERGENCY RESPONSE CONSEQUENCE ASSESSMENT MODELS AT DOE'S SAVANNAH RIVER SITE

    International Nuclear Information System (INIS)

    Hunter, C

    2007-01-01

    The Savannah River National Laboratory's (SRNL) Atmospheric Technologies Group develops, maintains, and operates computer-based software applications for use in emergency response consequence assessment at DOE's Savannah River Site. These applications range from straightforward, stand-alone Gaussian dispersion models run with simple meteorological input to complex computational software systems with supporting scripts that simulate highly dynamic atmospheric processes. A software quality assurance program has been developed to ensure appropriate lifecycle management of these software applications. This program was designed to meet fully the overall structure and intent of SRNL's institutional software QA programs, yet remain sufficiently practical to achieve the necessary level of control in a cost-effective manner. A general overview of this program is described

  11. High performance in software development

    CERN Multimedia

    CERN. Geneva; Haapio, Petri; Liukkonen, Juha-Matti

    2015-01-01

    What are the ingredients of high-performing software? Software development, especially for large high-performance systems, is one the most complex tasks mankind has ever tried. Technological change leads to huge opportunities but challenges our old ways of working. Processing large data sets, possibly in real time or with other tight computational constraints, requires an efficient solution architecture. Efficiency requirements span from the distributed storage and large-scale organization of computation and data onto the lowest level of processor and data bus behavior. Integrating performance behavior over these levels is especially important when the computation is resource-bounded, as it is in numerics: physical simulation, machine learning, estimation of statistical models, etc. For example, memory locality and utilization of vector processing are essential for harnessing the computing power of modern processor architectures due to the deep memory hierarchies of modern general-purpose computers. As a r...

  12. Consequence Prioritization Process for Potential High Consequence Events (HCE)

    Energy Technology Data Exchange (ETDEWEB)

    Freeman, Sarah G. [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-10-31

    This document describes the process for Consequence Prioritization, the first phase of the Consequence-Driven Cyber-Informed Engineering (CCE) framework. The primary goal of Consequence Prioritization is to identify potential disruptive events that would significantly inhibit an organization’s ability to provide the critical services and functions deemed fundamental to their business mission. These disruptive events, defined as High Consequence Events (HCE), include both events that have occurred or could be realized through an attack of critical infrastructure owner assets. While other efforts have been initiated to identify and mitigate disruptive events at the national security level, such as Presidential Policy Directive 41 (PPD-41), this process is intended to be used by individual organizations to evaluate events that fall below the threshold for a national security. Described another way, Consequence Prioritization considers threats greater than those addressable by standard cyber-hygiene and includes the consideration of events that go beyond a traditional continuity of operations (COOP) perspective. Finally, Consequence Prioritization is most successful when organizations adopt a multi-disciplinary approach, engaging both cyber security and engineering expertise, as in-depth engineering perspectives are required to recognize and characterize and mitigate HCEs. Figure 1 provides a high-level overview of the prioritization process.

  13. High Confidence Software and Systems Research Needs

    Data.gov (United States)

    Networking and Information Technology Research and Development, Executive Office of the President — This White Paper presents a survey of high confidence software and systems research needs. It has been prepared by the High Confidence Software and Systems...

  14. Evaluation of high-performance computing software

    Energy Technology Data Exchange (ETDEWEB)

    Browne, S.; Dongarra, J. [Univ. of Tennessee, Knoxville, TN (United States); Rowan, T. [Oak Ridge National Lab., TN (United States)

    1996-12-31

    The absence of unbiased and up to date comparative evaluations of high-performance computing software complicates a user`s search for the appropriate software package. The National HPCC Software Exchange (NHSE) is attacking this problem using an approach that includes independent evaluations of software, incorporation of author and user feedback into the evaluations, and Web access to the evaluations. We are applying this approach to the Parallel Tools Library (PTLIB), a new software repository for parallel systems software and tools, and HPC-Netlib, a high performance branch of the Netlib mathematical software repository. Updating the evaluations with feed-back and making it available via the Web helps ensure accuracy and timeliness, and using independent reviewers produces unbiased comparative evaluations difficult to find elsewhere.

  15. Low-Incidence, High-Consequence Pathogens

    Centers for Disease Control (CDC) Podcasts

    2014-02-21

    Dr. Stephan Monroe, a deputy director at CDC, discusses the impact of low-incidence, high-consequence pathogens globally.  Created: 2/21/2014 by National Center for Emerging and Zoonotic Infectious Diseases (NCEZID).   Date Released: 2/26/2014.

  16. Assuring quality in high-consequence engineering

    Energy Technology Data Exchange (ETDEWEB)

    Hoover, Marcey L.; Kolb, Rachel R.

    2014-03-01

    In high-consequence engineering organizations, such as Sandia, quality assurance may be heavily dependent on staff competency. Competency-dependent quality assurance models are at risk when the environment changes, as it has with increasing attrition rates, budget and schedule cuts, and competing program priorities. Risks in Sandia's competency-dependent culture can be mitigated through changes to hiring, training, and customer engagement approaches to manage people, partners, and products. Sandia's technical quality engineering organization has been able to mitigate corporate-level risks by driving changes that benefit all departments, and in doing so has assured Sandia's commitment to excellence in high-consequence engineering and national service.

  17. Simulation of high consequence areas for gas pipelines

    Directory of Open Access Journals (Sweden)

    Orlando Díaz-Parra

    2018-01-01

    Full Text Available The gas pipeline is used for the transport of natural gas at a great distance. Risks derived from the handling of a combustible material transported under high pressure, by pipelines that pass close to where people live, makes it necessary to adopt prevention, mitigation and control measures to reduce the effect in case of ignition of a gas leak. This work shows the development of a new mathematical model to determine areas of high consequence and their application, using widely available and easy to use software, such as Google Earth and Excel, to determine and visualize the area up to which the level of radiation can affect the integrity of people and buildings. The model takes into account the pressure drop into the gas pipeline from the compression station, the gas leakage rate and possible forms of gas ignition. This development is an alternative to the use of specialized software and highly trained personnel. The simulation is applied to a traced of the Miraflores-Tunja gas pipeline, using a macro developed in Excel to determine the impact area and compare it with the coordinates of the vulnerable areas. The zones where these areas intersect are constituted in high consequence areas and are identified along with the sections of the pipeline that affect them, to provide the operator with a risk analysis tool for the determination and visualization of the gas pipeline and its environment.

  18. Storage system software solutions for high-end user needs

    Science.gov (United States)

    Hogan, Carole B.

    1992-01-01

    Today's high-end storage user is one that requires rapid access to a reliable terabyte-capacity storage system running in a distributed environment. This paper discusses conventional storage system software and concludes that this software, designed for other purposes, cannot meet high-end storage requirements. The paper also reviews the philosophy and design of evolving storage system software. It concludes that this new software, designed with high-end requirements in mind, provides the potential for solving not only the storage needs of today but those of the foreseeable future as well.

  19. Method for critical software event execution reliability in high integrity software

    Energy Technology Data Exchange (ETDEWEB)

    Kidd, M.E. [Sandia National Labs., Albuquerque, NM (United States)

    1997-11-01

    This report contains viewgraphs on a method called SEER, which provides a high level of confidence that critical software driven event execution sequences faithfully exceute in the face of transient computer architecture failures in both normal and abnormal operating environments.

  20. The Evolution of Software in High Energy Physics

    International Nuclear Information System (INIS)

    Brun, René

    2012-01-01

    The paper reviews the evolution of the software in High Energy Physics from the time of expensive mainframes to grids and clouds systems using thousands of multi-core processors. It focuses on the key parameters or events that have shaped the current software infrastructure.

  1. Accident Damage Analysis Module (ADAM) – Technical Guidance, Software tool for Consequence Analysis calculations

    OpenAIRE

    FABBRI LUCIANO; BINDA MASSIMO; BRUINEN DE BRUIN YURI

    2017-01-01

    This report provides a technical description of the modelling and assumptions of the Accident Damage Analysis Module (ADAM) software application, which has been recently developed by the Joint Research Centre (JRC) of the European Commission (EC) to assess physical effects of an industrial accident resulting from an unintended release of a dangerous substance

  2. High Performance Computing Software Applications for Space Situational Awareness

    Science.gov (United States)

    Giuliano, C.; Schumacher, P.; Matson, C.; Chun, F.; Duncan, B.; Borelli, K.; Desonia, R.; Gusciora, G.; Roe, K.

    The High Performance Computing Software Applications Institute for Space Situational Awareness (HSAI-SSA) has completed its first full year of applications development. The emphasis of our work in this first year was in improving space surveillance sensor models and image enhancement software. These applications are the Space Surveillance Network Analysis Model (SSNAM), the Air Force Space Fence simulation (SimFence), and physically constrained iterative de-convolution (PCID) image enhancement software tool. Specifically, we have demonstrated order of magnitude speed-up in those codes running on the latest Cray XD-1 Linux supercomputer (Hoku) at the Maui High Performance Computing Center. The software applications improvements that HSAI-SSA has made, has had significant impact to the warfighter and has fundamentally changed the role of high performance computing in SSA.

  3. Software

    Energy Technology Data Exchange (ETDEWEB)

    Macedo, R.; Budd, G.; Ross, E.; Wells, P.

    2010-07-15

    The software section of this journal presented new software programs that have been developed to help in the exploration and development of hydrocarbon resources. Software provider IHS Inc. has made additions to its geological and engineering analysis software tool, IHS PETRA, a product used by geoscientists and engineers to visualize, analyze and manage well production, well log, drilling, reservoir, seismic and other related information. IHS PETRA also includes a directional well module and a decline curve analysis module to improve analysis capabilities in unconventional reservoirs. Petris Technology Inc. has developed a software to help manage the large volumes of data. PetrisWinds Enterprise (PWE) helps users find and manage wellbore data, including conventional wireline and MWD core data; analysis core photos and images; waveforms and NMR; and external files documentation. Ottawa-based Ambercore Software Inc. has been collaborating with Nexen on the Petroleum iQ software for steam assisted gravity drainage (SAGD) producers. Petroleum iQ integrates geology and geophysics data with engineering data in 3D and 4D. Calgary-based Envirosoft Corporation has developed a software that reduces the costly and time-consuming effort required to comply with Directive 39 of the Alberta Energy Resources Conservation Board. The product includes an emissions modelling software. Houston-based Seismic Micro-Technology (SMT) has developed the Kingdom software that features the latest in seismic interpretation. Holland-based Joa Oil and Gas and Calgary-based Computer Modelling Group have both supplied the petroleum industry with advanced reservoir simulation software that enables reservoir interpretation. The 2010 software survey included a guide to new software applications designed to facilitate petroleum exploration, drilling and production activities. Oil and gas producers can use the products for a range of functions, including reservoir characterization and accounting. In

  4. Simulation of high consequence areas for gas pipelines

    OpenAIRE

    Orlando Díaz-Parra; Enrique Vera-López

    2018-01-01

    The gas pipeline is used for the transport of natural gas at a great distance. Risks derived from the handling of a combustible material transported under high pressure, by pipelines that pass close to where people live, makes it necessary to adopt prevention, mitigation and control measures to reduce the effect in case of ignition of a gas leak. This work shows the development of a new mathematical model to determine areas of high consequence and their application, using widely available and...

  5. High-energy physics software parallelization using database techniques

    International Nuclear Information System (INIS)

    Argante, E.; Van der Stok, P.D.V.; Willers, I.

    1997-01-01

    A programming model for software parallelization, called CoCa, is introduced that copes with problems caused by typical features of high-energy physics software. By basing CoCa on the database transaction paradigm, the complexity induced by the parallelization is for a large part transparent to the programmer, resulting in a higher level of abstraction than the native message passing software. CoCa is implemented on a Meiko CS-2 and on a SUN SPARCcenter 2000 parallel computer. On the CS-2, the performance is comparable with the performance of native PVM and MPI. (orig.)

  6. High-Assurance Software: LDRD Report.

    Energy Technology Data Exchange (ETDEWEB)

    Hulette, Geoffrey Compton

    2014-06-01

    This report summarizes our work on methods for developing high-assurance digital systems. We present an approach for understanding and evaluating trust issues in digital systems, and for us- ing computer-checked proofs as a means for realizing this approach. We describe the theoretical background for programming with proofs based on the Curry-Howard correspondence, connect- ing the field of logic and proof theory to programs. We then describe a series of case studies, intended to demonstrate how this approach might be adopted in practice. In particular, our stud- ies elucidate some of the challenges that arise with this style of certified programming, including induction principles, generic programming, termination requirements, and reasoning over infinite state spaces.

  7. Technology Transfer Challenges for High-Assurance Software Engineering Tools

    Science.gov (United States)

    Koga, Dennis (Technical Monitor); Penix, John; Markosian, Lawrence Z.

    2003-01-01

    In this paper, we describe our experience with the challenges thar we are currently facing in our effort to develop advanced software verification and validation tools. We categorize these challenges into several areas: cost benefits modeling, tool usability, customer application domain, and organizational issues. We provide examples of challenges in each area and identrfj, open research issues in areas which limit our ability to transfer high-assurance software engineering tools into practice.

  8. Survey of industry methods for producing highly reliable software

    International Nuclear Information System (INIS)

    Lawrence, J.D.; Persons, W.L.

    1994-11-01

    The Nuclear Reactor Regulation Office of the US Nuclear Regulatory Commission is charged with assessing the safety of new instrument and control designs for nuclear power plants which may use computer-based reactor protection systems. Lawrence Livermore National Laboratory has evaluated the latest techniques in software reliability for measurement, estimation, error detection, and prediction that can be used during the software life cycle as a means of risk assessment for reactor protection systems. One aspect of this task has been a survey of the software industry to collect information to help identify the design factors used to improve the reliability and safety of software. The intent was to discover what practices really work in industry and what design factors are used by industry to achieve highly reliable software. The results of the survey are documented in this report. Three companies participated in the survey: Computer Sciences Corporation, International Business Machines (Federal Systems Company), and TRW. Discussions were also held with NASA Software Engineering Lab/University of Maryland/CSC, and the AIAA Software Reliability Project

  9. Conference on High Performance Software for Nonlinear Optimization

    CERN Document Server

    Murli, Almerico; Pardalos, Panos; Toraldo, Gerardo

    1998-01-01

    This book contains a selection of papers presented at the conference on High Performance Software for Nonlinear Optimization (HPSN097) which was held in Ischia, Italy, in June 1997. The rapid progress of computer technologies, including new parallel architec­ tures, has stimulated a large amount of research devoted to building software environments and defining algorithms able to fully exploit this new computa­ tional power. In some sense, numerical analysis has to conform itself to the new tools. The impact of parallel computing in nonlinear optimization, which had a slow start at the beginning, seems now to increase at a fast rate, and it is reasonable to expect an even greater acceleration in the future. As with the first HPSNO conference, the goal of the HPSN097 conference was to supply a broad overview of the more recent developments and trends in nonlinear optimization, emphasizing the algorithmic and high performance software aspects. Bringing together new computational methodologies with theoretical...

  10. Automated Theorem Proving in High-Quality Software Design

    Science.gov (United States)

    Schumann, Johann; Swanson, Keith (Technical Monitor)

    2001-01-01

    The amount and complexity of software developed during the last few years has increased tremendously. In particular, programs are being used more and more in embedded systems (from car-brakes to plant-control). Many of these applications are safety-relevant, i.e. a malfunction of hardware or software can cause severe damage or loss. Tremendous risks are typically present in the area of aviation, (nuclear) power plants or (chemical) plant control. Here, even small problems can lead to thousands of casualties and huge financial losses. Large financial risks also exist when computer systems are used in the area of telecommunication (telephone, electronic commerce) or space exploration. Computer applications in this area are not only subject to safety considerations, but also security issues are important. All these systems must be designed and developed to guarantee high quality with respect to safety and security. Even in an industrial setting which is (or at least should be) aware of the high requirements in Software Engineering, many incidents occur. For example, the Warshaw Airbus crash, was caused by an incomplete requirements specification. Uncontrolled reuse of an Ariane 4 software module was the reason for the Ariane 5 disaster. Some recent incidents in the telecommunication area, like illegal "cloning" of smart-cards of D2GSM handies, or the extraction of (secret) passwords from German T-online users show that also in this area serious flaws can happen. Due to the inherent complexity of computer systems, most authors claim that only a rigorous application of formal methods in all stages of the software life cycle can ensure high quality of the software and lead to real safe and secure systems. In this paper, we will have a look, in how far automated theorem proving can contribute to a more widespread application of formal methods and their tools, and what automated theorem provers (ATPs) must provide in order to be useful.

  11. Component-based software for high-performance scientific computing

    Energy Technology Data Exchange (ETDEWEB)

    Alexeev, Yuri; Allan, Benjamin A; Armstrong, Robert C; Bernholdt, David E; Dahlgren, Tamara L; Gannon, Dennis; Janssen, Curtis L; Kenny, Joseph P; Krishnan, Manojkumar; Kohl, James A; Kumfert, Gary; McInnes, Lois Curfman; Nieplocha, Jarek; Parker, Steven G; Rasmussen, Craig; Windus, Theresa L

    2005-01-01

    Recent advances in both computational hardware and multidisciplinary science have given rise to an unprecedented level of complexity in scientific simulation software. This paper describes an ongoing grass roots effort aimed at addressing complexity in high-performance computing through the use of Component-Based Software Engineering (CBSE). Highlights of the benefits and accomplishments of the Common Component Architecture (CCA) Forum and SciDAC ISIC are given, followed by an illustrative example of how the CCA has been applied to drive scientific discovery in quantum chemistry. Thrusts for future research are also described briefly.

  12. Component-based software for high-performance scientific computing

    International Nuclear Information System (INIS)

    Alexeev, Yuri; Allan, Benjamin A; Armstrong, Robert C; Bernholdt, David E; Dahlgren, Tamara L; Gannon, Dennis; Janssen, Curtis L; Kenny, Joseph P; Krishnan, Manojkumar; Kohl, James A; Kumfert, Gary; McInnes, Lois Curfman; Nieplocha, Jarek; Parker, Steven G; Rasmussen, Craig; Windus, Theresa L

    2005-01-01

    Recent advances in both computational hardware and multidisciplinary science have given rise to an unprecedented level of complexity in scientific simulation software. This paper describes an ongoing grass roots effort aimed at addressing complexity in high-performance computing through the use of Component-Based Software Engineering (CBSE). Highlights of the benefits and accomplishments of the Common Component Architecture (CCA) Forum and SciDAC ISIC are given, followed by an illustrative example of how the CCA has been applied to drive scientific discovery in quantum chemistry. Thrusts for future research are also described briefly

  13. High level issues in reliability quantification of safety-critical software

    International Nuclear Information System (INIS)

    Kim, Man Cheol

    2012-01-01

    For the purpose of developing a consensus method for the reliability assessment of safety-critical digital instrumentation and control systems in nuclear power plants, several high level issues in reliability assessment of the safety-critical software based on Bayesian belief network modeling and statistical testing are discussed. Related to the Bayesian belief network modeling, the relation between the assessment approach and the sources of evidence, the relation between qualitative evidence and quantitative evidence, how to consider qualitative evidence, and the cause-consequence relation are discussed. Related to the statistical testing, the need of the consideration of context-specific software failure probabilities and the inability to perform a huge number of tests in the real world are discussed. The discussions in this paper are expected to provide a common basis for future discussions on the reliability assessment of safety-critical software. (author)

  14. Proceedings of the High Consequence Operations Safety Symposium

    Energy Technology Data Exchange (ETDEWEB)

    1994-12-01

    Many organizations face high consequence safety situations where unwanted stimuli due to accidents, catastrophes, or inadvertent human actions can cause disasters. In order to improve interaction among such organizations and to build on each others` experience, preventive approaches, and assessment techniques, the High Consequence Operations Safety Symposium was held July 12--14, 1994 at Sandia National Laboratories, Albuquerque, New Mexico. The symposium was conceived by Dick Schwoebel, Director of the SNL Surety Assessment Center. Stan Spray, Manager of the SNL System Studies Department, planned strategy and made many of the decisions necessary to bring the concept to fruition on a short time scale. Angela Campos and about 60 people worked on the nearly limitless implementation and administrative details. The initial symposium (future symposia are planned) was structured around 21 plenary presentations in five methodology-oriented sessions, along with a welcome address, a keynote address, and a banquet address. Poster papers addressing the individual session themes were available before and after the plenary sessions and during breaks.

  15. High-Level Synthesis: Productivity, Performance, and Software Constraints

    Directory of Open Access Journals (Sweden)

    Yun Liang

    2012-01-01

    Full Text Available FPGAs are an attractive platform for applications with high computation demand and low energy consumption requirements. However, design effort for FPGA implementations remains high—often an order of magnitude larger than design effort using high-level languages. Instead of this time-consuming process, high-level synthesis (HLS tools generate hardware implementations from algorithm descriptions in languages such as C/C++ and SystemC. Such tools reduce design effort: high-level descriptions are more compact and less error prone. HLS tools promise hardware development abstracted from software designer knowledge of the implementation platform. In this paper, we present an unbiased study of the performance, usability and productivity of HLS using AutoPilot (a state-of-the-art HLS tool. In particular, we first evaluate AutoPilot using the popular embedded benchmark kernels. Then, to evaluate the suitability of HLS on real-world applications, we perform a case study of stereo matching, an active area of computer vision research that uses techniques also common for image denoising, image retrieval, feature matching, and face recognition. Based on our study, we provide insights on current limitations of mapping general-purpose software to hardware using HLS and some future directions for HLS tool development. We also offer several guidelines for hardware-friendly software design. For popular embedded benchmark kernels, the designs produced by HLS achieve 4X to 126X speedup over the software version. The stereo matching algorithms achieve between 3.5X and 67.9X speedup over software (but still less than manual RTL design with a fivefold reduction in design effort versus manual RTL design.

  16. 49 CFR 195.452 - Pipeline integrity management in high consequence areas.

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 3 2010-10-01 2010-10-01 false Pipeline integrity management in high consequence... Management § 195.452 Pipeline integrity management in high consequence areas. (a) Which pipelines are covered... that could affect a high consequence area, including any pipeline located in a high consequence area...

  17. Assessments of high-consequence platforms: Issues and applications

    International Nuclear Information System (INIS)

    Digre, K.A.; Craig, M.J.K.

    1994-01-01

    An API task group has developed a process for the assessment of existing platforms to determine their fitness for purpose. This has been released as a draft supplement to API RP 2A-WSD, 20th edition. Details and the background of this work are described in a companion paper. The assessment of a platform's fitness for purpose involves firstly a definition of the platform's exposure; and secondly, an evaluation of the platform's predicted performance relative to the assessment criteria associated with that exposure. This paper deals with platforms in the high exposure category. That is, platforms whose potential failure consequences, in terms of potential life loss and environmental damage, are significant. The criteria for placement of a platform in a high exposure category are explained, as are the performance criteria demanded of these high exposure platforms. In the companion paper, the metocean assessment process and associated API-developed acceptance criteria are highlighted. This paper addresses primarily ice and seismic loading assessments and associated API-developed criteria, which are based on over thirty years of successful offshore operation and field experience, as well as extrapolation of land-based performance criteria. Three West Coast, USA production platforms are used for illustration

  18. Software Systems for High-performance Quantum Computing

    Energy Technology Data Exchange (ETDEWEB)

    Humble, Travis S [ORNL; Britt, Keith A [ORNL

    2016-01-01

    Quantum computing promises new opportunities for solving hard computational problems, but harnessing this novelty requires breakthrough concepts in the design, operation, and application of computing systems. We define some of the challenges facing the development of quantum computing systems as well as software-based approaches that can be used to overcome these challenges. Following a brief overview of the state of the art, we present models for the quantum programming and execution models, the development of architectures for hybrid high-performance computing systems, and the realization of software stacks for quantum networking. This leads to a discussion of the role that conventional computing plays in the quantum paradigm and how some of the current challenges for exascale computing overlap with those facing quantum computing.

  19. High-consequence analysis, evaluation, and application of select criteria

    International Nuclear Information System (INIS)

    Gutmanis, I.; Jaksch, J.A.

    1984-01-01

    A number of characteristics distinguish environmental risk from pollution problems. The characteristics make environmental risk problems harder to manage through existing regulatory, legal, and economic institutions. Hence, technologies involving environmental risk impose on society extremely difficult collective decisions. This paper is concerned with the process of reaching social decisions that involve low-probability, high-consequence outcomes. It is divided into five major parts. Part I contains the introduction. Part II reviews the two main classes of criteria that have been proposed for social decisions: approaches based on market mechanisms and their extension, and approaches associated with Rawls and Buchanan, which not only focus on outcomes, but also impose a set of minimal constraints on the process for reaching decisions and social consensus. Part III proposes a set of eight criteria for evaluating social decision processes. In Parts IV and V we investigate applying the criteria to two case studies -- one on nuclear waste disposal and the other on transportation of liquefied natural gas

  20. Mechanisms of Evolution in High-Consequence Drug Resistance Plasmids.

    Science.gov (United States)

    He, Susu; Chandler, Michael; Varani, Alessandro M; Hickman, Alison B; Dekker, John P; Dyda, Fred

    2016-12-06

    The dissemination of resistance among bacteria has been facilitated by the fact that resistance genes are usually located on a diverse and evolving set of transmissible plasmids. However, the mechanisms generating diversity and enabling adaptation within highly successful resistance plasmids have remained obscure, despite their profound clinical significance. To understand these mechanisms, we have performed a detailed analysis of the mobilome (the entire mobile genetic element content) of a set of previously sequenced carbapenemase-producing Enterobacteriaceae (CPE) from the National Institutes of Health Clinical Center. This analysis revealed that plasmid reorganizations occurring in the natural context of colonization of human hosts were overwhelmingly driven by genetic rearrangements carried out by replicative transposons working in concert with the process of homologous recombination. A more complete understanding of the molecular mechanisms and evolutionary forces driving rearrangements in resistance plasmids may lead to fundamentally new strategies to address the problem of antibiotic resistance. The spread of antibiotic resistance among Gram-negative bacteria is a serious public health threat, as it can critically limit the types of drugs that can be used to treat infected patients. In particular, carbapenem-resistant members of the Enterobacteriaceae family are responsible for a significant and growing burden of morbidity and mortality. Here, we report on the mechanisms underlying the evolution of several plasmids carried by previously sequenced clinical Enterobacteriaceae isolates from the National Institutes of Health Clinical Center (NIH CC). Our ability to track genetic rearrangements that occurred within resistance plasmids was dependent on accurate annotation of the mobile genetic elements within the plasmids, which was greatly aided by access to long-read DNA sequencing data and knowledge of their mechanisms. Mobile genetic elements such as

  1. Design of High Temperature Reactor Vessel Using ANSYS Software

    International Nuclear Information System (INIS)

    Bandriyana; Kasmudin

    2003-01-01

    Design calculation and evaluation of material strength for high temperature reactor vessel based on the design of HTR-10 high temperature reactor vessel were carried out by using the ANSYS 5.4 software. ANSYS software was applied to calculate the combined load from thermal and pressure load. Evaluation of material strength was performed by calculate and determine the distribution of temperature, stress and strain in the thickness direction of vessel, and compared with its material strength for designed. The calculation was based on the inner wall temperature of vessel of 600 o C and the outer temperature of 500 and 600 o C. Result of calculation gave the maximum stress for outer temperature of 600 o C was 288 N/ mm 2 and strain of 0.000187. For outer temperature of 500 o C the maximum stress was 576 N/ mm 2 and strain of 0.003. Based on the analysis result, the material of steel SA 516-70 with limited stress for design of 308 N/ mm 2 can be used for vessel material with outer wall temperature of 600 o C

  2. Software on the Peregrine System | High-Performance Computing | NREL

    Science.gov (United States)

    on the Peregrine System Software on the Peregrine System NREL maintains a variety of applications environment modules for use on Peregrine. Applications View list of software applications by name and research area/discipline. Libraries View list of software libraries available for linking and loading

  3. The Future of Software Engineering for High Performance Computing

    Energy Technology Data Exchange (ETDEWEB)

    Pope, G [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-07-16

    DOE ASCR requested that from May through mid-July 2015 a study group identify issues and recommend solutions from a software engineering perspective transitioning into the next generation of High Performance Computing. The approach used was to ask some of the DOE complex experts who will be responsible for doing this work to contribute to the study group. The technique used was to solicit elevator speeches: a short and concise write up done as if the author was a speaker with only a few minutes to convince a decision maker of their top issues. Pages 2-18 contain the original texts of the contributed elevator speeches and end notes identifying the 20 contributors. The study group also ranked the importance of each topic, and those scores are displayed with each topic heading. A perfect score (and highest priority) is three, two is medium priority, and one is lowest priority. The highest scoring topic areas were software engineering and testing resources; the lowest scoring area was compliance to DOE standards. The following two paragraphs are an elevator speech summarizing the contributed elevator speeches. Each sentence or phrase in the summary is hyperlinked to its source via a numeral embedded in the text. A risk one liner has also been added to each topic to allow future risk tracking and mitigation.

  4. Architecture of high reliable control systems using complex software

    International Nuclear Information System (INIS)

    Tallec, M.

    1990-01-01

    The problems involved by the use of complex softwares in control systems that must insure a very high level of safety are examined. The first part makes a brief description of the prototype of PROSPER system. PROSPER means protection system for nuclear reactor with high performances. It has been installed on a French nuclear power plant at the beginnning of 1987 and has been continually working since that time. This prototype is realized on a multi-processors system. The processors communicate between themselves using interruptions and protected shared memories. On each processor, one or more protection algorithms are implemented. Those algorithms use data coming directly from the plant and, eventually, data computed by the other protection algorithms. Each processor makes its own acquisitions from the process and sends warning messages if some operating anomaly is detected. All algorithms are activated concurrently on an asynchronous way. The results are presented and the safety related problems are detailed. - The second part is about measurements' validation. First, we describe how the sensors' measurements will be used in a protection system. Then, a proposal for a method based on the techniques of artificial intelligence (expert systems and neural networks) is presented. - The last part is about the problems of architectures of systems including hardware and software: the different types of redundancies used till now and a proposition of a multi-processors architecture which uses an operating system that is able to manage several tasks implemented on different processors, which verifies the good operating of each of those tasks and of the related processors and which allows to carry on the operation of the system, even in a degraded manner when a failure has been detected are detailed [fr

  5. High energy physics experiment triggers and the trustworthiness of software

    International Nuclear Information System (INIS)

    Nash, T.

    1991-10-01

    For all the time and frustration that high energy physicists expend interacting with computers, it is surprising that more attention is not paid to the critical role computers play in the science. With large, expensive colliding beam experiments now dependent on complex programs working at startup, questions of reliability -- the trustworthiness of software -- need to be addressed. This issue is most acute in triggers, used to select data to record -- and data to discard -- in the real time environment of an experiment. High level triggers are built on codes that now exceed 2 million source lines -- and for the first time experiments are truly dependent on them. This dependency will increase at the accelerators planned for the new millennium (SSC and LHC), where cost and other pressures will reduce tolerance for first run problems, and the high luminosities will make this on-line data selection essential. A sense of this incipient crisis motivated the unusual juxtaposition to topics in these lectures. 37 refs., 1 fig

  6. High Energy Physics Forum for Computational Excellence: Working Group Reports (I. Applications Software II. Software Libraries and Tools III. Systems)

    Energy Technology Data Exchange (ETDEWEB)

    Habib, Salman [Argonne National Lab. (ANL), Argonne, IL (United States); Roser, Robert [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); LeCompte, Tom [Argonne National Lab. (ANL), Argonne, IL (United States); Marshall, Zach [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Borgland, Anders [SLAC National Accelerator Lab., Menlo Park, CA (United States); Viren, Brett [Brookhaven National Lab. (BNL), Upton, NY (United States); Nugent, Peter [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Asai, Makato [SLAC National Accelerator Lab., Menlo Park, CA (United States); Bauerdick, Lothar [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Finkel, Hal [Argonne National Lab. (ANL), Argonne, IL (United States); Gottlieb, Steve [Indiana Univ., Bloomington, IN (United States); Hoeche, Stefan [SLAC National Accelerator Lab., Menlo Park, CA (United States); Sheldon, Paul [Vanderbilt Univ., Nashville, TN (United States); Vay, Jean-Luc [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Elmer, Peter [Princeton Univ., NJ (United States); Kirby, Michael [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Patton, Simon [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Potekhin, Maxim [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Yanny, Brian [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Calafiura, Paolo [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Dart, Eli [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Gutsche, Oliver [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Izubuchi, Taku [Brookhaven National Lab. (BNL), Upton, NY (United States); Lyon, Adam [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Petravick, Don [Univ. of Illinois, Urbana-Champaign, IL (United States). National Center for Supercomputing Applications (NCSA)

    2015-10-29

    Computing plays an essential role in all aspects of high energy physics. As computational technology evolves rapidly in new directions, and data throughput and volume continue to follow a steep trend-line, it is important for the HEP community to develop an effective response to a series of expected challenges. In order to help shape the desired response, the HEP Forum for Computational Excellence (HEP-FCE) initiated a roadmap planning activity with two key overlapping drivers -- 1) software effectiveness, and 2) infrastructure and expertise advancement. The HEP-FCE formed three working groups, 1) Applications Software, 2) Software Libraries and Tools, and 3) Systems (including systems software), to provide an overview of the current status of HEP computing and to present findings and opportunities for the desired HEP computational roadmap. The final versions of the reports are combined in this document, and are presented along with introductory material.

  7. High Energy Physics Forum for Computational Excellence: Working Group Reports (I. Applications Software II. Software Libraries and Tools III. Systems)

    Energy Technology Data Exchange (ETDEWEB)

    Habib, Salman [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Roser, Robert [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States)

    2015-10-28

    Computing plays an essential role in all aspects of high energy physics. As computational technology evolves rapidly in new directions, and data throughput and volume continue to follow a steep trend-line, it is important for the HEP community to develop an effective response to a series of expected challenges. In order to help shape the desired response, the HEP Forum for Computational Excellence (HEP-FCE) initiated a roadmap planning activity with two key overlapping drivers -- 1) software effectiveness, and 2) infrastructure and expertise advancement. The HEP-FCE formed three working groups, 1) Applications Software, 2) Software Libraries and Tools, and 3) Systems (including systems software), to provide an overview of the current status of HEP computing and to present findings and opportunities for the desired HEP computational roadmap. The final versions of the reports are combined in this document, and are presented along with introductory material.

  8. Consequences of corrosion of the high flux reactor channels

    International Nuclear Information System (INIS)

    1987-01-01

    The effects of corrosion can increase the probability of the channel losing its seal. In case of a slow leak, the phenomena happening can be considered as quasi-static. The closing of the safety valve takes place even before the leak water reaches the level of the exit window. In case of a fast leak in the case of helium filled channels, the dynamic effects are limited to the front part of the plug. As for the back part of the plug and the housing/safety valve unit, the consequences of a fast leak can be assimilated to those of a slow leak. This paper evaluates the results of an incident such as this for the reactor and the surrounding experimental zones

  9. Development of object-oriented software technique in field of high energy and nuclear physics

    International Nuclear Information System (INIS)

    Ye Yanlin; Ying Jun; Chen Tao

    1997-01-01

    The background for developing object-oriented software technique in high energy and nuclear physics has been introduced. The progress made at CERN and US has been outlined. The merit and future of various software techniques have been commented

  10. High-integrity software, computation and the scientific method

    International Nuclear Information System (INIS)

    Hatton, L.

    2012-01-01

    Computation rightly occupies a central role in modern science. Datasets are enormous and the processing implications of some algorithms are equally staggering. With the continuing difficulties in quantifying the results of complex computations, it is of increasing importance to understand its role in the essentially Popperian scientific method. In this paper, some of the problems with computation, for example the long-term unquantifiable presence of undiscovered defect, problems with programming languages and process issues will be explored with numerous examples. One of the aims of the paper is to understand the implications of trying to produce high-integrity software and the limitations which still exist. Unfortunately Computer Science itself suffers from an inability to be suitably critical of its practices and has operated in a largely measurement-free vacuum since its earliest days. Within computer science itself, this has not been so damaging in that it simply leads to unconstrained creativity and a rapid turnover of new technologies. In the applied sciences however which have to depend on computational results, such unquantifiability significantly undermines trust. It is time this particular demon was put to rest. (author)

  11. Software for Displaying High-Frequency Test Data

    Science.gov (United States)

    Elmore, Jason L.

    2003-01-01

    An easy-to-use, intuitive computer program was written to satisfy a need of test operators and data requestors to quickly view and manipulate high-frequency test data recorded at the East and West Test Areas at Marshall Space Flight Center. By enabling rapid analysis, this program makes it possible to reduce times between test runs, thereby potentially reducing the overall cost of test operations. The program can be used to perform quick frequency analysis, using multiple fast- Fourier-transform windowing and amplitude options. The program can generate amplitude-versus-time plots with full zoom capabilities, frequency-component plots at specified time intervals, and waterfall plots (plots of spectral intensity versus frequency at successive small time intervals, showing the changing frequency components over time). There are options for printing of the plots and saving plot data as text files that can be imported into other application programs. The program can perform all of the aforementioned plotting and plot-data-handling functions on a relatively inexpensive computer; other software that performs the same functions requires computers with large amounts of power and memory.

  12. Software Development of High-Precision Ephemerides of Solar System

    Directory of Open Access Journals (Sweden)

    Jong-Seob Shin

    1995-06-01

    Full Text Available We solved n-body problem about 9 plants, moon, and 4 minor planets with relativistic effect related to the basic equation of motion of the solar system. Perturbations including figure potential of the earth and the moon and solid earth tidal effect were considered on this relativistic equation of motion. The orientations employed precession and nutation for the earth, and lunar libration model with Eckert's lunar libration model based on J2000.0 were used for the moon. Finally, we developed heliocentric ecliptic position and velocity of each planet using this software package named the SSEG (Solar System Ephemerides Generator by long-term (more than 100 years simulation on CRAY-2S super computer, through testing each subroutine on personal computer and short-time (within 800days running on SUN3/280 workstation. Epoch of input data JD2440400.5 were adopted in order to compare our results to the data archived from JPL's DE200 by Standish and Newhall. Above equation of motion was integrated numerically having 1-day step-size interval through 40,000 days (about 110 years long as total computing interval. We obtained high-precision ephemerides of the planets with maximum error, less than ~2 x 10-8AU (≈±3km compared with DE200 data(except for mars and moon.

  13. The Software Architecture of the LHCb High Level Trigger

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    The LHCb experiment is a spectrometer dedicated to the study of heavy flavor at the LHC. The rate of proton-proton collisions at the LHC is 15 MHz, but disk space limitations mean that only 3 kHz can be written to tape for offline processing. For this reason the LHCb data acquisition system -- trigger -- plays a key role in selecting signal events and rejecting background. In contrast to previous experiments at hadron colliders like for example CDF or D0, the bulk of the LHCb trigger is implemented in software and deployed on a farm of 20k parallel processing nodes. This system, called the High Level Trigger (HLT) is responsible for reducing the rate from the maximum at which the detector can be read out, 1.1 MHz, to the 3 kHz which can be processed offline,and has 20 ms in which to process and accept/reject each event. In order to minimize systematic uncertainties, the HLT was designed from the outset to reuse the offline reconstruction and selection code, and is based around multiple independent and redunda...

  14. Parallelization of an existing high energy physics event reconstruction software package

    International Nuclear Information System (INIS)

    Schiefer, R.; Francis, D.

    1996-01-01

    Software parallelization allows an efficient use of available computing power to increase the performance of applications. In a case study the authors have investigated the parallelization of high energy physics event reconstruction software in terms of costs (effort, computing resource requirements), benefits (performance increase) and the feasibility of a systematic parallelization approach. Guidelines facilitating a parallel implementation are proposed for future software development

  15. Prevalence and consequences of substance use among high school ...

    African Journals Online (AJOL)

    Alcohol, khat and cigarettes were commonly used by both high school and college students in urban as well as rural areas. While the use patterns of the substances were related to the gender, education/age and religion of the users, no clear-cut patterns were observed in relation to several other factors including ...

  16. High plasma uric acid concentration: causes and consequences

    Directory of Open Access Journals (Sweden)

    de Oliveira Erick

    2012-04-01

    Full Text Available Abstract High plasma uric acid (UA is a precipitating factor for gout and renal calculi as well as a strong risk factor for Metabolic Syndrome and cardiovascular disease. The main causes for higher plasma UA are either lower excretion, higher synthesis or both. Higher waist circumference and the BMI are associated with higher insulin resistance and leptin production, and both reduce uric acid excretion. The synthesis of fatty acids (tryglicerides in the liver is associated with the de novo synthesis of purine, accelerating UA production. The role played by diet on hyperuricemia has not yet been fully clarified, but high intake of fructose-rich industrialized food and high alcohol intake (particularly beer seem to influence uricemia. It is not known whether UA would be a causal factor or an antioxidant protective response. Most authors do not consider the UA as a risk factor, but presenting antioxidant function. UA contributes to > 50% of the antioxidant capacity of the blood. There is still no consensus if UA is a protective or a risk factor, however, it seems that acute elevation is a protective factor, whereas chronic elevation a risk for disease.

  17. The consequences of a new software package for the quantification of gated-SPECT myocardial perfusion studies

    International Nuclear Information System (INIS)

    Veen, Berlinda J. van der; Dibbets-Schneider, Petra; Stokkel, Marcel P.M.; Scholte, Arthur J.

    2010-01-01

    Semiquantitative analysis of myocardial perfusion scintigraphy (MPS) has reduced inter- and intraobserver variability, and enables researchers to compare parameters in the same patient over time, or between groups of patients. There are several software packages available that are designed to process MPS data and quantify parameters. In this study the performances of two systems, quantitative gated SPECT (QGS) and 4D-MSPECT, in the processing of clinical patient data and phantom data were compared. The clinical MPS data of 148 consecutive patients were analysed using QGS and 4D-MSPECT to determine the end-diastolic volume, end-systolic volume and left ventricular ejection fraction. Patients were divided into groups based on gender, body mass index, heart size, stressor type and defect type. The AGATE dynamic heart phantom was used to provide reference values for the left ventricular ejection fraction. Although the correlations were excellent (correlation coefficients 0.886 to 0.980) for all parameters, significant differences (p < 0.001) were found between the systems. Bland-Altman plots indicated that 4D-MSPECT provided overall higher values of all parameters than QGS. These differences between the systems were not significant in patients with a small heart (end-diastolic volume <70 ml). Other clinical factors had no direct influence on the relationship. Additionally, the phantom data indicated good linear responses of both systems. The discrepancies between these software packages were clinically relevant, and influenced by heart size. The possibility of such discrepancies should be taken into account when a new quantitative software system is introduced, or when multiple software systems are used in the same institution. (orig.)

  18. A probabilistic consequence assessment for a very high temperature reactor

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Joeun; Kim, Jintae; Jae, Moosung [Hanyang Univ., Seoul (Korea, Republic of). Dept. of Nuclear Engineering

    2017-02-15

    Currently, fossil fuel is globally running out. If current trends continue, crude oil will be depleted in 20 years and natural gas in 40 years. In addition, the use of fossil resource has increased emissions of green gas such as carbon dioxide. Therefore, there has been a strong demand in recent years for producing large amounts of hydrogen as an alternative energy [1]. To generate hydrogen energy, very high temperature more than 900 C is required but this level is not easy to reach. Because a Very High Temperature Reactor (VHTR), one of next generation reactor, is able to make the temperature, it is regarded as a solution of the problem. Also, VHTR has an excellent safety in comparison with existing and other next generation reactors. Especially, a passive system, Reactor Cavity Cooling System (RCCS), is adopted to get rid of radiant heat in case of accidents. To achieve variety requirements of new designed-reactors, however, it needs to develop new methodologies and definitions different with existing method. At the same time, an application of probability safety assessment (PSA) has been proposed to ensure the safety of next generation NPPs. For this, risk-informed designs of structures have to be developed and verified. Particularly, the passive system requires to be evaluated for its reliability. The objective of this study is to improve safety of VIITR by conducting risk profile.

  19. Mechanisms of Evolution in High-Consequence Drug Resistance Plasmids

    Directory of Open Access Journals (Sweden)

    Susu He

    2016-12-01

    Full Text Available The dissemination of resistance among bacteria has been facilitated by the fact that resistance genes are usually located on a diverse and evolving set of transmissible plasmids. However, the mechanisms generating diversity and enabling adaptation within highly successful resistance plasmids have remained obscure, despite their profound clinical significance. To understand these mechanisms, we have performed a detailed analysis of the mobilome (the entire mobile genetic element content of a set of previously sequenced carbapenemase-producing Enterobacteriaceae (CPE from the National Institutes of Health Clinical Center. This analysis revealed that plasmid reorganizations occurring in the natural context of colonization of human hosts were overwhelmingly driven by genetic rearrangements carried out by replicative transposons working in concert with the process of homologous recombination. A more complete understanding of the molecular mechanisms and evolutionary forces driving rearrangements in resistance plasmids may lead to fundamentally new strategies to address the problem of antibiotic resistance.

  20. Financial system loss as an example of high consequence, high frequency events

    Energy Technology Data Exchange (ETDEWEB)

    McGovern, D.E.

    1996-07-01

    Much work has been devoted to high consequence events with low frequency of occurrence. Characteristic of these events are bridge failure (such as that of the Tacoma Narrows), building failure (such as the collapse of a walkway at a Kansas City hotel), or compromise of a major chemical containment system (such as at Bhopal, India). Such events, although rare, have an extreme personal, societal, and financial impact. An interesting variation is demonstrated by financial losses due to fraud and abuse in the money management system. The impact can be huge, entailing very high aggregate costs, but these are a result of the contribution of many small attacks and not the result of a single (or few) massive events. Public awareness is raised through publicized events such as the junk bond fraud perpetrated by Milikin or gross mismanagement in the failure of the Barings Bank through unsupervised trading activities by Leeson in Singapore. These event,s although seemingly large (financial losses may be on the order of several billion dollars), are but small contributors to the estimated $114 billion loss to all types of financial fraud in 1993. This paper explores the magnitude of financial system losses and identifies new areas for analysis of high consequence events including the potential effect of malevolent intent.

  1. A software for a quick assessment of the consequences of a radioactive fallout on the food chain: application to a tropical environment

    International Nuclear Information System (INIS)

    Laylavoix, F.; Grouzelle, C.; Baudot, M.F.

    1995-01-01

    A software has been developed for a fast assessment of the consequences of a radioactive fallout on the food chain. From the surface activity measured 24 h after deposition, it computes the dose ingested by the population on the 60th day of exposure; this duration corresponds to 2 or 4 mechanical clearance half-lives of the radionuclides at the vegetal surface. Particular attention was paid to easy access to specific local parameters, conviviality and possibility to be used on any MS.DOS operated PC. (authors). 17 refs., 1 fig

  2. Distinctive Innovation Capabilities of Argentine Software Companies with High Innovation Results and Impacts

    OpenAIRE

    María Isabel Camio; María del Carmen Romero; María Belén Álvarez; Alfredo José Rébori

    2018-01-01

    The software sector is of growing importance and, due to its degree of dynamism, the identification of capabilities for innovation is vital. This study identifies capabilities variables that distinguish Argentine software companies with high innovation results and high innovation impacts from those with lesser results and impacts. It is applied to a sample of 103 companies, a measurement model and the component variables of an innovation degree index for software companies (INIs) formulated i...

  3. Main real time software for high-energy physics experiments

    International Nuclear Information System (INIS)

    Tikhonov, A.N.

    1985-01-01

    The general problems of organization of software complexes, as well as development of typical algorithms and packages of applied programs for real time systems used in experiments with charged particle accelerators are discussed. It is noted that numerous qualitatively different real time tasks are solved by parallel programming of the processes of data acquisition, equipment control, data exchange with remote terminals, data express processing and accumulation, operator's instruction interpretation, generation and buffering of resulting files for data output and information processing which is realized on the basis of multicomputer system utilization. Further development of software for experiments is associated with improving the algorithms for automatic recognition and analysis of events with complex topology and standardization of applied program packages

  4. High-speed single-crystal television diffractometer (software)

    International Nuclear Information System (INIS)

    Thomas, D.J.

    1982-01-01

    Area-detector diffractometers make possible almost ideal diffraction experiments. Until recently the performance of such instruments has been limited in practice by the available software. This general account discusses an unconventional way of indexing a lattice which is more appropriate for the calculations needed with normal-beam rotation geometry, and asserts the need to perform a continuous 'real-time' adaptive refinement to monitor the condition of the crystal and the detector. (orig.)

  5. An Agile Constructionist Mentoring Methodology for Software Projects in the High School

    Science.gov (United States)

    Meerbaum-Salant, Orni; Hazzan, Orit

    2010-01-01

    This article describes the construction process and evaluation of the Agile Constructionist Mentoring Methodology (ACMM), a mentoring method for guiding software development projects in the high school. The need for such a methodology has arisen due to the complexity of mentoring software project development in the high school. We introduce the…

  6. Analytical Design of Evolvable Software for High-Assurance Computing

    Science.gov (United States)

    2001-02-14

    system size Sext wij j 1= Ai ∑ wik k 1= Mi ∑+               i 1= N ∑= = 59 5 Analytical Partition of Components As discussed in Chapter 1...76]. Does the research approach yield evolvable components in less mathematically-oriented applications such as multi- media and e- commerce? There is... Social Security Number Date 216 217 Appendix H Benchmark Design for the Microwave Oven Software The benchmark design consists of the

  7. Methods for qualification of highly reliable software - international procedure

    International Nuclear Information System (INIS)

    Kersken, M.

    1997-01-01

    Despite the advantages of computer-assisted safety technology, there still is some uneasyness to be observed with respect to the novel processes, resulting from absence of a body of generally accepted and uncontentious qualification guides (regulatory provisions, standards) for safety evaluation of the computer codes applied. Warranty of adequate protection of the population, operators or plant components is an essential aspect in this context, too - as it is in general with reliability and risk assessment of novel technology - so that, due to appropriate legislation still missing, there currently is a licensing risk involved in the introduction of digital safety systems. Nevertheless, there is some extent of agreement within the international community and utility operators about what standards and measures should be applied for qualification of software of relevance to plant safety. The standard IEC 880/IEC 86/ in particular, in its original version, or national documents based on this standard, are applied in all countries using or planning to install those systems. A novel supplement to this standard, document /IEC 96/, is in the process of finalization and defines the requirements to be met by modern methods of software engineering. (orig./DG) [de

  8. Freud: a software suite for high-throughput simulation analysis

    Science.gov (United States)

    Harper, Eric; Spellings, Matthew; Anderson, Joshua; Glotzer, Sharon

    Computer simulation is an indispensable tool for the study of a wide variety of systems. As simulations scale to fill petascale and exascale supercomputing clusters, so too does the size of the data produced, as well as the difficulty in analyzing these data. We present Freud, an analysis software suite for efficient analysis of simulation data. Freud makes no assumptions about the system being analyzed, allowing for general analysis methods to be applied to nearly any type of simulation. Freud includes standard analysis methods such as the radial distribution function, as well as new methods including the potential of mean force and torque and local crystal environment analysis. Freud combines a Python interface with fast, parallel C + + analysis routines to run efficiently on laptops, workstations, and supercomputing clusters. Data analysis on clusters reduces data transfer requirements, a prohibitive cost for petascale computing. Used in conjunction with simulation software, Freud allows for smart simulations that adapt to the current state of the system, enabling the study of phenomena such as nucleation and growth, intelligent investigation of phases and phase transitions, and determination of effective pair potentials.

  9. High-school software development project helps increasing students' awareness of geo-hydrological hazards and their risks

    Science.gov (United States)

    Marchesini, Ivan; Rossi, Mauro; Balducci, Vinicio; Salvati, Paola; Guzzetti, Fausto; Bianchini, Andrea; Grzeleswki, Emanuell; Canonico, Andrea; Coccia, Rita; Fiorucci, Gianni Mario; Gobbi, Francesca; Ciuchetti, Monica

    2015-04-01

    In Italy, inundation and landslides are widespread phenomena that impact the population and cause significant economic damage to private and public properties. The perception of the risk posed by these natural geo-hydrological hazards varies geographically and in time. The variation in the perception of the risks has negative consequences on risk management, and limits the adoption of effective risk reduction strategies. We maintain that targeted education can foster the understanding of geo-hydrological hazards, improving their perception and the awareness of the associated risk. Collaboration of a research center experienced in geo-hydrological hazards and risks (CNR IRPI, Perugia) and a high school (ITIS Alessandro Volta, Perugia) has resulted in the design and execution of a project aimed at improving the perception of geo-hydrological risks in high school students and teachers through software development. In the two-year project, students, high school teachers and research scientists have jointly developed software broadly related to landslide and flood hazards. User requirements and system specifications were decided to facilitate the distribution and use of the software among students and their peers. This allowed a wider distribution of the project results. We discuss two prototype software developed by the high school students, including an application of augmented reality for improved dissemination of information of landslides and floods with human consequences in Italy, and a crowd science application to allow students (and others, including their families and friends) to collect information on landslide and flood occurrence exploiting modern mobile devices. This information can prove important e.g., for the validation of landslide forecasting models.

  10. Supporting Early Math--Rationales and Requirements for High Quality Software

    Science.gov (United States)

    Haake, Magnus; Husain, Layla; Gulz, Agneta

    2015-01-01

    There is substantial evidence that preschooler's performance in early math is highly correlated to math performance throughout school as well as academic skills in general. One way to help children attain early math skills is by using targeted educational software and the paper discusses potential gains of using such software to support early math…

  11. A model-based software development methodology for high-end automotive components

    NARCIS (Netherlands)

    Ravanan, Mahmoud

    2014-01-01

    This report provides a model-based software development methodology for high-end automotive components. The V-model is used as a process model throughout the development of the software platform. It offers a framework that simplifies the relation between requirements, design, implementation,

  12. Software Switching for High Throughput Data Acquisition Networks

    CERN Document Server

    AUTHOR|(CDS)2089787; Lehmann Miotto, Giovanna

    The bursty many-to-one communication pattern, typical for data acquisition systems, is particularly demanding for commodity TCP/IP and Ethernet technologies. The problem arising from this pattern is widely known in the literature as \\emph{incast} and can be observed as TCP throughput collapse. It is a result of overloading the switch buffers, when a specific node in a network requests data from multiple sources. This will become even more demanding for future upgrades of the experiments at the Large Hadron Collider at CERN. It is questionable whether commodity TCP/IP and Ethernet technologies in their current form will be still able to effectively adapt to bursty traffic without losing packets due to the scarcity of buffers in the networking hardware. This thesis provides an analysis of TCP/IP performance in data acquisition networks and presents a novel approach to incast congestion in these networks based on software-based packet forwarding. Our first contribution lies in confirming the strong analogies bet...

  13. The ATLAS online High Level Trigger framework experience reusing offline software components in the ATLAS trigger

    CERN Document Server

    Wiedenmann, W

    2009-01-01

    Event selection in the Atlas High Level Trigger is accomplished to a large extent by reusing software components and event selection algorithms developed and tested in an offline environment. Many of these offline software modules are not specifically designed to run in a heavily multi-threaded online data flow environment. The Atlas High Level Trigger (HLT) framework based on the Gaudi and Atlas Athena frameworks, forms the interface layer, which allows the execution of the HLT selection and monitoring code within the online run control and data flow software. While such an approach provides a unified environment for trigger event selection across all of Atlas, it also poses strict requirements on the reused software components in terms of performance, memory usage and stability. Experience of running the HLT selection software in the different environments and especially on large multi-node trigger farms has been gained in several commissioning periods using preloaded Monte Carlo events, in data taking peri...

  14. THE IMPORTANCE OF AFFECT TO BUILD CONSUMER TRUST IN HIGH-CONSEQUENCES EXCHANGES

    Directory of Open Access Journals (Sweden)

    Mellina da Silva Terres

    2012-12-01

    Full Text Available The present article investigates the importance of affect displayed by service provider to build consumer trust in high consequence exchanges. High-consequence exchanges are difficult situations in which the choices present a dilemma that can cause stress and severe emotional reactions (KAHN; LUCE, 2003. In this specific case, trust based on affect seems to become important; mainly because consumers may not have ability to evaluate the cognitive aspects of the situation, and moreover, a medical services failure can be highly problematic or even fatal (LEISEN; HYMAN, 2004. On the other hand, in low-consequence choices, we are predicting that cognition will be more important than affect in building trust. In this kind of situation, patients are more self-confident, less sensitive, and don’t perceive a high probability of loss (KUNREUTHER et al., 2002, and therefore focuses more on the rational outcomes.

  15. RaCon: a software tool serving to predict radiological consequences of various types of accident in support of emergency management and radiation monitoring management

    International Nuclear Information System (INIS)

    Svanda, J.; Hustakova, H.; Fiser, V.

    2008-01-01

    The RaCon software system, developed by the Nuclear Research Institute Rez, is described and its application when addressing various tasks in the domain of radiation accidents and nuclear safety (accidents at nuclear facilities, transport of radioactive material, terrorist attacks) are outlined. RaCon is intended for the prediction and evaluation of radiological consequences to population and rescue teams and for optimization of monitoring actions. The system provides support to emergency management when evaluating and devising actions to mitigate the consequences of radiation accidents. The deployment of RaCon within the system of radiation monitoring by mobile emergency teams or remote controlled UAV is an important application. Based on a prediction of the radiological situation, RaCon facilitates decision-making and control of the radiation monitoring system, and in turn, refines the prediction based on observed values. Furthermore, the system can perform simulations of evacuation patterns at the Dukovany NPP and at schools in the vicinity of the power plant and can provide support to emergency management should any such situation arise. (orig.)

  16. Awareness of Consequence of High School Students on Loss of Bio-Diversity

    Science.gov (United States)

    Kasot, Nazim; Özbas, Serap

    2015-01-01

    The aim of this study is to assess the egoistic, altruistic and biospheric awareness of the consequence of high school students regarding the loss of bio-diversity, then comparing the results on the basis of some independent variables (gender, class and family income). The research data were collected from 884 ninth and tenth grade high school…

  17. Software Applications on the Peregrine System | High-Performance Computing

    Science.gov (United States)

    Algebraic Modeling System (GAMS) Statistics and analysis High-level modeling system for mathematical reactivity. Gurobi Optimizer Statistics and analysis Solver for mathematical programming LAMMPS Chemistry and , reactivities, and vibrational, electronic and NMR spectra. R Statistical Computing Environment Statistics and

  18. The ATLAS online High Level Trigger framework: Experience reusing offline software components in the ATLAS trigger

    International Nuclear Information System (INIS)

    Wiedenmann, Werner

    2010-01-01

    Event selection in the ATLAS High Level Trigger is accomplished to a large extent by reusing software components and event selection algorithms developed and tested in an offline environment. Many of these offline software modules are not specifically designed to run in a heavily multi-threaded online data flow environment. The ATLAS High Level Trigger (HLT) framework based on the GAUDI and ATLAS ATHENA frameworks, forms the interface layer, which allows the execution of the HLT selection and monitoring code within the online run control and data flow software. While such an approach provides a unified environment for trigger event selection across all of ATLAS, it also poses strict requirements on the reused software components in terms of performance, memory usage and stability. Experience of running the HLT selection software in the different environments and especially on large multi-node trigger farms has been gained in several commissioning periods using preloaded Monte Carlo events, in data taking periods with cosmic events and in a short period with proton beams from LHC. The contribution discusses the architectural aspects of the HLT framework, its performance and its software environment within the ATLAS computing, trigger and data flow projects. Emphasis is also put on the architectural implications for the software by the use of multi-core processors in the computing farms and the experiences gained with multi-threading and multi-process technologies.

  19. Generating Safety-Critical PLC Code From a High-Level Application Software Specification

    Science.gov (United States)

    2008-01-01

    The benefits of automatic-application code generation are widely accepted within the software engineering community. These benefits include raised abstraction level of application programming, shorter product development time, lower maintenance costs, and increased code quality and consistency. Surprisingly, code generation concepts have not yet found wide acceptance and use in the field of programmable logic controller (PLC) software development. Software engineers at Kennedy Space Center recognized the need for PLC code generation while developing the new ground checkout and launch processing system, called the Launch Control System (LCS). Engineers developed a process and a prototype software tool that automatically translates a high-level representation or specification of application software into ladder logic that executes on a PLC. All the computer hardware in the LCS is planned to be commercial off the shelf (COTS), including industrial controllers or PLCs that are connected to the sensors and end items out in the field. Most of the software in LCS is also planned to be COTS, with only small adapter software modules that must be developed in order to interface between the various COTS software products. A domain-specific language (DSL) is a programming language designed to perform tasks and to solve problems in a particular domain, such as ground processing of launch vehicles. The LCS engineers created a DSL for developing test sequences of ground checkout and launch operations of future launch vehicle and spacecraft elements, and they are developing a tabular specification format that uses the DSL keywords and functions familiar to the ground and flight system users. The tabular specification format, or tabular spec, allows most ground and flight system users to document how the application software is intended to function and requires little or no software programming knowledge or experience. A small sample from a prototype tabular spec application is

  20. The risk ogf high-risk jobs : psychological health consequences in forensic physicians and ambulance workers

    NARCIS (Netherlands)

    Ploeg, E. van der

    2003-01-01

    The risk of high-risk jobs: Psychological health consequences in forensic doctors and ambulance workers This thesis has shown that forensic physicians and ambulance personnel frequently suffer from psychological complaints as a result of dramatic events and sources of chronic work stress. A

  1. 49 CFR 192.905 - How does an operator identify a high consequence area?

    Science.gov (United States)

    2010-10-01

    ...) PIPELINE AND HAZARDOUS MATERIALS SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) PIPELINE SAFETY TRANSPORTATION OF NATURAL AND OTHER GAS BY PIPELINE: MINIMUM FEDERAL SAFETY STANDARDS Gas Transmission Pipeline Integrity Management § 192.905 How does an operator identify a high consequence area? (a...

  2. Women's Ways of Drinking: College Women, High-Risk Alcohol Use, and Negative Consequences

    Science.gov (United States)

    Smith, Margaret A.; Berger, Joseph B.

    2010-01-01

    The purpose of this study was to explore college women's high-risk alcohol use and related consequences. This study employed a qualitative approach to understand and provide visibility for a gender-related perspective on college women's alcohol experiences and related outcomes. Data were collected from interviews with 10 undergraduate females at a…

  3. Distinctive Innovation Capabilities of Argentine Software Companies with High Innovation Results and Impacts

    Directory of Open Access Journals (Sweden)

    María Isabel Camio

    2018-04-01

    Full Text Available The software sector is of growing importance and, due to its degree of dynamism, the identification of capabilities for innovation is vital. This study identifies capabilities variables that distinguish Argentine software companies with high innovation results and high innovation impacts from those with lesser results and impacts. It is applied to a sample of 103 companies, a measurement model and the component variables of an innovation degree index for software companies (INIs formulated in previous studies. A Principal Component Analysis and a biplot are conducted. In the analysis of results and impacts, 100% of the variability within the first two components is explained, which shows the high correlation between variables. From the biplots, it appears that companies with high results have higher degrees in the variables of motivation, strategy, leadership and internal determinants; and those with high impacts present higher degrees of structure, strategy, leadership, free software and innovation activities. The findings add elements to the theory of capabilities for innovation in the software sector and allow us to consider the relative importance of different capabilities variables in the generation of innovation results and impacts.

  4. DVS-SOFTWARE: An Effective Tool for Applying Highly Parallelized Hardware To Computational Geophysics

    Science.gov (United States)

    Herrera, I.; Herrera, G. S.

    2015-12-01

    Most geophysical systems are macroscopic physical systems. The behavior prediction of such systems is carried out by means of computational models whose basic models are partial differential equations (PDEs) [1]. Due to the enormous size of the discretized version of such PDEs it is necessary to apply highly parallelized super-computers. For them, at present, the most efficient software is based on non-overlapping domain decomposition methods (DDM). However, a limiting feature of the present state-of-the-art techniques is due to the kind of discretizations used in them. Recently, I. Herrera and co-workers using 'non-overlapping discretizations' have produced the DVS-Software which overcomes this limitation [2]. The DVS-software can be applied to a great variety of geophysical problems and achieves very high parallel efficiencies (90%, or so [3]). It is therefore very suitable for effectively applying the most advanced parallel supercomputers available at present. In a parallel talk, in this AGU Fall Meeting, Graciela Herrera Z. will present how this software is being applied to advance MOD-FLOW. Key Words: Parallel Software for Geophysics, High Performance Computing, HPC, Parallel Computing, Domain Decomposition Methods (DDM)REFERENCES [1]. Herrera Ismael and George F. Pinder, Mathematical Modelling in Science and Engineering: An axiomatic approach", John Wiley, 243p., 2012. [2]. Herrera, I., de la Cruz L.M. and Rosas-Medina A. "Non Overlapping Discretization Methods for Partial, Differential Equations". NUMER METH PART D E, 30: 1427-1454, 2014, DOI 10.1002/num 21852. (Open source) [3]. Herrera, I., & Contreras Iván "An Innovative Tool for Effectively Applying Highly Parallelized Software To Problems of Elasticity". Geofísica Internacional, 2015 (In press)

  5. Does Alcohol Use Mediate the Association between Consequences Experienced in High School and Consequences Experienced during the First Semester of College?

    Science.gov (United States)

    Romosz, Ann Marie; Quigley, Brian M.

    2013-01-01

    Approximately 80% of college students drink alcohol; almost half of these students reporting that they drink to get drunk and over 22% engage in heavy episodic drinking. Heavy alcohol consumption during the transition from high school to college is associated with negative personal and academic consequences. Sixty-seven freshmen volunteered to…

  6. The Use of Software in Academic Stream High School Mathematics Teaching

    Science.gov (United States)

    Clay, Simon; Fotou, Nikolaos; Monaghan, John

    2017-01-01

    This paper reports on classroom observations of senior high school mathematics lessons with a focus on the use of digital technology. The observations were of teachers enrolled in an in-service course, Teaching Advanced Mathematics. The paper reports selected results and comments on: software that was observed to have been used; the use (or not)…

  7. Software systems for processing and analysis at the NOVA high-energy laser facility

    International Nuclear Information System (INIS)

    Auerbach, J.M.; Montgomery, D.S.; McCauley, E.W.; Stone, G.F.

    1986-01-01

    A typical laser interaction experiment at the NOVA high-energy laser facility produces in excess of 20 Mbytes of digitized data. Extensive processing and analysis of this raw data from a wide variety of instruments is necessary to produce results that can be readily used to interpret the experiment. Using VAX-based computer hardware, software systems have been set up to convert the digitized instrument output to physics quantities describing the experiment. A relational data-base management system is used to coordinate all levels of processing and analysis. Software development emphasizes structured design, flexibility, automation, and ease of use

  8. RSYST: From nuclear reactor calculations towards a highly sophisticated scientific software integration environment

    International Nuclear Information System (INIS)

    Noack, M.; Seybold, J.; Ruehle, R.

    1996-01-01

    The software environment RSYST was originally used to solve problems of reactor physics. The consideration of advanced scientific simulation requirements and the strict application of modern software design principles led to a system which is perfectly suitable to solve problems in various complex scientific problem domains. Starting with a review of the early days of RSYST, we describe the straight evolution driven by the need of software environment which combines the advantages of a high-performance database system with the capability to integrate sophisticated scientific technical applications. The RSYST architecture is presented and the data modelling capabilities are described. To demonstrate the powerful possibilities and flexibility of the RSYST environment, we describe a wide range of RSYST applications, e.g., mechanical simulations of multibody systems, which are used in biomechanical research, civil engineering and robotics. In addition, a hypermedia system which is used for scientific technical training and documentation is presented. (orig.) [de

  9. An improved COCOMO software cost estimation model | Duke ...

    African Journals Online (AJOL)

    In this paper, we discuss the methodologies adopted previously in software cost estimation using the COnstructive COst MOdels (COCOMOs). From our analysis, COCOMOs produce very high software development efforts, which eventually produce high software development costs. Consequently, we propose its extension, ...

  10. Re-assessment of road accident data-analysis policy : applying theory from involuntary, high-consequence, low-probability events like nuclear power plant meltdowns to voluntary, low-consequence, high-probability events like traffic accidents

    Science.gov (United States)

    2002-02-01

    This report examines the literature on involuntary, high-consequence, low-probability (IHL) events like nuclear power plant meltdowns to determine what can be applied to the problem of voluntary, low-consequence high-probability (VLH) events like tra...

  11. ROSE: A realtime object oriented software environment for high fidelity replica simulation

    International Nuclear Information System (INIS)

    Abramovitch, A.

    1994-01-01

    An object oriented software environment used for the production testing and documentation of real time models for high fidelity training simulators encompasses a wide variety of software constructs including code generators for various classes of physical systems, model executive control programs, a high resolution graphics editor, as well as databases and associated access routines used to store and control information transfer among the various software entities. CAE Electronics' newly developed ROSE allows for the generation and integrated test of thermalhydraulic, analog control, digital control and electrical system models. Based on an iconical/standard subroutine representation of standard plant components along with an admittance matrix solution governed by the topology of the system under consideration, the ROSE blends together network solution algorithms and standard component models, both previously time tested via manual implementation into a single integrated automated software environment. The methodology employed to construct the ROSE, along with a synopsis of the various CASE tools integrated together to form a complete graphics based system for high fidelity real time code generation and validation is described in the presentation. (1 fig.)

  12. Open high-level data formats and software for gamma-ray astronomy

    Science.gov (United States)

    Deil, Christoph; Boisson, Catherine; Kosack, Karl; Perkins, Jeremy; King, Johannes; Eger, Peter; Mayer, Michael; Wood, Matthew; Zabalza, Victor; Knödlseder, Jürgen; Hassan, Tarek; Mohrmann, Lars; Ziegler, Alexander; Khelifi, Bruno; Dorner, Daniela; Maier, Gernot; Pedaletti, Giovanna; Rosado, Jaime; Contreras, José Luis; Lefaucheur, Julien; Brügge, Kai; Servillat, Mathieu; Terrier, Régis; Walter, Roland; Lombardi, Saverio

    2017-01-01

    In gamma-ray astronomy, a variety of data formats and proprietary software have been traditionally used, often developed for one specific mission or experiment. Especially for ground-based imaging atmospheric Cherenkov telescopes (IACTs), data and software are mostly private to the collaborations operating the telescopes. However, there is a general movement in science towards the use of open data and software. In addition, the next-generation IACT instrument, the Cherenkov Telescope Array (CTA), will be operated as an open observatory. We have created a Github organisation at https://github.com/open-gamma-ray-astro where we are developing high-level data format specifications. A public mailing list was set up at https://lists.nasa.gov/mailman/listinfo/open-gamma-ray-astro and a first face-to-face meeting on the IACT high-level data model and formats took place in April 2016 in Meudon (France). This open multi-mission effort will help to accelerate the development of open data formats and open-source software for gamma-ray astronomy, leading to synergies in the development of analysis codes and eventually better scientific results (reproducible, multi-mission). This write-up presents this effort for the first time, explaining the motivation and context, the available resources and process we use, as well as the status and planned next steps for the data format specifications. We hope that it will stimulate feedback and future contributions from the gamma-ray astronomy community.

  13. Flexible event reconstruction software chains with the ALICE High-Level Trigger

    International Nuclear Information System (INIS)

    Ram, D; Breitner, T; Szostak, A

    2012-01-01

    The ALICE High-Level Trigger (HLT) has a large high-performance computing cluster at CERN whose main objective is to perform real-time analysis on the data generated by the ALICE experiment and scale it down to at-most 4GB/sec - which is the current maximum mass-storage bandwidth available. Data-flow in this cluster is controlled by a custom designed software framework. It consists of a set of components which can communicate with each other via a common control interface. The software framework also supports the creation of different configurations based on the detectors participating in the HLT. These configurations define a logical data processing “chain” of detector data-analysis components. Data flows through this software chain in a pipelined fashion so that several events can be processed at the same time. An instance of such a chain can run and manage a few thousand physics analysis and data-flow components. The HLT software and the configuration scheme used in the 2011 heavy-ion runs of ALICE, has been discussed in this contribution.

  14. Software development based on high speed PC oscilloscope for automated pulsed magnetic field measurement system

    International Nuclear Information System (INIS)

    Sun Yuxiang; Shang Lei; Li Ji; Ge Lei

    2011-01-01

    It introduces a method of a software development which is based on high speed PC oscilloscope for pulsed magnetic field measurement system. The previous design has been improved by this design, high-speed virtual oscilloscope has been used in the field for the first time. In the design, the automatic data acquisition, data process, data analysis and storage have been realized. Automated point checking reduces the workload. The use of precise motion bench increases the positioning accuracy. The software gets the data from PC oscilloscope by calling DLLs and includes the function of oscilloscope, such as trigger, ranges, and sample rate setting etc. Spline Interpolation and Bandstop Filter are used to denoise the signals. The core of the software is the state machine which controls the motion of stepper motors and data acquisition and stores the data automatically. NI Vision Acquisition Software and Database Connectivity Toolkit make the video surveillance of laboratory and MySQL database connectivity available. The raw signal and processed signal have been compared in this paper. The waveform has been greatly improved by the signal processing. (authors)

  15. Systems engineering applied to integrated safety management for high consequence facilities

    International Nuclear Information System (INIS)

    Barter, R; Morais, B.

    1998-01-01

    Integrated Safety Management is a concept that is being actively promoted by the U.S. Department of Energy as a means of assuring safe operation of its facilities. The concept involves the integration of safety precepts into work planning rather than adjusting for safe operations after defining the work activity. The system engineering techniques used to design an integrated safety management system for a high consequence research facility are described. An example is given to show how the concepts evolved with the system design

  16. Integration testing through reusing representative unit test cases for high-confidence medical software.

    Science.gov (United States)

    Shin, Youngsul; Choi, Yunja; Lee, Woo Jin

    2013-06-01

    As medical software is getting larger-sized, complex, and connected with other devices, finding faults in integrated software modules gets more difficult and time consuming. Existing integration testing typically takes a black-box approach, which treats the target software as a black box and selects test cases without considering internal behavior of each software module. Though it could be cost-effective, this black-box approach cannot thoroughly test interaction behavior among integrated modules and might leave critical faults undetected, which should not happen in safety-critical systems such as medical software. This work anticipates that information on internal behavior is necessary even for integration testing to define thorough test cases for critical software and proposes a new integration testing method by reusing test cases used for unit testing. The goal is to provide a cost-effective method to detect subtle interaction faults at the integration testing phase by reusing the knowledge obtained from unit testing phase. The suggested approach notes that the test cases for the unit testing include knowledge on internal behavior of each unit and extracts test cases for the integration testing from the test cases for the unit testing for a given test criteria. The extracted representative test cases are connected with functions under test using the state domain and a single test sequence to cover the test cases is produced. By means of reusing unit test cases, the tester has effective test cases to examine diverse execution paths and find interaction faults without analyzing complex modules. The produced test sequence can have test coverage as high as the unit testing coverage and its length is close to the length of optimal test sequences. Copyright © 2013 Elsevier Ltd. All rights reserved.

  17. Statistical surrogate models for prediction of high-consequence climate change.

    Energy Technology Data Exchange (ETDEWEB)

    Constantine, Paul; Field, Richard V., Jr.; Boslough, Mark Bruce Elrick

    2011-09-01

    In safety engineering, performance metrics are defined using probabilistic risk assessments focused on the low-probability, high-consequence tail of the distribution of possible events, as opposed to best estimates based on central tendencies. We frame the climate change problem and its associated risks in a similar manner. To properly explore the tails of the distribution requires extensive sampling, which is not possible with existing coupled atmospheric models due to the high computational cost of each simulation. We therefore propose the use of specialized statistical surrogate models (SSMs) for the purpose of exploring the probability law of various climate variables of interest. A SSM is different than a deterministic surrogate model in that it represents each climate variable of interest as a space/time random field. The SSM can be calibrated to available spatial and temporal data from existing climate databases, e.g., the Program for Climate Model Diagnosis and Intercomparison (PCMDI), or to a collection of outputs from a General Circulation Model (GCM), e.g., the Community Earth System Model (CESM) and its predecessors. Because of its reduced size and complexity, the realization of a large number of independent model outputs from a SSM becomes computationally straightforward, so that quantifying the risk associated with low-probability, high-consequence climate events becomes feasible. A Bayesian framework is developed to provide quantitative measures of confidence, via Bayesian credible intervals, in the use of the proposed approach to assess these risks.

  18. Global situational awareness and early warning of high-consequence climate change.

    Energy Technology Data Exchange (ETDEWEB)

    Backus, George A.; Carr, Martin J.; Boslough, Mark Bruce Elrick

    2009-08-01

    Global monitoring systems that have high spatial and temporal resolution, with long observational baselines, are needed to provide situational awareness of the Earth's climate system. Continuous monitoring is required for early warning of high-consequence climate change and to help anticipate and minimize the threat. Global climate has changed abruptly in the past and will almost certainly do so again, even in the absence of anthropogenic interference. It is possible that the Earth's climate could change dramatically and suddenly within a few years. An unexpected loss of climate stability would be equivalent to the failure of an engineered system on a grand scale, and would affect billions of people by causing agricultural, economic, and environmental collapses that would cascade throughout the world. The probability of such an abrupt change happening in the near future may be small, but it is nonzero. Because the consequences would be catastrophic, we argue that the problem should be treated with science-informed engineering conservatism, which focuses on various ways a system can fail and emphasizes inspection and early detection. Such an approach will require high-fidelity continuous global monitoring, informed by scientific modeling.

  19. Distributed control software of high-performance control-loop algorithm

    CERN Document Server

    Blanc, D

    1999-01-01

    The majority of industrial cooling and ventilation plants require the control of complex processes. All these processes are highly important for the operation of the machines. The stability and reliability of these processes are leading factors identifying the quality of the service provided. The control system architecture and software structure, as well, are required to have high dynamical performance and robust behaviour. The intelligent systems based on PID or RST controllers are used for their high level of stability and accuracy. The design and tuning of these complex controllers require the dynamic model of the plant to be known (generally obtained by identification) and the desired performance of the various control loops to be specified for achieving good performances. The concept of having a distributed control algorithm software provides full automation facilities with well-adapted functionality and good performances, giving methodology, means and tools to master the dynamic process optimization an...

  20. Assessment of Computer Simulation Software and Process Data for High Pressure Die Casting of Magnesium

    Energy Technology Data Exchange (ETDEWEB)

    Sabau, Adrian S [ORNL; Hatfield, Edward C [ORNL; Dinwiddie, Ralph Barton [ORNL; Kuwana, Kazunori [University of Kentucky; Viti, Valerio [University of Kentucky, Lexington; Hassan, Mohamed I [University of Kentucky, Lexington; Saito, Kozo [University of Kentucky

    2007-09-01

    Computer software for the numerical simulation of solidification and mold filling is an effective design tool for cast structural automotive magnesium components. A review of commercial software capabilities and their validation procedures was conducted. Aside form the software assessment, the program addressed five main areas: lubricant degradation, lubricant application, gate atomization, and heat transfer at metal mold interfaces. A test stand for lubricant application was designed. A sensor was used for the direct measurement of heat fluxes during lubricant application and casting solidification in graphite molds. Spray experiments were conducted using pure deionized water and commercial die lubricants. The results show that the sensor can be used with confidence for measuring heat fluxes under conditions specific to the die lube application. The data on heat flux was presented in forms suitable for use in HPDC simulation software. Severe jet breakup and atomization phenomena are likely to occur due to high gate velocities in HPDC. As a result of gate atomization, droplet flow affects the mold filling pattern, air entrapment, skin formation, and ensuing defects. Warm water analogue dies were designed for obtaining experimental data on mold filling phenomena. Data on break-up jet length, break-up pattern, velocities, and droplet size distribution were obtained experimentally and was used to develop correlations for jet break-up phenomena specific to die casting gate configurations.

  1. Hadronic energy resolution of a highly granular scintillator-steel hadron calorimeter using software compensation techniques

    Czech Academy of Sciences Publication Activity Database

    Adloff, C.; Blaha, J.; Blaising, J.J.; Cvach, Jaroslav; Gallus, Petr; Havránek, Miroslav; Janata, Milan; Kvasnička, Jiří; Lednický, Denis; Marčišovský, Michal; Polák, Ivo; Popule, Jiří; Tomášek, Lukáš; Tomášek, Michal; Růžička, Pavel; Šícho, Petr; Smolík, Jan; Vrba, Václav; Zálešák, Jaroslav

    2012-01-01

    Roč. 7, SEP (2012), 1-23 ISSN 1748-0221 R&D Projects: GA MŠk LA09042; GA MŠk LC527; GA ČR GA202/05/0653 Institutional research plan: CEZ:AV0Z10100502 Keywords : hadronic calorimetry * imaging calorimetry * software compensation Subject RIV: BF - Elementary Particles and High Energy Physics Impact factor: 1.869, year: 2011

  2. The external costs of low probability-high consequence events: Ex ante damages and lay risks

    International Nuclear Information System (INIS)

    Krupnick, A.J.; Markandya, A.; Nickell, E.

    1994-01-01

    This paper provides an analytical basis for characterizing key differences between two perspectives on how to estimate the expected damages of low probability - high consequence events. One perspective is the conventional method used in the U.S.-EC fuel cycle reports [e.g., ORNL/RFF (1994a,b]. This paper articulates another perspective, using economic theory. The paper makes a strong case for considering this, approach as an alternative, or at least as a complement, to the conventional approach. This alternative approach is an important area for future research. I Interest has been growing worldwide in embedding the external costs of productive activities, particularly the fuel cycles resulting in electricity generation, into prices. In any attempt to internalize these costs, one must take into account explicitly the remote but real possibilities of accidents and the wide gap between lay perceptions and expert assessments of such risks. In our fuel cycle analyses, we estimate damages and benefits' by simply monetizing expected consequences, based on pollution dispersion models, exposure-response functions, and valuation functions. For accidents, such as mining and transportation accidents, natural gas pipeline accidents, and oil barge accidents, we use historical data to estimate the rates of these accidents. For extremely severe accidents--such as severe nuclear reactor accidents and catastrophic oil tanker spills--events are extremely rare and they do not offer a sufficient sample size to estimate their probabilities based on past occurrences. In those cases the conventional approach is to rely on expert judgments about both the probability of the consequences and their magnitude. As an example of standard practice, which we term here an expert expected damage (EED) approach to estimating damages, consider how evacuation costs are estimated in the nuclear fuel cycle report

  3. The external costs of low probability-high consequence events: Ex ante damages and lay risks

    Energy Technology Data Exchange (ETDEWEB)

    Krupnick, A J; Markandya, A; Nickell, E

    1994-07-01

    This paper provides an analytical basis for characterizing key differences between two perspectives on how to estimate the expected damages of low probability - high consequence events. One perspective is the conventional method used in the U.S.-EC fuel cycle reports [e.g., ORNL/RFF (1994a,b]. This paper articulates another perspective, using economic theory. The paper makes a strong case for considering this, approach as an alternative, or at least as a complement, to the conventional approach. This alternative approach is an important area for future research. I Interest has been growing worldwide in embedding the external costs of productive activities, particularly the fuel cycles resulting in electricity generation, into prices. In any attempt to internalize these costs, one must take into account explicitly the remote but real possibilities of accidents and the wide gap between lay perceptions and expert assessments of such risks. In our fuel cycle analyses, we estimate damages and benefits' by simply monetizing expected consequences, based on pollution dispersion models, exposure-response functions, and valuation functions. For accidents, such as mining and transportation accidents, natural gas pipeline accidents, and oil barge accidents, we use historical data to estimate the rates of these accidents. For extremely severe accidents--such as severe nuclear reactor accidents and catastrophic oil tanker spills--events are extremely rare and they do not offer a sufficient sample size to estimate their probabilities based on past occurrences. In those cases the conventional approach is to rely on expert judgments about both the probability of the consequences and their magnitude. As an example of standard practice, which we term here an expert expected damage (EED) approach to estimating damages, consider how evacuation costs are estimated in the nuclear fuel cycle report.

  4. EVALUATION OF PATCHY ATROPHY SECONDARY TO HIGH MYOPIA BY SEMIAUTOMATED SOFTWARE FOR FUNDUS AUTOFLUORESCENCE ANALYSIS.

    Science.gov (United States)

    Miere, Alexandra; Capuano, Vittorio; Serra, Rita; Jung, Camille; Souied, Eric; Querques, Giuseppe

    2017-05-31

    To evaluate the progression of patchy atrophy in high myopia using semiautomated software for fundus autofluorescence (FAF) analysis. The medical records and multimodal imaging of 21 consecutive highly myopic patients with macular chorioretinal patchy atrophy (PA) were retrospectively analyzed. All patients underwent repeated fundus autofluorescence and spectral domain optical coherence tomography over at least 12 months. Color fundus photography was also performed in a subset of patients. Total atrophy area was measured on FAF images using Region Finder semiautomated software embedded in Spectralis (Heidelberg Engineering, Heidelberg, Germany) at baseline and during follow-up visits. Region Finder was compared with manually measured PA on FAF images. Twenty-two eyes of 21 patients (14 women, 7 men; mean age 62.8 + 13.0 years, range 32-84 years) were included. Mean PA area using Region Finder was 2.77 ± 2.91 SD mm at baseline, 3.12 ± 2.68 mm at Month 6, 3.43 ± 2.68 mm at Month 12, and 3.73 ± 2.74 mm at Month 18 (overall P autofluorescence analysis by Region Finder semiautomated software provides accurate measurements of lesion area and allows us to quantify the progression of PA in high myopia. In our series, PA enlarged significantly over at least 12 months, and its progression seemed to be related to the lesion size at baseline.

  5. Consequences of long-term power outages and high electricity prices lasting for months

    International Nuclear Information System (INIS)

    2005-01-01

    Several areas in the world have experienced electricity outages for longer periods of time, but the consequences of these are sparsely documented. There is a need for further analysis of the socioeconomic consequences of the outages. In addition to KILE (Quality adjusted revenue framework for un supplied energy) costs one has to take into account that the costs often increase proportionally with the durance of the outage, and that KILE tariffs do not reflect lost consumer's surplus for products that are not produced during an outage. A good example is the public underground transport, where the company's economical loss can be significantly smaller than the loss of utility value for the travellers. If the authorities act with reasonability it is difficult to see that periods with very high prices represent a big problem. The most important problems are related to diffused effects, especially for households with a weak economy. These problems can be solved with improved contractual forms (price guarantees) or by transfers to the households, without weakening the incentives for electricity economising (ml)

  6. A Phenomenological Inquiry into the Perceptions of Software Professionals on the Asperger's Syndrome/High Functioning Autism Spectrum and the Success of Software Development Projects

    Science.gov (United States)

    Kendall, Leslie R.

    2013-01-01

    Individuals who have Asperger's Syndrome/High-Functioning Autism, as a group, are chronically underemployed and underutilized. Many in this group have abilities that are well suited for various roles within the practice of software development. Multiple studies have shown that certain organizational and management changes in the software…

  7. The positive bystander effect: passive bystanders increase helping in situations with high expected negative consequences for the helper.

    Science.gov (United States)

    Fischer, Peter; Greitemeyer, Tobias

    2013-01-01

    The present field study investigated the interplay between the presence of a passive bystander (not present versus present) in a simulated bike theft and expected negative consequences (low versus high) in predicting intervention behavior when no physical victim is present. It was found that an additional bystander increases individual intervention in situations where the expected negative consequences for the helper in case of intervention were high (i.e., when the bike thief looks fierce) compared to situations where the expected negative consequences for the helper were low (i.e., when the bike thief does not look fierce). In contrast, no such effect for high vs. low expected negative consequences was observed when no additional bystander observed the critical situation. The results are discussed in light of previous laboratory findings on expected negative consequences and bystander intervention.

  8. Recent development in software and automation tools for high-throughput discovery bioanalysis.

    Science.gov (United States)

    Shou, Wilson Z; Zhang, Jun

    2012-05-01

    Bioanalysis with LC-MS/MS has been established as the method of choice for quantitative determination of drug candidates in biological matrices in drug discovery and development. The LC-MS/MS bioanalytical support for drug discovery, especially for early discovery, often requires high-throughput (HT) analysis of large numbers of samples (hundreds to thousands per day) generated from many structurally diverse compounds (tens to hundreds per day) with a very quick turnaround time, in order to provide important activity and liability data to move discovery projects forward. Another important consideration for discovery bioanalysis is its fit-for-purpose quality requirement depending on the particular experiments being conducted at this stage, and it is usually not as stringent as those required in bioanalysis supporting drug development. These aforementioned attributes of HT discovery bioanalysis made it an ideal candidate for using software and automation tools to eliminate manual steps, remove bottlenecks, improve efficiency and reduce turnaround time while maintaining adequate quality. In this article we will review various recent developments that facilitate automation of individual bioanalytical procedures, such as sample preparation, MS/MS method development, sample analysis and data review, as well as fully integrated software tools that manage the entire bioanalytical workflow in HT discovery bioanalysis. In addition, software tools supporting the emerging high-resolution accurate MS bioanalytical approach are also discussed.

  9. A role for relational databases in high energy physics software systems

    International Nuclear Information System (INIS)

    Lauer, R.; Slaughter, A.J.; Wolin, E.

    1987-01-01

    This paper presents the design and initial implementation of software which uses a relational database management system for storage and retrieval of real and Monte Carlo generated events from a charm and beauty spectrometer with a vertex detector. The purpose of the software is to graphically display and interactively manipulate the events, fit tracks and vertices and calculate physics quantities. The INGRES database forms the core of the system, while the DI3000 graphics package is used to plot the events. The paper introduces relational database concepts and their applicability to high energy physics data. It also evaluates the environment provided by INGRES, particularly its usefulness in code development and its Fortran interface. Specifics of the database design we have chosen are detailed as well. (orig.)

  10. Integrated State Estimation and Contingency Analysis Software Implementation using High Performance Computing Techniques

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Yousu; Glaesemann, Kurt R.; Rice, Mark J.; Huang, Zhenyu

    2015-12-31

    Power system simulation tools are traditionally developed in sequential mode and codes are optimized for single core computing only. However, the increasing complexity in the power grid models requires more intensive computation. The traditional simulation tools will soon not be able to meet the grid operation requirements. Therefore, power system simulation tools need to evolve accordingly to provide faster and better results for grid operations. This paper presents an integrated state estimation and contingency analysis software implementation using high performance computing techniques. The software is able to solve large size state estimation problems within one second and achieve a near-linear speedup of 9,800 with 10,000 cores for contingency analysis application. The performance evaluation is presented to show its effectiveness.

  11. Key drivers and economic consequences of high-end climate scenarios: uncertainties and risks

    DEFF Research Database (Denmark)

    Halsnæs, Kirsten; Kaspersen, Per Skougaard; Drews, Martin

    2015-01-01

    The consequences of high-end climate scenarios and the risks of extreme events involve a number of critical assumptions and methodological challenges related to key uncertainties in climate scenarios and modelling, impact analysis, and economics. A methodological framework for integrated analysis...... of extreme events increase beyond scaling, and in combination with economic assumptions we find a very wide range of risk estimates for urban precipitation events. A sensitivity analysis addresses 32 combinations of climate scenarios, damage cost curve approaches, and economic assumptions, including risk...... aversion and equity represented by discount rates. Major impacts of alternative assumptions are investigated. As a result, this study demonstrates that in terms of decision making the actual expectations concerning future climate scenarios and the economic assumptions applied are very important...

  12. Risk management & organizational uncertainty implications for the assessment of high consequence organizations

    Energy Technology Data Exchange (ETDEWEB)

    Bennett, C.T.

    1995-02-23

    Post hoc analyses have demonstrated clearly that macro-system, organizational processes have played important roles in such major catastrophes as Three Mile Island, Bhopal, Exxon Valdez, Chernobyl, and Piper Alpha. How can managers of such high-consequence organizations as nuclear power plants and nuclear explosives handling facilities be sure that similar macro-system processes are not operating in their plants? To date, macro-system effects have not been integrated into risk assessments. Part of the reason for not using macro-system analyses to assess risk may be the impression that standard organizational measurement tools do not provide hard data that can be managed effectively. In this paper, I argue that organizational dimensions, like those in ISO 9000, can be quantified and integrated into standard risk assessments.

  13. Falcon: a highly flexible open-source software for closed-loop neuroscience

    Science.gov (United States)

    Ciliberti, Davide; Kloosterman, Fabian

    2017-08-01

    Objective. Closed-loop experiments provide unique insights into brain dynamics and function. To facilitate a wide range of closed-loop experiments, we created an open-source software platform that enables high-performance real-time processing of streaming experimental data. Approach. We wrote Falcon, a C++ multi-threaded software in which the user can load and execute an arbitrary processing graph. Each node of a Falcon graph is mapped to a single thread and nodes communicate with each other through thread-safe buffers. The framework allows for easy implementation of new processing nodes and data types. Falcon was tested both on a 32-core and a 4-core workstation. Streaming data was read from either a commercial acquisition system (Neuralynx) or the open-source Open Ephys hardware, while closed-loop TTL pulses were generated with a USB module for digital output. We characterized the round-trip latency of our Falcon-based closed-loop system, as well as the specific latency contribution of the software architecture, by testing processing graphs with up to 32 parallel pipelines and eight serial stages. We finally deployed Falcon in a task of real-time detection of population bursts recorded live from the hippocampus of a freely moving rat. Main results. On Neuralynx hardware, round-trip latency was well below 1 ms and stable for at least 1 h, while on Open Ephys hardware latencies were below 15 ms. The latency contribution of the software was below 0.5 ms. Round-trip and software latencies were similar on both 32- and 4-core workstations. Falcon was used successfully to detect population bursts online with ~40 ms average latency. Significance. Falcon is a novel open-source software for closed-loop neuroscience. It has sub-millisecond intrinsic latency and gives the experimenter direct control of CPU resources. We envisage Falcon to be a useful tool to the neuroscientific community for implementing a wide variety of closed-loop experiments, including those

  14. Falcon: a highly flexible open-source software for closed-loop neuroscience.

    Science.gov (United States)

    Ciliberti, Davide; Kloosterman, Fabian

    2017-08-01

    Closed-loop experiments provide unique insights into brain dynamics and function. To facilitate a wide range of closed-loop experiments, we created an open-source software platform that enables high-performance real-time processing of streaming experimental data. We wrote Falcon, a C++ multi-threaded software in which the user can load and execute an arbitrary processing graph. Each node of a Falcon graph is mapped to a single thread and nodes communicate with each other through thread-safe buffers. The framework allows for easy implementation of new processing nodes and data types. Falcon was tested both on a 32-core and a 4-core workstation. Streaming data was read from either a commercial acquisition system (Neuralynx) or the open-source Open Ephys hardware, while closed-loop TTL pulses were generated with a USB module for digital output. We characterized the round-trip latency of our Falcon-based closed-loop system, as well as the specific latency contribution of the software architecture, by testing processing graphs with up to 32 parallel pipelines and eight serial stages. We finally deployed Falcon in a task of real-time detection of population bursts recorded live from the hippocampus of a freely moving rat. On Neuralynx hardware, round-trip latency was well below 1 ms and stable for at least 1 h, while on Open Ephys hardware latencies were below 15 ms. The latency contribution of the software was below 0.5 ms. Round-trip and software latencies were similar on both 32- and 4-core workstations. Falcon was used successfully to detect population bursts online with ~40 ms average latency. Falcon is a novel open-source software for closed-loop neuroscience. It has sub-millisecond intrinsic latency and gives the experimenter direct control of CPU resources. We envisage Falcon to be a useful tool to the neuroscientific community for implementing a wide variety of closed-loop experiments, including those requiring use of complex data structures and real

  15. Software Tools for Development on the Peregrine System | High-Performance

    Science.gov (United States)

    Computing | NREL Software Tools for Development on the Peregrine System Software Tools for and manage software at the source code level. Cross-Platform Make and SCons The "Cross-Platform Make" (CMake) package is from Kitware, and SCons is a modern software build tool based on Python

  16. UDECON: deconvolution optimization software for restoring high-resolution records from pass-through paleomagnetic measurements

    Science.gov (United States)

    Xuan, Chuang; Oda, Hirokuni

    2015-11-01

    The rapid accumulation of continuous paleomagnetic and rock magnetic records acquired from pass-through measurements on superconducting rock magnetometers (SRM) has greatly contributed to our understanding of the paleomagnetic field and paleo-environment. Pass-through measurements are inevitably smoothed and altered by the convolution effect of SRM sensor response, and deconvolution is needed to restore high-resolution paleomagnetic and environmental signals. Although various deconvolution algorithms have been developed, the lack of easy-to-use software has hindered the practical application of deconvolution. Here, we present standalone graphical software UDECON as a convenient tool to perform optimized deconvolution for pass-through paleomagnetic measurements using the algorithm recently developed by Oda and Xuan (Geochem Geophys Geosyst 15:3907-3924, 2014). With the preparation of a format file, UDECON can directly read pass-through paleomagnetic measurement files collected at different laboratories. After the SRM sensor response is determined and loaded to the software, optimized deconvolution can be conducted using two different approaches (i.e., "Grid search" and "Simplex method") with adjustable initial values or ranges for smoothness, corrections of sample length, and shifts in measurement position. UDECON provides a suite of tools to view conveniently and check various types of original measurement and deconvolution data. Multiple steps of measurement and/or deconvolution data can be compared simultaneously to check the consistency and to guide further deconvolution optimization. Deconvolved data together with the loaded original measurement and SRM sensor response data can be saved and reloaded for further treatment in UDECON. Users can also export the optimized deconvolution data to a text file for analysis in other software.

  17. Overdose Problem Associated with Treatment Planning Software for High Energy Photons in Response of Panama's Accident

    International Nuclear Information System (INIS)

    Attalla, E.M.; Lotayef, M.M.; Khalil, E.M.; El-Hosiny, H.A.

    2007-01-01

    The purpose of this study was to quantify dose distribution errors by comparing actual dose measurements with the calculated values done by the software. To evaluate the outcome of radiation overexposure related to Panama's accident and in response to ensure that the treatment planning systems (T.P.S.) are being operated in accordance with the appropriate quality assurance programme, we studied the central axis and pripheral depth dose data using complex field shaped with blocks to quantify dose distribution errors. Material and Methods: Multi data T.P.S. software versions 2.35 and 2.40 and Helax T.P.S. software version 5.1 B were assesed. The calculated data of the software treatment planning systems were verified by comparing these data with the actual dose measurements for open and blocked high energy photon fields (Co-60, 6MV and 18MV photons). Results: Close calculated and measured results were obtained for the 2-D (Multi data) and 3-D treatment planning (TMS Helax). These results were correct within 1 to 2% for open fields and 0.5 to 2.5% for peripheral blocked fields. Discrepancies between calculated and measured data ranged between 13. to 36% along the central axis of complex blocked fields when normalisation point was selected at the Dmax, when the normalisation point was selected near or under the blocks, the variation between the calculated and the measured data was up to 500% difference. Conclusions: The present results emphasize the importance of the proper selection of the normalization point in the radiation field, as this facilitates detection of aberrant dose distribution (over exposure or under exposure)

  18. Overdose problem associated with treatment planning software for high energy photons in response of Panama's accident.

    Science.gov (United States)

    Attalla, Ehab M; Lotayef, Mohamed M; Khalil, Ehab M; El-Hosiny, Hesham A; Nazmy, Mohamed S

    2007-06-01

    The purpose of this study was to quantify dose distribution errors by comparing actual dose measurements with the calculated values done by the software. To evaluate the outcome of radiation overexposure related to Panama's accident and in response to ensure that the treatment planning systems (T.P.S.) are being operated in accordance with the appropriate quality assurance programme, we studied the central axis and pripheral depth dose data using complex field shaped with blocks to quantify dose distribution errors. Multidata T.P.S. software versions 2.35 and 2.40 and Helax T.P.S. software version 5.1 B were assesed. The calculated data of the software treatment planning systems were verified by comparing these data with the actual dose measurements for open and blocked high energy photon fields (Co-60, 6MV & 18MV photons). Close calculated and measured results were obtained for the 2-D (Multidata) and 3-D treatment planning (TMS Helax). These results were correct within 1 to 2% for open fields and 0.5 to 2.5% for peripheral blocked fields. Discrepancies between calculated and measured data ranged between 13. to 36% along the central axis of complex blocked fields when normalisation point was selected at the Dmax, when the normalisation point was selected near or under the blocks, the variation between the calculated and the measured data was up to 500% difference. The present results emphasize the importance of the proper selection of the normalization point in the radiation field, as this facilitates detection of aberrant dose distribution (over exposure or under exposure).

  19. GiA Roots: software for the high throughput analysis of plant root system architecture

    Science.gov (United States)

    2012-01-01

    Background Characterizing root system architecture (RSA) is essential to understanding the development and function of vascular plants. Identifying RSA-associated genes also represents an underexplored opportunity for crop improvement. Software tools are needed to accelerate the pace at which quantitative traits of RSA are estimated from images of root networks. Results We have developed GiA Roots (General Image Analysis of Roots), a semi-automated software tool designed specifically for the high-throughput analysis of root system images. GiA Roots includes user-assisted algorithms to distinguish root from background and a fully automated pipeline that extracts dozens of root system phenotypes. Quantitative information on each phenotype, along with intermediate steps for full reproducibility, is returned to the end-user for downstream analysis. GiA Roots has a GUI front end and a command-line interface for interweaving the software into large-scale workflows. GiA Roots can also be extended to estimate novel phenotypes specified by the end-user. Conclusions We demonstrate the use of GiA Roots on a set of 2393 images of rice roots representing 12 genotypes from the species Oryza sativa. We validate trait measurements against prior analyses of this image set that demonstrated that RSA traits are likely heritable and associated with genotypic differences. Moreover, we demonstrate that GiA Roots is extensible and an end-user can add functionality so that GiA Roots can estimate novel RSA traits. In summary, we show that the software can function as an efficient tool as part of a workflow to move from large numbers of root images to downstream analysis. PMID:22834569

  20. Software project estimation the fundamentals for providing high quality information to decision makers

    CERN Document Server

    Abran, Alain

    2015-01-01

    Software projects are often late and over-budget and this leads to major problems for software customers. Clearly, there is a serious issue in estimating a realistic, software project budget. Furthermore, generic estimation models cannot be trusted to provide credible estimates for projects as complex as software projects. This book presents a number of examples using data collected over the years from various organizations building software. It also presents an overview of the non-for-profit organization, which collects data on software projects, the International Software Benchmarking Stan

  1. Quality Market: Design and Field Study of Prediction Market for Software Quality Control

    Science.gov (United States)

    Krishnamurthy, Janaki

    2010-01-01

    Given the increasing competition in the software industry and the critical consequences of software errors, it has become important for companies to achieve high levels of software quality. While cost reduction and timeliness of projects continue to be important measures, software companies are placing increasing attention on identifying the user…

  2. The Optimal Pricing of Computer Software and Other Products with High Switching Costs

    OpenAIRE

    Pekka Ahtiala

    2004-01-01

    The paper studies the determinants of the optimum prices of computer programs and their upgrades. It is based on the notion that because of the human capital invested in the use of a computer program by its user, this product has high switching costs, and on the finding that pirates are responsible for generating over 80 per cent of new software sales. A model to maximize the present value of the program to the program house is constructed to determine the optimal prices of initial programs a...

  3. Windows base sodium liquid high-speed measuring system software development

    International Nuclear Information System (INIS)

    Kolokol'tsev, M.V.

    2005-01-01

    This work describes software creation process, that allows to realize data capture from the sodium liquid parameter measuring system, information processing and imaging in the real-time operation mode, retrieval, visualization and documentation of the information in post-startup period as well. Nonstandard decision is described: creation of high-speed data capture system, based on Windows and relatively inexpensive hardware component. Technical description (enterprise classes, interface elements) of the developed and introduced enclosures, realizing data capture and post-startup information visualization are given. (author)

  4. Transformation as a Design Process and Runtime Architecture for High Integrity Software

    Energy Technology Data Exchange (ETDEWEB)

    Bespalko, S.J.; Winter, V.L.

    1999-04-05

    We have discussed two aspects of creating high integrity software that greatly benefit from the availability of transformation technology, which in this case is manifest by the requirement for a sophisticated backtracking parser. First, because of the potential for correctly manipulating programs via small changes, an automated non-procedural transformation system can be a valuable tool for constructing high assurance software. Second, modeling the processing of translating data into information as a, perhaps, context-dependent grammar leads to an efficient, compact implementation. From a practical perspective, the transformation process should begin in the domain language in which a problem is initially expressed. Thus in order for a transformation system to be practical it must be flexible with respect to domain-specific languages. We have argued that transformation applied to specification results in a highly reliable system. We also attempted to briefly demonstrate that transformation technology applied to the runtime environment will result in a safe and secure system. We thus believe that the sophisticated multi-lookahead backtracking parsing technology is central to the task of being in a position to demonstrate the existence of HIS.

  5. Belle II Software

    International Nuclear Information System (INIS)

    Kuhr, T; Ritter, M

    2016-01-01

    Belle II is a next generation B factory experiment that will collect 50 times more data than its predecessor, Belle. The higher luminosity at the SuperKEKB accelerator leads to higher background levels and requires a major upgrade of the detector. As a consequence, the simulation, reconstruction, and analysis software must also be upgraded substantially. Most of the software has been redesigned from scratch, taking into account the experience from Belle and other experiments and utilizing new technologies. The large amount of experimental and simulated data requires a high level of reliability and reproducibility, even in parallel environments. Several technologies, tools, and organizational measures are employed to evaluate and monitor the performance of the software during development. (paper)

  6. Lessons from the domestic Ebola response: Improving health care system resilience to high consequence infectious diseases.

    Science.gov (United States)

    Meyer, Diane; Kirk Sell, Tara; Schoch-Spana, Monica; Shearer, Matthew P; Chandler, Hannah; Thomas, Erin; Rose, Dale A; Carbone, Eric G; Toner, Eric

    2018-05-01

    The domestic response to the West Africa Ebola virus disease (EVD) epidemic from 2014-2016 provides a unique opportunity to distill lessons learned about health sector planning and operations from those individuals directly involved. This research project aimed to identify and integrate these lessons into an actionable checklist that can improve health sector resilience to future high-consequence infectious disease (HCID) events. Interviews (N = 73) were completed with individuals involved in the domestic EVD response in 4 cities (Atlanta, Dallas, New York, and Omaha), and included individuals who worked in academia, emergency management, government, health care, law, media, and public health during the response. Interviews were transcribed and analyzed qualitatively. Two focus groups were then conducted to expand on themes identified in the interviews. Using these themes, an evidence-informed checklist was developed and vetted for completeness and feasibility by an expert advisory group. Salient themes identified included health care facility issues-specifically identifying assessment and treatment hospitals, isolation and treatment unit layout, waste management, community relations, patient identification, patient isolation, limitations on treatment, laboratories, and research considerations-and health care workforce issues-specifically psychosocial impact, unit staffing, staff training, and proper personal protective equipment. The experiences of those involved in the domestic Ebola response provide critical lessons that can help strengthen resilience of health care systems and improve future responses to HCID events. Copyright © 2018 Association for Professionals in Infection Control and Epidemiology, Inc. All rights reserved.

  7. The renal consequences of maternal obesity in offspring are overwhelmed by postnatal high fat diet

    Science.gov (United States)

    Glastras, Sarah J.; Chen, Hui; Tsang, Michael; Teh, Rachel; McGrath, Rachel T.; Zaky, Amgad; Chen, Jason; Wong, Muh Geot; Pollock, Carol A.; Saad, Sonia

    2017-01-01

    Aims/Hypothesis Developmental programming induced by maternal obesity influences the development of chronic disease in offspring. In the present study, we aimed to determine whether maternal obesity exaggerates obesity-related kidney disease. Methods Female C57BL/6 mice were fed high-fat diet (HFD) for six weeks prior to mating, during gestation and lactation. Male offspring were weaned to normal chow or HFD. At postnatal Week 8, HFD-fed offspring were administered one dose streptozotocin (STZ, 100 mg/kg i.p.) or vehicle control. Metabolic parameters and renal functional and structural changes were observed at postnatal Week 32. Results HFD-fed offspring had increased adiposity, glucose intolerance and hyperlipidaemia, associated with increased albuminuria and serum creatinine levels. Their kidneys displayed structural changes with increased levels of fibrotic, inflammatory and oxidative stress markers. STZ administration did not potentiate the renal effects of HFD. Though maternal obesity had a sustained effect on serum creatinine and oxidative stress markers in lean offspring, the renal consequences of maternal obesity were overwhelmed by the powerful effect of diet-induced obesity. Conclusion Maternal obesity portends significant risks for metabolic and renal health in adult offspring. However, diet-induced obesity is an overwhelming and potent stimulus for the development of CKD that is not potentiated by maternal obesity. PMID:28225809

  8. High-intensity interval exercise and cerebrovascular health: curiosity, cause, and consequence.

    Science.gov (United States)

    Lucas, Samuel J E; Cotter, James D; Brassard, Patrice; Bailey, Damian M

    2015-06-01

    Exercise is a uniquely effective and pluripotent medicine against several noncommunicable diseases of westernised lifestyles, including protection against neurodegenerative disorders. High-intensity interval exercise training (HIT) is emerging as an effective alternative to current health-related exercise guidelines. Compared with traditional moderate-intensity continuous exercise training, HIT confers equivalent if not indeed superior metabolic, cardiac, and systemic vascular adaptation. Consequently, HIT is being promoted as a more time-efficient and practical approach to optimize health thereby reducing the burden of disease associated with physical inactivity. However, no studies to date have examined the impact of HIT on the cerebrovasculature and corresponding implications for cognitive function. This review critiques the implications of HIT for cerebrovascular function, with a focus on the mechanisms and translational impact for patient health and well-being. It also introduces similarly novel interventions currently under investigation as alternative means of accelerating exercise-induced cerebrovascular adaptation. We highlight a need for studies of the mechanisms and thereby also the optimal dose-response strategies to guide exercise prescription, and for studies to explore alternative approaches to optimize exercise outcomes in brain-related health and disease prevention. From a clinical perspective, interventions that selectively target the aging brain have the potential to prevent stroke and associated neurovascular diseases.

  9. The renal consequences of maternal obesity in offspring are overwhelmed by postnatal high fat diet.

    Directory of Open Access Journals (Sweden)

    Sarah J Glastras

    Full Text Available Developmental programming induced by maternal obesity influences the development of chronic disease in offspring. In the present study, we aimed to determine whether maternal obesity exaggerates obesity-related kidney disease.Female C57BL/6 mice were fed high-fat diet (HFD for six weeks prior to mating, during gestation and lactation. Male offspring were weaned to normal chow or HFD. At postnatal Week 8, HFD-fed offspring were administered one dose streptozotocin (STZ, 100 mg/kg i.p. or vehicle control. Metabolic parameters and renal functional and structural changes were observed at postnatal Week 32.HFD-fed offspring had increased adiposity, glucose intolerance and hyperlipidaemia, associated with increased albuminuria and serum creatinine levels. Their kidneys displayed structural changes with increased levels of fibrotic, inflammatory and oxidative stress markers. STZ administration did not potentiate the renal effects of HFD. Though maternal obesity had a sustained effect on serum creatinine and oxidative stress markers in lean offspring, the renal consequences of maternal obesity were overwhelmed by the powerful effect of diet-induced obesity.Maternal obesity portends significant risks for metabolic and renal health in adult offspring. However, diet-induced obesity is an overwhelming and potent stimulus for the development of CKD that is not potentiated by maternal obesity.

  10. Back pain and its consequences among Polish Air Force pilots flying high performance aircraft

    Directory of Open Access Journals (Sweden)

    Aleksandra Truszczyńska

    2014-04-01

    Full Text Available Objectives: Back pain in Air Force fast jet pilots has been studied by several air forces and found to be relatively common. The objective of the study was to determine the prevalence and degree of the pain intensity in the cervical, thoracic and lumbar spine, subjective risk factors and their effect on the pilots' performance while flying high maneuver aircrafts and the consequences for cognitive deficiencies. Material and Methods: The study was designed as a retrospective, anonymous questionnaire survey, collecting data on the age, aircraft type, flying hours, pain characteristics, physical activity, etc. The study was participated by 94 pilots aged 28-45 years (mean age: 35.9±3.3 years, actively flying fast jet aircrafts Su-22, Mig-29 and F-16. The estimates regarding the level of the subjective back pain were established using visual analogue scales (VAS. Results: The values of the Cochran and Cox T-test for heterogeneous variances are as follows: for the total number of flying hours: F = 2.53, p = 0.0145, for the pilot's age: F = 3.15, p = 0.003, and for the BMI factor F = 2.73, p = 0.008. Conclusions: Our questionnaire survey showed a significant problem regarding spinal conditions in high performance aircraft pilots. The determination of the risk factors may lead to solving this problem and help eliminate the effect of the unfavorable environment on piloting jet aircrafts. Experiencing back pain during the flight might influence the mission performance and flight safety. The costs of pilots education are enormous and inability to fly, or even disability, leads to considerable economic loss. More research on specific prevention strategies is warranted in order to improve the in-flight working environment of fighter pilots.

  11. A multidimensional conspiracy around the software industry: arguments for intervention in high technology sectors

    Directory of Open Access Journals (Sweden)

    Emerson Wilian Araújo

    2010-12-01

    Full Text Available This paper analyses the current debate about governmental intervention in high tech industries. On one hand, there are arguments against such interventions that consider the high tech industries as a capital intensive sector; instead of technological intensity, the capital intensity determines the value added. On the other hand, we could prove that there is at least one high tech industry, which is job intensive and non capital intensive, with high value added, that generates several positive externalities (the software industry. Such externalities are important arguments to support governmental intervention in high tech industries and cast doubts on the capital’s value added argument.Este artigo analisa o debate atual relacionado à intervenção estatal nas indústrias de alta intensidade tecnológica. Por um lado, existem argumentos contra tais intervenções que consideram essas indústrias setores também intensivos em capital. Ao contrário do que se pensa, para esta corrente, a intensidade de capital – e não a tecnológica – seria o fator determinante da adição de valor. Por ouro lado, foi possível provar, neste trabalho, que existe ao menos uma indústria de alta intensidade tecnológica, intensiva em trabalho e não em capital, com alto valor adicionado, que gera diversas externalidades positivas em âmbitos social e econômico: a indústria de software. Tais externalidades servem como importantes argumentos para justificar a intervenção estatal em setores de alta intensidade tecnológica e lança dúvidas sobre o argumento da exclusividade do capital como determinante de adição de valor.

  12. Future consequences of decreasing marginal production efficiency in the high-yielding dairy cow.

    Science.gov (United States)

    Moallem, U

    2016-04-01

    The objectives were to examine the gross and marginal production efficiencies in high-yielding dairy cows and the future consequences on dairy industry profitability. Data from 2 experiments were used in across-treatments analysis (n=82 mid-lactation multiparous Israeli-Holstein dairy cows). Milk yields, body weights (BW), and dry matter intakes (DMI) were recorded daily. In both experiments, cows were fed a diet containing 16.5 to 16.6% crude protein and net energy for lactation (NEL) at 1.61 Mcal/kg of dry matter (DM). The means of milk yield, BW, DMI, NEL intake, and energy required for maintenance were calculated individually over the whole study, and used to calculate gross and marginal efficiencies. Data were analyzed in 2 ways: (1) simple correlation between variables; and (2) cows were divided into 3 subgroups, designated low, moderate, and high DMI (LDMI, MDMI, and HDMI), according to actual DMI per day: ≤ 26 kg (n=27); >26 through 28.2 kg (n=28); and >28.2 kg (n=27). The phenotypic Pearson correlations among variables were analyzed, and the GLM procedure was used to test differences between subgroups. The relationships between milk and fat-corrected milk yields and the corresponding gross efficiencies were positive, whereas BW and gross production efficiency were negatively correlated. The marginal production efficiency from DM and energy consumed decreased with increasing DMI. The difference between BW gain as predicted by the National Research Council model (2001) and the present measurements increased with increasing DMI (r=0.68). The average calculated energy balances were 1.38, 2.28, and 4.20 Mcal/d (standard error of the mean=0.64) in the LDMI, MDMI, and HDMI groups, respectively. The marginal efficiency for milk yields from DMI or energy consumed was highest in LDMI, intermediate in MDMI, and lowest in HDMI. The predicted BW gains for the whole study period were 22.9, 37.9, and 75.8 kg for the LDMI, MDMI, and HDMI groups, respectively. The

  13. An ultra short pulse reconstruction software applied to the GEMINI high power laser system

    Energy Technology Data Exchange (ETDEWEB)

    Galletti, Mario, E-mail: mario.gall22@gmail.com [INFN – LNF, Via Enrico Fermi 40, 00044 Frascati (Italy); Galimberti, Marco [Central Laser Facility, Rutherford Appleton Laboratory, Didcot (United Kingdom); Hooker, Chris [Central Laser Facility, Rutherford Appleton Laboratory, Didcot (United Kingdom); University of Oxford, Oxford (United Kingdom); Chekhlov, Oleg; Tang, Yunxin [Central Laser Facility, Rutherford Appleton Laboratory, Didcot (United Kingdom); Bisesto, Fabrizio Giuseppe [INFN – LNF, Via Enrico Fermi 40, 00044 Frascati (Italy); Curcio, Alessandro [INFN – LNF, Via Enrico Fermi 40, 00044 Frascati (Italy); Sapienza – University of Rome, P.le Aldo Moro, 2, 00185 Rome (Italy); Anania, Maria Pia [INFN – LNF, Via Enrico Fermi 40, 00044 Frascati (Italy); Giulietti, Danilo [Physics Department of the University and INFN, Pisa (Italy)

    2016-09-01

    The GRENOUILLE traces of Gemini pulses (15 J, 30 fs, PW, shot per 20 s) were acquired in the Gemini Target Area PetaWatt at the Central Laser Facility (CLF), Rutherford Appleton Laboratory (RAL). A comparison between the characterizations of the laser pulse parameters made using two different types of algorithms: Video Frog and GRenouille/FrOG (GROG), was made. The temporal and spectral parameters came out to be in great agreement for the two kinds of algorithms. In this experimental campaign it has been showed how GROG, the developed algorithm, works as well as VideoFrog algorithm with the PetaWatt pulse class. - Highlights: • Integration of the diagnostic tool on high power laser. • Validation of the GROG algorithm in comparison to a well-known commercial available software. • Complete characterization of the GEMINI ultra-short high power laser pulse.

  14. Installing and Setting Up Git Software Tool on Windows | High-Performance

    Science.gov (United States)

    Computing | NREL Git Software Tool on Windows Installing and Setting Up Git Software Tool on Windows Learn how to set up the Git software tool on Windows for use with the Peregrine system. Git is this doc, we'll show you how to get git installed on Windows 7, and how to get things set up on NREL's

  15. The High-Level Interface Definitions in the ASTRI/CTA Mini Array Software System (MASS)

    Science.gov (United States)

    Conforti, V.; Tosti, G.; Schwarz, J.; Bruno, P.; Cefal‘A, M.; Paola, A. D.; Gianotti, F.; Grillo, A.; Russo, F.; Tanci, C.; Testa, V.; Antonelli, L. A.; Canestrari, R.; Catalano, O.; Fiorini, M.; Gallozzi, S.; Giro, E.; Palombara, N. L.; Leto, G.; Maccarone, M. C.; Pareschi, G.; Stringhetti, L.; Trifoglio, M.; Vercellone, S.; Astri Collaboration; Cta Consortium

    2015-09-01

    ASTRI (Astrofisica con Specchi a Tecnologia Replicante Italiana) is a Flagship Project funded by the Italian Ministry of Education, University and Research, and led by INAF, the Italian National Institute of Astrophysics. Within this framework, INAF is currently developing an end-to-end prototype, named ASTRI SST-2M, of a Small Size Dual-Mirror Telescope for the Cherenkov Telescope Array, CTA. A second goal of the project is the realization of the ASTRI/CTA mini-array, which will be composed of seven SST-2M telescopes placed at the CTA Southern Site. The ASTRI Mini Array Software System (MASS) is designed to support the ASTRI/CTA mini-array operations. MASS is being built on top of the ALMA Common Software (ACS) framework, which provides support for the implementation of distributed data acquisition and control systems, and functionality for log and alarm management, message driven communication and hardware devices management. The first version of the MASS system, which will comply with the CTA requirements and guidelines, will be tested on the ASTRI SST-2M prototype. In this contribution we present the interface definitions of the MASS high level components in charge of the ASTRI SST-2M observation scheduling, telescope control and monitoring, and data taking. Particular emphasis is given to their potential reuse for the ASTRI/CTA mini-array.

  16. Impact of Recent Hardware and Software Trends on High Performance Transaction Processing and Analytics

    Science.gov (United States)

    Mohan, C.

    In this paper, I survey briefly some of the recent and emerging trends in hardware and software features which impact high performance transaction processing and data analytics applications. These features include multicore processor chips, ultra large main memories, flash storage, storage class memories, database appliances, field programmable gate arrays, transactional memory, key-value stores, and cloud computing. While some applications, e.g., Web 2.0 ones, were initially built without traditional transaction processing functionality in mind, slowly system architects and designers are beginning to address such previously ignored issues. The availability, analytics and response time requirements of these applications were initially given more importance than ACID transaction semantics and resource consumption characteristics. A project at IBM Almaden is studying the implications of phase change memory on transaction processing, in the context of a key-value store. Bitemporal data management has also become an important requirement, especially for financial applications. Power consumption and heat dissipation properties are also major considerations in the emergence of modern software and hardware architectural features. Considerations relating to ease of configuration, installation, maintenance and monitoring, and improvement of total cost of ownership have resulted in database appliances becoming very popular. The MapReduce paradigm is now quite popular for large scale data analysis, in spite of the major inefficiencies associated with it.

  17. Risk-averse decision-making for civil infrastructure exposed to low-probability, high-consequence events

    International Nuclear Information System (INIS)

    Cha, Eun Jeong; Ellingwood, Bruce R.

    2012-01-01

    Quantitative analysis and assessment of risk to civil infrastructure has two components: probability of a potentially damaging event and consequence of damage, measured in terms of financial or human losses. Decision models that have been utilized during the past three decades take into account the probabilistic component rationally, but address decision-maker attitudes toward consequences and risk only to a limited degree. The application of models reflecting these attitudes to decisions involving low-probability, high-consequence events that may impact civil infrastructure requires a fundamental understanding of risk acceptance attitudes and how they affect individual and group choices. In particular, the phenomenon of risk aversion may be a significant factor in decisions for civil infrastructure exposed to low-probability events with severe consequences, such as earthquakes, hurricanes or floods. This paper utilizes cumulative prospect theory to investigate the role and characteristics of risk-aversion in assurance of structural safety.

  18. Software engineering

    CERN Document Server

    Sommerville, Ian

    2016-01-01

    For courses in computer science and software engineering The Fundamental Practice of Software Engineering Software Engineering introduces readers to the overwhelmingly important subject of software programming and development. In the past few years, computer systems have come to dominate not just our technological growth, but the foundations of our world's major industries. This text seeks to lay out the fundamental concepts of this huge and continually growing subject area in a clear and comprehensive manner. The Tenth Edition contains new information that highlights various technological updates of recent years, providing readers with highly relevant and current information. Sommerville's experience in system dependability and systems engineering guides the text through a traditional plan-based approach that incorporates some novel agile methods. The text strives to teach the innovators of tomorrow how to create software that will make our world a better, safer, and more advanced place to live.

  19. NiftyPET: a High-throughput Software Platform for High Quantitative Accuracy and Precision PET Imaging and Analysis.

    Science.gov (United States)

    Markiewicz, Pawel J; Ehrhardt, Matthias J; Erlandsson, Kjell; Noonan, Philip J; Barnes, Anna; Schott, Jonathan M; Atkinson, David; Arridge, Simon R; Hutton, Brian F; Ourselin, Sebastien

    2018-01-01

    We present a standalone, scalable and high-throughput software platform for PET image reconstruction and analysis. We focus on high fidelity modelling of the acquisition processes to provide high accuracy and precision quantitative imaging, especially for large axial field of view scanners. All the core routines are implemented using parallel computing available from within the Python package NiftyPET, enabling easy access, manipulation and visualisation of data at any processing stage. The pipeline of the platform starts from MR and raw PET input data and is divided into the following processing stages: (1) list-mode data processing; (2) accurate attenuation coefficient map generation; (3) detector normalisation; (4) exact forward and back projection between sinogram and image space; (5) estimation of reduced-variance random events; (6) high accuracy fully 3D estimation of scatter events; (7) voxel-based partial volume correction; (8) region- and voxel-level image analysis. We demonstrate the advantages of this platform using an amyloid brain scan where all the processing is executed from a single and uniform computational environment in Python. The high accuracy acquisition modelling is achieved through span-1 (no axial compression) ray tracing for true, random and scatter events. Furthermore, the platform offers uncertainty estimation of any image derived statistic to facilitate robust tracking of subtle physiological changes in longitudinal studies. The platform also supports the development of new reconstruction and analysis algorithms through restricting the axial field of view to any set of rings covering a region of interest and thus performing fully 3D reconstruction and corrections using real data significantly faster. All the software is available as open source with the accompanying wiki-page and test data.

  20. 3D modeling of high-Tc superconductors by finite element software

    International Nuclear Information System (INIS)

    Zhang Min; Coombs, T A

    2012-01-01

    A three-dimensional (3D) numerical model is proposed to solve the electromagnetic problems involving transport current and background field of a high-T c superconducting (HTS) system. The model is characterized by the E–J power law and H-formulation, and is successfully implemented using finite element software. We first discuss the model in detail, including the mesh methods, boundary conditions and computing time. To validate the 3D model, we calculate the ac loss and trapped field solution for a bulk material and compare the results with the previously verified 2D solutions and an analytical solution. We then apply our model to test some typical problems such as superconducting bulk array and twisted conductors, which cannot be tackled by the 2D models. The new 3D model could be a powerful tool for researchers and engineers to investigate problems with a greater level of complicity.

  1. Development of software in LabVIEW for measurement of transport properties of high Tc superconductors

    International Nuclear Information System (INIS)

    Reilly, D.; Savvides, N.

    1996-01-01

    Full text: The gathering of data and their analysis are vital processes in experiments. We have used LabVIEW (National Instruments) to develop programs to measure transport properties of high - T c superconductors, eg. resistivity, ac susceptibility, I-V characteristics. Our systems make use of GPIB (IEEE - 488.2) programmable instruments and a personal computer. LabVIEW is a graphical programming system for instrument control and data acquisition, data analysis and presentation. A key feature of LabVIEW is the ability to graphically assemble software modules or virtual instruments (VIs) and 'wire' them together. In this paper we describe the development of several programs and will offer advice to colleagues wanting to explore LabVIEW

  2. Automated load balancing in the ATLAS high-performance storage software

    CERN Document Server

    Le Goff, Fabrice; The ATLAS collaboration

    2017-01-01

    The ATLAS experiment collects proton-proton collision events delivered by the LHC accelerator at CERN. The ATLAS Trigger and Data Acquisition (TDAQ) system selects, transports and eventually records event data from the detector at several gigabytes per second. The data are recorded on transient storage before being delivered to permanent storage. The transient storage consists of high-performance direct-attached storage servers accounting for about 500 hard drives. The transient storage operates dedicated software in the form of a distributed multi-threaded application. The workload includes both CPU-demanding and IO-oriented tasks. This paper presents the original application threading model for this particular workload, discussing the load-sharing strategy among the available CPU cores. The limitations of this strategy were reached in 2016 due to changes in the trigger configuration involving a new data distribution pattern. We then describe a novel data-driven load-sharing strategy, designed to automatical...

  3. High-Level software requirements specification for the TWRS controlled baseline database system

    International Nuclear Information System (INIS)

    Spencer, S.G.

    1998-01-01

    This Software Requirements Specification (SRS) is an as-built document that presents the Tank Waste Remediation System (TWRS) Controlled Baseline Database (TCBD) in its current state. It was originally known as the Performance Measurement Control System (PMCS). Conversion to the new system name has not occurred within the current production system. Therefore, for simplicity, all references to TCBD are equivalent to PMCS references. This SRS will reference the PMCS designator from this point forward to capture the as-built SRS. This SRS is written at a high-level and is intended to provide the design basis for the PMCS. The PMCS was first released as the electronic data repository for cost, schedule, and technical administrative baseline information for the TAAS Program. During its initial development, the PMCS was accepted by the customer, TARS Business Management, with no formal documentation to capture the initial requirements

  4. Enabling Diverse Software Stacks on Supercomputers using High Performance Virtual Clusters.

    Energy Technology Data Exchange (ETDEWEB)

    Younge, Andrew J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Pedretti, Kevin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Grant, Ryan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Brightwell, Ron [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-05-01

    While large-scale simulations have been the hallmark of the High Performance Computing (HPC) community for decades, Large Scale Data Analytics (LSDA) workloads are gaining attention within the scientific community not only as a processing component to large HPC simulations, but also as standalone scientific tools for knowledge discovery. With the path towards Exascale, new HPC runtime systems are also emerging in a way that differs from classical distributed com- puting models. However, system software for such capabilities on the latest extreme-scale DOE supercomputing needs to be enhanced to more appropriately support these types of emerging soft- ware ecosystems. In this paper, we propose the use of Virtual Clusters on advanced supercomputing resources to enable systems to support not only HPC workloads, but also emerging big data stacks. Specifi- cally, we have deployed the KVM hypervisor within Cray's Compute Node Linux on a XC-series supercomputer testbed. We also use libvirt and QEMU to manage and provision VMs directly on compute nodes, leveraging Ethernet-over-Aries network emulation. To our knowledge, this is the first known use of KVM on a true MPP supercomputer. We investigate the overhead our solution using HPC benchmarks, both evaluating single-node performance as well as weak scaling of a 32-node virtual cluster. Overall, we find single node performance of our solution using KVM on a Cray is very efficient with near-native performance. However overhead increases by up to 20% as virtual cluster size increases, due to limitations of the Ethernet-over-Aries bridged network. Furthermore, we deploy Apache Spark with large data analysis workloads in a Virtual Cluster, ef- fectively demonstrating how diverse software ecosystems can be supported by High Performance Virtual Clusters.

  5. High-Performing Families: Causes, Consequences, and Clinical Solutions. The Family Psychology and Counseling Series.

    Science.gov (United States)

    Robinson, Bryan E., Ed.; Chase, Nancy D., Ed.

    This book explores the dilemma of the increasing obsession with work and the resulting imbalances between career and family life. Through theoretical frameworks and case examples it discusses the negative consequences of the societal phenomena of over-work and over-dedication to careers, which have been misdiagnosed or ignored by mental health…

  6. Perceived Sexual Benefits of Alcohol Use among Recent High School Graduates: Longitudinal Associations with Drinking Behavior and Consequences

    Science.gov (United States)

    Brady, Sonya S.; Wilkerson, J. Michael; Jones-Webb, Rhonda

    2012-01-01

    In this research study of 153 college-bound students, perceived sexual benefits of alcohol use were associated with greater drinking and related consequences during the senior year of high school and freshman year of college. Perceived benefits predicted drinking outcomes during fall after adjustment for gender, sensation seeking, parental…

  7. Development of risk assessment simulation tool for optimal control of a low probability-high consequence disaster

    International Nuclear Information System (INIS)

    Yotsumoto, Hiroki; Yoshida, Kikuo; Genchi, Hiroshi

    2011-01-01

    In order to control low probability-high consequence disaster which causes huge social and economic damage, it is necessary to develop simultaneous risk assessment simulation tool based on the scheme of disaster risk including diverse effects of primary disaster and secondary damages. We propose the scheme of this risk simulation tool. (author)

  8. Transformations, transport, and potential unintended consequences of high sulfur inputs to Napa Valley vineyards

    OpenAIRE

    Hinckley, Eve-Lyn S.; Matson, Pamela A.

    2011-01-01

    Unintended anthropogenic deposition of sulfur (S) to forest ecosystems has a range of negative consequences, identified through decades of research. There has been far less study of purposeful S use in agricultural systems around the world, including the application of elemental sulfur (S0) as a quick-reacting fungicide to prevent damage to crops. Here we report results from a three-year study of the transformations and flows of applied S0 in soils, vegetation, and hydrologic export pathways ...

  9. The impact of using computer decision-support software in primary care nurse-led telephone triage: interactional dilemmas and conversational consequences.

    Science.gov (United States)

    Murdoch, Jamie; Barnes, Rebecca; Pooler, Jillian; Lattimer, Valerie; Fletcher, Emily; Campbell, John L

    2015-02-01

    Telephone triage represents one strategy to manage demand for face-to-face GP appointments in primary care. Although computer decision-support software (CDSS) is increasingly used by nurses to triage patients, little is understood about how interaction is organized in this setting. Specifically any interactional dilemmas this computer-mediated setting invokes; and how these may be consequential for communication with patients. Using conversation analytic methods we undertook a multi-modal analysis of 22 audio-recorded telephone triage nurse-caller interactions from one GP practice in England, including 10 video-recordings of nurses' use of CDSS during triage. We draw on Goffman's theoretical notion of participation frameworks to make sense of these interactions, presenting 'telling cases' of interactional dilemmas nurses faced in meeting patient's needs and accurately documenting the patient's condition within the CDSS. Our findings highlight troubles in the 'interactional workability' of telephone triage exposing difficulties faced in aligning the proximal and wider distal context that structures CDSS-mediated interactions. Patients present with diverse symptoms, understanding of triage consultations, and communication skills which nurses need to negotiate turn-by-turn with CDSS requirements. Nurses therefore need to have sophisticated communication, technological and clinical skills to ensure patients' presenting problems are accurately captured within the CDSS to determine safe triage outcomes. Dilemmas around how nurses manage and record information, and the issues of professional accountability that may ensue, raise questions about the impact of CDSS and its use in supporting nurses to deliver safe and effective patient care. Copyright © 2014 Elsevier Ltd. All rights reserved.

  10. A survey on the high reliability software verification and validation technology for instrumentation and control in NPP.

    Energy Technology Data Exchange (ETDEWEB)

    Kwon, Kee Choon; Lee, Chang Soo; Dong, In Sook [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1994-01-01

    This document presents the technical status of the software verification and validation (V and V) efforts to support developing and licensing digital instrumentation and control (I and C) systems in nuclear power plants. We have reviewed codes and standards to be concensus criteria among vendor, licensee and licenser. Then we have described the software licensing procedures under 10 CFR 50 and 10 CFR 52 of the United States cope with the licensing barrier. At last, we have surveyed the technical issues related to developing and licensing the high integrity software for digital I and C systems. These technical issues let us know the development direction of our own software V and V methodology. (Author) 13 refs., 2 figs.,.

  11. Testing on a Large Scale Running the ATLAS Data Acquisition and High Level Trigger Software on 700 PC Nodes

    CERN Document Server

    Burckhart-Chromek, Doris; Adragna, P; Alexandrov, L; Amorim, A; Armstrong, S; Badescu, E; Baines, J T M; Barros, N; Beck, H P; Bee, C; Blair, R; Bogaerts, J A C; Bold, T; Bosman, M; Caprini, M; Caramarcu, C; Ciobotaru, M; Comune, G; Corso-Radu, A; Cranfield, R; Crone, G; Dawson, J; Della Pietra, M; Di Mattia, A; Dobinson, Robert W; Dobson, M; Dos Anjos, A; Dotti, A; Drake, G; Ellis, Nick; Ermoline, Y; Ertorer, E; Falciano, S; Ferrari, R; Ferrer, M L; Francis, D; Gadomski, S; Gameiro, S; Garitaonandia, H; Gaudio, G; George, S; Gesualdi-Mello, A; Gorini, B; Green, B; Haas, S; Haberichter, W N; Hadavand, H; Haeberli, C; Haller, J; Hansen, J; Hauser, R; Hillier, S J; Höcker, A; Hughes-Jones, R E; Joos, M; Kazarov, A; Kieft, G; Klous, S; Kohno, T; Kolos, S; Korcyl, K; Kordas, K; Kotov, V; Kugel, A; Landon, M; Lankford, A; Leahu, L; Leahu, M; Lehmann-Miotto, G; Le Vine, M J; Liu, W; Maeno, T; Männer, R; Mapelli, L; Martin, B; Masik, J; McLaren, R; Meessen, C; Meirosu, C; Mineev, M; Misiejuk, A; Morettini, P; Mornacchi, G; Müller, M; Garcia-Murillo, R; Nagasaka, Y; Negri, A; Padilla, C; Pasqualucci, E; Pauly, T; Perera, V; Petersen, J; Pope, B; Albuquerque-Portes, M; Pretzl, K; Prigent, D; Roda, C; Ryabov, Yu; Salvatore, D; Schiavi, C; Schlereth, J L; Scholtes, I; Sole-Segura, E; Seixas, M; Sloper, J; Soloviev, I; Spiwoks, R; Stamen, R; Stancu, S; Strong, S; Sushkov, S; Szymocha, T; Tapprogge, S; Teixeira-Dias, P; Torres, R; Touchard, F; Tremblet, L; Ünel, G; Van Wasen, J; Vandelli, W; Vaz-Gil-Lopes, L; Vermeulen, J C; von der Schmitt, H; Wengler, T; Werner, P; Wheeler, S; Wickens, F; Wiedenmann, W; Wiesmann, M; Wu, X; Yasu, Y; Yu, M; Zema, F; Zobernig, H; Computing In High Energy and Nuclear Physics

    2006-01-01

    The ATLAS Data Acquisition (DAQ) and High Level Trigger (HLT) software system will be comprised initially of 2000 PC nodes which take part in the control, event readout, second level trigger and event filter operations. This high number of PCs will only be purchased before data taking in 2007. The large CERN IT LXBATCH facility provided the opportunity to run in July 2005 online functionality tests over a period of 5 weeks on a stepwise increasing farm size from 100 up to 700 PC dual nodes. The interplay between the control and monitoring software with the event readout, event building and the trigger software has been exercised the first time as an integrated system on this large scale. New was also to run algorithms in the online environment for the trigger selection and in the event filter processing tasks on a larger scale. A mechanism has been developed to package the offline software together with the DAQ/HLT software and to distribute it via peer-to-peer software efficiently to this large pc cluster. T...

  12. Testing on a Large Scale running the ATLAS Data Acquisition and High Level Trigger Software on 700 PC Nodes

    CERN Document Server

    Burckhart-Chromek, Doris; Adragna, P; Albuquerque-Portes, M; Alexandrov, L; Amorim, A; Armstrong, S; Badescu, E; Baines, J T M; Barros, N; Beck, H P; Bee, C; Blair, R; Bogaerts, J A C; Bold, T; Bosman, M; Caprini, M; Caramarcu, C; Ciobotaru, M; Comune, G; Corso-Radu, A; Cranfield, R; Crone, G; Dawson, J; Della Pietra, M; Di Mattia, A; Dobinson, Robert W; Dobson, M; Dos Anjos, A; Dotti, A; Drake, G; Ellis, Nick; Ermoline, Y; Ertorer, E; Falciano, S; Ferrari, R; Ferrer, M L; Francis, D; Gadomski, S; Gameiro, S; Garcia-Murillo, R; Garitaonandia, H; Gaudio, G; George, S; Gesualdi-Mello, A; Gorini, B; Green, B; Haas, S; Haberichter, W N; Hadavand, H; Haeberli, C; Haller, J; Hansen, J; Hauser, R; Hillier, S J; Hughes-Jones, R E; Höcker, A; Joos, M; Kazarov, A; Kieft, G; Klous, S; Kohno, T; Kolos, S; Korcyl, K; Kordas, K; Kotov, V; Kugel, A; Landon, M; Lankford, A; Le Vine, M J; Leahu, L; Leahu, M; Lehmann-Miotto, G; Liu, W; Maeno, T; Mapelli, L; Martin, B; Masik, J; McLaren, R; Meessen, C; Meirosu, C; Mineev, M; Misiejuk, A; Morettini, P; Mornacchi, G; Männer, R; Müller, M; Nagasaka, Y; Negri, A; Padilla, C; Pasqualucci, E; Pauly, T; Perera, V; Petersen, J; Pope, B; Pretzl, K; Prigent, D; Roda, C; Ryabov, Yu; Salvatore, D; Schiavi, C; Schlereth, J L; Scholtes, I; Seixas, M; Sloper, J; Sole-Segura, E; Soloviev, I; Spiwoks, R; Stamen, R; Stancu, S; Strong, S; Sushkov, S; Szymocha, T; Tapprogge, S; Teixeira-Dias, P; Torres, R; Touchard, F; Tremblet, L; Van Wasen, J; Vandelli, W; Vaz-Gil-Lopes, L; Vermeulen, J C; Wengler, T; Werner, P; Wheeler, S; Wickens, F; Wiedenmann, W; Wiesmann, M; Wu, X; Yasu, Y; Yu, M; Zema, F; Zobernig, H; von der Schmitt, H; Ünel, G; Computing In High Energy and Nuclear Physics

    2006-01-01

    The ATLAS Data Acquisition (DAQ) and High Level Trigger (HLT) software system will be comprised initially of 2000 PC nodes which take part in the control, event readout, second level trigger and event filter operations. This high number of PCs will only be purchased before data taking in 2007. The large CERN IT LXBATCH facility provided the opportunity to run in July 2005 online functionality tests over a period of 5 weeks on a stepwise increasing farm size from 100 up to 700 PC dual nodes. The interplay between the control and monitoring software with the event readout, event building and the trigger software has been exercised the first time as an integrated system on this large scale. New was also to run algorithms in the online environment for the trigger selection and in the event filter processing tasks on a larger scale. A mechanism has been developed to package the offline software together with the DAQ/HLT software and to distribute it via peer-to-peer software efficiently to this large pc cluster. T...

  13. Biomarker Discovery Using New Metabolomics Software for Automated Processing of High Resolution LC-MS Data

    Science.gov (United States)

    Hnatyshyn, S.; Reily, M.; Shipkova, P.; McClure, T.; Sanders, M.; Peake, D.

    2011-01-01

    Robust biomarkers of target engagement and efficacy are required in different stages of drug discovery. Liquid chromatography coupled to high resolution mass spectrometry provides sensitivity, accuracy and wide dynamic range required for identification of endogenous metabolites in biological matrices. LCMS is widely-used tool for biomarker identification and validation. Typical high resolution LCMS profiles from biological samples may contain greater than a million mass spectral peaks corresponding to several thousand endogenous metabolites. Reduction of the total number of peaks, component identification and statistical comparison across sample groups remains to be a difficult and time consuming challenge. Blood samples from four groups of rats (male vs. female, fully satiated and food deprived) were analyzed using high resolution accurate mass (HRAM) LCMS. All samples were separated using a 15 minute reversed-phase C18 LC gradient and analyzed in both positive and negative ion modes. Data was acquired using 15K resolution and 5ppm mass measurement accuracy. The entire data set was analyzed using software developed in collaboration between Bristol Meyers Squibb and Thermo Fisher Scientific to determine the metabolic effects of food deprivation on rats. Metabolomic LC-MS data files are extraordinarily complex and appropriate reduction of the number of spectral peaks via identification of related peaks and background removal is essential. A single component such as hippuric acid generates more than 20 related peaks including isotopic clusters, adducts and dimers. Plasma and urine may contain 500-1500 unique quantifiable metabolites. Noise filtering approaches including blank subtraction were used to reduce the number of irrelevant peaks. By grouping related signals such as isotopic peaks and alkali adducts, data processing was greatly simplified by reducing the total number of components by 10-fold. The software processes 48 samples in under 60minutes. Principle

  14. Development of expert system software to improve performance of high-voltage arresters in substations

    Energy Technology Data Exchange (ETDEWEB)

    Souza, Andre Nunes de; Oltremari, Anderson; Zago, Maria Goretti; Silva, Paulo Sergio da; Costa Junior, Pedro da; Ferraz, Kleber [Sao Paulo State Univ. (UNESP), Bauru, SP (Brazil). Lab. of Power Systems and Intelligent Techniques], E-mail: andrejau@feb.unesp.br; Gusmao, Euripedes Silva; Prado, Jose Martins [ELETRONORTE, MT (Brazil)], E-mail: euripedes.gusmao@eln.gov.br

    2007-07-01

    One of the main causes of interruption and power outage on the energy distribution system in Brazil is related to lightning, which is also the main responsible by the reduction of service life and destruction of consumers and Utilities' equipment. As a manner of improving the protection of the energy distribution system, the Utilities have given attention on establishing maintenance techniques, such preventive as predictive, of the high-voltage arresters in substation. Currently, one of the main manners to obtain the installed arresters' characteristics involves the utilization of high cost equipment, such as leakage current meters. In this way, this paper aims to fulfill the needs of obtaining reliable results with the utilization of lower cost equipment, proposing a Expert System Software for diagnosing and aiding to decision through the utilization of intelligent techniques, which makes possible the monitoring of service life and the identification of aged arresters, allowing the establishment of one reliable chronogram for the removal of equipment, such for maintenance as for substitution. (author)

  15. Software implementation of a high speed interface between a PDP-10 and several PDP-11s

    International Nuclear Information System (INIS)

    De Mesa, N.P. III.

    1975-01-01

    The DMA10 is a high speed link between a PDP-10 and up to eight PDP-11s; specifically, the PDP-10 shares sections of its memory with the PDP-11s. The two segment concept on the PDP-10 of shared/reentrant code and non-shared code is implemented. The inclusion of read only memory on the PDP-11s allows for the development of ''PROM'' software which all the PDP-11s may share. The principal difference between the DMA10 and other communications interfaces is that it is not a block transfer device. Because of the shared memory concept the features of the DMA10 are high data bandwidth and minimal processor intervention between data transfers. Communication programs between the PDP-10 and the PDP-11 may be tested wholly in either processor, independent of the DMA10 interface. In the current mode of operation the PDP-11's simply act as device controllers. Future plans include separate operating systems in various PDP-11s

  16. Verification of computer system PROLOG - software tool for rapid assessments of consequences of short-term radioactive releases to the atmosphere

    Energy Technology Data Exchange (ETDEWEB)

    Kiselev, Alexey A.; Krylov, Alexey L.; Bogatov, Sergey A. [Nuclear Safety Institute (IBRAE), Bolshaya Tulskaya st. 52, 115191, Moscow (Russian Federation)

    2014-07-01

    In case of nuclear and radiation accidents emergency response authorities require a tool for rapid assessments of possible consequences. One of the most significant problems is lack of data on initial state of an accident. The lack can be especially critical in case the accident occurred in a location that was not thoroughly studied beforehand (during transportation of radioactive materials for example). One of possible solutions is the hybrid method when a model that enables rapid assessments with the use of reasonable minimum of input data is used conjointly with an observed data that can be collected shortly after accidents. The model is used to estimate parameters of the source and uncertain meteorological parameters on the base of some observed data. For example, field of fallout density can be observed and measured within hours after an accident. After that the same model with the use of estimated parameters is used to assess doses and necessity of recommended and mandatory countermeasures. The computer system PROLOG was designed to solve the problem. It is based on the widely used Gaussian model. The standard Gaussian model is supplemented with several sub-models that allow to take into account: polydisperse aerosols, aerodynamic shade from buildings in the vicinity of the place of accident, terrain orography, initial size of the radioactive cloud, effective height of the release, influence of countermeasures on the doses of radioactive exposure of humans. It uses modern GIS technologies and can use web map services. To verify ability of PROLOG to solve the problem it is necessary to test its ability to assess necessary parameters of real accidents in the past. Verification of the computer system on the data of Chazhma Bay accident (Russian Far East, 1985) was published previously. In this work verification was implemented on the base of observed contamination from the Kyshtym disaster (PA Mayak, 1957) and the Tomsk accident (1993). Observations of Sr-90

  17. Development of a software system for spatial resolved trace analysis of high performance materials with SIMS

    International Nuclear Information System (INIS)

    Brunner, Ch. H.

    1997-09-01

    The following work is separated into two distinctly different parts. The first one is dealing with the SIMSScan software project, an application system for secondary ion mass spectrometry. This application system primarily lays down the foundation, for the research activity introduced in the second part of this work. SIMSScan is an application system designed to provide data acquisition routines for different requirements in the field of secondary ion mass spectroscopy. The whole application package is divided into three major sections, each one dealing with specific measurement tasks. Various supporting clients and wizards, providing extended functionality to the main application, build the core of the software. The MassScan as well as the DepthScan module incorporate the SIMS in the direct imaging or stigmatic mode and are featuring the capabilities for mass spectra recording or depth profile analysis. In combination with an image recording facility the DepthScan module features the capability of spatial resolved material analysis - 3D SIMS. The RasterScan module incorporates the SIMS in scanning mode and supports an fiber optical link for optimized data transfer. The primary goal of this work is to introduce the basic ideas behind the implementation of the main application modules and the supporting clients. Furthermore, it is the intention to lay down the foundation for further developments. At the beginning a short introduction into the paradigm of object oriented programming as well as Windows TM programming is given. Besides explaining the basic ideas behind the Doc/View application architecture the focus is mainly shifted to the routines controlling the SIMS hardware and the basic concepts of multithreaded programming. The elementary structures of the view and document objects is discussed in detail only for the MassScan module, because the ideas behind data abstraction and encapsulation are quite similar. The second part introduces the research activities

  18. High-resolution gamma-ray measurement systems using a compact electro- mechanically cooled detector system and intelligent software

    International Nuclear Information System (INIS)

    Buckley, W.M.; Carlson, J.B.; Neufeld, K.W.

    1995-01-01

    Obtaining high-resolution gamma-ray measurements using high-purity germanium (HPGe) detectors in the field has been of limited practicality due to the need to use and maintain a supply of liquid nitrogen (LN 2 ). This same constraint limits high-resolution gamma measurements in unattended safeguards or treaty Verification applications. We are developing detectors and software to greatly extend the applicability of high-resolution germanium-based measurements for these situations

  19. The blues broaden, but the nasty narrows: attentional consequences of negative affects low and high in motivational intensity.

    Science.gov (United States)

    Gable, Philip; Harmon-Jones, Eddie

    2010-02-01

    Positive and negative affects high in motivational intensity cause a narrowing of attentional focus. In contrast, positive affects low in motivational intensity cause a broadening of attentional focus. The attentional consequences of negative affects low in motivational intensity have not been experimentally investigated. Experiment 1 compared the attentional consequences of negative affect low in motivational intensity (sadness) relative to a neutral affective state. Results indicated that low-motivation negative affect caused attentional broadening. Experiment 2 found that disgust, a high-motivation negative affect not previously investigated in attentional studies, narrowed attentional focus. These experiments support the conceptual model linking high-motivation affective states to narrowed attention and low-motivation affective states to broadened attention.

  20. ACHIEVING HIGH INTEGRITY OF PROCESS-CONTROL SOFTWARE BY GRAPHICAL DESIGN AND FORMAL VERIFICATION

    NARCIS (Netherlands)

    HALANG, WA; Kramer, B.J.

    The International Electrotechnical Commission is currently standardising four compatible languages for designing and implementing programmable logic controllers (PLCs). The language family includes a diagrammatic notation that supports the idea of software ICs to encourage graphical design

  1. A NASA-wide approach toward cost-effective, high-quality software through reuse

    Science.gov (United States)

    Scheper, Charlotte O. (Editor); Smith, Kathryn A. (Editor)

    1993-01-01

    NASA Langley Research Center sponsored the second Workshop on NASA Research in Software Reuse on May 5-6, 1992 at the Research Triangle Park, North Carolina. The workshop was hosted by the Research Triangle Institute. Participants came from the three NASA centers, four NASA contractor companies, two research institutes and the Air Force's Rome Laboratory. The purpose of the workshop was to exchange information on software reuse tool development, particularly with respect to tool needs, requirements, and effectiveness. The participants presented the software reuse activities and tools being developed and used by their individual centers and programs. These programs address a wide range of reuse issues. The group also developed a mission and goals for software reuse within NASA. This publication summarizes the presentations and the issues discussed during the workshop.

  2. Transformations, transport, and potential unintended consequences of high sulfur inputs to Napa Valley vineyards.

    Science.gov (United States)

    Hinckley, Eve-Lyn S; Matson, Pamela A

    2011-08-23

    Unintended anthropogenic deposition of sulfur (S) to forest ecosystems has a range of negative consequences, identified through decades of research. There has been far less study of purposeful S use in agricultural systems around the world, including the application of elemental sulfur (S(0)) as a quick-reacting fungicide to prevent damage to crops. Here we report results from a three-year study of the transformations and flows of applied S(0) in soils, vegetation, and hydrologic export pathways of Napa Valley, CA vineyards, documenting that all applied S is lost from the vineyard ecosystem on an annual basis. We found that S(0) oxidizes rapidly to sulfate ( ) on the soil surface where it then accumulates over the course of the growing season. Leaf and grape tissues accounted for only 7-13% of applied S whereas dormant season cover crops accounted for 4-10% of applications. Soil S inventories were largely and ester-bonded sulfates; they decreased from 1,623 ± 354 kg ha(-1) during the dry growing season to 981 ± 526 kg ha(-1) (0-0.5 m) during the dormant wet season. Nearly all S applied to the vineyard soils is transported offsite in dissolved oxidized forms during dormant season rainstorms. Thus, the residence time of reactive S is brief in these systems, and largely driven by hydrology. Our results provide new insight into how S use in vineyards constitutes a substantial perturbation of the S cycle in Northern California winegrowing regions and points to the unintended consequences that agricultural S use may have at larger scales.

  3. CAUSES AND CONSEQUENCES OF THE SCHOOL IN HIGH SCHOOL DROPOUT: CASE UNIVERSIDAD AUTÓNOMA DE SINALOA

    Directory of Open Access Journals (Sweden)

    Rosalva Ruiz-Ramírez

    2014-07-01

    Full Text Available The present investigation has the objective to establish the personal, economic and social causes and consequences that create school desertion of high school in Universidad Autónoma de Sinaloa (UAS. The investigation took place in the high school located in the municipality of El Fuerte, Sinaloa, in the academic unit (UA of San Blas and its extensions The Constancia and The Higueras of the Natoches in 2013. A mixed approach was used to analyze qualitative and quantitative information; the studied population was 18 women and 17 men deserters of the school cycle 2011-2012, ten teachers, four directors and twenty non-deserting students. In the results one can see that the principal factor for school desertion was the personnel to be married and not approving classes. The main consequence was economic, highlighting that the poverty cycle is hard to break.

  4. The Consequences of Commercialization Choices for New Entrants in High-Tech Industries: A Venture Emergence Perspective

    DEFF Research Database (Denmark)

    Giones, Ferran; Gurses, Kerem

    for these different markets. We test our hypotheses on longitudinal dataset of 453 new firms started in 2004 in different high-tech industries in the US. We find that that technology and human capital resources favor the adoption of alternative commercialization strategies; nevertheless, we do not observe significant...... differences in the venture emergence or survival likelihood. Our findings offer a closer view of the venture emergence process of new firms, clarifying the causes and consequences of the technology commercialization choices....

  5. Challenges in Mentoring Software Development Projects in the High School: Analysis According to Shulman's Teacher Knowledge Base Model

    Science.gov (United States)

    Meerbaum-Salant, Orni; Hazzan, Orit

    2009-01-01

    This paper focuses on challenges in mentoring software development projects in the high school and analyzes difficulties encountered by Computer Science teachers in the mentoring process according to Shulman's Teacher Knowledge Base Model. The main difficulties that emerged from the data analysis belong to the following knowledge sources of…

  6. Infrastructure for Multiphysics Software Integration in High Performance Computing-Aided Science and Engineering

    Energy Technology Data Exchange (ETDEWEB)

    Campbell, Michael T. [Illinois Rocstar LLC, Champaign, IL (United States); Safdari, Masoud [Illinois Rocstar LLC, Champaign, IL (United States); Kress, Jessica E. [Illinois Rocstar LLC, Champaign, IL (United States); Anderson, Michael J. [Illinois Rocstar LLC, Champaign, IL (United States); Horvath, Samantha [Illinois Rocstar LLC, Champaign, IL (United States); Brandyberry, Mark D. [Illinois Rocstar LLC, Champaign, IL (United States); Kim, Woohyun [Illinois Rocstar LLC, Champaign, IL (United States); Sarwal, Neil [Illinois Rocstar LLC, Champaign, IL (United States); Weisberg, Brian [Illinois Rocstar LLC, Champaign, IL (United States)

    2016-10-15

    . There are over 100 unit tests provided that run through the Illinois Rocstar Application Development (IRAD) lightweight testing infrastructure that is also supplied along with IMPACT. The package as a whole provides an excellent base for developing high-quality multiphysics applications using modern software development practices. To facilitate understanding how to utilize IMPACT effectively, two multiphysics systems have been developed and are available open-source through gitHUB. The simpler of the two systems, named ElmerFoamFSI in the repository, is a multiphysics, fluid-structure-interaction (FSI) coupling of the solid mechanics package Elmer with a fluid dynamics module from OpenFOAM. This coupling illustrates how to combine software packages that are unrelated by either author or architecture and combine them into a robust, parallel multiphysics system. A more complex multiphysics tool is the Illinois Rocstar Rocstar Multiphysics code that was rebuilt during the project around IMPACT. Rocstar Multiphysics was already an HPC multiphysics tool, but now that it has been rearchitected around IMPACT, it can be readily expanded to capture new and different physics in the future. In fact, during this project, the Elmer and OpenFOAM tools were also coupled into Rocstar Multiphysics and demonstrated. The full Rocstar Multiphysics codebase is also available on gitHUB, and licensed for any organization to use as they wish. Finally, the new IMPACT product is already being used in several multiphysics code coupling projects for the Air Force, NASA and the Missile Defense Agency, and initial work on expansion of the IMPACT-enabled Rocstar Multiphysics has begun in support of a commercial company. These initiatives promise to expand the interest and reach of IMPACT and Rocstar Multiphysics, ultimately leading to the envisioned standardization and consortium of users that was one of the goals of this project.

  7. Laboratory Information Management Software for genotyping workflows: applications in high throughput crop genotyping

    Directory of Open Access Journals (Sweden)

    Prasanth VP

    2006-08-01

    Full Text Available Abstract Background With the advances in DNA sequencer-based technologies, it has become possible to automate several steps of the genotyping process leading to increased throughput. To efficiently handle the large amounts of genotypic data generated and help with quality control, there is a strong need for a software system that can help with the tracking of samples and capture and management of data at different steps of the process. Such systems, while serving to manage the workflow precisely, also encourage good laboratory practice by standardizing protocols, recording and annotating data from every step of the workflow. Results A laboratory information management system (LIMS has been designed and implemented at the International Crops Research Institute for the Semi-Arid Tropics (ICRISAT that meets the requirements of a moderately high throughput molecular genotyping facility. The application is designed as modules and is simple to learn and use. The application leads the user through each step of the process from starting an experiment to the storing of output data from the genotype detection step with auto-binning of alleles; thus ensuring that every DNA sample is handled in an identical manner and all the necessary data are captured. The application keeps track of DNA samples and generated data. Data entry into the system is through the use of forms for file uploads. The LIMS provides functions to trace back to the electrophoresis gel files or sample source for any genotypic data and for repeating experiments. The LIMS is being presently used for the capture of high throughput SSR (simple-sequence repeat genotyping data from the legume (chickpea, groundnut and pigeonpea and cereal (sorghum and millets crops of importance in the semi-arid tropics. Conclusion A laboratory information management system is available that has been found useful in the management of microsatellite genotype data in a moderately high throughput genotyping

  8. New Modelling Capabilities in Commercial Software for High-Gain Antennas

    DEFF Research Database (Denmark)

    Jørgensen, Erik; Lumholt, Michael; Meincke, Peter

    2012-01-01

    characterization of the reflectarray element, an initial phaseonly synthesis, followed by a full optimization procedure taking into account the near-field from the feed and the finite extent of the array. Another interesting new modelling capability is made available through the DIATOOL software, which is a new...... type of EM software tool aimed at extending the ways engineers can use antenna measurements in the antenna design process. The tool allows reconstruction of currents and near fields on a 3D surface conformal to the antenna, by using the measured antenna field as input. The currents on the antenna...... surface can provide valuable information about the antenna performance or undesired contributions, e.g. currents on a cable,can be artificially removed. Finally, the CHAMP software will be extended to cover reflector shaping and more complex materials,which combined with a much faster execution speed...

  9. The subthalamic nucleus keeps you high on emotion: behavioral consequences of its inactivation

    Directory of Open Access Journals (Sweden)

    Yann ePelloux

    2014-12-01

    Full Text Available The subthalamic nucleus (STN belongs to the basal ganglia and is the current target for the surgical treatment of neurological and psychiatric disorders such as Parkinson’s Disease (PD and obsessive compulsive disorders, but also a proposed site for the treatment of addiction. It is therefore very important to understand its functions in order to anticipate and prevent possible side-effects in the patients. Although the involvement of the STN is well documented in motor, cognitive and motivational processes, less is known regarding emotional processes. Here we have investigated the direct consequences of STN inactivation by excitotoxic lesions on emotional processing and reinforcement in the rat. We have used various behavioral procedures to assess affect for neutral, positive and negative reinforcers in STN lesioned rats. STN lesions reduced affective responses for positive (sweet solutions and negative (electric foot shock, Lithium Chloride-induced sickness reinforcers while they had no effect on responses for a more neutral reinforcer (novelty induced place preference. Furthermore, when given the choice between saccharine, a sweet but non caloric solution, and glucose, a more bland but caloric solution, in contrast to sham animals that preferred saccharine, STN lesioned animals preferred glucose over saccharine. Taken altogether these results reveal that STN plays a critical role in emotional processing. These results, in line with some clinical observations in PD patients subjected to STN surgery, suggest possible emotional side-effects of treatments targeting the STN. They also suggest that the increased motivation for sucrose previously reported cannot be due to increased pleasure, but could be responsible for the decreased motivation for cocaine reported after STN inactivation.

  10. High capacity hydrogen absorption in transition-metal ethylene complexes: consequences of nanoclustering

    International Nuclear Information System (INIS)

    Phillips, A B; Shivaram, B S

    2009-01-01

    We have recently shown that organo-metallic complexes formed by laser ablating transition metals in ethylene are high hydrogen absorbers at room temperature (Phillips and Shivaram 2008 Phys. Rev. Lett. 100 105505). Here we show that the absorption percentage depends strongly on the ethylene pressure. High ethylene pressures (>100 mTorr) result in a lowered hydrogen uptake. Transmission electron microscopy measurements reveal that while low pressure ablations result in metal atoms dispersed uniformly on a near atomic scale, high pressure ones yield distinct nanoparticles with electron energy-loss spectroscopy demonstrating that the metal atoms are confined solely to the nanoparticles.

  11. EMBRYONIC VASCULAR DISRUPTION ADVERSE OUTCOMES: LINKING HIGH THROUGHPUT SIGNALING SIGNATURES WITH FUNCTIONAL CONSEQUENCES

    Science.gov (United States)

    Embryonic vascular disruption is an important adverse outcome pathway (AOP) given the knowledge that chemical disruption of early cardiovascular system development leads to broad prenatal defects. High throughput screening (HTS) assays provide potential building blocks for AOP d...

  12. Software Aspects of IEEE Floating-Point Computations for Numerical Applications in High Energy Physics

    CERN Multimedia

    CERN. Geneva

    2010-01-01

    hazards to be avoided About the speaker Jeffrey M Arnold is a Senior Software Engineer in the Intel Compiler and Languages group at Intel Corporation. He has been part of the Digital->Compaq->Intel compiler organization for nearly 20 years; part of that time, he worked on both lo...

  13. High-Q Variable Bandwidth Passive Filters for Software Defined Radio

    NARCIS (Netherlands)

    Arkesteijn, V.J.; Klumperink, Eric A.M.; Nauta, Bram

    2001-01-01

    An important aspect of Software Defined Radio is the ability to define the bandwidth of the filter that selects the desired channel. This paper describes a technique for channel filtering, in which two passive filters are combined to obtain a variable bandwidth. Passive filters have the advantage of

  14. High-Q variable bandwidth passive filters for Software Defined Radio

    NARCIS (Netherlands)

    Arkesteijn, V.J.; Klumperink, Eric A.M.; Nauta, Bram

    An important aspect of Software Defined Radio is the ability to define the bandwidth of the filter that selects the desired channel. This paper describes a technique for channel filtering, in which two passive filters are combined to obtain a variable bandwidth. Passive filters have the advantage of

  15. High Technology Systems with Low Technology Failures: Some Experiences with Rockets on Software Quality and Integration

    Science.gov (United States)

    Craig, Larry G.

    2010-01-01

    This slide presentation reviews three failures of software and how the failures contributed to or caused the failure of a launch or payload insertion into orbit. In order to avoid these systematic failures in the future, failure mitigation strategies are suggested for use.

  16. Experimental Consequences of Mottness in High-Temperature Copper-Oxide Superconductors

    Science.gov (United States)

    Chakraborty, Shiladitya

    2009-01-01

    It has been more than two decades since the copper-oxide high temperature superconductors were discovered. However, building a satisfactory theoretical framework to study these compounds still remains one of the major challenges in condensed matter physics. In addition to the mechanism of superconductivity, understanding the properties of the…

  17. High-intensity interval exercise and cerebrovascular health: curiosity, cause, and consequence

    OpenAIRE

    Lucas, Samuel J E; Cotter, James D; Brassard, Patrice; Bailey, Damian M

    2015-01-01

    Exercise is a uniquely effective and pluripotent medicine against several noncommunicable diseases of westernised lifestyles, including protection against neurodegenerative disorders. High-intensity interval exercise training (HIT) is emerging as an effective alternative to current health-related exercise guidelines. Compared with traditional moderate-intensity continuous exercise training, HIT confers equivalent if not indeed superior metabolic, cardiac, and systemic vascular adaptation. Con...

  18. High Risk Behaviors in Marine Mammals: Linking Behavioral Responses to Anthropogenic Disturbance to Biological Consequences

    Science.gov (United States)

    2015-09-30

    1 DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited. High Risk Behaviors in Marine Mammals : Linking...comprehensive evaluation of biological safety zones for diving marine mammals . In this way we intend to identify those marine mammal species or specific...improving the protection of marine mammals during naval operations. OBJECTIVES We are testing the hypothesis that extreme behaviors requiring

  19. Portal hypertension in children: High-risk varices, primary prophylaxis and consequences of bleeding.

    Science.gov (United States)

    Duché, Mathieu; Ducot, Béatrice; Ackermann, Oanez; Guérin, Florent; Jacquemin, Emmanuel; Bernard, Olivier

    2017-02-01

    Primary prophylaxis of bleeding is debated for children with portal hypertension because of the limited number of studies on its safety and efficacy, the lack of a known endoscopic pattern carrying a high-risk of bleeding for all causes, and the assumption that the mortality of a first bleed is low. We report our experience with these issues. From 1989 to 2014, we managed 1300 children with portal hypertension. Endoscopic features were recorded; high-risk varices were defined as: grade 3 esophageal varices, grade 2 varices with red wale markings, or gastric varices. Two hundred forty-six children bled spontaneously and 182 underwent primary prophylaxis. The results of primary prophylaxis were reviewed as well as bleed-free survival, overall survival and life-threatening complications of bleeding. High-risk varices were found in 96% of children who bled spontaneously and in 11% of children who did not bleed without primary prophylaxis (pportal hypertension. Life-threatening complications of bleeding were recorded in 19% of children with cirrhosis and high-risk varices who bled spontaneously. Ten-year probabilities of bleed-free survival after primary prophylaxis in children with high-risk varices were 96% and 72% for non-cirrhotic causes and cirrhosis respectively. Ten-year probabilities of overall survival after primary prophylaxis were 100% and 93% in children with non-cirrhotic causes and cirrhosis respectively. In children with portal hypertension, bleeding is linked to the high-risk endoscopic pattern reported here. Primary prophylaxis of bleeding based on this pattern is fairly effective and safe. In children with liver disease, the risk of bleeding from varices in the esophagus is linked to their large size, the presence of congestion on their surface and their expansion into the stomach but not to the child's age nor to the cause of portal hypertension. Prevention of the first bleed in children with high-risk varices can be achieved by surgery or endoscopic

  20. High pressure single-crystal micro X-ray diffraction analysis with GSE_ADA/RSV software

    Science.gov (United States)

    Dera, Przemyslaw; Zhuravlev, Kirill; Prakapenka, Vitali; Rivers, Mark L.; Finkelstein, Gregory J.; Grubor-Urosevic, Ognjen; Tschauner, Oliver; Clark, Simon M.; Downs, Robert T.

    2013-08-01

    GSE_ADA/RSV is a free software package for custom analysis of single-crystal micro X-ray diffraction (SCμXRD) data, developed with particular emphasis on data from samples enclosed in diamond anvil cells and subject to high pressure conditions. The package has been in extensive use at the high pressure beamlines of Advanced Photon Source (APS), Argonne National Laboratory and Advanced Light Source (ALS), Lawrence Berkeley National Laboratory. The software is optimized for processing of wide-rotation images and includes a variety of peak intensity corrections and peak filtering features, which are custom-designed to make processing of high pressure SCμXRD easier and more reliable.

  1. [Extraction of buildings three-dimensional information from high-resolution satellite imagery based on Barista software].

    Science.gov (United States)

    Zhang, Pei-feng; Hu, Yuan-man; He, Hong-shi

    2010-05-01

    The demand for accurate and up-to-date spatial information of urban buildings is becoming more and more important for urban planning, environmental protection, and other vocations. Today's commercial high-resolution satellite imagery offers the potential to extract the three-dimensional information of urban buildings. This paper extracted the three-dimensional information of urban buildings from QuickBird imagery, and validated the precision of the extraction based on Barista software. It was shown that the extraction of three-dimensional information of the buildings from high-resolution satellite imagery based on Barista software had the advantages of low professional level demand, powerful universality, simple operation, and high precision. One pixel level of point positioning and height determination accuracy could be achieved if the digital elevation model (DEM) and sensor orientation model had higher precision and the off-Nadir View Angle was relatively perfect.

  2. Genetic consequences of forest fragmentation for a highly specialized arboreal mammal--the edible dormouse.

    Directory of Open Access Journals (Sweden)

    Joanna Fietz

    Full Text Available Habitat loss and fragmentation represent the most serious extinction threats for many species and have been demonstrated to be especially detrimental for mammals. Particularly, highly specialized species with low dispersal abilities will encounter a high risk of extinction in fragmented landscapes. Here we studied the edible dormouse (Glis glis, a small arboreal mammal that is distributed throughout Central Europe, where forests are mostly fragmented at different spatial scales. The aim of this study was to investigate the effect of habitat fragmentation on genetic population structures using the example of edible dormouse populations inhabiting forest fragments in south western Germany. We genotyped 380 adult individuals captured between 2001 and 2009 in four different forest fragments and one large continuous forest using 14 species-specific microsatellites. We hypothesised, that populations in small forest patches have a lower genetic diversity and are more isolated compared to populations living in continuous forests. In accordance with our expectations we found that dormice inhabiting forest fragments were isolated from each other. Furthermore, their genetic population structure was more unstable over the study period than in the large continuous forest. Even though we could not detect lower genetic variability within individuals inhabiting forest fragments, strong genetic isolation and an overall high risk to mate with close relatives might be precursors to a reduced genetic variability and the onset of inbreeding depression. Results of this study highlight that connectivity among habitat fragments can already be strongly hampered before genetic erosion within small and isolated populations becomes evident.

  3. Choices and Consequences.

    Science.gov (United States)

    Thorp, Carmany

    1995-01-01

    Describes student use of Hyperstudio computer software to create history adventure games. History came alive while students learned efficient writing skills; learned to understand and manipulate cause, effect choice and consequence; and learned to incorporate succinct locational, climatic, and historical detail. (ET)

  4. The consequences of high cigarette excise taxes for low-income smokers.

    Directory of Open Access Journals (Sweden)

    Matthew C Farrelly

    Full Text Available BACKGROUND: To illustrate the burden of high cigarette excise taxes on low-income smokers. METHODOLOGY/PRINCIPAL FINDINGS: Using data from the New York and national Adult Tobacco Surveys from 2010-2011, we estimated how smoking prevalence, daily cigarette consumption, and share of annual income spent on cigarettes vary by annual income (less than $30,000; $30,000-$59,999; and more than $60,000. The 2010-2011 sample includes 7,536 adults and 1,294 smokers from New York and 3,777 adults and 748 smokers nationally. Overall, smoking prevalence is lower in New York (16.1% than nationally (22.2% and is strongly associated with income in New York and nationally (P<.001. Smoking prevalence ranges from 12.2% to 33.7% nationally and from 10.1% to 24.3% from the highest to lowest income group. In 2010-2011, the lowest income group spent 23.6% of annual household income on cigarettes in New York (up from 11.6% in 2003-2004 and 14.2% nationally. Daily cigarette consumption is not related to income. CONCLUSIONS/SIGNIFICANCE: Although high cigarette taxes are an effective method for reducing cigarette smoking, they can impose a significant financial burden on low-income smokers.

  5. Software tool for representation and processing of experimental data on high energy interactions of elementary particles

    International Nuclear Information System (INIS)

    Cherepanov, E.O.; Skachkov, N.B.

    2002-01-01

    The software tool is developed for detailed and evident displaying of information about energy and space distribution of secondary particles produced in the processes of elementary particles collisions. As input information the data on the components of 4-momenta of secondary particles is used. As for these data the information obtained from different parts of physical detector (for example, from the calorimeter or tracker) as well as the information obtained with the help of event generator is taken. The tool is intended for use in Windows operation system and is developed on the basis of Borland Delphi. Mathematical architecture of the software tool allows user to receive complete information without making additional calculations. The program automatically performs analysis of structure and distributions of signals and displays the results in a transparent form which allows their quick analysis. To display the information the three-dimensional graphic methods as well as colour decisions based on intuitive associations are also used. (author)

  6. Isotope and fast ions turbulence suppression effects: Consequences for high-β ITER plasmas

    Science.gov (United States)

    Garcia, J.; Görler, T.; Jenko, F.

    2018-05-01

    The impact of isotope effects and fast ions on microturbulence is analyzed by means of non-linear gyrokinetic simulations for an ITER hybrid scenario at high beta obtained from previous integrated modelling simulations with simplified assumptions. Simulations show that ITER might work very close to threshold, and in these conditions, significant turbulence suppression is found from DD to DT plasmas. Electromagnetic effects are shown to play an important role in the onset of this isotope effect. Additionally, even external ExB flow shear, which is expected to be low in ITER, has a stronger impact on DT than on DD. The fast ions generated by fusion reactions can additionally reduce turbulence even more although the impact in ITER seems weaker than in present-day tokamaks.

  7. Molecular origins and consequences of High-800 LH2 in Roseobacter denitrificans.

    Science.gov (United States)

    Duquesne, Katia; Blanchard, Cecile; Sturgis, James N

    2011-08-09

    Roseobacter denitrificans is a marine bacterium capable of using a wide variety of different metabolic schemes and in particular is an anoxygenic aerobic photosynthetic bacterium. In the work reported here we use a deletion mutant that we have constructed to investigate the structural origin of the unusual High-800 light-harvesting complex absorption in this bacterium. We suggest that the structure is essentially unaltered when compared to the usual nonameric complexes but that a change in the environment of the C(13:1) carbonyl group is responsible for the change in spectrum. We tentatively relate this change to the presence of a serine residue in the α-polypeptide. Surprisingly, the low spectral overlap between the peripheral and core light-harvesting systems appears not to compromise energy collection efficiency too severely. We suggest that this may be at the expense of maintaining a low antenna size. © 2011 American Chemical Society

  8. Challenges of characterization of radioactive waste with High composition variability and their consequences measurement methodology

    International Nuclear Information System (INIS)

    Lexa, D.

    2014-01-01

    Radioactive waste characterization is a key step in every nuclear decommissioning project. It normally relies on a combination of facility operational history with results of destructive and non-destructive analysis. A particularly challenging situation arises when historical waste from a nuclear research facility is to be characterized, meaning little or no radiological information is available and the composition of the waste is highly variable. The nuclide vector concept is of limited utility, resulting in increased requirements placed on both the extent and performance of destructive and non-destructive analysis. Specific challenges are illustrated on an example of the decommissioning project underway at the Joint Research Center of the European Commission in Ispra. (author)

  9. Evidence and Consequence of a Highly Adapted Clonal Haplotype within the Australian Ascochyta rabiei Population

    Directory of Open Access Journals (Sweden)

    Yasir Mehmood

    2017-06-01

    Full Text Available The Australian Ascochyta rabiei (Pass. Labr. (syn. Phoma rabiei population has low genotypic diversity with only one mating type detected to date, potentially precluding substantial evolution through recombination. However, a large diversity in aggressiveness exists. In an effort to better understand the risk from selective adaptation to currently used resistance sources and chemical control strategies, the population was examined in detail. For this, a total of 598 isolates were quasi-hierarchically sampled between 2013 and 2015 across all major Australian chickpea growing regions and commonly grown host genotypes. Although a large number of haplotypes were identified (66 through short sequence repeat (SSR genotyping, overall low gene diversity (Hexp = 0.066 and genotypic diversity (D = 0.57 was detected. Almost 70% of the isolates assessed were of a single dominant haplotype (ARH01. Disease screening on a differential host set, including three commonly deployed resistance sources, revealed distinct aggressiveness among the isolates, with 17% of all isolates identified as highly aggressive. Almost 75% of these were of the ARH01 haplotype. A similar pattern was observed at the host level, with 46% of all isolates collected from the commonly grown host genotype Genesis090 (classified as “resistant” during the term of collection identified as highly aggressive. Of these, 63% belonged to the ARH01 haplotype. In conclusion, the ARH01 haplotype represents a significant risk to the Australian chickpea industry, being not only widely adapted to the diverse agro-geographical environments of the Australian chickpea growing regions, but also containing a disproportionately large number of aggressive isolates, indicating fitness to survive and replicate on the best resistance sources in the Australian germplasm.

  10. Uremic anorexia: a consequence of persistently high brain serotonin levels? The tryptophan/serotonin disorder hypothesis.

    Science.gov (United States)

    Aguilera, A; Selgas, R; Codoceo, R; Bajo, A

    2000-01-01

    Anorexia is a frequent part of uremic syndrome, contributing to malnutrition in dialysis patients. Many factors have been suggested as responsible for uremic anorexia. In this paper we formulate a new hypothesis to explain the appetite disorders in dialysis patients: "the tryptophan/serotonin disorder hypothesis." We review current knowledge of normal hunger-satiety cycle control and the disorders described in uremic patients. There are four phases in food intake regulation: (1) the gastric phase, during which food induces satiety through gastric distention and satiety peptide release; (2) the post absorptive phase, during which circulating compounds, including glucose and amino acids, cause satiety by hepatic receptors via the vagus nerve; (3) the hepatic phase, during which adenosine triphosphate (ATP) concentration is the main stimulus inducing hunger or satiety, with cytokines inhibiting ATP production; and (4) the central phase, during which appetite is regulated through peripheral (circulating plasma substances and neurotransmitters) and brain stimuli. Brain serotonin is the final target for peripheral mechanisms controlling appetite. High brain serotonin levels and a lower serotonin/dopamine ratio cause anorexia. Plasma and brain amino acid concentrations are recognized factors involved in neurotransmitter synthesis and appetite control. Tryptophan is the substrate of serotonin synthesis. High plasma levels of anorectics such as tryptophan (plasma and brain), cholecystokinin, tumor necrosis factor alpha, interleukin-1, and leptin, and deficiencies of nitric oxide and neuropeptide Y have been described in uremia; all increase intracerebral serotonin. We suggest that brain serotonin hyperproduction due to a uremic-dependent excess of tryptophan may be the final common pathway involved in the genesis of uremic anorexia. Various methods of ameliorating anorexia by decreasing the central effects of serotonin are proposed.

  11. Specification Improvement Through Analysis of Proof Structure (SITAPS): High Assurance Software Development

    Science.gov (United States)

    2016-02-01

    proof in mathematics. For example, consider the proof of the Pythagorean Theorem illustrated at: http://www.cut-the-knot.org/ pythagoras / where 112...methods and tools have made significant progress in their ability to model software designs and prove correctness theorems about the systems modeled...assumption criticality” or “ theorem root set size” SITAPS detects potentially brittle verification cases. SITAPS provides tools and techniques that

  12. LipiDex: An Integrated Software Package for High-Confidence Lipid Identification.

    Science.gov (United States)

    Hutchins, Paul D; Russell, Jason D; Coon, Joshua J

    2018-04-17

    State-of-the-art proteomics software routinely quantifies thousands of peptides per experiment with minimal need for manual validation or processing of data. For the emerging field of discovery lipidomics via liquid chromatography-tandem mass spectrometry (LC-MS/MS), comparably mature informatics tools do not exist. Here, we introduce LipiDex, a freely available software suite that unifies and automates all stages of lipid identification, reducing hands-on processing time from hours to minutes for even the most expansive datasets. LipiDex utilizes flexible in silico fragmentation templates and lipid-optimized MS/MS spectral matching routines to confidently identify and track hundreds of lipid species and unknown compounds from diverse sample matrices. Unique spectral and chromatographic peak purity algorithms accurately quantify co-isolation and co-elution of isobaric lipids, generating identifications that match the structural resolution afforded by the LC-MS/MS experiment. During final data filtering, ionization artifacts are removed to significantly reduce dataset redundancy. LipiDex interfaces with several LC-MS/MS software packages, enabling robust lipid identification to be readily incorporated into pre-existing data workflows. Copyright © 2018 Elsevier Inc. All rights reserved.

  13. High Resolution Topography of Polar Regions from Commercial Satellite Imagery, Petascale Computing and Open Source Software

    Science.gov (United States)

    Morin, Paul; Porter, Claire; Cloutier, Michael; Howat, Ian; Noh, Myoung-Jong; Willis, Michael; Kramer, WIlliam; Bauer, Greg; Bates, Brian; Williamson, Cathleen

    2017-04-01

    Surface topography is among the most fundamental data sets for geosciences, essential for disciplines ranging from glaciology to geodynamics. Two new projects are using sub-meter, commercial imagery licensed by the National Geospatial-Intelligence Agency and open source photogrammetry software to produce a time-tagged 2m posting elevation model of the Arctic and an 8m posting reference elevation model for the Antarctic. When complete, this publically available data will be at higher resolution than any elevation models that cover the entirety of the Western United States. These two polar projects are made possible due to three equally important factors: 1) open-source photogrammetry software, 2) petascale computing, and 3) sub-meter imagery licensed to the United States Government. Our talk will detail the technical challenges of using automated photogrammetry software; the rapid workflow evolution to allow DEM production; the task of deploying the workflow on one of the world's largest supercomputers; the trials of moving massive amounts of data, and the management strategies the team needed to solve in order to meet deadlines. Finally, we will discuss the implications of this type of collaboration for future multi-team use of leadership-class systems such as Blue Waters, and for further elevation mapping.

  14. Peak fitting and identification software library for high resolution gamma-ray spectra

    International Nuclear Information System (INIS)

    Uher, Josef; Roach, Greg; Tickner, James

    2010-01-01

    A new gamma-ray spectral analysis software package is under development in our laboratory. It can be operated as a stand-alone program or called as a software library from Java, C, C++ and MATLAB TM environments. It provides an advanced graphical user interface for data acquisition, spectral analysis and radioisotope identification. The code uses a peak-fitting function that includes peak asymmetry, Compton continuum and flexible background terms. Peak fitting function parameters can be calibrated as functions of energy. Each parameter can be constrained to improve fitting of overlapping peaks. All of these features can be adjusted by the user. To assist with peak identification, the code can automatically measure half-lives of single or multiple overlapping peaks from a time series of spectra. It implements library-based peak identification, with options for restricting the search based on radioisotope half-lives and reaction types. The software also improves the reliability of isotope identification by utilizing Monte-Carlo simulation results.

  15. 33rd International School of Mathematics "G Stampacchia ": High Performance Algorithms and Software for Nonlinear Optics "Ettore Majorana"

    CERN Document Server

    Murli, Almerico; High Performance Algorithms and Software for Nonlinear Optics

    2003-01-01

    This volume contains the edited texts of the lectures presented at the Workshop on High Performance Algorithms and Software for Nonlinear Optimization held in Erice, Sicily, at the "G. Stampacchia" School of Mathematics of the "E. Majorana" Centre for Scientific Culture, June 30 - July 8, 2001. In the first year of the new century, the aim of the Workshop was to assess the past and to discuss the future of Nonlinear Optimization, and to highlight recent achieve­ ments and promising research trends in this field. An emphasis was requested on algorithmic and high performance software developments and on new computational experiences, as well as on theoretical advances. We believe that such goal was basically achieved. The Workshop was attended by 71 people from 22 countries. Although not all topics were covered, the presentations gave indeed a wide overview of the field, from different and complementary stand­ points. Besides the lectures, several formal and informal discussions took place. We wish ...

  16. Characterization and consequences of intermittent sediment oxygenation by macrofauna: interpretation of high-resolution data sets

    Science.gov (United States)

    Meile, C. D.; Dwyer, I.; Zhu, Q.; Polerecky, L.; Volkenborn, N.

    2017-12-01

    Mineralization of organic matter in marine sediments leads to the depletion of oxygen, while activities of infauna introduce oxygenated seawater to the subsurface. In permeable sediments solutes can be transported from animals and their burrows into the surrounding sediment through advection over several centimeters. The intermittency of pumping leads to a spatially heterogeneous distribution of oxidants, with the temporal dynamics depending on sediment reactivity and activity patterns of the macrofauna. Here, we present results from a series of experiments in which these dynamics are studied at high spatial and temporal resolution using planar optodes. From O2, pH and pCO2 optode data, we quantify rates of O2 consumption and dissolved inorganic carbon production, as well alkalinity dynamics, with millimeter-scale resolution. Simulating intermittent irrigation by imposed pumping patterns in thin aquaria, we derive porewater flow patterns, which together with the production and consumption rates cause the chemical distributions and the establishment of reaction fronts. Our analysis thus establishes a quantitative connection between the locally dynamic redox conditions relevant for biogeochemical transformations and macroscopic observations commonly made with sediment cores.

  17. Uncertainties in geologic disposal of high-level wastes - groundwater transport of radionuclides and radiological consequences

    International Nuclear Information System (INIS)

    Kocher, D.C.; Sjoreen, A.L.; Bard, C.S.

    1983-01-01

    The analysis for radionuclide transport in groundwater considers models and methods for characterizing (1) the present geologic environment and its future evolution due to natural geologic processes and to repository development and waste emplacement, (2) groundwater hydrology, (3) radionuclide geochemistry, and (4) the interactions among these phenomena. The discussion of groundwater transport focuses on the nature of the sources of uncertainty rather than on quantitative estimates of their magnitude, because of the lack of evidence that current models can provide realistic quantitative predictions of radionuclide transport in groundwater for expected repository environments. The analysis for the long-term health risk to man following releases of long-lived radionuclides to the biosphere is more quantitative and involves estimates of uncertainties in (1) radionuclide concentrations in man's exposure environment, (2) radionuclide intake by exposed individuals per unit concentration in the environment, (3) the dose per unit intake, (4) the number of exposed individuals, and (5) the health risk per unit dose. For the important long-lived radionuclides in high-level waste, uncertainties in most of the different components of a calculation of individual and collective dose per unit release appear to be no more than two or three orders of magnitude; these uncertainties are certainly much less than uncertainties in predicting groundwater transport of radionuclides between a repository and the biosphere. Several limitations in current models for predicting the health risk to man per unit release to the biosphere are discussed

  18. Metal stress consequences on frost hardiness of plants at northern high latitudes: a review and hypothesis

    International Nuclear Information System (INIS)

    Taulavuori, Kari; Prasad, M.N.V.; Taulavuori, Erja; Laine, Kari

    2005-01-01

    This paper reviews the potential of trace/heavy metal-induced stress to reduce plant frost hardiness at northern high latitudes. The scientific questions are first outlined prior to a brief summary of heavy metal tolerance. The concepts of plant capacity and survival adaptation were used to formulate a hypothesis, according to which heavy metal stress may reduce plant frost hardiness for the following reasons: (1) Heavy metals change membrane properties through impaired resource acquisition and subsequent diminution of the cryoprotectant pool. (2) Heavy metals change membrane properties directly through oxidative stress, i.e. an increase of active oxygen species. (3) The involved co-stress may further increase oxidative stress. (4) The risk of frost injury increases due to membrane alterations. An opposite perspective was also discussed: could metal stress result in enhanced plant frost hardiness? This phenomenon could be based on the metabolism (i.e. glutathione, polyamines, proline, heat shock proteins) underlying a possible general adaptation syndrome of stress (GAS). As a result of the review it was suggested that metal-induced stress seems to reduce rather than increase plant frost hardiness. - Metal stress may reduce plant frost hardiness

  19. Metal stress consequences on frost hardiness of plants at northern high latitudes: a review and hypothesis.

    Science.gov (United States)

    Taulavuori, Kari; Prasad, M N V; Taulavuori, Erja; Laine, Kari

    2005-05-01

    This paper reviews the potential of trace/heavy metal-induced stress to reduce plant frost hardiness at northern high latitudes. The scientific questions are first outlined prior to a brief summary of heavy metal tolerance. The concepts of plant capacity and survival adaptation were used to formulate a hypothesis, according to which heavy metal stress may reduce plant frost hardiness for the following reasons: (1) Heavy metals change membrane properties through impaired resource acquisition and subsequent diminution of the cryoprotectant pool. (2) Heavy metals change membrane properties directly through oxidative stress, i.e. an increase of active oxygen species. (3) The involved co-stress may further increase oxidative stress. (4) The risk of frost injury increases due to membrane alterations. An opposite perspective was also discussed: could metal stress result in enhanced plant frost hardiness? This phenomenon could be based on the metabolism (i.e. glutathione, polyamines, proline, heat shock proteins) underlying a possible general adaptation syndrome of stress (GAS). As a result of the review it was suggested that metal-induced stress seems to reduce rather than increase plant frost hardiness.

  20. Metal stress consequences on frost hardiness of plants at northern high latitudes: a review and hypothesis

    Energy Technology Data Exchange (ETDEWEB)

    Taulavuori, Kari [Department of Biology, University of Oulu, PO Box 3000, FIN-90014, Oulu (Finland)]. E-mail: kari.taulavuori@oulu.fi; Prasad, M.N.V. [Department of Plant Sciences, University of Hyderabad, Hyderabad 500 046, Andhra Pradesh (India); Taulavuori, Erja [Department of Biology, University of Oulu, PO Box 3000, FIN-90014, Oulu (Finland); Laine, Kari [Department of Biology, University of Oulu, PO Box 3000, FIN-90014, Oulu (Finland)

    2005-05-01

    This paper reviews the potential of trace/heavy metal-induced stress to reduce plant frost hardiness at northern high latitudes. The scientific questions are first outlined prior to a brief summary of heavy metal tolerance. The concepts of plant capacity and survival adaptation were used to formulate a hypothesis, according to which heavy metal stress may reduce plant frost hardiness for the following reasons: (1) Heavy metals change membrane properties through impaired resource acquisition and subsequent diminution of the cryoprotectant pool. (2) Heavy metals change membrane properties directly through oxidative stress, i.e. an increase of active oxygen species. (3) The involved co-stress may further increase oxidative stress. (4) The risk of frost injury increases due to membrane alterations. An opposite perspective was also discussed: could metal stress result in enhanced plant frost hardiness? This phenomenon could be based on the metabolism (i.e. glutathione, polyamines, proline, heat shock proteins) underlying a possible general adaptation syndrome of stress (GAS). As a result of the review it was suggested that metal-induced stress seems to reduce rather than increase plant frost hardiness. - Metal stress may reduce plant frost hardiness.

  1. Behavioral and cellular consequences of high-electrode count Utah Arrays chronically implanted in rat sciatic nerve

    Science.gov (United States)

    Wark, H. A. C.; Mathews, K. S.; Normann, R. A.; Fernandez, E.

    2014-08-01

    Objective. Before peripheral nerve electrodes can be used for the restoration of sensory and motor functions in patients with neurological disorders, the behavioral and histological consequences of these devices must be investigated. These indices of biocompatibility can be defined in terms of desired functional outcomes; for example, a device may be considered for use as a therapeutic intervention if the implanted subject retains functional neurons post-implantation even in the presence of a foreign body response. The consequences of an indwelling device may remain localized to cellular responses at the device-tissue interface, such as fibrotic encapsulation of the device, or they may affect the animal more globally, such as impacting behavioral or sensorimotor functions. The objective of this study was to investigate the overall consequences of implantation of high-electrode count intrafascicular peripheral nerve arrays, High Density Utah Slanted Electrode Arrays (HD-USEAs; 25 electrodes mm-2). Approach. HD-USEAs were implanted in rat sciatic nerves for one and two month periods. We monitored wheel running, noxious sensory paw withdrawal reflexes, footprints, nerve morphology and macrophage presence at the tissue-device interface. In addition, we used a novel approach to contain the arrays in actively behaving animals that consisted of an organic nerve wrap. A total of 500 electrodes were implanted across all ten animals. Main results. The results demonstrated that chronic implantation (⩽8 weeks) of HD-USEAs into peripheral nerves can evoke behavioral deficits that recover over time. Morphology of the nerve distal to the implantation site showed variable signs of nerve fiber degeneration and regeneration. Cytology adjacent to the device-tissue interface also showed a variable response, with some electrodes having many macrophages surrounding the electrodes, while other electrodes had few or no macrophages present. This variability was also seen along the length

  2. High temperature oxidation of metals: vacancy injection and consequences on the mechanical properties

    International Nuclear Information System (INIS)

    Perusin, S.

    2004-11-01

    The aim of this work is to account for the effects of the high temperature oxidation of metals on their microstructure and their mechanical properties. 'Model' materials like pure nickel, pure iron and the Ni-20Cr alloy are studied. Nickel foils have been oxidised at 1000 C on one side only in laboratory air, the other side being protected from oxidation by a reducing atmosphere. After the oxidation treatment, the unoxidized face was carefully examined by using an Atomic Force Microscope (AFM). Grain boundaries grooves were characterised and their depth were compared to the ones obtained on the same sample heat treated in the reducing atmosphere during the same time. They are found to be much deeper in the case of the single side oxidised samples. It is shown that this additional grooving is directly linked to the growth of the oxide scale on the opposite side and that it can be explained by the diffusion of the vacancies produced at the oxide scale - metal interface, across the entire sample through grain boundaries. Moreover, the comparison between single side oxidised samples and samples oxidised on both sides points out that voids in grain boundaries are only observed in this latter case proving the vacancies condensation in the metal when the two faces are oxidised. The role of the carbon content and the sample's geometry on this phenomenon is examined in detail. The diffusion of vacancies is coupled with the transport of oxygen so that a mechanism of oxygen transport by vacancies is suggested. The tensile tests realised at room temperature on nickel foils (bamboo microstructure) show that the oxide scale can constitute a barrier to the emergence of dislocations at the metal surface. Finally, the Ni-20Cr alloy is tested in tensile and creep tests between 25 and 825 C in oxidising or reducing atmospheres. (author)

  3. Consequences of high effective Prandtl number on solar differential rotation and convective velocity

    Science.gov (United States)

    Karak, Bidya Binay; Miesch, Mark; Bekki, Yuto

    2018-04-01

    Observations suggest that the large-scale convective velocities obtained by solar convection simulations might be over-estimated (convective conundrum). One plausible solution to this could be the small-scale dynamo which cannot be fully resolved by global simulations. The small-scale Lorentz force suppresses the convective motions and also the turbulent mixing of entropy between upflows and downflows, leading to a large effective Prandtl number (Pr). We explore this idea in three-dimensional global rotating convection simulations at different thermal conductivity (κ), i.e., at different Pr. In agreement with previous non-rotating simulations, the convective velocity is reduced with the increase of Pr as long as the thermal conductive flux is negligible. A subadiabatic layer is formed near the base of the convection zone due to continuous deposition of low entropy plumes in low-κ simulations. The most interesting result of our low-κ simulations is that the convective motions are accompanied by a change in the convection structure that is increasingly influenced by small-scale plumes. These plumes tend to transport angular momentum radially inward and thus establish an anti-solar differential rotation, in striking contrast to the solar rotation profile. If such low diffusive plumes, driven by the radiative-surface cooling, are present in the Sun, then our results cast doubt on the idea that a high effective Pr may be a viable solution to the solar convective conundrum. Our study also emphasizes that any resolution of the conundrum that relies on the downward plumes must take into account the angular momentum transport and heat transport.

  4. The State of Software for Evolutionary Biology.

    Science.gov (United States)

    Darriba, Diego; Flouri, Tomáš; Stamatakis, Alexandros

    2018-05-01

    With Next Generation Sequencing data being routinely used, evolutionary biology is transforming into a computational science. Thus, researchers have to rely on a growing number of increasingly complex software. All widely used core tools in the field have grown considerably, in terms of the number of features as well as lines of code and consequently, also with respect to software complexity. A topic that has received little attention is the software engineering quality of widely used core analysis tools. Software developers appear to rarely assess the quality of their code, and this can have potential negative consequences for end-users. To this end, we assessed the code quality of 16 highly cited and compute-intensive tools mainly written in C/C++ (e.g., MrBayes, MAFFT, SweepFinder, etc.) and JAVA (BEAST) from the broader area of evolutionary biology that are being routinely used in current data analysis pipelines. Because, the software engineering quality of the tools we analyzed is rather unsatisfying, we provide a list of best practices for improving the quality of existing tools and list techniques that can be deployed for developing reliable, high quality scientific software from scratch. Finally, we also discuss journal as well as science policy and, more importantly, funding issues that need to be addressed for improving software engineering quality as well as ensuring support for developing new and maintaining existing software. Our intention is to raise the awareness of the community regarding software engineering quality issues and to emphasize the substantial lack of funding for scientific software development.

  5. The Convergence of High-Consequence Livestock and Human Pathogen Research and Development: A Paradox of Zoonotic Disease

    Directory of Open Access Journals (Sweden)

    Julia M. Michelotti

    2018-05-01

    Full Text Available The World Health Organization (WHO estimates that zoonotic diseases transmitted from animals to humans account for 75 percent of new and emerging infectious diseases. Globally, high-consequence pathogens that impact livestock and have the potential for human transmission create research paradoxes and operational challenges for the high-containment laboratories that conduct work with them. These specialized facilities are required for conducting all phases of research on high-consequence pathogens (basic, applied, and translational with an emphasis on both the generation of fundamental knowledge and product development. To achieve this research mission, a highly-trained workforce is required and flexible operational methods are needed. In addition, working with certain pathogens requires compliance with regulations such as the Centers for Disease Control (CDC and the U.S. Department of Agriculture (USDA Select Agent regulations, which adds to the operational burden. The vast experience from the existing studies at Plum Island Animal Disease Center, other U.S. laboratories, and those in Europe and Australia with biosafety level 4 (BSL-4 facilities designed for large animals, clearly demonstrates the valuable contribution this capability brings to the efforts to detect, prepare, prevent and respond to livestock and potential zoonotic threats. To raise awareness of these challenges, which include biosafety and biosecurity issues, we held a workshop at the 2018 American Society for Microbiology (ASM Biothreats conference to further discuss the topic with invited experts and audience participants. The workshop covered the subjects of research funding and metrics, economic sustainment of drug and vaccine development pipelines, workforce turnover, and the challenges of maintaining operational readiness of high containment laboratories.

  6. MASH Suite: a user-friendly and versatile software interface for high-resolution mass spectrometry data interpretation and visualization.

    Science.gov (United States)

    Guner, Huseyin; Close, Patrick L; Cai, Wenxuan; Zhang, Han; Peng, Ying; Gregorich, Zachery R; Ge, Ying

    2014-03-01

    The rapid advancements in mass spectrometry (MS) instrumentation, particularly in Fourier transform (FT) MS, have made the acquisition of high-resolution and high-accuracy mass measurements routine. However, the software tools for the interpretation of high-resolution MS data are underdeveloped. Although several algorithms for the automatic processing of high-resolution MS data are available, there is still an urgent need for a user-friendly interface with functions that allow users to visualize and validate the computational output. Therefore, we have developed MASH Suite, a user-friendly and versatile software interface for processing high-resolution MS data. MASH Suite contains a wide range of features that allow users to easily navigate through data analysis, visualize complex high-resolution MS data, and manually validate automatically processed results. Furthermore, it provides easy, fast, and reliable interpretation of top-down, middle-down, and bottom-up MS data. MASH Suite is convenient, easily operated, and freely available. It can greatly facilitate the comprehensive interpretation and validation of high-resolution MS data with high accuracy and reliability.

  7. Effects of strength training on muscle fiber types and size; consequences for athletes training for high-intensity sport

    DEFF Research Database (Denmark)

    Andersen, J L; Aagaard, P

    2010-01-01

    Training toward improving performance in sports involving high intense exercise can and is done in many different ways based on a mixture of tradition in the specific sport, coaches' experience and scientific recommendations. Strength training is a form of training that now-a-days have found its...... way into almost all sports in which high intense work is conducted. In this review we will focus on a few selected aspects and consequences of strength training; namely what effects do strength training have of muscle fiber type composition, and how may these effects change the contractile properties...... functional training advises can be made. Thus, more than a review in the traditional context this review should be viewed upon as an attempt to bring sports-physiologists and coaches or others working directly with the athletes together for a mutual discussion on how recently acquired physiological knowledge...

  8. Economic consequences of mastitis and withdrawal of milk with high somatic cell count in Swedish dairy herds

    DEFF Research Database (Denmark)

    Nielsen, C; Østergaard, Søren; Emanuelson, U

    2010-01-01

    Herd, was used to study the effects of mastitis in a herd with 150 cows. Results given the initial incidence of mastitis (32 and 33 clinical and subclinical cases per 100 cow-years, respectively) were studied, together with the consequences of reducing or increasing the incidence of mastitis by 50%, modelling......% of the herd net return given the initial incidence of mastitis. Expressed per cow-year, the avoidable cost of mastitis was €55. The costs per case of CM and SCM were estimated at €278 and €60, respectively. Withdrawing milk with high SCC was never profitable because this generated a substantial amount of milk......The main aim was to assess the impact of mastitis on technical and economic results of a dairy herd under current Swedish farming conditions. The second aim was to investigate the effects obtained by withdrawing milk with high somatic cell count (SCC). A dynamic and stochastic simulation model, Sim...

  9. Effects of strength training on muscle fiber types and size; consequences for athletes training for high-intensity sport

    DEFF Research Database (Denmark)

    Andersen, J L; Aagaard, P

    2010-01-01

    way into almost all sports in which high intense work is conducted. In this review we will focus on a few selected aspects and consequences of strength training; namely what effects do strength training have of muscle fiber type composition, and how may these effects change the contractile properties......Training toward improving performance in sports involving high intense exercise can and is done in many different ways based on a mixture of tradition in the specific sport, coaches' experience and scientific recommendations. Strength training is a form of training that now-a-days have found its...... functional training advises can be made. Thus, more than a review in the traditional context this review should be viewed upon as an attempt to bring sports-physiologists and coaches or others working directly with the athletes together for a mutual discussion on how recently acquired physiological knowledge...

  10. Safety and reliability of automatization software

    Energy Technology Data Exchange (ETDEWEB)

    Kapp, K; Daum, R [Karlsruhe Univ. (TH) (Germany, F.R.). Lehrstuhl fuer Angewandte Informatik, Transport- und Verkehrssysteme

    1979-02-01

    Automated technical systems have to meet very high requirements concerning safety, security and reliability. Today, modern computers, especially microcomputers, are used as integral parts of those systems. In consequence computer programs must work in a safe and reliable mannter. Methods are discussed which allow to construct safe and reliable software for automatic systems such as reactor protection systems and to prove that the safety requirements are met. As a result it is shown that only the method of total software diversification can satisfy all safety requirements at tolerable cost. In order to achieve a high degree of reliability, structured and modular programming in context with high level programming languages are recommended.

  11. ATAQS: A computational software tool for high throughput transition optimization and validation for selected reaction monitoring mass spectrometry

    Directory of Open Access Journals (Sweden)

    Ramos Hector

    2011-03-01

    Full Text Available Abstract Background Since its inception, proteomics has essentially operated in a discovery mode with the goal of identifying and quantifying the maximal number of proteins in a sample. Increasingly, proteomic measurements are also supporting hypothesis-driven studies, in which a predetermined set of proteins is consistently detected and quantified in multiple samples. Selected reaction monitoring (SRM is a targeted mass spectrometric technique that supports the detection and quantification of specific proteins in complex samples at high sensitivity and reproducibility. Here, we describe ATAQS, an integrated software platform that supports all stages of targeted, SRM-based proteomics experiments including target selection, transition optimization and post acquisition data analysis. This software will significantly facilitate the use of targeted proteomic techniques and contribute to the generation of highly sensitive, reproducible and complete datasets that are particularly critical for the discovery and validation of targets in hypothesis-driven studies in systems biology. Result We introduce a new open source software pipeline, ATAQS (Automated and Targeted Analysis with Quantitative SRM, which consists of a number of modules that collectively support the SRM assay development workflow for targeted proteomic experiments (project management and generation of protein, peptide and transitions and the validation of peptide detection by SRM. ATAQS provides a flexible pipeline for end-users by allowing the workflow to start or end at any point of the pipeline, and for computational biologists, by enabling the easy extension of java algorithm classes for their own algorithm plug-in or connection via an external web site. This integrated system supports all steps in a SRM-based experiment and provides a user-friendly GUI that can be run by any operating system that allows the installation of the Mozilla Firefox web browser. Conclusions Targeted

  12. Analysis of lipid experiments (ALEX: a software framework for analysis of high-resolution shotgun lipidomics data.

    Directory of Open Access Journals (Sweden)

    Peter Husen

    Full Text Available Global lipidomics analysis across large sample sizes produces high-content datasets that require dedicated software tools supporting lipid identification and quantification, efficient data management and lipidome visualization. Here we present a novel software-based platform for streamlined data processing, management and visualization of shotgun lipidomics data acquired using high-resolution Orbitrap mass spectrometry. The platform features the ALEX framework designed for automated identification and export of lipid species intensity directly from proprietary mass spectral data files, and an auxiliary workflow using database exploration tools for integration of sample information, computation of lipid abundance and lipidome visualization. A key feature of the platform is the organization of lipidomics data in "database table format" which provides the user with an unsurpassed flexibility for rapid lipidome navigation using selected features within the dataset. To demonstrate the efficacy of the platform, we present a comparative neurolipidomics study of cerebellum, hippocampus and somatosensory barrel cortex (S1BF from wild-type and knockout mice devoid of the putative lipid phosphate phosphatase PRG-1 (plasticity related gene-1. The presented framework is generic, extendable to processing and integration of other lipidomic data structures, can be interfaced with post-processing protocols supporting statistical testing and multivariate analysis, and can serve as an avenue for disseminating lipidomics data within the scientific community. The ALEX software is available at www.msLipidomics.info.

  13. Analysis of lipid experiments (ALEX): a software framework for analysis of high-resolution shotgun lipidomics data.

    Science.gov (United States)

    Husen, Peter; Tarasov, Kirill; Katafiasz, Maciej; Sokol, Elena; Vogt, Johannes; Baumgart, Jan; Nitsch, Robert; Ekroos, Kim; Ejsing, Christer S

    2013-01-01

    Global lipidomics analysis across large sample sizes produces high-content datasets that require dedicated software tools supporting lipid identification and quantification, efficient data management and lipidome visualization. Here we present a novel software-based platform for streamlined data processing, management and visualization of shotgun lipidomics data acquired using high-resolution Orbitrap mass spectrometry. The platform features the ALEX framework designed for automated identification and export of lipid species intensity directly from proprietary mass spectral data files, and an auxiliary workflow using database exploration tools for integration of sample information, computation of lipid abundance and lipidome visualization. A key feature of the platform is the organization of lipidomics data in "database table format" which provides the user with an unsurpassed flexibility for rapid lipidome navigation using selected features within the dataset. To demonstrate the efficacy of the platform, we present a comparative neurolipidomics study of cerebellum, hippocampus and somatosensory barrel cortex (S1BF) from wild-type and knockout mice devoid of the putative lipid phosphate phosphatase PRG-1 (plasticity related gene-1). The presented framework is generic, extendable to processing and integration of other lipidomic data structures, can be interfaced with post-processing protocols supporting statistical testing and multivariate analysis, and can serve as an avenue for disseminating lipidomics data within the scientific community. The ALEX software is available at www.msLipidomics.info.

  14. Experience with highly-parallel software for the storage system of the ATLAS Experiment at CERN

    CERN Document Server

    Colombo, T; The ATLAS collaboration

    2012-01-01

    The ATLAS experiment is observing proton-proton collisions delivered by the LHC accelerator. The ATLAS Trigger and Data Acquisition (TDAQ) system selects interesting events on-line in a three-level trigger system in order to store them at a budgeted rate of several hundred Hz. This paper focuses on the TDAQ data-logging system and in particular on the implementation and performance of a novel parallel software design. In this respect, the main challenge presented by the data-logging workload is the conflict between the largely parallel nature of the event processing, especially the recently introduced event compression, and the constraint of sequential file writing and checksum evaluation. This is further complicated by the necessity of operating in a fully data-driven mode, to cope with continuously evolving trigger and detector configurations. In this paper we report on the design of the new ATLAS on-line storage software. In particular we will discuss our development experience using recent concurrency-ori...

  15. Implementation of highly parallel and large scale GW calculations within the OpenAtom software

    Science.gov (United States)

    Ismail-Beigi, Sohrab

    The need to describe electronic excitations with better accuracy than provided by band structures produced by Density Functional Theory (DFT) has been a long-term enterprise for the computational condensed matter and materials theory communities. In some cases, appropriate theoretical frameworks have existed for some time but have been difficult to apply widely due to computational cost. For example, the GW approximation incorporates a great deal of important non-local and dynamical electronic interaction effects but has been too computationally expensive for routine use in large materials simulations. OpenAtom is an open source massively parallel ab initiodensity functional software package based on plane waves and pseudopotentials (http://charm.cs.uiuc.edu/OpenAtom/) that takes advantage of the Charm + + parallel framework. At present, it is developed via a three-way collaboration, funded by an NSF SI2-SSI grant (ACI-1339804), between Yale (Ismail-Beigi), IBM T. J. Watson (Glenn Martyna) and the University of Illinois at Urbana Champaign (Laxmikant Kale). We will describe the project and our current approach towards implementing large scale GW calculations with OpenAtom. Potential applications of large scale parallel GW software for problems involving electronic excitations in semiconductor and/or metal oxide systems will be also be pointed out.

  16. Intelligent software solution for reliable high efficiency/low false alarm border monitoring

    International Nuclear Information System (INIS)

    Rieck, W.; Iwatschenko, M.

    2001-01-01

    Full text: Radioactivity Monitoring at border stations requires detection systems that are reliably operating under special conditions such as: different types and shapes of vehicles; different velocities; stop and go traffic. ESM has developed a solution that achieves under all such conditions the lowest possible detection limit and avoids false alarms generated by naturally occurring radioactive material (NORM). NBR (Natural Background Reduction) data evaluation - One of the main reasons for the success of the ESM gate monitors is the unique and proprietary NBR-technology of instantaneous discrimination of artificial and natural gamma radiation using large area plastic scintillators. Thus the FHT 1388 gate monitors show 2 unique features: Possible setting of different alarm levels for NORM and artificial gamma sources; Self adjusting compensation of the background shielding of the truck in respect to the detection of artificial sources. Both properties are a preposition for the highly sensitive detection of artificial gamma sources. While at scrap yards and steel mills usually all radioactivity (including NORM) must be detected, the main object of interest in respect to the measuring task at border stations, airports or harbours is clearly the detection of even very small signals of artificial radioactivity. The reliable rejection of the influence of natural radioactivity is of special importance in the case of detection of illicit trafficking, since construction material, fertilisers or soil often lead to much higher detector signals than the alarming levels for dangerous sources of interest. Beside the varying content of natural radioactivity in the load of a truck, different loads and trucks show different influence on the reduction of the ambient radiation due to the passing vehicle. Thus software approaches assuming a specific reduction of the background count rate (regarding relative magnitude and shape) must fail when trucks of different shape and load

  17. The Auto control System Based on InTouch Configuration software for High-gravity Oil Railway Tank Feeding

    Directory of Open Access Journals (Sweden)

    Xu De-Kai

    2015-01-01

    Full Text Available This paper provides automatic design for high-gravity oil railway tank feeding system of some refinery uses distributive control system. The system adopts the automatic system of Modicon TSX Quantum or PLC as monitor and control level and uses a PC-based plat form as principal computer running on the Microsoft Windows2000. An automatic control system is developed in the environment of InTouch configuration software. This system implements automatic high-gravity oil tank feeding with pump controlling function. And it combines automatic oil feeding controlling, pump controlling and tank monitoring function to implement the automation of oil feeding with rations and automatic control.

  18. Marijuana usage in relation to harmfulness ratings, perceived likelihood of negative consequences, and defense mechanisms in high school students.

    Science.gov (United States)

    Como-Lesko, N; Primavera, L H; Szeszko, P R

    1994-08-01

    This study investigated high school students' marijuana usage patterns in relation to their harmfulness ratings of 15 licit and illicit drugs, perceived negative consequences from using marijuana, and types of defense mechanisms employed. Subjects were classified into one of five pattern-of-use groups based on marijuana usage: principled nonusers, nonusers, light users, moderate users, and heavy users. Principled nonusers (individuals who have never used marijuana and would not do so if it was legalized) rated marijuana, hashish, cocaine, and alcohol as significantly more harmful than heavy users. A cluster analysis of the drugs' harmfulness ratings best fit a three cluster solution and were named medicinal drugs, recreational drugs, and hard drugs. In general, principled nonusers rated negative consequences from using marijuana as significantly more likely to occur than other groups. Principled nonusers and heavy users utilized reversal from the Defense Mechanism Inventory, which includes repression and denial, significantly more than nonusers, indicating some trait common to the two extreme pattern-of-use groups.

  19. Hadronic energy resolution of a highly granular scintillator-steel hadron calorimeter using software compensation techniques

    CERN Document Server

    Adloff, C.; Blaising, J.J.; Drancourt, C.; Espargiliere, A.; Gaglione, R.; Geffroy, N.; Karyotakis, Y.; Prast, J.; Vouters, G.; Francis, K.; Repond, J.; Smith, J.; Xia, L.; Baldolemar, E.; Li, J.; Park, S.T.; Sosebee, M.; White, A.P.; Yu, J.; Buanes, T.; Eigen, G.; Mikami, Y.; Watson, N.K.; Goto, T.; Mavromanolakis, G.; Thomson, M.A.; Ward, D.R.; Yan, W.; Benchekroun, D.; Hoummada, A.; Khoulaki, Y.; Benyamna, M.; Carloganu, C.; Fehr, F.; Gay, P.; Manen, S.; Royer, L.; Blazey, G.C.; Dyshkant, A.; Lima, J.G.R.; Zutshi, V.; Hostachy, J.Y.; Morin, L.; Cornett, U.; David, D.; Falley, G.; Gadow, K.; Gottlicher, P.; Gunter, C.; Hermberg, B.; Karstensen, S.; Krivan, F.; Lucaci-Timoce, A.I.; Lu, S.; Lutz, B.; Morozov, S.; Morgunov, V.; Reinecke, M.; Sefkow, F.; Smirnov, P.; Terwort, M.; Vargas-Trevino, A.; Feege, N.; Garutti, E.; Marchesini, I.; Ramilli, M.; Eckert, P.; Harion, T.; Kaplan, A.; Schultz-Coulon, H.Ch; Shen, W.; Stamen, R.; Tadday, A.; Bilki, B.; Norbeck, E.; Onel, Y.; Wilson, G.W.; Kawagoe, K.; Dauncey, P.D.; Magnan, A.M.; Wing, M.; Salvatore, F.; Calvo Alamillo, E.; Fouz, M.C.; Puerta-Pelayo, J.; Balagura, V.; Bobchenko, B.; Chadeeva, M.; Danilov, M.; Epifantsev, A.; Markin, O.; Mizuk, R.; Novikov, E.; Rusinov, V.; Tarkovsky, E.; Kirikova, N.; Kozlov, V.; Smirnov, P.; Soloviev, Y.; Buzhan, P.; Dolgoshein, B.; Ilyin, A.; Kantserov, V.; Kaplin, V.; Karakash, A.; Popova, E.; Smirnov, S.; Kiesling, C.; Pfau, S.; Seidel, K.; Simon, F.; Soldner, C.; Szalay, M.; Tesar, M.; Weuste, L.; Bonis, J.; Bouquet, B.; Callier, S.; Cornebise, P.; Doublet, Ph; Dulucq, F.; Faucci Giannelli, M.; Fleury, J.; Li, H.; Martin-Chassard, G.; Richard, F.; de la Taille, Ch.; Poschl, R.; Raux, L.; Seguin-Moreau, N.; Wicek, F.; Anduze, M.; Boudry, V.; Brient, J.C.; Jeans, D.; Mora de Freitas, P.; Musat, G.; Reinhard, M.; Ruan, M.; Videau, H.; Bulanek, B.; Zacek, J.; Cvach, J.; Gallus, P.; Havranek, M.; Janata, M.; Kvasnicka, J.; Lednicky, D.; Marcisovsky, M.; Polak, I.; Popule, J.; Tomasek, L.; Tomasek, M.; Ruzicka, P.; Sicho, P.; Smolik, J.; Vrba, V.; Zalesak, J.; Belhorma, B.; Ghazlane, H.; Takeshita, T.; Uozumi, S.; Sauer, J.; Weber, S.; Zeitnitz, C.

    2012-01-01

    SPS. The energy resolution for single hadrons is determined to be approximately 58%/ √E/GeV. This resolution is improved to approximately 45%/ √E/GeV with software compensation techniques. These techniques take advantage of the event-by-event information about the substructure of hadronic showers which is provided by the imaging capabilities of the calorimeter. The energy reconstruction is improved either with corrections based on the local energy density or by applying a single correction factor to the event energy sum derived from a global measure of the shower energy density. The application of the compensation algorithms to GEANT4 simulations yield resolution improvements comparable to those observed for real data.

  20. Applications of artificial intelligence to space station and automated software techniques: High level robot command language

    Science.gov (United States)

    Mckee, James W.

    1989-01-01

    The objective is to develop a system that will allow a person not necessarily skilled in the art of programming robots to quickly and naturally create the necessary data and commands to enable a robot to perform a desired task. The system will use a menu driven graphical user interface. This interface will allow the user to input data to select objects to be moved. There will be an imbedded expert system to process the knowledge about objects and the robot to determine how they are to be moved. There will be automatic path planning to avoid obstacles in the work space and to create a near optimum path. The system will contain the software to generate the required robot instructions.

  1. Toward high-speed 3D nonlinear soft tissue deformation simulations using Abaqus software.

    Science.gov (United States)

    Idkaidek, Ashraf; Jasiuk, Iwona

    2015-12-01

    We aim to achieve a fast and accurate three-dimensional (3D) simulation of a porcine liver deformation under a surgical tool pressure using the commercial finite element software Abaqus. The liver geometry is obtained using magnetic resonance imaging, and a nonlinear constitutive law is employed to capture large deformations of the tissue. Effects of implicit versus explicit analysis schemes, element type, and mesh density on computation time are studied. We find that Abaqus explicit and implicit solvers are capable of simulating nonlinear soft tissue deformations accurately using first-order tetrahedral elements in a relatively short time by optimizing the element size. This study provides new insights and guidance on accurate and relatively fast nonlinear soft tissue simulations. Such simulations can provide force feedback during robotic surgery and allow visualization of tissue deformations for surgery planning and training of surgical residents.

  2. ATLAS High Level Calorimeter Trigger Software Performance for Cosmic Ray Events

    CERN Document Server

    Oliveira Damazio, Denis; The ATLAS collaboration

    2009-01-01

    The ATLAS detector is undergoing intense commissioning effort with cosmic rays preparing for the first LHC collisions next spring. Combined runs with all of the ATLAS subsystems are being taken in order to evaluate the detector performance. This is an unique opportunity also for the trigger system to be studied with different detector operation modes, such as different event rates and detector configuration. The ATLAS trigger starts with a hardware based system which tries to identify detector regions where interesting physics objects may be found (eg: large energy depositions in the calorimeter system). An approved event will be further processed by more complex software algorithms at the second level where detailed features are extracted (full detector granularity data for small portions of the detector is available). Events accepted at this level will be further processed at the so-called event filter level. Full detector data at full granularity is available for offline like processing with complete calib...

  3. Research on the verification of highly parallel and concurrent embedded software

    OpenAIRE

    青木, 利晃

    2012-01-01

    本研究では,スケジューリングを伴う並行・並列ソフトウェアと,スケジューリングを提供するリアルタイムオペレーティングシステム(RTOS)を対象とした.成果としては,前者に関しては,実時間を含む振る舞いを検証するためのアルゴリズムおよびツールを提案し,後者に関しては,RTOS の設計と実装を検証する手法およびツールを提案し,実際に使われているRTOS の検証も行った.これにより,現実的なセッティングで,モデル検査に基づいた手法の提案に成功し,実際に,現実問題に適用できることがわかった. : We focus on parallel/concurrent software which is controlled by real-time operating system(RTOS) and RTOS itself. We have proposed an algorithm and tool to verify the behavior of parallel/concurrent software which contains scheduling by RTOS and real-...

  4. Actual directions in study of ecological consequences of a highly toxic 1,1-dimethylhydrazine-based rocket fuel spills

    Directory of Open Access Journals (Sweden)

    Bulat Kenessov

    2012-05-01

    Full Text Available The paper represents a review of the actual directions in study of ecological consequences of highly toxic 1,1-dimethylhydrazine-based rocket fuel spills. Recent results on study of processes of transformation of 1,1-dimethylhydrazine, identification of its main metabolites and development of analytical methods for their determination are generalized. Modern analytical methods of determination of 1,1-dimethylhydrazine and its transformation products in environmental samples are characterized. It is shown that in recent years, through the use of most modern methods of physical chemical analysis and sample preparation, works in this direction made significant progress and contributed to the development of studies in adjacent areas. A character of distribution of transformation products in soils of fall places of first stages of rocket-carriers is described and the available methods for their remediation are characterized.

  5. Magnitude and reactivity consequences of accidental moisture ingress into the Modular High-Temperature Gas-Cooled Reactor core

    International Nuclear Information System (INIS)

    Smith, O.L.

    1992-01-01

    Accidental admission of moisture into the primary system of a Modular High-Temperature Gas-Cooled Reactor (MHTGR) has been identified in US Department of Energy-sponsored studies as an important safety concern. The work described here develops an analytical methodology to quantify the pressure and reactivity consequences of steam-generator tube rupture and other moistureingress-related incidents. Important neutronic and thermohydraulic processes are coupled with reactivity feedback and safety and control system responses. Rate and magnitude of steam buildup are found to be dominated by major system features such as break size in comparison with safety valve capacity and reliability, while being less sensitive to factors such as heat transfer coefficients. The results indicate that ingress transients progress at a slower pace than previously predicted by bounding analyses, with milder power overshoots and more time for operator or automatic corrective actions

  6. Magnitude and reactivity consequences of moisture ingress into the modular High-Temperature Gas-Cooled Reactor core

    International Nuclear Information System (INIS)

    Smith, O.L.

    1992-12-01

    Inadvertent admission of moisture into the primary system of a modular high-temperature gas-cooled reactor has been identified in US Department of Energy-sponsored studies as an important safety concern. The work described here develops an analytical methodology to quantify the pressure and reactivity consequences of steam-generator tube rupture and other moisture-ingress-related incidents. Important neutronic and thermohydraulic processes are coupled with reactivity feedback and safety and control system responses. The rate and magnitude of steam buildup are found to be dominated by major system features such as break size compared with safety valve capacity and reliability and less sensitive to factors such as heat transfer coefficients. The results indicate that ingress transients progress at a slower pace than previously predicted by bounding analyses, with milder power overshoots and more time for operator or automatic corrective actions

  7. High integrity software for nuclear power plants: Candidate guidelines, technical basis and research needs. Executive summary: Volume 1

    International Nuclear Information System (INIS)

    Seth, S.; Bail, W.; Cleaves, D.; Cohen, H.; Hybertson, D.; Schaefer, C.; Stark, G.; Ta, A.; Ulery, B.

    1995-06-01

    The work documented in this report was performed in support of the US Nuclear Regulatory Commission to examine the technical basis for candidate guidelines that could be considered in reviewing and evaluating high integrity computer software used in the safety systems of nuclear power plants. The framework for the work consisted of the following software development and assurance activities: requirements specification; design; coding; verification and validation, including static analysis and dynamic testing; safety analysis; operation and maintenance; configuration management; quality assurance; and planning and management. Each activity (framework element) was subdivided into technical areas (framework subelements). The report describes the development of approximately 200 candidate guidelines that span the entire range of software life-cycle activities; the assessment of the technical basis for those candidate guidelines; and the identification, categorization and prioritization of research needs for improving the technical basis. The report has two volumes: Volume 1, Executive Summary, includes an overview of the framework and of each framework element, the complete set of candidate guidelines, the results of the assessment of the technical basis for each candidate guideline, and a discussion of research needs that support the regulatory function; Volume 2 is the main report

  8. Proposing a Qualitative Approach for Corporate Competitive Capability Modeling in High-Tech Business (Case study: Software Industry

    Directory of Open Access Journals (Sweden)

    Mahmoud Saremi Saremi

    2010-09-01

    Full Text Available The evolution of global business trend for ICT-based products in recent decades shows the intensive activity of pioneer developing countries to gain a powerful competitive position in global software industry. In this research, with regard to importance of competition issue for top managers of Iranian software companies, a conceptual model has been developed for Corporate Competitive Capability concept. First, after describing the research problem, we present a comparative review of recent theories of firm and competition that has been applied by different researchers in the High-Tech and Knowledge Intensive Organization filed. Afterwards, with a detailed review of literature and previous research papers, an initial research framework and applied research method has been proposed. The main and final section of paper assigned to describing the result of research in different steps of qualitative modeling process. The agreed concepts are related to corporate competitive capability, the elicited and analyzed experts Cause Map, the elicited collective causal maps, and the final proposed model for software industry are the modeling results for this paper.

  9. Advanced communications technology satellite high burst rate link evaluation terminal power control and rain fade software test plan, version 1.0

    Science.gov (United States)

    Reinhart, Richard C.

    1993-01-01

    The Power Control and Rain Fade Software was developed at the NASA Lewis Research Center to support the Advanced Communications Technology Satellite High Burst Rate Link Evaluation Terminal (ACTS HBR-LET). The HBR-LET is an experimenters terminal to communicate with the ACTS for various experiments by government, university, and industry agencies. The Power Control and Rain Fade Software is one segment of the Control and Performance Monitor (C&PM) Software system of the HBR-LET. The Power Control and Rain Fade Software automatically controls the LET uplink power to compensate for signal fades. Besides power augmentation, the C&PM Software system is also responsible for instrument control during HBR-LET experiments, control of the Intermediate Frequency Switch Matrix on board the ACTS to yield a desired path through the spacecraft payload, and data display. The Power Control and Rain Fade Software User's Guide, Version 1.0 outlines the commands and procedures to install and operate the Power Control and Rain Fade Software. The Power Control and Rain Fade Software Maintenance Manual, Version 1.0 is a programmer's guide to the Power Control and Rain Fade Software. This manual details the current implementation of the software from a technical perspective. Included is an overview of the Power Control and Rain Fade Software, computer algorithms, format representations, and computer hardware configuration. The Power Control and Rain Fade Test Plan provides a step-by-step procedure to verify the operation of the software using a predetermined signal fade event. The Test Plan also provides a means to demonstrate the capability of the software.

  10. PlantCV v2: Image analysis software for high-throughput plant phenotyping

    Directory of Open Access Journals (Sweden)

    Malia A. Gehan

    2017-12-01

    Full Text Available Systems for collecting image data in conjunction with computer vision techniques are a powerful tool for increasing the temporal resolution at which plant phenotypes can be measured non-destructively. Computational tools that are flexible and extendable are needed to address the diversity of plant phenotyping problems. We previously described the Plant Computer Vision (PlantCV software package, which is an image processing toolkit for plant phenotyping analysis. The goal of the PlantCV project is to develop a set of modular, reusable, and repurposable tools for plant image analysis that are open-source and community-developed. Here we present the details and rationale for major developments in the second major release of PlantCV. In addition to overall improvements in the organization of the PlantCV project, new functionality includes a set of new image processing and normalization tools, support for analyzing images that include multiple plants, leaf segmentation, landmark identification tools for morphometrics, and modules for machine learning.

  11. An assessment of the radiological consequences of disposal of high-level waste in coastal geologic formations

    International Nuclear Information System (INIS)

    Hill, M.D.; Lawson, G.

    1980-11-01

    This study was carried out with the objectives of assessing the potential radiological consequences of entry of circulating ground-water into a high-level waste repository sited on the coast; and comparing the results with those of previous assessments for a repository sited inland. Mathematical models are used to calculate the rate of release of radioactivity into ground-water by leaching, the rates of migration of radionuclides with ground-water from the repository to the sea and the concentrations of radionuclides in sea-water and sea-food as a function of time. Estimates are made of the peak annual collective doses and collective dose commitments which could be received as a result of sea-food consumption. Since there are considerable uncertainties associated with the values of many of the parameters used in the calculations the broad features of the results are more significant than the numerical values of predicted annual doses and collective dose commitments. The results of the assessment show that the rates of migration of radionuclides with ground-water are of primary importance in determining the radiological impact of ground-water ingress. The implications of this result for selection of coastal sites and allocation of research effort are discussed. The comparison of coastal and inland sites suggest that coastal siting may have substantial advantages in terms of the radiological consequences to the public after disposal and that a significant fraction of available research effort should therefore be directed towards investigation of coastal sites. This study has been carried out under contract to the United Kingdom Atomic Energy Authority, Harwell, on behalf of the Commission of the European Communities. (author)

  12. Software engineering

    CERN Document Server

    Sommerville, Ian

    2010-01-01

    The ninth edition of Software Engineering presents a broad perspective of software engineering, focusing on the processes and techniques fundamental to the creation of reliable, software systems. Increased coverage of agile methods and software reuse, along with coverage of 'traditional' plan-driven software engineering, gives readers the most up-to-date view of the field currently available. Practical case studies, a full set of easy-to-access supplements, and extensive web resources make teaching the course easier than ever.

  13. A concept of software testing for SMART MMIS software

    International Nuclear Information System (INIS)

    Seo, Yong Seok; Seong, Seung Hwan; Park, Keun Ok; Hur, Sub; Kim, Dong Hoon

    2001-01-01

    In order to achieve high quality of SMART MMIS software, the well-constructed software testing concept shall be required. This paper established software testing concept which is to be applied to SMART MMIS software, in terms of software testing organization, documentation. procedure, and methods. The software testing methods are classified into source code static analysis and dynamic testing. The software dynamic testing methods are discussed with two aspects: white-box and black-box testing. As software testing concept introduced in this paper is applied to the SMART MMIS software. the high quality of the software will be produced. In the future, software failure data will be collected through the construction of SMART MMIS prototyping facility which the software testing concept of this paper is applied to

  14. Cooperation on Improved Isotopic Identification and Analysis Software for Portable, Electrically Cooled High-Resolution Gamma Spectrometry Systems Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Dreyer, Jonathan G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Wang, Tzu-Fang [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Vo, Duc T. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Funk, Pierre F. [Inst. for Radiological Protection and Nuclear Safety (IRSN), Fontenay-aux-Roses (France); Weber, Anne-Laure [Inst. for Radiological Protection and Nuclear Safety (IRSN), Fontenay-aux-Roses (France)

    2017-07-20

    Under a 2006 agreement between the Department of Energy (DOE) of the United States of America and the Institut de Radioprotection et de Sûreté Nucléaire (IRSN) of France, the National Nuclear Security Administration (NNSA) within DOE and IRSN initiated a collaboration to improve isotopic identification and analysis of nuclear material [i.e., plutonium (Pu) and uranium (U)]. The specific aim of the collaborative project was to develop new versions of two types of isotopic identification and analysis software: (1) the fixed-energy response-function analysis for multiple energies (FRAM) codes and (2) multi-group analysis (MGA) codes. The project is entitled Action Sheet 4 – Cooperation on Improved Isotopic Identification and Analysis Software for Portable, Electrically Cooled, High-Resolution Gamma Spectrometry Systems (Action Sheet 4). FRAM and MGA/U235HI are software codes used to analyze isotopic ratios of U and Pu. FRAM is an application that uses parameter sets for the analysis of U or Pu. MGA and U235HI are two separate applications that analyze Pu or U, respectively. They have traditionally been used by safeguards practitioners to analyze gamma spectra acquired with high-resolution gamma spectrometry (HRGS) systems that are cooled by liquid nitrogen. However, it was discovered that these analysis programs were not as accurate when used on spectra acquired with a newer generation of more portable, electrically cooled HRGS (ECHRGS) systems. In response to this need, DOE/NNSA and IRSN collaborated to update the FRAM and U235HI codes to improve their performance with newer ECHRGS systems. Lawrence Livermore National Laboratory (LLNL) and Los Alamos National Laboratory (LANL) performed this work for DOE/NNSA.

  15. The effects of actinide separation on the radiological consequences of geologic disposal of high-level waste

    International Nuclear Information System (INIS)

    Hill, M.D.; White, I.F.; Fleishman, A.B.

    1980-01-01

    It has often been suggested that the potential hazard to man from the disposal of high-level radioactive waste could be reduced by removing a substantial fraction of the actinide elements. In this report the effects of actinide separation on the radiological consequences of one of the disposal options currently under consideration, that of burial in deep geologic formations, are examined. The results show that the potential radiological impact of geologic disposal of high-level waste arises from both long-lived fission products and actinides (and their daughter radionuclides). Neither class of radionuclides is of overriding importance and actinide separation would therefore reduce the radiological impact to only a limited extent and over limited periods. There might be a case for attempting to reduce doses from 237 Np. To achieve this it appears to be necessary to separate both neptunium and its precursor element americium. However, there are major uncertainties in the data needed to predict doses from 237 Np; further research is required to resolve these uncertainties. In addition, consideration should be given to alternative methods of reducing the radiological impact of geologic disposal. The conclusions of this assessment differ considerably from those of similar studies based on the concept of toxicity indices. Use of these indices can lead to incorrect allocation of research and development effort. (author)

  16. Highly-optimized TWSM software package for seismic diffraction modeling adapted for GPU-cluster

    Science.gov (United States)

    Zyatkov, Nikolay; Ayzenberg, Alena; Aizenberg, Arkady

    2015-04-01

    Oil producing companies concern to increase resolution capability of seismic data for complex oil-and-gas bearing deposits connected with salt domes, basalt traps, reefs, lenses, etc. Known methods of seismic wave theory define shape of hydrocarbon accumulation with nonsufficient resolution, since they do not account for multiple diffractions explicitly. We elaborate alternative seismic wave theory in terms of operators of propagation in layers and reflection-transmission at curved interfaces. Approximation of this theory is realized in the seismic frequency range as the Tip-Wave Superposition Method (TWSM). TWSM based on the operator theory allows to evaluate of wavefield in bounded domains/layers with geometrical shadow zones (in nature it can be: salt domes, basalt traps, reefs, lenses, etc.) accounting for so-called cascade diffraction. Cascade diffraction includes edge waves from sharp edges, creeping waves near concave parts of interfaces, waves of the whispering galleries near convex parts of interfaces, etc. The basic algorithm of TWSM package is based on multiplication of large-size matrices (make hundreds of terabytes in size). We use advanced information technologies for effective realization of numerical procedures of the TWSM. In particular, we actively use NVIDIA CUDA technology and GPU accelerators allowing to significantly improve the performance of the TWSM software package, that is important in using it for direct and inverse problems. The accuracy, stability and efficiency of the algorithm are justified by numerical examples with curved interfaces. TWSM package and its separate components can be used in different modeling tasks such as planning of acquisition systems, physical interpretation of laboratory modeling, modeling of individual waves of different types and in some inverse tasks such as imaging in case of laterally inhomogeneous overburden, AVO inversion.

  17. From Software Development to Software Assembly

    NARCIS (Netherlands)

    Sneed, Harry M.; Verhoef, Chris

    2016-01-01

    The lack of skilled programming personnel and the growing burden of maintaining customized software are forcing organizations to quit producing their own software. It's high time they turned to ready-made, standard components to fulfill their business requirements. Cloud services might be one way to

  18. Advanced Cell Classifier: User-Friendly Machine-Learning-Based Software for Discovering Phenotypes in High-Content Imaging Data.

    Science.gov (United States)

    Piccinini, Filippo; Balassa, Tamas; Szkalisity, Abel; Molnar, Csaba; Paavolainen, Lassi; Kujala, Kaisa; Buzas, Krisztina; Sarazova, Marie; Pietiainen, Vilja; Kutay, Ulrike; Smith, Kevin; Horvath, Peter

    2017-06-28

    High-content, imaging-based screens now routinely generate data on a scale that precludes manual verification and interrogation. Software applying machine learning has become an essential tool to automate analysis, but these methods require annotated examples to learn from. Efficiently exploring large datasets to find relevant examples remains a challenging bottleneck. Here, we present Advanced Cell Classifier (ACC), a graphical software package for phenotypic analysis that addresses these difficulties. ACC applies machine-learning and image-analysis methods to high-content data generated by large-scale, cell-based experiments. It features methods to mine microscopic image data, discover new phenotypes, and improve recognition performance. We demonstrate that these features substantially expedite the training process, successfully uncover rare phenotypes, and improve the accuracy of the analysis. ACC is extensively documented, designed to be user-friendly for researchers without machine-learning expertise, and distributed as a free open-source tool at www.cellclassifier.org. Copyright © 2017 Elsevier Inc. All rights reserved.

  19. Software-controlled, highly automated intrafraction prostate motion correction with intrafraction stereographic targeting: System description and clinical results

    International Nuclear Information System (INIS)

    Mutanga, Theodore F.; Boer, Hans C. J. de; Rajan, Vinayakrishnan; Dirkx, Maarten L. P.; Os, Marjolein J. H. van; Incrocci, Luca; Heijmen, Ben J. M.

    2012-01-01

    Purpose: A new system for software-controlled, highly automated correction of intrafraction prostate motion,'' intrafraction stereographic targeting'' (iSGT), is described and evaluated. Methods: At our institute, daily prostate positioning is routinely performed at the start of treatment beam using stereographic targeting (SGT). iSGT was implemented by extension of the SGT software to facilitate fast and accurate intrafraction motion corrections with minimal user interaction. iSGT entails megavoltage (MV) image acquisitions with the first segment of selected IMRT beams, automatic registration of implanted markers, followed by remote couch repositioning to correct for intrafraction motion above a predefined threshold, prior to delivery of the remaining segments. For a group of 120 patients, iSGT with corrections for two nearly lateral beams was evaluated in terms of workload and impact on effective intrafraction displacements in the sagittal plane. Results: SDs of systematic (Σ) and random (σ) displacements relative to the planning CT measured directly after initial SGT setup correction were eff eff eff eff eff eff < 0.7 mm, requiring corrections in 82.4% of the fractions. Because iSGT is highly automated, the extra time added by iSGT is <30 s if a correction is required. Conclusions: Without increasing imaging dose, iSGT successfully reduces intrafraction prostate motion with minimal workload and increase in fraction time. An action level of 2 mm is recommended.

  20. "Joint Workshop on High Confidence Medical Devices, Software, and Systems (HCMDSS) and Medical Device Plug-and-Play (MD PnP) Interoperability"

    National Research Council Canada - National Science Library

    Goldman, Julian M

    2008-01-01

    Partial support was requested from TATRC, with joint funding from NSF, for a joint workshop to bring together the synergistic efforts and communities of the High Confidence Medical Devices, Software, and Systems (HCMDSS...

  1. Computer based workstation for development of software for high energy physics experiments

    International Nuclear Information System (INIS)

    Ivanchenko, I.M.; Sedykh, Yu.V.

    1987-01-01

    Methodical principles and results of a successful attempt to create on the base of IBM-PC/AT personal computer of effective means for development of programs for high energy physics experiments are analysed. The obtained results permit to combine the best properties and a positive materialized experience accumulated on the existing time sharing collective systems with a high quality of data representation, reliability and convenience of personal computer applications

  2. Formal Verification of Digital Protection Logic and Automatic Testing Software

    Energy Technology Data Exchange (ETDEWEB)

    Cha, S. D.; Ha, J. S.; Seo, J. S. [KAIST, Daejeon (Korea, Republic of)

    2008-06-15

    - Technical aspect {center_dot} It is intended that digital I and C software have safety and reliability. Project results help the software to acquire license. Software verification technique, which results in this project, can be to use for digital NPP(Nuclear power plant) in the future. {center_dot} This research introduces many meaningful results of verification on digital protection logic and suggests I and C software testing strategy. These results apply to verify nuclear fusion device, accelerator, nuclear waste management and nuclear medical device that require dependable software and high-reliable controller. Moreover, These can be used for military, medical or aerospace-related software. - Economical and industrial aspect {center_dot} Since safety of digital I and C software is highly import, It is essential for the software to be verified. But verification and licence acquisition related to digital I and C software face high cost. This project gives economic profit to domestic economy by using introduced verification and testing technique instead of foreign technique. {center_dot} The operation rate of NPP will rise, when NPP safety critical software is verified with intellectual V and V tool. It is expected that these software substitute safety-critical software that wholly depend on foreign. Consequently, the result of this project has high commercial value and the recognition of the software development works will be able to be spread to the industrial circles. - Social and cultural aspect People expect that nuclear power generation contributes to relieving environmental problems because that does not emit more harmful air pollution source than other power generations. To give more trust and expectation about nuclear power generation to our society, we should make people to believe that NPP is highly safe system. In that point of view, we can present high-reliable I and C proofed by intellectual V and V technique as evidence

  3. High-precision atmospheric parameter and abundance determination of massive stars, and consequences for stellar and Galactic evolution

    International Nuclear Information System (INIS)

    Nieva, Maria-Fernanda; Przybilla, Norbert; Irrgang, Andreas

    2011-01-01

    The derivation of high precision/accuracy parameters and chemical abundances of massive stars is of utmost importance to the fields of stellar evolution and Galactic chemical evolution. We concentrate on the study of OB-type stars near the main sequence and their evolved progeny, the BA-type supergiants, covering masses of ∼6 to 25 solar masses and a range in effective temperature from ∼8000 to 35 000 K. The minimization of the main sources of systematic errors in the atmospheric model computation, the observed spectra and the quantitative spectral analysis play a critical role in the final results. Our self-consistent spectrum analysis technique employing a robust non-LTE line formation allows precise atmospheric parameters of massive stars to be derived, achieving 1σ-uncertainties as low as 1% in effective temperature and ∼0.05–0.10 dex in surface gravity. Consequences on the behaviour of the chemical elements carbon, nitrogen and oxygen are discussed here in the context of massive star evolution and Galactic chemical evolution, showing tight relations covered in previous work by too large statistical and systematic uncertainties. The spectral analysis of larger star samples, like from the upcoming Gaia-ESO survey, may benefit from these findings.

  4. Decon2LS: An open-source software package for automated processing and visualization of high resolution mass spectrometry data.

    Science.gov (United States)

    Jaitly, Navdeep; Mayampurath, Anoop; Littlefield, Kyle; Adkins, Joshua N; Anderson, Gordon A; Smith, Richard D

    2009-03-17

    Data generated from liquid chromatography coupled to high-resolution mass spectrometry (LC-MS)-based studies of a biological sample can contain large amounts of biologically significant information in the form of proteins, peptides, and metabolites. Interpreting this data involves inferring the masses and abundances of biomolecules injected into the instrument. Because of the inherent complexity of mass spectral patterns produced by these biomolecules, the analysis is significantly enhanced by using visualization capabilities to inspect and confirm results. In this paper we describe Decon2LS, an open-source software package for automated processing and visualization of high-resolution MS data. Drawing extensively on algorithms developed over the last ten years for ICR2LS, Decon2LS packages the algorithms as a rich set of modular, reusable processing classes for performing diverse functions such as reading raw data, routine peak finding, theoretical isotope distribution modelling, and deisotoping. Because the source code is openly available, these functionalities can now be used to build derivative applications in relatively fast manner. In addition, Decon2LS provides an extensive set of visualization tools, such as high performance chart controls. With a variety of options that include peak processing, deisotoping, isotope composition, etc, Decon2LS supports processing of multiple raw data formats. Deisotoping can be performed on an individual scan, an individual dataset, or on multiple datasets using batch processing. Other processing options include creating a two dimensional view of mass and liquid chromatography (LC) elution time features, generating spectrum files for tandem MS data, creating total intensity chromatograms, and visualizing theoretical peptide profiles. Application of Decon2LS to deisotope different datasets obtained across different instruments yielded a high number of features that can be used to identify and quantify peptides in the

  5. Decon2LS: An open-source software package for automated processing and visualization of high resolution mass spectrometry data

    Directory of Open Access Journals (Sweden)

    Anderson Gordon A

    2009-03-01

    Full Text Available Abstract Background Data generated from liquid chromatography coupled to high-resolution mass spectrometry (LC-MS-based studies of a biological sample can contain large amounts of biologically significant information in the form of proteins, peptides, and metabolites. Interpreting this data involves inferring the masses and abundances of biomolecules injected into the instrument. Because of the inherent complexity of mass spectral patterns produced by these biomolecules, the analysis is significantly enhanced by using visualization capabilities to inspect and confirm results. In this paper we describe Decon2LS, an open-source software package for automated processing and visualization of high-resolution MS data. Drawing extensively on algorithms developed over the last ten years for ICR2LS, Decon2LS packages the algorithms as a rich set of modular, reusable processing classes for performing diverse functions such as reading raw data, routine peak finding, theoretical isotope distribution modelling, and deisotoping. Because the source code is openly available, these functionalities can now be used to build derivative applications in relatively fast manner. In addition, Decon2LS provides an extensive set of visualization tools, such as high performance chart controls. Results With a variety of options that include peak processing, deisotoping, isotope composition, etc, Decon2LS supports processing of multiple raw data formats. Deisotoping can be performed on an individual scan, an individual dataset, or on multiple datasets using batch processing. Other processing options include creating a two dimensional view of mass and liquid chromatography (LC elution time features, generating spectrum files for tandem MS data, creating total intensity chromatograms, and visualizing theoretical peptide profiles. Application of Decon2LS to deisotope different datasets obtained across different instruments yielded a high number of features that can be used to

  6. Examining software complexity and quality for scientific software

    International Nuclear Information System (INIS)

    Kelly, D.; Shepard, T.

    2005-01-01

    Research has not found a simple relationship between software complexity and software quality, and particularly no relationship between commonly used software complexity metrics and the occurrence of software faults. A study with an example of scientific software from the nuclear power industry illustrates the importance of addressing cognitive complexity, the complexity related to understanding the intellectual content of the software. Simple practices such as aptly-named variables contributes more to high quality software than limiting code sizes. This paper examines the research into complexity and quality and reports on a longitudinal study using the example of nuclear software. (author)

  7. SOFTWARE OPEN SOURCE, SOFTWARE GRATIS?

    Directory of Open Access Journals (Sweden)

    Nur Aini Rakhmawati

    2006-01-01

    Full Text Available Normal 0 false false false IN X-NONE X-NONE MicrosoftInternetExplorer4 Berlakunya Undang – undang Hak Atas Kekayaan Intelektual (HAKI, memunculkan suatu alternatif baru untuk menggunakan software open source. Penggunaan software open source menyebar seiring dengan isu global pada Information Communication Technology (ICT saat ini. Beberapa organisasi dan perusahaan mulai menjadikan software open source sebagai pertimbangan. Banyak konsep mengenai software open source ini. Mulai dari software yang gratis sampai software tidak berlisensi. Tidak sepenuhnya isu software open source benar, untuk itu perlu dikenalkan konsep software open source mulai dari sejarah, lisensi dan bagaimana cara memilih lisensi, serta pertimbangan dalam memilih software open source yang ada. Kata kunci :Lisensi, Open Source, HAKI

  8. High accuracy positioning using carrier-phases with the opensource GPSTK software

    OpenAIRE

    Salazar Hernández, Dagoberto José; Hernández Pajares, Manuel; Juan Zornoza, José Miguel; Sanz Subirana, Jaume

    2008-01-01

    The objective of this work is to show how using a proper GNSS data management strategy, combined with the flexibility provided by the open source "GPS Toolkit" (GPSTk), it is possible to easily develop both simple code-based processing strategies as well as basic high accuracy carrier-phase positioning techniques like Precise Point Positioning (PPP

  9. Worked examples are more efficient for learning than high-assistance instructional software

    NARCIS (Netherlands)

    McLaren, Bruce M.; van Gog, Tamara; Ganoe, Craig; Yaron, David; Karabinos, Michael

    2015-01-01

    The ‘assistance dilemma’, an important issue in the Learning Sciences, is concerned with how much guidance or assistance should be provided to help students learn. A recent study comparing three high-assistance approaches (worked examples, tutored problems, and erroneous examples) and one

  10. High-throughput computational methods and software for quantitative trait locus (QTL) mapping

    NARCIS (Netherlands)

    Arends, Danny

    2014-01-01

    De afgelopen jaren zijn vele nieuwe technologieen zoals Tiling arrays en High throughput DNA sequencing een belangrijke rol gaan spelen binnen het onderzoeksveld van de systeem genetica. Voor onderzoekers is het extreem belangrijk om te begrijpen dat deze methodes hun manier van werken zullen gaan

  11. Software Prototyping

    Science.gov (United States)

    Del Fiol, Guilherme; Hanseler, Haley; Crouch, Barbara Insley; Cummins, Mollie R.

    2016-01-01

    Summary Background Health information exchange (HIE) between Poison Control Centers (PCCs) and Emergency Departments (EDs) could improve care of poisoned patients. However, PCC information systems are not designed to facilitate HIE with EDs; therefore, we are developing specialized software to support HIE within the normal workflow of the PCC using user-centered design and rapid prototyping. Objective To describe the design of an HIE dashboard and the refinement of user requirements through rapid prototyping. Methods Using previously elicited user requirements, we designed low-fidelity sketches of designs on paper with iterative refinement. Next, we designed an interactive high-fidelity prototype and conducted scenario-based usability tests with end users. Users were asked to think aloud while accomplishing tasks related to a case vignette. After testing, the users provided feedback and evaluated the prototype using the System Usability Scale (SUS). Results Survey results from three users provided useful feedback that was then incorporated into the design. After achieving a stable design, we used the prototype itself as the specification for development of the actual software. Benefits of prototyping included having 1) subject-matter experts heavily involved with the design; 2) flexibility to make rapid changes, 3) the ability to minimize software development efforts early in the design stage; 4) rapid finalization of requirements; 5) early visualization of designs; 6) and a powerful vehicle for communication of the design to the programmers. Challenges included 1) time and effort to develop the prototypes and case scenarios; 2) no simulation of system performance; 3) not having all proposed functionality available in the final product; and 4) missing needed data elements in the PCC information system. PMID:27081404

  12. Software Epistemology

    Science.gov (United States)

    2016-03-01

    in-vitro decision to incubate a startup, Lexumo [7], which is developing a commercial Software as a Service ( SaaS ) vulnerability assessment...LTS Label Transition System MUSE Mining and Understanding Software Enclaves RTEMS Real-Time Executive for Multi-processor Systems SaaS Software ...as a Service SSA Static Single Assignment SWE Software Epistemology UD/DU Def-Use/Use-Def Chains (Dataflow Graph)

  13. Final Report on XStack: Software Synthesis for High Productivity ExaScale Computing

    Energy Technology Data Exchange (ETDEWEB)

    Solar-Lezama, Armando [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States). Computer Science and Artificial Intelligence Lab.

    2016-07-12

    The goal of the project was to develop a programming model that would significantly improve productivity in the high-performance computing domain by bringing together three components: a) Automated equivalence checking, b) Sketch-based program synthesis, and c) Autotuning. The report provides an executive summary of the research accomplished through this project. At the end of the report is appended a paper that describes in more detail the key technical accomplishments from this project, and which was published in SC 2014.

  14. Consequence analysis

    International Nuclear Information System (INIS)

    Woodard, K.

    1985-01-01

    The objectives of this paper are to: Provide a realistic assessment of consequences; Account for plant and site-specific characteristics; Adjust accident release characteristics to account for results of plant-containment analysis; Produce conditional risk curves for each of five health effects; and Estimate uncertainties

  15. Open Source Software Acquisition

    DEFF Research Database (Denmark)

    Holck, Jesper; Kühn Pedersen, Mogens; Holm Larsen, Michael

    2005-01-01

    Lately we have seen a growing interest from both public and private organisations to adopt OpenSource Software (OSS), not only for a few, specific applications but also on a more general levelthroughout the organisation. As a consequence, the organisations' decisions on adoption of OSS arebecoming...

  16. GERMINATOR: a software package for high-throughput scoring and curve fitting of Arabidopsis seed germination.

    Science.gov (United States)

    Joosen, Ronny V L; Kodde, Jan; Willems, Leo A J; Ligterink, Wilco; van der Plas, Linus H W; Hilhorst, Henk W M

    2010-04-01

    Over the past few decades seed physiology research has contributed to many important scientific discoveries and has provided valuable tools for the production of high quality seeds. An important instrument for this type of research is the accurate quantification of germination; however gathering cumulative germination data is a very laborious task that is often prohibitive to the execution of large experiments. In this paper we present the germinator package: a simple, highly cost-efficient and flexible procedure for high-throughput automatic scoring and evaluation of germination that can be implemented without the use of complex robotics. The germinator package contains three modules: (i) design of experimental setup with various options to replicate and randomize samples; (ii) automatic scoring of germination based on the color contrast between the protruding radicle and seed coat on a single image; and (iii) curve fitting of cumulative germination data and the extraction, recap and visualization of the various germination parameters. The curve-fitting module enables analysis of general cumulative germination data and can be used for all plant species. We show that the automatic scoring system works for Arabidopsis thaliana and Brassica spp. seeds, but is likely to be applicable to other species, as well. In this paper we show the accuracy, reproducibility and flexibility of the germinator package. We have successfully applied it to evaluate natural variation for salt tolerance in a large population of recombinant inbred lines and were able to identify several quantitative trait loci for salt tolerance. Germinator is a low-cost package that allows the monitoring of several thousands of germination tests, several times a day by a single person.

  17. High porosity harzburgite and dunite channels for the transport of compositionally heterogeneous melts in the mantle: II. Geochemical consequences

    Science.gov (United States)

    Liang, Y.; Schiemenz, A.; Xia, Y.; Parmentier, E.

    2009-12-01

    In a companion numerical study [1], we explored the spatial distribution of high porosity harzburgite and dunite channels produced by reactive dissolution of orthopyroxene (opx) in an upwelling mantle column and identified a number of new features. In this study, we examine the geochemical consequences of channelized melt flow under the settings outlined in [1] with special attention to the transport of compositionally heterogeneous melts and their interactions with the surrounding peridotite matrix during melt migration in the mantle. Time-dependent transport equations for a trace element in the interstitial melt and solids that include advection, dispersion, and melt-rock reaction were solved in a 2-D upwelling column using the high-order numerical methods outlined in [1]. The melt and solid velocities were taken from the steady state or quasi-steady state solutions of [1]. In terms of trace element fractionation, the simulation domain can be divided into 4 distinct regions: (a) high porosity harzburgite channel, overlain by; (b) high porosity dunite channel; (c) low porosity compacting boundary layer surrounding the melt channels; and (d) inter-channel regions outside (c). In the limit of local chemical equilibrium, melting in region (d) is equivalent to batch melting, whereas melting and melt extraction in (c) is more close to fractional melting with the melt suction rate first increase from the bottom of the melting column to a maximum near the bottom of the dunite channel and then decrease upward in the compacting boundary layer. The melt composition in the high porosity harzburgite channel is similar to that produced by high-degree batch melting (up to opx exhaustion), whereas the melt composition in the dunite is a weighted average of the ultra-depleted melt from the harzburgite channel below, the expelled melt from the compacting boundary layer, and melt produced by opx dissolution along the sidewalls of the dunite channel. Compaction within the dunite

  18. Cardiovascular and metabolic consequences of the association between chronic stress and high-fat diet in rats.

    Science.gov (United States)

    Simas, Bruna B; Nunes, Everson A; Crestani, Carlos C; Speretta, Guilherme F

    2018-05-01

    Obesity and chronic stress are considered independent risk factors for the development of cardiovascular diseases and changes in autonomic system activity. However, the cardiovascular consequences induced by the association between high-fat diet (HFD) and chronic stress are not fully understood. We hypothesized that the association between HFD and exposure to a chronic variable stress (CVS) protocol for four weeks might exacerbate the cardiovascular and metabolic disturbances in rats when compared to these factors singly. To test this hypothesis, male Wistar rats were divided into four groups: control-standard chow diet (SD; n = 8); control-HFD (n = 8); CVS-SD (n = 8); and CVS-HFD (n = 8). The CVS consisted of repeated exposure of the rats to different inescapable and unpredictable stressors (restraint tress; damp sawdust, cold, swim stress and light cycle inversion). We evaluated cardiovascular function, autonomic activity, dietary intake, adiposity and metabolism. The HFD increased body weight, adiposity and blood glucose concentration (∼15%) in both control and CVS rats. The CVS-HFD rats showed decreased insulin sensitivity (25%) compared to CVS-SD rats. The control-HFD and CVS-HFD rats presented increased intrinsic heart rate (HR) values (∼8%). CVS increased cardiac sympathetic activity (∼65%) in both SD- and HFD-fed rats. The HFD increased basal HR (∼10%). Blood pressure and baroreflex analyzes showed no differences among the experimental groups. In conclusion, the present data indicate absence of interaction on autonomic imbalance evoked by either CVS or HFD. Additionally, HFD increased HR and evoked metabolic disruptions which are independent of stress exposure.

  19. Software reliability

    CERN Document Server

    Bendell, A

    1986-01-01

    Software Reliability reviews some fundamental issues of software reliability as well as the techniques, models, and metrics used to predict the reliability of software. Topics covered include fault avoidance, fault removal, and fault tolerance, along with statistical methods for the objective assessment of predictive accuracy. Development cost models and life-cycle cost models are also discussed. This book is divided into eight sections and begins with a chapter on adaptive modeling used to predict software reliability, followed by a discussion on failure rate in software reliability growth mo

  20. High Performance Electrical Modeling and Simulation Software Normal Environment Verification and Validation Plan, Version 1.0; TOPICAL

    International Nuclear Information System (INIS)

    WIX, STEVEN D.; BOGDAN, CAROLYN W.; MARCHIONDO JR., JULIO P.; DEVENEY, MICHAEL F.; NUNEZ, ALBERT V.

    2002-01-01

    The requirements in modeling and simulation are driven by two fundamental changes in the nuclear weapons landscape: (1) The Comprehensive Test Ban Treaty and (2) The Stockpile Life Extension Program which extends weapon lifetimes well beyond their originally anticipated field lifetimes. The move from confidence based on nuclear testing to confidence based on predictive simulation forces a profound change in the performance asked of codes. The scope of this document is to improve the confidence in the computational results by demonstration and documentation of the predictive capability of electrical circuit codes and the underlying conceptual, mathematical and numerical models as applied to a specific stockpile driver. This document describes the High Performance Electrical Modeling and Simulation software normal environment Verification and Validation Plan

  1. Development of new generation software tools for simulation of electron beam formation in novel high power gyrotrons

    Science.gov (United States)

    Sabchevski, S.; Zhelyazkov, I.; Benova, E.; Atanassov, V.; Dankov, P.; Thumm, M.; Dammertz, G.; Piosczyk, B.; Illy, S.; Tran, M. Q.; Alberti, S.; Hogge, J.-Ph

    2006-07-01

    Computer aided design (CAD) based on numerical experiments performed by using adequate physical models and efficient simulation codes is an indispensable tool for development, investigation, and optimization of gyrotrons used as radiation sources for electron cyclotron resonance heating (ECRH) of fusion plasmas. In this paper, we review briefly the state-of-the-art in the field of modelling and simulation of intense, relativistic, helical electron beams formed in the electron-optical systems (EOS) of powerful gyrotrons. We discuss both the limitations of the known computer codes and the requirements for increasing their capabilities for solution of various design problems that are being envisaged in the development of the next generation gyrotrons for ECRH. Moreover, we present the concept followed by us in an attempt to unite the advantages of the modern programming techniques with self-consistent, first-principles 3D physical models in the creation of a new highly efficient and versatile software package for simulation of powerful gyrotrons.

  2. 3D-SURFER: software for high-throughput protein surface comparison and analysis.

    Science.gov (United States)

    La, David; Esquivel-Rodríguez, Juan; Venkatraman, Vishwesh; Li, Bin; Sael, Lee; Ueng, Stephen; Ahrendt, Steven; Kihara, Daisuke

    2009-11-01

    We present 3D-SURFER, a web-based tool designed to facilitate high-throughput comparison and characterization of proteins based on their surface shape. As each protein is effectively represented by a vector of 3D Zernike descriptors, comparison times for a query protein against the entire PDB take, on an average, only a couple of seconds. The web interface has been designed to be as interactive as possible with displays showing animated protein rotations, CATH codes and structural alignments using the CE program. In addition, geometrically interesting local features of the protein surface, such as pockets that often correspond to ligand binding sites as well as protrusions and flat regions can also be identified and visualized. 3D-SURFER is a web application that can be freely accessed from: http://dragon.bio.purdue.edu/3d-surfer dkihara@purdue.edu Supplementary data are available at Bioinformatics online.

  3. Development of High Level Trigger Software for Belle II at SuperKEKB

    International Nuclear Information System (INIS)

    Lee, S; Itoh, R; Katayama, N; Mineo, S

    2011-01-01

    The Belle collaboration has been trying for 10 years to reveal the mystery of the current matter-dominated universe. However, much more statistics is required to search for New Physics through quantum loops in decays of B mesons. In order to increase the experimental sensitivity, the next generation B-factory, SuperKEKB, is planned. The design luminosity of SuperKEKB is 8 x 10 35 cm −2 s −1 a factor 40 above KEKB's peak luminosity. At this high luminosity, the level 1 trigger of the Belle II experiment will stream events of 300 kB size at a 30 kHz rate. To reduce the data flow to a manageable level, a high-level trigger (HLT) is needed, which will be implemented using the full offline reconstruction on a large scale PC farm. There, physics level event selection is performed, reducing the event rate by ∼ 10 to a few kHz. To execute the reconstruction the HLT uses the offline event processing framework basf2, which has parallel processing capabilities used for multi-core processing and PC clusters. The event data handling in the HLT is totally object oriented utilizing ROOT I/O with a new method of object passing over the UNIX socket connection. Also under consideration is the use of the HLT output as well to reduce the pixel detector event size by only saving hits associated with a track, resulting in an additional data reduction of ∼ 100 for the pixel detector. In this contribution, the design and implementation of the Belle II HLT are presented together with a report of preliminary testing results.

  4. Hardware and software system for monitoring oil pump operation in power high-voltage transformers

    Directory of Open Access Journals (Sweden)

    Михайло Дмитрович Дяченко

    2017-07-01

    Full Text Available The article considers the basic prerequisites for the creation of an automated monitoring system for oil pumps of high-voltage transformers. This is due to the fact that the long operation of oil pumps results in deterioration and destruction of bearings, rubbing of the rotor, breakage and damage to the impeller, leakage, etc., which inevitably causes a significant decrease in the insulating properties of the transformer oil and leads to expenditures for its further recovery. False triggerings of gas protection sometimes occur. Continuous operation of the electric motor also requires additional equipment to protect the motor itself from various emergency situations, such as a short in the stator winding, a housing breakdown, an incomplete phase mode, etc. The use of stationary systems provides: diagnosing defects at an early stage of their development, increasing the reliability and longevity of the equipment components, increasing the overhaul period, decreasing the number of emergency stops, and adjusting the schedule of preventative maintenance. The basic principles of identification of the damaged part of the oil pump are given, the hardware and algorithmic solutions are considered in the work. The full-scale tests of the model sample on the power transformer of the high-voltage substation confirmed the assumption of the possibility of detecting the damaged unit separating it from the rest connected in one mechanical structure. A detailed analysis of the operation of each of the units is carried out by means of the general substation switchboard and displayed as graphs, diagrams and text messages. When the limit values of vibration are reached, faults in the operation of the unit are detected, the overlimit current values, a warning alarm is activated, and the command to disconnect the damaged unit is issued. The optimal solution for the organization of the information collection system using the principle of sensor networks, but combined

  5. Workflow for high-content, individual cell quantification of fluorescent markers from universal microscope data, supported by open source software.

    Science.gov (United States)

    Stockwell, Simon R; Mittnacht, Sibylle

    2014-12-16

    Advances in understanding the control mechanisms governing the behavior of cells in adherent mammalian tissue culture models are becoming increasingly dependent on modes of single-cell analysis. Methods which deliver composite data reflecting the mean values of biomarkers from cell populations risk losing subpopulation dynamics that reflect the heterogeneity of the studied biological system. In keeping with this, traditional approaches are being replaced by, or supported with, more sophisticated forms of cellular assay developed to allow assessment by high-content microscopy. These assays potentially generate large numbers of images of fluorescent biomarkers, which enabled by accompanying proprietary software packages, allows for multi-parametric measurements per cell. However, the relatively high capital costs and overspecialization of many of these devices have prevented their accessibility to many investigators. Described here is a universally applicable workflow for the quantification of multiple fluorescent marker intensities from specific subcellular regions of individual cells suitable for use with images from most fluorescent microscopes. Key to this workflow is the implementation of the freely available Cell Profiler software(1) to distinguish individual cells in these images, segment them into defined subcellular regions and deliver fluorescence marker intensity values specific to these regions. The extraction of individual cell intensity values from image data is the central purpose of this workflow and will be illustrated with the analysis of control data from a siRNA screen for G1 checkpoint regulators in adherent human cells. However, the workflow presented here can be applied to analysis of data from other means of cell perturbation (e.g., compound screens) and other forms of fluorescence based cellular markers and thus should be useful for a wide range of laboratories.

  6. Computer software.

    Science.gov (United States)

    Rosenthal, L E

    1986-10-01

    Software is the component in a computer system that permits the hardware to perform the various functions that a computer system is capable of doing. The history of software and its development can be traced to the early nineteenth century. All computer systems are designed to utilize the "stored program concept" as first developed by Charles Babbage in the 1850s. The concept was lost until the mid-1940s, when modern computers made their appearance. Today, because of the complex and myriad tasks that a computer system can perform, there has been a differentiation of types of software. There is software designed to perform specific business applications. There is software that controls the overall operation of a computer system. And there is software that is designed to carry out specialized tasks. Regardless of types, software is the most critical component of any computer system. Without it, all one has is a collection of circuits, transistors, and silicone chips.

  7. Optimized Architectural Approaches in Hardware and Software Enabling Very High Performance Shared Storage Systems

    CERN Multimedia

    CERN. Geneva

    2004-01-01

    There are issues encountered in high performance storage systems that normally lead to compromises in architecture. Compute clusters tend to have compute phases followed by an I/O phase that must move data from the entire cluster in one operation. That data may then be shared by a large number of clients creating unpredictable read and write patterns. In some cases the aggregate performance of a server cluster must exceed 100 GB/s to minimize the time required for the I/O cycle thus maximizing compute availability. Accessing the same content from multiple points in a shared file system leads to the classical problems of data "hot spots" on the disk drive side and access collisions on the data connectivity side. The traditional method for increasing apparent bandwidth usually includes data replication which is costly in both storage and management. Scaling a model that includes replicated data presents additional management challenges as capacity and bandwidth expand asymmetrically while the system is scaled. ...

  8. Development of interactive hypermedia software for high school biology: A research and development study

    Science.gov (United States)

    Alturki, Uthman T.

    The goal of this research was to research, design, and develop a hypertext program for students who study biology. The Ecology Hypertext Program was developed using Research and Development (R&D) methodology. The purpose of this study was to place the final "product", a CD-ROM for learning biology concepts, in the hands of teachers and students to help them in learning and teaching process. The product was created through a cycle of literature review, needs assessment, development, and a cycle of field tests and revisions. I applied the ten steps of R&D process suggested by Borg and Gall (1989) which, consisted of: (1) Literature review, (2) Needs assessment, (3) Planning, (4) Develop preliminary product, (5) Preliminary field-testing, (6) Preliminary revision, (7) Main field-testing, (8) Main revision, (9) Final field-testing, and (10) Final product revision. The literature review and needs assessment provided a support and foundation for designing the preliminary product---the Ecology Hypertext Program. Participants in the needs assessment joined a focus group discussion. They were a group of graduate students in education who suggested the importance for designing this product. For the preliminary field test, the participants were a group of high school students studying biology. They were the potential user of the product. They reviewed the preliminary product and then filled out a questionnaire. Their feedback and suggestions were used to develop and improve the product in a step called preliminary revision. The second round of field tasting was the main field test in which the participants joined a focus group discussion. They were the same group who participated in needs assessment task. They reviewed the revised product and then provided ideas and suggestions to improve the product. Their feedback were categorized and implemented to develop the product as in the main revision task. Finally, a group of science teachers participated in this study by reviewing

  9. Software engineering in industry

    Science.gov (United States)

    Story, C. M.

    1989-12-01

    Can software be "engineered"? Can a few people with limited resources and a negligible budget produce high quality software solutions to complex software problems? It is possible to resolve the conflict between research activities and the necessity to view software development as a means to an end rather than as an end in itself? The aim of this paper is to encourage further thought and discussion on various topics which, in the author's experience, are becoming increasingly critical in large current software production and development projects, inside and outside high energy physics (HEP). This is done by briefly exploring some of the software engineering ideas and technologies now used in the information industry, using, as a case-study, a project with many similarities to those currently under way in HEP.

  10. Track and mode controller (TMC): a software executive for a high-altitude pointing and tracking experiment

    Science.gov (United States)

    Michnovicz, Michael R.

    1997-06-01

    A real-time executive has been implemented to control a high altitude pointing and tracking experiment. The track and mode controller (TMC) implements a table driven design, in which the track mode logic for a tracking mission is defined within a state transition diagram (STD). THe STD is implemented as a state transition table in the TMC software. Status Events trigger the state transitions in the STD. Each state, as it is entered, causes a number of processes to be activated within the system. As these processes propagate through the system, the status of key processes are monitored by the TMC, allowing further transitions within the STD. This architecture is implemented in real-time, using the vxWorks operating system. VxWorks message queues allow communication of status events from the Event Monitor task to the STD task. Process commands are propagated to the rest of the system processors by means of the SCRAMNet shared memory network. The system mode logic contained in the STD will autonomously sequence in acquisition, tracking and pointing system through an entire engagement sequence, starting with target detection and ending with aimpoint maintenance. Simulation results and lab test results will be presented to verify the mode controller. In addition to implementing the system mode logic with the STD, the TMC can process prerecorded time sequences of commands required during startup operations. It can also process single commands from the system operator. In this paper, the author presents (1) an overview, in which he describes the TMC architecture, the relationship of an end-to-end simulation to the flight software and the laboratory testing environment, (2) implementation details, including information on the vxWorks message queues and the SCRAMNet shared memory network, (3) simulation results and lab test results which verify the mode controller, and (4) plans for the future, specifically as to how this executive will expedite transition to a fully

  11. Software quality management

    International Nuclear Information System (INIS)

    Bishop, D.C.; Pymm, P.

    1991-01-01

    As programmable electronic (software-based) systems are increasingly being proposed as design solutions for high integrity applications in nuclear power stations, the need to adopt suitable quality management arrangements is paramount. The authors describe Scottish Nuclear's strategy for software quality management and, using the main on-line monitoring system at Torness Power Station as an example, explain how this strategy is put into practice. Particular attention is given to the topics of software quality planning and change control. (author)

  12. Assuring Software Reliability

    Science.gov (United States)

    2014-08-01

    technologies and processes to achieve a required level of confidence that software systems and services function in the intended manner. 1.3 Security Example...that took three high-voltage lines out of service and a software fail- ure (a race condition3) that disabled the computing service that notified the... service had failed. Instead of analyzing the details of the alarm server failure, the reviewers asked why the following software assurance claim had

  13. Modelling of Consequences of Biogas Leakage from Gasholder

    Directory of Open Access Journals (Sweden)

    Petr Trávníček

    2017-03-01

    Full Text Available This paper describes modelling of consequences of biogas leakage from a gasholder on agricultural biogas station. Four scenarios were selected for the purpose of this work. A rupture of gasholders membrane and instantaneous explosion of gas cloud, blast of gas with delay, emptying of whole volume of gas (without initiation and initiation of gas with Jet-Fire. Leakage of gas is modelled by special software and consequences are determined on the basis of results. The first scenario was modelled with help of equations because used software does not include an appropriate model. A farm with high building density was chosen as a model case. Biogas is replaced by methane because used software does not support modelling of dispersion of mixtures. From this viewpoint, a conservative approach is applied because biogas contains “only” approximately 60% of methane (in dependence on technology and processed material.

  14. Perceived vulnerability moderates the relations between the use of protective behavioral strategies and alcohol use and consequences among high-risk young adults.

    Science.gov (United States)

    Garcia, Tracey A; Fairlie, Anne M; Litt, Dana M; Waldron, Katja A; Lewis, Melissa A

    2018-06-01

    Drinking protective behavioral strategies (PBS) have been associated with reductions in alcohol use and alcohol-related consequences in young adults. PBS subscales, Limiting/Stopping (LS), Manner of Drinking (MOD), and Serious Harm Reduction (SHR), have been examined in the literature; LS, MOD, and SHR have mixed support as protective factors. Understanding moderators between PBS and alcohol use and related consequences is an important development in PBS research in order to delineate when and for whom PBS use is effective in reducing harm from alcohol use. Perceptions of vulnerability to negative consequences, included in health-risk models, may be one such moderator. The current study examined whether two types of perceived vulnerability (perceived vulnerability when drinking; perceived vulnerability in uncomfortable/unfamiliar situations) moderated the relations between LS, MOD, SHR strategies and alcohol use and related negative consequences. High-risk young adults (N = 400; 53.75% female) recruited nationally completed measures of PBS, alcohol use and related consequences, and measures of perceived vulnerability. Findings demonstrated that perceived vulnerability when drinking moderated the relations between MOD strategies and alcohol use. The interactions between perceived vulnerability when drinking and PBS did not predict alcohol-related consequences. Perceived vulnerability in unfamiliar/uncomfortable social situations moderated relations between MOD strategies and both alcohol use and related negative consequences; no other significant interactions emerged. Across both perceived vulnerability types and MOD strategies, those with the highest levels of perceived vulnerability and who used MOD strategies the most had the greatest decrements in alcohol use and related negative consequences. Prevention and intervention implications are discussed. Copyright © 2018. Published by Elsevier Ltd.

  15. Constrained consequence

    CSIR Research Space (South Africa)

    Britz, K

    2011-09-01

    Full Text Available their basic properties and relationship. In Section 3 we present a modal instance of these constructions which also illustrates with an example how to reason abductively with constrained entailment in a causal or action oriented context. In Section 4 we... of models with the former approach, whereas in Section 3.3 we give an example illustrating ways in which C can be de ned with both. Here we employ the following versions of local consequence: De nition 3.4. Given a model M = hW;R;Vi and formulas...

  16. Fault-specific verification (FSV) - An alternative VV ampersand T strategy for high reliability nuclear software systems

    International Nuclear Information System (INIS)

    Miller, L.A.

    1994-01-01

    The author puts forth an argument that digital instrumentation and control systems can be safely applied in the nuclear industry, but it will require changes to the way software for such systems is developed and tested. He argues for a fault-specific verification procedure to be applied to software development. This plan includes enumerating and classifying all software faults at all levels of the product development, over the whole development process. While collecting this data, develop and validate different methods for software verification, validation and testing, and apply them against all the detected faults. Force all of this development toward an automated product for doing this testing. Continue to develop, expand, test, and share these testing methods across a wide array of software products

  17. An Interactive and Comprehensive Working Environment for High-Energy Physics Software with Python and Jupyter Notebooks

    Science.gov (United States)

    Braun, N.; Hauth, T.; Pulvermacher, C.; Ritter, M.

    2017-10-01

    Today’s analyses for high-energy physics (HEP) experiments involve processing a large amount of data with highly specialized algorithms. The contemporary workflow from recorded data to final results is based on the execution of small scripts - often written in Python or ROOT macros which call complex compiled algorithms in the background - to perform fitting procedures and generate plots. During recent years interactive programming environments, such as Jupyter, became popular. Jupyter allows to develop Python-based applications, so-called notebooks, which bundle code, documentation and results, e.g. plots. Advantages over classical script-based approaches is the feature to recompute only parts of the analysis code, which allows for fast and iterative development, and a web-based user frontend, which can be hosted centrally and only requires a browser on the user side. In our novel approach, Python and Jupyter are tightly integrated into the Belle II Analysis Software Framework (basf2), currently being developed for the Belle II experiment in Japan. This allows to develop code in Jupyter notebooks for every aspect of the event simulation, reconstruction and analysis chain. These interactive notebooks can be hosted as a centralized web service via jupyterhub with docker and used by all scientists of the Belle II Collaboration. Because of its generality and encapsulation, the setup can easily be scaled to large installations.

  18. Tools & training for more secure software

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    Just by fate of nature, software today is shipped out as “beta”, coming with vulnerabilities and weaknesses, which should already have been fixed at the programming stage. This presentation will show the consequences of suboptimal software, why good programming, thorough software design, and a proper software development process is imperative for the overall security of the Organization, and how a few simple tools and training are supposed to make CERN software more secure.

  19. Global Software Engineering

    DEFF Research Database (Denmark)

    Ebert, Christof; Kuhrmann, Marco; Prikladnicki, Rafael

    2016-01-01

    Professional software products and IT systems and services today are developed mostly by globally distributed teams, projects, and companies. Successfully orchestrating Global Software Engineering (GSE) has become the major success factor both for organizations and practitioners. Yet, more than...... and experience reported at the IEEE International Conference on Software Engineering (ICGSE) series. The outcomes of our analysis show GSE as a field highly attached to industry and, thus, a considerable share of ICGSE papers address the transfer of Software Engineering concepts and solutions to the global stage...

  20. Computational Fluid Dynamics (CFD) Computations With Zonal Navier-Stokes Flow Solver (ZNSFLOW) Common High Performance Computing Scalable Software Initiative (CHSSI) Software

    National Research Council Canada - National Science Library

    Edge, Harris

    1999-01-01

    ...), computational fluid dynamics (CFD) 6 project. Under the project, a proven zonal Navier-Stokes solver was rewritten for scalable parallel performance on both shared memory and distributed memory high performance computers...

  1. SU-E-J-04: Integration of Interstitial High Intensity Therapeutic Ultrasound Applicators On a Clinical MRI-Guided High Intensity Focused Ultrasound Treatment Planning Software Platform

    Energy Technology Data Exchange (ETDEWEB)

    Ellens, N [Johns Hopkins University, Baltimore, Maryland (United States); Partanen, A [Philips Healthcare, Andover, Massachusetts (United States); Ghoshal, G; Burdette, E [Acoustic MedSystems Inc., Savoy, IL (United States); Farahani, K [National Cancer Institute, Bethesda, MD (United States)

    2015-06-15

    Purpose: Interstitial high intensity therapeutic ultrasound (HITU) applicators can be used to ablate tissue percutaneously, allowing for minimally-invasive treatment without ionizing radiation [1,2]. The purpose of this study was to evaluate the feasibility and usability of combining multielement interstitial HITU applicators with a clinical magnetic resonance imaging (MRI)-guided focused ultrasound software platform. Methods: The Sonalleve software platform (Philips Healthcare, Vantaa, Finland) combines anatomical MRI for target selection and multi-planar MRI thermometry to provide real-time temperature information. The MRI-compatible interstitial US applicators (Acoustic MedSystems, Savoy, IL, USA) had 1–4 cylindrical US elements, each 1 cm long with either 180° or 360° of active surface. Each applicator (4 Fr diameter, enclosed within a 13 Fr flexible catheter) was inserted into a tissue-mimicking agar-silica phantom. Degassed water was circulated around the transducers for cooling and coupling. Based on the location of the applicator, a virtual transducer overlay was added to the software to assist targeting and to allow automatic thermometry slice placement. The phantom was sonicated at 7 MHz for 5 minutes with 6–8 W of acoustic power for each element. MR thermometry data were collected during and after sonication. Results: Preliminary testing indicated that the applicator location could be identified in the planning images and the transducer locations predicted within 1 mm accuracy using the overlay. Ablation zones (thermal dose ≥ 240 CEM43) for 2 active, adjacent US elements ranged from 18 mm × 24 mm (width × length) to 25 mm × 25 mm for the 6 W and 8 W sonications, respectively. Conclusion: The combination of interstitial HITU applicators and this software platform holds promise for novel approaches in minimally-invasive MRI-guided therapy, especially when bony structures or air-filled cavities may preclude extracorporeal HIFU.[1] Diederich et al

  2. Science and Software

    Science.gov (United States)

    Zelt, C. A.

    2017-12-01

    Earth science attempts to understand how the earth works. This research often depends on software for modeling, processing, inverting or imaging. Freely sharing open-source software is essential to prevent reinventing the wheel and allows software to be improved and applied in ways the original author may never have envisioned. For young scientists, releasing software can increase their name ID when applying for jobs and funding, and create opportunities for collaborations when scientists who collect data want the software's creator to be involved in their project. However, we frequently hear scientists say software is a tool, it's not science. Creating software that implements a new or better way of earth modeling or geophysical processing, inverting or imaging should be viewed as earth science. Creating software for things like data visualization, format conversion, storage, or transmission, or programming to enhance computational performance, may be viewed as computer science. The former, ideally with an application to real data, can be published in earth science journals, the latter possibly in computer science journals. Citations in either case should accurately reflect the impact of the software on the community. Funding agencies need to support more software development and open-source releasing, and the community should give more high-profile awards for developing impactful open-source software. Funding support and community recognition for software development can have far reaching benefits when the software is used in foreseen and unforeseen ways, potentially for years after the original investment in the software development. For funding, an open-source release that is well documented should be required, with example input and output files. Appropriate funding will provide the incentive and time to release user-friendly software, and minimize the need for others to duplicate the effort. All funded software should be available through a single web site

  3. Software Innovation

    DEFF Research Database (Denmark)

    Rose, Jeremy

      Innovation is the forgotten key to modern systems development - the element that defines the enterprising engineer, the thriving software firm and the cutting edge software application.  Traditional forms of technical education pay little attention to creativity - often encouraging overly...

  4. Choice & Consequence

    DEFF Research Database (Denmark)

    Khan, Azam

    to support hypothesis generation, hypothesis testing, and decision making. In addition to sensors in buildings, infrastructure, or the environment, we also propose the instrumentation of user interfaces to help measure performance in decision making applications. We show the benefits of applying principles...... between cause and effect in complex systems complicates decision making. To address this issue, we examine the central role that data-driven decision making could play in critical domains such as sustainability or medical treatment. We developed systems for exploratory data analysis and data visualization...... of data analysis and instructional interface design, to both simulation systems and decision support interfaces. We hope that projects such as these will help people to understand the link between their choices and the consequences of their decisions....

  5. Adoption of High Performance Computational (HPC) Modeling Software for Widespread Use in the Manufacture of Welded Structures

    Energy Technology Data Exchange (ETDEWEB)

    Brust, Frederick W. [Engineering Mechanics Corporation of Columbus, Columbus, OH (United States); Punch, Edward F. [Engineering Mechanics Corporation of Columbus, Columbus, OH (United States); Twombly, Elizabeth Kurth [Engineering Mechanics Corporation of Columbus, Columbus, OH (United States); Kalyanam, Suresh [Engineering Mechanics Corporation of Columbus, Columbus, OH (United States); Kennedy, James [Engineering Mechanics Corporation of Columbus, Columbus, OH (United States); Hattery, Garty R. [Engineering Mechanics Corporation of Columbus, Columbus, OH (United States); Dodds, Robert H. [Professional Consulting Services, Inc., Lisle, IL (United States); Mach, Justin C [Caterpillar, Peoria, IL (United States); Chalker, Alan [Ohio Supercomputer Center (OSC), Columbus, OH (United States); Nicklas, Jeremy [Ohio Supercomputer Center (OSC), Columbus, OH (United States); Gohar, Basil M [Ohio Supercomputer Center (OSC), Columbus, OH (United States); Hudak, David [Ohio Supercomputer Center (OSC), Columbus, OH (United States)

    2016-12-30

    This report summarizes the final product developed for the US DOE Small Business Innovation Research (SBIR) Phase II grant made to Engineering Mechanics Corporation of Columbus (Emc2) between April 16, 2014 and August 31, 2016 titled ‘Adoption of High Performance Computational (HPC) Modeling Software for Widespread Use in the Manufacture of Welded Structures’. Many US companies have moved fabrication and production facilities off shore because of cheaper labor costs. A key aspect in bringing these jobs back to the US is the use of technology to render US-made fabrications more cost-efficient overall with higher quality. One significant advantage that has emerged in the US over the last two decades is the use of virtual design for fabrication of small and large structures in weld fabrication industries. Industries that use virtual design and analysis tools have reduced material part size, developed environmentally-friendly fabrication processes, improved product quality and performance, and reduced manufacturing costs. Indeed, Caterpillar Inc. (CAT), one of the partners in this effort, continues to have a large fabrication presence in the US because of the use of weld fabrication modeling to optimize fabrications by controlling weld residual stresses and distortions and improving fatigue, corrosion, and fracture performance. This report describes Emc2’s DOE SBIR Phase II final results to extend an existing, state-of-the-art software code, Virtual Fabrication Technology (VFT®), currently used to design and model large welded structures prior to fabrication - to a broader range of products with widespread applications for small and medium-sized enterprises (SMEs). VFT® helps control distortion, can minimize and/or control residual stresses, control welding microstructure, and pre-determine welding parameters such as weld-sequencing, pre-bending, thermal-tensioning, etc. VFT® uses material properties, consumable properties, etc. as inputs

  6. Software refactoring at the package level using clustering techniques

    KAUST Repository

    Alkhalid, A.; Alshayeb, M.; Mahmoud, S. A.

    2011-01-01

    Enhancing, modifying or adapting the software to new requirements increases the internal software complexity. Software with high level of internal complexity is difficult to maintain. Software refactoring reduces software complexity and hence

  7. The Harvard Automated Processing Pipeline for Electroencephalography (HAPPE): Standardized Processing Software for Developmental and High-Artifact Data.

    Science.gov (United States)

    Gabard-Durnam, Laurel J; Mendez Leal, Adriana S; Wilkinson, Carol L; Levin, April R

    2018-01-01

    Electroenchephalography (EEG) recordings collected with developmental populations present particular challenges from a data processing perspective. These EEGs have a high degree of artifact contamination and often short recording lengths. As both sample sizes and EEG channel densities increase, traditional processing approaches like manual data rejection are becoming unsustainable. Moreover, such subjective approaches preclude standardized metrics of data quality, despite the heightened importance of such measures for EEGs with high rates of initial artifact contamination. There is presently a paucity of automated resources for processing these EEG data and no consistent reporting of data quality measures. To address these challenges, we propose the Harvard Automated Processing Pipeline for EEG (HAPPE) as a standardized, automated pipeline compatible with EEG recordings of variable lengths and artifact contamination levels, including high-artifact and short EEG recordings from young children or those with neurodevelopmental disorders. HAPPE processes event-related and resting-state EEG data from raw files through a series of filtering, artifact rejection, and re-referencing steps to processed EEG suitable for time-frequency-domain analyses. HAPPE also includes a post-processing report of data quality metrics to facilitate the evaluation and reporting of data quality in a standardized manner. Here, we describe each processing step in HAPPE, perform an example analysis with EEG files we have made freely available, and show that HAPPE outperforms seven alternative, widely-used processing approaches. HAPPE removes more artifact than all alternative approaches while simultaneously preserving greater or equivalent amounts of EEG signal in almost all instances. We also provide distributions of HAPPE's data quality metrics in an 867 file dataset as a reference distribution and in support of HAPPE's performance across EEG data with variable artifact contamination and

  8. The Harvard Automated Processing Pipeline for Electroencephalography (HAPPE: Standardized Processing Software for Developmental and High-Artifact Data

    Directory of Open Access Journals (Sweden)

    Laurel J. Gabard-Durnam

    2018-02-01

    Full Text Available Electroenchephalography (EEG recordings collected with developmental populations present particular challenges from a data processing perspective. These EEGs have a high degree of artifact contamination and often short recording lengths. As both sample sizes and EEG channel densities increase, traditional processing approaches like manual data rejection are becoming unsustainable. Moreover, such subjective approaches preclude standardized metrics of data quality, despite the heightened importance of such measures for EEGs with high rates of initial artifact contamination. There is presently a paucity of automated resources for processing these EEG data and no consistent reporting of data quality measures. To address these challenges, we propose the Harvard Automated Processing Pipeline for EEG (HAPPE as a standardized, automated pipeline compatible with EEG recordings of variable lengths and artifact contamination levels, including high-artifact and short EEG recordings from young children or those with neurodevelopmental disorders. HAPPE processes event-related and resting-state EEG data from raw files through a series of filtering, artifact rejection, and re-referencing steps to processed EEG suitable for time-frequency-domain analyses. HAPPE also includes a post-processing report of data quality metrics to facilitate the evaluation and reporting of data quality in a standardized manner. Here, we describe each processing step in HAPPE, perform an example analysis with EEG files we have made freely available, and show that HAPPE outperforms seven alternative, widely-used processing approaches. HAPPE removes more artifact than all alternative approaches while simultaneously preserving greater or equivalent amounts of EEG signal in almost all instances. We also provide distributions of HAPPE's data quality metrics in an 867 file dataset as a reference distribution and in support of HAPPE's performance across EEG data with variable artifact

  9. Custom database development and biomarker discovery methods for MALDI-TOF mass spectrometry-based identification of high-consequence bacterial pathogens.

    Science.gov (United States)

    Tracz, Dobryan M; Tyler, Andrea D; Cunningham, Ian; Antonation, Kym S; Corbett, Cindi R

    2017-03-01

    A high-quality custom database of MALDI-TOF mass spectral profiles was developed with the goal of improving clinical diagnostic identification of high-consequence bacterial pathogens. A biomarker discovery method is presented for identifying and evaluating MALDI-TOF MS spectra to potentially differentiate biothreat bacteria from less-pathogenic near-neighbour species. Crown Copyright © 2017. Published by Elsevier B.V. All rights reserved.

  10. High integrity software for nuclear power plants: Candidate guidelines, technical basis and research needs. Main report, Volume 2

    International Nuclear Information System (INIS)

    Seth, S.; Bail, W.; Cleaves, D.; Cohen, H.; Hybertson, D.; Schaefer, C.; Stark, G.; Ta, A.; Ulery, B.

    1995-06-01

    The work documented in this report was performed in support of the US Nuclear Regulatory Commission to examine the technical basis for candidate guidelines that could be considered in reviewing and evaluating high integrity computer e following software development and assurance activities: Requirements specification; design; coding; verification and validation, inclukding static analysis and dynamic testing; safety analysis; operation and maintenance; configuration management; quality assurance; and planning and management. Each activity (framework element) was subdivided into technical areas (framework subelements). The report describes the development of approximately 200 candidate guidelines that span the entire ran e identification, categorization and prioritization of technical basis for those candidate guidelines; and the identification, categorization and prioritization of research needs for improving the technical basis. The report has two volumes: Volume 1, Executive Summary includes an overview of the framwork and of each framework element, the complete set of candidate guidelines, the results of the assessment of the technical basis for each candidate guideline, and a discussion of research needs that support the regulatory function; this document, Volume 2, is the main report

  11. High integrity software for nuclear power plants: Candidate guidelines, technical basis and research needs. Main report, Volume 2

    Energy Technology Data Exchange (ETDEWEB)

    Seth, S.; Bail, W.; Cleaves, D.; Cohen, H.; Hybertson, D.; Schaefer, C.; Stark, G.; Ta, A.; Ulery, B. [Mitre Corp., McLean, VA (United States)

    1995-06-01

    The work documented in this report was performed in support of the US Nuclear Regulatory Commission to examine the technical basis for candidate guidelines that could be considered in reviewing and evaluating high integrity computer e following software development and assurance activities: Requirements specification; design; coding; verification and validation, inclukding static analysis and dynamic testing; safety analysis; operation and maintenance; configuration management; quality assurance; and planning and management. Each activity (framework element) was subdivided into technical areas (framework subelements). The report describes the development of approximately 200 candidate guidelines that span the entire ran e identification, categorization and prioritization of technical basis for those candidate guidelines; and the identification, categorization and prioritization of research needs for improving the technical basis. The report has two volumes: Volume 1, Executive Summary includes an overview of the framwork and of each framework element, the complete set of candidate guidelines, the results of the assessment of the technical basis for each candidate guideline, and a discussion of research needs that support the regulatory function; this document, Volume 2, is the main report.

  12. Design and simulation study of high frequency response for surface acoustic wave device by using CST software

    Science.gov (United States)

    Zakaria, M. R.; Hashim, U.; Amin, Mohd Hasrul I. M.; Ayub, R. Mat; Hashim, M. N.; Adam, T.

    2015-05-01

    This paper focuses on the enhancement and improvement of the Surface Acoustic Wave (SAW) device performance. Due to increased demand in the international market for biosensor product, the product must be emphasized in terms of quality. However, within the technological advances, demand for device with low cost, high efficiency and friendly-user preferred. Surface Acoustic Wave (SAW) device with the combination of pair electrode know as Interdigital Transducer (IDT) was fabricated on a piezoelectric substrate. The design of Interdigital Transducer (IDT) parameter is changes in several sizes and values for which it is able to provide greater efficiency in sensing sensitivity by using process simulation with CST STUDIO Suite software. In addition, Interdigital Transducer (IDT) parameters also changed to be created the products with a smaller size and easy to handle where it also reduces the cost of this product. Parameter values of an Interdigital Transducer (IDT) will be changed in the design is the total number of fingers pair, finger length, finger width and spacing, aperture and also the thickness of the Interdigital Transducer (IDT). From the result, the performance of the sensor is improved significantly after modification is done.

  13. Systematic Software Development

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; Méndez Fernández, Daniel

    2015-01-01

    The speed of innovation and the global allocation of resources to accelerate development or to reduce cost put pressure on the software industry. In the global competition, especially so-called high-price countries have to present arguments why the higher development cost is justified and what...... makes these countries an attractive host for software companies. Often, high-quality engineering and excellent quality of products, e.g., machinery and equipment, are mentioned. Yet, the question is: Can such arguments be also found for the software industry? We aim at investigating the degree...... of professionalism and systematization of software development to draw a map of strengths and weaknesses. To this end, we conducted as a first step an exploratory survey in Germany, presented in this paper. In this survey, we focused on the perceived importance of the two general software engineering process areas...

  14. Global Ionosphere Mapping and Differential Code Bias Estimation during Low and High Solar Activity Periods with GIMAS Software

    Directory of Open Access Journals (Sweden)

    Qiang Zhang

    2018-05-01

    Full Text Available Ionosphere research using the Global Navigation Satellite Systems (GNSS techniques is a hot topic, with their unprecedented high temporal and spatial sampling rate. We introduced a new GNSS Ionosphere Monitoring and Analysis Software (GIMAS in order to model the global ionosphere vertical total electron content (VTEC maps and to estimate the GPS and GLObalnaya NAvigatsionnaya Sputnikovaya Sistema (GLONASS satellite and receiver differential code biases (DCBs. The GIMAS-based Global Ionosphere Map (GIM products during low (day of year from 202 to 231, in 2008 and high (day of year from 050 to 079, in 2014 solar activity periods were investigated and assessed. The results showed that the biases of the GIMAS-based VTEC maps relative to the International GNSS Service (IGS Ionosphere Associate Analysis Centers (IAACs VTEC maps ranged from −3.0 to 1.0 TECU (TEC unit (1 TECU = 1 × 1016 electrons/m2. The standard deviations (STDs ranged from 0.7 to 1.9 TECU in 2008, and from 2.0 to 8.0 TECU in 2014. The STDs at a low latitude were significantly larger than those at middle and high latitudes, as a result of the ionospheric latitudinal gradients. When compared with the Jason-2 VTEC measurements, the GIMAS-based VTEC maps showed a negative systematic bias of about −1.8 TECU in 2008, and a positive systematic bias of about +2.2 TECU in 2014. The STDs were about 2.0 TECU in 2008, and ranged from 2.2 to 8.5 TECU in 2014. Furthermore, the aforementioned characteristics were strongly related to the conditions of the ionosphere variation and the geographic latitude. The GPS and GLONASS satellite and receiver P1-P2 DCBs were compared with the IAACs DCBs. The root mean squares (RMSs were 0.16–0.20 ns in 2008 and 0.13–0.25 ns in 2014 for the GPS satellites and 0.26–0.31 ns in 2014 for the GLONASS satellites. The RMSs of receiver DCBs were 0.21–0.42 ns in 2008 and 0.33–1.47 ns in 2014 for GPS and 0.67–0.96 ns in 2014 for GLONASS. The monthly

  15. The consequences of "Culture's consequences"

    DEFF Research Database (Denmark)

    Knudsen, Fabienne; Froholdt, Lisa Loloma

    2009-01-01

      In this article, it is claimed that research on cross-cultural crews is dominated by one specific understanding of the concept of culture, which is static, evenly distributed and context-independent. Such a conception of culture may bring some basic order while facing an unknown culture...... review of the theory of Geert Hofstede, the most renowned representative of this theoretical approach. The practical consequences of using such a concept of culture is then analysed by means of a critical review of an article applying Hofstede to cross-cultural crews in seafaring. Finally, alternative...... views on culture are presented. The aim of the article is, rather than to promote any specific theory, to reflect about diverse perspectives of cultural sense-making in cross-cultural encounters. Udgivelsesdato: Oktober...

  16. Software requirements

    CERN Document Server

    Wiegers, Karl E

    2003-01-01

    Without formal, verifiable software requirements-and an effective system for managing them-the programs that developers think they've agreed to build often will not be the same products their customers are expecting. In SOFTWARE REQUIREMENTS, Second Edition, requirements engineering authority Karl Wiegers amplifies the best practices presented in his original award-winning text?now a mainstay for anyone participating in the software development process. In this book, you'll discover effective techniques for managing the requirements engineering process all the way through the development cy

  17. System Software and Tools for High Performance Computing Environments: A report on the findings of the Pasadena Workshop, April 14--16, 1992

    Energy Technology Data Exchange (ETDEWEB)

    Sterling, T. [Universities Space Research Association, Washington, DC (United States); Messina, P. [Jet Propulsion Lab., Pasadena, CA (United States); Chen, M. [Yale Univ., New Haven, CT (United States)] [and others

    1993-04-01

    The Pasadena Workshop on System Software and Tools for High Performance Computing Environments was held at the Jet Propulsion Laboratory from April 14 through April 16, 1992. The workshop was sponsored by a number of Federal agencies committed to the advancement of high performance computing (HPC) both as a means to advance their respective missions and as a national resource to enhance American productivity and competitiveness. Over a hundred experts in related fields from industry, academia, and government were invited to participate in this effort to assess the current status of software technology in support of HPC systems. The overall objectives of the workshop were to understand the requirements and current limitations of HPC software technology and to contribute to a basis for establishing new directions in research and development for software technology in HPC environments. This report includes reports written by the participants of the workshop`s seven working groups. Materials presented at the workshop are reproduced in appendices. Additional chapters summarize the findings and analyze their implications for future directions in HPC software technology development.

  18. Software Reviews.

    Science.gov (United States)

    Dwyer, Donna; And Others

    1989-01-01

    Reviewed are seven software packages for Apple and IBM computers. Included are: "Toxicology"; "Science Corner: Space Probe"; "Alcohol and Pregnancy"; "Science Tool Kit Plus"; Computer Investigations: Plant Growth"; "Climatrolls"; and "Animal Watch: Whales." (CW)

  19. Software Reviews.

    Science.gov (United States)

    Davis, Shelly J., Ed.; Knaupp, Jon, Ed.

    1984-01-01

    Reviewed is computer software on: (1) classification of living things, a tutorial program for grades 5-10; and (2) polynomial practice using tiles, a drill-and-practice program for algebra students. (MNS)

  20. Software Reviews.

    Science.gov (United States)

    Miller, Anne, Ed.; Radziemski, Cathy, Ed.

    1988-01-01

    Three pieces of computer software are described and reviewed: HyperCard, to build and use varied applications; Iggy's Gnees, for problem solving with shapes in grades kindergarten-two; and Algebra Shop, for practicing skills and problem solving. (MNS)

  1. NASA's Software Safety Standard

    Science.gov (United States)

    Ramsay, Christopher M.

    2007-01-01

    NASA relies more and more on software to control, monitor, and verify its safety critical systems, facilities and operations. Since the 1960's there has hardly been a spacecraft launched that does not have a computer on board that will provide command and control services. There have been recent incidents where software has played a role in high-profile mission failures and hazardous incidents. For example, the Mars Orbiter, Mars Polar Lander, the DART (Demonstration of Autonomous Rendezvous Technology), and MER (Mars Exploration Rover) Spirit anomalies were all caused or contributed to by software. The Mission Control Centers for the Shuttle, ISS, and unmanned programs are highly dependant on software for data displays, analysis, and mission planning. Despite this growing dependence on software control and monitoring, there has been little to no consistent application of software safety practices and methodology to NASA's projects with safety critical software. Meanwhile, academia and private industry have been stepping forward with procedures and standards for safety critical systems and software, for example Dr. Nancy Leveson's book Safeware: System Safety and Computers. The NASA Software Safety Standard, originally published in 1997, was widely ignored due to its complexity and poor organization. It also focused on concepts rather than definite procedural requirements organized around a software project lifecycle. Led by NASA Headquarters Office of Safety and Mission Assurance, the NASA Software Safety Standard has recently undergone a significant update. This new standard provides the procedures and guidelines for evaluating a project for safety criticality and then lays out the minimum project lifecycle requirements to assure the software is created, operated, and maintained in the safest possible manner. This update of the standard clearly delineates the minimum set of software safety requirements for a project without detailing the implementation for those

  2. inventory management, VMI, software agents, MDV model

    Directory of Open Access Journals (Sweden)

    Waldemar Wieczerzycki

    2012-03-01

    Full Text Available Background: As it is well know, the implementation of instruments of logistics management is only possible with the use of the latest information technology. So-called agent technology is one of the most promising solutions in this area. Its essence consists in an entirely new way of software distribution on the computer network platform, in which computer exchange among themselves not only data, but also software modules, called just agents. The first aim is to propose the alternative method of the implementation of the concept of the inventory management by the supplier with the use of intelligent software agents, which are able not only to transfer the information but also to make the autonomous decisions based on the privileges given to them. The second aim of this research was to propose a new model of a software agent, which will be both of a high mobility and a high intelligence. Methods: After a brief discussion of the nature of agent technology, the most important benefits of using it to build platforms to support business are given. Then the original model of polymorphic software agent, called Multi-Dimensionally Versioned Software Agent (MDV is presented, which is oriented on the specificity of IT applications in business. MDV agent is polymorphic, which allows the transmission through the network only the most relevant parts of its code, and only when necessary. Consequently, the network nodes exchange small amounts of software code, which ensures high mobility of software agents, and thus highly efficient operation of IT platforms built on the proposed model. Next, the adaptation of MDV software agents to implementation of well-known logistics management instrument - VMI (Vendor Managed Inventory is illustrated. Results: The key benefits of this approach are identified, among which one can distinguish: reduced costs, higher flexibility and efficiency, new functionality - especially addressed to business negotiation, full automation

  3. A Software Module for High-Accuracy Calibration of Rings and Cylinders on CMM using Multi-Orientation Techniques (Multi-Step and Reversal methods)

    DEFF Research Database (Denmark)

    Tosello, Guido; De Chiffre, Leonardo

    . The Centre for Geometrical Metrology (CGM) at the Technical University of Denmark takes care of free form measurements, in collaboration with DIMEG, University of Padova, Italy. The present report describes a software module, ROUNDCAL, to be used for high-accuracy calibration of rings and cylinders....... The purpose of the software is to calculate the form error and the least square circle of rings and cylinders by mean of average of pontwise measuring results becoming from so-called multi-orientation techniques (both reversal and multi-step methods) in order to eliminate systematic errors of CMM ....

  4. and consequences

    Directory of Open Access Journals (Sweden)

    P. Athanasopoulou

    2011-01-01

    Full Text Available (a Purpose: The purpose of this research is to identify the types of CSR initiatives employed by sports organisations; their antecedents, and their consequences for the company and society. (b Design/methodology/approach: This study is exploratory in nature. Two detailed case studies were conducted involving the football team and the basketball team of one professional, premier league club in Greece and their CSR initiatives. Both teams have the same name, they belong to one of the most popular teams in Greece with a large fan population; have both competed in International Competitions (UEFA’s Champion League; Final Four of the European Tournament and have realised many CSR initiatives in the past. The case studies involved in depth, personal interviews of managers responsible for CSR in each team. Case study data was triangulated with documentation and search of published material concerning CSR actions. Data was analysed with content analysis. (c Findings: Both teams investigated have undertaken various CSR activities the last 5 years, the football team significantly more than the basketball team. Major factors that affect CSR activity include pressure from leagues; sponsors; local community, and global organisations; orientation towards fulfilling their duty to society, and team CSR strategy. Major benefits from CSR include relief of vulnerable groups and philanthropy as well as a better reputation for the firm; increase in fan base; and finding sponsors more easily due to the social profile of the team. However, those benefits are not measured in any way although both teams observe increase in tickets sold; web site traffic and TV viewing statistics after CSR activities. Finally, promotion of CSR is mainly done through web sites; press releases; newspapers, and word-of-mouth communications. (d Research limitations/implications: This study involves only two case studies and has limited generalisability. Future research can extend the

  5. The Effect of Dynamic Geometry Software on the Vocational High School Students' Succes for Teaching Bisector and the Median Concepts

    Directory of Open Access Journals (Sweden)

    Mihriban HACISALİHOĞLU KARADENİZ

    2014-12-01

    Full Text Available The aim of this study is to examine the effect on the success of the students “Bisector and Median correspond at a point in a triangle” findings stated within Geometry course curriculum by using computer assisted teaching program Dynamic Geometer’s Sketchpad. During this study quasi experimental design was used on preliminary and posttest groups. This research was conducted on twenty five 10th grade students studying Computer Information Systems (CIS at Technical and Vocational High School. “A Success Test” consisting of 12 questions which had been previously formed through the observations of specialized teachers was used as preliminary and posttests. Each group was chosen without any meaningful difference between preliminary test results gathered. Two groups were formed due to preliminary test results: an experiment group T-10 class (13 students and a control group A-10A (12 students. It was targeted to access findings developed by Dynamic Geometer’s Sketchpad programmed in the experiment group but the control group wasn’t intervened. An independent sample t-test on pair comparison was used among different groups paired samples t-test was used on pair comparison among same groups. Findings achieved during the study revealed that Dynamic Geometer’s Sketchpad was more effective on students’ success than traditional teaching methods. A meaningful difference on behalf of the experiment group was determined within the independent sample t-test conducted on final test results. [t(23=3.176, p< .05]. These findings indicate that Dynamic Geometer’s Sketchpad software used in the experiment group is more effective on students’ success compared to traditional teaching methods used in control group.

  6. Trends in software testing

    CERN Document Server

    Mohanty, J; Balakrishnan, Arunkumar

    2017-01-01

    This book is focused on the advancements in the field of software testing and the innovative practices that the industry is adopting. Considering the widely varied nature of software testing, the book addresses contemporary aspects that are important for both academia and industry. There are dedicated chapters on seamless high-efficiency frameworks, automation on regression testing, software by search, and system evolution management. There are a host of mathematical models that are promising for software quality improvement by model-based testing. There are three chapters addressing this concern. Students and researchers in particular will find these chapters useful for their mathematical strength and rigor. Other topics covered include uncertainty in testing, software security testing, testing as a service, test technical debt (or test debt), disruption caused by digital advancement (social media, cloud computing, mobile application and data analytics), and challenges and benefits of outsourcing. The book w...

  7. High-risk versus low-risk football game weekends: differences in problem drinking and alcohol-related consequences on college campuses in the United States.

    Science.gov (United States)

    Champion, Heather; Blocker, Jill N; Buettner, Cynthia K; Martin, Barbara A; Parries, Maria; Mccoy, Thomas P; Mitra, Ananda; Andrews, David W; Rhodes, Scott D

    2009-01-01

    Collegiate football games provide multiple social opportunities for alcohol use by students over the course of the weekend. The goal of this study was to examine alcohol use and alcohol-related consequences on football game weekends to determine differences based on characteristics of the game. A random sample of students from two large, public universities in the United States completed a survey on the Sunday-Friday following a high-risk weekend (HRW, important, home game) and low-risk weekend (LRW, no home game or game of importance) (N = 3,238 total). The survey measured the number of days students drank (0-3) and got drunk (0-3) over the weekend and whether 1+ consequences were experienced due to one's own drinking (yes/no) and due to others' drinking (yes/no). Ordinal logistic regression analyses revealed greater odds of drinking alcohol (OR = 1.70, CI = 1.46-1.97) and getting drunk (OR = 1.49, CI = 1.27-1.76) on HRW versus LRW. Logistic regression analyses revealed greater odds of experiencing 1+ consequences as a result of one's own drinking (OR = 1.38, CI = 1.16-1.63) and experiencing 1+ consequences as a result of others' drinking (OR = 1.52, CI = 1.30-1.78) on HRW versus LRW. These findings suggest that additional prevention efforts aimed at reducing risky drinking are needed over HRW and have implications for campus administrators, law enforcement, and substance abuse program coordinators.

  8. A software engineering process for safety-critical software application

    International Nuclear Information System (INIS)

    Kang, Byung Heon; Kim, Hang Bae; Chang, Hoon Seon; Jeon, Jong Sun

    1995-01-01

    Application of computer software to safety-critical systems in on the increase. To be successful, the software must be designed and constructed to meet the functional and performance requirements of the system. For safety reason, the software must be demonstrated not only to meet these requirements, but also to operate safely as a component within the system. For longer-term cost consideration, the software must be designed and structured to ease future maintenance and modifications. This paper presents a software engineering process for the production of safety-critical software for a nuclear power plant. The presentation is expository in nature of a viable high quality safety-critical software development. It is based on the ideas of a rational design process and on the experience of the adaptation of such process in the production of the safety-critical software for the shutdown system number two of Wolsung 2, 3 and 4 nuclear power generation plants. This process is significantly different from a conventional process in terms of rigorous software development phases and software design techniques, The process covers documentation, design, verification and testing using mathematically precise notations and highly reviewable tabular format to specify software requirements and software requirements and software requirements and code against software design using static analysis. The software engineering process described in this paper applies the principle of information-hiding decomposition in software design using a modular design technique so that when a change is required or an error is detected, the affected scope can be readily and confidently located. it also facilitates a sense of high degree of confidence in the 'correctness' of the software production, and provides a relatively simple and straightforward code implementation effort. 1 figs., 10 refs. (Author)

  9. Phenomenological consequences of supersymmetry

    International Nuclear Information System (INIS)

    Hinchliffe, I.; Littenberg, L.

    1982-01-01

    This paper deals with the phenomenological consequences of supersymmetric theories, and with the implications of such theories for future high energy machines. The paper represents the work of a subgroup at the meeting. The authors are concerned only with high energy predictions of supersymmetry; low energy consequences (for example in the K/sub o/K-bar/sub o/ system) are discussed in the context of future experiments by another group, and will be mentioned briefly only in the context of constraining existing models. However a brief section is included on the implication for proton decay, although detailed experimental questions are not discussed

  10. Software reengineering

    Science.gov (United States)

    Fridge, Ernest M., III

    1991-01-01

    Today's software systems generally use obsolete technology, are not integrated properly with other software systems, and are difficult and costly to maintain. The discipline of reverse engineering is becoming prominent as organizations try to move their systems up to more modern and maintainable technology in a cost effective manner. JSC created a significant set of tools to develop and maintain FORTRAN and C code during development of the Space Shuttle. This tool set forms the basis for an integrated environment to re-engineer existing code into modern software engineering structures which are then easier and less costly to maintain and which allow a fairly straightforward translation into other target languages. The environment will support these structures and practices even in areas where the language definition and compilers do not enforce good software engineering. The knowledge and data captured using the reverse engineering tools is passed to standard forward engineering tools to redesign or perform major upgrades to software systems in a much more cost effective manner than using older technologies. A beta vision of the environment was released in Mar. 1991. The commercial potential for such re-engineering tools is very great. CASE TRENDS magazine reported it to be the primary concern of over four hundred of the top MIS executives.

  11. Low-Probability High-Consequence (LPHC) Failure Events in Geologic Carbon Sequestration Pipelines and Wells: Framework for LPHC Risk Assessment Incorporating Spatial Variability of Risk

    Energy Technology Data Exchange (ETDEWEB)

    Oldenburg, Curtis M. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Budnitz, Robert J. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2016-08-31

    If Carbon dioxide Capture and Storage (CCS) is to be effective in mitigating climate change, it will need to be carried out on a very large scale. This will involve many thousands of miles of dedicated high-pressure pipelines in order to transport many millions of tonnes of CO2 annually, with the CO2 delivered to many thousands of wells that will inject the CO2 underground. The new CCS infrastructure could rival in size the current U.S. upstream natural gas pipeline and well infrastructure. This new infrastructure entails hazards for life, health, animals, the environment, and natural resources. Pipelines are known to rupture due to corrosion, from external forces such as impacts by vehicles or digging equipment, by defects in construction, or from the failure of valves and seals. Similarly, wells are vulnerable to catastrophic failure due to corrosion, cement degradation, or operational mistakes. While most accidents involving pipelines and wells will be minor, there is the inevitable possibility of accidents with very high consequences, especially to public health. The most important consequence of concern is CO2 release to the environment in concentrations sufficient to cause death by asphyxiation to nearby populations. Such accidents are thought to be very unlikely, but of course they cannot be excluded, even if major engineering effort is devoted (as it will be) to keeping their probability low and their consequences minimized. This project has developed a methodology for analyzing the risks of these rare but high-consequence accidents, using a step-by-step probabilistic methodology. A key difference between risks for pipelines and wells is that the former are spatially distributed along the pipe whereas the latter are confined to the vicinity of the well. Otherwise, the methodology we develop for risk assessment of pipeline and well failures is similar and provides an analysis both of the annual probabilities of

  12. Can everyone become highly intelligent? Cultural differences in and societal consequences of beliefs about the universal potential for intelligence.

    Science.gov (United States)

    Rattan, Aneeta; Savani, Krishna; Naidu, N V R; Dweck, Carol S

    2012-11-01

    We identify a novel dimension of people's beliefs about intelligence: beliefs about the potential to become highly intelligent. Studies 1-3 found that in U.S. American contexts, people tend to believe that only some people have the potential to become highly intelligent. In contrast, in South Asian Indian contexts, people tend to believe that most people have the potential to become highly intelligent. To examine the implications of these beliefs, Studies 4-6 measured and manipulated Americans' beliefs about the potential for intelligence and found that the belief that everyone can become highly intelligent predicted increased support for policies that distribute resources more equally across advantaged and disadvantaged social groups. These findings suggest that the belief that only some people have the potential to become highly intelligent is a culturally shaped belief, and one that can lead people to oppose policies aimed at redressing social inequality. (c) 2012 APA, all rights reserved.

  13. Leading Change: College-School Collaboration to Assimilate New Administrative Software for an Arab High School in Israel

    Science.gov (United States)

    Arar, Khalid

    2016-01-01

    The study traced the assimilation of new administrative software in an Arab school, assisted by collaboration between the school and an Arab academic teacher-training college in Israel. The research used a mixed-method paradigm. A questionnaire consisting of 81 items was administered to 55 of the school's teachers in two stages to elicit their…

  14. Climate change consequences for terrestrial ecosystem processes in NW Greeland: Results from the High Arctic Biocomplexity project

    Science.gov (United States)

    Welker, J. M.; Sullivan, P.; Rogers, M.; Sharp, E. D.; Sletten, R.; Burnham, J. L.; Hallet, B.; Hagedorn, B.; Czimiczk, C.

    2009-12-01

    Greenland is experiencing some of the fastest rates of climate warming across the Arctic including warmer summers and increases in snow fall. The effects of these new states of Greenland are however, uncertain especially for carbon, nitrogen and water biogeochemical processes, soil traits, vegetation growth patterns, mineral nutrition and plant ecophysiological processes. Since 2003 we have conducted a suite of observational and experimental measurements that have been designed to understand the fundamental nature of polar desert, polar semi-desert and fen landscapes in NW Greenland. In addition, we have established a suite of experiments to ascertain ecosystem responses to warming at multiple levels (~2030 and 2050), in conjunction with added summer rain; the consequences of added snow fall (ambient, intermediate and deep) and the effects of increases in nutrient additions (added N, P and N+P), which represent extreme warming conditions. We find that: a) the soil C pools are 6-fold larger than previously measured, b) extremely old C (up to ~30k bp) which has been buried by frost cracking and frost heaving is reaching the modern atmosphere, but in only trace amounts as measured by respired 14CO2, c) warming that simulates 2030, has only a small effect on net C sequestration but warming that simulates 2050 when combined with added summer rain, increases C sequestration by 300%, d) increases in N deposition almost immediately and completely changes the vegetation composition of polar semi-deserts shifting the NDVI values from 0.2 to 0.5 within 2 years. Our findings depict a system that is poised to contribute stronger feedbacks than previously expected as climates in NW Greenland change.

  15. Proximity to a high traffic road: glucocorticoid and life history consequences for nestling white-crowned sparrows.

    Science.gov (United States)

    Crino, O L; Van Oorschot, B Klaassen; Johnson, E E; Malisch, J L; Breuner, C W

    2011-09-01

    Roads have been associated with decreased reproductive success and biodiversity in avian communities and increased physiological stress in adult birds. Alternatively, roads may also increase food availability and reduce predator pressure. Previous studies have focused on adult birds, but nestlings may also be susceptible to the detrimental impacts of roads. We examined the effects of proximity to a road on nestling glucocorticoid activity and growth in the mountain white-crowned sparrow (Zonotrichia leucophrys oriantha). Additionally, we examined several possible indirect factors that may influence nestling corticosterone (CORT) activity secretion in relation to roads. These indirect effects include parental CORT activity, nest-site characteristics, and parental provisioning. And finally, we assessed possible fitness consequences of roads through measures of fledging success. Nestlings near roads had increased CORT activity, elevated at both baseline and stress-induced levels. Surprisingly, these nestlings were also bigger. Generally, greater corticosterone activity is associated with reduced growth. However, the hypothalamic-pituitary-adrenal axis matures through the nestling period (as nestlings get larger, HPA-activation is greater). Although much of the variance in CORT responses was explained by body size, nestling CORT responses were higher close to roads after controlling for developmental differences. Indirect effects of roads may be mediated through paternal care. Nestling CORT responses were correlated with paternal CORT responses and paternal provisioning increased near roads. Hence, nestlings near roads may be larger due to increased paternal attentiveness. And finally, nest predation was higher for nests close to the road. Roads have apparent costs for white-crowned sparrow nestlings--increased predation, and apparent benefits--increased size. The elevation in CORT activity seems to reflect both increased size (benefit) and elevation due to road

  16. Software Authentication

    International Nuclear Information System (INIS)

    Wolford, J.K.; Geelhood, B.D.; Hamilton, V.A.; Ingraham, J.; MacArthur, D.W.; Mitchell, D.J.; Mullens, J.A.; Vanier, P. E.; White, G.K.; Whiteson, R.

    2001-01-01

    The effort to define guidance for authentication of software for arms control and nuclear material transparency measurements draws on a variety of disciplines and has involved synthesizing established criteria and practices with newer methods. Challenges include the need to protect classified information that the software manipulates as well as deal with the rapid pace of innovation in the technology of nuclear material monitoring. The resulting guidance will shape the design of future systems and inform the process of authentication of instruments now being developed. This paper explores the technical issues underlying the guidance and presents its major tenets

  17. Software engineering

    CERN Document Server

    Thorin, Marc

    1985-01-01

    Software Engineering describes the conceptual bases as well as the main methods and rules on computer programming. This book presents software engineering as a coherent and logically built synthesis and makes it possible to properly carry out an application of small or medium difficulty that can later be developed and adapted to more complex cases. This text is comprised of six chapters and begins by introducing the reader to the fundamental notions of entities, actions, and programming. The next two chapters elaborate on the concepts of information and consistency domains and show that a proc

  18. Visualization of scientific data for high energy physics: PAW, a general-purpose portable software tool for data analysis and presentation

    International Nuclear Information System (INIS)

    Brun, R.; Couet, O.; Vandoni, C.E.; Zanarini, P.

    1990-01-01

    Visualization of scientific data although a fashionable word in the world of computer graphics, is not a new invention, but it is hundreds years old. With the advent of computer graphics the visualization of Scientific Data has now become a well understood and widely used technology, with hundreds of applications in the most different fields, ranging from media applications to real scientific ones. In the present paper, we shall discuss the design concepts of the Visualization of Scientific Data systems in particular in the specific field of High Energy Physics. During the last twenty years, CERN has played a leading role as the focus for development of packages and software libraries to solve problems related to High Energy Physics (HEP). The results of the integration of resources from many different Laboratories can be expressed in several million lines of code written at CERN during this period of time, used at CERN and distributed to collaborating laboratories. Nowadays, this role of software developer is considered very important by the entire HEP community. In this paper a large software package, where man-machine interaction and graphics play a key role (PAW-Physics Analysis Workstation), is described. PAW is essentially an interactive system which includes many different software tools, strongly oriented towards data analysis and data presentation. Some of these tools have been available in different forms and with different human interfaces for several years. 6 figs

  19. Consequences of exchanging carbohydrates for proteins in the cholesterol metabolism of mice fed a high-fat diet.

    Directory of Open Access Journals (Sweden)

    Frédéric Raymond

    Full Text Available Consumption of low-carbohydrate, high-protein, high-fat diets lead to rapid weight loss but the cardioprotective effects of these diets have been questioned. We examined the impact of high-protein and high-fat diets on cholesterol metabolism by comparing the plasma cholesterol and the expression of cholesterol biosynthesis genes in the liver of mice fed a high-fat (HF diet that has a high (H or a low (L protein-to-carbohydrate (P/C ratio. H-P/C-HF feeding, compared with L-P/C-HF feeding, decreased plasma total cholesterol and increased HDL cholesterol concentrations at 4-wk. Interestingly, the expression of genes involved in hepatic steroid biosynthesis responded to an increased dietary P/C ratio by first down-regulation (2-d followed by later up-regulation at 4-wk, and the temporal gene expression patterns were connected to the putative activity of SREBF1 and 2. In contrast, Cyp7a1, the gene responsible for the conversion of cholesterol to bile acids, was consistently up-regulated in the H-P/C-HF liver regardless of feeding duration. Over expression of Cyp7a1 after 2-d and 4-wk H-P/C-HF feeding was connected to two unique sets of transcription regulators. At both time points, up-regulation of the Cyp7a1 gene could be explained by enhanced activations and reduced suppressions of multiple transcription regulators. In conclusion, we demonstrated that the hypocholesterolemic effect of H-P/C-HF feeding coincided with orchestrated changes of gene expressions in lipid metabolic pathways in the liver of mice. Based on these results, we hypothesize that the cholesterol lowering effect of high-protein feeding is associated with enhanced bile acid production but clinical validation is warranted. (246 words.

  20. Consequences of exchanging carbohydrates for proteins in the cholesterol metabolism of mice fed a high-fat diet.

    Science.gov (United States)

    Raymond, Frédéric; Wang, Long; Moser, Mireille; Metairon, Sylviane; Mansourian, Robert; Zwahlen, Marie-Camille; Kussmann, Martin; Fuerholz, Andreas; Macé, Katherine; Chou, Chieh Jason

    2012-01-01

    Consumption of low-carbohydrate, high-protein, high-fat diets lead to rapid weight loss but the cardioprotective effects of these diets have been questioned. We examined the impact of high-protein and high-fat diets on cholesterol metabolism by comparing the plasma cholesterol and the expression of cholesterol biosynthesis genes in the liver of mice fed a high-fat (HF) diet that has a high (H) or a low (L) protein-to-carbohydrate (P/C) ratio. H-P/C-HF feeding, compared with L-P/C-HF feeding, decreased plasma total cholesterol and increased HDL cholesterol concentrations at 4-wk. Interestingly, the expression of genes involved in hepatic steroid biosynthesis responded to an increased dietary P/C ratio by first down-regulation (2-d) followed by later up-regulation at 4-wk, and the temporal gene expression patterns were connected to the putative activity of SREBF1 and 2. In contrast, Cyp7a1, the gene responsible for the conversion of cholesterol to bile acids, was consistently up-regulated in the H-P/C-HF liver regardless of feeding duration. Over expression of Cyp7a1 after 2-d and 4-wk H-P/C-HF feeding was connected to two unique sets of transcription regulators. At both time points, up-regulation of the Cyp7a1 gene could be explained by enhanced activations and reduced suppressions of multiple transcription regulators. In conclusion, we demonstrated that the hypocholesterolemic effect of H-P/C-HF feeding coincided with orchestrated changes of gene expressions in lipid metabolic pathways in the liver of mice. Based on these results, we hypothesize that the cholesterol lowering effect of high-protein feeding is associated with enhanced bile acid production but clinical validation is warranted. (246 words).

  1. Software Innovation - Values for a Methodology

    DEFF Research Database (Denmark)

    Aaen, Ivan

    2013-01-01

    Innovation is a recurrent theme in public as well as academic debate, and software development plays a major role for innovation in about every sector of our economy. As a consequence, software innovation will play an increasingly important role in software development. The focus in this paper is...... for experimentation, learning, and flexibility in software projects, but how can this change be used to facilitate software innovation? This paper proposes a set of values to guide the development of a methodology to facilitate software innovation....

  2. Reviews, Software.

    Science.gov (United States)

    Science Teacher, 1988

    1988-01-01

    Reviews two computer software packages for use in physical science, physics, and chemistry classes. Includes "Physics of Model Rocketry" for Apple II, and "Black Box" for Apple II and IBM compatible computers. "Black Box" is designed to help students understand the concept of indirect evidence. (CW)

  3. Software Reviews.

    Science.gov (United States)

    Kinnaman, Daniel E.; And Others

    1988-01-01

    Reviews four educational software packages for Apple, IBM, and Tandy computers. Includes "How the West was One + Three x Four,""Mavis Beacon Teaches Typing,""Math and Me," and "Write On." Reviews list hardware requirements, emphasis, levels, publisher, purchase agreements, and price. Discusses the strengths…

  4. Software Review.

    Science.gov (United States)

    McGrath, Diane, Ed.

    1989-01-01

    Reviewed is a computer software package entitled "Audubon Wildlife Adventures: Grizzly Bears" for Apple II and IBM microcomputers. Included are availability, hardware requirements, cost, and a description of the program. The murder-mystery flavor of the program is stressed in this program that focuses on illegal hunting and game…

  5. Software Reviews.

    Science.gov (United States)

    Teles, Elizabeth, Ed.; And Others

    1990-01-01

    Reviewed are two computer software packages for Macintosh microcomputers including "Phase Portraits," an exploratory graphics tool for studying first-order planar systems; and "MacMath," a set of programs for exploring differential equations, linear algebra, and other mathematical topics. Features, ease of use, cost, availability, and hardware…

  6. MIAWARE Software

    DEFF Research Database (Denmark)

    Wilkowski, Bartlomiej; Pereira, Oscar N. M.; Dias, Paulo

    2008-01-01

    is automatically generated. Furthermore, MIAWARE software is accompanied with an intelligent search engine for medical reports, based on the relations between parts of the lungs. A logical structure of the lungs is introduced to the search algorithm through the specially developed ontology. As a result...

  7. High-Level Design for Ultra-Fast Software Defined Radio Prototyping on Multi-Processors Heterogeneous Platforms

    OpenAIRE

    Moy , Christophe; Raulet , Mickaël

    2010-01-01

    International audience; The design of Software Defined Radio (SDR) equipments (terminals, base stations, etc.) is still very challenging. We propose here a design methodology for ultra-fast prototyping on heterogeneous platforms made of GPPs (General Purpose Processors), DSPs (Digital Signal Processors) and FPGAs (Field Programmable Gate Array). Lying on a component-based approach, the methodology mainly aims at automating as much as possible the design from an algorithmic validation to a mul...

  8. Server-based enterprise collaboration software improves safety and quality in high-volume PET/CT practice.

    Science.gov (United States)

    McDonald, James E; Kessler, Marcus M; Hightower, Jeremy L; Henry, Susan D; Deloney, Linda A

    2013-12-01

    With increasing volumes of complex imaging cases and rising economic pressure on physician staffing, timely reporting will become progressively challenging. Current and planned iterations of PACS and electronic medical record systems do not offer workflow management tools to coordinate delivery of imaging interpretations with the needs of the patient and ordering physician. The adoption of a server-based enterprise collaboration software system by our Division of Nuclear Medicine has significantly improved our efficiency and quality of service.

  9. When do ego threats lead to self-regulation failure? Negative consequences of defensive high self-esteem.

    Science.gov (United States)

    Lambird, Kathleen Hoffman; Mann, Traci

    2006-09-01

    High self-esteem (HSE) is increasingly recognized as heterogeneous. By measuring subtypes of HSE, the present research reevaluates the finding that HSE individuals show poor self-regulation following ego threat (Baumeister, Heatherton, & Tice, 1993). In Experiment 1, participants with HSE showed poor self-regulation after ego threat only if they also were defensive (high in self-presentation bias). In Experiment 2, two measures--self-presentation bias and implicit self-esteem--were used to subtype HSE individuals as defensive. Both operationalizations of defensive HSE predicted poor self-regulation after ego threat. The results indicate that (a) only defensive HSE individuals are prone to self-regulation failure following ego threat and (b) measures of self-presentation bias and implicit self-esteem can both be used to detect defensiveness.

  10. Noble gases in basalt glasses from a Mid-Atlantic Ridge topographic high at 14deg N - geodynamic consequences

    International Nuclear Information System (INIS)

    Staudacher, T.; Sarda, P.; Richardson, S.H.; Allegre, C.J.; Sagna, I.; Dmitriev, L.V.

    1989-01-01

    We present a complete noble gas study of mid-oceanic ridge basalt glasses (MORB) from a small ridge segment, centered on an along-strike topographic elevation of the Mid-Atlantic Ridge at about 14deg N. We have found the highest 40 Ar/ 36 Ar ratio ever observed for a MORB glass, i.e. 28,150±330 for sample 2ΠD40, correlated with high 129 Xe/ 130 Xe ratios and the highest noble gas concentrations in a so-called popping-rock, labeled 2ΠD43. The latter sample displays a 4 He/ 40 Ar * ratio of 2.0-2.7, which is close to the production ratio in the mantle due to the radioactive decay of U, Th and K. Hence, this sample probably best represents the elemental noble gas ratios in the mantle, from which we have computed the 4 He concentration in the mantle source of MORB to be 1.5x10 -5 cm 3 STP g -1 . High 4 He/ 3 He ratios in two of the samples from the summit of the topographic high indicate the presence of a U, Th-rich component in the mantle source, possibly old subducted oceanic crust and/or sediments, which could originate in the so-called mesosphere boundary layer. (orig.)

  11. High-throughput image analysis of tumor spheroids: a user-friendly software application to measure the size of spheroids automatically and accurately.

    Science.gov (United States)

    Chen, Wenjin; Wong, Chung; Vosburgh, Evan; Levine, Arnold J; Foran, David J; Xu, Eugenia Y

    2014-07-08

    The increasing number of applications of three-dimensional (3D) tumor spheroids as an in vitro model for drug discovery requires their adaptation to large-scale screening formats in every step of a drug screen, including large-scale image analysis. Currently there is no ready-to-use and free image analysis software to meet this large-scale format. Most existing methods involve manually drawing the length and width of the imaged 3D spheroids, which is a tedious and time-consuming process. This study presents a high-throughput image analysis software application - SpheroidSizer, which measures the major and minor axial length of the imaged 3D tumor spheroids automatically and accurately; calculates the volume of each individual 3D tumor spheroid; then outputs the results in two different forms in spreadsheets for easy manipulations in the subsequent data analysis. The main advantage of this software is its powerful image analysis application that is adapted for large numbers of images. It provides high-throughput computation and quality-control workflow. The estimated time to process 1,000 images is about 15 min on a minimally configured laptop, or around 1 min on a multi-core performance workstation. The graphical user interface (GUI) is also designed for easy quality control, and users can manually override the computer results. The key method used in this software is adapted from the active contour algorithm, also known as Snakes, which is especially suitable for images with uneven illumination and noisy background that often plagues automated imaging processing in high-throughput screens. The complimentary "Manual Initialize" and "Hand Draw" tools provide the flexibility to SpheroidSizer in dealing with various types of spheroids and diverse quality images. This high-throughput image analysis software remarkably reduces labor and speeds up the analysis process. Implementing this software is beneficial for 3D tumor spheroids to become a routine in vitro model

  12. Myeloid-specific deletion of NOX2 prevents the metabolic and neurologic consequences of high fat diet.

    Directory of Open Access Journals (Sweden)

    Jennifer K Pepping

    Full Text Available High fat diet-induced obesity is associated with inflammatory and oxidative signaling in macrophages that likely participates in metabolic and physiologic impairment. One key factor that could drive pathologic changes in macrophages is the pro-inflammatory, pro-oxidant enzyme NADPH oxidase. However, NADPH oxidase is a pleiotropic enzyme with both pathologic and physiologic functions, ruling out indiscriminant NADPH oxidase inhibition as a viable therapy. To determine if targeted inhibition of monocyte/macrophage NADPH oxidase could mitigate obesity pathology, we generated mice that lack the NADPH oxidase catalytic subunit NOX2 in myeloid lineage cells. C57Bl/6 control (NOX2-FL and myeloid-deficient NOX2 (mNOX2-KO mice were given high fat diet for 16 weeks, and subject to comprehensive metabolic, behavioral, and biochemical analyses. Data show that mNOX2-KO mice had lower body weight, delayed adiposity, attenuated visceral inflammation, and decreased macrophage infiltration and cell injury in visceral adipose relative to control NOX2-FL mice. Moreover, the effects of high fat diet on glucose regulation and circulating lipids were attenuated in mNOX2-KO mice. Finally, memory was impaired and markers of brain injury increased in NOX2-FL, but not mNOX2-KO mice. Collectively, these data indicate that NOX2 signaling in macrophages participates in the pathogenesis of obesity, and reinforce a key role for macrophage inflammation in diet-induced metabolic and neurologic decline. Development of macrophage/immune-specific NOX-based therapies could thus potentially be used to preserve metabolic and neurologic function in the context of obesity.

  13. On the counterintuitive consequences of high-performance work practices in cross-border post-merger human integration

    DEFF Research Database (Denmark)

    Vasilaki, A.; Smith, Pernille; Giangreco, A.

    2012-01-01

    , such as communication, employee involvement, and team building, may not always produce the expected effects on human integration; rather, it can have the opposite effects if top management does not closely monitor the immediate results of deploying such practices. Implications for managers dealing with post......, this article investigates the impact of systemic and integrated human resource practices [i.e., high-performance work practices (HPWPs)] on human integration and how their implementation affects employees' behaviours and attitudes towards post-merger human integration. We find that the implementation of HPWPs...

  14. High-Resolution C-Arm CT and Metal Artifact Reduction Software: A Novel Imaging Modality for Analyzing Aneurysms Treated with Stent-Assisted Coil Embolization.

    Science.gov (United States)

    Yuki, I; Kambayashi, Y; Ikemura, A; Abe, Y; Kan, I; Mohamed, A; Dahmani, C; Suzuki, T; Ishibashi, T; Takao, H; Urashima, M; Murayama, Y

    2016-02-01

    Combination of high-resolution C-arm CT and novel metal artifact reduction software may contribute to the assessment of aneurysms treated with stent-assisted coil embolization. This study aimed to evaluate the efficacy of a novel Metal Artifact Reduction prototype software combined with the currently available high spatial-resolution C-arm CT prototype implementation by using an experimental aneurysm model treated with stent-assisted coil embolization. Eight experimental aneurysms were created in 6 swine. Coil embolization of each aneurysm was performed by using a stent-assisted technique. High-resolution C-arm CT with intra-arterial contrast injection was performed immediately after the treatment. The obtained images were processed with Metal Artifact Reduction. Five neurointerventional specialists reviewed the image quality before and after Metal Artifact Reduction. Observational and quantitative analyses (via image analysis software) were performed. Every aneurysm was successfully created and treated with stent-assisted coil embolization. Before Metal Artifact Reduction, coil loops protruding through the stent lumen were not visualized due to the prominent metal artifacts produced by the coils. These became visible after Metal Artifact Reduction processing. Contrast filling in the residual aneurysm was also visualized after Metal Artifact Reduction in every aneurysm. Both the observational (P software. The combination of high-resolution C-arm CT and Metal Artifact Reduction enables differentiation of the coil mass, stent, and contrast material on the same image by significantly reducing the metal artifacts produced by the platinum coils. This novel image technique may improve the assessment of aneurysms treated with stent-assisted coil embolization. © 2016 by American Journal of Neuroradiology.

  15. Software Tools for Software Maintenance

    Science.gov (United States)

    1988-10-01

    COMMUNICATIONS, AND COMPUTER SCIENCES I ,(AIRMICS) FO~SOFTWARE TOOLS (.o FOR SOF1 ’ARE MAINTENANCE (ASQBG-1-89-001) October, 1988 DTIC ELECTE -ifB...SUNWW~. B..c Program An~Iysw HA.c C-Tractr C Cobol Stncturing Facility VS Cobol 11 F-Scan Foctma Futbol Cobol Fortran Sltiuc Code Anaiyaer Fortran IS

  16. The behavioral and health consequences of sleep deprivation among U.S. high school students: relative deprivation matters.

    Science.gov (United States)

    Meldrum, Ryan Charles; Restivo, Emily

    2014-06-01

    To evaluate whether the strength of the association between sleep deprivation and negative behavioral and health outcomes varies according to the relative amount of sleep deprivation experienced by adolescents. 2011 Youth Risk Behavior Survey data of high school students (N=15,364) were analyzed. Associations were examined on weighted data using logistic regression. Twelve outcomes were examined, ranging from weapon carrying to obesity. The primary independent variable was a self-reported measure of average number of hours slept on school nights. Participants who reported deprivations in sleep were at an increased risk of a number of negative outcomes. However, this varied considerably across different degrees of sleep deprivation. For each of the outcomes considered, those who slept less than 5h were more likely to report negative outcomes (adjusted odds ratios ranging from 1.38 to 2.72; psleeping 8 or more hours. However, less extreme forms of sleep deprivation were, in many instances, unrelated to the outcomes considered. Among U.S. high school students, deficits in sleep are significantly and substantively associated with a variety of negative outcomes, and this association is particularly pronounced for students achieving fewer than 5h of sleep at night. Copyright © 2014 Elsevier Inc. All rights reserved.

  17. Radiological assessment of the consequences of the disposal of high-level radioactive waste in subseabed sediments

    International Nuclear Information System (INIS)

    de Marsily, G.; Behrendt, V.; Ensminger, D.A.

    1987-01-01

    The radiological assessment of the seabed option consists in estimating the detriment to man and to the environment that could result from the disposal of high-level waste (HLW) within the seabed sediments in deep oceans. The assessment is made for the high-level waste (vitrified glass) produced by the reprocessing of 10 5 tons of heavy metal from spent fuel, which represents the amount of waste generated by 3333 reactor-yr of 900-MW(electric) reactors, i.e., 3000 GW(electric) x yr. The disposal option considered is to use 14,667 steel penetrators, each of them containing five canisters of HLW glass (0.15 m 3 each). These penetrators would reach a depth of 50 m in the sediments and would be placed at an average distance of 180 m from each other, requiring a disposal area on the order of 22 x 22 km. Two such potential disposal areas in the Atlantic Ocean were studied, Great Meteor East (GME) and South Nares Abyssal Plains (SNAP). A special ship design is proposed to minimize transportation accidents. Approximately 100 shipments would be necessary to dispose of the proposed amount of waste. The results of this radiological assessment seem to show that the disposal of HLW in subseabed sediments is radiologically a very acceptable option

  18. The health of homeless people in high-income countries: descriptive epidemiology, health consequences, and clinical and policy recommendations

    Science.gov (United States)

    Fazel, Seena; Geddes, John R; Kushel, Margot

    2015-01-01

    In the European Union, more than 400 000 individuals are homeless on any one night and more than 600 000 are homeless in the USA. The causes of homelessness are an interaction between individual and structural factors. Individual factors include poverty, family problems, and mental health and substance misuse problems. The availability of low-cost housing is thought to be the most important structural determinant for homelessness. Homeless people have higher rates of premature mortality than the rest of the population, especially from suicide and unintentional injuries, and an increased prevalence of a range of infectious diseases, mental disorders, and substance misuse. High rates of non-communicable diseases have also been described with evidence of accelerated ageing. Although engagement with health services and adherence to treatments is often compromised, homeless people typically attend the emergency department more often than non-homeless people. We discuss several recommendations to improve the surveillance of morbidity and mortality in homeless people. Programmes focused on high-risk groups, such as individuals leaving prisons, psychiatric hospitals, and the child welfare system, and the introduction of national and state-wide plans that target homeless people are likely to improve outcomes. PMID:25390578

  19. Managing Software Process Evolution

    DEFF Research Database (Denmark)

    This book focuses on the design, development, management, governance and application of evolving software processes that are aligned with changing business objectives, such as expansion to new domains or shifting to global production. In the context of an evolving business world, it examines...... the complete software process lifecycle, from the initial definition of a product to its systematic improvement. In doing so, it addresses difficult problems, such as how to implement processes in highly regulated domains or where to find a suitable notation system for documenting processes, and provides...... essential insights and tips to help readers manage process evolutions. And last but not least, it provides a wealth of examples and cases on how to deal with software evolution in practice. Reflecting these topics, the book is divided into three parts. Part 1 focuses on software business transformation...

  20. EPIQR software

    Energy Technology Data Exchange (ETDEWEB)

    Flourentzos, F. [Federal Institute of Technology, Lausanne (Switzerland); Droutsa, K. [National Observatory of Athens, Athens (Greece); Wittchen, K.B. [Danish Building Research Institute, Hoersholm (Denmark)

    1999-11-01

    The support of the EPIQR method is a multimedia computer program. Several modules help the users of the method to treat the data collected during a diagnosis survey, to set up refurbishment scenario and calculate their cost or energy performance, and finally to visualize the results in a comprehensive way and to prepare quality reports. This article presents the structure and the main features of the software. (au)

  1. Software preservation

    Directory of Open Access Journals (Sweden)

    Tadej Vodopivec

    2011-01-01

    Full Text Available Comtrade Ltd. covers a wide range of activities related to information and communication technologies; its deliverables include web applications, locally installed programs,system software, drivers, embedded software (used e.g. in medical devices, auto parts,communication switchboards. Also the extensive knowledge and practical experience about digital long-term preservation technologies have been acquired. This wide spectrum of activities puts us in the position to discuss the often overlooked aspect of the digital preservation - preservation of software programs. There are many resources dedicated to digital preservation of digital data, documents and multimedia records,but not so many about how to preserve the functionalities and features of computer programs. Exactly these functionalities - dynamic response to inputs - render the computer programs rich compared to documents or linear multimedia. The article opens the questions on the beginning of the way to the permanent digital preservation. The purpose is to find a way in the right direction, where all relevant aspects will be covered in proper balance. The following questions are asked: why at all to preserve computer programs permanently, who should do this and for whom, when we should think about permanent program preservation, what should be persevered (such as source code, screenshots, documentation, and social context of the program - e.g. media response to it ..., where and how? To illustrate the theoretic concepts given the idea of virtual national museum of electronic banking is also presented.

  2. Establishing software quality assurance

    International Nuclear Information System (INIS)

    Malsbury, J.

    1983-01-01

    This paper is concerned with four questions about establishing software QA: What is software QA. Why have software QA. What is the role of software QA. What is necessary to ensure the success of software QA

  3. Integrating existing software toolkits into VO system

    Science.gov (United States)

    Cui, Chenzhou; Zhao, Yong-Heng; Wang, Xiaoqian; Sang, Jian; Luo, Ze

    2004-09-01

    Virtual Observatory (VO) is a collection of interoperating data archives and software tools. Taking advantages of the latest information technologies, it aims to provide a data-intensively online research environment for astronomers all around the world. A large number of high-qualified astronomical software packages and libraries are powerful and easy of use, and have been widely used by astronomers for many years. Integrating those toolkits into the VO system is a necessary and important task for the VO developers. VO architecture greatly depends on Grid and Web services, consequently the general VO integration route is "Java Ready - Grid Ready - VO Ready". In the paper, we discuss the importance of VO integration for existing toolkits and discuss the possible solutions. We introduce two efforts in the field from China-VO project, "gImageMagick" and "Galactic abundance gradients statistical research under grid environment". We also discuss what additional work should be done to convert Grid service to VO service.

  4. High levels of intravenous mephedrone (4-methylmethcathinone) self-administration in rats: neural consequences and comparison with methamphetamine.

    Science.gov (United States)

    Motbey, Craig P; Clemens, Kelly J; Apetz, Nadine; Winstock, Adam R; Ramsey, John; Li, Kong M; Wyatt, Naomi; Callaghan, Paul D; Bowen, Michael T; Cornish, Jennifer L; McGregor, Iain S

    2013-09-01

    Mephedrone (MMC) is a relatively new recreational drug that has rapidly increased in popularity in recent years. This study explored the characteristics of intravenous MMC self-administration in the rat, with methamphetamine (METH) used as a comparator drug. Male Sprague-Dawley rats were trained to nose poke for intravenous MMC or METH in daily 2 h sessions over a 10 d acquisition period. Dose-response functions were then established under fixed- and progressive-ratio (FR and PR) schedules over three subsequent weeks of testing. Brains were analyzed ex vivo for striatal serotonin (5-HT) and dopamine (DA) levels and metabolites, while autoradiography assessed changes in the regional density of 5-HT and serotonin transporter (SERT) and DA transporter (DAT) and induction of the inflammation marker translocator protein (TSPO). Results showed that MMC was readily and vigorously self-administered via the intravenous route. Under a FR1 schedule, peak responding for MMC was obtained at 0.1 mg/kg/infusion, versus 0.01 mg/kg/infusion for METH. Break points under a PR schedule peaked at 1 mg/kg/infusion MMC versus 0.3 mg/kg/infusion for METH. Final intakes of MMC were 31.3 mg/kg/d compared to 4 mg/kg/d for METH. Rats self-administering MMC, but not METH, gained weight at a slower rate than control rats. METH, but not MMC, self-administration elevated TSPO receptor density in the nucleus accumbens and hippocampus, while MMC, but not METH, self-administration decreased striatal 5-hydroxyindolacetic acid (5-HIAA) concentrations. In summary, MMC supported high levels of self-administration, matching or exceeding those previously reported with other drugs of abuse.

  5. Adding Cross-Platform Support to a High-Throughput Software Stack and Exploration of Vectorization Libraries

    CERN Document Server

    AUTHOR|(CDS)2258962

    This master thesis is written at the LHCb experiment at CERN. It is part of the initiative for improving software in view of the upcoming upgrade in 2021 which will significantly increase the amount of acquired data. This thesis consists of two parts. The first part is about the exploration of different vectorization libraries and their usefulness for the LHCb collaboration. The second part is about adding cross-platform support to the LHCb software stack. Here, the LHCb stack is successfully ported to ARM (aarch64) and its performance is analyzed. At the end of the thesis, the port to PowerPC(ppc64le) awaits the performance analysis. The main goal of porting the stack is the cost-performance evaluation for the different platforms to get the most cost efficient hardware for the new server farm for the upgrade. For this, selected vectorization libraries are extended to support the PowerPC and ARM platform. And though the same compiler is used, platform-specific changes to the compilation flags are required. In...

  6. Factors That Affect Software Testability

    Science.gov (United States)

    Voas, Jeffrey M.

    1991-01-01

    Software faults that infrequently affect software's output are dangerous. When a software fault causes frequent software failures, testing is likely to reveal the fault before the software is releases; when the fault remains undetected during testing, it can cause disaster after the software is installed. A technique for predicting whether a particular piece of software is likely to reveal faults within itself during testing is found in [Voas91b]. A piece of software that is likely to reveal faults within itself during testing is said to have high testability. A piece of software that is not likely to reveal faults within itself during testing is said to have low testability. It is preferable to design software with higher testabilities from the outset, i.e., create software with as high of a degree of testability as possible to avoid the problems of having undetected faults that are associated with low testability. Information loss is a phenomenon that occurs during program execution that increases the likelihood that a fault will remain undetected. In this paper, I identify two brad classes of information loss, define them, and suggest ways of predicting the potential for information loss to occur. We do this in order to decrease the likelihood that faults will remain undetected during testing.

  7. Software quality assurance: in large scale and complex software-intensive systems

    NARCIS (Netherlands)

    Mistrik, I.; Soley, R.; Ali, N.; Grundy, J.; Tekinerdogan, B.

    2015-01-01

    Software Quality Assurance in Large Scale and Complex Software-intensive Systems presents novel and high-quality research related approaches that relate the quality of software architecture to system requirements, system architecture and enterprise-architecture, or software testing. Modern software

  8. The Ettention software package

    International Nuclear Information System (INIS)

    Dahmen, Tim; Marsalek, Lukas; Marniok, Nico; Turoňová, Beata; Bogachev, Sviatoslav; Trampert, Patrick; Nickels, Stefan; Slusallek, Philipp

    2016-01-01

    We present a novel software package for the problem “reconstruction from projections” in electron microscopy. The Ettention framework consists of a set of modular building-blocks for tomographic reconstruction algorithms. The well-known block iterative reconstruction method based on Kaczmarz algorithm is implemented using these building-blocks, including adaptations specific to electron tomography. Ettention simultaneously features (1) a modular, object-oriented software design, (2) optimized access to high-performance computing (HPC) platforms such as graphic processing units (GPU) or many-core architectures like Xeon Phi, and (3) accessibility to microscopy end-users via integration in the IMOD package and eTomo user interface. We also provide developers with a clean and well-structured application programming interface (API) that allows for extending the software easily and thus makes it an ideal platform for algorithmic research while hiding most of the technical details of high-performance computing. - Highlights: • Novel software package for “reconstruction from projections” in electron microscopy. • Support for high-resolution reconstructions on iterative reconstruction algorithms. • Support for CPU, GPU and Xeon Phi. • Integration in the IMOD software. • Platform for algorithm researchers: object oriented, modular design.

  9. The Ettention software package

    Energy Technology Data Exchange (ETDEWEB)

    Dahmen, Tim, E-mail: Tim.Dahmen@dfki.de [German Research Center for Artificial Intelligence GmbH (DFKI), 66123 Saarbrücken (Germany); Saarland University, 66123 Saarbrücken (Germany); Marsalek, Lukas [Eyen SE, Na Nivách 1043/16, 141 00 Praha 4 (Czech Republic); Saarland University, 66123 Saarbrücken (Germany); Marniok, Nico [Saarland University, 66123 Saarbrücken (Germany); Turoňová, Beata [Saarland University, 66123 Saarbrücken (Germany); IMPRS-CS, Max-Planck Institute for Informatics, Campus E 1.4, 66123 Saarbrücken (Germany); Bogachev, Sviatoslav [Saarland University, 66123 Saarbrücken (Germany); Trampert, Patrick; Nickels, Stefan [German Research Center for Artificial Intelligence GmbH (DFKI), 66123 Saarbrücken (Germany); Slusallek, Philipp [German Research Center for Artificial Intelligence GmbH (DFKI), 66123 Saarbrücken (Germany); Saarland University, 66123 Saarbrücken (Germany)

    2016-02-15

    We present a novel software package for the problem “reconstruction from projections” in electron microscopy. The Ettention framework consists of a set of modular building-blocks for tomographic reconstruction algorithms. The well-known block iterative reconstruction method based on Kaczmarz algorithm is implemented using these building-blocks, including adaptations specific to electron tomography. Ettention simultaneously features (1) a modular, object-oriented software design, (2) optimized access to high-performance computing (HPC) platforms such as graphic processing units (GPU) or many-core architectures like Xeon Phi, and (3) accessibility to microscopy end-users via integration in the IMOD package and eTomo user interface. We also provide developers with a clean and well-structured application programming interface (API) that allows for extending the software easily and thus makes it an ideal platform for algorithmic research while hiding most of the technical details of high-performance computing. - Highlights: • Novel software package for “reconstruction from projections” in electron microscopy. • Support for high-resolution reconstructions on iterative reconstruction algorithms. • Support for CPU, GPU and Xeon Phi. • Integration in the IMOD software. • Platform for algorithm researchers: object oriented, modular design.

  10. MEGADOCK 4.0: an ultra-high-performance protein-protein docking software for heterogeneous supercomputers.

    Science.gov (United States)

    Ohue, Masahito; Shimoda, Takehiro; Suzuki, Shuji; Matsuzaki, Yuri; Ishida, Takashi; Akiyama, Yutaka

    2014-11-15

    The application of protein-protein docking in large-scale interactome analysis is a major challenge in structural bioinformatics and requires huge computing resources. In this work, we present MEGADOCK 4.0, an FFT-based docking software that makes extensive use of recent heterogeneous supercomputers and shows powerful, scalable performance of >97% strong scaling. MEGADOCK 4.0 is written in C++ with OpenMPI and NVIDIA CUDA 5.0 (or later) and is freely available to all academic and non-profit users at: http://www.bi.cs.titech.ac.jp/megadock. akiyama@cs.titech.ac.jp Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press.

  11. High-efficiency space-based software radio architectures & algorithms (a minimum size, weight, and power TeraOps processor)

    Energy Technology Data Exchange (ETDEWEB)

    Dunham, Mark Edward [Los Alamos National Laboratory; Baker, Zachary K [Los Alamos National Laboratory; Stettler, Matthew W [Los Alamos National Laboratory; Pigue, Michael J [Los Alamos National Laboratory; Schmierer, Eric N [Los Alamos National Laboratory; Power, John F [Los Alamos National Laboratory; Graham, Paul S [Los Alamos National Laboratory

    2009-01-01

    Los Alamos has recently completed the latest in a series of Reconfigurable Software Radios, which incorporates several key innovations in both hardware design and algorithms. Due to our focus on satellite applications, each design must extract the best size, weight, and power performance possible from the ensemble of Commodity Off-the-Shelf (COTS) parts available at the time of design. In this case we have achieved 1 TeraOps/second signal processing on a 1920 Megabit/second datastream, while using only 53 Watts mains power, 5.5 kg, and 3 liters. This processing capability enables very advanced algorithms such as our wideband RF compression scheme to operate remotely, allowing network bandwidth constrained applications to deliver previously unattainable performance.

  12. Global Software Engineering: A Software Process Approach

    Science.gov (United States)

    Richardson, Ita; Casey, Valentine; Burton, John; McCaffery, Fergal

    Our research has shown that many companies are struggling with the successful implementation of global software engineering, due to temporal, cultural and geographical distance, which causes a range of factors to come into play. For example, cultural, project managementproject management and communication difficulties continually cause problems for software engineers and project managers. While the implementation of efficient software processes can be used to improve the quality of the software product, published software process models do not cater explicitly for the recent growth in global software engineering. Our thesis is that global software engineering factors should be included in software process models to ensure their continued usefulness in global organisations. Based on extensive global software engineering research, we have developed a software process, Global Teaming, which includes specific practices and sub-practices. The purpose is to ensure that requirements for successful global software engineering are stipulated so that organisations can ensure successful implementation of global software engineering.

  13. Approaches in highly parameterized inversion—PEST++ Version 3, a Parameter ESTimation and uncertainty analysis software suite optimized for large environmental models

    Science.gov (United States)

    Welter, David E.; White, Jeremy T.; Hunt, Randall J.; Doherty, John E.

    2015-09-18

    The PEST++ Version 1 object-oriented parameter estimation code is here extended to Version 3 to incorporate additional algorithms and tools to further improve support for large and complex environmental modeling problems. PEST++ Version 3 includes the Gauss-Marquardt-Levenberg (GML) algorithm for nonlinear parameter estimation, Tikhonov regularization, integrated linear-based uncertainty quantification, options of integrated TCP/IP based parallel run management or external independent run management by use of a Version 2 update of the GENIE Version 1 software code, and utilities for global sensitivity analyses. The Version 3 code design is consistent with PEST++ Version 1 and continues to be designed to lower the barriers of entry for users as well as developers while providing efficient and optimized algorithms capable of accommodating large, highly parameterized inverse problems. As such, this effort continues the original focus of (1) implementing the most popular and powerful features of the PEST software suite in a fashion that is easy for novice or experienced modelers to use and (2) developing a software framework that is easy to extend.

  14. High Efficiency Traveling-Wave Tube Power Amplifier for Ka-Band Software Defined Radio on International Space Station-A Platform for Communications Technology Development

    Science.gov (United States)

    Simons, Rainee N.; Force, Dale A.; Kacpura, Thomas J.

    2013-01-01

    The design, fabrication and RF performance of the output traveling-wave tube amplifier (TWTA) for a space based Ka-band software defined radio (SDR) is presented. The TWTA, the SDR and the supporting avionics are integrated to forms a testbed, which is currently located on an exterior truss of the International Space Station (ISS). The SDR in the testbed communicates at Ka-band frequencies through a high-gain antenna directed to NASA s Tracking and Data Relay Satellite System (TDRSS), which communicates to the ground station located at White Sands Complex. The application of the testbed is for demonstrating new waveforms and software designed to enhance data delivery from scientific spacecraft and, the waveforms and software can be upgraded and reconfigured from the ground. The construction and the salient features of the Ka-band SDR are discussed. The testbed is currently undergoing on-orbit checkout and commissioning and is expected to operate for 3 to 5 years in space.

  15. Consequences of long-term power outages and high electricity prices lasting for months; Konsekvenser av langvarige stroemutfall og hoeye kraftpriser i flere maaneder

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2005-07-01

    Several areas in the world have experienced electricity outages for longer periods of time, but the consequences of these are sparsely documented. There is a need for further analysis of the socioeconomic consequences of the outages. In addition to KILE (Quality adjusted revenue framework for un supplied energy) costs one has to take into account that the costs often increase proportionally with the durance of the outage, and that KILE tariffs do not reflect lost consumer's surplus for products that are not produced during an outage. A good example is the public underground transport, where the company's economical loss can be significantly smaller than the loss of utility value for the travellers. If the authorities act with reasonability it is difficult to see that periods with very high prices represent a big problem. The most important problems are related to diffused effects, especially for households with a weak economy. These problems can be solved with improved contractual forms (price guarantees) or by transfers to the households, without weakening the incentives for electricity economising (ml)

  16. Software system safety

    Science.gov (United States)

    Uber, James G.

    1988-01-01

    Software itself is not hazardous, but since software and hardware share common interfaces there is an opportunity for software to create hazards. Further, these software systems are complex, and proven methods for the design, analysis, and measurement of software safety are not yet available. Some past software failures, future NASA software trends, software engineering methods, and tools and techniques for various software safety analyses are reviewed. Recommendations to NASA are made based on this review.

  17. Criteria for the selection of ERP software

    Directory of Open Access Journals (Sweden)

    2007-01-01

    Full Text Available The implementation of an ERP software package is an important investment for an organization, which is characterized also by a high degree of risk. Selecting the most appropriate software is a necessary condition for a successful implementation. This paper is describing the major aspects of software selection in general and the relevant criteria in the case of ERP software.

  18. Phenomenological consequences of supersymmetry

    International Nuclear Information System (INIS)

    Hinchliffe, I.; Littenberg, L.

    1982-01-01

    This report deals with the phenomenological consequences of supersymmetric theories, and with the implications of such theories for future high energy machines. It is concerned only with high energy predictions of supersymmetry; low energy consequences (for example in the K/sub o/anti K/sub o/ system) are discussed in the context of future experiments by another group, and will be mentioned briefly only in the context of constraining existing models. However a brief section is included on the implication for proton decay, although detailed experimental questions are not discussed. The report is organized as follows. Section I consists of a brief review of supersymmetry and the salient features of existing supersymmetric models; this section can be ignored by those familiar with such models since it contains nothing new. Section 2 deals with the consequences for nucleon decay of SUSY. The remaining sections then discuss the physics possibilities of various machines; e anti e in Section 3, ep in Section 4, pp (or anti pp) colliders in Section 5 and fixed target hadron machines in Section 6

  19. Failure to Respond to Food Resource Decline Has Catastrophic Consequences for Koalas in a High-Density Population in Southern Australia.

    Directory of Open Access Journals (Sweden)

    Desley A Whisson

    Full Text Available Understanding the ability of koalas to respond to changes in their environment is critical for conservation of the species and their habitat. We monitored the behavioural response of koalas to declining food resources in manna gum (Eucalyptus viminalis woodland at Cape Otway, Victoria, Australia, from September 2011 to November 2013. Over this period, koala population density increased from 10.1 to 18.4 koalas.ha-1. As a result of the high browsing pressure of this population, manna gum canopy condition declined with 71.4% manna gum being completely or highly defoliated in September 2013. Despite declining food resources, radio collared koalas (N = 30 exhibited high fidelity to small ranges (0.4-1.2 ha. When trees became severely defoliated in September 2013, koalas moved relatively short distances from their former ranges (mean predicted change in range centroid = 144 m and remained in areas of 0.9 to 1.0 ha. This was despite the high connectivity of most manna gum woodland, and close proximity of the study site (< 3 km to the contiguous mixed forest of the Great Otway National Park. Limited movement had catastrophic consequences for koalas with 71% (15/21 of radio collared koalas dying from starvation or being euthanased due to their poor condition between September and November 2013.

  20. Novel, Highly-Parallel Software for the Online Storage System of the ATLAS Experiment at CERN: Design and Performances

    CERN Document Server

    Colombo, T; The ATLAS collaboration

    2012-01-01

    Abstract--- The ATLAS experiment observes proton-proton collisions delivered by the LHC accelerator at CERN. The ATLAS Trigger and Data Acquisition (TDAQ) system selects interesting events on-line in a three-level trigger system in order to store them at a budgeted rate of several hundred Hz, for an average event size of ~1.5 MB. This paper focuses on the TDAQ data-logging system and in particular on the implementation and performance of a novel software design, reporting on the effort of exploiting the full power of multi-core hardware. In this respect, the main challenge presented by the data-logging workload is the conflict between the largely parallel nature of the event processing, including the recently introduced on-line event-compression, and the constraint of sequential file writing and checksum evaluation. This is further complicated by the necessity of operating in a fully data-driven mode, to cope with continuously evolving trigger and detector configurations. In this paper we will briefly discuss...

  1. Software product quality control

    CERN Document Server

    Wagner, Stefan

    2013-01-01

    Quality is not a fixed or universal property of software; it depends on the context and goals of its stakeholders. Hence, when you want to develop a high-quality software system, the first step must be a clear and precise specification of quality. Yet even if you get it right and complete, you can be sure that it will become invalid over time. So the only solution is continuous quality control: the steady and explicit evaluation of a product's properties with respect to its updated quality goals.This book guides you in setting up and running continuous quality control in your environment. Star

  2. Software Safety and Security

    CERN Document Server

    Nipkow, T; Hauptmann, B

    2012-01-01

    Recent decades have seen major advances in methods and tools for checking the safety and security of software systems. Automatic tools can now detect security flaws not only in programs of the order of a million lines of code, but also in high-level protocol descriptions. There has also been something of a breakthrough in the area of operating system verification. This book presents the lectures from the NATO Advanced Study Institute on Tools for Analysis and Verification of Software Safety and Security; a summer school held at Bayrischzell, Germany, in 2011. This Advanced Study Institute was

  3. Possibilities and limitations of applying software reliability growth models to safety-critical software

    International Nuclear Information System (INIS)

    Kim, Man Cheol; Jang, Seung Cheol; Ha, Jae Joo

    2007-01-01

    It is generally known that software reliability growth models such as the Jelinski-Moranda model and the Goel-Okumoto's Non-Homogeneous Poisson Process (NHPP) model cannot be applied to safety-critical software due to a lack of software failure data. In this paper, by applying two of the most widely known software reliability growth models to sample software failure data, we demonstrate the possibility of using the software reliability growth models to prove the high reliability of safety-critical software. The high sensitivity of a piece of software's reliability to software failure data, as well as a lack of sufficient software failure data, is also identified as a possible limitation when applying the software reliability growth models to safety-critical software

  4. Aircraft Design Software

    Science.gov (United States)

    1997-01-01

    Successful commercialization of the AirCraft SYNThesis (ACSYNT) tool has resulted in the creation of Phoenix Integration, Inc. ACSYNT has been exclusively licensed to the company, an outcome of a seven year, $3 million effort to provide unique software technology to a focused design engineering market. Ames Research Center formulated ACSYNT and in working with the Virginia Polytechnic Institute CAD Laboratory, began to design and code a computer-aided design for ACSYNT. Using a Joint Sponsored Research Agreement, Ames formed an industry-government-university alliance to improve and foster research and development for the software. As a result of the ACSYNT Institute, the software is becoming a predominant tool for aircraft conceptual design. ACSYNT has been successfully applied to high- speed civil transport configuration, subsonic transports, and supersonic fighters.

  5. Lecture 2: Software Security

    CERN Multimedia

    CERN. Geneva

    2013-01-01

    Computer security has been an increasing concern for IT professionals for a number of years, yet despite all the efforts, computer systems and networks remain highly vulnerable to attacks of different kinds. Design flaws and security bugs in the underlying software are among the main reasons for this. This lecture addresses the following question: how to create secure software? The lecture starts with a definition of computer security and an explanation of why it is so difficult to achieve. It then introduces the main security principles (like least-privilege, or defense-in-depth) and discusses security in different phases of the software development cycle. The emphasis is put on the implementation part: most common pitfalls and security bugs are listed, followed by advice on best practice for security development, testing and deployment. Sebastian Lopienski is CERN’s deputy Computer Security Officer. He works on security strategy and policies; offers internal consultancy and audit services; develops and ...

  6. ACTS: from ATLAS software towards a common track reconstruction software

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00349786; The ATLAS collaboration; Salzburger, Andreas; Kiehn, Moritz; Hrdinka, Julia; Calace, Noemi

    2017-01-01

    Reconstruction of charged particles' trajectories is a crucial task for most particle physics experiments. The high instantaneous luminosity achieved at the LHC leads to a high number of proton-proton collisions per bunch crossing, which has put the track reconstruction software of the LHC experiments through a thorough test. Preserving track reconstruction performance under increasingly difficult experimental conditions, while keeping the usage of computational resources at a reasonable level, is an inherent problem for many HEP experiments. Exploiting concurrent algorithms and using multivariate techniques for track identification are the primary strategies to achieve that goal. Starting from current ATLAS software, the ACTS project aims to encapsulate track reconstruction software into a generic, framework- and experiment-independent software package. It provides a set of high-level algorithms and data structures for performing track reconstruction tasks as well as fast track simulation. The software is de...

  7. Sandia software guidelines: Software quality planning

    Energy Technology Data Exchange (ETDEWEB)

    1987-08-01

    This volume is one in a series of Sandia Software Guidelines intended for use in producing quality software within Sandia National Laboratories. In consonance with the IEEE Standard for Software Quality Assurance Plans, this volume identifies procedures to follow in producing a Software Quality Assurance Plan for an organization or a project, and provides an example project SQA plan. 2 figs., 4 tabs.

  8. Avoidable Software Procurements

    Science.gov (United States)

    2012-09-01

    software license, software usage, ELA, Software as a Service , SaaS , Software Asset...PaaS Platform as a Service SaaS Software as a Service SAM Software Asset Management SMS System Management Server SEWP Solutions for Enterprise Wide...delivery of full Cloud Services , we will see the transition of the Cloud Computing service model from Iaas to SaaS , or Software as a Service . Software

  9. Factors that motivate software developers in Nigerian's software ...

    African Journals Online (AJOL)

    It was also observed those courtesy, good reward systems, regular training, recognition, tolerance of mistakes and good leadership were high motivators of software developers. Keywords: Software developers, information technology, project managers, Nigeria International Journal of Natural and Applied Sciences, 6(4): ...

  10. Verification of safety critical software

    International Nuclear Information System (INIS)

    Son, Ki Chang; Chun, Chong Son; Lee, Byeong Joo; Lee, Soon Sung; Lee, Byung Chai

    1996-01-01

    To assure quality of safety critical software, software should be developed in accordance with software development procedures and rigorous software verification and validation should be performed. Software verification is the formal act of reviewing, testing of checking, and documenting whether software components comply with the specified requirements for a particular stage of the development phase[1]. New software verification methodology was developed and was applied to the Shutdown System No. 1 and 2 (SDS1,2) for Wolsung 2,3 and 4 nuclear power plants by Korea Atomic Energy Research Institute(KAERI) and Atomic Energy of Canada Limited(AECL) in order to satisfy new regulation requirements of Atomic Energy Control Boars(AECB). Software verification methodology applied to SDS1 for Wolsung 2,3 and 4 project will be described in this paper. Some errors were found by this methodology during the software development for SDS1 and were corrected by software designer. Outputs from Wolsung 2,3 and 4 project have demonstrated that the use of this methodology results in a high quality, cost-effective product. 15 refs., 6 figs. (author)

  11. Single software platform used for high speed data transfer implementation in a 65k pixel camera working in single photon counting mode

    International Nuclear Information System (INIS)

    Maj, P.; Kasiński, K.; Gryboś, P.; Szczygieł, R.; Kozioł, A.

    2015-01-01

    Integrated circuits designed for specific applications generally use non-standard communication methods. Hybrid pixel detector readout electronics produces a huge amount of data as a result of number of frames per seconds. The data needs to be transmitted to a higher level system without limiting the ASIC's capabilities. Nowadays, the Camera Link interface is still one of the fastest communication methods, allowing transmission speeds up to 800 MB/s. In order to communicate between a higher level system and the ASIC with a dedicated protocol, an FPGA with dedicated code is required. The configuration data is received from the PC and written to the ASIC. At the same time, the same FPGA should be able to transmit the data from the ASIC to the PC at the very high speed. The camera should be an embedded system enabling autonomous operation and self-monitoring. In the presented solution, at least three different hardware platforms are used—FPGA, microprocessor with real-time operating system and the PC with end-user software. We present the use of a single software platform for high speed data transfer from 65k pixel camera to the personal computer

  12. Single software platform used for high speed data transfer implementation in a 65k pixel camera working in single photon counting mode

    Science.gov (United States)

    Maj, P.; Kasiński, K.; Gryboś, P.; Szczygieł, R.; Kozioł, A.

    2015-12-01

    Integrated circuits designed for specific applications generally use non-standard communication methods. Hybrid pixel detector readout electronics produces a huge amount of data as a result of number of frames per seconds. The data needs to be transmitted to a higher level system without limiting the ASIC's capabilities. Nowadays, the Camera Link interface is still one of the fastest communication methods, allowing transmission speeds up to 800 MB/s. In order to communicate between a higher level system and the ASIC with a dedicated protocol, an FPGA with dedicated code is required. The configuration data is received from the PC and written to the ASIC. At the same time, the same FPGA should be able to transmit the data from the ASIC to the PC at the very high speed. The camera should be an embedded system enabling autonomous operation and self-monitoring. In the presented solution, at least three different hardware platforms are used—FPGA, microprocessor with real-time operating system and the PC with end-user software. We present the use of a single software platform for high speed data transfer from 65k pixel camera to the personal computer.

  13. Modularity analysis of automotive control software

    OpenAIRE

    Dajsuren, Y.; Brand, van den, M.G.J.; Serebrenik, A.

    2013-01-01

    A design language and tool like MATLAB/Simulink is used for the graphical modelling and simulation of automotive control software. As the functionality based on electronics and software systems increases in motor vehicles, it is becoming increasingly important for system/software architects and control engineers in the automotive industry to ensure the quality of the highly complex MATLAB/Simulink control software. For automotive software, modularity is recognized as being a crucial quality a...

  14. The software development process in worldwide collaborations

    International Nuclear Information System (INIS)

    Amako, K.

    1998-01-01

    High energy physics experiments in future colliders are inevitably large scale international collaborations. In these experiments, software development has to be done by a large number of physicists, software engineers and computer scientists, dispersed all over the world. The major subject of this paper is to discuss on various aspects of software development in the worldwide environment. These include software engineering and methodology, software development process and management. (orig.)

  15. Software engineering architecture-driven software development

    CERN Document Server

    Schmidt, Richard F

    2013-01-01

    Software Engineering: Architecture-driven Software Development is the first comprehensive guide to the underlying skills embodied in the IEEE's Software Engineering Body of Knowledge (SWEBOK) standard. Standards expert Richard Schmidt explains the traditional software engineering practices recognized for developing projects for government or corporate systems. Software engineering education often lacks standardization, with many institutions focusing on implementation rather than design as it impacts product architecture. Many graduates join the workforce with incomplete skil

  16. High feather mercury concentrations in the wandering albatross are related to sex, breeding status and trophic ecology with no demographic consequences

    Energy Technology Data Exchange (ETDEWEB)

    Bustamante, Paco, E-mail: pbustama@univ-lr.fr [Littoral Environnement et Sociétés (LIENSs), UMR 7266 CNRS-Université de la Rochelle, 2 rue Olympe de Gouges, 17000 La Rochelle (France); Carravieri, Alice [Littoral Environnement et Sociétés (LIENSs), UMR 7266 CNRS-Université de la Rochelle, 2 rue Olympe de Gouges, 17000 La Rochelle (France); Centre d’Etudes Biologiques de Chizé (CEBC), UMR 7372 du Centre National de la Recherche Scientifique-Université de La Rochelle, 79360 Villiers-en-Bois (France); Goutte, Aurélie [Centre d’Etudes Biologiques de Chizé (CEBC), UMR 7372 du Centre National de la Recherche Scientifique-Université de La Rochelle, 79360 Villiers-en-Bois (France); École Pratique des Hautes Études (EPHE), SPL, UPMC Université Paris 06, UMR 7619 METIS, F-75005, 4 place Jussieu, Paris (France); Barbraud, Christophe; Delord, Karine; Chastel, Olivier; Weimerskirch, Henri; Cherel, Yves [Centre d’Etudes Biologiques de Chizé (CEBC), UMR 7372 du Centre National de la Recherche Scientifique-Université de La Rochelle, 79360 Villiers-en-Bois (France)

    2016-01-15

    Hg can affect physiology of seabirds and ultimately their demography, particularly if they are top consumers. In the present study, body feathers of >200 wandering albatrosses from Possession Island in the Crozet archipelago were used to explore the potential demographic effects of the long-term exposure to Hg on an apex predator. Variations of Hg with sex, age class, foraging habitat (inferred from δ{sup 13}C values), and feeding habits (inferred from δ{sup 15}N values) were examined as well as the influence of Hg on current breeding output, long-term fecundity and survival. Wandering albatrosses displayed among the highest Hg feather concentrations reported for seabirds, ranging from 5.9 to 95 µg g{sup −1}, as a consequence of their high trophic position (δ{sup 15}N values). These concentrations fall within the same range of those of other wandering albatross populations from subantarctic sites, suggesting that this species has similar exposure to Hg all around the Southern Ocean. In both immature and adult albatrosses, females had higher Hg concentrations than males (28 vs. 20 µg g{sup −1} dw on average, respectively), probably as a consequence of females foraging at lower latitudes than males (δ{sup 13}C values). Hg concentrations were higher in immature than in adult birds, and they remained fairly constant across a wide range of ages in adults. Such high levels in immature individuals question (i) the frequency of moult in young birds, (ii) the efficiency of Hg detoxification processes in immatures compared to adults, and (iii) importantly the potential detrimental effects of Hg in early life. Despite very high Hg concentrations in their feathers, neither effects on adults' breeding probability, hatching failure and fledgling failure, nor on adults' survival rate were detected, suggesting that long-term bioaccumulated Hg was not under a chemical form leading to deleterious effects on reproductive parameters in adult individuals

  17. High feather mercury concentrations in the wandering albatross are related to sex, breeding status and trophic ecology with no demographic consequences

    International Nuclear Information System (INIS)

    Bustamante, Paco; Carravieri, Alice; Goutte, Aurélie; Barbraud, Christophe; Delord, Karine; Chastel, Olivier; Weimerskirch, Henri; Cherel, Yves

    2016-01-01

    Hg can affect physiology of seabirds and ultimately their demography, particularly if they are top consumers. In the present study, body feathers of >200 wandering albatrosses from Possession Island in the Crozet archipelago were used to explore the potential demographic effects of the long-term exposure to Hg on an apex predator. Variations of Hg with sex, age class, foraging habitat (inferred from δ 13 C values), and feeding habits (inferred from δ 15 N values) were examined as well as the influence of Hg on current breeding output, long-term fecundity and survival. Wandering albatrosses displayed among the highest Hg feather concentrations reported for seabirds, ranging from 5.9 to 95 µg g −1 , as a consequence of their high trophic position (δ 15 N values). These concentrations fall within the same range of those of other wandering albatross populations from subantarctic sites, suggesting that this species has similar exposure to Hg all around the Southern Ocean. In both immature and adult albatrosses, females had higher Hg concentrations than males (28 vs. 20 µg g −1 dw on average, respectively), probably as a consequence of females foraging at lower latitudes than males (δ 13 C values). Hg concentrations were higher in immature than in adult birds, and they remained fairly constant across a wide range of ages in adults. Such high levels in immature individuals question (i) the frequency of moult in young birds, (ii) the efficiency of Hg detoxification processes in immatures compared to adults, and (iii) importantly the potential detrimental effects of Hg in early life. Despite very high Hg concentrations in their feathers, neither effects on adults' breeding probability, hatching failure and fledgling failure, nor on adults' survival rate were detected, suggesting that long-term bioaccumulated Hg was not under a chemical form leading to deleterious effects on reproductive parameters in adult individuals. - Highlights: • Immature

  18. Future of Software Engineering Standards

    Science.gov (United States)

    Poon, Peter T.

    1997-01-01

    In the new millennium, software engineering standards are expected to continue to influence the process of producing software-intensive systems which are cost-effetive and of high quality. These sytems may range from ground and flight systems used for planetary exploration to educational support systems used in schools as well as consumer-oriented systems.

  19. Software architecture analysis of usability

    NARCIS (Netherlands)

    Folmer, Eelke

    2005-01-01

    One of the qualities that has received increased attention in recent decades is usability. A software product with poor usability is likely to fail in a highly competitive market; therefore software developing organizations are paying more and more attention to ensuring the usability of their

  20. Consequences of high-x proton size fluctuations in small collision systems at √{sNN}=200 GeV

    Science.gov (United States)

    McGlinchey, D.; Nagle, J. L.; Perepelitsa, D. V.

    2016-08-01

    Recent measurements of jet production rates at large transverse momentum (pT) in the collisions of small projectiles with large nuclei at the BNL Relativistic Heavy Ion Collider (RHIC) and the CERN Large Hadron Collider indicate that they have an unexpected relationship with estimates of the collision centrality. One compelling interpretation of the data is that they capture an xp-dependent decrease in the average interaction strength of the nucleon in the projectile undergoing a hard scattering. A weakly interacting or "shrinking" nucleon in the projectile strikes fewer nucleons in the nucleus, resulting in a particular pattern of centrality-dependent modifications to high-pT processes. We describe a simple one-parameter geometric implementation of this picture within a modified Monte Carlo Glauber model tuned to d +Au jet data, and explore two of its major consequences. First, the model predicts a particular projectile-species effect on the centrality dependence at high xp, opposite to that expected from a final state energy loss effect. Second, we find that some of the large centrality dependence observed for forward dihadron production in d +Au collisions at RHIC may arise from the physics of the "shrinking" projectile nucleon, in addition to impact parameter dependent shadowing or saturation effects at low nuclear x . We conclude that analogous measurements in recently collected p +Au and 3He+Au collision data at RHIC can provide a unique test of these predictions.

  1. Towards an Evaluation Framework for Software Process Improvement

    OpenAIRE

    Cheng, Chow Kian; Permadi, Rahadian Bayu

    2009-01-01

    Software has gained an essential role in our daily life in the last decades. This condition demands high quality software. To produce high quality software many practitioners and researchers put more attention on the software development process. Large investments are poured to improve the software development process. Software Process Improvement (SPI) is a research area which is aimed to address the assessment and improvement issues in the software development process. One of the most impor...

  2. The software life cycle

    CERN Document Server

    Ince, Darrel

    1990-01-01

    The Software Life Cycle deals with the software lifecycle, that is, what exactly happens when software is developed. Topics covered include aspects of software engineering, structured techniques of software development, and software project management. The use of mathematics to design and develop computer systems is also discussed. This book is comprised of 20 chapters divided into four sections and begins with an overview of software engineering and software development, paying particular attention to the birth of software engineering and the introduction of formal methods of software develop

  3. Multi-atom resonant photoemission and the development of next-generation software and high-speed detectors for electron spectroscopy

    International Nuclear Information System (INIS)

    Kay, Alexander William

    2000-01-01

    This dissertation has involved the exploration of a new effect in photoelectron emission, multi-atom resonant photoemission (MARPE), as well as the development of new software, data analysis techniques, and detectors of general use in such research. We present experimental and theoretical results related to MARPE, in which the photoelectron intensity from a core level on one atom is influenced by a core-level absorption resonance on another. We point out that some of our and others prior experimental data has been strongly influenced by detector non-linearity and that the effects seen in new corrected data are smaller and of different form. Corrected data for the MnO(001) system with resonance between the O 1s and Mn 2p energy levels are found to be well described by an extension of well-known intraatomic resonant photoemission theory to the interatomic case, provided that interactions beyond the usual second-order Kramers-Heisenberg treatment are included. This theory is also found to simplify under certain conditions so as to yield results equivalent to a classical x-ray optical approach, with the latter providing an accurate and alternative, although less detailed and general, physical picture of these effects. Possible future applications of MARPE as a new probe of near-neighbor identities and bonding and its relationship to other known effects are also discussed. We also consider in detail specially written data acquisition software that has been used for most of the measurements reported here. This software has been used with an existing experimental system to develop the method of detector characterization and then data correction required for the work described above. The development of a next generation one-dimensional, high-speed, electron detector is also discussed. Our goal has been to design, build and test a prototype high-performance, one-dimensional pulse-counting detector that represents a significant advancement in detector technology and is well

  4. Multi-atom resonant photoemission and the development of next-generation software and high-speed detectors for electron spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Kay, Alexander William [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2000-09-01

    This dissertation has involved the exploration of a new effect in photoelectron emission, multi-atom resonant photoemission (MARPE), as well as the development of new software, data analysis techniques, and detectors of general use in such research. We present experimental and theoretical results related to MARPE, in which the photoelectron intensity from a core level on one atom is influenced by a core-level absorption resonance on another. We point out that some of our and others prior experimental data has been strongly influenced by detector non-linearity and that the effects seen in new corrected data are smaller and of different form. Corrected data for the MnO(001) system with resonance between the O 1s and Mn 2p energy levels are found to be well described by an extension of well-known intraatomic resonant photoemission theory to the interatomic case, provided that interactions beyond the usual second-order Kramers-Heisenberg treatment are included. This theory is also found to simplify under certain conditions so as to yield results equivalent to a classical x-ray optical approach, with the latter providing an accurate and alternative, although less detailed and general, physical picture of these effects. Possible future applications of MARPE as a new probe of near-neighbor identities and bonding and its relationship to other known effects are also discussed. We also consider in detail specially written data acquisition software that has been used for most of the measurements reported here. This software has been used with an existing experimental system to develop the method of detector characterization and then data correction required for the work described above. The development of a next generation one-dimensional, high-speed, electron detector is also discussed. Our goal has been to design, build and test a prototype high-performance, one-dimensional pulse-counting detector that represents a significant advancement in detector technology and is well

  5. Comparison of performance of object-based image analysis techniques available in open source software (Spring and Orfeo Toolbox/Monteverdi) considering very high spatial resolution data

    Science.gov (United States)

    Teodoro, Ana C.; Araujo, Ricardo

    2016-01-01

    The use of unmanned aerial vehicles (UAVs) for remote sensing applications is becoming more frequent. However, this type of information can result in several software problems related to the huge amount of data available. Object-based image analysis (OBIA) has proven to be superior to pixel-based analysis for very high-resolution images. The main objective of this work was to explore the potentialities of the OBIA methods available in two different open source software applications, Spring and OTB/Monteverdi, in order to generate an urban land cover map. An orthomosaic derived from UAVs was considered, 10 different regions of interest were selected, and two different approaches were followed. The first one (Spring) uses the region growing segmentation algorithm followed by the Bhattacharya classifier. The second approach (OTB/Monteverdi) uses the mean shift segmentation algorithm followed by the support vector machine (SVM) classifier. Two strategies were followed: four classes were considered using Spring and thereafter seven classes were considered for OTB/Monteverdi. The SVM classifier produces slightly better results and presents a shorter processing time. However, the poor spectral resolution of the data (only RGB bands) is an important factor that limits the performance of the classifiers applied.

  6. Software Quality Assurance activities of ITER CODAC

    Energy Technology Data Exchange (ETDEWEB)

    Pande, Sopan, E-mail: sopan.pande@iter.org [ITER Organization, Route de Vinon sur Verdon, 13115 St Paul Lez Durance (France); DiMaio, Franck; Kim, Changseung; Kim, Joohan; Klotz, Wolf-Dieter; Makijarvi, Petri; Stepanov, Denis; Wallander, Anders [ITER Organization, Route de Vinon sur Verdon, 13115 St Paul Lez Durance (France)

    2013-10-15

    Highlights: ► Comprehensive and consistent software engineering and quality assurance of CODAC. ► Applicable to all CODAC software projects executed by ITER DAs and contractors. ► Configurable plans for cost effective application of SQA processes. ► CODAC software plans SQAP, SVVP, SDP, and SCMP. ► CODAC software processes based on IEEE 12207-2008. -- Abstract: Software as an integral part of the plant system I and C is crucial in the manufacturing and integrated operation of ITER plant systems. Software Quality Assurance is necessary to ensure the development and maintenance of consistently high quality I and C software throughout the lifetime of ITER. CODAC decided to follow IEEE 12207-2008 software lifecycle processes for Software Engineering and Software Quality Assurance. Software Development Plan, Software Configuration Management Plan and Software Verification and Validation Plan are the mainstay of Software Quality Assurance which is documented in the Software Quality Assurance Plan. This paper describes the Software Quality Assurance (SQA) activities performed by CODAC. The SQA includes development and maintenance of above plans, processes and resources. With the help of Verification and Validation Teams they gather evidence of process conformance and product conformance, and record process data for quality audits and perform process improvements.

  7. Software Quality Assurance activities of ITER CODAC

    International Nuclear Information System (INIS)

    Pande, Sopan; DiMaio, Franck; Kim, Changseung; Kim, Joohan; Klotz, Wolf-Dieter; Makijarvi, Petri; Stepanov, Denis; Wallander, Anders

    2013-01-01

    Highlights: ► Comprehensive and consistent software engineering and quality assurance of CODAC. ► Applicable to all CODAC software projects executed by ITER DAs and contractors. ► Configurable plans for cost effective application of SQA processes. ► CODAC software plans SQAP, SVVP, SDP, and SCMP. ► CODAC software processes based on IEEE 12207-2008. -- Abstract: Software as an integral part of the plant system I and C is crucial in the manufacturing and integrated operation of ITER plant systems. Software Quality Assurance is necessary to ensure the development and maintenance of consistently high quality I and C software throughout the lifetime of ITER. CODAC decided to follow IEEE 12207-2008 software lifecycle processes for Software Engineering and Software Quality Assurance. Software Development Plan, Software Configuration Management Plan and Software Verification and Validation Plan are the mainstay of Software Quality Assurance which is documented in the Software Quality Assurance Plan. This paper describes the Software Quality Assurance (SQA) activities performed by CODAC. The SQA includes development and maintenance of above plans, processes and resources. With the help of Verification and Validation Teams they gather evidence of process conformance and product conformance, and record process data for quality audits and perform process improvements

  8. Producing and supporting sharable software

    International Nuclear Information System (INIS)

    Johnstad, H.; Nicholls, J.

    1987-02-01

    A survey is reported that addressed the question of shareable software for the High Energy Physics community. Statistics are compiled for the responses of 54 people attending a conference on the subject of shareable software to a questionnaire which addressed the usefulness of shareable software, preference of programming language, and source management tools. The results are found to reflect a continued need for shareable software in the High Energy Physics community and that this effort be performed in coordination. A strong mandate is also claimed for large facilities to support the community with software and that these facilities should act as distribution points. Considerable interest is expressed in languages other than FORTRAN, and the desire for standards or rules in programming is expressed. A need is identified for source management tools

  9. System support software for TSTA

    International Nuclear Information System (INIS)

    Claborn, G.W.; Mann, L.W.; Nielson, C.W.

    1987-01-01

    The software at the Tritium Systems Test Assembly (TSTA) is logically broken into two parts, the system support software and the subsystem software. The purpose of the system support software is to isolate the subsystem software from the physical hardware. In this sense the system support software forms the kernel of the software at TSTA. The kernel software performs several functions. It gathers data from CAMAC modules and makes that data available for subsystem processes. It services requests to send commands to CAMAC modules. It provides a system of logging functions and provides for a system-wide global program state that allows highly structured interaction between subsystem processes. The kernel's most visible function is to provide the Man-Machine Interface (MMI). The MMI allows the operators a window into the physical hardware and subsystem process state. Finally the kernel provides a data archiving and compression function that allows archival data to be accessed and plotted. Such kernel software as developed and implemented at TSTA is described

  10. Analysis of the effect of variations in parameter values on the predicted radiological consequences of geologic disposal of high-level waste

    International Nuclear Information System (INIS)

    Hill, M.D.

    1979-06-01

    A preliminary assessment of the radiological consequences of geologic disposal of high-level waste (Hill and Grimwood. NRPB-R69 (1978)) identified several areas where further research is required before this disposal option can be fully evaluated. This report is an analysis of the sensitivity of the results of the preliminary assessment to the assumptions made and the values of the parameters used. The parameters considered include the leach rate of the waste, the ground-water velocity, the length of the flow path from the repository to a source of drinking water and the sorption constants of the principle radionuclides. The results obtained by varying these parameters are used to examine the effects of assumptions such as the time at which leaching of the waste begins. The sensitivity analysis shows the relative importance of the waste canisters, the waste form and the geologic barrier to radionuclide migration in determining potential doses. These results are used to identify research priorities, establish preliminary design criteria and indicate developments needed in the mathematical modelling of the movement of radionuclides from a repository to the biosphere. (author)

  11. MALDI-TOF mass spectrometry and high-consequence bacteria: safety and stability of biothreat bacterial sample testing in clinical diagnostic laboratories.

    Science.gov (United States)

    Tracz, Dobryan M; Tober, Ashley D; Antonation, Kym S; Corbett, Cindi R

    2018-03-01

    We considered the application of MALDI-TOF mass spectrometry for BSL-3 bacterial diagnostics, with a focus on the biosafety of live-culture direct-colony testing and the stability of stored extracts. Biosafety level 2 (BSL-2) bacterial species were used as surrogates for BSL-3 high-consequence pathogens in all live-culture MALDI-TOF experiments. Viable BSL-2 bacteria were isolated from MALDI-TOF mass spectrometry target plates after 'direct-colony' and 'on-plate' extraction testing, suggesting that the matrix chemicals alone cannot be considered sufficient to inactivate bacterial culture and spores in all samples. Sampling of the instrument interior after direct-colony analysis did not recover viable organisms, suggesting that any potential risks to the laboratory technician are associated with preparation of the MALDI-TOF target plate before or after testing. Secondly, a long-term stability study (3 years) of stored MALDI-TOF extracts showed that match scores can decrease below the threshold for reliable species identification (<1.7), which has implications for proficiency test panel item storage and distribution.

  12. Software testing and global industry future paradigms

    CERN Document Server

    Casey, Valentine; Richardson, Ita

    2009-01-01

    Today software development has truly become a globally sourced commodity. This trend has been facilitated by the availability of highly skilled software professionals in low cost locations in Eastern Europe, Latin America and the Far East. Organisations

  13. ESTSC - Software Best Practices

    Science.gov (United States)

    DOE Scientific and Technical Software Best Practices December 2010 Table of Contents 1.0 Introduction 2.0 Responsibilities 2.1 OSTI/ESTSC 2.2 SIACs 2.3 Software Submitting Sites/Creators 2.4 Software Sensitivity Review 3.0 Software Announcement and Submission 3.1 STI Software Appropriate for Announcement 3.2

  14. Software Assurance Competency Model

    Science.gov (United States)

    2013-03-01

    COTS) software , and software as a service ( SaaS ). L2: Define and analyze risks in the acquisition of contracted software , COTS software , and SaaS ...2010a]: Application of technologies and processes to achieve a required level of confidence that software systems and services function in the...

  15. Using the PSCPCSP computer software for optimization of the composition of industrial alloys and development of new high-temperature nickel-base alloys

    Science.gov (United States)

    Rtishchev, V. V.

    1995-11-01

    Using computer programs some foreign firms have developed new deformable and castable high-temperature nickel-base alloys such as IN, Rene, Mar-M, Udimet, TRW, TM, TMS, TUT, with equiaxial, columnar, and single-crystal structures for manufacturing functional and nozzle blades and other parts of the hot duct of transport and stationary gas-turbine installations (GTI). Similar investigations have been carried out in Russia. This paper presents examples of the use of the PSCPCSP computer software for a quantitative analysis of structural und phase characteristics and properties of industrial alloys with change (within the grade range) in the concentrations of the alloying elements for optimizing the composition of the alloys and regimes of their heat treatment.

  16. Reliability of the imaging software in the preoperative planning of the open-wedge high tibial osteotomy.

    Science.gov (United States)

    Lee, Yong Seuk; Kim, Min Kyu; Byun, Hae Won; Kim, Sang Bum; Kim, Jin Goo

    2015-03-01

    The purpose of this study was to verify a recently developed picture-archiving and communications system-photoshop method by comparing reliabilities between real-size paper template and the PACS-photoshop methods in preoperative planning of open-wedge high tibial osteotomy. A prospective case series was conducted, including patients with medial osteoarthritis undergoing open-wedge high tibial osteotomy. In the preoperative planning, the picture-archiving and communications system-photoshop method and real-size paper template method were used simultaneously in all patients. Preoperative hip-knee-ankle angle, height, and angle of the osteotomy were evaluated. The reliability of this newly devised method was evaluated, and the consistency between the two methods was also evaluated using intra-class correlation coefficient. Using the picture-archiving and communications system-photoshop method, the mean correction angle and height of osteotomy gap of rater-1 were 11.7° ± 3.6° and 10.7 ± 3.6 mm, respectively. The mean correction angle and height of osteotomy gap of rater-2 were 12.0 ± 2.6 and 10.8 ± 3.6, respectively. The inter- and intra-rater reliabilities of the correction angle were 0.956 ~ 0.979 and 0.980 ~ 0.992, respectively. The inter- and intra-rater reliabilities of the height of the osteotomy gap were 0.968 ~ 0.985 and 0.971 ~ 0.994, respectively (p photoshop method, mean values of the correction angle and height of the osteotomy gap were 11.9° ± 3.6° and 10.8 ± 3.6 mm, respectively. Consistency between the two methods by comparing the means of the correction angle and the height of the osteotomy gap were 0.985 and 0.985, respectively (p photoshop method enables direct measurement of the height of the osteotomy gap with high reliability.

  17. Relationship intimacy in software ecosystems : a survey of the dutch software industry

    NARCIS (Netherlands)

    Angeren, van J.; Blijleven, V.; Jansen, S.

    2011-01-01

    Software vendors depend on suppliers to provide the underlying technology for domain specific solutions. As a consequence, software vendors cooperate with suppliers to deliver a product. This cooperation results in supplier dependence, but also leads to opportunities. We present the results of an

  18. Toward objective software process information : experiences from a case study

    NARCIS (Netherlands)

    Samalikova, J.; Kusters, R.J.; Trienekens, J.J.M.; Weijters, A.J.M.M.; Siemons, P.

    2011-01-01

    A critical problem in software development is the monitoring, control and improvement in the processes of software developers. Software processes are often not explicitly modeled, and manuals to support the development work contain abstract guidelines and procedures. Consequently, there are huge

  19. Childhood Obesity Causes & Consequences

    Science.gov (United States)

    ... and Local Programs Related Topics Diabetes Nutrition Childhood Obesity Causes & Consequences Recommend on Facebook Tweet Share Compartir ... determine how a community is designed. Consequences of Obesity More Immediate Health Risks Obesity during childhood can ...

  20. Software: our quest for excellence. Honoring 50 years of software history, progress, and process

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-06-01

    The Software Quality Forum was established by the Software Quality Assurance (SQA) Subcommittee, which serves as a technical advisory group on software engineering and quality initiatives and issues for DOE`s quality managers. The forum serves as an opportunity for all those involved in implementing SQA programs to meet and share ideas and concerns. Participation from managers, quality engineers, and software professionals provides an ideal environment for identifying and discussing issues and concerns. The interaction provided by the forum contributes to the realization of a shared goal--high quality software product. Topics include: testing, software measurement, software surety, software reliability, SQA practices, assessments, software process improvement, certification and licensing of software professionals, CASE tools, software project management, inspections, and management`s role in ensuring SQA. The bulk of this document consists of vugraphs. Selected papers have been indexed separately for inclusion in the Energy Science and Technology Database.

  1. Software Project Management Plan for the Integrated Systems Code (ISC) of New Production Reactor -- Modular High Temperature Gas Reactor

    International Nuclear Information System (INIS)

    Taylor, D.

    1990-11-01

    The United States Department of Energy (DOE) has selected the Modular High Temperature Gas-Cooled Reactor (MHTGR) as one of the concepts for the New Production Reactor (NPR). DOE has also established several Technical Working Groups (TWG's) at the national laboratories to provide independent design confirmation of the NPR-MHTGR design. One of those TWG's is concerned with Thermal Fluid Flow (TFF) and analysis methods to provide independent design confirmation of the NPR-MHTGR. Analysis methods are also needed for operational safety evaluations, performance monitoring, sensitivity studies, and operator training. The TFF Program Plan includes, as one of its principal tasks, the development of a computer program (called the Integrated Systems Code, or ISC). This program will provide the needed long-term analysis capabilities for the MHTGR and its subsystems. This document presents the project management plan for development of the ISC. It includes the associated quality assurance tasks, and the schedule and resource requirements to complete these activities. The document conforms to the format of ANSI/IEEE Std. 1058.1-1987. 2 figs

  2. Reliability of software

    International Nuclear Information System (INIS)

    Kopetz, H.

    1980-01-01

    Common factors and differences in the reliability of hardware and software; reliability increase by means of methods of software redundancy. Maintenance of software for long term operating behavior. (HP) [de

  3. reSpect: Software for Identification of High and Low Abundance Ion Species in Chimeric Tandem Mass Spectra

    Science.gov (United States)

    Shteynberg, David; Mendoza, Luis; Hoopmann, Michael R.; Sun, Zhi; Schmidt, Frank; Deutsch, Eric W.; Moritz, Robert L.

    2015-11-01

    Most shotgun proteomics data analysis workflows are based on the assumption that each fragment ion spectrum is explained by a single species of peptide ion isolated by the mass spectrometer; however, in reality mass spectrometers often isolate more than one peptide ion within the window of isolation that contribute to additional peptide fragment peaks in many spectra. We present a new tool called reSpect, implemented in the Trans-Proteomic Pipeline (TPP), which enables an iterative workflow whereby fragment ion peaks explained by a peptide ion identified in one round of sequence searching or spectral library search are attenuated based on the confidence of the identification, and then the altered spectrum is subjected to further rounds of searching. The reSpect tool is not implemented as a search engine, but rather as a post-search engine processing step where only fragment ion intensities are altered. This enables the application of any search engine combination in the iterations that follow. Thus, reSpect is compatible with all other protein sequence database search engines as well as peptide spectral library search engines that are supported by the TPP. We show that while some datasets are highly amenable to chimeric spectrum identification and lead to additional peptide identification boosts of over 30% with as many as four different peptide ions identified per spectrum, datasets with narrow precursor ion selection only benefit from such processing at the level of a few percent. We demonstrate a technique that facilitates the determination of the degree to which a dataset would benefit from chimeric spectrum analysis. The reSpect tool is free and open source, provided within the TPP and available at the TPP website.

  4. reSpect: software for identification of high and low abundance ion species in chimeric tandem mass spectra.

    Science.gov (United States)

    Shteynberg, David; Mendoza, Luis; Hoopmann, Michael R; Sun, Zhi; Schmidt, Frank; Deutsch, Eric W; Moritz, Robert L

    2015-11-01

    Most shotgun proteomics data analysis workflows are based on the assumption that each fragment ion spectrum is explained by a single species of peptide ion isolated by the mass spectrometer; however, in reality mass spectrometers often isolate more than one peptide ion within the window of isolation that contribute to additional peptide fragment peaks in many spectra. We present a new tool called reSpect, implemented in the Trans-Proteomic Pipeline (TPP), which enables an iterative workflow whereby fragment ion peaks explained by a peptide ion identified in one round of sequence searching or spectral library search are attenuated based on the confidence of the identification, and then the altered spectrum is subjected to further rounds of searching. The reSpect tool is not implemented as a search engine, but rather as a post-search engine processing step where only fragment ion intensities are altered. This enables the application of any search engine combination in the iterations that follow. Thus, reSpect is compatible with all other protein sequence database search engines as well as peptide spectral library search engines that are supported by the TPP. We show that while some datasets are highly amenable to chimeric spectrum identification and lead to additional peptide identification boosts of over 30% with as many as four different peptide ions identified per spectrum, datasets with narrow precursor ion selection only benefit from such processing at the level of a few percent. We demonstrate a technique that facilitates the determination of the degree to which a dataset would benefit from chimeric spectrum analysis. The reSpect tool is free and open source, provided within the TPP and available at the TPP website. Graphical Abstract ᅟ.

  5. Managing and understanding risk perception of surface leaks from CCS sites: risk assessment for emerging technologies and low-probability, high-consequence events

    Science.gov (United States)

    Augustin, C. M.

    2015-12-01

    Carbon capture and storage (CCS) has been suggested by the Intergovernmental Panel on Climate Change as a partial solution to the greenhouse gas emissions problem. As CCS has become mainstream, researchers have raised multiple risk assessment issues typical of emerging technologies. In our research, we examine issues occuring when stored carbon dioxide (CO2) migrates to the near-surface or surface. We believe that both the public misperception and the physical reality of potential environmental, health, and commercial impacts of leak events from such subsurface sites have prevented widespread adoption of CCS. This paper is presented in three parts; the first is an evaluation of the systemic risk of a CCS site CO2 leak and models indicating potential likelihood of a leakage event. As the likelihood of a CCS site leak is stochastic and nonlinear, we present several Bayesian simulations for leak events based on research done with other low-probability, high-consequence gaseous pollutant releases. Though we found a large, acute leak to be exceptionally rare, we demonstrate potential for a localized, chronic leak at a CCS site. To that end, we present the second piece of this paper. Using a combination of spatio-temporal models and reaction-path models, we demonstrate the interplay between leak migrations, material interactions, and atmospheric dispersion for leaks of various duration and volume. These leak-event scenarios have implications for human, environmental, and economic health; they also have a significant impact on implementation support. Public acceptance of CCS is essential for a national low-carbon future, and this is what we address in the final part of this paper. We demonstrate that CCS remains unknown to the general public in the United States. Despite its unknown state, we provide survey findings -analyzed in Slovic and Weber's 2002 framework - that show a high unknown, high dread risk perception of leaks from a CCS site. Secondary findings are a

  6. Space Flight Software Development Software for Intelligent System Health Management

    Science.gov (United States)

    Trevino, Luis C.; Crumbley, Tim

    2004-01-01

    The slide presentation examines the Marshall Space Flight Center Flight Software Branch, including software development projects, mission critical space flight software development, software technical insight, advanced software development technologies, and continuous improvement in the software development processes and methods.

  7. Software Engineering Guidebook

    Science.gov (United States)

    Connell, John; Wenneson, Greg

    1993-01-01

    The Software Engineering Guidebook describes SEPG (Software Engineering Process Group) supported processes and techniques for engineering quality software in NASA environments. Three process models are supported: structured, object-oriented, and evolutionary rapid-prototyping. The guidebook covers software life-cycles, engineering, assurance, and configuration management. The guidebook is written for managers and engineers who manage, develop, enhance, and/or maintain software under the Computer Software Services Contract.

  8. Communicating Low-Probability High-Consequence Risk, Uncertainty and Expert Confidence: Induced Seismicity of Deep Geothermal Energy and Shale Gas.

    Science.gov (United States)

    Knoblauch, Theresa A K; Stauffacher, Michael; Trutnevyte, Evelina

    2018-04-01

    Subsurface energy activities entail the risk of induced seismicity including low-probability high-consequence (LPHC) events. For designing respective risk communication, the scientific literature lacks empirical evidence of how the public reacts to different written risk communication formats about such LPHC events and to related uncertainty or expert confidence. This study presents findings from an online experiment (N = 590) that empirically tested the public's responses to risk communication about induced seismicity and to different technology frames, namely deep geothermal energy (DGE) and shale gas (between-subject design). Three incrementally different formats of written risk communication were tested: (i) qualitative, (ii) qualitative and quantitative, and (iii) qualitative and quantitative with risk comparison. Respondents found the latter two the easiest to understand, the most exact, and liked them the most. Adding uncertainty and expert confidence statements made the risk communication less clear, less easy to understand and increased concern. Above all, the technology for which risks are communicated and its acceptance mattered strongly: respondents in the shale gas condition found the identical risk communication less trustworthy and more concerning than in the DGE conditions. They also liked the risk communication overall less. For practitioners in DGE or shale gas projects, the study shows that the public would appreciate efforts in describing LPHC risks with numbers and optionally risk comparisons. However, there seems to be a trade-off between aiming for transparency by disclosing uncertainty and limited expert confidence, and thereby decreasing clarity and increasing concern in the view of the public. © 2017 Society for Risk Analysis.

  9. Software quality - how is it achieved?

    International Nuclear Information System (INIS)

    Straker, E.A.

    1986-01-01

    Although software quality can't be quantified, the tools and techniques to achieve high quality are available. As management stresses the need for definable software quality programs from vendors and subcontractors and provides the incentives for these programs, the quality of software will improve. EPRI could provide the leadership in establishing guidelines for a balanced software quality program and through workshops provide training to utility staff and management on the methods for evaluating the characteristics of quality software. With the more complex systems discussed at this workshop and particularly with the trend toward the use of artificial intelligence, the importance of quality software will grow dramatically

  10. The Ettention software package.

    Science.gov (United States)

    Dahmen, Tim; Marsalek, Lukas; Marniok, Nico; Turoňová, Beata; Bogachev, Sviatoslav; Trampert, Patrick; Nickels, Stefan; Slusallek, Philipp

    2016-02-01

    We present a novel software package for the problem "reconstruction from projections" in electron microscopy. The Ettention framework consists of a set of modular building-blocks for tomographic reconstruction algorithms. The well-known block iterative reconstruction method based on Kaczmarz algorithm is implemented using these building-blocks, including adaptations specific to electron tomography. Ettention simultaneously features (1) a modular, object-oriented software design, (2) optimized access to high-performance computing (HPC) platforms such as graphic processing units (GPU) or many-core architectures like Xeon Phi, and (3) accessibility to microscopy end-users via integration in the IMOD package and eTomo user interface. We also provide developers with a clean and well-structured application programming interface (API) that allows for extending the software easily and thus makes it an ideal platform for algorithmic research while hiding most of the technical details of high-performance computing. Copyright © 2015 Elsevier B.V. All rights reserved.

  11. FPGAs for software programmers

    CERN Document Server

    Hannig, Frank; Ziener, Daniel

    2016-01-01

    This book makes powerful Field Programmable Gate Array (FPGA) and reconfigurable technology accessible to software engineers by covering different state-of-the-art high-level synthesis approaches (e.g., OpenCL and several C-to-gates compilers). It introduces FPGA technology, its programming model, and how various applications can be implemented on FPGAs without going through low-level hardware design phases. Readers will get a realistic sense for problems that are suited for FPGAs and how to implement them from a software designer’s point of view. The authors demonstrate that FPGAs and their programming model reflect the needs of stream processing problems much better than traditional CPU or GPU architectures, making them well-suited for a wide variety of systems, from embedded systems performing sensor processing to large setups for Big Data number crunching. This book serves as an invaluable tool for software designers and FPGA design engineers who are interested in high design productivity through behavi...

  12. TOUGH2 software qualification

    International Nuclear Information System (INIS)

    Pruess, K.; Simmons, A.; Wu, Y.S.; Moridis, G.

    1996-02-01

    TOUGH2 is a numerical simulation code for multi-dimensional coupled fluid and heat flow of multiphase, multicomponent fluid mixtures in porous and fractured media. It belongs to the MULKOM (open-quotes MULti-KOMponentclose quotes) family of codes and is a more general version of the TOUGH simulator. The MULKOM family of codes was originally developed with a focus on geothermal reservoir simulation. They are suited to modeling systems which contain different fluid mixtures, with applications to flow problems arising in the context of high-level nuclear waste isolation, oil and gas recovery and storage, and groundwater resource protection. TOUGH2 is essentially a subset of MULKOM, consisting of a selection of the better tested and documented MULKOM program modules. The purpose of this package of reports is to provide all software baseline documents necessary for the software qualification of TOUGH2

  13. TOUGH2 software qualification

    Energy Technology Data Exchange (ETDEWEB)

    Pruess, K.; Simmons, A.; Wu, Y.S.; Moridis, G.

    1996-02-01

    TOUGH2 is a numerical simulation code for multi-dimensional coupled fluid and heat flow of multiphase, multicomponent fluid mixtures in porous and fractured media. It belongs to the MULKOM ({open_quotes}MULti-KOMponent{close_quotes}) family of codes and is a more general version of the TOUGH simulator. The MULKOM family of codes was originally developed with a focus on geothermal reservoir simulation. They are suited to modeling systems which contain different fluid mixtures, with applications to flow problems arising in the context of high-level nuclear waste isolation, oil and gas recovery and storage, and groundwater resource protection. TOUGH2 is essentially a subset of MULKOM, consisting of a selection of the better tested and documented MULKOM program modules. The purpose of this package of reports is to provide all software baseline documents necessary for the software qualification of TOUGH2.

  14. The ATLAS Software Installation System v2: a highly available system to install and validate Grid and Cloud sites via Panda

    Science.gov (United States)

    De Salvo, A.; Kataoka, M.; Sanchez Pineda, A.; Smirnov, Y.

    2015-12-01

    The ATLAS Installation System v2 is the evolution of the original system, used since 2003. The original tool has been completely re-designed in terms of database backend and components, adding support for submission to multiple backends, including the original Workload Management Service (WMS) and the new PanDA modules. The database engine has been changed from plain MySQL to Galera/Percona and the table structure has been optimized to allow a full High-Availability (HA) solution over Wide Area Network. The servlets, running on each frontend, have been also decoupled from local settings, to allow an easy scalability of the system, including the possibility of an HA system with multiple sites. The clients can also be run in multiple copies and in different geographical locations, and take care of sending the installation and validation jobs to the target Grid or Cloud sites. Moreover, the Installation Database is used as source of parameters by the automatic agents running in CVMFS, in order to install the software and distribute it to the sites. The system is in production for ATLAS since 2013, having as main sites in HA the INFN Roma Tier 2 and the CERN Agile Infrastructure. The Light Job Submission Framework for Installation (LJSFi) v2 engine is directly interfacing with PanDA for the Job Management, the Atlas Grid Information System (AGIS) for the site parameter configurations, and CVMFS for both core components and the installation of the software itself. LJSFi2 is also able to use other plugins, and is essentially Virtual Organization (VO) agnostic, so can be directly used and extended to cope with the requirements of any Grid or Cloud enabled VO. In this work we will present the architecture, performance, status and possible evolutions to the system for the LHC Run2 and beyond.

  15. Survey on Projects at DLR Simulation and Software Technology with Focus on Software Engineering and HPC

    OpenAIRE

    Schreiber, Andreas; Basermann, Achim

    2013-01-01

    We introduce the DLR institute “Simulation and Software Technology” (SC) and present current activities regarding software engineering and high performance computing (HPC) in German or international projects. Software engineering at SC focusses on data and knowledge management as well as tools for studies and experiments. We discuss how we apply software configuration management, validation and verification in our projects. Concrete research topics are traceability of (software devel...

  16. Antenna Controller Replacement Software

    Science.gov (United States)

    Chao, Roger Y.; Morgan, Scott C.; Strain, Martha M.; Rockwell, Stephen T.; Shimizu, Kenneth J.; Tehrani, Barzia J.; Kwok, Jaclyn H.; Tuazon-Wong, Michelle; Valtier, Henry; Nalbandi, Reza; hide

    2010-01-01

    The Antenna Controller Replacement (ACR) software accurately points and monitors the Deep Space Network (DSN) 70-m and 34-m high-efficiency (HEF) ground-based antennas that are used to track primarily spacecraft and, periodically, celestial targets. To track a spacecraft, or other targets, the antenna must be accurately pointed at the spacecraft, which can be very far away with very weak signals. ACR s conical scanning capability collects the signal in a circular pattern around the target, calculates the location of the strongest signal, and adjusts the antenna pointing to point directly at the spacecraft. A real-time, closed-loop servo control algorithm performed every 0.02 second allows accurate positioning of the antenna in order to track these distant spacecraft. Additionally, this advanced servo control algorithm provides better antenna pointing performance in windy conditions. The ACR software provides high-level commands that provide a very easy user interface for the DSN operator. The operator only needs to enter two commands to start the antenna and subreflector, and Master Equatorial tracking. The most accurate antenna pointing is accomplished by aligning the antenna to the Master Equatorial, which because of its small size and sheltered location, has the most stable pointing. The antenna has hundreds of digital and analog monitor points. The ACR software provides compact displays to summarize the status of the antenna, subreflector, and the Master Equatorial. The ACR software has two major functions. First, it performs all of the steps required to accurately point the antenna (and subreflector and Master Equatorial) at the spacecraft (or celestial target). This involves controlling the antenna/ subreflector/Master-Equatorial hardware, initiating and monitoring the correct sequence of operations, calculating the position of the spacecraft relative to the antenna, executing the real-time servo control algorithm to maintain the correct position, and

  17. FASTBUS software workshop

    International Nuclear Information System (INIS)

    1985-01-01

    FASTBUS is a standard for modular high-speed data acquisition, data-processing and control, development for use in high-energy physics experiments incorporating different types of computers and microprocessors. This Workshop brought together users from different laboratories for a review of current software activities, using the standard both in experiments and for test equipment. There are also papers on interfacing and the present state of systems being developed for use in future LEP experiments. Also included is a discussion on the proposed revision of FASTBUS Standard Routines. (orig.)

  18. Computer systems and software engineering

    Science.gov (United States)

    Mckay, Charles W.

    1988-01-01

    The High Technologies Laboratory (HTL) was established in the fall of 1982 at the University of Houston Clear Lake. Research conducted at the High Tech Lab is focused upon computer systems and software engineering. There is a strong emphasis on the interrelationship of these areas of technology and the United States' space program. In Jan. of 1987, NASA Headquarters announced the formation of its first research center dedicated to software engineering. Operated by the High Tech Lab, the Software Engineering Research Center (SERC) was formed at the University of Houston Clear Lake. The High Tech Lab/Software Engineering Research Center promotes cooperative research among government, industry, and academia to advance the edge-of-knowledge and the state-of-the-practice in key topics of computer systems and software engineering which are critical to NASA. The center also recommends appropriate actions, guidelines, standards, and policies to NASA in matters pertinent to the center's research. Results of the research conducted at the High Tech Lab/Software Engineering Research Center have given direction to many decisions made by NASA concerning the Space Station Program.

  19. Process mining application in software process assessment

    NARCIS (Netherlands)

    Samalikova, J.

    2012-01-01

    Nowadays, our daily life heavily depends on software. Software is everywhere, from appliances in our homes, to safety-critical systems such as medical equipment. The failure of these software-intensive systems results in high financial losses, environmental or property damages, or even loss of life.

  20. Software testing for evolutionary iterative rapid prototyping

    OpenAIRE

    Davis, Edward V., Jr.

    1990-01-01

    Approved for public release; distribution unlimited. Rapid prototyping is emerging as a promising software development paradigm. It provides a systematic and automatable means of developing a software system under circumstances where initial requirements are not well known or where requirements change frequently during development. To provide high software quality assurance requires sufficient software testing. The unique nature of evolutionary iterative prototyping is not well-suited for ...

  1. Demonstration of Multi-Gbps Data Rates at Ka-Band Using Software-Defined Modem and Broadband High Power Amplifier for Space Communications

    Science.gov (United States)

    Simons, Rainee N.; Wintucky, Edwin G.; Landon, David G.; Sun, Jun Y.; Winn, James S.; Laraway, Stephen; McIntire, William K.; Metz, John L.; Smith, Francis J.

    2011-01-01

    The paper presents the first ever research and experimental results regarding the combination of a software-defined multi-Gbps modem and a broadband high power space amplifier when tested with an extended form of the industry standard DVB-S2 and LDPC rate 9/10 FEC codec. The modem supports waveforms including QPSK, 8-PSK, 16-APSK, 32-APSK, 64-APSK, and 128-QAM. The broadband high power amplifier is a space qualified traveling-wave tube (TWT), which has a passband greater than 3 GHz at 33 GHz, output power of 200 W and efficiency greater than 60 percent. The modem and the TWTA together enabled an unprecedented data rate at 20 Gbps with low BER of 10(exp -9). The presented results include a plot of the received waveform constellation, BER vs. E(sub b)/N(sub 0) and implementation loss for each of the modulation types tested. The above results when included in an RF link budget analysis show that NASA s payload data rate can be increased by at least an order of magnitude (greater than 10X) over current state-of-practice, limited only by the spacecraft EIRP, ground receiver G/T, range, and available spectrum or bandwidth.

  2. Software problems in magnetic fusion research

    International Nuclear Information System (INIS)

    Gruber, R.

    1982-01-01

    The main world effort in magnetic fusion research involves studying the plasma in a Tokamak device. Four large Tokamaks are under construction (TFTR in USA, JET in Europe, T15 in USSR and JT60 in Japan). To understand the physical phenomena that occur in these costly devices, it is generally necessary to carry out extensive numerical calculations. These computer simulations make use of sophisticated numerical methods and demand high power computers. As a consequence they represent a substantial investment. To reduce software costs, the computer codes are more and more often exhanged among scientists. Standardization (STANDARD FORTRAN, OLYMPUS system) and good documentation (CPC program library) are proposed to make codes exportable. Centralized computing centers would also help in the exchange of codes and ease communication between the staff at different laboratories. (orig.)

  3. Vertical Equity Consequences of Very High Cigarette Tax Increases: If the Poor Are the Ones Smoking, How Could Cigarette Tax Increases Be Progressive?

    Science.gov (United States)

    Colman, Gregory J.; Remler, Dahlia K.

    2008-01-01

    Cigarette smoking is concentrated among low-income groups. Consequently, cigarette taxes are considered regressive. However, if poorer individuals are much more price sensitive than richer individuals, then tax increases would reduce smoking much more among the poor and their cigarette tax expenditures as a share of income would rise by much less…

  4. High environmental ammonia exposure has developmental-stage specific and long-term consequences on the cortisol stress response in zebrafish.

    Science.gov (United States)

    Williams, Tegan A; Bonham, Luke A; Bernier, Nicholas J

    2017-12-01

    The capacity for early life environmental stressors to induce programming effects on the endocrine stress response in fish is largely unknown. In this study we determined the effects of high environmental ammonia (HEA) exposure on the stress response in larval zebrafish, assessed the tolerance of embryonic and larval stages to HEA, and evaluated whether early life HEA exposure has long-term consequences on the cortisol response to a novel stressor. Exposure to 500-2000μM NH 4 Cl for 16h did not affect the gene expression of corticotropin-releasing factor (CRF) system components in 1day post-fertilization (dpf) embryos, but differentially increased crfa, crfb and CRF binding protein (crfbp) expression and stimulated both dose- and time-dependent increases in the whole body cortisol of 5dpf larvae. Pre-acclimation to HEA at 1dpf did not affect the cortisol response to a subsequent NH 4 Cl exposure at 5dpf. In contrast, pre-acclimation to HEA at 5dpf caused a small but significant reduction in the cortisol response to a second NH 4 Cl exposure at 10dpf. While continuous exposure to 500-2000μM NH 4 Cl between 0 and 5dpf had a modest effect on mean survival time, exposure to 400-1000μM NH 4 Cl between 10 and 14dpf decreased mean survival time in a dose-dependent manner. Moreover, pre-acclimation to HEA at 5dpf significantly decreased the risk of mortality to continuous NH 4 Cl exposure between 10 and 14dpf. Finally, while HEA at 1dpf did not affect the cortisol stress response to a novel vortex stressor at 5dpf, the same HEA treatment at 5dpf abolished vortex stressor-induced increases in whole body cortisol at 10 and 60dpf. Together these results show that the impact of HEA on the cortisol stress response during development is life-stage specific and closely linked to ammonia tolerance. Further, we demonstrate that HEA exposure at the larval stage can have persistent effects on the capacity to respond to stressors in later life. Copyright © 2017 Elsevier Inc. All

  5. Ensuring Software IP Cleanliness

    Directory of Open Access Journals (Sweden)

    Mahshad Koohgoli

    2007-12-01

    Full Text Available At many points in the life of a software enterprise, determination of intellectual property (IP cleanliness becomes critical. The value of an enterprise that develops and sells software may depend on how clean the software is from the IP perspective. This article examines various methods of ensuring software IP cleanliness and discusses some of the benefits and shortcomings of current solutions.

  6. Commercial Literacy Software.

    Science.gov (United States)

    Balajthy, Ernest

    1997-01-01

    Presents the first year's results of a continuing project to monitor the availability of software of relevance for literacy education purposes. Concludes there is an enormous amount of software available for use by teachers of reading and literacy--whereas drill-and-practice software is the largest category of software available, large numbers of…

  7. Ensuring Software IP Cleanliness

    OpenAIRE

    Mahshad Koohgoli; Richard Mayer

    2007-01-01

    At many points in the life of a software enterprise, determination of intellectual property (IP) cleanliness becomes critical. The value of an enterprise that develops and sells software may depend on how clean the software is from the IP perspective. This article examines various methods of ensuring software IP cleanliness and discusses some of the benefits and shortcomings of current solutions.

  8. Statistical Software Engineering

    Science.gov (United States)

    1998-04-13

    multiversion software subject to coincident errors. IEEE Trans. Software Eng. SE-11:1511-1517. Eckhardt, D.E., A.K Caglayan, J.C. Knight, L.D. Lee, D.F...J.C. and N.G. Leveson. 1986. Experimental evaluation of the assumption of independence in multiversion software. IEEE Trans. Software

  9. Agile Software Development

    Science.gov (United States)

    Biju, Soly Mathew

    2008-01-01

    Many software development firms are now adopting the agile software development method. This method involves the customer at every level of software development, thus reducing the impact of change in the requirement at a later stage. In this article, the principles of the agile method for software development are explored and there is a focus on…

  10. Improving Software Developer's Competence

    DEFF Research Database (Denmark)

    Abrahamsson, Pekka; Kautz, Karlheinz; Sieppi, Heikki

    2002-01-01

    Emerging agile software development methods are people oriented development approaches to be used by the software industry. The personal software process (PSP) is an accepted method for improving the capabilities of a single software engineer. Five original hypotheses regarding the impact...

  11. Software - Naval Oceanography Portal

    Science.gov (United States)

    are here: Home › USNO › Earth Orientation › Software USNO Logo USNO Navigation Earth Orientation Products GPS-based Products VLBI-based Products EO Information Center Publications about Products Software Search databases Auxiliary Software Supporting Software Form Folder Earth Orientation Matrix Calculator

  12. Software Engineering Education Directory

    Science.gov (United States)

    1990-04-01

    and Engineering (CMSC 735) Codes: GPEV2 * Textiooks: IEEE Tutoria on Models and Metrics for Software Management and Engameeing by Basi, Victor R...Software Engineering (Comp 227) Codes: GPRY5 Textbooks: IEEE Tutoria on Software Design Techniques by Freeman, Peter and Wasserman, Anthony 1. Software

  13. High temperature oxidation of metals: vacancy injection and consequences on the mechanical properties; Consequences de l'oxydation haute temperature sur l'injection de defauts et le comportement mecanique des materiaux metalliques

    Energy Technology Data Exchange (ETDEWEB)

    Perusin, S

    2004-11-15

    The aim of this work is to account for the effects of the high temperature oxidation of metals on their microstructure and their mechanical properties. 'Model' materials like pure nickel, pure iron and the Ni-20Cr alloy are studied. Nickel foils have been oxidised at 1000 C on one side only in laboratory air, the other side being protected from oxidation by a reducing atmosphere. After the oxidation treatment, the unoxidized face was carefully examined by using an Atomic Force Microscope (AFM). Grain boundaries grooves were characterised and their depth were compared to the ones obtained on the same sample heat treated in the reducing atmosphere during the same time. They are found to be much deeper in the case of the single side oxidised samples. It is shown that this additional grooving is directly linked to the growth of the oxide scale on the opposite side and that it can be explained by the diffusion of the vacancies produced at the oxide scale - metal interface, across the entire sample through grain boundaries. Moreover, the comparison between single side oxidised samples and samples oxidised on both sides points out that voids in grain boundaries are only observed in this latter case proving the vacancies condensation in the metal when the two faces are oxidised. The role of the carbon content and the sample's geometry on this phenomenon is examined in detail. The diffusion of vacancies is coupled with the transport of oxygen so that a mechanism of oxygen transport by vacancies is suggested. The tensile tests realised at room temperature on nickel foils (bamboo microstructure) show that the oxide scale can constitute a barrier to the emergence of dislocations at the metal surface. Finally, the Ni-20Cr alloy is tested in tensile and creep tests between 25 and 825 C in oxidising or reducing atmospheres. (author)

  14. A major upgrade of the sediment echosounder ATLAS PARASOUND and the digital acquisition software ParaDigMA for high-resolution sea floor studies

    Science.gov (United States)

    Gerriets, A.; von Lom-Keil, H.; Spiess, V.; Zwanzig, C.; Bruns, R.

    2003-04-01

    The combination of the ATLAS PARASOUND sediment echosounder, designed by ATLAS Hydrographic, and the digital recording software package ParaDigMA (commercially available as ATLAS PARASTORE-3) for online digitisation, preprocessing and visualisation of recorded seismograms has proven to be a reliable system for high-resolution acoustic sea floor studies. During 10 years of successful operation aboard several research vessels, including R/V Meteor, R/V Sonne and R/V Polarstern, the system has been only slightly modified. Based on this experience, today's PARASOUND/ParaDigMA system has accomplished the step from DOS towards Windows platform and network capability. In cooperation of ATLAS Hydrographic and the Department of Earth Sciences, University of Bremen a major upgrade of the PARASOUND/ParaDigMA system has been developed that adds significant functionality for surveys of sediment structures and sea floor morphology. The innovations primarily concern the control section of the ATLAS PARASOUND echosounder and the ParaDigMA user front end. The previous analogue PARASOUND control terminal has been replaced by a small real time control PC responsible for the control of the echosounder as well as for the continuous digitisation of the data. The control PC communicates via standard network protocols metadata and data with client applications that can display and store the acquired data on different computers on the network. The new network capabilities of the system overcome former limitations and admit a high flexibility with respect to numbers and locations of operator and recording/display PCs. The system now offers a simultaneous parallel registration of the 2.5-5.5kHz parametric signal and the 18kHz NBS signal. This feature in combination with the recording of complete soundings including the entire water column provides the basis for evolving scientific research topics e. g. gas venting. The ParaDigMA recording software now operates on Windows platforms which

  15. Great software debates

    CERN Document Server

    Davis, A

    2004-01-01

    The industry’s most outspoken and insightful critic explains how the software industry REALLY works. In Great Software Debates, Al Davis, shares what he has learned about the difference between the theory and the realities of business and encourages you to question and think about software engineering in ways that will help you succeed where others fail. In short, provocative essays, Davis fearlessly reveals the truth about process improvement, productivity, software quality, metrics, agile development, requirements documentation, modeling, software marketing and sales, empiricism, start-up financing, software research, requirements triage, software estimation, and entrepreneurship.

  16. Views on Software Testability

    OpenAIRE

    Shimeall, Timothy; Friedman, Michael; Chilenski, John; Voas, Jeffrey

    1994-01-01

    The field of testability is an active, well-established part of engineering of modern computer systems. However, only recently have technologies for software testability began to be developed. These technologies focus on accessing the aspects of software that improve or depreciate the ease of testing. As both the size of implemented software and the amount of effort required to test that software increase, so will the important of software testability technologies in influencing the softwa...

  17. Agile software assessment

    OpenAIRE

    Nierstrasz Oscar; Lungu Mircea

    2012-01-01

    Informed decision making is a critical activity in software development but it is poorly supported by common development environments which focus mainly on low level programming tasks. We posit the need for agile software assessment which aims to support decision making by enabling rapid and effective construction of software models and custom analyses. Agile software assessment entails gathering and exploiting the broader context of software information related to the system at hand as well ...

  18. Software component quality evaluation

    Science.gov (United States)

    Clough, A. J.

    1991-01-01

    The paper describes a software inspection process that can be used to evaluate the quality of software components. Quality criteria, process application, independent testing of the process and proposed associated tool support are covered. Early results indicate that this technique is well suited for assessing software component quality in a standardized fashion. With automated machine assistance to facilitate both the evaluation and selection of software components, such a technique should promote effective reuse of software components.

  19. Advancing the integration of hospital IT. Pitfalls and perspectives when replacing specialized software for high-risk environments with enterprise system extensions.

    Science.gov (United States)

    Engelmann, Carsten; Ametowobla, Dzifa

    2017-05-17

    Planning and controlling surgical operations hugely impacts upon productivity, patient safety, and surgeons' careers. Established, specialized software for this task is being increasingly replaced by "Operating Room (OR)-modules" appended to enterprise-wide resource planning (ERP) systems. As a result, usability problems are re-emerging and require developers' attention. Systematic evaluation of the functionality and social repercussions of a global, market-leading IT business control system (SAP R3, Germany), adapted for real-time OR process steering. Field study involving document analyses, interviews, and a 73-item survey addressed to 77 qualified (> 1-year system experience) senior planning executives (end users; "planners") working in surgical departments of university hospitals. Planners reported that 57% of electronic operation requests contained contradictory information. Key screens contained clinically irrelevant areas (36 +/- 29%). Compared to the legacy system, users reported either no improvements or worse performance, in regard to co-ordination of OR stakeholders, intra-day program changes, and safety. Planners concluded that the ERP-planning module was "non-intuitive" (66%), increased planning work (56%, p=0.002), and did not impact upon either organizational mishap spectrum or frequency. Interviews evidenced intra-institutional power shifts due to increased system complexity. Planners resented e.g. a trend towards increased personal culpability for mishap. Highly complex enterprise system extensions may not be directly suited to specific process steering tasks in a high risk/low error-environment like the OR. In view of surgeons' high primary task load, the repeated call for simpler IT is an imperative for ERP extensions. System design should consider a) that current OR IT suffers from an input limitation regarding planning-relevant real-time data, and b) that there are social processes that strongly affect planning and particularly ERP use beyond

  20. Application of Formal Methods in Software Engineering

    Directory of Open Access Journals (Sweden)

    Adriana Morales

    2011-12-01

    Full Text Available The purpose of this research work is to examine: (1 why are necessary the formal methods for software systems today, (2 high integrity systems through the methodology C-by-C –Correctness-by-Construction–, and (3 an affordable methodology to apply formal methods in software engineering. The research process included reviews of the literature through Internet, in publications and presentations in events. Among the Research results found that: (1 there is increasing the dependence that the nations have, the companies and people of software systems, (2 there is growing demand for software Engineering to increase social trust in the software systems, (3 exist methodologies, as C-by-C, that can provide that level of trust, (4 Formal Methods constitute a principle of computer science that can be applied software engineering to perform reliable process in software development, (5 software users have the responsibility to demand reliable software products, and (6 software engineers have the responsibility to develop reliable software products. Furthermore, it is concluded that: (1 it takes more research to identify and analyze other methodologies and tools that provide process to apply the Formal Software Engineering methods, (2 Formal Methods provide an unprecedented ability to increase the trust in the exactitude of the software products and (3 by development of new methodologies and tools is being achieved costs are not more a disadvantage for application of formal methods.

  1. Analyzing, Modelling, and Designing Software Ecosystems

    DEFF Research Database (Denmark)

    Manikas, Konstantinos

    as the software development and distribution by a set of actors dependent on each other and the ecosystem. We commence on the hypothesis that the establishment of a software ecosystem on the telemedicine services of Denmark would address these issues and investigate how a software ecosystem can foster...... the development, implementation, and use of telemedicine services. We initially expand the theory of software ecosystems by contributing to the definition and understanding of software ecosystems, providing means of analyzing existing and designing new ecosystems, and defining and measuring the qualities...... of software ecosystems. We use these contributions to design a software ecosystem in the telemedicine services of Denmark with (i) a common platform that supports and promotes development from different actors, (ii) high software interaction, (iii) strong social network of actors, (iv) robust business...

  2. A reflection on Software Engineering in HEP

    International Nuclear Information System (INIS)

    Carminati, Federico

    2012-01-01

    High Energy Physics (HEP) has been making very extensive usage of computers to achieve its research goals. Fairly large program suites have been developed, maintained and used over the years and it is fair to say that, overall, HEP has been successful in software development. Yet, HEP software development has not used classical Software Engineering techniques, which have been invented and refined to help the production of large programmes. In this paper we will review the development of HEP code with its strengths and weaknesses. Using several well-known HEP software projects as examples, we will try to demonstrate that our community has used a form of Software Engineering, albeit in an informal manner. The software development techniques employed in these projects are indeed very close in many aspects to the modern tendencies of Software Engineering itself, in particular the so-called “agile technologies”. The paper will conclude with an outlook on the future of software development in HEP.

  3. Software Quality Assurance Metrics

    Science.gov (United States)

    McRae, Kalindra A.

    2004-01-01

    Software Quality Assurance (SQA) is a planned and systematic set of activities that ensures conformance of software life cycle processes and products conform to requirements, standards and procedures. In software development, software quality means meeting requirements and a degree of excellence and refinement of a project or product. Software Quality is a set of attributes of a software product by which its quality is described and evaluated. The set of attributes includes functionality, reliability, usability, efficiency, maintainability, and portability. Software Metrics help us understand the technical process that is used to develop a product. The process is measured to improve it and the product is measured to increase quality throughout the life cycle of software. Software Metrics are measurements of the quality of software. Software is measured to indicate the quality of the product, to assess the productivity of the people who produce the product, to assess the benefits derived from new software engineering methods and tools, to form a baseline for estimation, and to help justify requests for new tools or additional training. Any part of the software development can be measured. If Software Metrics are implemented in software development, it can save time, money, and allow the organization to identify the caused of defects which have the greatest effect on software development. The summer of 2004, I worked with Cynthia Calhoun and Frank Robinson in the Software Assurance/Risk Management department. My task was to research and collect, compile, and analyze SQA Metrics that have been used in other projects that are not currently being used by the SA team and report them to the Software Assurance team to see if any metrics can be implemented in their software assurance life cycle process.

  4. The ATLAS Software Installation System v2: a highly available system to install and validate Grid and Cloud sites via Panda

    CERN Document Server

    De Salvo, Alessandro; The ATLAS collaboration; Sanchez, Arturo; Smirnov, Yuri

    2015-01-01

    The ATLAS Installation System v2 is the evolution of the original system, used since 2003. The original tool has been completely re-designed in terms of database backend and components, adding support for submission to multiple backends, including the original WMS and the new Panda modules. The database engine has been changed from plain MySQL to Galera/Percona and the table structure has been optimized to allow a full High-Availability (HA) solution over WAN. The servlets, running on each frontend, have been also decoupled from local settings, to allow an easy scalability of the system, including the possibility of an HA system with multiple sites. The clients can also be run in multiple copies and in different geographical locations, and take care of sending the installation and validation jobs to the target Grid or Cloud sites. Moreover, the Installation DB is used as source of parameters by the automatic agents running in CVMFS, in order to install the software and distribute it to the sites. The system i...

  5. The virtual double-slit experiment to High School level (Part I: behavior classical analysis (with bullets and waves and development of computational software

    Directory of Open Access Journals (Sweden)

    Danilo Cardoso Ferreira

    2016-09-01

    Full Text Available http://dx.doi.org/10.5007/2175-7941.2016v33n2p697   This paper analyses the double-slit virtual experiment and it composed of two parts: The part I covers the classical theory (with bullets and waves and the part II covers the interference with electrons or photons. Firstly, we have analyzed the same experiment that shoots a stream of bullets. In front of the gun we have a wall that has in it two holes just big enough to let a bullet through. Beyond the wall is a backstop (say a thick wall of wood which will absorb the bullets when they hit it. In this case, the probabilities just add together. The effect with both holes open is the sum of effects with each holes open alone. We have shown it for high school level. Next, we have analyzed a same experiment with water waves. The intensity observed when both holes are open is certainly not the sum of the intensity of the wave from hole 1 (which we find by measuring when hole 2 is blocked off and the intensity of the wave form hole 2 (seen when hole 1 is blocked. Finally, we have shown a software developed by students about double-slit experiment with bullets.

  6. SSR_pipeline--computer software for the identification of microsatellite sequences from paired-end Illumina high-throughput DNA sequence data

    Science.gov (United States)

    Miller, Mark P.; Knaus, Brian J.; Mullins, Thomas D.; Haig, Susan M.

    2013-01-01

    SSR_pipeline is a flexible set of programs designed to efficiently identify simple sequence repeats (SSRs; for example, microsatellites) from paired-end high-throughput Illumina DNA sequencing data. The program suite contains three analysis modules along with a fourth control module that can be used to automate analyses of large volumes of data. The modules are used to (1) identify the subset of paired-end sequences that pass quality standards, (2) align paired-end reads into a single composite DNA sequence, and (3) identify sequences that possess microsatellites conforming to user specified parameters. Each of the three separate analysis modules also can be used independently to provide greater flexibility or to work with FASTQ or FASTA files generated from other sequencing platforms (Roche 454, Ion Torrent, etc). All modules are implemented in the Python programming language and can therefore be used from nearly any computer operating system (Linux, Macintosh, Windows). The program suite relies on a compiled Python extension module to perform paired-end alignments. Instructions for compiling the extension from source code are provided in the documentation. Users who do not have Python installed on their computers or who do not have the ability to compile software also may choose to download packaged executable files. These files include all Python scripts, a copy of the compiled extension module, and a minimal installation of Python in a single binary executable. See program documentation for more information.

  7. Software Engineering Program: Software Process Improvement Guidebook

    Science.gov (United States)

    1996-01-01

    The purpose of this document is to provide experience-based guidance in implementing a software process improvement program in any NASA software development or maintenance community. This guidebook details how to define, operate, and implement a working software process improvement program. It describes the concept of the software process improvement program and its basic organizational components. It then describes the structure, organization, and operation of the software process improvement program, illustrating all these concepts with specific NASA examples. The information presented in the document is derived from the experiences of several NASA software organizations, including the SEL, the SEAL, and the SORCE. Their experiences reflect many of the elements of software process improvement within NASA. This guidebook presents lessons learned in a form usable by anyone considering establishing a software process improvement program within his or her own environment. This guidebook attempts to balance general and detailed information. It provides material general enough to be usable by NASA organizations whose characteristics do not directly match those of the sources of the information and models presented herein. It also keeps the ideas sufficiently close to the sources of the practical experiences that have generated the models and information.

  8. Software testing in roughness calculation

    International Nuclear Information System (INIS)

    Chen, Y L; Hsieh, P F; Fu, W E

    2005-01-01

    A test method to determine the function quality provided by the software for roughness measurement is presented in this study. The function quality of the software requirements should be part of and assessed through the entire life cycle of the software package. The specific function, or output accuracy, is crucial for the analysis of the experimental data. For scientific applications, however, commercial software is usually embedded with specific instrument, which is used for measurement or analysis during the manufacture process. In general, the error ratio caused by the software would be more apparent especially when dealing with relatively small quantities, like the measurements in the nanometer-scale range. The model of 'using a data generator' proposed by NPL of UK was applied in this study. An example of the roughness software is tested and analyzed by the above mentioned process. After selecting the 'reference results', the 'reference data' was generated by a programmable 'data generator'. The filter function of 0.8 mm long cutoff value, defined in ISO 11562 was tested with 66 sinusoid data at different wavelengths. Test results from commercial software and CMS written program were compared to the theoretical data calculated from ISO standards. As for the filter function in this software, the result showed a significant disagreement between the reference and test results. The short cutoff feature for filtering at the high frequencies does not function properly, while the long cutoff feature has the maximum difference in the filtering ratio, which is more than 70% between the wavelength of 300 μm and 500 μm. Conclusively, the commercial software needs to be tested more extensively for specific application by appropriate design of reference dataset to ensure its function quality

  9. Software for the LHCb experiment

    CERN Document Server

    Corti, Gloria; Belyaev, Ivan; Cattaneo, Marco; Charpentier, Philippe; Frank, Markus; Koppenburg, Patrick; Mato-Vila, P; Ranjard, Florence; Roiser, Stefan

    2006-01-01

    LHCb is an experiment for precision measurements of CP-violation and rare decays in B mesons at the LHC collider at CERN. The LHCb software development strategy follows an architecture-centric approach as a way of creating a resilient software framework that can withstand changes in requirements and technology over the expected long lifetime of the experiment. The software architecture, called GAUDI, supports event data processing applications that run in different processing environments ranging from the real-time high- level triggers in the online system to the final physics analysis performed by more than one hundred physicists. The major architectural design choices and the arguments that lead to these choices will be outlined. Object oriented technologies have been used throughout. Initially developed for the LHCb experiment, GAUDI has been adopted and extended by other experiments. Several iterations of the GAUDI software framework have been released and are now being used routinely by the physicists of...

  10. Development of Farm Records Software

    Directory of Open Access Journals (Sweden)

    M. S. Abubakar

    2017-12-01

    Full Text Available Farm records are mostly manually kept on paper notebooks and folders where similar records are organized in one folder or spread sheet. These records are usually kept for many years therefore they becomes bulky and less organized. Consequently, it becomes difficult to search, update and tedious and time consuming to manage these records. This study was carried-out to overcome these problems associated with manual farm records keeping by developing user-friendly, easily accessible, reliable and secured software. The software was limited records keeping in crop production, livestock production, poultry production, employees, income and expenditure. The system was implemented using Java Server Faces (JSF for designing Graphical User Interface (GUI, Enterprises Java Beans (EJB for logic tier and MySQL database for storing farm records.

  11. ACTS: from ATLAS software towards a common track reconstruction software

    Science.gov (United States)

    Gumpert, C.; Salzburger, A.; Kiehn, M.; Hrdinka, J.; Calace, N.; ATLAS Collaboration

    2017-10-01

    Reconstruction of charged particles’ trajectories is a crucial task for most particle physics experiments. The high instantaneous luminosity achieved at the LHC leads to a high number of proton-proton collisions per bunch crossing, which has put the track reconstruction software of the LHC experiments through a thorough test. Preserving track reconstruction performance under increasingly difficult experimental conditions, while keeping the usage of computational resources at a reasonable level, is an inherent problem for many HEP experiments. Exploiting concurrent algorithms and using multivariate techniques for track identification are the primary strategies to achieve that goal. Starting from current ATLAS software, the ACTS project aims to encapsulate track reconstruction software into a generic, framework- and experiment-independent software package. It provides a set of high-level algorithms and data structures for performing track reconstruction tasks as well as fast track simulation. The software is developed with special emphasis on thread-safety to support parallel execution of the code and data structures are optimised for vectorisation to speed up linear algebra operations. The implementation is agnostic to the details of the detection technologies and magnetic field configuration which makes it applicable to many different experiments.

  12. A maternal high-fat, high-sucrose diet has sex-specific effects on fetal glucocorticoids with little consequence for offspring metabolism and voluntary locomotor activity in mice.

    Directory of Open Access Journals (Sweden)

    Eunice H Chin

    Full Text Available Maternal overnutrition and obesity during pregnancy can have long-term effects on offspring physiology and behaviour. These developmental programming effects may be mediated by fetal exposure to glucocorticoids, which is regulated in part by placental 11β-hydroxysteroid dehydrogenase (11β-HSD type 1 and 2. We tested whether a maternal high-fat, high-sucrose diet would alter expression of placental 11β-HSD1 and 2, thereby increasing fetal exposure to maternal glucocorticoids, with downstream effects on offspring physiology and behaviour. C57BL/6J mice were fed a high-fat, high-sucrose (HFHS diet or a nutrient-matched low-fat, no-sucrose control diet prior to and during pregnancy and lactation. At day 17 of gestation, HFHS dams had ~20% lower circulating corticosterone levels than controls. Furthermore, there was a significant interaction between maternal diet and fetal sex for circulating corticosterone levels in the fetuses, whereby HFHS males tended to have higher corticosterone than control males, with no effect in female fetuses. However, placental 11β-HSD1 or 11β-HSD2 expression did not differ between diets or show an interaction between diet and sex. To assess potential long-term consequences of this sex-specific effect on fetal corticosterone, we studied locomotor activity and metabolic traits in adult offspring. Despite a sex-specific effect of maternal diet on fetal glucocorticoids, there was little evidence of sex-specific effects on offspring physiology or behaviour, although HFHS offspring of both sexes had higher circulating corticosterone at 9 weeks of age. Our results suggest the existence of as yet unknown mechanisms that mitigate the effects of altered glucocorticoid exposure early in development, making offspring resilient to the potentially negative effects of a HFHS maternal diet.

  13. Software Engineering Improvement Plan

    Science.gov (United States)

    2006-01-01

    In performance of this task order, bd Systems personnel provided support to the Flight Software Branch and the Software Working Group through multiple tasks related to software engineering improvement and to activities of the independent Technical Authority (iTA) Discipline Technical Warrant Holder (DTWH) for software engineering. To ensure that the products, comments, and recommendations complied with customer requirements and the statement of work, bd Systems personnel maintained close coordination with the customer. These personnel performed work in areas such as update of agency requirements and directives database, software effort estimation, software problem reports, a web-based process asset library, miscellaneous documentation review, software system requirements, issue tracking software survey, systems engineering NPR, and project-related reviews. This report contains a summary of the work performed and the accomplishments in each of these areas.

  14. Spotting software errors sooner

    International Nuclear Information System (INIS)

    Munro, D.

    1989-01-01

    Static analysis is helping to identify software errors at an earlier stage and more cheaply than conventional methods of testing. RTP Software's MALPAS system also has the ability to check that a code conforms to its original specification. (author)

  15. Avionics and Software Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The goal of the AES Avionics and Software (A&S) project is to develop a reference avionics and software architecture that is based on standards and that can be...

  16. Paladin Software Support Lab

    Data.gov (United States)

    Federal Laboratory Consortium — The Paladin Software Support Environment (SSE) occupies 2,241 square-feet. It contains the hardware and software tools required to support the Paladin Automatic Fire...

  17. Pragmatic Software Innovation

    DEFF Research Database (Denmark)

    Aaen, Ivan; Jensen, Rikke Hagensby

    2014-01-01

    We understand software innovation as concerned with introducing innovation into the development of software intensive systems, i.e. systems in which software development and/or integration are dominant considerations. Innovation is key in almost any strategy for competitiveness in existing markets......, for creating new markets, or for curbing rising public expenses, and software intensive systems are core elements in most such strategies. Software innovation therefore is vital for about every sector of the economy. Changes in software technologies over the last decades have opened up for experimentation......, learning, and flexibility in ongoing software projects, but how can this change be used to facilitate software innovation? How can a team systematically identify and pursue opportunities to create added value in ongoing projects? In this paper, we describe Deweyan pragmatism as the philosophical foundation...

  18. Process mining software repositories

    NARCIS (Netherlands)

    Poncin, W.; Serebrenik, A.; Brand, van den M.G.J.

    2011-01-01

    Software developers' activities are in general recorded in software repositories such as version control systems, bug trackers and mail archives. While abundant information is usually present in such repositories, successful information extraction is often challenged by the necessity to

  19. Optimization of Antivirus Software

    OpenAIRE

    Catalin BOJA; Adrian VISOIU

    2007-01-01

    The paper describes the main techniques used in development of computer antivirus software applications. For this particular category of software, are identified and defined optimum criteria that helps determine which solution is better and what are the objectives of the optimization process. From the general viewpoint of software optimization are presented methods and techniques that are applied at code development level. Regarding the particularities of antivirus software, the paper analyze...

  20. Open Source Software Development

    Science.gov (United States)

    2011-01-01

    appropriate to refer to FOSS or FLOSS (L for Libre , where the alternative term “ libre software ” has popularity in some parts of the world) in order...Applying Social Network Analysis to Community-Drive Libre Software Projects, Intern. J. Info. Tech. and Web Engineering, 2006, 1(3), 27-28. 17...Open Source Software Development* Walt Scacchi Institute for Software Researcher University of California, Irvine Irvine, CA 92697-3455 USA Abstract

  1. Gammasphere software development

    International Nuclear Information System (INIS)

    Piercey, R.B.

    1994-01-01

    This report describes the activities of the nuclear physics group at Mississippi State University which were performed during 1993. Significant progress has been made in the focus areas: chairing the Gammasphere Software Working Group (SWG); assisting with the porting and enhancement of the ORNL UPAK histogramming software package; and developing standard formats for Gammasphere data products. In addition, they have established a new public ftp archive to distribute software and software development tools and information

  2. Software engineer's pocket book

    CERN Document Server

    Tooley, Michael

    2013-01-01

    Software Engineer's Pocket Book provides a concise discussion on various aspects of software engineering. The book is comprised of six chapters that tackle various areas of concerns in software engineering. Chapter 1 discusses software development, and Chapter 2 covers programming languages. Chapter 3 deals with operating systems. The book also tackles discrete mathematics and numerical computation. Data structures and algorithms are also explained. The text will be of great use to individuals involved in the specification, design, development, implementation, testing, maintenance, and qualit

  3. Software Testing Requires Variability

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak

    2003-01-01

    Software variability is the ability of a software system or artefact to be changed, customized or configured for use in a particular context. Variability in software systems is important from a number of perspectives. Some perspectives rightly receive much attention due to their direct economic...... impact in software production. As is also apparent from the call for papers these perspectives focus on qualities such as reuse, adaptability, and maintainability....

  4. Knowledge-Based Software Management

    International Nuclear Information System (INIS)

    Sally Schaffner; Matthew Bickley; Brian Bevins; Leon Clancy; Karen White

    2003-01-01

    Management of software in a dynamic environment such as is found at Jefferson Lab can be a daunting task. Software development tasks are distributed over a wide range of people with varying skill levels. The machine configuration is constantly changing requiring upgrades to software at both the hardware control level and the operator control level. In order to obtain high quality support from vendor service agreements, which is vital to maintaining 24/7 operations, hardware and software must be kept at industry's current levels. This means that periodic upgrades independent of machine configuration changes must take place. It is often difficult to identify and organize the information needed to guide the process of development, upgrades and enhancements. Dependencies between support software and applications need to be consistently identified to prevent introducing errors during upgrades and to allow adequate testing to be planned and performed. Developers also need access to information regarding compilers, make files and organized distribution directories. This paper describes a system under development at Jefferson Lab which will provide software developers and managers this type of information in a timely user-friendly fashion. The current status and future plans for the system will be detailed

  5. Testing Object-Oriented Software

    DEFF Research Database (Denmark)

    Caspersen, Michael Edelgaard; Madsen, Ole Lehrmann; Skov, Stefan H.

    The report is a result of an activity within the project Centre for Object Technology (COT), case 2. In case 2 a number of pilot projects have been carried out to test the feasibility of using object technology within embedded software. Some of the pilot projects have resulted in proto-types that......The report is a result of an activity within the project Centre for Object Technology (COT), case 2. In case 2 a number of pilot projects have been carried out to test the feasibility of using object technology within embedded software. Some of the pilot projects have resulted in proto......-types that are currently being developed into production versions. To assure a high quality in the product it was decided to carry out an activ-ity regarding issues in testing OO software. The purpose of this report is to discuss the issues of testing object-oriented software. It is often claimed that testing of OO...... software is radically different form testing traditional software developed using imperative/procedural programming. Other authors claim that there is no difference. In this report we will attempt to give an answer to these questions (or at least initiate a discussion)....

  6. BNL multiparticle spectrometer software

    International Nuclear Information System (INIS)

    Saulys, A.C.

    1984-01-01

    This paper discusses some solutions to problems common to the design, management and maintenance of a large high energy physics spectrometer software system. The experience of dealing with a large, complex program and the necessity of having the program controlled by various people at different levels of computer experience has led us to design a program control structure of mnemonic and self-explanatory nature. The use of this control language in both on-line and off-line operation of the program will be discussed. The solution of structuring a large program for modularity so that substantial changes to the program can be made easily for a wide variety of high energy physics experiments is discussed. Specialized tools for this type of large program management are also discussed

  7. Fostering successful scientific software communities

    Science.gov (United States)

    Bangerth, W.; Heister, T.; Hwang, L.; Kellogg, L. H.

    2016-12-01

    Developing sustainable open source software packages for the sciences appears at first to be primarily a technical challenge: How can one create stable and robust algorithms, appropriate software designs, sufficient documentation, quality assurance strategies such as continuous integration and test suites, or backward compatibility approaches that yield high-quality software usable not only by the authors, but also the broader community of scientists? However, our experience from almost two decades of leading the development of the deal.II software library (http://www.dealii.org, a widely-used finite element package) and the ASPECT code (http://aspect.dealii.org, used to simulate convection in the Earth's mantle) has taught us that technical aspects are not the most difficult ones in scientific open source software. Rather, it is the social challenge of building and maintaining a community of users and developers interested in answering questions on user forums, contributing code, and jointly finding solutions to common technical and non-technical challenges. These problems are posed in an environment where project leaders typically have no resources to reward the majority of contributors, where very few people are specifically paid for the work they do on the project, and with frequent turnover of contributors as project members rotate into and out of jobs. In particular, much software work is done by graduate students who may become fluent enough in a software only a year or two before they leave academia. We will discuss strategies we have found do and do not work in maintaining and growing communities around the scientific software projects we lead. Specifically, we will discuss the management style necessary to keep contributors engaged, ways to give credit where credit is due, and structuring documentation to decrease reliance on forums and thereby allow user communities to grow without straining those who answer questions.

  8. Computer software quality assurance

    International Nuclear Information System (INIS)

    Ives, K.A.

    1986-06-01

    The author defines some criteria for the evaluation of software quality assurance elements for applicability to the regulation of the nuclear industry. The author then analyses a number of software quality assurance (SQA) standards. The major extracted SQA elements are then discussed, and finally specific software quality assurance recommendations are made for the nuclear industry

  9. Software Architecture Evolution

    Science.gov (United States)

    Barnes, Jeffrey M.

    2013-01-01

    Many software systems eventually undergo changes to their basic architectural structure. Such changes may be prompted by new feature requests, new quality attribute requirements, changing technology, or other reasons. Whatever the causes, architecture evolution is commonplace in real-world software projects. Today's software architects, however,…

  10. XES Software Communication Extension

    NARCIS (Netherlands)

    Leemans, M.; Liu, C.

    2017-01-01

    During the execution of software, execution data can be recorded. With the development of process mining techniques on the one hand, and the growing availability of software execution data on the other hand, a new form of software analytics comes into reach. That is, applying process mining

  11. Neutron Scattering Software

    Science.gov (United States)

    Home Page | Facilities | Reference | Software | Conferences | Announcements | Mailing Lists Neutron Scattering Banner Neutron Scattering Software A new portal for neutron scattering has just been established sets KUPLOT: data plotting and fitting software ILL/TAS: Matlab probrams for analyzing triple axis data

  12. XES Software Event Extension

    NARCIS (Netherlands)

    Leemans, M.; Liu, C.

    2017-01-01

    During the execution of software, execution data can be recorded. With the development of process mining techniques on the one hand, and the growing availability of software execution data on the other hand, a new form of software analytics comes into reach. That is, applying process mining

  13. ARC Software and Models

    Science.gov (United States)

    Archives RESEARCH ▼ Research Areas Ongoing Projects Completed Projects SOFTWARE CONTACT ▼ Primary Contacts Researchers External Link MLibrary Deep Blue Software Archive Most research conducted at the ARC produce software code and methodologies that are transferred to TARDEC and industry partners. These

  14. XES Software Telemetry Extension

    NARCIS (Netherlands)

    Leemans, M.; Liu, C.

    2017-01-01

    During the execution of software, execution data can be recorded. With the development of process mining techniques on the one hand, and the growing availability of software execution data on the other hand, a new form of software analytics comes into reach. That is, applying process mining

  15. Specifications in software prototyping

    OpenAIRE

    Luqi; Chang, Carl K.; Zhu, Hong

    1998-01-01

    We explore the use of software speci®cations for software prototyping. This paper describes a process model for software prototyping, and shows how specifications can be used to support such a process via a cellular mobile phone switch example.

  16. Software Engineering for Portability.

    Science.gov (United States)

    Stanchev, Ivan

    1990-01-01

    Discussion of the portability of educational software focuses on the software design and development process. Topics discussed include levels of portability; the user-computer dialog; software engineering principles; design techniques for student performance records; techniques of courseware programing; and suggestions for further research and…

  17. Physics detector simulation facility system software description

    International Nuclear Information System (INIS)

    Allen, J.; Chang, C.; Estep, P.; Huang, J.; Liu, J.; Marquez, M.; Mestad, S.; Pan, J.; Traversat, B.

    1991-12-01

    Large and costly detectors will be constructed during the next few years to study the interactions produced by the SSC. Efficient, cost-effective designs for these detectors will require careful thought and planning. Because it is not possible to test fully a proposed design in a scaled-down version, the adequacy of a proposed design will be determined by a detailed computer model of the detectors. Physics and detector simulations will be performed on the computer model using high-powered computing system at the Physics Detector Simulation Facility (PDSF). The SSCL has particular computing requirements for high-energy physics (HEP) Monte Carlo calculations for the simulation of SSCL physics and detectors. The numerical calculations to be performed in each simulation are lengthy and detailed; they could require many more months per run on a VAX 11/780 computer and may produce several gigabytes of data per run. Consequently, a distributed computing environment of several networked high-speed computing engines is envisioned to meet these needs. These networked computers will form the basis of a centralized facility for SSCL physics and detector simulation work. Our computer planning groups have determined that the most efficient, cost-effective way to provide these high-performance computing resources at this time is with RISC-based UNIX workstations. The modeling and simulation application software that will run on the computing system is usually written by physicists in FORTRAN language and may need thousands of hours of supercomputing time. The system software is the ''glue'' which integrates the distributed workstations and allows them to be managed as a single entity. This report will address the computing strategy for the SSC

  18. Software Acquisition and Software Engineering Best Practices

    National Research Council Canada - National Science Library

    Eslinger, S

    1999-01-01

    The purpose of this white paper is to address the issues raised in the recently published Senate Armed Services Committee Report 106-50 concerning Software Management Improvements for the Department of Defense (DoD...

  19. Presenting Software Metrics Indicators- A Case Study

    DEFF Research Database (Denmark)

    Pantazos, Kostas; Shollo, Arisa; Staron, Miroslaw

    2010-01-01

    Industrial measurement systems in software projects can generate a large number of indicators (main measurements). Having a large number of indicators might result in failing to present an overview of the status of measured entities. In consequence, managers might experience problems when making...

  20. GRACAT, Software for grounding and collision analysis

    DEFF Research Database (Denmark)

    Friis-Hansen, Peter; Simonsen, Bo Cerup

    2002-01-01

    and grounding accidents. The software consists of three basic analysis modules and one risk mitigation module: 1) frequency, 2) damage, and 3) consequence. These modules can be used individually or in series and the analyses can be performed in deterministic or probabilistic mode. Finally, in the mitigation...

  1. STEM - software test and evaluation methods. A study of failure dependency in diverse software

    International Nuclear Information System (INIS)

    Bishop, P.G.; Pullen, F.D.

    1989-02-01

    STEM is a collaborative software reliability project undertaken in partnership with Halden Reactor Project, UKAEA, and the Finnish Technical Research Centre. The objective of STEM is to evaluate a number of fault detection and fault estimation methods which can be applied to high integrity software. This Report presents a study of the observed failure dependencies between faults in diversely produced software. (author)

  2. Software Quality Assurance in Software Projects: A Study of Pakistan

    OpenAIRE

    Faisal Shafique Butt; Sundus Shaukat; M. Wasif Nisar; Ehsan Ullah Munir; Muhammad Waseem; Kashif Ayyub

    2013-01-01

    Software quality is specific property which tells what kind of standard software should have. In a software project, quality is the key factor of success and decline of software related organization. Many researches have been done regarding software quality. Software related organization follows standards introduced by Capability Maturity Model Integration (CMMI) to achieve good quality software. Quality is divided into three main layers which are Software Quality Assurance (SQA), Software Qu...

  3. BioXTAS RAW, a software program for high-throughput automated small-angle X-ray scattering data reduction and preliminary analysis

    DEFF Research Database (Denmark)

    Nielsen, S.S.; Toft, K.N.; Snakenborg, Detlef

    2009-01-01

    A fully open source software program for automated two-dimensional and one-dimensional data reduction and preliminary analysis of isotropic small-angle X-ray scattering (SAXS) data is presented. The program is freely distributed, following the open-source philosophy, and does not rely on any...... commercial software packages. BioXTAS RAW is a fully automated program that, via an online feature, reads raw two-dimensional SAXS detector output files and processes and plots data as the data files are created during measurement sessions. The software handles all steps in the data reduction. This includes...... mask creation, radial averaging, error bar calculation, artifact removal, normalization and q calibration. Further data reduction such as background subtraction and absolute intensity scaling is fast and easy via the graphical user interface. BioXTAS RAW also provides preliminary analysis of one...

  4. Software and Computing News

    CERN Multimedia

    Barberis, D

    The last several months have been very busy ones for the ATLAS software developers. They've been trying to cope with the competing demands of multiple software stress tests and testbeds. These include Data Challenge Two (DC2), the Combined Testbeam (CTB), preparations for the Physics Workshop to be held in Rome in June 2005, and other testbeds, primarily one for the High-Level Trigger. Data Challenge 2 (DC2) The primary goal of this was to validate the computing model and to provide a test of simulating a day's worth of ATLAS data (10 million events) and of fully processing it and making it available to the physicists within 10 days (i.e. a 10% scale test). DC2 consists of three parts - the generation, simulation, and mixing of a representative sample of physics events with background events; the reconstruction of the mixed samples with initial classification into the different physics signatures; and the distribution of the data to multiple remote sites (Tier-1 centers) for analysis by physicists. Figu...

  5. Hardware and software for automating the process of studying high-speed gas flows in wind tunnels of short-term action

    Science.gov (United States)

    Yakovlev, V. V.; Shakirov, S. R.; Gilyov, V. M.; Shpak, S. I.

    2017-10-01

    In this paper, we propose a variant of constructing automation systems for aerodynamic experiments on the basis of modern hardware-software means of domestic development. The structure of the universal control and data collection system for performing experiments in wind tunnels of continuous, periodic or short-term action is proposed. The proposed hardware and software development tools for ICT SB RAS and ITAM SB RAS, as well as subsystems based on them, can be widely applied to any scientific and experimental installations, as well as to the automation of technological processes in production.

  6. Licensing safety critical software

    International Nuclear Information System (INIS)

    Archinoff, G.H.; Brown, R.A.

    1990-01-01

    Licensing difficulties with the shutdown system software at the Darlington Nuclear Generating Station contributed to delays in starting up the station. Even though the station has now been given approval by the Atomic Energy Control Board (AECB) to operate, the software issue has not disappeared - Ontario Hydro has been instructed by the AECB to redesign the software. This article attempts to explain why software based shutdown systems were chosen for Darlington, why there was so much difficulty licensing them, and what the implications are for other safety related software based applications

  7. Software-aided approach to investigate peptide structure and metabolic susceptibility of amide bonds in peptide drugs based on high resolution mass spectrometry.

    Directory of Open Access Journals (Sweden)

    Tatiana Radchenko

    Full Text Available Interest in using peptide molecules as therapeutic agents due to high selectivity and efficacy is increasing within the pharmaceutical industry. However, most peptide-derived drugs cannot be administered orally because of low bioavailability and instability in the gastrointestinal tract due to protease activity. Therefore, structural modifications peptides are required to improve their stability. For this purpose, several in-silico software tools have been developed such as PeptideCutter or PoPS, which aim to predict peptide cleavage sites for different proteases. Moreover, several databases exist where this information is collected and stored from public sources such as MEROPS and ExPASy ENZYME databases. These tools can help design a peptide drug with increased stability against proteolysis, though they are limited to natural amino acids or cannot process cyclic peptides, for example. We worked to develop a new methodology to analyze peptide structure and amide bond metabolic stability based on the peptide structure (linear/cyclic, natural/unnatural amino acids. This approach used liquid chromatography / high resolution, mass spectrometry to obtain the analytical data from in vitro incubations. We collected experimental data for a set (linear/cyclic, natural/unnatural amino acids of fourteen peptide drugs and four substrate peptides incubated with different proteolytic media: trypsin, chymotrypsin, pepsin, pancreatic elastase, dipeptidyl peptidase-4 and neprilysin. Mass spectrometry data was analyzed to find metabolites and determine their structures, then all the results were stored in a chemically aware manner, which allows us to compute the peptide bond susceptibility by using a frequency analysis of the metabolic-liable bonds. In total 132 metabolites were found from the various in vitro conditions tested resulting in 77 distinct cleavage sites. The most frequent observed cleavage sites agreed with those reported in the literature. The

  8. Pharmacovigilance: the devastating consequences of not thinking ...

    African Journals Online (AJOL)

    Pharmacovigilance: the devastating consequences of not thinking about adverse drug reactions: The burden of adverse drug reactions (ADRs) on patient care has been found to be high globally and is particularly high in South Africa.

  9. HAZARD ANALYSIS SOFTWARE

    International Nuclear Information System (INIS)

    Sommer, S; Tinh Tran, T.

    2008-01-01

    Washington Safety Management Solutions, LLC developed web-based software to improve the efficiency and consistency of hazard identification and analysis, control selection and classification, and to standardize analysis reporting at Savannah River Site. In the new nuclear age, information technology provides methods to improve the efficiency of the documented safety analysis development process which includes hazard analysis activities. This software provides a web interface that interacts with a relational database to support analysis, record data, and to ensure reporting consistency. A team of subject matter experts participated in a series of meetings to review the associated processes and procedures for requirements and standard practices. Through these meetings, a set of software requirements were developed and compiled into a requirements traceability matrix from which software could be developed. The software was tested to ensure compliance with the requirements. Training was provided to the hazard analysis leads. Hazard analysis teams using the software have verified its operability. The software has been classified as NQA-1, Level D, as it supports the analysis team but does not perform the analysis. The software can be transported to other sites with alternate risk schemes. The software is being used to support the development of 14 hazard analyses. User responses have been positive with a number of suggestions for improvement which are being incorporated as time permits. The software has enforced a uniform implementation of the site procedures. The software has significantly improved the efficiency and standardization of the hazard analysis process

  10. Software Validation in ATLAS

    International Nuclear Information System (INIS)

    Hodgkinson, Mark; Seuster, Rolf; Simmons, Brinick; Sherwood, Peter; Rousseau, David

    2012-01-01

    The ATLAS collaboration operates an extensive set of protocols to validate the quality of the offline software in a timely manner. This is essential in order to process the large amounts of data being collected by the ATLAS detector in 2011 without complications on the offline software side. We will discuss a number of different strategies used to validate the ATLAS offline software; running the ATLAS framework software, Athena, in a variety of configurations daily on each nightly build via the ATLAS Nightly System (ATN) and Run Time Tester (RTT) systems; the monitoring of these tests and checking the compilation of the software via distributed teams of rotating shifters; monitoring of and follow up on bug reports by the shifter teams and periodic software cleaning weeks to improve the quality of the offline software further.

  11. Software Design for Smile Analysis

    Directory of Open Access Journals (Sweden)

    A. Sarkhosh

    2010-12-01

    Full Text Available Introduction: Esthetics and attractiveness of the smile is one of the major demands in contemporary orthodontic treatment. In order to improve a smile design, it is necessary to record “posed smile” as an intentional, non-pressure, static, natural and reproduciblesmile. The record then should be analyzed to determine its characteristics. In this study,we intended to design and introduce a software to analyze the smile rapidly and precisely in order to produce an attractive smile for the patients.Materials and Methods: For this purpose, a practical study was performed to design multimedia software “Smile Analysis” which can receive patients’ photographs and videographs. After giving records to the software, the operator should mark the points and lines which are displayed on the system’s guide and also define the correct scale for each image. Thirty-three variables are measured by the software and displayed on the report page. Reliability of measurements in both image and video was significantly high(=0.7-1.Results: In order to evaluate intra- operator and inter-operator reliability, five cases were selected randomly. Statistical analysis showed that calculations performed in smile analysis software were both valid and highly reliable (for both video and photo.Conclusion: The results obtained from smile analysis could be used in diagnosis,treatment planning and evaluation of the treatment progress.

  12. Strategies for successful software development risk management

    Directory of Open Access Journals (Sweden)

    Marija Boban

    2003-01-01

    Full Text Available Nowadays, software is becoming a major part of enterprise business. Software development is activity connected with advanced technology and high level of knowledge. Risks on software development projects must be successfully mitigated to produce successful software systems. Lack of a defined approach to risk management is one of the common causes for project failures. To improve project chances for success, this work investigates common risk impact areas to perceive a foundation that can be used to define a common approach to software risk management. Based on typical risk impact areas on software development projects, we propose three risk management strategies suitable for a broad area of enterprises and software development projects with different amounts of connected risks. Proposed strategies define activities that should be performed for successful risk management, the one that will enable software development projects to perceive risks as soon as possible and to solve problems connected with risk materialization. We also propose a risk-based approach to software development planning and risk management as attempts to address and retire the highest impact risks as early as possible in the development process. Proposed strategies should improve risk management on software development projects and help create a successful software solution.

  13. Programming Language Software For Graphics Applications

    Science.gov (United States)

    Beckman, Brian C.

    1993-01-01

    New approach reduces repetitive development of features common to different applications. High-level programming language and interactive environment with access to graphical hardware and software created by adding graphical commands and other constructs to standardized, general-purpose programming language, "Scheme". Designed for use in developing other software incorporating interactive computer-graphics capabilities into application programs. Provides alternative to programming entire applications in C or FORTRAN, specifically ameliorating design and implementation of complex control and data structures typifying applications with interactive graphics. Enables experimental programming and rapid development of prototype software, and yields high-level programs serving as executable versions of software-design documentation.

  14. Understanding Acceptance of Software Metrics--A Developer Perspective

    Science.gov (United States)

    Umarji, Medha

    2009-01-01

    Software metrics are measures of software products and processes. Metrics are widely used by software organizations to help manage projects, improve product quality and increase efficiency of the software development process. However, metrics programs tend to have a high failure rate in organizations, and developer pushback is one of the sources…

  15. MATLAB Software Versions and Licenses for the Peregrine System |

    Science.gov (United States)

    High-Performance Computing | NREL MATLAB Software Versions and Licenses for the Peregrine System MATLAB Software Versions and Licenses for the Peregrine System Learn about the MATLAB software Peregrine is R2017b. Licenses MATLAB is proprietary software. As such, users have access to a limited number

  16. Antecedents and consequences of work engagement among nurses.

    Science.gov (United States)

    Sohrabizadeh, Sanaz; Sayfouri, Nasrin

    2014-11-01

    Engaged nurses have high levels of energy and are enthusiastic about their work which impacts quality of health care services. However, in the context of Iran, due to observed burnout, work engagement among nurses necessitates immediate exploration. This investigation aimed to identify a suitable work engagement model in nursing profession in hospitals according to the hypothesized model and to determine antecedents and consequences related to work engagement among nurses. In this cross-sectional study, a questionnaire was given to 279 randomly-selected nurses working in two general teaching hospitals of Shiraz University of Medical Sciences (Shiraz, Iran) to measure antecedents and consequences of work engagement using the Saks's (2005) model. Structural Equation Modeling was used to examine the model fitness. Two paths were added using LISREL software. The resulting model showed good fitness indices (χ(2) = 23.62, AGFI = 0.93, CFI = 0.97, RMSEA = 0.07) and all the coefficients of the paths were significant (t ≥ 2, t ≤ -2). A significant correlation was found between work engagement and model variables. Paying adequate attention to the antecedents of work engagement can enhance the quality of performance among nurses. Additionally, rewards, organizational and supervisory supports, and job characteristics should be taken into consideration to establish work engagement among nurses. Further researches are required to identify other probable antecedents and consequences of nursing work engagement, which might be related to specific cultural settings.

  17. A Venture Capital Mixed Model for the Acquisition of Defense Software Products

    National Research Council Canada - National Science Library

    Botsakos, Michael T

    2007-01-01

    .... Cost and schedule overruns are the consequence of the software development models selected, inaccurate estimation of size, time, and cost, the instability of user requirements, and poor decision...

  18. Adolescent childbearing: consequences and interventions.

    Science.gov (United States)

    Ruedinger, Emily; Cox, Joanne E

    2012-08-01

    Adolescent childbearing in the United States continues to occur at high rates compared with other industrialized nations, despite a recent decline. Adolescent mothers and their offspring are at risk for negative outcomes. Recent literature exploring the consequences of teenage childbearing and interventions to ameliorate these consequences are presented. Negative consequences of adolescent childbearing can impact mothers and their offspring throughout the lifespan. These consequences are likely attributable to social and environmental factors rather than solely to maternal age. Increasing educational attainment, preventing repeat pregnancy and improving mother-child interactions can improve outcomes for mothers and their children. Home, community, school and clinic-based programs are all viable models of service delivery to this population. Connecting teen mothers with comprehensive services to meet their social, economic, health and educational needs can potentially improve long-term outcomes for both mothers and their offspring. Programs that deliver care to this population in culturally sensitive, developmentally appropriate ways have demonstrated success. Future investigation of parenting interventions with larger sample sizes and that assess multiple outcomes will allow comparison among programs. Explorations of the role of the father and coparenting are also directions for future research.

  19. TMT approach to observatory software development process

    Science.gov (United States)

    Buur, Hanne; Subramaniam, Annapurni; Gillies, Kim; Dumas, Christophe; Bhatia, Ravinder

    2016-07-01

    effective communications; adopting an agile-based software development process across the observatory to enable frequent software releases to help mitigate subsystem interdependencies; defining concise scope and work packages for each of the OSW subsystems to facilitate effective outsourcing of software deliverables to the ITCC partner, and to enable performance monitoring and risk management. At this stage, the architecture and high-level design of the software system has been established and reviewed. During construction each subsystem will have a final design phase with reviews, followed by implementation and testing. The results of the TMT approach to the Observatory Software development process will only be preliminary at the time of the submittal of this paper, but it is anticipated that the early results will be a favorable indication of progress.

  20. Software and Network Engineering

    CERN Document Server

    2012-01-01

    The series "Studies in Computational Intelligence" (SCI) publishes new developments and advances in the various areas of computational intelligence – quickly and with a high quality. The intent is to cover the theory, applications, and design methods of computational intelligence, as embedded in the fields of engineering, computer science, physics and life science, as well as the methodologies behind them. The series contains monographs, lecture notes and edited volumes in computational intelligence spanning the areas of neural networks, connectionist systems, genetic algorithms, evolutionary computation, artificial intelligence, cellular automata, self-organizing systems, soft computing, fuzzy systems, and hybrid intelligent systems. Critical to both contributors and readers are the short publication time and world-wide distribution - this permits a rapid and broad dissemination of research results.   The purpose of the first ACIS International Symposium on Software and Network Engineering held on Decembe...