WorldWideScience

Sample records for catissue core software

  1. Core Flight Software

    Data.gov (United States)

    National Aeronautics and Space Administration — The AES Core Flight Software (CFS) project purpose is to analyze applicability, and evolve and extend the reusability of the CFS system originally developed by...

  2. SpaceCube Core Software

    Data.gov (United States)

    National Aeronautics and Space Administration — Develop a flexible, modular and user friendly SpaceCube Core Software system that will dramatically simplify SpaceCube application development and enable any...

  3. Open core control software for surgical robots.

    Science.gov (United States)

    Arata, Jumpei; Kozuka, Hiroaki; Kim, Hyung Wook; Takesue, Naoyuki; Vladimirov, B; Sakaguchi, Masamichi; Tokuda, Junichi; Hata, Nobuhiko; Chinzei, Kiyoyuki; Fujimoto, Hideo

    2010-05-01

    In these days, patients and doctors in operation room are surrounded by many medical devices as resulting from recent advancement of medical technology. However, these cutting-edge medical devices are working independently and not collaborating with each other, even though the collaborations between these devices such as navigation systems and medical imaging devices are becoming very important for accomplishing complex surgical tasks (such as a tumor removal procedure while checking the tumor location in neurosurgery). On the other hand, several surgical robots have been commercialized, and are becoming common. However, these surgical robots are not open for collaborations with external medical devices in these days. A cutting-edge "intelligent surgical robot" will be possible in collaborating with surgical robots, various kinds of sensors, navigation system and so on. On the other hand, most of the academic software developments for surgical robots are "home-made" in their research institutions and not open to the public. Therefore, open source control software for surgical robots can be beneficial in this field. From these perspectives, we developed Open Core Control software for surgical robots to overcome these challenges. In general, control softwares have hardware dependencies based on actuators, sensors and various kinds of internal devices. Therefore, these control softwares cannot be used on different types of robots without modifications. However, the structure of the Open Core Control software can be reused for various types of robots by abstracting hardware dependent parts. In addition, network connectivity is crucial for collaboration between advanced medical devices. The OpenIGTLink is adopted in Interface class which plays a role to communicate with external medical devices. At the same time, it is essential to maintain the stable operation within the asynchronous data transactions through network. In the Open Core Control software, several

  4. Core software security security at the source

    CERN Document Server

    Ransome, James

    2013-01-01

    First and foremost, Ransome and Misra have made an engaging book that will empower readers in both large and small software development and engineering organizations to build security into their products. This book clarifies to executives the decisions to be made on software security and then provides guidance to managers and developers on process and procedure. Readers are armed with firm solutions for the fight against cyber threats.-Dr. Dena Haritos Tsamitis. Carnegie Mellon UniversityIn the wake of cloud computing and mobile apps, the issue of software security has never been more importan

  5. Core Flight Executive Software Radiation Mitigation Study

    Data.gov (United States)

    National Aeronautics and Space Administration — The reliability of SmallSat / CubeSat missions may be increased by using software radiation mitigation for single event upsets (SEUs). Implementing protection in...

  6. Core Flight Software (CFS) Maturation Towards Human Rating

    Data.gov (United States)

    National Aeronautics and Space Administration — The research performed under this proposal will assess the applicability of the Core Flight Software (CFS) within human-rated type architectures by prototyping and...

  7. Cronos 2: a neutronic simulation software for reactor core calculations

    International Nuclear Information System (INIS)

    Lautard, J.J.; Magnaud, C.; Moreau, F.; Baudron, A.M.

    1999-01-01

    The CRONOS2 software is that part of the SAPHYR code system dedicated to neutronic core calculations. CRONOS2 is a powerful tool for reactor design, fuel management and safety studies. Its modular structure and great flexibility make CRONOS2 an unique simulation tool for research and development for a wide variety of reactor systems. CRONOS2 is a versatile tool that covers a large range of applications from very fast calculations used in training simulators to time and memory consuming reference calculations needed to understand complex physical phenomena. CRONOS2 has a procedure library named CPROC that allows the user to create its own application environment fitted to a specific industrial use. (authors)

  8. Automated software analysis of nuclear core discharge data

    International Nuclear Information System (INIS)

    Larson, T.W.; Halbig, J.K.; Howell, J.A.; Eccleston, G.W.; Klosterbuer, S.F.

    1993-03-01

    Monitoring the fueling process of an on-load nuclear reactor is a full-time job for nuclear safeguarding agencies. Nuclear core discharge monitors (CDMS) can provide continuous, unattended recording of the reactor's fueling activity for later, qualitative review by a safeguards inspector. A quantitative analysis of this collected data could prove to be a great asset to inspectors because more information can be extracted from the data and the analysis time can be reduced considerably. This paper presents a prototype for an automated software analysis system capable of identifying when fuel bundle pushes occurred and monitoring the power level of the reactor. Neural network models were developed for calculating the region on the reactor face from which the fuel was discharged and predicting the burnup. These models were created and tested using actual data collected from a CDM system at an on-load reactor facility. Collectively, these automated quantitative analysis programs could help safeguarding agencies to gain a better perspective on the complete picture of the fueling activity of an on-load nuclear reactor. This type of system can provide a cost-effective solution for automated monitoring of on-load reactors significantly reducing time and effort

  9. KNGR core proection calculator, software, verification and validation plan

    International Nuclear Information System (INIS)

    Kim, Jang Yeol; Park, Jong Kyun; Lee, Ki Young; Lee, Jang Soo; Cheon, Se Woo

    2001-05-01

    This document describes the Software Verification and Validation Plan(SVVP) Guidance to be used in reviewing the Software Program Manual(SPM) in Korean Next Generation Reactor(KNGR) projects. This document is intended for a verifier or reviewer who is involved with performing of software verification and validation task activity in KNGR projects. This document includeds the basic philosophy, performing V and V effort, software testing techniques, criteria of review and audit on the safety software V and V activity. Major review topics on safety software addresses three kinds of characteristics based on Standard Review Plan(SRP) Chapter 7, Branch Technical Position(BTP)-14 : management characteristics, implementation characteristics and resources characteristics when reviewing on SVVP. Based on major topics of this document, we have produced the evaluation items list such as checklist in Appendix A

  10. Core design methodology and software for Temelin NPP

    International Nuclear Information System (INIS)

    Havluj, F; Hejzlar, J.; Klouzal, J.; Stary, V.; Vocka, R.

    2011-01-01

    In the frame of the process of fuel vendor change at Temelin NPP in the Czech Republic, where, starting since 2010, TVEL TVSA-T fuel is loaded instead of Westinghouse VVANTAGE-6 fuel, new methodologies for core design and core reload safety evaluation have been developed. These documents are based on the methodologies delivered by TVEL within the fuel contract, and they were further adapted according to Temelin NPP operational needs and according to the current practice at NPP. Along with the methodology development the 3D core analysis code ANDREA, licensed for core reload safety evaluation in 2010, have been upgraded in order to optimize the safety evaluation process. New sequences of calculations were implemented in order to simplify the evaluation of different limiting parameters and output visualization tools were developed to make the verification process user friendly. Interfaces to the fuel performance code TRANSURANUS and sub-channel analysis code SUBCAL were developed as well. (authors)

  11. SecureCore Software Architecture: Trusted Path Application (TPA) Requirements

    National Research Council Canada - National Science Library

    Clark, Paul C; Irvine, Cynthia E; Levin, Timothy E; Nguyen, Thuy D; Vidas, Timothy M

    2007-01-01

    .... The purpose of the SecureCore research project is to investigate fundamental architectural features required for the trusted operation of mobile computing devices so the security is built-in, transparent and flexible...

  12. Experience with Intel's Many Integrated Core Architecture in ATLAS Software

    CERN Document Server

    Fleischmann, S; The ATLAS collaboration; Lavrijsen, W; Neumann, M; Vitillo, R

    2014-01-01

    Intel recently released the first commercial boards of its Many Integrated Core (MIC) Architecture. MIC is Intel's solution for the domain of throughput computing, currently dominated by general purpose programming on graphics processors (GPGPU). MIC allows the use of the more familiar x86 programming model and supports standard technologies such as OpenMP, MPI, and Intel's Threading Building Blocks. This should make it possible to develop for both throughput and latency devices using a single code base.\

  13. Experience with Intel's Many Integrated Core Architecture in ATLAS Software

    CERN Document Server

    Fleischmann, S; The ATLAS collaboration; Lavrijsen, W; Neumann, M; Vitillo, R

    2013-01-01

    Intel recently released the first commercial boards of its Many Integrated Core (MIC) Architecture. MIC is Intel's solution for the domain of throughput computing, currently dominated by general purpose programming on graphics processors (GPGPU). MIC allows the use of the more familiar x86 programming model and supports standard technologies such as OpenMP, MPI, and Intel's Threading Building Blocks. This should make it possible to develop for both throughput and latency devices using a single code base.\

  14. Software

    Energy Technology Data Exchange (ETDEWEB)

    Macedo, R.; Budd, G.; Ross, E.; Wells, P.

    2010-07-15

    The software section of this journal presented new software programs that have been developed to help in the exploration and development of hydrocarbon resources. Software provider IHS Inc. has made additions to its geological and engineering analysis software tool, IHS PETRA, a product used by geoscientists and engineers to visualize, analyze and manage well production, well log, drilling, reservoir, seismic and other related information. IHS PETRA also includes a directional well module and a decline curve analysis module to improve analysis capabilities in unconventional reservoirs. Petris Technology Inc. has developed a software to help manage the large volumes of data. PetrisWinds Enterprise (PWE) helps users find and manage wellbore data, including conventional wireline and MWD core data; analysis core photos and images; waveforms and NMR; and external files documentation. Ottawa-based Ambercore Software Inc. has been collaborating with Nexen on the Petroleum iQ software for steam assisted gravity drainage (SAGD) producers. Petroleum iQ integrates geology and geophysics data with engineering data in 3D and 4D. Calgary-based Envirosoft Corporation has developed a software that reduces the costly and time-consuming effort required to comply with Directive 39 of the Alberta Energy Resources Conservation Board. The product includes an emissions modelling software. Houston-based Seismic Micro-Technology (SMT) has developed the Kingdom software that features the latest in seismic interpretation. Holland-based Joa Oil and Gas and Calgary-based Computer Modelling Group have both supplied the petroleum industry with advanced reservoir simulation software that enables reservoir interpretation. The 2010 software survey included a guide to new software applications designed to facilitate petroleum exploration, drilling and production activities. Oil and gas producers can use the products for a range of functions, including reservoir characterization and accounting. In

  15. The future of commodity computing and many-core versus the interests of HEP software

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    As the mainstream computing world has shifted from multi-core to many-core platforms, the situation for software developers has changed as well. With the numerous hardware and software options available, choices balancing programmability and performance are becoming a significant challenge. The expanding multiplicative dimensions of performance offer a growing number of possibilities that need to be assessed and addressed on several levels of abstraction. This paper reviews the major tradeoffs forced upon the software domain by the changing landscape of parallel technologies – hardware and software alike. Recent developments, paradigms and techniques are considered with respect to their impact on the rather traditional HEP programming models. Other considerations addressed include aspects of efficiency and reasonably achievable targets for the parallelization of large scale HEP workloads.

  16. Core Community Specifications for Electron Microprobe Operating Systems: Software, Quality Control, and Data Management Issues

    Science.gov (United States)

    Fournelle, John; Carpenter, Paul

    2006-01-01

    Modem electron microprobe systems have become increasingly sophisticated. These systems utilize either UNIX or PC computer systems for measurement, automation, and data reduction. These systems have undergone major improvements in processing, storage, display, and communications, due to increased capabilities of hardware and software. Instrument specifications are typically utilized at the time of purchase and concentrate on hardware performance. The microanalysis community includes analysts, researchers, software developers, and manufacturers, who could benefit from exchange of ideas and the ultimate development of core community specifications (CCS) for hardware and software components of microprobe instrumentation and operating systems.

  17. Exploring the impact of socio-technical core-periphery structures in open source software development

    NARCIS (Netherlands)

    Amrit, Chintan Amrit; van Hillegersberg, Jos

    2010-01-01

    In this paper we apply the social network concept of core-periphery structure to the socio-technical structure of a software development team. We propose a socio-technical pattern that can be used to locate emerging coordination problems in Open Source projects. With the help of our tool and method

  18. Spent nuclear fuel application of CORE reg-sign systems engineering software

    International Nuclear Information System (INIS)

    Grimm, R.J.

    1996-01-01

    The Department of Energy (DOE) has adopted a systems engineering approach for the successful completion of the Spent Nuclear Fuel (SNF) Program mission. The DOE has utilized systems engineering principles to develop the SNF Program guidance documents and has held several systems engineering workshops to develop the functional hierarchies of both the programmatic and technical side of the SNF Program. The sheer size and complexity of the SNF Program, however, has led to problems that the Westinghouse Savannah River Company (WSRC) is working to manage through the use of systems engineering software. WSRC began using CORE reg-sign, an off-the-shelf PC based software package, to assist the DOE in management of the SNF program. This paper details the successful use of the CORE reg-sign systems engineering software to date and the proposed future activities

  19. Spent nuclear fuel application of CORE reg-sign systems engineering software

    International Nuclear Information System (INIS)

    Grimm, R.J.

    1996-01-01

    The DOE has adopted a systems engineering approach for the successful completion of the Spent Nuclear Fuel (SNF) Program mission. The DOE has utilized systems engineering principles to develop the SNF program guidance documents and has held several systems engineering workshops to develop the functional hierarchies of both the programmatic and technical side of the SNF program. The sheer size and complexity of the SNF program has led to problems that the Westinghouse Savannah River Company (WSRC) is working to manage through the use of systems engineering software. WSRC began using CORE reg-sign, an off the shelf PC based software package, to assist DOE in management of the SNF program. This paper details the successful use of the CORE reg-sign systems engineering software to date and the proposed future activities

  20. New value added to network services through software-defined optical core networking

    Science.gov (United States)

    Yamada, Akiko; Nakatsugawa, Keiichi; Yamashita, Shinji; Soumiya, Toshio

    2016-02-01

    If an optical core network can be handled flexibly, it can be used not only as network infrastructure but also as a temporary broadband resource when customers have to transfer a large volume of data quickly, which will in turn lead to new WAN services. We propose "software-defined optical core networking", which achieves flexible optical network control, meaning it virtualizes optical transport network/wavelength-division multiplexing resources and controls them with resources from other layers, such as Ether/MPLS. We developed a testbed system and verified that users could request broadband resources easily, and our controller could quickly set up an optical channel data unit path for the request.

  1. The caCORE Software Development Kit: Streamlining construction of interoperable biomedical information services

    Directory of Open Access Journals (Sweden)

    Warzel Denise

    2006-01-01

    Full Text Available Abstract Background Robust, programmatically accessible biomedical information services that syntactically and semantically interoperate with other resources are challenging to construct. Such systems require the adoption of common information models, data representations and terminology standards as well as documented application programming interfaces (APIs. The National Cancer Institute (NCI developed the cancer common ontologic representation environment (caCORE to provide the infrastructure necessary to achieve interoperability across the systems it develops or sponsors. The caCORE Software Development Kit (SDK was designed to provide developers both within and outside the NCI with the tools needed to construct such interoperable software systems. Results The caCORE SDK requires a Unified Modeling Language (UML tool to begin the development workflow with the construction of a domain information model in the form of a UML Class Diagram. Models are annotated with concepts and definitions from a description logic terminology source using the Semantic Connector component. The annotated model is registered in the Cancer Data Standards Repository (caDSR using the UML Loader component. System software is automatically generated using the Codegen component, which produces middleware that runs on an application server. The caCORE SDK was initially tested and validated using a seven-class UML model, and has been used to generate the caCORE production system, which includes models with dozens of classes. The deployed system supports access through object-oriented APIs with consistent syntax for retrieval of any type of data object across all classes in the original UML model. The caCORE SDK is currently being used by several development teams, including by participants in the cancer biomedical informatics grid (caBIG program, to create compatible data services. caBIG compatibility standards are based upon caCORE resources, and thus the caCORE SDK has

  2. Geolocating thermal binoculars based on a software defined camera core incorporating HOT MCT grown by MOVPE

    Science.gov (United States)

    Pillans, Luke; Harmer, Jack; Edwards, Tim; Richardson, Lee

    2016-05-01

    Geolocation is the process of calculating a target position based on bearing and range relative to the known location of the observer. A high performance thermal imager with integrated geolocation functions is a powerful long range targeting device. Firefly is a software defined camera core incorporating a system-on-a-chip processor running the AndroidTM operating system. The processor has a range of industry standard serial interfaces which were used to interface to peripheral devices including a laser rangefinder and a digital magnetic compass. The core has built in Global Positioning System (GPS) which provides the third variable required for geolocation. The graphical capability of Firefly allowed flexibility in the design of the man-machine interface (MMI), so the finished system can give access to extensive functionality without appearing cumbersome or over-complicated to the user. This paper covers both the hardware and software design of the system, including how the camera core influenced the selection of peripheral hardware, and the MMI design process which incorporated user feedback at various stages.

  3. Dynamic optical resource allocation for mobile core networks with software defined elastic optical networking.

    Science.gov (United States)

    Zhao, Yongli; Chen, Zhendong; Zhang, Jie; Wang, Xinbo

    2016-07-25

    Driven by the forthcoming of 5G mobile communications, the all-IP architecture of mobile core networks, i.e. evolved packet core (EPC) proposed by 3GPP, has been greatly challenged by the users' demands for higher data rate and more reliable end-to-end connection, as well as operators' demands for low operational cost. These challenges can be potentially met by software defined optical networking (SDON), which enables dynamic resource allocation according to the users' requirement. In this article, a novel network architecture for mobile core network is proposed based on SDON. A software defined network (SDN) controller is designed to realize the coordinated control over different entities in EPC networks. We analyze the requirement of EPC-lightpath (EPCL) in data plane and propose an optical switch load balancing (OSLB) algorithm for resource allocation in optical layer. The procedure of establishment and adjustment of EPCLs is demonstrated on a SDON-based EPC testbed with extended OpenFlow protocol. We also evaluate the OSLB algorithm through simulation in terms of bandwidth blocking ratio, traffic load distribution, and resource utilization ratio compared with link-based load balancing (LLB) and MinHops algorithms.

  4. Evaluating the scalability of HEP software and multi-core hardware

    CERN Document Server

    Jarp, S; Leduc, J; Nowak, A

    2011-01-01

    As researchers have reached the practical limits of processor performance improvements by frequency scaling, it is clear that the future of computing lies in the effective utilization of parallel and multi-core architectures. Since this significant change in computing is well underway, it is vital for HEP programmers to understand the scalability of their software on modern hardware and the opportunities for potential improvements. This work aims to quantify the benefit of new mainstream architectures to the HEP community through practical benchmarking on recent hardware solutions, including the usage of parallelized HEP applications.

  5. The whiteStar development project: Westinghouse's next generation core design simulator and core monitoring software to power the nuclear renaissance

    International Nuclear Information System (INIS)

    Boyd, W. A.; Mayhue, L. T.; Penkrot, V. S.; Zhang, B.

    2009-01-01

    The WhiteStar project has undertaken the development of the next generation core analysis and monitoring system for Westinghouse Electric Company. This on-going project focuses on the development of the ANC core simulator, BEACON core monitoring system and NEXUS nuclear data generation system. This system contains many functional upgrades to the ANC core simulator and BEACON core monitoring products as well as the release of the NEXUS family of codes. The NEXUS family of codes is an automated once-through cross section generation system designed for use in both PWR and BWR applications. ANC is a multi-dimensional nodal code for all nuclear core design calculations at a given condition. ANC predicts core reactivity, assembly power, rod power, detector thimble flux, and other relevant core characteristics. BEACON is an advanced core monitoring and support system which uses existing instrumentation data in conjunction with an analytical methodology for on-line generation and evaluation of 3D core power distributions. This new system is needed to design and monitor the Westinghouse AP1000 PWR. This paper describes provides an overview of the software system, software development methodologies used as well some initial results. (authors)

  6. Extension of the AMBER molecular dynamics software to Intel's Many Integrated Core (MIC) architecture

    Science.gov (United States)

    Needham, Perri J.; Bhuiyan, Ashraf; Walker, Ross C.

    2016-04-01

    We present an implementation of explicit solvent particle mesh Ewald (PME) classical molecular dynamics (MD) within the PMEMD molecular dynamics engine, that forms part of the AMBER v14 MD software package, that makes use of Intel Xeon Phi coprocessors by offloading portions of the PME direct summation and neighbor list build to the coprocessor. We refer to this implementation as pmemd MIC offload and in this paper present the technical details of the algorithm, including basic models for MPI and OpenMP configuration, and analyze the resultant performance. The algorithm provides the best performance improvement for large systems (>400,000 atoms), achieving a ∼35% performance improvement for satellite tobacco mosaic virus (1,067,095 atoms) when 2 Intel E5-2697 v2 processors (2 ×12 cores, 30M cache, 2.7 GHz) are coupled to an Intel Xeon Phi coprocessor (Model 7120P-1.238/1.333 GHz, 61 cores). The implementation utilizes a two-fold decomposition strategy: spatial decomposition using an MPI library and thread-based decomposition using OpenMP. We also present compiler optimization settings that improve the performance on Intel Xeon processors, while retaining simulation accuracy.

  7. Experience with Intel's many integrated core architecture in ATLAS software

    International Nuclear Information System (INIS)

    Fleischmann, S; Neumann, M; Kama, S; Lavrijsen, W; Vitillo, R

    2014-01-01

    Intel recently released the first commercial boards of its Many Integrated Core (MIC) Architecture. MIC is Intel's solution for the domain of throughput computing, currently dominated by general purpose programming on graphics processors (GPGPU). MIC allows the use of the more familiar x86 programming model and supports standard technologies such as OpenMP, MPI, and Intel's Threading Building Blocks (TBB). This should make it possible to develop for both throughput and latency devices using a single code base. In ATLAS Software, track reconstruction has been shown to be a good candidate for throughput computing on GPGPU devices. In addition, the newly proposed offline parallel event-processing framework, GaudiHive, uses TBB for task scheduling. The MIC is thus, in principle, a good fit for this domain. In this paper, we report our experiences of porting to and optimizing ATLAS tracking algorithms for the MIC, comparing the programmability and relative cost/performance of the MIC against those of current GPGPUs and latency-optimized CPUs.

  8. A Reusable and Adaptable Software Architecture for Embedded Space Flight System: The Core Flight Software System (CFS)

    Science.gov (United States)

    Wilmot, Jonathan

    2005-01-01

    The contents include the following: High availability. Hardware is in harsh environment. Flight processor (constraints) very widely due to power and weight constraints. Software must be remotely modifiable and still operate while changes are being made. Many custom one of kind interfaces for one of a kind missions. Sustaining engineering. Price of failure is high, tens to hundreds of millions of dollars.

  9. Core Logistics Capability Policy Applied to USAF Combat Aircraft Avionics Software: A Systems Engineering Analysis

    Science.gov (United States)

    2010-06-01

    cannot make a distinction between software maintenance and development” (Sharma, 2004). ISO /IEC 12207 Software Lifecycle Processes offers a guide to...synopsis of ISO /IEC 12207 , Raghu Singh of the Federal Aviation Administration states “Whenever a software product needs modifications, the development...Corporation. Singh, R. (1998). International Standard ISO /IEC 12207 Software Life Cycle Processes. Washington: Federal Aviation Administration. The Joint

  10. GENIE: a software package for gene-gene interaction analysis in genetic association studies using multiple GPU or CPU cores

    Directory of Open Access Journals (Sweden)

    Wang Kai

    2011-05-01

    Full Text Available Abstract Background Gene-gene interaction in genetic association studies is computationally intensive when a large number of SNPs are involved. Most of the latest Central Processing Units (CPUs have multiple cores, whereas Graphics Processing Units (GPUs also have hundreds of cores and have been recently used to implement faster scientific software. However, currently there are no genetic analysis software packages that allow users to fully utilize the computing power of these multi-core devices for genetic interaction analysis for binary traits. Findings Here we present a novel software package GENIE, which utilizes the power of multiple GPU or CPU processor cores to parallelize the interaction analysis. GENIE reads an entire genetic association study dataset into memory and partitions the dataset into fragments with non-overlapping sets of SNPs. For each fragment, GENIE analyzes: 1 the interaction of SNPs within it in parallel, and 2 the interaction between the SNPs of the current fragment and other fragments in parallel. We tested GENIE on a large-scale candidate gene study on high-density lipoprotein cholesterol. Using an NVIDIA Tesla C1060 graphics card, the GPU mode of GENIE achieves a speedup of 27 times over its single-core CPU mode run. Conclusions GENIE is open-source, economical, user-friendly, and scalable. Since the computing power and memory capacity of graphics cards are increasing rapidly while their cost is going down, we anticipate that GENIE will achieve greater speedups with faster GPU cards. Documentation, source code, and precompiled binaries can be downloaded from http://www.cceb.upenn.edu/~mli/software/GENIE/.

  11. The development of test software for the inadequate core cooling monitoring system

    International Nuclear Information System (INIS)

    Lee, Soon Sung.

    1996-06-01

    The test software including the ICCMS simulator which is necessary for dynamic test for the ICCMS software in PWR is developed. The developed dynamic test software consists of the module test simulator, the integration test simulator, and the test result analyser. The simulator was programmed by C language according to the same algorithm requirements for the FORTRAN version ICCMS software, and also for the Factory Acceptance Test (FAT). And the simulator can be used as training tool for the reactor operator and system development tool for the performance improvement. (author). 4 tabs., 8 figs., 11 refs

  12. An integrated software system for core design and safety analyses: Cascade-3D

    International Nuclear Information System (INIS)

    Wan De Velde, A.; Finnemann, H.; Hahn, T.; Merk, S.

    1999-01-01

    The new Siemens program system CASCADE-3D (Core Analysis and Safety Codes for Advanced Design Evaluation) links some of the most advanced code packages for in-core fuel management and accident analysis: SAV95, PANBOX/COBRA and RELAP5. Consequently by using CASCADE-3D the potential of modern fuel assemblies and in-core fuel management strategies can be much better utilized because safety margins which had been reduced due to conservative methods are now predicted more accurately. By this innovative code system the customers can now take full advantage of the recent progress in fuel assembly design and in-core fuel management. (authors)

  13. Coupling of the 3D neutron kinetic core model DYN3D with the CFD software ANSYS-CFX

    International Nuclear Information System (INIS)

    Grahn, Alexander; Kliem, Sören; Rohde, Ulrich

    2015-01-01

    Highlights: • Improved thermal hydraulic description of nuclear reactor cores. • Possibility of three-dimensional flow phenomena in the core, such as cross flow, flow reversal, flow around obstacles. • Simulation at higher spatial resolution as compared to system codes. - Abstract: This article presents the implementation of a coupling between the 3D neutron kinetic core model DYN3D and the commercial, general purpose computational fluid dynamics (CFD) software ANSYS-CFX. In the coupling approach, parts of the thermal hydraulic calculation are transferred to CFX for its better ability to simulate the three-dimensional coolant redistribution in the reactor core region. The calculation of the heat transfer from the fuel into the coolant remains with DYN3D, which incorporates well tested and validated heat transfer models for rod-type fuel elements. On the CFX side, the core region is modeled based on the porous body approach. The implementation of the code coupling is verified by comparing test case results with reference solutions of the DYN3D standalone version. Test cases cover mini and full core geometries, control rod movement and partial overcooling transients

  14. Incorporating Computer-Aided Software in the Undergraduate Chemical Engineering Core Courses

    Science.gov (United States)

    Alnaizy, Raafat; Abdel-Jabbar, Nabil; Ibrahim, Taleb H.; Husseini, Ghaleb A.

    2014-01-01

    Introductions of computer-aided software and simulators are implemented during the sophomore-year of the chemical engineering (ChE) curriculum at the American University of Sharjah (AUS). Our faculty concurs that software integration within the curriculum is beneficial to our students, as evidenced by the positive feedback received from industry…

  15. Rapid Development of Guidance, Navigation, and Control Core Flight System Software Applications Using Simulink Models

    Data.gov (United States)

    National Aeronautics and Space Administration — The goal of this proposal is to demonstrate a new Guidance, Navigation, and Control (GNC) Flight Software (FSW) application development paradigm which takes...

  16. SecureCore Software Architecture: Trusted Management Layer (TML) Kernel Extension Module Integration Guide

    National Research Council Canada - National Science Library

    Shifflett, David J; Clark, Paul C; Irvine, Cynthia E; Nguyen, Thuy D; Vidas, Timothy M; Levin, Timothy E

    2007-01-01

    .... The purpose of the SecureCore research project is to investigate fundamental architectural features required for the trusted operation of mobile computing devices such as smart cards, embedded...

  17. SecureCore Software Architecture: Trusted Management Layer (TML) Kernel Extension Module Interface Specification

    National Research Council Canada - National Science Library

    Shifflett, David J; Clark, Paul C; Irvine, Cynthia E; Nguyen, Thuy D; Vidas, Timothy M; Levin, Timothy E

    2008-01-01

    .... The purpose of the SecureCore research project is to investigate fundamental architectural features required for the trusted operation of mobile computing devices such as smart cards, embedded...

  18. Toroidal HTS transformer with cold magnetic core - analysis with FEM software

    International Nuclear Information System (INIS)

    Grzesik, B; Stepien, M; Jez, R

    2010-01-01

    The aim of this paper is to present a thorough characterization of the toroidal HTS transformer by means of FEM analysis. The analysis was a 2D/3D harmonic electromagnetic and thermal analysis. The toroidal transformer operated in LN2 by being immersed together with the magnetic core in it, for which its power losses were acceptable. Two extreme variants of windings were analysed. The first one called parallel and the second called perpendicular. Three variants of the magnetic core were considered. In the first one the core was put outside of the windings, in the second the core was inside of the windings and in the third variant the core was outside as well as inside of the windings. The windings were made of HTS tape BiSCCO-2223/Ag while the magnetic core was made of the nanocrystalline material Finemet. The two windings, with a 1:1 turn-to-turn ratio, were uniformly distributed along the whole torus circumference. The output power, efficiency and power density are in the results of the analysis. The temperature distribution was also calculated. In summary, the performance of the transformer is better than those currently known.

  19. On the Design of Energy Efficient Optical Networks with Software Defined Networking Control Across Core and Access Networks

    DEFF Research Database (Denmark)

    Wang, Jiayuan; Yan, Ying; Dittmann, Lars

    2013-01-01

    This paper presents a Software Defined Networking (SDN) control plane based on an overlay GMPLS control model. The SDN control platform manages optical core networks (WDM/DWDM networks) and the associated access networks (GPON networks), which makes it possible to gather global information...... and enable wider areas' energy efficiency networking. The energy related information of the networks and the types of the traffic flows are collected and utilized for the end-to-end QoS provision. Dynamic network simulation results show that by applying different routing algorithms according to the type...... of traffic in the core networks, the energy efficiency of the network is improved without compromising the quality of service....

  20. Portable memory consistency for software managed distributed memory in many-core SoC

    NARCIS (Netherlands)

    Rutgers, J.H.; Bekooij, Marco Jan Gerrit; Smit, Gerardus Johannes Maria

    2013-01-01

    Porting software to different platforms can require modifications of the application. One of the issues is that the targeted hardware supports another memory consistency model. As a consequence, the completion order of reads and writes in a multi-threaded application can change, which may result in

  1. GRAPES: a software for parallel searching on biological graphs targeting multi-core architectures.

    Directory of Open Access Journals (Sweden)

    Rosalba Giugno

    Full Text Available Biological applications, from genomics to ecology, deal with graphs that represents the structure of interactions. Analyzing such data requires searching for subgraphs in collections of graphs. This task is computationally expensive. Even though multicore architectures, from commodity computers to more advanced symmetric multiprocessing (SMP, offer scalable computing power, currently published software implementations for indexing and graph matching are fundamentally sequential. As a consequence, such software implementations (i do not fully exploit available parallel computing power and (ii they do not scale with respect to the size of graphs in the database. We present GRAPES, software for parallel searching on databases of large biological graphs. GRAPES implements a parallel version of well-established graph searching algorithms, and introduces new strategies which naturally lead to a faster parallel searching system especially for large graphs. GRAPES decomposes graphs into subcomponents that can be efficiently searched in parallel. We show the performance of GRAPES on representative biological datasets containing antiviral chemical compounds, DNA, RNA, proteins, protein contact maps and protein interactions networks.

  2. BROCCOLI: Software for Fast fMRI Analysis on Many-Core CPUs and GPUs

    Directory of Open Access Journals (Sweden)

    Anders eEklund

    2014-03-01

    Full Text Available Analysis of functional magnetic resonance imaging (fMRI data is becoming ever more computationally demanding as temporal and spatial resolutions improve, and large, publicly available data sets proliferate. Moreover, methodological improvements in the neuroimaging pipeline, such as non-linear spatial normalization, non-parametric permutation tests and Bayesian Markov Chain Monte Carlo approaches, can dramatically increase the computational burden. Despite these challenges, there do not yet exist any fMRI software packages which leverage inexpensive and powerful graphics processing units (GPUs to perform these analyses. Here, we therefore present BROCCOLI, a free software package written in OpenCL (Open Computing Language that can be used for parallel analysis of fMRI data on a large variety of hardware configurations. BROCCOLI has, for example, been tested with an Intel CPU, an Nvidia GPU and an AMD GPU. These tests show that parallel processing of fMRI data can lead to significantly faster analysis pipelines. This speedup can be achieved on relatively standard hardware, but further, dramatic speed improvements require only a modest investment in GPU hardware. BROCCOLI (running on a GPU can perform non-linear spatial normalization to a 1 mm3 brain template in 4-6 seconds, and run a second level permutation test with 10,000 permutations in about a minute. These non-parametric tests are generally more robust than their parametric counterparts, and can also enable more sophisticated analyses by estimating complicated null distributions. Additionally, BROCCOLI includes support for Bayesian first-level fMRI analysis using a Gibbs sampler. The new software is freely available under GNU GPL3 and can be downloaded from github (https://github.com/wanderine/BROCCOLI/.

  3. Using SAFRAN Software to Assess Radiological Hazards from Dismantling of Tammuz-2 Reactor Core at Al-tuwaitha Nuclear Site

    Science.gov (United States)

    Abed Gatea, Mezher; Ahmed, Anwar A.; jundee kadhum, Saad; Ali, Hasan Mohammed; Hussein Muheisn, Abbas

    2018-05-01

    The Safety Assessment Framework (SAFRAN) software has implemented here for radiological safety analysis; to verify that the dose acceptance criteria and safety goals are met with a high degree of confidence for dismantling of Tammuz-2 reactor core at Al-tuwaitha nuclear site. The activities characterizing, dismantling and packaging were practiced to manage the generated radioactive waste. Dose to the worker was considered an endpoint-scenario while dose to the public has neglected due to that Tammuz-2 facility is located in a restricted zone and 30m berm surrounded Al-tuwaitha site. Safety assessment for dismantling worker endpoint-scenario based on maximum external dose at component position level in the reactor pool and internal dose via airborne activity while, for characterizing and packaging worker endpoints scenarios have been done via external dose only because no evidence for airborne radioactivity hazards outside the reactor pool. The in-situ measurements approved that reactor core components are radiologically activated by Co-60 radioisotope. SAFRAN results showed that the maximum received dose for workers are (1.85, 0.64 and 1.3mSv/y) for activities dismantling, characterizing and packaging of reactor core components respectively. Hence, the radiological hazards remain below the low level hazard and within the acceptable annual dose for workers in radiation field

  4. Harmonic Domain Modelling of Transformer Core Nonlinearities Using the DIgSILENT PowerFactory Software

    DEFF Research Database (Denmark)

    Bak, Claus Leth; Bak-Jensen, Birgitte; Wiechowski, Wojciech

    2008-01-01

    This paper demonstrates the results of implementation and verification of an already existing algorithm that allows for calculating saturation characteristics of singlephase power transformers. The algorithm was described for the first time in 1993. Now this algorithm has been implemented using...... the DIgSILENT Programming Language (DPL) as an external script in the harmonic domain calculations of a power system analysis tool PowerFactory [10]. The algorithm is verified by harmonic measurements on a single-phase power transformer. A theoretical analysis of the core nonlinearities phenomena...... in single and three-phase transformers is also presented. This analysis leads to the conclusion that the method can be applied for modelling nonlinearities of three-phase autotransformers....

  5. Development and Evaluation of Vectorised and Multi-Core Event Reconstruction Algorithms within the CMS Software Framework

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    The processing of data acquired by the CMS detector at LHC is carried out with an object-oriented C++ software framework: CMSSW. With the increasing luminosity delivered by the LHC, the treatment of recorded data requires extraordinary large computing resources, also in terms of CPU usage. A possible solution to cope with this task is the exploitation of the features offered by the latest microprocessor architectures. Modern CPUs present several vector units, the capacity of which is growing steadily with the introduction of new processor generations. Moreover, an increasing number of cores per die is offered by the main vendors, even on consumer hardware. Most recent C++ compilers provide facilities to take advantage of such innovations, either by explicit statements in the programs’ sources or automatically adapting the generated machine instructions to the available hardware, without the need of modifying the existing code base. Programming techniques to implement reconstruction algorithms and optimised ...

  6. AthenaMT: upgrading the ATLAS software framework for the many-core world with multi-threading

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00100895; The ATLAS collaboration; Baines, John; Bold, Tomasz; Calafiura, Paolo; Farrell, Steven; Malon, David; Ritsch, Elmar; Stewart, Graeme; Snyder, Scott; Tsulaia, Vakhtang; Wynne, Benjamin; van Gemmeren, Peter

    2017-01-01

    ATLAS’s current software framework, Gaudi/Athena, has been very successful for the experiment in LHC Runs 1 and 2. However, its single threaded design has been recognized for some time to be increasingly problematic as CPUs have increased core counts and decreased available memory per core. Even the multi-process version of Athena, AthenaMP, will not scale to the range of architectures we expect to use beyond Run2. After concluding a rigorous requirements phase, where many design components were examined in detail, ATLAS has begun the migration to a new data-flow driven, multi-threaded framework, which enables the simultaneous processing of singleton, thread unsafe legacy Algorithms, cloned Algorithms that execute concurrently in their own threads with different Event contexts, and fully re-entrant, thread safe Algorithms. In this paper we report on the process of modifying the framework to safely process multiple concurrent events in different threads, which entails significant changes in the underlying ha...

  7. AthenaMT: Upgrading the ATLAS Software Framework for the Many-Core World with Multi-Threading

    CERN Document Server

    Leggett, Charles; The ATLAS collaboration; Bold, Tomasz; Calafiura, Paolo; Farrell, Steven; Malon, David; Ritsch, Elmar; Stewart, Graeme; Snyder, Scott; Tsulaia, Vakhtang; Wynne, Benjamin; van Gemmeren, Peter

    2016-01-01

    ATLAS's current software framework, Gaudi/Athena, has been very successful for the experiment in LHC Runs 1 and 2. However, its single threaded design has been recognised for some time to be increasingly problematic as CPUs have increased core counts and decreased available memory per core. Even the multi-process version of Athena, AthenaMP, will not scale to the range of architectures we expect to use beyond Run2. After concluding a rigorous requirements phase, where many design components were examined in detail, ATLAS has begun the migration to a new data-flow driven, multi-threaded framework, which enables the simultaneous processing of singleton, thread unsafe legacy Algorithms, cloned Algorithms that execute concurrently in their own threads with different Event contexts, and fully re-entrant, thread safe Algorithms. In this paper we will report on the process of modifying the framework to safely process multiple concurrent events in different threads, which entails significant changes in the underlying...

  8. Validation of a new software version for monitoring of the core of the Unit 2 of the Laguna Verde power plant with ARTS

    International Nuclear Information System (INIS)

    Calleros, G.; Riestra, M.; Ibanez, C.; Lopez, X.; Vargas, A.; Mendez, A.; Gomez, R.

    2005-01-01

    In this work it is intended a methodology to validate a new version of the software used for monitoring the reactor core, which requires of the evaluation of the thermal limits settled down in the Operation Technical Specifications, for the Unit 2 of Laguna Verde with ARTS (improvements to the APRMs, Rod Block Monitor and Technical specifications). According to the proposed methodology, those are shown differences found in the thermal limits determined with the new versions and previous of the core monitoring software. Author)

  9. Development and Evaluation of Vectorised and Multi-Core Event Reconstruction Algorithms within the CMS Software Framework

    Science.gov (United States)

    Hauth, T.; Innocente and, V.; Piparo, D.

    2012-12-01

    The processing of data acquired by the CMS detector at LHC is carried out with an object-oriented C++ software framework: CMSSW. With the increasing luminosity delivered by the LHC, the treatment of recorded data requires extraordinary large computing resources, also in terms of CPU usage. A possible solution to cope with this task is the exploitation of the features offered by the latest microprocessor architectures. Modern CPUs present several vector units, the capacity of which is growing steadily with the introduction of new processor generations. Moreover, an increasing number of cores per die is offered by the main vendors, even on consumer hardware. Most recent C++ compilers provide facilities to take advantage of such innovations, either by explicit statements in the programs sources or automatically adapting the generated machine instructions to the available hardware, without the need of modifying the existing code base. Programming techniques to implement reconstruction algorithms and optimised data structures are presented, that aim to scalable vectorization and parallelization of the calculations. One of their features is the usage of new language features of the C++11 standard. Portions of the CMSSW framework are illustrated which have been found to be especially profitable for the application of vectorization and multi-threading techniques. Specific utility components have been developed to help vectorization and parallelization. They can easily become part of a larger common library. To conclude, careful measurements are described, which show the execution speedups achieved via vectorised and multi-threaded code in the context of CMSSW.

  10. Co Modeling and Co Synthesis of Safety Critical Multi threaded Embedded Software for Multi Core Embedded Platforms

    Science.gov (United States)

    2017-03-20

    Kaiserslautern Kaiserslautern, Germany Sandeep Shukla FERMAT Lab Electrical and Computer Engineering Department Virginia Tech 900 North Glebe Road...Software Engineering , Software Producibility, Component-based software design, behavioral types, behavioral type inference, Polychronous model of...near future, many embedded applications including safety critical ones as used in avionics, automotive , mission control systems will run on

  11. HardwareSoftware Co-design for Heterogeneous Multi-core Platforms The hArtes Toolchain

    CERN Document Server

    2012-01-01

    This book describes the results and outcome of the FP6 project, known as hArtes, which focuses on the development of an integrated tool chain targeting a heterogeneous multi core platform comprising of a general purpose processor (ARM or powerPC), a DSP (the diopsis) and an FPGA. The tool chain takes existing source code and proposes transformations and mappings such that legacy code can easily be ported to a modern, multi-core platform. Benefits of the hArtes approach, described in this book, include: Uses a familiar programming paradigm: hArtes proposes a familiar programming paradigm which is compatible with the widely used programming practice, irrespective of the target platform. Enables users to view multiple cores as a single processor: the hArtes approach abstracts away the heterogeneity as well as the multi-core aspect of the underlying hardware so the developer can view the platform as consisting of a single, general purpose processor. Facilitates easy porting of existing applications: hArtes provid...

  12. A Formal Approach to the Provably Correct Synthesis of Mission Critical Embedded Software for Multi Core Embedded Platforms

    Science.gov (United States)

    2014-04-01

    synchronization primitives based on preset templates can result in over synchronization if unchecked, possibly creating deadlock situations. Further...inputs rather than enforcing synchronization with a global clock. MRICDF models software as a network of communicating actors. Four primitive actors...control wants to send interrupt or not. Since this is shared buffer, a semaphore mechanism is assumed to synchronize the read/write of this buffer. The

  13. Core component integration tests for the back-end software sub-system in the ATLAS data acquisition and event filter prototype -1 project

    International Nuclear Information System (INIS)

    Badescu, E.; Caprini, M.; Niculescu, M.; Radu, A.

    2000-01-01

    The ATLAS data acquisition (DAQ) and Event Filter (EF) prototype -1 project was intended to produce a prototype system for evaluating candidate technologies and architectures for the final ATLAS DAQ system on the LHC accelerator at CERN. Within the prototype project, the back-end sub-system encompasses the software for configuring, controlling and monitoring the DAQ. The back-end sub-system includes core components and detector integration components. The core components provide the basic functionality and had priority in terms of time-scale for development in order to have a baseline sub-system that can be used for integration with the data-flow sub-system and event filter. The following components are considered to be the core of the back-end sub-system: - Configuration databases, describe a large number of parameters of the DAQ system architecture, hardware and software components, running modes and status; - Message reporting system (MRS), allows all software components to report messages to other components in the distributed environment; - Information service (IS) allows the information exchange for software components; - Process manager (PMG), performs basic job control of software components (start, stop, monitoring the status); - Run control (RC), controls the data taking activities by coordinating the operations of the DAQ sub-systems, back-end software and external systems. Performance and scalability tests have been made for individual components. The back-end subsystem integration tests bring together all the core components and several trigger/DAQ/detector integration components to simulate the control and configuration of data taking sessions. For back-end integration tests a test plan was provided. The tests have been done using a shell script that goes through different phases as follows: - starting the back-end server processes to initialize communication services and PMG; - launching configuration specific processes via DAQ supervisor as

  14. Prototype coupling of the CFD software ansys CFX with the 3D neutron kinetic core model DYN3D - 249

    International Nuclear Information System (INIS)

    Kliem, S.; Rohde, U.; Schutze, J.; Frank, Th.

    2010-01-01

    The CFD code ANSYS CFX has been coupled with the neutron-kinetic core model DYN3D. ANSYS CFX calculates the fluid dynamics and related transport phenomena in the reactor's coolant and provides the corresponding data to DYN3D. In the fluid flow simulation of the coolant, the core itself is modeled within the porous body approach. DYN3D calculates the neutron kinetics and the fuel behavior including the heat transfer to the coolant. The physical data interface between the codes is the volumetric heat release rate into the coolant. In the prototype that is currently available, the coupling is restricted to single-phase flow problems. In the time domain an explicit coupling of the codes has been implemented so far. Steady-state and transient verification calculations for a small-size test problem confirm the correctness of the implementation of the prototype coupling. This test problem was a mini-core consisting of nine real-size fuel assemblies. Comparison was performed with the DYN3D standalone code. In the steady state, the effective multiplication factor obtained by the ANSYS CFX/DYN3D codes shows a deviation of 9.8 pcm from the DYN3D stand-alone solution. This difference can be attributed to the use of different water property packages in the two codes. The transient test case simulated the withdrawal of the control rod from the central fuel assembly at hot zero power. Power increase during the introduction of positive reactivity and power reduction due to fuel temperature increase are calculated in the same manner by the coupled and the stand-alone codes. The maximum values reached during the power rise differ by about 1 MW at a power level of 50 MW. Beside the different water property packages, these differences are caused by the use of different flow solvers. (authors)

  15. Cronos 2: a neutronic simulation software for reactor core calculations; Cronos 2: un logiciel de simulation neutronique des coeurs de reacteurs

    Energy Technology Data Exchange (ETDEWEB)

    Lautard, J J; Magnaud, C; Moreau, F; Baudron, A M [CEA Saclay, Dept. de Mecanique et de Technologie (DMT/SERMA), 91 - Gif-sur-Yvette (France)

    1999-07-01

    The CRONOS2 software is that part of the SAPHYR code system dedicated to neutronic core calculations. CRONOS2 is a powerful tool for reactor design, fuel management and safety studies. Its modular structure and great flexibility make CRONOS2 an unique simulation tool for research and development for a wide variety of reactor systems. CRONOS2 is a versatile tool that covers a large range of applications from very fast calculations used in training simulators to time and memory consuming reference calculations needed to understand complex physical phenomena. CRONOS2 has a procedure library named CPROC that allows the user to create its own application environment fitted to a specific industrial use. (authors)

  16. The coupling of the Star-Cd software to a whole-core neutron transport code Decart for PWR applications

    International Nuclear Information System (INIS)

    Thomas, J.W.; Lee, H.C.; Downar, T.J.; Sofu, T.; Weber, D.P.; Joo, H.G.; Cho, J.Y.

    2003-01-01

    As part of a U.S.- Korea collaborative U.S. Department of Energy INERI project, a comprehensive high-fidelity reactor-core modeling capability is being developed for detailed analysis of existing and advanced PWR reactor designs. An essential element of the project has been the development of an interface between the computational fluid dynamics (CFD) module, STAR-CD, and the neutronics module, DeCART. Since the computational mesh for CFD and neutronics calculations are generally different, the capability to average and decompose data on these different meshes has been an important part of code coupling activities. An averaging process has been developed to extract neutronics zone temperatures in the fuel and coolant and to generate appropriate multi group cross sections and densities. Similar procedures have also been established to map the power distribution from the neutronics zones to the mesh structure used in the CFD module. Since MPI is used as the parallel model in STAR-CD and conflicts arise during initiation of a second level of MPI, the interface developed here is based on using TCP/IP protocol sockets to establish communication between the CFD and neutronics modules. Preliminary coupled calculations have been performed for PWR fuel assembly size problems and converged solutions have been achieved for a series of steady-state problems ranging from a single pin to a 1/8 model of a 17 x 17 PWR fuel assembly. (authors)

  17. Validation of a new software version for monitoring of the core of the Unit 2 of the Laguna Verde power plant with ARTS; Validacion de una nueva version del software para monitoreo del nucleo de la Unidad 2 de la Central Laguna Verde con ARTS

    Energy Technology Data Exchange (ETDEWEB)

    Calleros, G.; Riestra, M.; Ibanez, C.; Lopez, X.; Vargas, A.; Mendez, A.; Gomez, R. [CFE, Central Nucleoelectrica de Laguna Verde, Alto Lucero, Veracruz (Mexico)]. e-mail: gcm9acpp@cfe.gob.mx

    2005-07-01

    In this work it is intended a methodology to validate a new version of the software used for monitoring the reactor core, which requires of the evaluation of the thermal limits settled down in the Operation Technical Specifications, for the Unit 2 of Laguna Verde with ARTS (improvements to the APRMs, Rod Block Monitor and Technical specifications). According to the proposed methodology, those are shown differences found in the thermal limits determined with the new versions and previous of the core monitoring software. Author)

  18. Software concepts for the build-up of complex systems - selection and realization taking as example a program system for calculation of hypothetical core meltdown accidents

    International Nuclear Information System (INIS)

    Scheuermann, W.

    1994-10-01

    Development and application of simulation systems for the analysis of complex processes require on the one hand and detailed engineering knowledge of the plant and the processes to be simulated and on the other hand a detailled knowledge about software engineering, numerics and data structures. The cooperation of specialists of both areas will become easier if it is possible to reduce the complexicity of the problems to be solved in a way that the analyses will not be disturbed and the communication between different disciplines will not become unnecessarily complicated. One solution to reduce the complexity is to consider computer science as an engineering discipline which provides mainly abstract elements and to allow engineers to build application systems based on these abstract elements. The principle of abstraction leads through the processes of modularisation and the solution of the interface problem to an almost problem independent system architecture where the elements of the system (modules, model components and models) operate only on those data assigned to them. In addition the development of abstract data types allows the formal description of the relations and interactions between system elements. This work describes how these ideas can be concretized to build complex systems which allow reliable and effective problem solutions. These ideas were applied successfully during the design, realization and application of the code system KESS, which allows the analysis of core melt down accidents in pressurized water reactors. (orig.) [de

  19. Top-Level Software for VVER-1000 In-core Monitoring System under Implementation of Expanded Nuclear Fuel Diversification Program in Ukraine

    International Nuclear Information System (INIS)

    Khalimonchuk, V.A.

    2015-01-01

    The paper considers the possibility and expediency of developing mathematical software for VVER-1000 ICMS in Ukraine. This mathematical software is among the most important conditions for implementation of the expanded nuclear fuel diversification program. The top-level software is to be developed based on SSTC own studies in the development of codes for power distribution recovery, which were successfully used previously for RBMK-1000 safety analysis

  20. Validation of a new version of software for monitoring the core of nuclear power plant of Laguna Verde Unit 2, at the end of Cycle 10

    International Nuclear Information System (INIS)

    Hernandez, G.; Calleros, G.; Mata, F.

    2009-10-01

    This work shows the differences observed in thermal limits established in the technical specifications of operation, among the new software, installed at the end of Cycle 10 of Unit 2 of nuclear power plant of Laguna Verde, and the old software that was installed from the beginning of the cycle. The methodology allowed to validate the new software during the coast down stage, before finishing the cycle, for what could be used as tool during the shutdown of Unit 2 at the end of Cycle 10. (Author)

  1. Pragmatic Software Innovation

    DEFF Research Database (Denmark)

    Aaen, Ivan; Jensen, Rikke Hagensby

    2014-01-01

    We understand software innovation as concerned with introducing innovation into the development of software intensive systems, i.e. systems in which software development and/or integration are dominant considerations. Innovation is key in almost any strategy for competitiveness in existing markets......, for creating new markets, or for curbing rising public expenses, and software intensive systems are core elements in most such strategies. Software innovation therefore is vital for about every sector of the economy. Changes in software technologies over the last decades have opened up for experimentation......, learning, and flexibility in ongoing software projects, but how can this change be used to facilitate software innovation? How can a team systematically identify and pursue opportunities to create added value in ongoing projects? In this paper, we describe Deweyan pragmatism as the philosophical foundation...

  2. Simulation of an MSLB scenario using the 3D neutron kinetic core model DYN3D coupled with the CFD software Trio-U

    Energy Technology Data Exchange (ETDEWEB)

    Grahn, Alexander, E-mail: a.grahn@hzdr.de; Gommlich, André; Kliem, Sören; Bilodid, Yurii; Kozmenkov, Yaroslav

    2017-04-15

    Highlights: • Improved thermal-hydraulic description of nuclear reactor cores. • Providing reactor dynamics code with realistic thermal-hydraulic boundary conditions. • Possibility of three-dimensional flow phenomena in the core, such as cross flow, flow reversal. • Simulation at higher spatial resolution as compared to system codes. - Abstract: In the framework of the European project NURESAFE, the reactor dynamics code DYN3D, developed at Helmholtz-Zentrum Dresden-Rossendorf (HZDR), was coupled with the Computational Fluid Dynamics (CFD) solver Trio-U, developed at CEA France, in order to replace DYN3D’s one-dimensional hydraulic part with a full three-dimensional description of the coolant flow in the reactor core at higher spatial resolution. The present document gives an introduction into the coupling method and shows results of its application to the simulation of a Main Steamline Break (MSLB) accident of a Pressurised Water Reactor (PWR).

  3. Coupling of 3-D core computational codes and a reactor simulation software for the computation of PWR reactivity accidents induced by thermal-hydraulic transients

    International Nuclear Information System (INIS)

    Raymond, P.; Caruge, D.; Paik, H.J.

    1994-01-01

    The French CEA has recently developed a set of new computer codes for reactor physics computations called the Saphir system which includes CRONOS-2, a three-dimensional neutronic code, FLICA-4, a three-dimensional core thermal hydraulic code, and FLICA-S, a primary loops thermal-hydraulic transient computation code, which are coupled and applied to analyze a severe reactivity accident induced by a thermal hydraulic transient: the Steamline Break accident for a pressurized water reactor until soluble boron begins to accumulate in the core. The coupling of these codes has proved to be numerically stable. 15 figs., 7 refs

  4. PARA UNA EMPRESA DE DISEÑO Y DESARROLLO DE SOFTWARE ENFOCADO AL OCIO EN REALIDAD VIRTUAL. CREACIÓN DE LA MARCA ''VIRTUAL CORE ENTERTAINMENT''

    OpenAIRE

    COLOMER RUBIO, GUILLERMO JOSÉ

    2017-01-01

    [EN] The present project deals with a business plan for the creation of a software studio dedicated to the production of videogames in virtual reality. The main motivation for the creation of this company is based on the great opportunity of the video games sector and, more specifically, the leisure based in Virtual Reality, due to the fact that in recent years this technology has been strongly developed and also the economic weight of the videogames sector is getting to surpas...

  5. Dtest Testing Software

    Science.gov (United States)

    Jain, Abhinandan; Cameron, Jonathan M.; Myint, Steven

    2013-01-01

    This software runs a suite of arbitrary software tests spanning various software languages and types of tests (unit level, system level, or file comparison tests). The dtest utility can be set to automate periodic testing of large suites of software, as well as running individual tests. It supports distributing multiple tests over multiple CPU cores, if available. The dtest tool is a utility program (written in Python) that scans through a directory (and its subdirectories) and finds all directories that match a certain pattern and then executes any tests in that directory as described in simple configuration files.

  6. Upgrade Software and Computing

    CERN Document Server

    The LHCb Collaboration, CERN

    2018-01-01

    This document reports the Research and Development activities that are carried out in the software and computing domains in view of the upgrade of the LHCb experiment. The implementation of a full software trigger implies major changes in the core software framework, in the event data model, and in the reconstruction algorithms. The increase of the data volumes for both real and simulated datasets requires a corresponding scaling of the distributed computing infrastructure. An implementation plan in both domains is presented, together with a risk assessment analysis.

  7. Software engineering

    CERN Document Server

    Sommerville, Ian

    2010-01-01

    The ninth edition of Software Engineering presents a broad perspective of software engineering, focusing on the processes and techniques fundamental to the creation of reliable, software systems. Increased coverage of agile methods and software reuse, along with coverage of 'traditional' plan-driven software engineering, gives readers the most up-to-date view of the field currently available. Practical case studies, a full set of easy-to-access supplements, and extensive web resources make teaching the course easier than ever.

  8. SOFTWARE OPEN SOURCE, SOFTWARE GRATIS?

    Directory of Open Access Journals (Sweden)

    Nur Aini Rakhmawati

    2006-01-01

    Full Text Available Normal 0 false false false IN X-NONE X-NONE MicrosoftInternetExplorer4 Berlakunya Undang – undang Hak Atas Kekayaan Intelektual (HAKI, memunculkan suatu alternatif baru untuk menggunakan software open source. Penggunaan software open source menyebar seiring dengan isu global pada Information Communication Technology (ICT saat ini. Beberapa organisasi dan perusahaan mulai menjadikan software open source sebagai pertimbangan. Banyak konsep mengenai software open source ini. Mulai dari software yang gratis sampai software tidak berlisensi. Tidak sepenuhnya isu software open source benar, untuk itu perlu dikenalkan konsep software open source mulai dari sejarah, lisensi dan bagaimana cara memilih lisensi, serta pertimbangan dalam memilih software open source yang ada. Kata kunci :Lisensi, Open Source, HAKI

  9. Software Epistemology

    Science.gov (United States)

    2016-03-01

    in-vitro decision to incubate a startup, Lexumo [7], which is developing a commercial Software as a Service ( SaaS ) vulnerability assessment...LTS Label Transition System MUSE Mining and Understanding Software Enclaves RTEMS Real-Time Executive for Multi-processor Systems SaaS Software ...as a Service SSA Static Single Assignment SWE Software Epistemology UD/DU Def-Use/Use-Def Chains (Dataflow Graph)

  10. Software reliability

    CERN Document Server

    Bendell, A

    1986-01-01

    Software Reliability reviews some fundamental issues of software reliability as well as the techniques, models, and metrics used to predict the reliability of software. Topics covered include fault avoidance, fault removal, and fault tolerance, along with statistical methods for the objective assessment of predictive accuracy. Development cost models and life-cycle cost models are also discussed. This book is divided into eight sections and begins with a chapter on adaptive modeling used to predict software reliability, followed by a discussion on failure rate in software reliability growth mo

  11. Computer software.

    Science.gov (United States)

    Rosenthal, L E

    1986-10-01

    Software is the component in a computer system that permits the hardware to perform the various functions that a computer system is capable of doing. The history of software and its development can be traced to the early nineteenth century. All computer systems are designed to utilize the "stored program concept" as first developed by Charles Babbage in the 1850s. The concept was lost until the mid-1940s, when modern computers made their appearance. Today, because of the complex and myriad tasks that a computer system can perform, there has been a differentiation of types of software. There is software designed to perform specific business applications. There is software that controls the overall operation of a computer system. And there is software that is designed to carry out specialized tasks. Regardless of types, software is the most critical component of any computer system. Without it, all one has is a collection of circuits, transistors, and silicone chips.

  12. Core analysis: new features and applications

    International Nuclear Information System (INIS)

    Edenius, M.; Kurcyusz, E.; Molina, D.; Wiksell, G.

    1995-01-01

    Today, core analysis may be performed with sophisticated software capable of both steady state and transient analysis using a common methodology for BWRs and PWRs. General trends in core analysis software development are: improved accuracy, automated engineering functions; three-dimensional transient capability; graphical user interfaces. As a demonstration of such software, new features of Studsvik-CMS (Core management system) and examples of applications are discussed in this article. 2 figs., 8 refs

  13. The STARLINK software collection

    Science.gov (United States)

    Penny, A. J.; Wallace, P. T.; Sherman, J. C.; Terret, D. L.

    1993-12-01

    A demonstration will be given of some recent Starlink software. STARLINK is: a network of computers used by UK astronomers; a collection of programs for the calibration and analysis of astronomical data; a team of people giving hardware, software and administrative support. The Starlink Project has been in operation since 1980 to provide UK astronomers with interactive image processing and data reduction facilities. There are now Starlink computer systems at 25 UK locations, serving about 1500 registered users. The Starlink software collection now has about 25 major packages covering a wide range of astronomical data reduction and analysis techniques, as well as many smaller programs and utilities. At the core of most of the packages is a common `software environment', which provides many of the functions which applications need and offers standardized methods of structuring and accessing data. The software environment simplifies programming and support, and makes it easy to use different packages for different stages of the data reduction. Users see a consistent style, and can mix applications without hitting problems of differing data formats. The Project group coordinates the writing and distribution of this software collection, which is Unix based. Outside the UK, Starlink is used at a large number of places, which range from installations at major UK telescopes, which are Starlink-compatible and managed like Starlink sites, to individuals who run only small parts of the Starlink software collection.

  14. The dynamic of modern software development project management and the software crisis of quality. An integrated system dynamics approach towards software quality improvement

    OpenAIRE

    Nasirikaljahi, Armindokht

    2012-01-01

    The software industry is plagued by cost-overruns, delays, poor customer satisfaction and quality issues that are costing clients and customers world-wide billions of dollars each year. The phenomenon is coined The Software Crisis", and poses a huge challenge for software project management. This thesis addresses one of the core issues of the software crisis, namely software quality. The challenges of software quality are central for understanding the other symptoms of the software crisis. Th...

  15. Proteomics Core

    Data.gov (United States)

    Federal Laboratory Consortium — Proteomics Core is the central resource for mass spectrometry based proteomics within the NHLBI. The Core staff help collaborators design proteomics experiments in a...

  16. Software Innovation

    DEFF Research Database (Denmark)

    Rose, Jeremy

      Innovation is the forgotten key to modern systems development - the element that defines the enterprising engineer, the thriving software firm and the cutting edge software application.  Traditional forms of technical education pay little attention to creativity - often encouraging overly...

  17. Software engineering

    CERN Document Server

    Sommerville, Ian

    2016-01-01

    For courses in computer science and software engineering The Fundamental Practice of Software Engineering Software Engineering introduces readers to the overwhelmingly important subject of software programming and development. In the past few years, computer systems have come to dominate not just our technological growth, but the foundations of our world's major industries. This text seeks to lay out the fundamental concepts of this huge and continually growing subject area in a clear and comprehensive manner. The Tenth Edition contains new information that highlights various technological updates of recent years, providing readers with highly relevant and current information. Sommerville's experience in system dependability and systems engineering guides the text through a traditional plan-based approach that incorporates some novel agile methods. The text strives to teach the innovators of tomorrow how to create software that will make our world a better, safer, and more advanced place to live.

  18. Software requirements

    CERN Document Server

    Wiegers, Karl E

    2003-01-01

    Without formal, verifiable software requirements-and an effective system for managing them-the programs that developers think they've agreed to build often will not be the same products their customers are expecting. In SOFTWARE REQUIREMENTS, Second Edition, requirements engineering authority Karl Wiegers amplifies the best practices presented in his original award-winning text?now a mainstay for anyone participating in the software development process. In this book, you'll discover effective techniques for managing the requirements engineering process all the way through the development cy

  19. Software Reviews.

    Science.gov (United States)

    Dwyer, Donna; And Others

    1989-01-01

    Reviewed are seven software packages for Apple and IBM computers. Included are: "Toxicology"; "Science Corner: Space Probe"; "Alcohol and Pregnancy"; "Science Tool Kit Plus"; Computer Investigations: Plant Growth"; "Climatrolls"; and "Animal Watch: Whales." (CW)

  20. Software Reviews.

    Science.gov (United States)

    Davis, Shelly J., Ed.; Knaupp, Jon, Ed.

    1984-01-01

    Reviewed is computer software on: (1) classification of living things, a tutorial program for grades 5-10; and (2) polynomial practice using tiles, a drill-and-practice program for algebra students. (MNS)

  1. Software Reviews.

    Science.gov (United States)

    Miller, Anne, Ed.; Radziemski, Cathy, Ed.

    1988-01-01

    Three pieces of computer software are described and reviewed: HyperCard, to build and use varied applications; Iggy's Gnees, for problem solving with shapes in grades kindergarten-two; and Algebra Shop, for practicing skills and problem solving. (MNS)

  2. The Ettention software package

    International Nuclear Information System (INIS)

    Dahmen, Tim; Marsalek, Lukas; Marniok, Nico; Turoňová, Beata; Bogachev, Sviatoslav; Trampert, Patrick; Nickels, Stefan; Slusallek, Philipp

    2016-01-01

    We present a novel software package for the problem “reconstruction from projections” in electron microscopy. The Ettention framework consists of a set of modular building-blocks for tomographic reconstruction algorithms. The well-known block iterative reconstruction method based on Kaczmarz algorithm is implemented using these building-blocks, including adaptations specific to electron tomography. Ettention simultaneously features (1) a modular, object-oriented software design, (2) optimized access to high-performance computing (HPC) platforms such as graphic processing units (GPU) or many-core architectures like Xeon Phi, and (3) accessibility to microscopy end-users via integration in the IMOD package and eTomo user interface. We also provide developers with a clean and well-structured application programming interface (API) that allows for extending the software easily and thus makes it an ideal platform for algorithmic research while hiding most of the technical details of high-performance computing. - Highlights: • Novel software package for “reconstruction from projections” in electron microscopy. • Support for high-resolution reconstructions on iterative reconstruction algorithms. • Support for CPU, GPU and Xeon Phi. • Integration in the IMOD software. • Platform for algorithm researchers: object oriented, modular design.

  3. The Ettention software package

    Energy Technology Data Exchange (ETDEWEB)

    Dahmen, Tim, E-mail: Tim.Dahmen@dfki.de [German Research Center for Artificial Intelligence GmbH (DFKI), 66123 Saarbrücken (Germany); Saarland University, 66123 Saarbrücken (Germany); Marsalek, Lukas [Eyen SE, Na Nivách 1043/16, 141 00 Praha 4 (Czech Republic); Saarland University, 66123 Saarbrücken (Germany); Marniok, Nico [Saarland University, 66123 Saarbrücken (Germany); Turoňová, Beata [Saarland University, 66123 Saarbrücken (Germany); IMPRS-CS, Max-Planck Institute for Informatics, Campus E 1.4, 66123 Saarbrücken (Germany); Bogachev, Sviatoslav [Saarland University, 66123 Saarbrücken (Germany); Trampert, Patrick; Nickels, Stefan [German Research Center for Artificial Intelligence GmbH (DFKI), 66123 Saarbrücken (Germany); Slusallek, Philipp [German Research Center for Artificial Intelligence GmbH (DFKI), 66123 Saarbrücken (Germany); Saarland University, 66123 Saarbrücken (Germany)

    2016-02-15

    We present a novel software package for the problem “reconstruction from projections” in electron microscopy. The Ettention framework consists of a set of modular building-blocks for tomographic reconstruction algorithms. The well-known block iterative reconstruction method based on Kaczmarz algorithm is implemented using these building-blocks, including adaptations specific to electron tomography. Ettention simultaneously features (1) a modular, object-oriented software design, (2) optimized access to high-performance computing (HPC) platforms such as graphic processing units (GPU) or many-core architectures like Xeon Phi, and (3) accessibility to microscopy end-users via integration in the IMOD package and eTomo user interface. We also provide developers with a clean and well-structured application programming interface (API) that allows for extending the software easily and thus makes it an ideal platform for algorithmic research while hiding most of the technical details of high-performance computing. - Highlights: • Novel software package for “reconstruction from projections” in electron microscopy. • Support for high-resolution reconstructions on iterative reconstruction algorithms. • Support for CPU, GPU and Xeon Phi. • Integration in the IMOD software. • Platform for algorithm researchers: object oriented, modular design.

  4. Software reengineering

    Science.gov (United States)

    Fridge, Ernest M., III

    1991-01-01

    Today's software systems generally use obsolete technology, are not integrated properly with other software systems, and are difficult and costly to maintain. The discipline of reverse engineering is becoming prominent as organizations try to move their systems up to more modern and maintainable technology in a cost effective manner. JSC created a significant set of tools to develop and maintain FORTRAN and C code during development of the Space Shuttle. This tool set forms the basis for an integrated environment to re-engineer existing code into modern software engineering structures which are then easier and less costly to maintain and which allow a fairly straightforward translation into other target languages. The environment will support these structures and practices even in areas where the language definition and compilers do not enforce good software engineering. The knowledge and data captured using the reverse engineering tools is passed to standard forward engineering tools to redesign or perform major upgrades to software systems in a much more cost effective manner than using older technologies. A beta vision of the environment was released in Mar. 1991. The commercial potential for such re-engineering tools is very great. CASE TRENDS magazine reported it to be the primary concern of over four hundred of the top MIS executives.

  5. Software Authentication

    International Nuclear Information System (INIS)

    Wolford, J.K.; Geelhood, B.D.; Hamilton, V.A.; Ingraham, J.; MacArthur, D.W.; Mitchell, D.J.; Mullens, J.A.; Vanier, P. E.; White, G.K.; Whiteson, R.

    2001-01-01

    The effort to define guidance for authentication of software for arms control and nuclear material transparency measurements draws on a variety of disciplines and has involved synthesizing established criteria and practices with newer methods. Challenges include the need to protect classified information that the software manipulates as well as deal with the rapid pace of innovation in the technology of nuclear material monitoring. The resulting guidance will shape the design of future systems and inform the process of authentication of instruments now being developed. This paper explores the technical issues underlying the guidance and presents its major tenets

  6. Software engineering

    CERN Document Server

    Thorin, Marc

    1985-01-01

    Software Engineering describes the conceptual bases as well as the main methods and rules on computer programming. This book presents software engineering as a coherent and logically built synthesis and makes it possible to properly carry out an application of small or medium difficulty that can later be developed and adapted to more complex cases. This text is comprised of six chapters and begins by introducing the reader to the fundamental notions of entities, actions, and programming. The next two chapters elaborate on the concepts of information and consistency domains and show that a proc

  7. Reviews, Software.

    Science.gov (United States)

    Science Teacher, 1988

    1988-01-01

    Reviews two computer software packages for use in physical science, physics, and chemistry classes. Includes "Physics of Model Rocketry" for Apple II, and "Black Box" for Apple II and IBM compatible computers. "Black Box" is designed to help students understand the concept of indirect evidence. (CW)

  8. Software Reviews.

    Science.gov (United States)

    Kinnaman, Daniel E.; And Others

    1988-01-01

    Reviews four educational software packages for Apple, IBM, and Tandy computers. Includes "How the West was One + Three x Four,""Mavis Beacon Teaches Typing,""Math and Me," and "Write On." Reviews list hardware requirements, emphasis, levels, publisher, purchase agreements, and price. Discusses the strengths…

  9. Software Review.

    Science.gov (United States)

    McGrath, Diane, Ed.

    1989-01-01

    Reviewed is a computer software package entitled "Audubon Wildlife Adventures: Grizzly Bears" for Apple II and IBM microcomputers. Included are availability, hardware requirements, cost, and a description of the program. The murder-mystery flavor of the program is stressed in this program that focuses on illegal hunting and game…

  10. Software Reviews.

    Science.gov (United States)

    Teles, Elizabeth, Ed.; And Others

    1990-01-01

    Reviewed are two computer software packages for Macintosh microcomputers including "Phase Portraits," an exploratory graphics tool for studying first-order planar systems; and "MacMath," a set of programs for exploring differential equations, linear algebra, and other mathematical topics. Features, ease of use, cost, availability, and hardware…

  11. MIAWARE Software

    DEFF Research Database (Denmark)

    Wilkowski, Bartlomiej; Pereira, Oscar N. M.; Dias, Paulo

    2008-01-01

    is automatically generated. Furthermore, MIAWARE software is accompanied with an intelligent search engine for medical reports, based on the relations between parts of the lungs. A logical structure of the lungs is introduced to the search algorithm through the specially developed ontology. As a result...

  12. Software Tools for Software Maintenance

    Science.gov (United States)

    1988-10-01

    COMMUNICATIONS, AND COMPUTER SCIENCES I ,(AIRMICS) FO~SOFTWARE TOOLS (.o FOR SOF1 ’ARE MAINTENANCE (ASQBG-1-89-001) October, 1988 DTIC ELECTE -ifB...SUNWW~. B..c Program An~Iysw HA.c C-Tractr C Cobol Stncturing Facility VS Cobol 11 F-Scan Foctma Futbol Cobol Fortran Sltiuc Code Anaiyaer Fortran IS

  13. EPIQR software

    Energy Technology Data Exchange (ETDEWEB)

    Flourentzos, F. [Federal Institute of Technology, Lausanne (Switzerland); Droutsa, K. [National Observatory of Athens, Athens (Greece); Wittchen, K.B. [Danish Building Research Institute, Hoersholm (Denmark)

    1999-11-01

    The support of the EPIQR method is a multimedia computer program. Several modules help the users of the method to treat the data collected during a diagnosis survey, to set up refurbishment scenario and calculate their cost or energy performance, and finally to visualize the results in a comprehensive way and to prepare quality reports. This article presents the structure and the main features of the software. (au)

  14. Software preservation

    Directory of Open Access Journals (Sweden)

    Tadej Vodopivec

    2011-01-01

    Full Text Available Comtrade Ltd. covers a wide range of activities related to information and communication technologies; its deliverables include web applications, locally installed programs,system software, drivers, embedded software (used e.g. in medical devices, auto parts,communication switchboards. Also the extensive knowledge and practical experience about digital long-term preservation technologies have been acquired. This wide spectrum of activities puts us in the position to discuss the often overlooked aspect of the digital preservation - preservation of software programs. There are many resources dedicated to digital preservation of digital data, documents and multimedia records,but not so many about how to preserve the functionalities and features of computer programs. Exactly these functionalities - dynamic response to inputs - render the computer programs rich compared to documents or linear multimedia. The article opens the questions on the beginning of the way to the permanent digital preservation. The purpose is to find a way in the right direction, where all relevant aspects will be covered in proper balance. The following questions are asked: why at all to preserve computer programs permanently, who should do this and for whom, when we should think about permanent program preservation, what should be persevered (such as source code, screenshots, documentation, and social context of the program - e.g. media response to it ..., where and how? To illustrate the theoretic concepts given the idea of virtual national museum of electronic banking is also presented.

  15. Establishing software quality assurance

    International Nuclear Information System (INIS)

    Malsbury, J.

    1983-01-01

    This paper is concerned with four questions about establishing software QA: What is software QA. Why have software QA. What is the role of software QA. What is necessary to ensure the success of software QA

  16. Software Engineering Education: Some Important Dimensions

    Science.gov (United States)

    Mishra, Alok; Cagiltay, Nergiz Ercil; Kilic, Ozkan

    2007-01-01

    Software engineering education has been emerging as an independent and mature discipline. Accordingly, various studies are being done to provide guidelines for curriculum design. The main focus of these guidelines is around core and foundation courses. This paper summarizes the current problems of software engineering education programs. It also…

  17. Transformer core

    NARCIS (Netherlands)

    Mehendale, A.; Hagedoorn, Wouter; Lötters, Joost Conrad

    2008-01-01

    A transformer core includes a stack of a plurality of planar core plates of a magnetically permeable material, which plates each consist of a first and a second sub-part that together enclose at least one opening. The sub-parts can be fitted together via contact faces that are located on either side

  18. Transformer core

    NARCIS (Netherlands)

    Mehendale, A.; Hagedoorn, Wouter; Lötters, Joost Conrad

    2010-01-01

    A transformer core includes a stack of a plurality of planar core plates of a magnetically permeable material, which plates each consist of a first and a second sub-part that together enclose at least one opening. The sub-parts can be fitted together via contact faces that are located on either side

  19. Software Prototyping

    Science.gov (United States)

    Del Fiol, Guilherme; Hanseler, Haley; Crouch, Barbara Insley; Cummins, Mollie R.

    2016-01-01

    Summary Background Health information exchange (HIE) between Poison Control Centers (PCCs) and Emergency Departments (EDs) could improve care of poisoned patients. However, PCC information systems are not designed to facilitate HIE with EDs; therefore, we are developing specialized software to support HIE within the normal workflow of the PCC using user-centered design and rapid prototyping. Objective To describe the design of an HIE dashboard and the refinement of user requirements through rapid prototyping. Methods Using previously elicited user requirements, we designed low-fidelity sketches of designs on paper with iterative refinement. Next, we designed an interactive high-fidelity prototype and conducted scenario-based usability tests with end users. Users were asked to think aloud while accomplishing tasks related to a case vignette. After testing, the users provided feedback and evaluated the prototype using the System Usability Scale (SUS). Results Survey results from three users provided useful feedback that was then incorporated into the design. After achieving a stable design, we used the prototype itself as the specification for development of the actual software. Benefits of prototyping included having 1) subject-matter experts heavily involved with the design; 2) flexibility to make rapid changes, 3) the ability to minimize software development efforts early in the design stage; 4) rapid finalization of requirements; 5) early visualization of designs; 6) and a powerful vehicle for communication of the design to the programmers. Challenges included 1) time and effort to develop the prototypes and case scenarios; 2) no simulation of system performance; 3) not having all proposed functionality available in the final product; and 4) missing needed data elements in the PCC information system. PMID:27081404

  20. SproutCore web application development

    CERN Document Server

    Keating, Tyler

    2013-01-01

    Written as a practical, step-by-step tutorial, Creating HTML5 Apps with SproutCore is full of engaging examples to help you learn in a practical context.This book is for any person looking to write software for the Web or already writing software for the Web. Whether your background is in web development or in software development, Creating HTML5 Apps with SproutCore will help you expand your skills so that you will be ready to apply the software development principles in the web development space.

  1. Validation of a new version of software for monitoring the core of nuclear power plant of Laguna Verde Unit 2, at the end of Cycle 10; Validacion de una nueva version del software para monitoreo del nucleo de la Central Laguna Verde Unidad 2, al final del Ciclo 10

    Energy Technology Data Exchange (ETDEWEB)

    Hernandez, G.; Calleros, G.; Mata, F. [Comision Federal de Electricidad, Central Nucleoelectrica de Laguna Verde, Carretera Cardel-Nautla Km 42.5, Veracruz (Mexico)], e-mail: gabriel.hernandez05@cfe.gob.mx

    2009-10-15

    This work shows the differences observed in thermal limits established in the technical specifications of operation, among the new software, installed at the end of Cycle 10 of Unit 2 of nuclear power plant of Laguna Verde, and the old software that was installed from the beginning of the cycle. The methodology allowed to validate the new software during the coast down stage, before finishing the cycle, for what could be used as tool during the shutdown of Unit 2 at the end of Cycle 10. (Author)

  2. The Ettention software package.

    Science.gov (United States)

    Dahmen, Tim; Marsalek, Lukas; Marniok, Nico; Turoňová, Beata; Bogachev, Sviatoslav; Trampert, Patrick; Nickels, Stefan; Slusallek, Philipp

    2016-02-01

    We present a novel software package for the problem "reconstruction from projections" in electron microscopy. The Ettention framework consists of a set of modular building-blocks for tomographic reconstruction algorithms. The well-known block iterative reconstruction method based on Kaczmarz algorithm is implemented using these building-blocks, including adaptations specific to electron tomography. Ettention simultaneously features (1) a modular, object-oriented software design, (2) optimized access to high-performance computing (HPC) platforms such as graphic processing units (GPU) or many-core architectures like Xeon Phi, and (3) accessibility to microscopy end-users via integration in the IMOD package and eTomo user interface. We also provide developers with a clean and well-structured application programming interface (API) that allows for extending the software easily and thus makes it an ideal platform for algorithmic research while hiding most of the technical details of high-performance computing. Copyright © 2015 Elsevier B.V. All rights reserved.

  3. Core lifter

    Energy Technology Data Exchange (ETDEWEB)

    Pavlov, N G; Edel' man, Ya A

    1981-02-15

    A core lifter is suggested which contains a housing, core-clamping elements installed in the housing depressions in the form of semirings with projections on the outer surface restricting the rotation of the semirings in the housing depressions. In order to improve the strength and reliability of the core lifter, the semirings have a variable transverse section formed from the outside by the surface of the rotation body of the inner arc of the semiring aroung the rotation axis and from the inner a cylindrical surface which is concentric to the outer arc of the semiring. The core-clamping elements made in this manner have the possibility of freely rotating in the housing depressions under their own weight and from contact with the core sample. These semirings do not have weakened sections, have sufficient strength, are inserted into the limited ring section of the housing of the core lifter without reduction in its through opening and this improve the reliability of the core lifter in operation.

  4. Global Software Engineering: A Software Process Approach

    Science.gov (United States)

    Richardson, Ita; Casey, Valentine; Burton, John; McCaffery, Fergal

    Our research has shown that many companies are struggling with the successful implementation of global software engineering, due to temporal, cultural and geographical distance, which causes a range of factors to come into play. For example, cultural, project managementproject management and communication difficulties continually cause problems for software engineers and project managers. While the implementation of efficient software processes can be used to improve the quality of the software product, published software process models do not cater explicitly for the recent growth in global software engineering. Our thesis is that global software engineering factors should be included in software process models to ensure their continued usefulness in global organisations. Based on extensive global software engineering research, we have developed a software process, Global Teaming, which includes specific practices and sub-practices. The purpose is to ensure that requirements for successful global software engineering are stipulated so that organisations can ensure successful implementation of global software engineering.

  5. Software system safety

    Science.gov (United States)

    Uber, James G.

    1988-01-01

    Software itself is not hazardous, but since software and hardware share common interfaces there is an opportunity for software to create hazards. Further, these software systems are complex, and proven methods for the design, analysis, and measurement of software safety are not yet available. Some past software failures, future NASA software trends, software engineering methods, and tools and techniques for various software safety analyses are reviewed. Recommendations to NASA are made based on this review.

  6. Reactor core

    International Nuclear Information System (INIS)

    Azekura, Kazuo; Kurihara, Kunitoshi.

    1992-01-01

    In a BWR type reactor, a great number of pipes (spectral shift pipes) are disposed in the reactor core. Moderators having a small moderating cross section (heavy water) are circulated in the spectral shift pipes to suppress the excess reactivity while increasing the conversion ratio at an initial stage of the operation cycle. After the intermediate stage of the operation cycle in which the reactor core reactivity is lowered, reactivity is increased by circulating moderators having a great moderating cross section (light water) to extend the taken up burnup degree. Further, neutron absorbers such as boron are mixed to the moderator in the spectral shift pipe to control the concentration thereof. With such a constitution, control rods and driving mechanisms are no more necessary, to simplify the structure of the reactor core. This can increase the fuel conversion ratio and control great excess reactivity. Accordingly, a nuclear reactor core of high conversion and high burnup degree can be attained. (I.N.)

  7. Ice Cores

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Records of past temperature, precipitation, atmospheric trace gases, and other aspects of climate and environment derived from ice cores drilled on glaciers and ice...

  8. Core Flight System (CFS) Integrated Development Environment

    Data.gov (United States)

    National Aeronautics and Space Administration — The purpose of this project is to create an Integrated Development Environment (IDE) for the Core Flight System (CFS) software to reduce the time it takes to...

  9. Sandia software guidelines: Software quality planning

    Energy Technology Data Exchange (ETDEWEB)

    1987-08-01

    This volume is one in a series of Sandia Software Guidelines intended for use in producing quality software within Sandia National Laboratories. In consonance with the IEEE Standard for Software Quality Assurance Plans, this volume identifies procedures to follow in producing a Software Quality Assurance Plan for an organization or a project, and provides an example project SQA plan. 2 figs., 4 tabs.

  10. Avoidable Software Procurements

    Science.gov (United States)

    2012-09-01

    software license, software usage, ELA, Software as a Service , SaaS , Software Asset...PaaS Platform as a Service SaaS Software as a Service SAM Software Asset Management SMS System Management Server SEWP Solutions for Enterprise Wide...delivery of full Cloud Services , we will see the transition of the Cloud Computing service model from Iaas to SaaS , or Software as a Service . Software

  11. Risk reduction using DDP (Defect Detection and Prevention): Software support and software applications

    Science.gov (United States)

    Feather, M. S.

    2001-01-01

    Risk assessment and mitigation is the focus of the Defect Detection and Prevention (DDP) process, which has been applied to spacecraft technology assessments and planning, both hardware and software. DDP's major elements and their relevance to core requirement engineering concerns are summarized. The accompanying research demonstration illustrates DDP's tool support, and further customizations for application to software.

  12. Software engineering architecture-driven software development

    CERN Document Server

    Schmidt, Richard F

    2013-01-01

    Software Engineering: Architecture-driven Software Development is the first comprehensive guide to the underlying skills embodied in the IEEE's Software Engineering Body of Knowledge (SWEBOK) standard. Standards expert Richard Schmidt explains the traditional software engineering practices recognized for developing projects for government or corporate systems. Software engineering education often lacks standardization, with many institutions focusing on implementation rather than design as it impacts product architecture. Many graduates join the workforce with incomplete skil

  13. Continuing progress on a lattice QCD software infrastructure

    International Nuclear Information System (INIS)

    Joo, B

    2008-01-01

    We report on the progress of the software effort in the QCD application area of SciDAC. In particular, we discuss how the software developed under SciDAC enabled the aggressive exploitation of leadership computers, and we report on progress in the area of QCD software for multi-core architectures

  14. The software life cycle

    CERN Document Server

    Ince, Darrel

    1990-01-01

    The Software Life Cycle deals with the software lifecycle, that is, what exactly happens when software is developed. Topics covered include aspects of software engineering, structured techniques of software development, and software project management. The use of mathematics to design and develop computer systems is also discussed. This book is comprised of 20 chapters divided into four sections and begins with an overview of software engineering and software development, paying particular attention to the birth of software engineering and the introduction of formal methods of software develop

  15. Reactor core

    International Nuclear Information System (INIS)

    Matsuura, Tetsuaki; Nomura, Teiji; Tokunaga, Kensuke; Okuda, Shin-ichi

    1990-01-01

    Fuel assemblies in the portions where the gradient of fast neutron fluxes between two opposing faces of a channel box is great are kept loaded at the outermost peripheral position of the reactor core also in the second operation cycle in the order to prevent interference between a control rod and the channel box due to bending deformation of the channel box. Further, the fuel assemblies in the second row from the outer most periphery in the first operation cycle are also kept loaded at the second row in the second operation cycle. Since the gradient of the fast neutrons in the reactor core is especially great at the outer circumference of the reactor core, the channel box at the outer circumference is bent such that the surface facing to the center of the reactor core is convexed and the channel box in the second row is also bent to the identical direction, the insertion of the control rod is not interfered. Further, if the positions for the fuels at the outermost periphery and the fuels in the second row are not altered in the second operation cycle, the gaps are not reduced to prevent the interference between the control rod and the channel box. (N.H.)

  16. Software Innovation in a Mission Critical Environment

    Science.gov (United States)

    Fredrickson, Steven

    2015-01-01

    Operating in mission-critical environments requires trusted solutions, and the preference for "tried and true" approaches presents a potential barrier to infusing innovation into mission-critical systems. This presentation explores opportunities to overcome this barrier in the software domain. It outlines specific areas of innovation in software development achieved by the Johnson Space Center (JSC) Engineering Directorate in support of NASA's major human spaceflight programs, including International Space Station, Multi-Purpose Crew Vehicle (Orion), and Commercial Crew Programs. Software engineering teams at JSC work with hardware developers, mission planners, and system operators to integrate flight vehicles, habitats, robotics, and other spacecraft elements for genuinely mission critical applications. The innovations described, including the use of NASA Core Flight Software and its associated software tool chain, can lead to software that is more affordable, more reliable, better modelled, more flexible, more easily maintained, better tested, and enabling of automation.

  17. Modelling characteristics of ferromagnetic cores with the influence of temperature

    International Nuclear Information System (INIS)

    Górecki, K; Rogalska, M; Zarȩbski, J; Detka, K

    2014-01-01

    The paper is devoted to modelling characteristics of ferromagnetic cores with the use of SPICE software. Some disadvantages of the selected literature models of such cores are discussed. A modified model of ferromagnetic cores taking into account the influence of temperature on the magnetizing characteristics and the core losses is proposed. The form of the elaborated model is presented and discussed. The correctness of this model is verified by comparing the calculated and the measured characteristics of the selected ferromagnetic cores.

  18. ESTSC - Software Best Practices

    Science.gov (United States)

    DOE Scientific and Technical Software Best Practices December 2010 Table of Contents 1.0 Introduction 2.0 Responsibilities 2.1 OSTI/ESTSC 2.2 SIACs 2.3 Software Submitting Sites/Creators 2.4 Software Sensitivity Review 3.0 Software Announcement and Submission 3.1 STI Software Appropriate for Announcement 3.2

  19. Software Assurance Competency Model

    Science.gov (United States)

    2013-03-01

    COTS) software , and software as a service ( SaaS ). L2: Define and analyze risks in the acquisition of contracted software , COTS software , and SaaS ...2010a]: Application of technologies and processes to achieve a required level of confidence that software systems and services function in the...

  20. Software attribute visualization for high integrity software

    Energy Technology Data Exchange (ETDEWEB)

    Pollock, G.M.

    1998-03-01

    This report documents a prototype tool developed to investigate the use of visualization and virtual reality technologies for improving software surety confidence. The tool is utilized within the execution phase of the software life cycle. It provides a capability to monitor an executing program against prespecified requirements constraints provided in a program written in the requirements specification language SAGE. The resulting Software Attribute Visual Analysis Tool (SAVAnT) also provides a technique to assess the completeness of a software specification.

  1. Validation of reactor core protection system

    International Nuclear Information System (INIS)

    Lee, Sang-Hoon; Bae, Jong-Sik; Baeg, Seung-Yeob; Cho, Chang-Ho; Kim, Chang-Ho; Kim, Sung-Ho; Kim, Hang-Bae; In, Wang-Kee; Park, Young-Ho

    2008-01-01

    Reactor COre Protection System (RCOPS), an advanced core protection calculator system, is a digitized one which provides core protection function based on two reactor core operation parameters, Departure from Nucleate Boiling Ratio (DNBR) and Local Power Density (LPD). It generates a reactor trip signal when the core condition exceeds the DNBR or LPD design limit. It consists of four independent channels adapted a two-out-of-four trip logic. System configuration, hardware platform and an improved algorithm of the newly designed core protection calculator system are described in this paper. One channel of RCOPS was implemented as a single channel facility for this R and D project where we performed final integration software testing. To implement custom function blocks, pSET is used. Software test is performed by two methods. The first method is a 'Software Module Test' and the second method is a 'Software Unit Test'. New features include improvement of core thermal margin through a revised on-line DNBR algorithm, resolution of the latching problem of control element assembly signal and addition of the pre-trip alarm generation. The change of the on-line DNBR calculation algorithm is considered to improve the DNBR net margin by 2.5%-3.3%. (author)

  2. Reliability of software

    International Nuclear Information System (INIS)

    Kopetz, H.

    1980-01-01

    Common factors and differences in the reliability of hardware and software; reliability increase by means of methods of software redundancy. Maintenance of software for long term operating behavior. (HP) [de

  3. Space Flight Software Development Software for Intelligent System Health Management

    Science.gov (United States)

    Trevino, Luis C.; Crumbley, Tim

    2004-01-01

    The slide presentation examines the Marshall Space Flight Center Flight Software Branch, including software development projects, mission critical space flight software development, software technical insight, advanced software development technologies, and continuous improvement in the software development processes and methods.

  4. Software Engineering Guidebook

    Science.gov (United States)

    Connell, John; Wenneson, Greg

    1993-01-01

    The Software Engineering Guidebook describes SEPG (Software Engineering Process Group) supported processes and techniques for engineering quality software in NASA environments. Three process models are supported: structured, object-oriented, and evolutionary rapid-prototyping. The guidebook covers software life-cycles, engineering, assurance, and configuration management. The guidebook is written for managers and engineers who manage, develop, enhance, and/or maintain software under the Computer Software Services Contract.

  5. Core BPEL

    DEFF Research Database (Denmark)

    Hallwyl, Tim; Højsgaard, Espen

    The Web Services Business Process Execution Language (WS-BPEL) is a language for expressing business process behaviour based on web services. The language is intentionally not minimal but provides a rich set of constructs, allows omission of constructs by relying on defaults, and supports language......, does not allow omissions, and does not contain ignorable elements. We do so by identifying syntactic sugar, including default values, and ignorable elements in WS-BPEL. The analysis results in a translation from the full language to the core subset. Thus, we reduce the effort needed for working...

  6. Packaging of control system software

    International Nuclear Information System (INIS)

    Zagar, K.; Kobal, M.; Saje, N.; Zagar, A.; Sabjan, R.; Di Maio, F.; Stepanov, D.

    2012-01-01

    Control system software consists of several parts - the core of the control system, drivers for integration of devices, configuration for user interfaces, alarm system, etc. Once the software is developed and configured, it must be installed to computers where it runs. Usually, it is installed on an operating system whose services it needs, and also in some cases dynamically links with the libraries it provides. Operating system can be quite complex itself - for example, a typical Linux distribution consists of several thousand packages. To manage this complexity, we have decided to rely on Red Hat Package Management system (RPM) to package control system software, and also ensure it is properly installed (i.e., that dependencies are also installed, and that scripts are run after installation if any additional actions need to be performed). As dozens of RPM packages need to be prepared, we are reducing the amount of effort and improving consistency between packages through a Maven-based infrastructure that assists in packaging (e.g., automated generation of RPM SPEC files, including automated identification of dependencies). So far, we have used it to package EPICS, Control System Studio (CSS) and several device drivers. We perform extensive testing on Red Hat Enterprise Linux 5.5, but we have also verified that packaging works on CentOS and Scientific Linux. In this article, we describe in greater detail the systematic system of packaging we are using, and its particular application for the ITER CODAC Core System. (authors)

  7. Evolvable Neural Software System

    Science.gov (United States)

    Curtis, Steven A.

    2009-01-01

    The Evolvable Neural Software System (ENSS) is composed of sets of Neural Basis Functions (NBFs), which can be totally autonomously created and removed according to the changing needs and requirements of the software system. The resulting structure is both hierarchical and self-similar in that a given set of NBFs may have a ruler NBF, which in turn communicates with other sets of NBFs. These sets of NBFs may function as nodes to a ruler node, which are also NBF constructs. In this manner, the synthetic neural system can exhibit the complexity, three-dimensional connectivity, and adaptability of biological neural systems. An added advantage of ENSS over a natural neural system is its ability to modify its core genetic code in response to environmental changes as reflected in needs and requirements. The neural system is fully adaptive and evolvable and is trainable before release. It continues to rewire itself while on the job. The NBF is a unique, bilevel intelligence neural system composed of a higher-level heuristic neural system (HNS) and a lower-level, autonomic neural system (ANS). Taken together, the HNS and the ANS give each NBF the complete capabilities of a biological neural system to match sensory inputs to actions. Another feature of the NBF is the Evolvable Neural Interface (ENI), which links the HNS and ANS. The ENI solves the interface problem between these two systems by actively adapting and evolving from a primitive initial state (a Neural Thread) to a complicated, operational ENI and successfully adapting to a training sequence of sensory input. This simulates the adaptation of a biological neural system in a developmental phase. Within the greater multi-NBF and multi-node ENSS, self-similar ENI s provide the basis for inter-NBF and inter-node connectivity.

  8. Ensuring Software IP Cleanliness

    Directory of Open Access Journals (Sweden)

    Mahshad Koohgoli

    2007-12-01

    Full Text Available At many points in the life of a software enterprise, determination of intellectual property (IP cleanliness becomes critical. The value of an enterprise that develops and sells software may depend on how clean the software is from the IP perspective. This article examines various methods of ensuring software IP cleanliness and discusses some of the benefits and shortcomings of current solutions.

  9. Commercial Literacy Software.

    Science.gov (United States)

    Balajthy, Ernest

    1997-01-01

    Presents the first year's results of a continuing project to monitor the availability of software of relevance for literacy education purposes. Concludes there is an enormous amount of software available for use by teachers of reading and literacy--whereas drill-and-practice software is the largest category of software available, large numbers of…

  10. Ensuring Software IP Cleanliness

    OpenAIRE

    Mahshad Koohgoli; Richard Mayer

    2007-01-01

    At many points in the life of a software enterprise, determination of intellectual property (IP) cleanliness becomes critical. The value of an enterprise that develops and sells software may depend on how clean the software is from the IP perspective. This article examines various methods of ensuring software IP cleanliness and discusses some of the benefits and shortcomings of current solutions.

  11. Statistical Software Engineering

    Science.gov (United States)

    1998-04-13

    multiversion software subject to coincident errors. IEEE Trans. Software Eng. SE-11:1511-1517. Eckhardt, D.E., A.K Caglayan, J.C. Knight, L.D. Lee, D.F...J.C. and N.G. Leveson. 1986. Experimental evaluation of the assumption of independence in multiversion software. IEEE Trans. Software

  12. Agile Software Development

    Science.gov (United States)

    Biju, Soly Mathew

    2008-01-01

    Many software development firms are now adopting the agile software development method. This method involves the customer at every level of software development, thus reducing the impact of change in the requirement at a later stage. In this article, the principles of the agile method for software development are explored and there is a focus on…

  13. Improving Software Developer's Competence

    DEFF Research Database (Denmark)

    Abrahamsson, Pekka; Kautz, Karlheinz; Sieppi, Heikki

    2002-01-01

    Emerging agile software development methods are people oriented development approaches to be used by the software industry. The personal software process (PSP) is an accepted method for improving the capabilities of a single software engineer. Five original hypotheses regarding the impact...

  14. Software - Naval Oceanography Portal

    Science.gov (United States)

    are here: Home › USNO › Earth Orientation › Software USNO Logo USNO Navigation Earth Orientation Products GPS-based Products VLBI-based Products EO Information Center Publications about Products Software Search databases Auxiliary Software Supporting Software Form Folder Earth Orientation Matrix Calculator

  15. Software Engineering Education Directory

    Science.gov (United States)

    1990-04-01

    and Engineering (CMSC 735) Codes: GPEV2 * Textiooks: IEEE Tutoria on Models and Metrics for Software Management and Engameeing by Basi, Victor R...Software Engineering (Comp 227) Codes: GPRY5 Textbooks: IEEE Tutoria on Software Design Techniques by Freeman, Peter and Wasserman, Anthony 1. Software

  16. Great software debates

    CERN Document Server

    Davis, A

    2004-01-01

    The industry’s most outspoken and insightful critic explains how the software industry REALLY works. In Great Software Debates, Al Davis, shares what he has learned about the difference between the theory and the realities of business and encourages you to question and think about software engineering in ways that will help you succeed where others fail. In short, provocative essays, Davis fearlessly reveals the truth about process improvement, productivity, software quality, metrics, agile development, requirements documentation, modeling, software marketing and sales, empiricism, start-up financing, software research, requirements triage, software estimation, and entrepreneurship.

  17. The State of Software for Evolutionary Biology.

    Science.gov (United States)

    Darriba, Diego; Flouri, Tomáš; Stamatakis, Alexandros

    2018-05-01

    With Next Generation Sequencing data being routinely used, evolutionary biology is transforming into a computational science. Thus, researchers have to rely on a growing number of increasingly complex software. All widely used core tools in the field have grown considerably, in terms of the number of features as well as lines of code and consequently, also with respect to software complexity. A topic that has received little attention is the software engineering quality of widely used core analysis tools. Software developers appear to rarely assess the quality of their code, and this can have potential negative consequences for end-users. To this end, we assessed the code quality of 16 highly cited and compute-intensive tools mainly written in C/C++ (e.g., MrBayes, MAFFT, SweepFinder, etc.) and JAVA (BEAST) from the broader area of evolutionary biology that are being routinely used in current data analysis pipelines. Because, the software engineering quality of the tools we analyzed is rather unsatisfying, we provide a list of best practices for improving the quality of existing tools and list techniques that can be deployed for developing reliable, high quality scientific software from scratch. Finally, we also discuss journal as well as science policy and, more importantly, funding issues that need to be addressed for improving software engineering quality as well as ensuring support for developing new and maintaining existing software. Our intention is to raise the awareness of the community regarding software engineering quality issues and to emphasize the substantial lack of funding for scientific software development.

  18. ATLAS Software Installation on Supercomputers

    CERN Document Server

    Undrus, Alexander; The ATLAS collaboration

    2018-01-01

    PowerPC and high performance computers (HPC) are important resources for computing in the ATLAS experiment. The future LHC data processing will require more resources than Grid computing, currently using approximately 100,000 cores at well over 100 sites, can provide. Supercomputers are extremely powerful as they use resources of hundreds of thousands CPUs joined together. However their architectures have different instruction sets. ATLAS binary software distributions for x86 chipsets do not fit these architectures, as emulation of these chipsets results in huge performance loss. This presentation describes the methodology of ATLAS software installation from source code on supercomputers. The installation procedure includes downloading the ATLAS code base as well as the source of about 50 external packages, such as ROOT and Geant4, followed by compilation, and rigorous unit and integration testing. The presentation reports the application of this procedure at Titan HPC and Summit PowerPC at Oak Ridge Computin...

  19. Views on Software Testability

    OpenAIRE

    Shimeall, Timothy; Friedman, Michael; Chilenski, John; Voas, Jeffrey

    1994-01-01

    The field of testability is an active, well-established part of engineering of modern computer systems. However, only recently have technologies for software testability began to be developed. These technologies focus on accessing the aspects of software that improve or depreciate the ease of testing. As both the size of implemented software and the amount of effort required to test that software increase, so will the important of software testability technologies in influencing the softwa...

  20. Agile software assessment

    OpenAIRE

    Nierstrasz Oscar; Lungu Mircea

    2012-01-01

    Informed decision making is a critical activity in software development but it is poorly supported by common development environments which focus mainly on low level programming tasks. We posit the need for agile software assessment which aims to support decision making by enabling rapid and effective construction of software models and custom analyses. Agile software assessment entails gathering and exploiting the broader context of software information related to the system at hand as well ...

  1. Software component quality evaluation

    Science.gov (United States)

    Clough, A. J.

    1991-01-01

    The paper describes a software inspection process that can be used to evaluate the quality of software components. Quality criteria, process application, independent testing of the process and proposed associated tool support are covered. Early results indicate that this technique is well suited for assessing software component quality in a standardized fashion. With automated machine assistance to facilitate both the evaluation and selection of software components, such a technique should promote effective reuse of software components.

  2. SMART core protection system design

    International Nuclear Information System (INIS)

    Lee, J. K.; Park, H. Y.; Koo, I. S.; Park, H. S.; Kim, J. S.; Son, C. H.

    2003-01-01

    SMART COre Protection System(SCOPS) is designed with real-tims Digital Signal Processor(DSP) board and Network Interface Card(NIC) board. SCOPS has a Control Rod POSition (CRPOS) software module while Core Protection Calculator System(CPCS) consists of Core Protection Calculators(CPCs) and Control Element Assembly(CEA) Calculators(CEACs) in the commercial nuclear plant. It's not necessary to have a independent cabinets for SCOPS because SCOPS is physically very small. Then SCOPS is designed to share the cabinets with Plant Protection System(PPS) of SMART. Therefor it's very easy to maintain the system because CRPOS module is used instead of the computer with operating system

  3. Software Quality Assurance Metrics

    Science.gov (United States)

    McRae, Kalindra A.

    2004-01-01

    Software Quality Assurance (SQA) is a planned and systematic set of activities that ensures conformance of software life cycle processes and products conform to requirements, standards and procedures. In software development, software quality means meeting requirements and a degree of excellence and refinement of a project or product. Software Quality is a set of attributes of a software product by which its quality is described and evaluated. The set of attributes includes functionality, reliability, usability, efficiency, maintainability, and portability. Software Metrics help us understand the technical process that is used to develop a product. The process is measured to improve it and the product is measured to increase quality throughout the life cycle of software. Software Metrics are measurements of the quality of software. Software is measured to indicate the quality of the product, to assess the productivity of the people who produce the product, to assess the benefits derived from new software engineering methods and tools, to form a baseline for estimation, and to help justify requests for new tools or additional training. Any part of the software development can be measured. If Software Metrics are implemented in software development, it can save time, money, and allow the organization to identify the caused of defects which have the greatest effect on software development. The summer of 2004, I worked with Cynthia Calhoun and Frank Robinson in the Software Assurance/Risk Management department. My task was to research and collect, compile, and analyze SQA Metrics that have been used in other projects that are not currently being used by the SA team and report them to the Software Assurance team to see if any metrics can be implemented in their software assurance life cycle process.

  4. Software Engineering Program: Software Process Improvement Guidebook

    Science.gov (United States)

    1996-01-01

    The purpose of this document is to provide experience-based guidance in implementing a software process improvement program in any NASA software development or maintenance community. This guidebook details how to define, operate, and implement a working software process improvement program. It describes the concept of the software process improvement program and its basic organizational components. It then describes the structure, organization, and operation of the software process improvement program, illustrating all these concepts with specific NASA examples. The information presented in the document is derived from the experiences of several NASA software organizations, including the SEL, the SEAL, and the SORCE. Their experiences reflect many of the elements of software process improvement within NASA. This guidebook presents lessons learned in a form usable by anyone considering establishing a software process improvement program within his or her own environment. This guidebook attempts to balance general and detailed information. It provides material general enough to be usable by NASA organizations whose characteristics do not directly match those of the sources of the information and models presented herein. It also keeps the ideas sufficiently close to the sources of the practical experiences that have generated the models and information.

  5. From Software Development to Software Assembly

    NARCIS (Netherlands)

    Sneed, Harry M.; Verhoef, Chris

    2016-01-01

    The lack of skilled programming personnel and the growing burden of maintaining customized software are forcing organizations to quit producing their own software. It's high time they turned to ready-made, standard components to fulfill their business requirements. Cloud services might be one way to

  6. SLIMarray: Lightweight software for microarray facility management

    Directory of Open Access Journals (Sweden)

    Marzolf Bruz

    2006-10-01

    Full Text Available Abstract Background Microarray core facilities are commonplace in biological research organizations, and need systems for accurately tracking various logistical aspects of their operation. Although these different needs could be handled separately, an integrated management system provides benefits in organization, automation and reduction in errors. Results We present SLIMarray (System for Lab Information Management of Microarrays, an open source, modular database web application capable of managing microarray inventories, sample processing and usage charges. The software allows modular configuration and is well suited for further development, providing users the flexibility to adapt it to their needs. SLIMarray Lite, a version of the software that is especially easy to install and run, is also available. Conclusion SLIMarray addresses the previously unmet need for free and open source software for managing the logistics of a microarray core facility.

  7. The Evolution of Software in High Energy Physics

    International Nuclear Information System (INIS)

    Brun, René

    2012-01-01

    The paper reviews the evolution of the software in High Energy Physics from the time of expensive mainframes to grids and clouds systems using thousands of multi-core processors. It focuses on the key parameters or events that have shaped the current software infrastructure.

  8. Imprinting Community College Computer Science Education with Software Engineering Principles

    Science.gov (United States)

    Hundley, Jacqueline Holliday

    2012-01-01

    Although the two-year curriculum guide includes coverage of all eight software engineering core topics, the computer science courses taught in Alabama community colleges limit student exposure to the programming, or coding, phase of the software development lifecycle and offer little experience in requirements analysis, design, testing, and…

  9. Software Engineering Improvement Plan

    Science.gov (United States)

    2006-01-01

    In performance of this task order, bd Systems personnel provided support to the Flight Software Branch and the Software Working Group through multiple tasks related to software engineering improvement and to activities of the independent Technical Authority (iTA) Discipline Technical Warrant Holder (DTWH) for software engineering. To ensure that the products, comments, and recommendations complied with customer requirements and the statement of work, bd Systems personnel maintained close coordination with the customer. These personnel performed work in areas such as update of agency requirements and directives database, software effort estimation, software problem reports, a web-based process asset library, miscellaneous documentation review, software system requirements, issue tracking software survey, systems engineering NPR, and project-related reviews. This report contains a summary of the work performed and the accomplishments in each of these areas.

  10. Spotting software errors sooner

    International Nuclear Information System (INIS)

    Munro, D.

    1989-01-01

    Static analysis is helping to identify software errors at an earlier stage and more cheaply than conventional methods of testing. RTP Software's MALPAS system also has the ability to check that a code conforms to its original specification. (author)

  11. Avionics and Software Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The goal of the AES Avionics and Software (A&S) project is to develop a reference avionics and software architecture that is based on standards and that can be...

  12. Paladin Software Support Lab

    Data.gov (United States)

    Federal Laboratory Consortium — The Paladin Software Support Environment (SSE) occupies 2,241 square-feet. It contains the hardware and software tools required to support the Paladin Automatic Fire...

  13. Process mining software repositories

    NARCIS (Netherlands)

    Poncin, W.; Serebrenik, A.; Brand, van den M.G.J.

    2011-01-01

    Software developers' activities are in general recorded in software repositories such as version control systems, bug trackers and mail archives. While abundant information is usually present in such repositories, successful information extraction is often challenged by the necessity to

  14. A software quality model and metrics for risk assessment

    Science.gov (United States)

    Hyatt, L.; Rosenberg, L.

    1996-01-01

    A software quality model and its associated attributes are defined and used as the model for the basis for a discussion on risk. Specific quality goals and attributes are selected based on their importance to a software development project and their ability to be quantified. Risks that can be determined by the model's metrics are identified. A core set of metrics relating to the software development process and its products is defined. Measurements for each metric and their usability and applicability are discussed.

  15. Side core lifter

    Energy Technology Data Exchange (ETDEWEB)

    Edelman, Ya A

    1982-01-01

    A side core lifter is proposed which contains a housing with guide slits and a removable core lifter with side projections on the support section connected to the core receiver. In order to preserve the structure of the rock in the core sample by means of guaranteeing rectilinear movement of the core lifter in the rock, the support and core receiver sections are hinged. The device is equipped with a spring for angular shift in the core-reception part.

  16. Optimization of Antivirus Software

    OpenAIRE

    Catalin BOJA; Adrian VISOIU

    2007-01-01

    The paper describes the main techniques used in development of computer antivirus software applications. For this particular category of software, are identified and defined optimum criteria that helps determine which solution is better and what are the objectives of the optimization process. From the general viewpoint of software optimization are presented methods and techniques that are applied at code development level. Regarding the particularities of antivirus software, the paper analyze...

  17. Open Source Software Development

    Science.gov (United States)

    2011-01-01

    appropriate to refer to FOSS or FLOSS (L for Libre , where the alternative term “ libre software ” has popularity in some parts of the world) in order...Applying Social Network Analysis to Community-Drive Libre Software Projects, Intern. J. Info. Tech. and Web Engineering, 2006, 1(3), 27-28. 17...Open Source Software Development* Walt Scacchi Institute for Software Researcher University of California, Irvine Irvine, CA 92697-3455 USA Abstract

  18. Gammasphere software development

    International Nuclear Information System (INIS)

    Piercey, R.B.

    1994-01-01

    This report describes the activities of the nuclear physics group at Mississippi State University which were performed during 1993. Significant progress has been made in the focus areas: chairing the Gammasphere Software Working Group (SWG); assisting with the porting and enhancement of the ORNL UPAK histogramming software package; and developing standard formats for Gammasphere data products. In addition, they have established a new public ftp archive to distribute software and software development tools and information

  19. Software engineer's pocket book

    CERN Document Server

    Tooley, Michael

    2013-01-01

    Software Engineer's Pocket Book provides a concise discussion on various aspects of software engineering. The book is comprised of six chapters that tackle various areas of concerns in software engineering. Chapter 1 discusses software development, and Chapter 2 covers programming languages. Chapter 3 deals with operating systems. The book also tackles discrete mathematics and numerical computation. Data structures and algorithms are also explained. The text will be of great use to individuals involved in the specification, design, development, implementation, testing, maintenance, and qualit

  20. Software Testing Requires Variability

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak

    2003-01-01

    Software variability is the ability of a software system or artefact to be changed, customized or configured for use in a particular context. Variability in software systems is important from a number of perspectives. Some perspectives rightly receive much attention due to their direct economic...... impact in software production. As is also apparent from the call for papers these perspectives focus on qualities such as reuse, adaptability, and maintainability....

  1. Computer software quality assurance

    International Nuclear Information System (INIS)

    Ives, K.A.

    1986-06-01

    The author defines some criteria for the evaluation of software quality assurance elements for applicability to the regulation of the nuclear industry. The author then analyses a number of software quality assurance (SQA) standards. The major extracted SQA elements are then discussed, and finally specific software quality assurance recommendations are made for the nuclear industry

  2. Software Architecture Evolution

    Science.gov (United States)

    Barnes, Jeffrey M.

    2013-01-01

    Many software systems eventually undergo changes to their basic architectural structure. Such changes may be prompted by new feature requests, new quality attribute requirements, changing technology, or other reasons. Whatever the causes, architecture evolution is commonplace in real-world software projects. Today's software architects, however,…

  3. XES Software Communication Extension

    NARCIS (Netherlands)

    Leemans, M.; Liu, C.

    2017-01-01

    During the execution of software, execution data can be recorded. With the development of process mining techniques on the one hand, and the growing availability of software execution data on the other hand, a new form of software analytics comes into reach. That is, applying process mining

  4. Neutron Scattering Software

    Science.gov (United States)

    Home Page | Facilities | Reference | Software | Conferences | Announcements | Mailing Lists Neutron Scattering Banner Neutron Scattering Software A new portal for neutron scattering has just been established sets KUPLOT: data plotting and fitting software ILL/TAS: Matlab probrams for analyzing triple axis data

  5. XES Software Event Extension

    NARCIS (Netherlands)

    Leemans, M.; Liu, C.

    2017-01-01

    During the execution of software, execution data can be recorded. With the development of process mining techniques on the one hand, and the growing availability of software execution data on the other hand, a new form of software analytics comes into reach. That is, applying process mining

  6. ARC Software and Models

    Science.gov (United States)

    Archives RESEARCH ▼ Research Areas Ongoing Projects Completed Projects SOFTWARE CONTACT ▼ Primary Contacts Researchers External Link MLibrary Deep Blue Software Archive Most research conducted at the ARC produce software code and methodologies that are transferred to TARDEC and industry partners. These

  7. XES Software Telemetry Extension

    NARCIS (Netherlands)

    Leemans, M.; Liu, C.

    2017-01-01

    During the execution of software, execution data can be recorded. With the development of process mining techniques on the one hand, and the growing availability of software execution data on the other hand, a new form of software analytics comes into reach. That is, applying process mining

  8. Specifications in software prototyping

    OpenAIRE

    Luqi; Chang, Carl K.; Zhu, Hong

    1998-01-01

    We explore the use of software speci®cations for software prototyping. This paper describes a process model for software prototyping, and shows how specifications can be used to support such a process via a cellular mobile phone switch example.

  9. Software Engineering for Portability.

    Science.gov (United States)

    Stanchev, Ivan

    1990-01-01

    Discussion of the portability of educational software focuses on the software design and development process. Topics discussed include levels of portability; the user-computer dialog; software engineering principles; design techniques for student performance records; techniques of courseware programing; and suggestions for further research and…

  10. Software Acquisition and Software Engineering Best Practices

    National Research Council Canada - National Science Library

    Eslinger, S

    1999-01-01

    The purpose of this white paper is to address the issues raised in the recently published Senate Armed Services Committee Report 106-50 concerning Software Management Improvements for the Department of Defense (DoD...

  11. Software Quality Assurance in Software Projects: A Study of Pakistan

    OpenAIRE

    Faisal Shafique Butt; Sundus Shaukat; M. Wasif Nisar; Ehsan Ullah Munir; Muhammad Waseem; Kashif Ayyub

    2013-01-01

    Software quality is specific property which tells what kind of standard software should have. In a software project, quality is the key factor of success and decline of software related organization. Many researches have been done regarding software quality. Software related organization follows standards introduced by Capability Maturity Model Integration (CMMI) to achieve good quality software. Quality is divided into three main layers which are Software Quality Assurance (SQA), Software Qu...

  12. SecureCore Software Architecture: Trusted Path Application (TPA) Requirements

    National Research Council Canada - National Science Library

    Clark, Paul C; Irvine, Cynthia E; Levin, Timothy E; Nguyen, Thuy D; Vidas, Timothy M

    2007-01-01

    .... A high-level architecture is described to provide such features. In addition, a usage scenario is described for a potential use of the architecture, with emphasis on the trusted path, a non-spoofable user interface to the trusted components of the system. Detailed requirements for the trusted path are provided.

  13. Core Flight Software for Unmanned Aircraft Systems, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Use of Unmanned Aircraft Systems (UAS) is increasing worldwide, but multiple technical barriers restrict the greater use of UASs. The safe operation of UASs in the...

  14. Licensing safety critical software

    International Nuclear Information System (INIS)

    Archinoff, G.H.; Brown, R.A.

    1990-01-01

    Licensing difficulties with the shutdown system software at the Darlington Nuclear Generating Station contributed to delays in starting up the station. Even though the station has now been given approval by the Atomic Energy Control Board (AECB) to operate, the software issue has not disappeared - Ontario Hydro has been instructed by the AECB to redesign the software. This article attempts to explain why software based shutdown systems were chosen for Darlington, why there was so much difficulty licensing them, and what the implications are for other safety related software based applications

  15. HAZARD ANALYSIS SOFTWARE

    International Nuclear Information System (INIS)

    Sommer, S; Tinh Tran, T.

    2008-01-01

    Washington Safety Management Solutions, LLC developed web-based software to improve the efficiency and consistency of hazard identification and analysis, control selection and classification, and to standardize analysis reporting at Savannah River Site. In the new nuclear age, information technology provides methods to improve the efficiency of the documented safety analysis development process which includes hazard analysis activities. This software provides a web interface that interacts with a relational database to support analysis, record data, and to ensure reporting consistency. A team of subject matter experts participated in a series of meetings to review the associated processes and procedures for requirements and standard practices. Through these meetings, a set of software requirements were developed and compiled into a requirements traceability matrix from which software could be developed. The software was tested to ensure compliance with the requirements. Training was provided to the hazard analysis leads. Hazard analysis teams using the software have verified its operability. The software has been classified as NQA-1, Level D, as it supports the analysis team but does not perform the analysis. The software can be transported to other sites with alternate risk schemes. The software is being used to support the development of 14 hazard analyses. User responses have been positive with a number of suggestions for improvement which are being incorporated as time permits. The software has enforced a uniform implementation of the site procedures. The software has significantly improved the efficiency and standardization of the hazard analysis process

  16. Software Validation in ATLAS

    International Nuclear Information System (INIS)

    Hodgkinson, Mark; Seuster, Rolf; Simmons, Brinick; Sherwood, Peter; Rousseau, David

    2012-01-01

    The ATLAS collaboration operates an extensive set of protocols to validate the quality of the offline software in a timely manner. This is essential in order to process the large amounts of data being collected by the ATLAS detector in 2011 without complications on the offline software side. We will discuss a number of different strategies used to validate the ATLAS offline software; running the ATLAS framework software, Athena, in a variety of configurations daily on each nightly build via the ATLAS Nightly System (ATN) and Run Time Tester (RTT) systems; the monitoring of these tests and checking the compilation of the software via distributed teams of rotating shifters; monitoring of and follow up on bug reports by the shifter teams and periodic software cleaning weeks to improve the quality of the offline software further.

  17. NASA software documentation standard software engineering program

    Science.gov (United States)

    1991-01-01

    The NASA Software Documentation Standard (hereinafter referred to as Standard) can be applied to the documentation of all NASA software. This Standard is limited to documentation format and content requirements. It does not mandate specific management, engineering, or assurance standards or techniques. This Standard defines the format and content of documentation for software acquisition, development, and sustaining engineering. Format requirements address where information shall be recorded and content requirements address what information shall be recorded. This Standard provides a framework to allow consistency of documentation across NASA and visibility into the completeness of project documentation. This basic framework consists of four major sections (or volumes). The Management Plan contains all planning and business aspects of a software project, including engineering and assurance planning. The Product Specification contains all technical engineering information, including software requirements and design. The Assurance and Test Procedures contains all technical assurance information, including Test, Quality Assurance (QA), and Verification and Validation (V&V). The Management, Engineering, and Assurance Reports is the library and/or listing of all project reports.

  18. Science and Software

    Science.gov (United States)

    Zelt, C. A.

    2017-12-01

    Earth science attempts to understand how the earth works. This research often depends on software for modeling, processing, inverting or imaging. Freely sharing open-source software is essential to prevent reinventing the wheel and allows software to be improved and applied in ways the original author may never have envisioned. For young scientists, releasing software can increase their name ID when applying for jobs and funding, and create opportunities for collaborations when scientists who collect data want the software's creator to be involved in their project. However, we frequently hear scientists say software is a tool, it's not science. Creating software that implements a new or better way of earth modeling or geophysical processing, inverting or imaging should be viewed as earth science. Creating software for things like data visualization, format conversion, storage, or transmission, or programming to enhance computational performance, may be viewed as computer science. The former, ideally with an application to real data, can be published in earth science journals, the latter possibly in computer science journals. Citations in either case should accurately reflect the impact of the software on the community. Funding agencies need to support more software development and open-source releasing, and the community should give more high-profile awards for developing impactful open-source software. Funding support and community recognition for software development can have far reaching benefits when the software is used in foreseen and unforeseen ways, potentially for years after the original investment in the software development. For funding, an open-source release that is well documented should be required, with example input and output files. Appropriate funding will provide the incentive and time to release user-friendly software, and minimize the need for others to duplicate the effort. All funded software should be available through a single web site

  19. G3 : GENESIS software envrionment update

    OpenAIRE

    Castagné, Nicolas; Cadoz, Claude; Allaoui, Ali; Tache, Olivier Michel

    2009-01-01

    GENESIS3 is the new version of the GENESIS software environment for musical creation by means of mass-interaction physics network modeling. It was designed, and developed from scratch, in hindsight of more than 10 years working on and using the previous version. We take the opportunity of this birth to provide in this article (1) an analysis of the peculiarities in GENESIS, aiming at highlighting its core ?software paradigm?; and (2) an update on the features of the new version as compared to...

  20. Software Maintenance and Evolution: The Implication for Software ...

    African Journals Online (AJOL)

    Software Maintenance and Evolution: The Implication for Software Development. ... Software maintenance is the process of modifying existing operational software by correcting errors, ... EMAIL FREE FULL TEXT EMAIL FREE FULL TEXT

  1. Animal MRI Core

    Data.gov (United States)

    Federal Laboratory Consortium — The Animal Magnetic Resonance Imaging (MRI) Core develops and optimizes MRI methods for cardiovascular imaging of mice and rats. The Core provides imaging expertise,...

  2. Software Defined Networking Demands on Software Technologies

    DEFF Research Database (Denmark)

    Galinac Grbac, T.; Caba, Cosmin Marius; Soler, José

    2015-01-01

    Software Defined Networking (SDN) is a networking approach based on a centralized control plane architecture with standardised interfaces between control and data planes. SDN enables fast configuration and reconfiguration of the network to enhance resource utilization and service performances....... This new approach enables a more dynamic and flexible network, which may adapt to user needs and application requirements. To this end, systemized solutions must be implemented in network software, aiming to provide secure network services that meet the required service performance levels. In this paper......, we review this new approach to networking from an architectural point of view, and identify and discuss some critical quality issues that require new developments in software technologies. These issues we discuss along with use case scenarios. Here in this paper we aim to identify challenges...

  3. Software engineering in industry

    Science.gov (United States)

    Story, C. M.

    1989-12-01

    Can software be "engineered"? Can a few people with limited resources and a negligible budget produce high quality software solutions to complex software problems? It is possible to resolve the conflict between research activities and the necessity to view software development as a means to an end rather than as an end in itself? The aim of this paper is to encourage further thought and discussion on various topics which, in the author's experience, are becoming increasingly critical in large current software production and development projects, inside and outside high energy physics (HEP). This is done by briefly exploring some of the software engineering ideas and technologies now used in the information industry, using, as a case-study, a project with many similarities to those currently under way in HEP.

  4. LANMAS core: Update and current directions

    International Nuclear Information System (INIS)

    Claborn, J.

    1995-01-01

    Local Area Network Material Accountability system (LANMAS) core software provides the framework of a material accountability system. It tracks the movement of material throughout a site and generates the required material accountability reports. LANMAS is a net-work- based nuclear material accountability system that runs in a client/server mode. The database of material type and location resides on the server, while the user interface runs on the client. The user interface accesses the data stored on the server via a network. The LANMAS core can be used as the foundation for building required materials control and accountability (MCA) functionality at any site requiring a new MCA system. An individual site will build on the LANMAS core by supplying site-specific software. This paper will provide an update on the current LANMAS development activities and discuss the current direction of the LANMAS project

  5. LANMAS core: Update and current directions

    International Nuclear Information System (INIS)

    Claborn, J.

    1994-01-01

    Local Area Network Material Accountability System (LANMAS) core software will provide the framework of a material accountability system. LANMAS is a network-based nuclear material accountability system. It tracks the movement of material throughout a site and generates the required reports on material accountability. LANMAS will run in a client/server mode. The database of material type and location will reside on the server, while the user interface runs on the client. The user interface accesses the server via a network. The LANMAS core can be used as the foundation for building required Materials Control and Accountability (MC ampersand A) functionality at any site requiring a new MC ampersand A system. An individual site will build on the LANMAS core by supplying site-specific software. This paper will provide an update on the current LANMAS development activities and discuss the current direction of the LANMAS project

  6. A software product certification model

    NARCIS (Netherlands)

    Heck, P.M.; Klabbers, M.D.; van Eekelen, Marko

    2010-01-01

    Certification of software artifacts offers organizations more certainty and confidence about software. Certification of software helps software sales, acquisition, and can be used to certify legislative compliance or to achieve acceptable deliverables in outsourcing. In this article, we present a

  7. Software verification for nuclear industry

    International Nuclear Information System (INIS)

    Wilburn, N.P.

    1985-08-01

    Why verification of software products throughout the software life cycle is necessary is considered. Concepts of verification, software verification planning, and some verification methodologies for products generated throughout the software life cycle are then discussed

  8. Software evolution and maintenance

    CERN Document Server

    Tripathy, Priyadarshi

    2014-01-01

    Software Evolution and Maintenance: A Practitioner's Approach is an accessible textbook for students and professionals, which collates the advances in software development and provides the most current models and techniques in maintenance.Explains two maintenance standards: IEEE/EIA 1219 and ISO/IEC14764Discusses several commercial reverse and domain engineering toolkitsSlides for instructors are available onlineInformation is based on the IEEE SWEBOK (Software Engineering Body of Knowledge)

  9. Software for microcircuit systems

    International Nuclear Information System (INIS)

    Kunz, P.F.

    1978-10-01

    Modern Large Scale Integration (LSI) microcircuits are meant to be programed in order to control the function that they perform. The basics of microprograming and new microcircuits have already been discussed. In this course, the methods of developing software for these microcircuits are explored. This generally requires a package of support software in order to assemble the microprogram, and also some amount of support software to test the microprograms and to test the microprogramed circuit itself. 15 figures, 2 tables

  10. Hospital Management Software Development

    OpenAIRE

    sobogunGod, olawale

    2012-01-01

    The purpose of this thesis was to implement a hospital management software which is suitable for small private hospitals in Nigeria, especially for the ones that use a file based system for storing information rather than having it stored in a more efficient and safer environment like databases or excel programming software. The software developed within this thesis project was specifically designed for the Rainbow specialist hospital which is based in Lagos, the commercial neurological cente...

  11. Computer software configuration management

    International Nuclear Information System (INIS)

    Pelletier, G.

    1987-08-01

    This report reviews the basic elements of software configuration management (SCM) as defined by military and industry standards. Several software configuration management standards are evaluated given the requirements of the nuclear industry. A survey is included of available automated tools for supporting SCM activities. Some information is given on the experience of establishing and using SCM plans of other organizations that manage critical software. The report concludes with recommendations of practices that would be most appropriate for the nuclear power industry in Canada

  12. Gammasphere software development

    International Nuclear Information System (INIS)

    Piercey, R.B.

    1993-01-01

    Activities of the nuclear physics group are described. Progress was made in organizing the Gammasphere Software Working Group, establishing a nuclear computing facility, participating in software development at Lawrence Berkeley, developing a common data file format, and adapting the ORNL UPAK software to run at Gammasphere. A universal histogram object was developed that defines a file format and provides for an objective-oriented programming model. An automated liquid nitrogen fill system was developed for Gammasphere (110 Ge detectors comprise the sphere)

  13. Software quality management

    International Nuclear Information System (INIS)

    Bishop, D.C.; Pymm, P.

    1991-01-01

    As programmable electronic (software-based) systems are increasingly being proposed as design solutions for high integrity applications in nuclear power stations, the need to adopt suitable quality management arrangements is paramount. The authors describe Scottish Nuclear's strategy for software quality management and, using the main on-line monitoring system at Torness Power Station as an example, explain how this strategy is put into practice. Particular attention is given to the topics of software quality planning and change control. (author)

  14. Software Process Improvement Defined

    DEFF Research Database (Denmark)

    Aaen, Ivan

    2002-01-01

    This paper argues in favor of the development of explanatory theory on software process improvement. The last one or two decades commitment to prescriptive approaches in software process improvement theory may contribute to the emergence of a gulf dividing theorists and practitioners....... It is proposed that this divide be met by the development of theory evaluating prescriptive approaches and informing practice with a focus on the software process policymaking and process control aspects of improvement efforts...

  15. Assuring Software Reliability

    Science.gov (United States)

    2014-08-01

    technologies and processes to achieve a required level of confidence that software systems and services function in the intended manner. 1.3 Security Example...that took three high-voltage lines out of service and a software fail- ure (a race condition3) that disabled the computing service that notified the... service had failed. Instead of analyzing the details of the alarm server failure, the reviewers asked why the following software assurance claim had

  16. Software evolution with XVCL

    DEFF Research Database (Denmark)

    Zhang, Weishan; Jarzabek, Stan; Zhang, Hongyu

    2004-01-01

    This chapter introduces software evolution with XVCL (XML-based Variant Configuration Language), which is an XML-based metaprogramming technique. As the software evolves, a large number of variants may arise, especially whtn such kinds of evolutions are related to multiple platforms as shown in our...... case study. Handling variants and tracing the impact of variants across the development lifecycle is a challenge. This chapter shows how we can maintain different versions of software in a reuse-based way....

  17. FASTBUS software status

    International Nuclear Information System (INIS)

    Gustavson, D.B.

    1980-10-01

    Computer software will be needed in addition to the mechanical, electrical, protocol and timing specifications of the FASTBUS, in order to facilitate the use of this flexible new multiprocessor and multisegment data acquisition and processing system. Software considerations have been important in the FASTBUS design, but standard subroutines and recommended algorithms will be needed as the FASTBUS comes into use. This paper summarizes current FASTBUS software projects, goals and status

  18. Software configuration management

    CERN Document Server

    Keyes, Jessica

    2004-01-01

    Software Configuration Management discusses the framework from a standards viewpoint, using the original DoD MIL-STD-973 and EIA-649 standards to describe the elements of configuration management within a software engineering perspective. Divided into two parts, the first section is composed of 14 chapters that explain every facet of configuration management related to software engineering. The second section consists of 25 appendices that contain many valuable real world CM templates.

  19. Solar Asset Management Software

    Energy Technology Data Exchange (ETDEWEB)

    Iverson, Aaron [Ra Power Management, Inc., Oakland, CA (United States); Zviagin, George [Ra Power Management, Inc., Oakland, CA (United States)

    2016-09-30

    Ra Power Management (RPM) has developed a cloud based software platform that manages the financial and operational functions of third party financed solar projects throughout their lifecycle. RPM’s software streamlines and automates the sales, financing, and management of a portfolio of solar assets. The software helps solar developers automate the most difficult aspects of asset management, leading to increased transparency, efficiency, and reduction in human error. More importantly, our platform will help developers save money by improving their operating margins.

  20. Essential software architecture

    CERN Document Server

    Gorton, Ian

    2011-01-01

    Job titles like ""Technical Architect"" and ""Chief Architect"" nowadays abound in software industry, yet many people suspect that ""architecture"" is one of the most overused and least understood terms in professional software development. Gorton's book tries to resolve this dilemma. It concisely describes the essential elements of knowledge and key skills required to be a software architect. The explanations encompass the essentials of architecture thinking, practices, and supporting technologies. They range from a general understanding of structure and quality attributes through technical i

  1. Software engineering the current practice

    CERN Document Server

    Rajlich, Vaclav

    2011-01-01

    INTRODUCTION History of Software EngineeringSoftware PropertiesOrigins of SoftwareBirth of Software EngineeringThird Paradigm: Iterative ApproachSoftware Life Span ModelsStaged ModelVariants of Staged ModelSoftware Technologies Programming Languages and CompilersObject-Oriented TechnologyVersion Control SystemSoftware ModelsClass DiagramsUML Activity DiagramsClass Dependency Graphs and ContractsSOFTWARE CHANGEIntroduction to Software ChangeCharacteristics of Software ChangePhases of Software ChangeRequirements and Their ElicitationRequirements Analysis and Change InitiationConcepts and Concept

  2. Agile software development

    CERN Document Server

    Dingsoyr, Torgeir; Moe, Nils Brede

    2010-01-01

    Agile software development has become an umbrella term for a number of changes in how software developers plan and coordinate their work, how they communicate with customers and external stakeholders, and how software development is organized in small, medium, and large companies, from the telecom and healthcare sectors to games and interactive media. Still, after a decade of research, agile software development is the source of continued debate due to its multifaceted nature and insufficient synthesis of research results. Dingsoyr, Dyba, and Moe now present a comprehensive snapshot of the kno

  3. Optimization of Antivirus Software

    Directory of Open Access Journals (Sweden)

    2007-01-01

    Full Text Available The paper describes the main techniques used in development of computer antivirus software applications. For this particular category of software, are identified and defined optimum criteria that helps determine which solution is better and what are the objectives of the optimization process. From the general viewpoint of software optimization are presented methods and techniques that are applied at code development level. Regarding the particularities of antivirus software, the paper analyzes some of the optimization concepts applied to this category of applications

  4. Software quality assurance

    CERN Document Server

    Laporte, Claude Y

    2018-01-01

    This book introduces Software Quality Assurance (SQA) and provides an overview of standards used to implement SQA. It defines ways to assess the effectiveness of how one approaches software quality across key industry sectors such as telecommunications, transport, defense, and aerospace. * Includes supplementary website with an instructor's guide and solutions * Applies IEEE software standards as well as the Capability Maturity Model Integration for Development (CMMI) * Illustrates the application of software quality assurance practices through the use of practical examples, quotes from experts, and tips from the authors

  5. Software architecture 2

    CERN Document Server

    Oussalah, Mourad Chabanne

    2014-01-01

    Over the past 20 years, software architectures have significantly contributed to the development of complex and distributed systems. Nowadays, it is recognized that one of the critical problems in the design and development of any complex software system is its architecture, i.e. the organization of its architectural elements. Software Architecture presents the software architecture paradigms based on objects, components, services and models, as well as the various architectural techniques and methods, the analysis of architectural qualities, models of representation of architectural templa

  6. Software as quality product

    International Nuclear Information System (INIS)

    Enders, A.

    1975-01-01

    In many discussions on the reliability of computer systems, software is presented as the weak link in the chain. The contribution attempts to identify the reasons for this situation as seen from the software development. The concepts correctness and reliability of programmes are explained as they are understood in the specialist discussion of today. Measures and methods are discussed which are particularly relevant as far as the obtaining of fault-free and reliable programmes is concerned. Conclusions are drawn for the user of software so that he is in the position to judge himself what can be justly expected frm the product software compared to other products. (orig./LH) [de

  7. Essence: Facilitating Software Innovation

    DEFF Research Database (Denmark)

    Aaen, Ivan

    2008-01-01

      This paper suggests ways to facilitate creativity and innovation in software development. The paper applies four perspectives – Product, Project, Process, and People –to identify an outlook for software innovation. The paper then describes a new facility–Software Innovation Research Lab (SIRL......) – and a new method concept for software innovation – Essence – based on views, modes, and team roles. Finally, the paper reports from an early experiment using SIRL and Essence and identifies further research....

  8. Global Software Engineering

    DEFF Research Database (Denmark)

    Ebert, Christof; Kuhrmann, Marco; Prikladnicki, Rafael

    2016-01-01

    Professional software products and IT systems and services today are developed mostly by globally distributed teams, projects, and companies. Successfully orchestrating Global Software Engineering (GSE) has become the major success factor both for organizations and practitioners. Yet, more than...... and experience reported at the IEEE International Conference on Software Engineering (ICGSE) series. The outcomes of our analysis show GSE as a field highly attached to industry and, thus, a considerable share of ICGSE papers address the transfer of Software Engineering concepts and solutions to the global stage...

  9. Software Intensive Systems

    National Research Council Canada - National Science Library

    Horvitz, E; Katz, D. J; Rumpf, R. L; Shrobe, H; Smith, T. B; Webber, G. E; Williamson, W. E; Winston, P. H; Wolbarsht, James L

    2006-01-01

    .... Additionally, recommend that DoN invest in software engineering, particularly as it complements commercial industry developments and promotes the application of systems engineering methodology...

  10. Contractor Software Charges

    National Research Council Canada - National Science Library

    Granetto, Paul

    1994-01-01

    .... Examples of computer software costs that contractors charge through indirect rates are material management systems, security systems, labor accounting systems, and computer-aided design and manufacturing...

  11. Decentralized Software Architecture

    National Research Council Canada - National Science Library

    Khare, Rohit

    2002-01-01

    .... While the term "decentralization" is familiar from political and economic contexts, it has been applied extensively, if indiscriminately, to describe recent trends in software architecture towards...

  12. Software architecture 1

    CERN Document Server

    Oussalah , Mourad Chabane

    2014-01-01

    Over the past 20 years, software architectures have significantly contributed to the development of complex and distributed systems. Nowadays, it is recognized that one of the critical problems in the design and development of any complex software system is its architecture, i.e. the organization of its architectural elements. Software Architecture presents the software architecture paradigms based on objects, components, services and models, as well as the various architectural techniques and methods, the analysis of architectural qualities, models of representation of architectural template

  13. Software Radar Technology

    Directory of Open Access Journals (Sweden)

    Tang Jun

    2015-08-01

    Full Text Available In this paper, the definition and the key features of Software Radar, which is a new concept, are proposed and discussed. We consider the development of modern radar system technology to be divided into three stages: Digital Radar, Software radar and Intelligent Radar, and the second stage is just commencing now. A Software Radar system should be a combination of various modern digital modular components conformed to certain software and hardware standards. Moreover, a software radar system with an open system architecture supporting to decouple application software and low level hardware would be easy to adopt "user requirements-oriented" developing methodology instead of traditional "specific function-oriented" developing methodology. Compared with traditional Digital Radar, Software Radar system can be easily reconfigured and scaled up or down to adapt to the changes of requirements and technologies. A demonstration Software Radar signal processing system, RadarLab 2.0, which has been developed by Tsinghua University, is introduced in this paper and the suggestions for the future development of Software Radar in China are also given in the conclusion.

  14. Systems and software variability management concepts, tools and experiences

    CERN Document Server

    Capilla, Rafael; Kang, Kyo-Chul

    2013-01-01

    The success of product line engineering techniques in the last 15 years has popularized the use of software variability as a key modeling approach for describing the commonality and variability of systems at all stages of the software lifecycle. Software product lines enable a family of products to share a common core platform, while allowing for product specific functionality being built on top of the platform. Many companies have exploited the concept of software product lines to increase the resources that focus on highly differentiating functionality and thus improve their competitiveness

  15. Design and implementation of Skype USB user gateway software

    Science.gov (United States)

    Qi, Yang

    2017-08-01

    With the widespread application of VoIP, the client with private protocol becomes more and more popular. Skype is one of the representatives. How to connect Skype with PSTN just by Skype client has gradually become hot. This paper design and implement the software based on a kind of USB User Gateway. With the software Skype user can freely communicate with PSTN phone. FSM is designed as the core of the software, and Skype control is separated by the USB Gateway control. In this way, the communication becomes more flexible and efficient. In the actual user testing, the software obtains good results.

  16. Social software in global software development

    DEFF Research Database (Denmark)

    Giuffrida, Rosalba; Dittrich, Yvonne

    2010-01-01

    variety of tools such as: instant messaging, internet forums, mailing lists, blogs, wikis, social network sites, social bookmarking, social libraries, virtual worlds. Though normally rather belonging to the private realm, the use of social software in corporate context has been reported, e.g. as a way...

  17. NASA's Software Safety Standard

    Science.gov (United States)

    Ramsay, Christopher M.

    2007-01-01

    NASA relies more and more on software to control, monitor, and verify its safety critical systems, facilities and operations. Since the 1960's there has hardly been a spacecraft launched that does not have a computer on board that will provide command and control services. There have been recent incidents where software has played a role in high-profile mission failures and hazardous incidents. For example, the Mars Orbiter, Mars Polar Lander, the DART (Demonstration of Autonomous Rendezvous Technology), and MER (Mars Exploration Rover) Spirit anomalies were all caused or contributed to by software. The Mission Control Centers for the Shuttle, ISS, and unmanned programs are highly dependant on software for data displays, analysis, and mission planning. Despite this growing dependence on software control and monitoring, there has been little to no consistent application of software safety practices and methodology to NASA's projects with safety critical software. Meanwhile, academia and private industry have been stepping forward with procedures and standards for safety critical systems and software, for example Dr. Nancy Leveson's book Safeware: System Safety and Computers. The NASA Software Safety Standard, originally published in 1997, was widely ignored due to its complexity and poor organization. It also focused on concepts rather than definite procedural requirements organized around a software project lifecycle. Led by NASA Headquarters Office of Safety and Mission Assurance, the NASA Software Safety Standard has recently undergone a significant update. This new standard provides the procedures and guidelines for evaluating a project for safety criticality and then lays out the minimum project lifecycle requirements to assure the software is created, operated, and maintained in the safest possible manner. This update of the standard clearly delineates the minimum set of software safety requirements for a project without detailing the implementation for those

  18. Systems, methods and apparatus for developing and maintaining evolving systems with software product lines

    Science.gov (United States)

    Hinchey, Michael G. (Inventor); Rash, James L. (Inventor); Pena, Joaquin (Inventor)

    2011-01-01

    Systems, methods and apparatus are provided through which an evolutionary system is managed and viewed as a software product line. In some embodiments, the core architecture is a relatively unchanging part of the system, and each version of the system is viewed as a product from the product line. Each software product is generated from the core architecture with some agent-based additions. The result may be a multi-agent system software product line.

  19. Multicore Considerations for Legacy Flight Software Migration

    Science.gov (United States)

    Vines, Kenneth; Day, Len

    2013-01-01

    In this paper we will discuss potential benefits and pitfalls when considering a migration from an existing single core code base to a multicore processor implementation. The results of this study present options that should be considered before migrating fault managers, device handlers and tasks with time-constrained requirements to a multicore flight software environment. Possible future multicore test bed demonstrations are also discussed.

  20. Customer Interaction in Software Development: A Comparison of Software Methodologies Deployed in Namibian Software Firms

    CSIR Research Space (South Africa)

    Iyawa, GE

    2016-01-01

    Full Text Available within the Namibian context. An implication for software project managers and software developers is that customer interaction should be properly managed to ensure that the software methodologies for improving software development processes...

  1. Marketing Mix del Software.

    Directory of Open Access Journals (Sweden)

    Yudith del Carmen Rodríguez Pérez

    2006-03-01

    Por ello, en este trabajo se define el concepto de producto software, se caracteriza al mismo y se exponen sus atributos de calidad. Además, se aborda la mezcla de marketing del software necesaria y diferente a la de otros productos para que este triunfe en el mercado.

  2. Sustainability in Software Engineering

    NARCIS (Netherlands)

    Wolfram, N.J.E.; Lago, P.; Osborne, Francesco

    2017-01-01

    The intersection between software engineering research and issues related to sustainability and green IT has been the subject of increasing attention. In spite of that, we observe that sustainability is still not clearly defined, or understood, in the field of software engineering. This lack of

  3. Software cost estimation

    NARCIS (Netherlands)

    Heemstra, F.J.

    1992-01-01

    The paper gives an overview of the state of the art of software cost estimation (SCE). The main questions to be answered in the paper are: (1) What are the reasons for overruns of budgets and planned durations? (2) What are the prerequisites for estimating? (3) How can software development effort be

  4. Software cost estimation

    NARCIS (Netherlands)

    Heemstra, F.J.; Heemstra, F.J.

    1993-01-01

    The paper gives an overview of the state of the art of software cost estimation (SCE). The main questions to be answered in the paper are: (1) What are the reasons for overruns of budgets and planned durations? (2) What are the prerequisites for estimating? (3) How can software development effort be

  5. Software engineering ethics

    Science.gov (United States)

    Bown, Rodney L.

    1991-01-01

    Software engineering ethics is reviewed. The following subject areas are covered: lack of a system viewpoint; arrogance of PC DOS software vendors; violation od upward compatibility; internet worm; internet worm revisited; student cheating and company hiring interviews; computing practitioners and the commodity market; new projects and old programming languages; schedule and budget; and recent public domain comments.

  6. Computer Software Reviews.

    Science.gov (United States)

    Hawaii State Dept. of Education, Honolulu. Office of Instructional Services.

    Intended to provide guidance in the selection of the best computer software available to support instruction and to make optimal use of schools' financial resources, this publication provides a listing of computer software programs that have been evaluated according to their currency, relevance, and value to Hawaii's educational programs. The…

  7. Software product family evaluation

    NARCIS (Netherlands)

    van der Linden, F; Bosch, J; Kamsties, E; Kansala, K; Krzanik, L; Obbink, H; VanDerLinden, F

    2004-01-01

    This paper proposes a 4-dimensional software product family engineering evaluation model. The 4 dimensions relate to the software engineering concerns of business, architecture, organisation and process. The evaluation model is meant to be used within organisations to determine the status of their

  8. Selecting the Right Software.

    Science.gov (United States)

    Shearn, Joseph

    1987-01-01

    Selection of administrative software requires analyzing present needs and, to meet future needs, choosing software that will function with a more powerful computer system. Other important factors to include are a professional system demonstration, maintenance and training, and financial considerations that allow leasing or renting alternatives.…

  9. Trends in software testing

    CERN Document Server

    Mohanty, J; Balakrishnan, Arunkumar

    2017-01-01

    This book is focused on the advancements in the field of software testing and the innovative practices that the industry is adopting. Considering the widely varied nature of software testing, the book addresses contemporary aspects that are important for both academia and industry. There are dedicated chapters on seamless high-efficiency frameworks, automation on regression testing, software by search, and system evolution management. There are a host of mathematical models that are promising for software quality improvement by model-based testing. There are three chapters addressing this concern. Students and researchers in particular will find these chapters useful for their mathematical strength and rigor. Other topics covered include uncertainty in testing, software security testing, testing as a service, test technical debt (or test debt), disruption caused by digital advancement (social media, cloud computing, mobile application and data analytics), and challenges and benefits of outsourcing. The book w...

  10. Revisiting software ecosystems research

    DEFF Research Database (Denmark)

    Manikas, Konstantinos

    2016-01-01

    Software ecosystems’ is argued to first appear as a concept more than 10 years ago and software ecosystem research started to take off in 2010. We conduct a systematic literature study, based on the most extensive literature review in the field up to date, with two primarily aims: (a) to provide...... an updated overview of the field and (b) to document evolution in the field. In total, we analyze 231 papers from 2007 until 2014 and provide an overview of the research in software ecosystems. Our analysis reveals a field that is rapidly growing both in volume and empirical focus while becoming more mature...... from evolving. We propose means for future research and the community to address them. Finally, our analysis shapes the view of the field having evolved outside the existing definitions of software ecosystems and thus propose the update of the definition of software ecosystems....

  11. Software safety hazard analysis

    International Nuclear Information System (INIS)

    Lawrence, J.D.

    1996-02-01

    Techniques for analyzing the safety and reliability of analog-based electronic protection systems that serve to mitigate hazards in process control systems have been developed over many years, and are reasonably well understood. An example is the protection system in a nuclear power plant. The extension of these techniques to systems which include digital computers is not well developed, and there is little consensus among software engineering experts and safety experts on how to analyze such systems. One possible technique is to extend hazard analysis to include digital computer-based systems. Software is frequently overlooked during system hazard analyses, but this is unacceptable when the software is in control of a potentially hazardous operation. In such cases, hazard analysis should be extended to fully cover the software. A method for performing software hazard analysis is proposed in this paper

  12. Systematic Software Development

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; Méndez Fernández, Daniel

    2015-01-01

    The speed of innovation and the global allocation of resources to accelerate development or to reduce cost put pressure on the software industry. In the global competition, especially so-called high-price countries have to present arguments why the higher development cost is justified and what...... makes these countries an attractive host for software companies. Often, high-quality engineering and excellent quality of products, e.g., machinery and equipment, are mentioned. Yet, the question is: Can such arguments be also found for the software industry? We aim at investigating the degree...... of professionalism and systematization of software development to draw a map of strengths and weaknesses. To this end, we conducted as a first step an exploratory survey in Germany, presented in this paper. In this survey, we focused on the perceived importance of the two general software engineering process areas...

  13. Software architecture evolution

    DEFF Research Database (Denmark)

    Barais, Olivier; Le Meur, Anne-Francoise; Duchien, Laurence

    2008-01-01

    Software architectures must frequently evolve to cope with changing requirements, and this evolution often implies integrating new concerns. Unfortunately, when the new concerns are crosscutting, existing architecture description languages provide little or no support for this kind of evolution....... The software architect must modify multiple elements of the architecture manually, which risks introducing inconsistencies. This chapter provides an overview, comparison and detailed treatment of the various state-of-the-art approaches to describing and evolving software architectures. Furthermore, we discuss...... one particular framework named Tran SAT, which addresses the above problems of software architecture evolution. Tran SAT provides a new element in the software architecture descriptions language, called an architectural aspect, for describing new concerns and their integration into an existing...

  14. Software quality in 1997

    Energy Technology Data Exchange (ETDEWEB)

    Jones, C. [Software Productivity Research, Inc., Burlington, MA (United States)

    1997-11-01

    For many years, software quality assurance lagged behind hardware quality assurance in terms of methods, metrics, and successful results. New approaches such as Quality Function Deployment (QFD) the ISO 9000-9004 standards, the SEI maturity levels, and Total Quality Management (TQM) are starting to attract wide attention, and in some cases to bring software quality levels up to a parity with manufacturing quality levels. Since software is on the critical path for many engineered products, and for internal business systems as well, the new approaches are starting to affect global competition and attract widespread international interest. It can be hypothesized that success in mastering software quality will be a key strategy for dominating global software markets in the 21st century.

  15. Developing Software Simulations

    Directory of Open Access Journals (Sweden)

    Tom Hall

    2007-06-01

    Full Text Available Programs in education and business often require learners to develop and demonstrate competence in specified areas and then be able to effectively apply this knowledge. One method to aid in developing a skill set in these areas is through the use of software simulations. These simulations can be used for learner demonstrations of competencies in a specified course as well as a review of the basic skills at the beginning of subsequent courses. The first section of this paper discusses ToolBook, the software used to develop our software simulations. The second section discusses the process of developing software simulations. The third part discusses how we have used software simulations to assess student knowledge of research design by providing simulations that allow the student to practice using SPSS and Excel.

  16. Software licenses: Stay honest!

    CERN Multimedia

    Computer Security Team

    2012-01-01

    Do you recall our article about copyright violation in the last issue of the CERN Bulletin, “Music, videos and the risk for CERN”? Now let’s be more precise. “Violating copyright” not only means the illegal download of music and videos, it also applies to software packages and applications.   Users must respect proprietary rights in compliance with the CERN Computing Rules (OC5). Not having legitimately obtained a program or the required licenses to run that software is not a minor offense. It violates CERN rules and puts the Organization at risk! Vendors deserve credit and compensation. Therefore, make sure that you have the right to use their software. In other words, you have bought the software via legitimate channels and use a valid and honestly obtained license. This also applies to “Shareware” and software under open licenses, which might also come with a cost. Usually, only “Freeware” is complete...

  17. Designing Scientific Software for Heterogeneous Computing

    DEFF Research Database (Denmark)

    Glimberg, Stefan Lemvig

    , algorithms and data structures must be designed to utilize the underlying parallel architecture. The architectural changes in hardware design within the last decade, from single to multi and many-core architectures, require software developers to identify and properly implement methods that both exploit...... makes parallel software design applicable, but also a challenge for scientific software developers at all levels. We have developed a generic C++ library for fast prototyping of large-scale PDEs solvers based on flexible-order finite difference approximations on structured regular grids. The library...... is designed with a high abstraction interface to improve developer productivity. The library is based on modern template-based design concepts as described in Glimberg, Engsig-Karup, Nielsen & Dammann (2013). The library utilizes heterogeneous CPU/GPU environments in order to maximize computational throughput...

  18. Is Scrum fit for global software engineering?

    DEFF Research Database (Denmark)

    Lous, Pernille; Kuhrmann, Marco; Tell, Paolo

    2017-01-01

    Distributed software engineering and agility are strongly pushing on today's software industry. Due to inherent incompatibilities, for years, studying Scrum and its application in distributed setups has been subject to theoretical and applied research, and an increasing body of knowledge reports...... insights into this combination. Through a systematic literature review, this paper contributes a collection of experiences on the application of Scrum to global software engineering (GSE). In total, we identified 40 challenges in 19 categories practitioners face when using Scrum in GSE. Among...... the challenges, scaling Scrum to GSE and adopting practices accordingly are the most frequently named. Our findings also show that most solution proposals aim at modifying elements of the Scrum core processes. We thus conclude that, even though Scrum allows for extensive modification, Scrum itself represents...

  19. Core Hunter 3: flexible core subset selection.

    Science.gov (United States)

    De Beukelaer, Herman; Davenport, Guy F; Fack, Veerle

    2018-05-31

    Core collections provide genebank curators and plant breeders a way to reduce size of their collections and populations, while minimizing impact on genetic diversity and allele frequency. Many methods have been proposed to generate core collections, often using distance metrics to quantify the similarity of two accessions, based on genetic marker data or phenotypic traits. Core Hunter is a multi-purpose core subset selection tool that uses local search algorithms to generate subsets relying on one or more metrics, including several distance metrics and allelic richness. In version 3 of Core Hunter (CH3) we have incorporated two new, improved methods for summarizing distances to quantify diversity or representativeness of the core collection. A comparison of CH3 and Core Hunter 2 (CH2) showed that these new metrics can be effectively optimized with less complex algorithms, as compared to those used in CH2. CH3 is more effective at maximizing the improved diversity metric than CH2, still ensures a high average and minimum distance, and is faster for large datasets. Using CH3, a simple stochastic hill-climber is able to find highly diverse core collections, and the more advanced parallel tempering algorithm further increases the quality of the core and further reduces variability across independent samples. We also evaluate the ability of CH3 to simultaneously maximize diversity, and either representativeness or allelic richness, and compare the results with those of the GDOpt and SimEli methods. CH3 can sample equally representative cores as GDOpt, which was specifically designed for this purpose, and is able to construct cores that are simultaneously more diverse, and either are more representative or have higher allelic richness, than those obtained by SimEli. In version 3, Core Hunter has been updated to include two new core subset selection metrics that construct cores for representativeness or diversity, with improved performance. It combines and outperforms the

  20. Software for physical start-up console

    International Nuclear Information System (INIS)

    Arbet, L.; Suchy, R.

    1991-01-01

    The physical start-up console comprises an PC AT-based control unit equipped with an 80386 processor, and information input/output units. The basic functions to be fulfilled by the control unit software include data acquisition related to the following parameters: neutron physics properties of the reactor core (neutron fluxes recorded by ionization chambers and reactivity recorded by a digital reactimeter), positions of the reactor core control elements (by the digital position meter) and reactor core control measurements, and technological quantities requisite for evaluating physical start-up tests. The measured and calculated data are shown on the control unit display. The setup of the data acquisition system and of user programs is dealt with, and characteristics of the user processes are briefly described. (Z.S.)

  1. k-core covers and the core

    NARCIS (Netherlands)

    Sanchez-Rodriguez, E.; Borm, Peter; Estevez-Fernandez, A.; Fiestras-Janeiro, G.; Mosquera, M.A.

    This paper extends the notion of individual minimal rights for a transferable utility game (TU-game) to coalitional minimal rights using minimal balanced families of a specific type, thus defining a corresponding minimal rights game. It is shown that the core of a TU-game coincides with the core of

  2. Towards an Ontology of Software

    OpenAIRE

    Wang, Xiaowei

    2016-01-01

    Software is permeating every aspect of our personal and social life. And yet, the cluster of concepts around the notion of software, such as the notions of a software product, software requirements, software specifications, are still poorly understood with no consensus on the horizon. For many, software is just code, something intangible best defined in contrast with hardware, but it is not particularly illuminating. This erroneous notion, software is just code, presents both in the ontology ...

  3. Microprocessor-based integrated LMFBR core surveillance

    International Nuclear Information System (INIS)

    Gmeiner, L.

    1984-06-01

    This report results from a joint study of KfK and INTERATOM. The aim of this study is to explore the advantages of microprocessors and microelectronics for a more sophisticated core surveillance, which is based on the integration of separate surveillance techniques. Due to new developments in microelectronics and related software an approach to LMFBR core surveillance can be conceived that combines a number of measurements into a more intelligent decision-making data processing system. The following techniques are considered to contribute essentially to an integrated core surveillance system: - subassembly state and thermal hydraulics performance monitoring, - temperature noise analysis, - acoustic core surveillance, - failure characterization and failure prediction based on DND- and cover gas signals, and - flux tilting techniques. Starting from a description of these techniques it is shown that by combination and correlation of these individual techniques a higher degree of cost-effectiveness, reliability and accuracy can be achieved. (orig./GL) [de

  4. LDUA software custodian's notebook

    International Nuclear Information System (INIS)

    Aftanas, B.L.

    1998-01-01

    This plan describes the activities to be performed and controls to be applied to the process of specifying, obtaining, and qualifying the control and data acquisition software for the Light Duty Utility Arm (LDUA) System. It serves the purpose of a software quality assurance plan, a verification and validation plan, and a configuration management plan. This plan applies to all software that is an integral part of the LDUA control and data acquisition system, that is, software that is installed in the computers that are part of the LDUA system as it is deployed in the field. This plan applies to the entire development process, including: requirements; design; implementation; and operations and maintenance. This plan does not apply to any software that is not integral with the LDUA system. This plan has-been prepared in accordance with WHC-CM-6-1 Engineering Practices, EP-2.1; WHC-CM-3-10 Software Practices; and WHC-CM-4-2, QR 19.0, Software Quality Assurance Requirements

  5. Software Formal Inspections Guidebook

    Science.gov (United States)

    1993-01-01

    The Software Formal Inspections Guidebook is designed to support the inspection process of software developed by and for NASA. This document provides information on how to implement a recommended and proven method for conducting formal inspections of NASA software. This Guidebook is a companion document to NASA Standard 2202-93, Software Formal Inspections Standard, approved April 1993, which provides the rules, procedures, and specific requirements for conducting software formal inspections. Application of the Formal Inspections Standard is optional to NASA program or project management. In cases where program or project management decide to use the formal inspections method, this Guidebook provides additional information on how to establish and implement the process. The goal of the formal inspections process as documented in the above-mentioned Standard and this Guidebook is to provide a framework and model for an inspection process that will enable the detection and elimination of defects as early as possible in the software life cycle. An ancillary aspect of the formal inspection process incorporates the collection and analysis of inspection data to effect continual improvement in the inspection process and the quality of the software subjected to the process.

  6. Evidence synthesis software.

    Science.gov (United States)

    Park, Sophie Elizabeth; Thomas, James

    2018-06-07

    It can be challenging to decide which evidence synthesis software to choose when doing a systematic review. This article discusses some of the important questions to consider in relation to the chosen method and synthesis approach. Software can support researchers in a range of ways. Here, a range of review conditions and software solutions. For example, facilitating contemporaneous collaboration across time and geographical space; in-built bias assessment tools; and line-by-line coding for qualitative textual analysis. EPPI-Reviewer is a review software for research synthesis managed by the EPPI-centre, UCL Institute of Education. EPPI-Reviewer has text mining automation technologies. Version 5 supports data sharing and re-use across the systematic review community. Open source software will soon be released. EPPI-Centre will continue to offer the software as a cloud-based service. The software is offered via a subscription with a one-month (extendible) trial available and volume discounts for 'site licences'. It is free to use for Cochrane and Campbell reviews. The next EPPI-Reviewer version is being built in collaboration with National Institute for Health and Care Excellence using 'surveillance' of newly published research to support 'living' iterative reviews. This is achieved using a combination of machine learning and traditional information retrieval technologies to identify the type of research each new publication describes and determine its relevance for a particular review, domain or guideline. While the amount of available knowledge and research is constantly increasing, the ways in which software can support the focus and relevance of data identification are also developing fast. Software advances are maximising the opportunities for the production of relevant and timely reviews. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise

  7. Factors negatively influencing knowledge sharing in software development

    Directory of Open Access Journals (Sweden)

    Lucas T. Khoza

    2017-07-01

    Objective: This study seeks to identify factors that negatively influence knowledge sharing in software development in the developing country context. Method: Expert sampling as a subcategory of purposive sampling was employed to extract information, views and opinions from experts in the field of information and communication technology, more specifically from those who are involved in software development projects. Four Johannesburg-based software developing organisations listed on the Johannesburg Stock Exchange (JSE, South Africa, participated in this research study. Quantitative data were collected using an online questionnaire with closed-ended questions. Results: Findings of this research reveal that job security, motivation, time constraints, physiological factors, communication, resistance to change and rewards are core factors negatively influencing knowledge sharing in software developing organisations. Conclusions: Improved understanding of factors negatively influencing knowledge sharing is expected to assist software developing organisations in closing the gap for software development projects failing to meet the triple constraint of time, cost and scope.

  8. COSINE software development based on code generation technology

    International Nuclear Information System (INIS)

    Ren Hao; Mo Wentao; Liu Shuo; Zhao Guang

    2013-01-01

    The code generation technology can significantly improve the quality and productivity of software development and reduce software development risk. At present, the code generator is usually based on UML model-driven technology, which can not satisfy the development demand of nuclear power calculation software. The feature of scientific computing program was analyzed and the FORTRAN code generator (FCG) based on C# was developed in this paper. FCG can generate module variable definition FORTRAN code automatically according to input metadata. FCG also can generate memory allocation interface for dynamic variables as well as data access interface. FCG was applied to the core and system integrated engine for design and analysis (COSINE) software development. The result shows that FCG can greatly improve the development efficiency of nuclear power calculation software, and reduce the defect rate of software development. (authors)

  9. Data Acquisition Backbone Core DABC

    International Nuclear Information System (INIS)

    Adamczewski, J; Essel, H G; Kurz, N; Linev, S

    2008-01-01

    For the new experiments at FAIR new concepts of data acquisition systems have to be developed like the distribution of self-triggered, time stamped data streams over high performance networks for event building. The Data Acquisition Backbone Core (DABC) is a software package currently under development for FAIR detector tests, readout components test, and data flow investigations. All kinds of data channels (front-end systems) are connected by program plug-ins into functional components of DABC like data input, combiner, scheduler, event builder, analysis and storage components. After detailed simulations real tests of event building over a switched network (InfiniBand clusters with up to 110 nodes) have been performed. With the DABC software more than 900 MByte/s input and output per node can be achieved meeting the most demanding requirements. The software is ready for the implementation of various test beds needed for the final design of data acquisition systems at FAIR. The development of key components is supported by the FutureDAQ project of the European Union (FP6 I3HP JRA1)

  10. Beginning software engineering

    CERN Document Server

    Stephens, Rod

    2015-01-01

    Beginning Software Engineering demystifies the software engineering methodologies and techniques that professional developers use to design and build robust, efficient, and consistently reliable software. Free of jargon and assuming no previous programming, development, or management experience, this accessible guide explains important concepts and techniques that can be applied to any programming language. Each chapter ends with exercises that let you test your understanding and help you elaborate on the chapter's main concepts. Everything you need to understand waterfall, Sashimi, agile, RAD, Scrum, Kanban, Extreme Programming, and many other development models is inside!

  11. Software industrial flexible

    OpenAIRE

    Díaz Araya, Daniel; Muñoz, Leandro; Sirerol, Daniel; Oviedo, Sandra; Ibáñez, Francisco S.

    2012-01-01

    En este trabajo se pretende investigar y proponer técnicas, métodos y tecnologías que permitan el desarrollo de software flexible en ambientes industriales. El objetivo es generar métodos y técnicas para facilitar el desarrollo de software flexible en ambientes industriales. Las áreas de investigación son los sistemas de scheduling de producción, la generación de software para plataformas de hardware abiertas y la innovación.

  12. Thyroid uptake software

    International Nuclear Information System (INIS)

    Alonso, Dolores; Arista, Eduardo

    2003-01-01

    The DETEC-PC software was developed as a complement to a measurement system (hardware) able to perform Iodine Thyroid Uptake studies. The software was designed according to the principles of Object oriented programming using C++ language. The software automatically fixes spectrometric measurement parameters and besides patient measurement also performs statistical analysis of a batch of samples. It possesses a PARADOX database with all information of measured patients and a help system with the system options and medical concepts related to the thyroid uptake study

  13. Criteria for software modularization

    Science.gov (United States)

    Card, David N.; Page, Gerald T.; Mcgarry, Frank E.

    1985-01-01

    A central issue in programming practice involves determining the appropriate size and information content of a software module. This study attempted to determine the effectiveness of two widely used criteria for software modularization, strength and size, in reducing fault rate and development cost. Data from 453 FORTRAN modules developed by professional programmers were analyzed. The results indicated that module strength is a good criterion with respect to fault rate, whereas arbitrary module size limitations inhibit programmer productivity. This analysis is a first step toward defining empirically based standards for software modularization.

  14. Error Free Software

    Science.gov (United States)

    1985-01-01

    A mathematical theory for development of "higher order" software to catch computer mistakes resulted from a Johnson Space Center contract for Apollo spacecraft navigation. Two women who were involved in the project formed Higher Order Software, Inc. to develop and market the system of error analysis and correction. They designed software which is logically error-free, which, in one instance, was found to increase productivity by 600%. USE.IT defines its objectives using AXES -- a user can write in English and the system converts to computer languages. It is employed by several large corporations.

  15. Global Software Engineering

    DEFF Research Database (Denmark)

    Ebert, Christof; Kuhrmann, Marco; Prikladnicki, Rafael

    2016-01-01

    SOFTWARE, LIKE ALL industry products, is the result of complex multinational supply chains with many partners from concept to development to production and maintenance. Global software engineering (GSE), IT outsourcing, and business process outsourcing during the past decade have showed growth...... rates of 10 to 20 percent per year. This instalment of Practitioner’s Digest summarizes experiences and guidance from industry to facilitate knowledge and technology transfer for GSE. It’s based on industry feedback from the annual IEEE International Conference on Global Software Engineering, which had...

  16. Software for microcircuit systems

    International Nuclear Information System (INIS)

    Kunz, P.F.

    1978-01-01

    Modern Large Scale Integration (LSI) microcircuits are meant to be programmed in order to control the function that they perform. In the previous paper the author has already discussed the basics of microprogramming and have studied in some detail two types of new microcircuits. In this paper, methods of developing software for these microcircuits are explored. This generally requires a package of support software in order to assemble the microprogram, and also some amount of support software to test the microprograms and to test the microprogrammed circuit itself. (Auth.)

  17. Guide to software export

    CERN Document Server

    Philips, Roger A

    2014-01-01

    An ideal reference source for CEOs, marketing and sales managers, sales consultants, and students of international marketing, Guide to Software Export provides a step-by-step approach to initiating or expanding international software sales. It teaches you how to examine critically your candidate product for exportability; how to find distributors, agents, and resellers abroad; how to identify the best distribution structure for export; and much, much more!Not content with providing just the guidelines for setting up, expanding, and managing your international sales channels, Guide to Software

  18. Software takes command

    CERN Document Server

    Manovich, Lev

    2013-01-01

    Software has replaced a diverse array of physical, mechanical, and electronic technologies used before 21st century to create, store, distribute and interact with cultural artifacts. It has become our interface to the world, to others, to our memory and our imagination - a universal language through which the world speaks, and a universal engine on which the world runs. What electricity and combustion engine were to the early 20th century, software is to the early 21st century. Offering the the first theoretical and historical account of software for media authoring and its effects on the prac

  19. Software quality assurance handbook

    Energy Technology Data Exchange (ETDEWEB)

    1990-09-01

    There are two important reasons for Software Quality Assurance (SQA) at Allied-Signal Inc., Kansas City Division (KCD): First, the benefits from SQA make good business sense. Second, the Department of Energy has requested SQA. This handbook is one of the first steps in a plant-wide implementation of Software Quality Assurance at KCD. The handbook has two main purposes. The first is to provide information that you will need to perform software quality assurance activities. The second is to provide a common thread to unify the approach to SQA at KCD. 2 figs.

  20. Sobre software libre

    OpenAIRE

    Matellán Olivera, Vicente; González Barahona, Jesús; Heras Quirós, Pedro de las; Robles Martínez, Gregorio

    2004-01-01

    220 p. "Sobre software libre" reune casi una treintena de ensayos sobre temas de candente actualidad relacionados con el software libre (del cual Linux es su ex- ponente más conocido). Los ensayos que el lector encontrará están divididos en bloques temáticos que van desde la propiedad intelectual o las cuestiones económicas y sociales de este modelo hasta su uso en la educación y las administraciones publicas, pasando por alguno que repasa la historia del software libre en l...

  1. Software platform virtualization in chemistry research and university teaching.

    Science.gov (United States)

    Kind, Tobias; Leamy, Tim; Leary, Julie A; Fiehn, Oliver

    2009-11-16

    Modern chemistry laboratories operate with a wide range of software applications under different operating systems, such as Windows, LINUX or Mac OS X. Instead of installing software on different computers it is possible to install those applications on a single computer using Virtual Machine software. Software platform virtualization allows a single guest operating system to execute multiple other operating systems on the same computer. We apply and discuss the use of virtual machines in chemistry research and teaching laboratories. Virtual machines are commonly used for cheminformatics software development and testing. Benchmarking multiple chemistry software packages we have confirmed that the computational speed penalty for using virtual machines is low and around 5% to 10%. Software virtualization in a teaching environment allows faster deployment and easy use of commercial and open source software in hands-on computer teaching labs. Software virtualization in chemistry, mass spectrometry and cheminformatics is needed for software testing and development of software for different operating systems. In order to obtain maximum performance the virtualization software should be multi-core enabled and allow the use of multiprocessor configurations in the virtual machine environment. Server consolidation, by running multiple tasks and operating systems on a single physical machine, can lead to lower maintenance and hardware costs especially in small research labs. The use of virtual machines can prevent software virus infections and security breaches when used as a sandbox system for internet access and software testing. Complex software setups can be created with virtual machines and are easily deployed later to multiple computers for hands-on teaching classes. We discuss the popularity of bioinformatics compared to cheminformatics as well as the missing cheminformatics education at universities worldwide.

  2. Intellectual Property Protection of Software – At the Crossroads of Software Patents and Open Source Software

    OpenAIRE

    Tantarimäki, Maria

    2018-01-01

    The thesis considers the intellectual property protection of software in Europe and in the US, which is increasingly important subject as the world is globalizing and digitalizing. The special nature of software has challenges the intellectual property rights. The current protection of software is based on copyright protection but in this thesis, two other options are considered: software patents and open source software. Software patents provide strong protection for software whereas the pur...

  3. Center for Adaptive Optics | Software

    Science.gov (United States)

    Optics Software The Center for Adaptive Optics acts as a clearing house for distributing Software to Institutes it gives specialists in Adaptive Optics a place to distribute their software. All software is shared on an "as-is" basis and the users should consult with the software authors with any

  4. A concept of software testing for SMART MMIS software

    International Nuclear Information System (INIS)

    Seo, Yong Seok; Seong, Seung Hwan; Park, Keun Ok; Hur, Sub; Kim, Dong Hoon

    2001-01-01

    In order to achieve high quality of SMART MMIS software, the well-constructed software testing concept shall be required. This paper established software testing concept which is to be applied to SMART MMIS software, in terms of software testing organization, documentation. procedure, and methods. The software testing methods are classified into source code static analysis and dynamic testing. The software dynamic testing methods are discussed with two aspects: white-box and black-box testing. As software testing concept introduced in this paper is applied to the SMART MMIS software. the high quality of the software will be produced. In the future, software failure data will be collected through the construction of SMART MMIS prototyping facility which the software testing concept of this paper is applied to

  5. Computer Center: Software Review.

    Science.gov (United States)

    Duhrkopf, Richard, Ed.; Belshe, John F., Ed.

    1988-01-01

    Reviews a software package, "Mitosis-Meiosis," available for Apple II or IBM computers with colorgraphics capabilities. Describes the documentation, presentation and flexibility of the program. Rates the program based on graphics and usability in a biology classroom. (CW)

  6. Petroleum software profiles

    International Nuclear Information System (INIS)

    Anon.

    1996-01-01

    A profile of twenty-two software packages designed for petroleum exploration and production was provided. Some focussed on the oil and gas engineering industry, and others on mapping systems containing well history files and well data summaries. Still other programs provided accounting systems designed to address the complexities of the oil and gas industry. The software packages reviewed were developed by some of the best-known groups involved in software development for the oil and gas industry, including among others, Geoquest, the Can Tek Group, Applied Terravision Systems Inc., Neotechnology Consultants Ltd., (12) OGCI Software Inc., Oracle Energy, Production Revenue Information Systems Management, Virtual Computing Services Ltd., and geoLogic Systems Ltd

  7. Next Generation Software Development

    National Research Council Canada - National Science Library

    Manna, Zohar

    2005-01-01

    Under this grant we have studied the development of a scientifically sound basis for software development that builds on widely used pragmatic methods but is firmly grounded in well-established formal...

  8. Managing Distributed Software Projects

    DEFF Research Database (Denmark)

    Persson, John Stouby

    Increasingly, software projects are becoming geographically distributed, with limited face-toface interaction between participants. These projects face particular challenges that need careful managerial attention. This PhD study reports on how we can understand and support the management...... of distributed software projects, based on a literature study and a case study. The main emphasis of the literature study was on how to support the management of distributed software projects, but also contributed to an understanding of these projects. The main emphasis of the case study was on how to understand...... the management of distributed software projects, but also contributed to supporting the management of these projects. The literature study integrates what we know about risks and risk-resolution techniques, into a framework for managing risks in distributed contexts. This framework was developed iteratively...

  9. eSoftwareList

    Data.gov (United States)

    US Agency for International Development — USAID Software Database reporting tool created in Oracle Application Express (APEX). This version provides read only access to a database view of the JIRA SAR...

  10. Software didattico: integrazione scolastica

    Directory of Open Access Journals (Sweden)

    Lucia Ferlino

    1996-01-01

    Full Text Available Discussion of the use of educational software for school integration. Requires being aware of its potential effectiveness and know that it also lies in the choice of functional products.

  11. Tier2 Submit Software

    Science.gov (United States)

    Download this tool for Windows or Mac, which helps facilities prepare a Tier II electronic chemical inventory report. The data can also be exported into the CAMEOfm (Computer-Aided Management of Emergency Operations) emergency planning software.

  12. SEER Data & Software

    Science.gov (United States)

    Options for accessing datasets for incidence, mortality, county populations, standard populations, expected survival, and SEER-linked and specialized data. Plus variable definitions, documentation for reporting and using datasets, statistical software (SEER*Stat), and observational research resources.

  13. Managing Software Process Evolution

    DEFF Research Database (Denmark)

    This book focuses on the design, development, management, governance and application of evolving software processes that are aligned with changing business objectives, such as expansion to new domains or shifting to global production. In the context of an evolving business world, it examines...... the complete software process lifecycle, from the initial definition of a product to its systematic improvement. In doing so, it addresses difficult problems, such as how to implement processes in highly regulated domains or where to find a suitable notation system for documenting processes, and provides...... essential insights and tips to help readers manage process evolutions. And last but not least, it provides a wealth of examples and cases on how to deal with software evolution in practice. Reflecting these topics, the book is divided into three parts. Part 1 focuses on software business transformation...

  14. Software for nuclear spectrometry

    International Nuclear Information System (INIS)

    1998-10-01

    The Advisory Group Meeting (AGM) on Software for Nuclear Spectrometry was dedicated to review the present status of software for nuclear spectrometry and to advise on future activities in this field. Because similar AGM and consultant's meetings had been held in the past; together with an attempt to get more streamlined, this AGM was devoted to the specific field of software for gamma ray spectrometry. Nevertheless, many of the issues discussed and the recommendations made are of general concern for any software on nuclear spectrometry. The report is organized by sections. The 'Summary' gives conclusions and recommendations adopted at the AGM. These conclusions and recommendations resulted from the discussions held during and after presentations of the scientific and technical papers. These papers are reported here in their integral form in the following Sections

  15. Software for radiation protection

    International Nuclear Information System (INIS)

    Graffunder, H.

    2002-01-01

    The software products presented are universally usable programs for radiation protection. The systems were designed in order to establish a comprehensive database specific to radiation protection and, on this basis, model in programs subjects of radiation protection. Development initially focused on the creation of the database. Each software product was to access the same nuclide-specific data; input errors and differences in spelling were to be excluded from the outset. This makes the products more compatible with each other and able to exchange data among each other. The software products are modular in design. Functions recurring in radiation protection are always treated the same way in different programs, and also represented the same way on the program surface. The recognition effect makes it easy for users to familiarize with the products quickly. All software products are written in German and are tailored to the administrative needs and codes and regulations in Germany and in Switzerland. (orig.) [de

  16. ITSY Handheld Software Radio

    National Research Council Canada - National Science Library

    Bose, Vanu

    2001-01-01

    .... A handheld software radio platform would enable the construction of devices that could inter-operate with multiple legacy systems, download new waveforms and be used to construct adhoc networks...

  17. Error-Free Software

    Science.gov (United States)

    1989-01-01

    001 is an integrated tool suited for automatically developing ultra reliable models, simulations and software systems. Developed and marketed by Hamilton Technologies, Inc. (HTI), it has been applied in engineering, manufacturing, banking and software tools development. The software provides the ability to simplify the complex. A system developed with 001 can be a prototype or fully developed with production quality code. It is free of interface errors, consistent, logically complete and has no data or control flow errors. Systems can be designed, developed and maintained with maximum productivity. Margaret Hamilton, President of Hamilton Technologies, also directed the research and development of USE.IT, an earlier product which was the first computer aided software engineering product in the industry to concentrate on automatically supporting the development of an ultrareliable system throughout its life cycle. Both products originated in NASA technology developed under a Johnson Space Center contract.

  18. Conceptual Models Core to Good Design

    CERN Document Server

    Johnson, Jeff

    2011-01-01

    People make use of software applications in their activities, applying them as tools in carrying out tasks. That this use should be good for people--easy, effective, efficient, and enjoyable--is a principal goal of design. In this book, we present the notion of Conceptual Models, and argue that Conceptual Models are core to achieving good design. From years of helping companies create software applications, we have come to believe that building applications without Conceptual Models is just asking for designs that will be confusing and difficult to learn, remember, and use. We show how Concept

  19. MARS software package status

    International Nuclear Information System (INIS)

    Azhgirej, I.L.; Talanov, V.V.

    2000-01-01

    The MARS software package is intended for simulating the nuclear-electromagnetic cascades and the secondary neutrons and muons transport in the heterogeneous medium of arbitrary complexity in the magnetic fields presence. The inclusive approach to describing the particle production in the nuclear and electromagnetic interactions and by the unstable particles decay is realized in the package. The MARS software package was actively applied for solving various radiation physical problems [ru

  20. MAGIC user's group software

    International Nuclear Information System (INIS)

    Warren, G.; Ludeking, L.; McDonald, J.; Nguyen, K.; Goplen, B.

    1990-01-01

    The MAGIC User's Group has been established to facilitate the use of electromagnetic particle-in-cell software by universities, government agencies, and industrial firms. The software consists of a series of independent executables that are capable of inter-communication. MAGIC, SOS, μ SOS are used to perform electromagnetic simulations while POSTER is used to provide post-processing capabilities. Each is described in the paper. Use of the codes for Klystrode simulation is discussed

  1. Global software development

    DEFF Research Database (Denmark)

    Matthiesen, Stina

    2016-01-01

    This overview presents the mid stages of my doctoral research-based on ethnographic work conducted in IT companies in India and in Denmark-on collaborative work within global software development (GSD). In the following I briefly introduce how this research seeks to spark a debate in CSCW...... by challenging contemporary ideals about software development outsourcing through the exploration of the multiplicities and asymmetric dynamics inherent in the collaborative work of GSD....

  2. Principles of Antifragile Software

    OpenAIRE

    Monperrus, Martin

    2014-01-01

    The goal of this paper is to study and define the concept of "antifragile software". For this, I start from Taleb's statement that antifragile systems love errors, and discuss whether traditional software dependability fits into this class. The answer is somewhat negative, although adaptive fault tolerance is antifragile: the system learns something when an error happens, and always imrpoves. Automatic runtime bug fixing is changing the code in response to errors, fault injection in productio...

  3. Software product quality measurement

    OpenAIRE

    Godliauskas, Eimantas

    2016-01-01

    This paper analyses Ruby product quality measures, suggesting three new measures for Ruby product quality measurement tool Rubocop to measure Ruby product quality characteristics defined in ISO 2502n standard series. This paper consists of four main chapters. The first chapter gives a brief view of software product quality and software product quality measurement. The second chapter analyses object oriented quality measures. The third chapter gives a brief view of the most popular Ruby qualit...

  4. Reactor core fuel management

    International Nuclear Information System (INIS)

    Silvennoinen, P.

    1976-01-01

    The subject is covered in chapters, entitled: concepts of reactor physics; neutron diffusion; core heat transfer; reactivity; reactor operation; variables of core management; computer code modules; alternative reactor concepts; methods of optimization; general system aspects. (U.K.)

  5. Nuclear reactor core catcher

    International Nuclear Information System (INIS)

    1977-01-01

    A nuclear reactor core catcher is described for containing debris resulting from an accident causing core meltdown and which incorporates a method of cooling the debris by the circulation of a liquid coolant. (U.K.)

  6. The online simulation of core physics in nuclear power plant

    International Nuclear Information System (INIS)

    Zhao Qiang

    2005-01-01

    The three-dimensional power distribution in core is one of the most important status variables of nuclear reactor. In order to monitor the 3-D in core power distribution timely and accurately, the online simulation system of core physics was designed in the paper. This system combines core physics simulation with the data, which is from the plant and reactor instrumentation. The design of the system consists of the hardware part and the software part. The online simulation system consists of a main simulation computer and a simulation operation station. The online simulation system software includes of the real-time simulation support software, the system communication software, the simulation program and the simulation interface software. Two-group and three-dimensional neutron kinetics model with six groups delayed neutrons was used in the real-time simulation of nuclear reactor core physics. According to the characteristics of the nuclear reactor, the core was divided into many nodes. Resolving the neutron equation, the method of separate variables was used. The input data from the plant and reactor instrumentation system consist of core thermal power, loop temperatures and pressure, control rod positions, boron concentration, core exit thermocouple data, Excore detector signals, in core flux detectors signals. There are two purposes using the data, one is to ensure that the model is as close as the current actual reactor condition, and the other is to calibrate the calculated power distribution. In this paper, the scheme of the online simulation system was introduced. Under the real-time simulation support system, the simulation program is being compiled. Compared with the actual operational data, the elementary simulation results were reasonable and correct. (author)

  7. Global Software Development with Cloud Platforms

    Science.gov (United States)

    Yara, Pavan; Ramachandran, Ramaseshan; Balasubramanian, Gayathri; Muthuswamy, Karthik; Chandrasekar, Divya

    Offshore and outsourced distributed software development models and processes are facing challenges, previously unknown, with respect to computing capacity, bandwidth, storage, security, complexity, reliability, and business uncertainty. Clouds promise to address these challenges by adopting recent advances in virtualization, parallel and distributed systems, utility computing, and software services. In this paper, we envision a cloud-based platform that addresses some of these core problems. We outline a generic cloud architecture, its design and our first implementation results for three cloud forms - a compute cloud, a storage cloud and a cloud-based software service- in the context of global distributed software development (GSD). Our ”compute cloud” provides computational services such as continuous code integration and a compile server farm, ”storage cloud” offers storage (block or file-based) services with an on-line virtual storage service, whereas the on-line virtual labs represent a useful cloud service. We note some of the use cases for clouds in GSD, the lessons learned with our prototypes and identify challenges that must be conquered before realizing the full business benefits. We believe that in the future, software practitioners will focus more on these cloud computing platforms and see clouds as a means to supporting a ecosystem of clients, developers and other key stakeholders.

  8. A Core Language for Separate Variability Modeling

    DEFF Research Database (Denmark)

    Iosif-Lazăr, Alexandru Florin; Wasowski, Andrzej; Schaefer, Ina

    2014-01-01

    Separate variability modeling adds variability to a modeling language without requiring modifications of the language or the supporting tools. We define a core language for separate variability modeling using a single kind of variation point to define transformations of software artifacts in object...... hierarchical dependencies between variation points via copying and flattening. Thus, we reduce a model with intricate dependencies to a flat executable model transformation consisting of simple unconditional local variation points. The core semantics is extremely concise: it boils down to two operational rules...

  9. Seismic core shroud

    International Nuclear Information System (INIS)

    Puri, A.; Mullooly, J.F.

    1981-01-01

    A core shroud is provided, comprising: a coolant boundary, following the shape of the core boundary, for channeling the coolant through the fuel assemblies; a cylindrical band positioned inside the core barrel and surrounding the coolant boundary; and support members extending from the coolant boundary to the band, for transferring load from the coolant boundary to the band. The shroud may be assembled in parts using automated welding techniques, and it may be adjusted to fit the reactor core easily

  10. Examining software complexity and quality for scientific software

    International Nuclear Information System (INIS)

    Kelly, D.; Shepard, T.

    2005-01-01

    Research has not found a simple relationship between software complexity and software quality, and particularly no relationship between commonly used software complexity metrics and the occurrence of software faults. A study with an example of scientific software from the nuclear power industry illustrates the importance of addressing cognitive complexity, the complexity related to understanding the intellectual content of the software. Simple practices such as aptly-named variables contributes more to high quality software than limiting code sizes. This paper examines the research into complexity and quality and reports on a longitudinal study using the example of nuclear software. (author)

  11. Core Values | NREL

    Science.gov (United States)

    Core Values Core Values NREL's core values are rooted in a safe and supportive work environment guide our everyday actions and efforts: Safe and supportive work environment Respect for the rights physical and social environment Integrity Maintain the highest standard of ethics, honesty, and integrity

  12. Sidewall coring shell

    Energy Technology Data Exchange (ETDEWEB)

    Edelman, Ya A; Konstantinov, L P; Martyshin, A N

    1966-12-12

    A sidewall coring shell consists of a housing and a detachable core catcher. The core lifter is provided with projections, the ends of which are situated in another plane, along the longitudinal axis of the lifter. The chamber has corresponding projections.

  13. Computing and software

    Directory of Open Access Journals (Sweden)

    White, G. C.

    2004-06-01

    Full Text Available The reality is that the statistical methods used for analysis of data depend upon the availability of software. Analysis of marked animal data is no different than the rest of the statistical field. The methods used for analysis are those that are available in reliable software packages. Thus, the critical importance of having reliable, up–to–date software available to biologists is obvious. Statisticians have continued to develop more robust models, ever expanding the suite of potential analysis methods available. But without software to implement these newer methods, they will languish in the abstract, and not be applied to the problems deserving them. In the Computers and Software Session, two new software packages are described, a comparison of implementation of methods for the estimation of nest survival is provided, and a more speculative paper about how the next generation of software might be structured is presented. Rotella et al. (2004 compare nest survival estimation with different software packages: SAS logistic regression, SAS non–linear mixed models, and Program MARK. Nests are assumed to be visited at various, possibly infrequent, intervals. All of the approaches described compute nest survival with the same likelihood, and require that the age of the nest is known to account for nests that eventually hatch. However, each approach offers advantages and disadvantages, explored by Rotella et al. (2004. Efford et al. (2004 present a new software package called DENSITY. The package computes population abundance and density from trapping arrays and other detection methods with a new and unique approach. DENSITY represents the first major addition to the analysis of trapping arrays in 20 years. Barker & White (2004 discuss how existing software such as Program MARK require that each new model’s likelihood must be programmed specifically for that model. They wishfully think that future software might allow the user to combine

  14. Benchmarking State-of-the-Art Deep Learning Software Tools

    OpenAIRE

    Shi, Shaohuai; Wang, Qiang; Xu, Pengfei; Chu, Xiaowen

    2016-01-01

    Deep learning has been shown as a successful machine learning method for a variety of tasks, and its popularity results in numerous open-source deep learning software tools. Training a deep network is usually a very time-consuming process. To address the computational challenge in deep learning, many tools exploit hardware features such as multi-core CPUs and many-core GPUs to shorten the training time. However, different tools exhibit different features and running performance when training ...

  15. Software Engineering Reviews and Audits

    CERN Document Server

    Summers, Boyd L

    2011-01-01

    Accurate software engineering reviews and audits have become essential to the success of software companies and military and aerospace programs. These reviews and audits define the framework and specific requirements for verifying software development efforts. Authored by an industry professional with three decades of experience, Software Engineering Reviews and Audits offers authoritative guidance for conducting and performing software first article inspections, and functional and physical configuration software audits. It prepares readers to answer common questions for conducting and perform

  16. Business Management Software Axolon ERP

    OpenAIRE

    Axolon ERP Solution

    2018-01-01

    Axolon ERP a Business Management Software www.axolonerp.com by Micromind is a comprehensive business management software solution for businesses. We deliver Business Management Software Dubai in UAE, GCC Countries and products also include ERP Software Dubai. HR & Payroll, Inventory Software, Project Management, Software Development, Solutions and Services in Dubai, UAE for small and medium sized Enterprises (SME) in the middle east with a easy-to-use, secure and efficient business management...

  17. Software FMEA analysis for safety-related application software

    International Nuclear Information System (INIS)

    Park, Gee-Yong; Kim, Dong Hoon; Lee, Dong Young

    2014-01-01

    Highlights: • We develop a modified FMEA analysis suited for applying to software architecture. • A template for failure modes on a specific software language is established. • A detailed-level software FMEA analysis on nuclear safety software is presented. - Abstract: A method of a software safety analysis is described in this paper for safety-related application software. The target software system is a software code installed at an Automatic Test and Interface Processor (ATIP) in a digital reactor protection system (DRPS). For the ATIP software safety analysis, at first, an overall safety or hazard analysis is performed over the software architecture and modules, and then a detailed safety analysis based on the software FMEA (Failure Modes and Effect Analysis) method is applied to the ATIP program. For an efficient analysis, the software FMEA analysis is carried out based on the so-called failure-mode template extracted from the function blocks used in the function block diagram (FBD) for the ATIP software. The software safety analysis by the software FMEA analysis, being applied to the ATIP software code, which has been integrated and passed through a very rigorous system test procedure, is proven to be able to provide very valuable results (i.e., software defects) that could not be identified during various system tests

  18. Rotary core drills

    Energy Technology Data Exchange (ETDEWEB)

    1967-11-30

    The design of a rotary core drill is described. Primary consideration is given to the following component parts of the drill: the inner and outer tube, the core bit, an adapter, and the core lifter. The adapter has the form of a downward-converging sleeve and is mounted to the lower end of the inner tube. The lifter, extending from the adapter, is split along each side so that it can be held open to permit movement of a core. It is possible to grip a core by allowing the lifter to assume a closed position.

  19. Software reliability assessment

    International Nuclear Information System (INIS)

    Barnes, M.; Bradley, P.A.; Brewer, M.A.

    1994-01-01

    The increased usage and sophistication of computers applied to real time safety-related systems in the United Kingdom has spurred on the desire to provide a standard framework within which to assess dependable computing systems. Recent accidents and ensuing legislation have acted as a catalyst in this area. One particular aspect of dependable computing systems is that of software, which is usually designed to reduce risk at the system level, but which can increase risk if it is unreliable. Various organizations have recognized the problem of assessing the risk imposed to the system by unreliable software, and have taken initial steps to develop and use such assessment frameworks. This paper relates the approach of Consultancy Services of AEA Technology in developing a framework to assess the risk imposed by unreliable software. In addition, the paper discusses the experiences gained by Consultancy Services in applying the assessment framework to commercial and research projects. The framework is applicable to software used in safety applications, including proprietary software. Although the paper is written with Nuclear Reactor Safety applications in mind, the principles discussed can be applied to safety applications in all industries

  20. Software challenges in extreme scale systems

    International Nuclear Information System (INIS)

    Sarkar, Vivek; Harrod, William; Snavely, Allan E

    2009-01-01

    Computer systems anticipated in the 2015 - 2020 timeframe are referred to as Extreme Scale because they will be built using massive multi-core processors with 100's of cores per chip. The largest capability Extreme Scale system is expected to deliver Exascale performance of the order of 10 18 operations per second. These systems pose new critical challenges for software in the areas of concurrency, energy efficiency and resiliency. In this paper, we discuss the implications of the concurrency and energy efficiency challenges on future software for Extreme Scale Systems. From an application viewpoint, the concurrency and energy challenges boil down to the ability to express and manage parallelism and locality by exploring a range of strong scaling and new-era weak scaling techniques. For expressing parallelism and locality, the key challenges are the ability to expose all of the intrinsic parallelism and locality in a programming model, while ensuring that this expression of parallelism and locality is portable across a range of systems. For managing parallelism and locality, the OS-related challenges include parallel scalability, spatial partitioning of OS and application functionality, direct hardware access for inter-processor communication, and asynchronous rather than interrupt-driven events, which are accompanied by runtime system challenges for scheduling, synchronization, memory management, communication, performance monitoring, and power management. We conclude by discussing the importance of software-hardware co-design in addressing the fundamental challenges for application enablement on Extreme Scale systems.

  1. HYDRATE CORE DRILLING TESTS

    Energy Technology Data Exchange (ETDEWEB)

    John H. Cohen; Thomas E. Williams; Ali G. Kadaster; Bill V. Liddell

    2002-11-01

    The ''Methane Hydrate Production from Alaskan Permafrost'' project is a three-year endeavor being conducted by Maurer Technology Inc. (MTI), Noble, and Anadarko Petroleum, in partnership with the U.S. DOE National Energy Technology Laboratory (NETL). The project's goal is to build on previous and ongoing R&D in the area of onshore hydrate deposition. The project team plans to design and implement a program to safely and economically drill, core and produce gas from arctic hydrates. The current work scope includes drilling and coring one well on Anadarko leases in FY 2003 during the winter drilling season. A specially built on-site core analysis laboratory will be used to determine some of the physical characteristics of the hydrates and surrounding rock. Prior to going to the field, the project team designed and conducted a controlled series of coring tests for simulating coring of hydrate formations. A variety of equipment and procedures were tested and modified to develop a practical solution for this special application. This Topical Report summarizes these coring tests. A special facility was designed and installed at MTI's Drilling Research Center (DRC) in Houston and used to conduct coring tests. Equipment and procedures were tested by cutting cores from frozen mixtures of sand and water supported by casing and designed to simulate hydrate formations. Tests were conducted with chilled drilling fluids. Tests showed that frozen core can be washed out and reduced in size by the action of the drilling fluid. Washing of the core by the drilling fluid caused a reduction in core diameter, making core recovery very difficult (if not impossible). One successful solution was to drill the last 6 inches of core dry (without fluid circulation). These tests demonstrated that it will be difficult to capture core when drilling in permafrost or hydrates without implementing certain safeguards. Among the coring tests was a simulated hydrate

  2. Preliminaries on core image analysis using fault drilling samples; Core image kaiseki kotohajime (danso kussaku core kaisekirei)

    Energy Technology Data Exchange (ETDEWEB)

    Miyazaki, T; Ito, H [Geological Survey of Japan, Tsukuba (Japan)

    1996-05-01

    This paper introduces examples of image data analysis on fault drilling samples. The paper describes the following matters: core samples used in the analysis are those obtained from wells drilled piercing the Nojima fault which has moved in the Hygoken-Nanbu Earthquake; the CORESCAN system made by DMT Corporation, Germany, used in acquiring the image data consists of a CCD camera, a light source and core rotation mechanism, and a personal computer, its resolution being about 5 pixels/mm in both axial and circumferential directions, and 24-bit full color; with respect to the opening fractures in core samples collected by using a constant azimuth coring, it was possible to derive values of the opening width, inclination angle, and travel from the image data by using a commercially available software for the personal computer; and comparison of this core image with the BHTV record and the hydrophone VSP record (travel and inclination obtained from the BHTV record agree well with those obtained from the core image). 4 refs., 4 figs.

  3. Software Engineering Laboratory Series: Collected Software Engineering Papers. Volume 14

    Science.gov (United States)

    1996-01-01

    The Software Engineering Laboratory (SEL) is an organization sponsored by NASA/GSFC and created to investigate the effectiveness of software engineering technologies when applied to the development of application software. The activities, findings, and recommendations of the SEL are recorded in the Software Engineering Laboratory Series, a continuing series of reports that includes this document.

  4. Software Engineering Laboratory Series: Collected Software Engineering Papers. Volume 15

    Science.gov (United States)

    1997-01-01

    The Software Engineering Laboratory (SEL) is an organization sponsored by NASA/GSFC and created to investigate the effectiveness of software engineering technologies when applied to the development of application software. The activities, findings, and recommendations of the SEL are recorded in the Software Engineering Laboratory Series, a continuing series of reports that includes this document.

  5. Software Engineering Laboratory Series: Collected Software Engineering Papers. Volume 13

    Science.gov (United States)

    1995-01-01

    The Software Engineering Laboratory (SEL) is an organization sponsored by NASA/GSFC and created to investigate the effectiveness of software engineering technologies when applied to the development of application software. The activities, findings, and recommendations of the SEL are recorded in the Software Engineering Laboratory Series, a continuing series of reports that includes this document.

  6. Software cost/resource modeling: Software quality tradeoff measurement

    Science.gov (United States)

    Lawler, R. W.

    1980-01-01

    A conceptual framework for treating software quality from a total system perspective is developed. Examples are given to show how system quality objectives may be allocated to hardware and software; to illustrate trades among quality factors, both hardware and software, to achieve system performance objectives; and to illustrate the impact of certain design choices on software functionality.

  7. Software architecture analysis tool : software architecture metrics collection

    NARCIS (Netherlands)

    Muskens, J.; Chaudron, M.R.V.; Westgeest, R.

    2002-01-01

    The Software Engineering discipline lacks the ability to evaluate software architectures. Here we describe a tool for software architecture analysis that is based on metrics. Metrics can be used to detect possible problems and bottlenecks in software architectures. Even though metrics do not give a

  8. Impact of Agile Software Development Model on Software Maintainability

    Science.gov (United States)

    Gawali, Ajay R.

    2012-01-01

    Software maintenance and support costs account for up to 60% of the overall software life cycle cost and often burdens tightly budgeted information technology (IT) organizations. Agile software development approach delivers business value early, but implications on software maintainability are still unknown. The purpose of this quantitative study…

  9. Belle II Software

    International Nuclear Information System (INIS)

    Kuhr, T; Ritter, M

    2016-01-01

    Belle II is a next generation B factory experiment that will collect 50 times more data than its predecessor, Belle. The higher luminosity at the SuperKEKB accelerator leads to higher background levels and requires a major upgrade of the detector. As a consequence, the simulation, reconstruction, and analysis software must also be upgraded substantially. Most of the software has been redesigned from scratch, taking into account the experience from Belle and other experiments and utilizing new technologies. The large amount of experimental and simulated data requires a high level of reliability and reproducibility, even in parallel environments. Several technologies, tools, and organizational measures are employed to evaluate and monitor the performance of the software during development. (paper)

  10. New Media as Software

    Directory of Open Access Journals (Sweden)

    Manuel Portela

    2014-03-01

    Full Text Available Review of Lev Manovich, Software Takes Command: Extending the Language of New Media. London: Bloomsbury, 2013, 358 pp. ISBN 978-1-6235-6817-7. In Lev Manovich’s most recent book, this programmatic interrogation of our medial condition leads to the following question: do media still exist after software? This is the question that triggers Manovich’s dialogue both with computing history and with theories of digital media of recent decades, including the extension of his own previous formulations in The Language of New Media, published in 2001, and which became a major reference work in the field. The subtitle of the new book points precisely to this critical revisiting of his earlier work in the context of ubiquitous computing and accelerated transcoding of social, cultural and artistic practices by software.

  11. LHCb software strategy

    CERN Document Server

    Van Herwijnen, Eric

    1998-01-01

    This document describes the software strategy of the LHCb experiment. The main objective is to reuse designs and code wherever possible; We will implement an architecturally driven design process; This architectural process will be implemented using Object Technology; We aim for platform indepence; try to take advantage of distributed computing and will use industry standards, commercial software and profit from HEP developments; We will implement a common software process and development environment. One of the major problems that we are immediately faced with is the conversion of our current code from Fortran into an Object Oriented language and the conversion of our current developers to Object technology. Some technical terms related to OO programming are defined in Annex A.1

  12. Test af Software

    DEFF Research Database (Denmark)

    Dette dokument udgør slutrapporten for netværkssamarbejdet ”Testnet”, som er udført i perioden 1.4.2006 til 31.12.2008. Netværket beskæftiger sig navnlig med emner inden for test af indlejret og teknisk software, men et antal eksempler på problemstillinger og løsninger forbundet med test af...... administrativ software indgår også. Rapporten er opdelt i følgende 3 dele: Overblik. Her giver vi et resumé af netværkets formål, aktiviteter og resultater. State of the art af software test ridses op. Vi omtaler, at CISS og netværket tager nye tiltag. Netværket. Formål, deltagere og behandlede emner på ti...

  13. ORNL's DCAL software package

    International Nuclear Information System (INIS)

    Eckerman, K.F.

    2007-01-01

    Oak Ridge National Laboratory has released its Dose and Risk Calculation software, DCAL. DCAL, developed with the support of the U.S. Environmental Protection Agency, consists of a series of computational modules, driven in either an interactive or a batch mode for computation of dose and risk coefficients from intakes of radionuclides or exposure to radionuclides in environmental media. The software package includes extensive libraries of biokinetic and dosimetric data that represent the current state of the art. The software has unique capability for addressing intakes of radionuclides by non-adults. DCAL runs as 32-bit extended DOS and console applications under Windows 98/NT/2000/XP. It is intended for users familiar with the basic elements of computational radiation dosimetry. Components of DCAL have been used to prepare U.S. Environmental Protection Agency's Federal Guidance Reports 12 and 13 and several publications of the International Commission on Radiological Protection. (author)

  14. Aircraft Design Software

    Science.gov (United States)

    1997-01-01

    Successful commercialization of the AirCraft SYNThesis (ACSYNT) tool has resulted in the creation of Phoenix Integration, Inc. ACSYNT has been exclusively licensed to the company, an outcome of a seven year, $3 million effort to provide unique software technology to a focused design engineering market. Ames Research Center formulated ACSYNT and in working with the Virginia Polytechnic Institute CAD Laboratory, began to design and code a computer-aided design for ACSYNT. Using a Joint Sponsored Research Agreement, Ames formed an industry-government-university alliance to improve and foster research and development for the software. As a result of the ACSYNT Institute, the software is becoming a predominant tool for aircraft conceptual design. ACSYNT has been successfully applied to high- speed civil transport configuration, subsonic transports, and supersonic fighters.

  15. Lecture 2: Software Security

    CERN Multimedia

    CERN. Geneva

    2013-01-01

    Computer security has been an increasing concern for IT professionals for a number of years, yet despite all the efforts, computer systems and networks remain highly vulnerable to attacks of different kinds. Design flaws and security bugs in the underlying software are among the main reasons for this. This lecture addresses the following question: how to create secure software? The lecture starts with a definition of computer security and an explanation of why it is so difficult to achieve. It then introduces the main security principles (like least-privilege, or defense-in-depth) and discusses security in different phases of the software development cycle. The emphasis is put on the implementation part: most common pitfalls and security bugs are listed, followed by advice on best practice for security development, testing and deployment. Sebastian Lopienski is CERN’s deputy Computer Security Officer. He works on security strategy and policies; offers internal consultancy and audit services; develops and ...

  16. Methods of Software Verification

    Directory of Open Access Journals (Sweden)

    R. E. Gurin

    2015-01-01

    Full Text Available This article is devoted to the problem of software verification (SW. Methods of software verification designed to check the software for compliance with the stated requirements such as correctness, system security and system adaptability to small changes in the environment, portability and compatibility, etc. These are various methods both by the operation process and by the way of achieving result. The article describes the static and dynamic methods of software verification and paid attention to the method of symbolic execution. In its review of static analysis are discussed and described the deductive method, and methods for testing the model. A relevant issue of the pros and cons of a particular method is emphasized. The article considers classification of test techniques for each method. In this paper we present and analyze the characteristics and mechanisms of the static analysis of dependencies, as well as their views, which can reduce the number of false positives in situations where the current state of the program combines two or more states obtained both in different paths of execution and in working with multiple object values. Dependences connect various types of software objects: single variables, the elements of composite variables (structure fields, array elements, the size of the heap areas, the length of lines, the number of initialized array elements in the verification code using static methods. The article pays attention to the identification of dependencies within the framework of the abstract interpretation, as well as gives an overview and analysis of the inference tools.Methods of dynamic analysis such as testing, monitoring and profiling are presented and analyzed. Also some kinds of tools are considered which can be applied to the software when using the methods of dynamic analysis. Based on the work a conclusion is drawn, which describes the most relevant problems of analysis techniques, methods of their solutions and

  17. A SOFTWARE RELIABILITY ESTIMATION METHOD TO NUCLEAR SAFETY SOFTWARE

    Directory of Open Access Journals (Sweden)

    GEE-YONG PARK

    2014-02-01

    Full Text Available A method for estimating software reliability for nuclear safety software is proposed in this paper. This method is based on the software reliability growth model (SRGM, where the behavior of software failure is assumed to follow a non-homogeneous Poisson process. Two types of modeling schemes based on a particular underlying method are proposed in order to more precisely estimate and predict the number of software defects based on very rare software failure data. The Bayesian statistical inference is employed to estimate the model parameters by incorporating software test cases as a covariate into the model. It was identified that these models are capable of reasonably estimating the remaining number of software defects which directly affects the reactor trip functions. The software reliability might be estimated from these modeling equations, and one approach of obtaining software reliability value is proposed in this paper.

  18. Agile software development

    CERN Document Server

    Stober, Thomas

    2009-01-01

    Software Development is moving towards a more agile and more flexible approach. It turns out that the traditional 'waterfall' model is not supportive in an environment where technical, financial and strategic constraints are changing almost every day. But what is agility? What are today's major approaches? And especially: What is the impact of agile development principles on the development teams, on project management and on software architects? How can large enterprises become more agile and improve their business processes, which have been existing since many, many years? What are the limit

  19. Software Testing as Science

    Directory of Open Access Journals (Sweden)

    Ingrid Gallesdic

    2013-06-01

    Full Text Available The most widespread opinion among people who have some connection with software testing is that this activity is an art. In fact, books have been published widely whose titles refer to it as art, role or process. But because software complexity is increasing every year, this paper proposes a new approach, conceiving the test as a science. This is because the processes by which they are applied are the steps of the scientific method: inputs, processes, outputs. The contents of this paper examines the similarities and test characteristics as science.

  20. Provider software buyer's guide.

    Science.gov (United States)

    1994-03-01

    To help long term care providers find new ways to improve quality of care and efficiency, Provider magazine presents the fourth annual listing of software firms marketing computer programs for all areas of nursing facility operations. On the following five pages, more than 80 software firms display their wares, with programs such as minimum data set and care planning, dietary, accounting and financials, case mix, and medication administration records. The guide also charts compatible hardware, integration ability, telephone numbers, company contacts, and easy-to-use reader service numbers.

  1. Model of software quality

    OpenAIRE

    Valencia Ayala, Luz Estela; Villa Sánchez, Paula Andréa; Ocampo S., Carlos Alberto

    2009-01-01

    En un mercado globalizado donde las empresas deben innovar y mejorar continuamente para crecer y ser más competitivas, es necesario tener acceso a certificaciones de calidad internacionales que les den un respaldo y puedan mantenerse en este mercado. Las certificaciones de calidad en la industria del software ayudan a las empresas a ser más productivas disminuyendo costos y tiempo en sus desarrollos. Las empresas de desarrollo de software de nuestro país en su mayoría son micro y pequeñas...

  2. Security System Software

    Science.gov (United States)

    1993-01-01

    C Language Integration Production System (CLIPS), a NASA-developed expert systems program, has enabled a security systems manufacturer to design a new generation of hardware. C.CURESystem 1 Plus, manufactured by Software House, is a software based system that is used with a variety of access control hardware at installations around the world. Users can manage large amounts of information, solve unique security problems and control entry and time scheduling. CLIPS acts as an information management tool when accessed by C.CURESystem 1 Plus. It asks questions about the hardware and when given the answer, recommends possible quick solutions by non-expert persons.

  3. Software product quality control

    CERN Document Server

    Wagner, Stefan

    2013-01-01

    Quality is not a fixed or universal property of software; it depends on the context and goals of its stakeholders. Hence, when you want to develop a high-quality software system, the first step must be a clear and precise specification of quality. Yet even if you get it right and complete, you can be sure that it will become invalid over time. So the only solution is continuous quality control: the steady and explicit evaluation of a product's properties with respect to its updated quality goals.This book guides you in setting up and running continuous quality control in your environment. Star

  4. Software Safety and Security

    CERN Document Server

    Nipkow, T; Hauptmann, B

    2012-01-01

    Recent decades have seen major advances in methods and tools for checking the safety and security of software systems. Automatic tools can now detect security flaws not only in programs of the order of a million lines of code, but also in high-level protocol descriptions. There has also been something of a breakthrough in the area of operating system verification. This book presents the lectures from the NATO Advanced Study Institute on Tools for Analysis and Verification of Software Safety and Security; a summer school held at Bayrischzell, Germany, in 2011. This Advanced Study Institute was

  5. Maintenance simulation: Software issues

    Energy Technology Data Exchange (ETDEWEB)

    Luk, C.H.; Jette, M.A.

    1995-07-01

    The maintenance of a distributed software system in a production environment involves: (1) maintaining software integrity, (2) maintaining and database integrity, (3) adding new features, and (4) adding new systems. These issues will be discussed in general: what they are and how they are handled. This paper will present our experience with a distributed resource management system that accounts for resources consumed, in real-time, on a network of heterogenous computers. The simulated environments to maintain this system will be presented relate to the four maintenance areas.

  6. Processeringsoptimering med Canons software

    DEFF Research Database (Denmark)

    Precht, Helle

    2009-01-01

    . Muligheder i software optimering blev studeret i relation til optimal billedkvalitet og kontrol optagelser, for at undersøge om det var muligt at acceptere diagnostisk billedkvalitet og derved tage afsæt i ALARA. Metode og materialer Et kvantitativt eksperimentelt studie baseret på forsøg med teknisk og...... humant fantom. CD Rad fantom anvendes som teknisk fantom, hvor billederne blev analyseret med CD Rad software, og resultatet var en objektiv IQF værdi. Det humane fantom var et lamme pelvis med femur, der via NRPB’ er sammenlignelig med absorptionen ved et femårigt barn. De humane forsøgsbilleder blev...

  7. Agile distributed software development

    DEFF Research Database (Denmark)

    Persson, John Stouby; Mathiassen, Lars; Aaen, Ivan

    2012-01-01

    While face-to-face interaction is fundamental in agile software development, distributed environments must rely extensively on mediated interactions. Practicing agile principles in distributed environments therefore poses particular control challenges related to balancing fixed vs. evolving quality...... requirements and people vs. process-based collaboration. To investigate these challenges, we conducted an in-depth case study of a successful agile distributed software project with participants from a Russian firm and a Danish firm. Applying Kirsch’s elements of control framework, we offer an analysis of how...

  8. Six Sigma software development

    CERN Document Server

    Tayntor, Christine B

    2002-01-01

    Since Six Sigma has had marked success in improving quality in other settings, and since the quality of software remains poor, it seems a natural evolution to apply the concepts and tools of Six Sigma to system development and the IT department. Until now however, there were no books available that applied these concepts to the system development process. Six Sigma Software Development fills this void and illustrates how Six Sigma concepts can be applied to all aspects of the evolving system development process. It includes the traditional waterfall model and in the support of legacy systems,

  9. Inventory of safeguards software

    International Nuclear Information System (INIS)

    Suzuki, Mitsutoshi; Horino, Koichi

    2009-03-01

    The purpose of this survey activity will serve as a basis for determining what needs may exist in this arena for development of next-generation safeguards systems and approaches. 23 software tools are surveyed by JAEA and NMCC. Exchanging information regarding existing software tools for safeguards and discussing about a next R and D program of developing a general-purpose safeguards tool should be beneficial to a safeguards system design and indispensable to evaluate a safeguards system for future nuclear fuel facilities. (author)

  10. Machine Tool Software

    Science.gov (United States)

    1988-01-01

    A NASA-developed software package has played a part in technical education of students who major in Mechanical Engineering Technology at William Rainey Harper College. Professor Hack has been using (APT) Automatically Programmed Tool Software since 1969 in his CAD/CAM Computer Aided Design and Manufacturing curriculum. Professor Hack teaches the use of APT programming languages for control of metal cutting machines. Machine tool instructions are geometry definitions written in APT Language to constitute a "part program." The part program is processed by the machine tool. CAD/CAM students go from writing a program to cutting steel in the course of a semester.

  11. Green in software engineering

    CERN Document Server

    Calero Munoz, Coral

    2015-01-01

    This is the first book that presents a comprehensive overview of sustainability aspects in software engineering. Its format follows the structure of the SWEBOK and covers the key areas involved in the incorporation of green aspects in software engineering, encompassing topics from requirement elicitation to quality assurance and maintenance, while also considering professional practices and economic aspects. The book consists of thirteen chapters, which are structured in five parts. First the "Introduction" gives an overview of the primary general concepts related to Green IT, discussing wha

  12. Model-driven software engineering

    NARCIS (Netherlands)

    Amstel, van M.F.; Brand, van den M.G.J.; Protic, Z.; Verhoeff, T.; Hamberg, R.; Verriet, J.

    2014-01-01

    Software plays an important role in designing and operating warehouses. However, traditional software engineering methods for designing warehouse software are not able to cope with the complexity, size, and increase of automation in modern warehouses. This chapter describes Model-Driven Software

  13. Package-based software development

    NARCIS (Netherlands)

    Jonge, de M.; Chroust, G.; Hofer, C.

    2003-01-01

    The main goal of component-based software engineering is to decrease development time and development costs of software systems, by reusing prefabricated building blocks. Here we focus on software reuse within the implementation of such component-based applications, and on the corresponding software

  14. The fallacy of Software Patents

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    Software patents are usually used as argument for innovation but do they really promote innovation? Who really benefits from software patents? This talk attempts to show the problems with software patents and how they can actually harm innovation having little value for software users and our society in general.

  15. A methodology for software documentation

    OpenAIRE

    Torres Júnior, Roberto Dias; Ahlert, Hubert

    2000-01-01

    With the growing complexity of window based software and the use of object-oriented, the development of software is getting more complex than ever. Based on that, this article intends to present a methodology for software documentation and to analyze our experience and how this methodology can aid the software maintenance

  16. The NOvA software testing framework

    International Nuclear Information System (INIS)

    Tamsett, M; Group, C

    2015-01-01

    The NOvA experiment at Fermilab is a long-baseline neutrino experiment designed to study vε appearance in a vμ beam. NOvA has already produced more than one million Monte Carlo and detector generated files amounting to more than 1 PB in size. This data is divided between a number of parallel streams such as far and near detector beam spills, cosmic ray backgrounds, a number of data-driven triggers and over 20 different Monte Carlo configurations. Each of these data streams must be processed through the appropriate steps of the rapidly evolving, multi-tiered, interdependent NOvA software framework. In total there are greater than 12 individual software tiers, each of which performs a different function and can be configured differently depending on the input stream. In order to regularly test and validate that all of these software stages are working correctly NOvA has designed a powerful, modular testing framework that enables detailed validation and benchmarking to be performed in a fast, efficient and accessible way with minimal expert knowledge. The core of this system is a novel series of python modules which wrap, monitor and handle the underlying C++ software framework and then report the results to a slick front-end web-based interface. This interface utilises modern, cross-platform, visualisation libraries to render the test results in a meaningful way. They are fast and flexible, allowing for the easy addition of new tests and datasets. In total upwards of 14 individual streams are regularly tested amounting to over 70 individual software processes, producing over 25 GB of output files. The rigour enforced through this flexible testing framework enables NOvA to rapidly verify configurations, results and software and thus ensure that data is available for physics analysis in a timely and robust manner. (paper)

  17. Core monitoring at the WNP-2 reactor

    International Nuclear Information System (INIS)

    Skeen, D.R.; Torres, R.H.; Burke, W.J.; Jenkins, I.; Jones, S.W.

    1992-01-01

    The WNP-2 reactor is a 3,323-MW(thermal) boiling water reactor (BWR) that is operated by the Washington Public Power Supply System. The WNP-2 reactor began commercial operation in 1984 and is currently in its eighth cycle. The core monitoring system used for the first cycle of operation was supplied by the reactor vendor. Cycles 2 through 6 were monitored with the POWERPLEX Core Monitoring Software System (CMSS) using the XTGBWR simulation code. In 1991, the supply system upgraded the core monitoring system by installing the POWERPLEX 2 CMSS prior to the seventh cycle of operation for WNP-2. The POWERPLEX 2 CMSS was developed by Siemens Power Corporation (SPC) and is based on SPC's advanced state-of-the-art reactor simulator code MICROBURN-B. The improvements in the POWERPLEX 2 system are possible as a result of advances in minicomputer hardware

  18. Software testing concepts and operations

    CERN Document Server

    Mili, Ali

    2015-01-01

    Explores and identifies the main issues, concepts, principles and evolution of software testing, including software quality engineering and testing concepts, test data generation, test deployment analysis, and software test managementThis book examines the principles, concepts, and processes that are fundamental to the software testing function. This book is divided into five broad parts. Part I introduces software testing in the broader context of software engineering and explores the qualities that testing aims to achieve or ascertain, as well as the lifecycle of software testing. Part II c

  19. Patterns for Parallel Software Design

    CERN Document Server

    Ortega-Arjona, Jorge Luis

    2010-01-01

    Essential reading to understand patterns for parallel programming Software patterns have revolutionized the way we think about how software is designed, built, and documented, and the design of parallel software requires you to consider other particular design aspects and special skills. From clusters to supercomputers, success heavily depends on the design skills of software developers. Patterns for Parallel Software Design presents a pattern-oriented software architecture approach to parallel software design. This approach is not a design method in the classic sense, but a new way of managin

  20. The Art of Software Testing

    CERN Document Server

    Myers, Glenford J; Badgett, Tom

    2011-01-01

    The classic, landmark work on software testing The hardware and software of computing have changed markedly in the three decades since the first edition of The Art of Software Testing, but this book's powerful underlying analysis has stood the test of time. Whereas most books on software testing target particular development techniques, languages, or testing methods, The Art of Software Testing, Third Edition provides a brief but powerful and comprehensive presentation of time-proven software testing approaches. If your software development project is mission critical, this book is an investme

  1. What Counts in Software Process?

    DEFF Research Database (Denmark)

    Cohn, Marisa

    2009-01-01

    and conversations in negotiating between prescriptions from a model and the contingencies that arise in an enactment. A qualitative field study at two Agile software development companies was conducted to investigate the role of artifacts in the software development work and the relationship between these artifacts...... and the Software Process. Documentation of software requirements is a major concern among software developers and software researchers. Agile software development denotes a different relationship to documentation, one that warrants investigation. Empirical findings are presented which suggest a new understanding...

  2. Software for noise measurements

    International Nuclear Information System (INIS)

    Zyryanov, V.A.

    1987-01-01

    The CURS program library comprising 38 fortran-programs, designed for processing descrete experimental data in the form of random or determined periodic processes is described. The library is based on the modular construction principle which allows one to create on its base any sets of programs to solve tasks related to NPP operation, and to develop special software

  3. Software complex "remember me"

    OpenAIRE

    Kosheutova, N. V.; Osina, P. M.

    2016-01-01

    The article describes the importance of time management and effective planning in modern society and is devoted to an Android OS application development. It points out the main features of a mobile application such as cross-platform capability and synchronization. Much attention is given to the software architecture as well as user data protection via password hashing methods.

  4. Software management issues

    International Nuclear Information System (INIS)

    Kunz, P.F.

    1990-06-01

    The difficulty of managing the software in large HEP collaborations appears to becoming progressively worst with each new generation of detector. If one were to extrapolate to the SSC, it will become a major problem. This paper explores the possible causes of the difficulty and makes suggestions on what corrective actions should be taken

  5. Application software profiles 2010

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    2010-04-15

    This article presented information on new software applications designed to facilitate petroleum exploration, drilling and production activities. Computer modelling and analysis enables oil and gas producers to characterize reservoirs, estimate reserves forecast production, plan operations and manage assets. Seven Calgary-based organizations were highlighted along with their sophisticated software tools, the applications and the new features available in each product. The geoSCOUT version 7.7 by GeoLOGIC Systems Ltd. integrates public and proprietary data on wells, well logs, reserves, pipelines, production, ownership and seismic location data. The Value Navigator and AFE Navigator by Energy Navigator provides control over reserves, production and cash flow forecasting. FAST Harmony, FAST Evolution, FAST CBM, FAST FieldNotes, Fast Piper, FAST RTA, FAST VirtuWell and FAST WellTest by Fekete Associates Inc. provide reserve evaluations for reservoir engineering projects and production data analysis. The esi.manage software program by 3esi improves business results for upstream oil and gas companies through enhanced decision making and workforce effectiveness. WELLFLO, PIPEFLO, FORGAS, OLGA, Drillbench, and MEPO wellbore solutions by Neotec provide unique platforms for flow simulation to optimize oil and gas production systems. Petrel, ECLIPSE, Avocet, PipeSim and Merak software tools by Schlumberger Information Solutions are petroleum systems modelling tools for geologic mapping, visualization modelling and reservoir engineering. StudioSL by Streamsim Technologies Inc. is a modelling tool for optimizing flood management. figs.

  6. Software Geometry in Simulations

    Science.gov (United States)

    Alion, Tyler; Viren, Brett; Junk, Tom

    2015-04-01

    The Long Baseline Neutrino Experiment (LBNE) involves many detectors. The experiment's near detector (ND) facility, may ultimately involve several detectors. The far detector (FD) will be significantly larger than any other Liquid Argon (LAr) detector yet constructed; many prototype detectors are being constructed and studied to motivate a plethora of proposed FD designs. Whether it be a constructed prototype or a proposed ND/FD design, every design must be simulated and analyzed. This presents a considerable challenge to LBNE software experts; each detector geometry must be described to the simulation software in an efficient way which allows for multiple authors to easily collaborate. Furthermore, different geometry versions must be tracked throughout their use. We present a framework called General Geometry Description (GGD), written and developed by LBNE software collaborators for managing software to generate geometries. Though GGD is flexible enough to be used by any experiment working with detectors, we present it's first use in generating Geometry Description Markup Language (GDML) files to interface with LArSoft, a framework of detector simulations, event reconstruction, and data analyses written for all LAr technology users at Fermilab. Brett is the other of the framework discussed here, the General Geometry Description (GGD).

  7. Software configuration management

    International Nuclear Information System (INIS)

    Arribas Peces, E.; Martin Faraldo, P.

    1993-01-01

    Software Configuration Management is directed towards identifying system configuration at specific points of its life cycle, so as to control changes to the configuration and to maintain the integrity and traceability of the configuration throughout its life. SCM functions and tasks are presented in the paper

  8. Patterns in Software Development

    DEFF Research Database (Denmark)

    Corry, Aino Vonge

    the university and I entered a project to industry within Center for Object Technology (COT). I focused on promoting the pattern concept to the Danish software industry in order to help them take advantage of the benefits of applying patterns in system development. In the obligatory stay abroad, I chose to visit...

  9. Open Source Software Acquisition

    DEFF Research Database (Denmark)

    Holck, Jesper; Kühn Pedersen, Mogens; Holm Larsen, Michael

    2005-01-01

    Lately we have seen a growing interest from both public and private organisations to adopt OpenSource Software (OSS), not only for a few, specific applications but also on a more general levelthroughout the organisation. As a consequence, the organisations' decisions on adoption of OSS arebecoming...

  10. SEER*Stat Software

    Science.gov (United States)

    If you have access to SEER Research Data, use SEER*Stat to analyze SEER and other cancer-related databases. View individual records and produce statistics including incidence, mortality, survival, prevalence, and multiple primary. Tutorials and related analytic software tools are available.

  11. Improving Agile Software Practice

    DEFF Research Database (Denmark)

    Tjørnehøj, Gitte

    2006-01-01

    Software process improvement in small and agile organizations is often problematic, but achieving good SPI-assessments can still be necessary to stay in the marked or to meet demands of multinational owners. The traditional norm driven, centralized and control centered improvement approaches has...

  12. Software Defined Coded Networking

    DEFF Research Database (Denmark)

    Di Paola, Carla; Roetter, Daniel Enrique Lucani; Palazzo, Sergio

    2017-01-01

    the quality of each link and even across neighbouring links and using simulations to show that an additional reduction of packet transmission in the order of 40% is possible. Second, to advocate for the use of network coding (NC) jointly with software defined networking (SDN) providing an implementation...

  13. MOCASSIN-prot software

    Science.gov (United States)

    MOCASSIN-prot is a software, implemented in Perl and Matlab, for constructing protein similarity networks to classify proteins. Both domain composition and quantitative sequence similarity information are utilized in constructing the directed protein similarity networks. For each reference protein i...

  14. Writing testable software requirements

    Energy Technology Data Exchange (ETDEWEB)

    Knirk, D. [Sandia National Labs., Albuquerque, NM (United States)

    1997-11-01

    This tutorial identifies common problems in analyzing requirements in the problem and constructing a written specification of what the software is to do. It deals with two main problem areas: identifying and describing problem requirements, and analyzing and describing behavior specifications.

  15. Green Software Products

    NARCIS (Netherlands)

    Jagroep, E.A.

    2017-01-01

    The rising energy consumption of the ICT industry has triggered a quest for more green, energy efficient ICT solutions. The role of software as the true consumer of power and its potential contribution to reach sustainability goals has increasingly been acknowledged. At the same time, it is shown to

  16. Iterative software kernels

    Energy Technology Data Exchange (ETDEWEB)

    Duff, I.

    1994-12-31

    This workshop focuses on kernels for iterative software packages. Specifically, the three speakers discuss various aspects of sparse BLAS kernels. Their topics are: `Current status of user lever sparse BLAS`; Current status of the sparse BLAS toolkit`; and `Adding matrix-matrix and matrix-matrix-matrix multiply to the sparse BLAS toolkit`.

  17. Software for airborne radiation monitoring system

    International Nuclear Information System (INIS)

    Sheinfeld, M.; Kadmon, Y.; Tirosh, D.; Elhanany, I.; Gabovitch, A.; Barak, D.

    1997-01-01

    The Airborne Radiation Monitoring System monitors radioactive contamination in the air or on the ground. The contamination source can be a radioactive plume or an area contaminated with radionuclides. This system is composed of two major parts: Airborne Unit carried by a helicopter, and Ground Station carried by a truck. The Airborne software is intended to be the core of a computerized airborne station. The software is written in C++ under MS-Windows with object-oriented methodology. It has been designed to be user-friendly: function keys and other accelerators are used for vital operations, a help file and help subjects are available, the Human-Machine-Interface is plain and obvious. (authors)

  18. Software development: do good manners matter?

    Directory of Open Access Journals (Sweden)

    Giuseppe Destefanis

    2016-07-01

    Full Text Available A successful software project is the result of a complex process involving, above all, people. Developers are the key factors for the success of a software development process, not merely as executors of tasks, but as protagonists and core of the whole development process. This paper investigates social aspects among developers working on software projects developed with the support of Agile tools. We studied 22 open-source software projects developed using the Agile board of the JIRA repository. All comments committed by developers involved in the projects were analyzed and we explored whether the politeness of comments affected the number of developers involved and the time required to fix any given issue. Our results showed that the level of politeness in the communication process among developers does have an effect on the time required to fix issues and, in the majority of the analysed projects, it had a positive correlation with attractiveness of the project to both active and potential developers. The more polite developers were, the less time it took to fix an issue.

  19. The core paradox.

    Science.gov (United States)

    Kennedy, G. C.; Higgins, G. H.

    1973-01-01

    Rebuttal of suggestions from various critics attempting to provide an escape from the seeming paradox originated by Higgins and Kennedy's (1971) proposed possibility that the liquid in the outer core was thermally stably stratified and that this stratification might prove a powerful inhibitor to circulation of the outer core fluid of the kind postulated for the generation of the earth's magnetic field. These suggestions are examined and shown to provide no reasonable escape from the core paradox.

  20. Nuclear reactor core flow baffling

    International Nuclear Information System (INIS)

    Berringer, R.T.

    1979-01-01

    A flow baffling arrangement is disclosed for the core of a nuclear reactor. A plurality of core formers are aligned with the grids of the core fuel assemblies such that the high pressure drop areas in the core are at the same elevations as the high pressure drop areas about the core periphery. The arrangement minimizes core bypass flow, maintains cooling of the structure surrounding the core, and allows the utilization of alternative beneficial components such as neutron reflectors positioned near the core

  1. Flight Software Math Library

    Science.gov (United States)

    McComas, David

    2013-01-01

    The flight software (FSW) math library is a collection of reusable math components that provides typical math utilities required by spacecraft flight software. These utilities are intended to increase flight software quality reusability and maintainability by providing a set of consistent, well-documented, and tested math utilities. This library only has dependencies on ANSI C, so it is easily ported. Prior to this library, each mission typically created its own math utilities using ideas/code from previous missions. Part of the reason for this is that math libraries can be written with different strategies in areas like error handling, parameters orders, naming conventions, etc. Changing the utilities for each mission introduces risks and costs. The obvious risks and costs are that the utilities must be coded and revalidated. The hidden risks and costs arise in miscommunication between engineers. These utilities must be understood by both the flight software engineers and other subsystem engineers (primarily guidance navigation and control). The FSW math library is part of a larger goal to produce a library of reusable Guidance Navigation and Control (GN&C) FSW components. A GN&C FSW library cannot be created unless a standardized math basis is created. This library solves the standardization problem by defining a common feature set and establishing policies for the library s design. This allows the libraries to be maintained with the same strategy used in its initial development, which supports a library of reusable GN&C FSW components. The FSW math library is written for an embedded software environment in C. This places restrictions on the language features that can be used by the library. Another advantage of the FSW math library is that it can be used in the FSW as well as other environments like the GN&C analyst s simulators. This helps communication between the teams because they can use the same utilities with the same feature set and syntax.

  2. Sediment Core Laboratory

    Data.gov (United States)

    Federal Laboratory Consortium — FUNCTION: Provides instrumentation and expertise for physical and geoacoustic characterization of marine sediments.DESCRIPTION: The multisensor core logger measures...

  3. TWRS engineering bibliography software listing

    International Nuclear Information System (INIS)

    Husa, E.I.

    1995-01-01

    This document contains the computer software listing for Engineering Bibliography software, developed by E. Ivar Husa. This software is in the working prototype stage of development. The code has not been tested to requirements. TWRS Engineering created this software for engineers to share bibliographic references across the Hanford site network (HLAN). This software is intended to store several hundred to several thousand references (a compendium with limited range). Coded changes are needed to support the larger number of references

  4. Interface-based software testing

    OpenAIRE

    Aziz Ahmad Rais

    2016-01-01

    Software quality is determined by assessing the characteristics that specify how it should work, which are verified through testing. If it were possible to touch, see, or measure software, it would be easier to analyze and prove its quality. Unfortunately, software is an intangible asset, which makes testing complex. This is especially true when software quality is not a question of particular functions that can be tested through a graphical user interface. The primary objective of softwar...

  5. Self-assembling software generator

    Science.gov (United States)

    Bouchard, Ann M [Albuquerque, NM; Osbourn, Gordon C [Albuquerque, NM

    2011-11-25

    A technique to generate an executable task includes inspecting a task specification data structure to determine what software entities are to be generated to create the executable task, inspecting the task specification data structure to determine how the software entities will be linked after generating the software entities, inspecting the task specification data structure to determine logic to be executed by the software entities, and generating the software entities to create the executable task.

  6. Metabolic interrelationships software application: Interactive learning tool for intermediary metabolism

    NARCIS (Netherlands)

    A.J.M. Verhoeven (Adrie); M. Doets (Mathijs); J.M.J. Lamers (Jos); J.F. Koster (Johan)

    2005-01-01

    textabstractWe developed and implemented the software application titled Metabolic Interrelationships as a self-learning and -teaching tool for intermediary metabolism. It is used by undergraduate medical students in an integrated organ systems-based and disease-oriented core curriculum, which

  7. Product-oriented Software Certification Process for Software Synthesis

    Science.gov (United States)

    Nelson, Stacy; Fischer, Bernd; Denney, Ewen; Schumann, Johann; Richardson, Julian; Oh, Phil

    2004-01-01

    The purpose of this document is to propose a product-oriented software certification process to facilitate use of software synthesis and formal methods. Why is such a process needed? Currently, software is tested until deemed bug-free rather than proving that certain software properties exist. This approach has worked well in most cases, but unfortunately, deaths still occur due to software failure. Using formal methods (techniques from logic and discrete mathematics like set theory, automata theory and formal logic as opposed to continuous mathematics like calculus) and software synthesis, it is possible to reduce this risk by proving certain software properties. Additionally, software synthesis makes it possible to automate some phases of the traditional software development life cycle resulting in a more streamlined and accurate development process.

  8. Interface-based software testing

    Directory of Open Access Journals (Sweden)

    Aziz Ahmad Rais

    2016-10-01

    Full Text Available Software quality is determined by assessing the characteristics that specify how it should work, which are verified through testing. If it were possible to touch, see, or measure software, it would be easier to analyze and prove its quality. Unfortunately, software is an intangible asset, which makes testing complex. This is especially true when software quality is not a question of particular functions that can be tested through a graphical user interface. The primary objective of software architecture is to design quality of software through modeling and visualization. There are many methods and standards that define how to control and manage quality. However, many IT software development projects still fail due to the difficulties involved in measuring, controlling, and managing software quality. Software quality failure factors are numerous. Examples include beginning to test software too late in the development process, or failing properly to understand, or design, the software architecture and the software component structure. The goal of this article is to provide an interface-based software testing technique that better measures software quality, automates software quality testing, encourages early testing, and increases the software’s overall testability

  9. Can Psychiatric Rehabilitation Be Core to CORE?

    Science.gov (United States)

    Olney, Marjorie F.; Gill, Kenneth J.

    2016-01-01

    Purpose: In this article, we seek to determine whether psychiatric rehabilitation principles and practices have been more fully incorporated into the Council on Rehabilitation Education (CORE) standards, the extent to which they are covered in four rehabilitation counseling "foundations" textbooks, and how they are reflected in the…

  10. PWR core design calculations

    International Nuclear Information System (INIS)

    Trkov, A.; Ravnik, M.; Zeleznik, N.

    1992-01-01

    Functional description of the programme package Cord-2 for PWR core design calculations is presented. Programme package is briefly described. Use of the package and calculational procedures for typical core design problems are treated. Comparison of main results with experimental values is presented as part of the verification process. (author) [sl

  11. A software engineering process for safety-critical software application

    International Nuclear Information System (INIS)

    Kang, Byung Heon; Kim, Hang Bae; Chang, Hoon Seon; Jeon, Jong Sun

    1995-01-01

    Application of computer software to safety-critical systems in on the increase. To be successful, the software must be designed and constructed to meet the functional and performance requirements of the system. For safety reason, the software must be demonstrated not only to meet these requirements, but also to operate safely as a component within the system. For longer-term cost consideration, the software must be designed and structured to ease future maintenance and modifications. This paper presents a software engineering process for the production of safety-critical software for a nuclear power plant. The presentation is expository in nature of a viable high quality safety-critical software development. It is based on the ideas of a rational design process and on the experience of the adaptation of such process in the production of the safety-critical software for the shutdown system number two of Wolsung 2, 3 and 4 nuclear power generation plants. This process is significantly different from a conventional process in terms of rigorous software development phases and software design techniques, The process covers documentation, design, verification and testing using mathematically precise notations and highly reviewable tabular format to specify software requirements and software requirements and software requirements and code against software design using static analysis. The software engineering process described in this paper applies the principle of information-hiding decomposition in software design using a modular design technique so that when a change is required or an error is detected, the affected scope can be readily and confidently located. it also facilitates a sense of high degree of confidence in the 'correctness' of the software production, and provides a relatively simple and straightforward code implementation effort. 1 figs., 10 refs. (Author)

  12. A study of software safety analysis system for safety-critical software

    International Nuclear Information System (INIS)

    Chang, H. S.; Shin, H. K.; Chang, Y. W.; Jung, J. C.; Kim, J. H.; Han, H. H.; Son, H. S.

    2004-01-01

    The core factors and requirements for the safety-critical software traced and the methodology adopted in each stage of software life cycle are presented. In concept phase, Failure Modes and Effects Analysis (FMEA) for the system has been performed. The feasibility evaluation of selected safety parameter was performed and Preliminary Hazards Analysis list was prepared using HAZOP(Hazard and Operability) technique. And the check list for management control has been produced via walk-through technique. Based on the evaluation of the check list, activities to be performed in requirement phase have been determined. In the design phase, hazard analysis has been performed to check the safety capability of the system with regard to safety software algorithm using Fault Tree Analysis (FTA). In the test phase, the test items based on FMEA have been checked for fitness guided by an accident scenario. The pressurizer low pressure trip algorithm has been selected to apply FTA method to software safety analysis as a sample. By applying CASE tool, the requirements traceability of safety critical system has been enhanced during all of software life cycle phases

  13. Development of In-Core Protection System

    International Nuclear Information System (INIS)

    Cho, J. H; Kim, C. H.; Kim, J. H.; Jeong, S. H.; Sohn, S. D.; BaeK, S. M.; YOON, J. H.

    2016-01-01

    In-core Protection System (ICOPS) is an on-line digital computer system which continuously calculates Departure from Nucleate Boiling Ratio (DNBR) and Local Power Density (LPD) based on plant parameters to make trip decisions based on the computations. The function of the system is the same as that of Core Protection Calculator System (CPCS) and Reactor Core Protection System (RCOPS) which are applied to Optimized Power Reactor 1000 (OPR1000) and Advanced Power Reactor 1400 (APR1400). The ICOPS has been developed to overcome the algorithm related obstacles in overseas project. To achieve this goal, several algorithms were newly developed and hardware and software design was updated. The functional design requirements document was developed by KEPCO-NF and the component design was conducted by Doosan. System design and software implementation were performed by KEPCO-E and C, and software Verification and Validation (V and V) was performed by KEPCO-E and C and Sure Softtech. The ICOPS has been developed to overcome the algorithm related obstacles in overseas project. The function of I/O simulator was improved even though the hardware platform is the same as that of RCOPS for Shin-Hanul 1 and 2. SCADE was applied to the implementation of ICOPS software, and the V and V system for ICOPS which satisfies international standards was developed. Although several further detailed design works remain, the function of ICOPS has been confirmed. The ICOPS will be applied to APR+ project, and the further works will be performed in following project

  14. Development of In-Core Protection System

    Energy Technology Data Exchange (ETDEWEB)

    Cho, J. H; Kim, C. H.; Kim, J. H.; Jeong, S. H.; Sohn, S. D.; BaeK, S. M.; YOON, J. H. [KEPCO Engineering and Construction Co., Deajeon (Korea, Republic of)

    2016-10-15

    In-core Protection System (ICOPS) is an on-line digital computer system which continuously calculates Departure from Nucleate Boiling Ratio (DNBR) and Local Power Density (LPD) based on plant parameters to make trip decisions based on the computations. The function of the system is the same as that of Core Protection Calculator System (CPCS) and Reactor Core Protection System (RCOPS) which are applied to Optimized Power Reactor 1000 (OPR1000) and Advanced Power Reactor 1400 (APR1400). The ICOPS has been developed to overcome the algorithm related obstacles in overseas project. To achieve this goal, several algorithms were newly developed and hardware and software design was updated. The functional design requirements document was developed by KEPCO-NF and the component design was conducted by Doosan. System design and software implementation were performed by KEPCO-E and C, and software Verification and Validation (V and V) was performed by KEPCO-E and C and Sure Softtech. The ICOPS has been developed to overcome the algorithm related obstacles in overseas project. The function of I/O simulator was improved even though the hardware platform is the same as that of RCOPS for Shin-Hanul 1 and 2. SCADE was applied to the implementation of ICOPS software, and the V and V system for ICOPS which satisfies international standards was developed. Although several further detailed design works remain, the function of ICOPS has been confirmed. The ICOPS will be applied to APR+ project, and the further works will be performed in following project.

  15. TOUGH2 software qualification

    International Nuclear Information System (INIS)

    Pruess, K.; Simmons, A.; Wu, Y.S.; Moridis, G.

    1996-02-01

    TOUGH2 is a numerical simulation code for multi-dimensional coupled fluid and heat flow of multiphase, multicomponent fluid mixtures in porous and fractured media. It belongs to the MULKOM (open-quotes MULti-KOMponentclose quotes) family of codes and is a more general version of the TOUGH simulator. The MULKOM family of codes was originally developed with a focus on geothermal reservoir simulation. They are suited to modeling systems which contain different fluid mixtures, with applications to flow problems arising in the context of high-level nuclear waste isolation, oil and gas recovery and storage, and groundwater resource protection. TOUGH2 is essentially a subset of MULKOM, consisting of a selection of the better tested and documented MULKOM program modules. The purpose of this package of reports is to provide all software baseline documents necessary for the software qualification of TOUGH2

  16. Software Configurable Multichannel Transceiver

    Science.gov (United States)

    Freudinger, Lawrence C.; Cornelius, Harold; Hickling, Ron; Brooks, Walter

    2009-01-01

    Emerging test instrumentation and test scenarios increasingly require network communication to manage complexity. Adapting wireless communication infrastructure to accommodate challenging testing needs can benefit from reconfigurable radio technology. A fundamental requirement for a software-definable radio system is independence from carrier frequencies, one of the radio components that to date has seen only limited progress toward programmability. This paper overviews an ongoing project to validate the viability of a promising chipset that performs conversion of radio frequency (RF) signals directly into digital data for the wireless receiver and, for the transmitter, converts digital data into RF signals. The Software Configurable Multichannel Transceiver (SCMT) enables four transmitters and four receivers in a single unit the size of a commodity disk drive, programmable for any frequency band between 1 MHz and 6 GHz.

  17. Implementing Software Defined Radio

    CERN Document Server

    Grayver, Eugene

    2013-01-01

    Software Defined Radio makes wireless communications easier, more efficient, and more reliable. This book bridges the gap between academic research and practical implementation. When beginning a project, practicing engineers, technical managers, and graduate students can save countless hours by considering the concepts presented in these pages. The author covers the myriad options and trade-offs available when selecting an appropriate hardware architecture. As demonstrated here, the choice between hardware- and software-centric architecture can mean the difference between meeting an aggressive schedule and bogging down in endless design iterations. Because of the author’s experience overseeing dozens of failed and successful developments, he is able to present many real-life examples. Some of the key concepts covered are: Choosing the right architecture for the market – laboratory, military, or commercial Hardware platforms – FPGAs, GPPs, specialized and hybrid devices Standardization efforts to ens...

  18. TOUGH2 software qualification

    Energy Technology Data Exchange (ETDEWEB)

    Pruess, K.; Simmons, A.; Wu, Y.S.; Moridis, G.

    1996-02-01

    TOUGH2 is a numerical simulation code for multi-dimensional coupled fluid and heat flow of multiphase, multicomponent fluid mixtures in porous and fractured media. It belongs to the MULKOM ({open_quotes}MULti-KOMponent{close_quotes}) family of codes and is a more general version of the TOUGH simulator. The MULKOM family of codes was originally developed with a focus on geothermal reservoir simulation. They are suited to modeling systems which contain different fluid mixtures, with applications to flow problems arising in the context of high-level nuclear waste isolation, oil and gas recovery and storage, and groundwater resource protection. TOUGH2 is essentially a subset of MULKOM, consisting of a selection of the better tested and documented MULKOM program modules. The purpose of this package of reports is to provide all software baseline documents necessary for the software qualification of TOUGH2.

  19. Scaling NS-3 DCE Experiments on Multi-Core Servers

    Science.gov (United States)

    2016-06-15

    MPTCP) using the same software in DCE. In the experiment, only two wireless links ( LTE and Wi-Fi) are setup to examine MPTCP, resulting in limited...performance drop on the blade server. Our investigation then turned to other straightforward measures including the following: • We reduced the amount...simulation with varying numbers of cores and measured the run time. To pin the simulation to a specific set of cores, we switched from using 0:00   2:00

  20. Replaceable LMFBR core components

    International Nuclear Information System (INIS)

    Evans, E.A.; Cunningham, G.W.

    1976-01-01

    Much progress has been made in understanding material and component performance in the high temperature, fast neutron environment of the LMFBR. Current data have provided strong assurance that the initial core component lifetime objectives of FFTF and CRBR can be met. At the same time, this knowledge translates directly into the need for improved core designs that utilize improved materials and advanced fuels required to meet objectives of low doubling times and extended core component lifetimes. An industrial base for the manufacture of quality core components has been developed in the US, and all procurements for the first two core equivalents for FFTF will be completed this year. However, the problem of fabricating recycled plutonium while dramatically reducing fabrication costs, minimizing personnel exposure, and protecting public health and safety must be addressed

  1. Lunar Core and Tides

    Science.gov (United States)

    Williams, J. G.; Boggs, D. H.; Ratcliff, J. T.

    2004-01-01

    Variations in rotation and orientation of the Moon are sensitive to solid-body tidal dissipation, dissipation due to relative motion at the fluid-core/solid-mantle boundary, and tidal Love number k2 [1,2]. There is weaker sensitivity to flattening of the core-mantle boundary (CMB) [2,3,4] and fluid core moment of inertia [1]. Accurate Lunar Laser Ranging (LLR) measurements of the distance from observatories on the Earth to four retroreflector arrays on the Moon are sensitive to lunar rotation and orientation variations and tidal displacements. Past solutions using the LLR data have given results for dissipation due to solid-body tides and fluid core [1] plus Love number [1-5]. Detection of CMB flattening, which in the past has been marginal but improving [3,4,5], now seems significant. Direct detection of the core moment has not yet been achieved.

  2. Internal core tightener

    International Nuclear Information System (INIS)

    Brynsvold, G.V.; Snyder, H.J. Jr.

    1976-01-01

    An internal core tightener is disclosed which is a linear actuated (vertical actuation motion) expanding device utilizing a minimum of moving parts to perform the lateral tightening function. The key features are: (1) large contact areas to transmit loads during reactor operation; (2) actuation cam surfaces loaded only during clamping and unclamping operation; (3) separation of the parts and internal operation involved in the holding function from those involved in the actuation function; and (4) preloaded pads with compliant travel at each face of the hexagonal assembly at the two clamping planes to accommodate thermal expansion and irradiation induced swelling. The latter feature enables use of a ''fixed'' outer core boundary, and thus eliminates the uncertainty in gross core dimensions, and potential for rapid core reactivity changes as a result of core dimensional change. 5 claims, 12 drawing figures

  3. Guidance and Control Software,

    Science.gov (United States)

    1980-05-01

    user, by forcing him subconsciously to make faster decisions than necessary and giving him fewer choices than possible. It may be compared to the... reprogramming , and two real time references. Interfaced to the main computer but still within the same physical case are a 12-bit HUD processor, a HDD...redesigned and reprogrammed many areas of the UPDATE I Mission Software to rectify this problem. The lesson learned was that the in-house stuff must devise

  4. ThermalTracker Software

    Energy Technology Data Exchange (ETDEWEB)

    2016-08-10

    The software processes recorded thermal video and detects the flight tracks of birds and bats that passed through the camera's field of view. The output is a set of images that show complete flight tracks for any detections, with the direction of travel indicated and the thermal image of the animal delineated. A report of the descriptive features of each detected track is also output in the form of a comma-separated value text file.

  5. Antenna Controller Replacement Software

    Science.gov (United States)

    Chao, Roger Y.; Morgan, Scott C.; Strain, Martha M.; Rockwell, Stephen T.; Shimizu, Kenneth J.; Tehrani, Barzia J.; Kwok, Jaclyn H.; Tuazon-Wong, Michelle; Valtier, Henry; Nalbandi, Reza; hide

    2010-01-01

    The Antenna Controller Replacement (ACR) software accurately points and monitors the Deep Space Network (DSN) 70-m and 34-m high-efficiency (HEF) ground-based antennas that are used to track primarily spacecraft and, periodically, celestial targets. To track a spacecraft, or other targets, the antenna must be accurately pointed at the spacecraft, which can be very far away with very weak signals. ACR s conical scanning capability collects the signal in a circular pattern around the target, calculates the location of the strongest signal, and adjusts the antenna pointing to point directly at the spacecraft. A real-time, closed-loop servo control algorithm performed every 0.02 second allows accurate positioning of the antenna in order to track these distant spacecraft. Additionally, this advanced servo control algorithm provides better antenna pointing performance in windy conditions. The ACR software provides high-level commands that provide a very easy user interface for the DSN operator. The operator only needs to enter two commands to start the antenna and subreflector, and Master Equatorial tracking. The most accurate antenna pointing is accomplished by aligning the antenna to the Master Equatorial, which because of its small size and sheltered location, has the most stable pointing. The antenna has hundreds of digital and analog monitor points. The ACR software provides compact displays to summarize the status of the antenna, subreflector, and the Master Equatorial. The ACR software has two major functions. First, it performs all of the steps required to accurately point the antenna (and subreflector and Master Equatorial) at the spacecraft (or celestial target). This involves controlling the antenna/ subreflector/Master-Equatorial hardware, initiating and monitoring the correct sequence of operations, calculating the position of the spacecraft relative to the antenna, executing the real-time servo control algorithm to maintain the correct position, and

  6. Unified Engineering Software System

    Science.gov (United States)

    Purves, L. R.; Gordon, S.; Peltzman, A.; Dube, M.

    1989-01-01

    Collection of computer programs performs diverse functions in prototype engineering. NEXUS, NASA Engineering Extendible Unified Software system, is research set of computer programs designed to support full sequence of activities encountered in NASA engineering projects. Sequence spans preliminary design, design analysis, detailed design, manufacturing, assembly, and testing. Primarily addresses process of prototype engineering, task of getting single or small number of copies of product to work. Written in FORTRAN 77 and PROLOG.

  7. Software trace cache

    OpenAIRE

    Ramírez Bellido, Alejandro; Larriba Pey, Josep; Valero Cortés, Mateo

    2005-01-01

    We explore the use of compiler optimizations, which optimize the layout of instructions in memory. The target is to enable the code to make better use of the underlying hardware resources regardless of the specific details of the processor/architecture in order to increase fetch performance. The Software Trace Cache (STC) is a code layout algorithm with a broader target than previous layout optimizations. We target not only an improvement in the instruction cache hit rate, but also an increas...

  8. Software for Avionics.

    Science.gov (United States)

    1983-01-01

    fonctions gfinbrales et lea uti- litaires fournis en particulier grice 41 UNIX, sont intfigrfs aelon divers points de vue: - par leur accas 41 travers le...Are They Really A Problem? Proceedings, 2nd International Conference On Software Engineering, pp 91-68. Long acCA : IEEE Computer Society. Britton...CD The Hague. Nc KLEINSCIIMIDT, M. Dr Fa. LITEF. Poatfach 774. 7800 Freiburg i. Br., Ge KLEMM, R. Dr FGAN- FFM , D 5 307 Watchberg-Werthhoven. Ge KLENK

  9. Real World Software Engineering

    Science.gov (United States)

    1994-07-15

    You put the new kid there and their first promotion is out of maintenance. ii Maintenance is not sufficiently emphasized as an important criteria for...the successful material from Koffman’s CS1 pedagogy with a software-engineering-oriented Ada presentation order. Packages are introduced early and...Shumate, K. Understanding Ada. 2nd edition, John Wiley & Sons. This would make a CS1 book if it included more overall pedagogy , independent of language

  10. Hardening Software Defined Networks

    Science.gov (United States)

    2014-07-01

    Zarifis,Peyman Kazemian:Leveraging SDN layering to systematically troubleshoot networks. HotSDN 2013: 37-42 21. Aurojit Panda ,Colin Scott,Ali Ghodsi...Unlimited. 21 ICN.SIGCOMM 2013: 147-158 23. Sangjin Han (U.C.Berkeley), Norbert Egi (Huawei Corp.), Aurojit Panda , Sylvia Ratnasamy (U.C.Berkeley...balancers, traffic-shapers, and so on. SDN brings software and processing power to bear on all this complexity. While a large Data Center may be

  11. Addressing Software Security

    Science.gov (United States)

    Bailey, Brandon

    2015-01-01

    Historically security within organizations was thought of as an IT function (web sites/servers, email, workstation patching, etc.) Threat landscape has evolved (Script Kiddies, Hackers, Advanced Persistent Threat (APT), Nation States, etc.) Attack surface has expanded -Networks interconnected!! Some security posture factors Network Layer (Routers, Firewalls, etc.) Computer Network Defense (IPS/IDS, Sensors, Continuous Monitoring, etc.) Industrial Control Systems (ICS) Software Security (COTS, FOSS, Custom, etc.)

  12. Office software Individual coaching

    CERN Multimedia

    HR Department

    2010-01-01

    If one or several particular topics cause you sleepless nights, you can get the help of our trainer who will come to your workplace for a multiple of 1-hour slots . All fields in which our trainer can help are detailed in the course description in our training catalogue (Microsoft Office software, Adobe applications, i-applications etc.). Please discover these new courses in our catalogue! Tel. 74924

  13. Software Project Management

    Science.gov (United States)

    1989-07-01

    incorporated into the sys- Kotler88 tem. Several interesting concepts are presented, but Kotler , P. Marketing Planning: Analysis, Planning, the bulk of the...the development organiza- In some environments, a software product is de- tion. veloped on speculation that there is a market for Teaching Consideration...development houses find it necessary to b. Types of plans know what the potential market for the product There are a number of plans developed in

  14. Bicriterial Optimization of Software

    Directory of Open Access Journals (Sweden)

    Ion IVAN

    2006-01-01

    Full Text Available There are defined two optimum criteria for software analysis. For each criterion there are defined solutions in order to reach a minimum level. There are analyzed the effects of pursuing one objective over the other one. There is developed an aggregate function for which it is determined the two criteria composed level. Based on this value it is selected the optimum solution

  15. Standard software for CAMAC

    International Nuclear Information System (INIS)

    Lenkszus, F.R.

    1978-01-01

    The NIM Committee (National Instrumentation Methods Committee) of the U.S. Department of Energy and the ESONE Committee of European Laboratories have jointly specified standard software for use with CAMAC. Three general approaches were followed: the definition of a language called IML for use in CAMAC systems, the definition of a standard set of subroutine calls, and real-time extensions to the BASIC language. This paper summarizes the results of these efforts. 1 table

  16. ALMA software architecture

    Science.gov (United States)

    Schwarz, Joseph; Raffi, Gianni

    2002-12-01

    The Atacama Large Millimeter Array (ALMA) is a joint project involving astronomical organizations in Europe and North America. ALMA will consist of at least 64 12-meter antennas operating in the millimeter and sub-millimeter range. It will be located at an altitude of about 5000m in the Chilean Atacama desert. The primary challenge to the development of the software architecture is the fact that both its development and runtime environments will be distributed. Groups at different institutes will develop the key elements such as Proposal Preparation tools, Instrument operation, On-line calibration and reduction, and Archiving. The Proposal Preparation software will be used primarily at scientists' home institutions (or on their laptops), while Instrument Operations will execute on a set of networked computers at the ALMA Operations Support Facility. The ALMA Science Archive, itself to be replicated at several sites, will serve astronomers worldwide. Building upon the existing ALMA Common Software (ACS), the system architects will prepare a robust framework that will use XML-encoded entity objects to provide an effective solution to the persistence needs of this system, while remaining largely independent of any underlying DBMS technology. Independence of distributed subsystems will be facilitated by an XML- and CORBA-based pass-by-value mechanism for exchange of objects. Proof of concept (as well as a guide to subsystem developers) will come from a prototype whose details will be presented.

  17. Terra Harvest software architecture

    Science.gov (United States)

    Humeniuk, Dave; Klawon, Kevin

    2012-06-01

    Under the Terra Harvest Program, the DIA has the objective of developing a universal Controller for the Unattended Ground Sensor (UGS) community. The mission is to define, implement, and thoroughly document an open architecture that universally supports UGS missions, integrating disparate systems, peripherals, etc. The Controller's inherent interoperability with numerous systems enables the integration of both legacy and future UGS System (UGSS) components, while the design's open architecture supports rapid third-party development to ensure operational readiness. The successful accomplishment of these objectives by the program's Phase 3b contractors is demonstrated via integration of the companies' respective plug-'n'-play contributions that include controllers, various peripherals, such as sensors, cameras, etc., and their associated software drivers. In order to independently validate the Terra Harvest architecture, L-3 Nova Engineering, along with its partner, the University of Dayton Research Institute, is developing the Terra Harvest Open Source Environment (THOSE), a Java Virtual Machine (JVM) running on an embedded Linux Operating System. The Use Cases on which the software is developed support the full range of UGS operational scenarios such as remote sensor triggering, image capture, and data exfiltration. The Team is additionally developing an ARM microprocessor-based evaluation platform that is both energy-efficient and operationally flexible. The paper describes the overall THOSE architecture, as well as the design decisions for some of the key software components. Development process for THOSE is discussed as well.

  18. FPGAs for software programmers

    CERN Document Server

    Hannig, Frank; Ziener, Daniel

    2016-01-01

    This book makes powerful Field Programmable Gate Array (FPGA) and reconfigurable technology accessible to software engineers by covering different state-of-the-art high-level synthesis approaches (e.g., OpenCL and several C-to-gates compilers). It introduces FPGA technology, its programming model, and how various applications can be implemented on FPGAs without going through low-level hardware design phases. Readers will get a realistic sense for problems that are suited for FPGAs and how to implement them from a software designer’s point of view. The authors demonstrate that FPGAs and their programming model reflect the needs of stream processing problems much better than traditional CPU or GPU architectures, making them well-suited for a wide variety of systems, from embedded systems performing sensor processing to large setups for Big Data number crunching. This book serves as an invaluable tool for software designers and FPGA design engineers who are interested in high design productivity through behavi...

  19. Modular Software Performance Monitoring

    CERN Document Server

    Kruse, D F

    2011-01-01

    CPU clock frequency is not likely to be increased significantly in the coming years, and data analysis speed can be improved by using more processors or buying new machines, only if one is willing to change the paradigm to a parallel one. Therefore, performance monitoring procedures and tools are needed to help programmers to optimize existing software running on current and future hardware. Low level information from hardware performance counters is vital to spot specific performance problems slowing program execution. HEP software is often huge and complex, and existing tools are unable to give results with the required granularity. We will report on the approach we have chose to solve this problem that involves decomposing the application into parts and monitoring each of them separately. Both counting and sampling methods are used to allow an analysis with the required custom granularity: from global level, up to the function level. A set of tools (based on perfmon2 – a software interface to hardware co...

  20. Software reliability studies

    Science.gov (United States)

    Hoppa, Mary Ann; Wilson, Larry W.

    1994-01-01

    There are many software reliability models which try to predict future performance of software based on data generated by the debugging process. Our research has shown that by improving the quality of the data one can greatly improve the predictions. We are working on methodologies which control some of the randomness inherent in the standard data generation processes in order to improve the accuracy of predictions. Our contribution is twofold in that we describe an experimental methodology using a data structure called the debugging graph and apply this methodology to assess the robustness of existing models. The debugging graph is used to analyze the effects of various fault recovery orders on the predictive accuracy of several well-known software reliability algorithms. We found that, along a particular debugging path in the graph, the predictive performance of different models can vary greatly. Similarly, just because a model 'fits' a given path's data well does not guarantee that the model would perform well on a different path. Further we observed bug interactions and noted their potential effects on the predictive process. We saw that not only do different faults fail at different rates, but that those rates can be affected by the particular debugging stage at which the rates are evaluated. Based on our experiment, we conjecture that the accuracy of a reliability prediction is affected by the fault recovery order as well as by fault interaction.

  1. Evidence of Absence software

    Science.gov (United States)

    Dalthorp, Daniel; Huso, Manuela M. P.; Dail, David; Kenyon, Jessica

    2014-01-01

    Evidence of Absence software (EoA) is a user-friendly application used for estimating bird and bat fatalities at wind farms and designing search protocols. The software is particularly useful in addressing whether the number of fatalities has exceeded a given threshold and what search parameters are needed to give assurance that thresholds were not exceeded. The software is applicable even when zero carcasses have been found in searches. Depending on the effectiveness of the searches, such an absence of evidence of mortality may or may not be strong evidence that few fatalities occurred. Under a search protocol in which carcasses are detected with nearly 100 percent certainty, finding zero carcasses would be convincing evidence that overall mortality rate was near zero. By contrast, with a less effective search protocol with low probability of detecting a carcass, finding zero carcasses does not rule out the possibility that large numbers of animals were killed but not detected in the searches. EoA uses information about the search process and scavenging rates to estimate detection probabilities to determine a maximum credible number of fatalities, even when zero or few carcasses are observed.

  2. Addressing Software Engineering Issues in Real-Time Software ...

    African Journals Online (AJOL)

    Addressing Software Engineering Issues in Real-Time Software ... systems, manufacturing process, process control, military, space exploration, and ... but also physical properties such as timeliness, Quality of Service and reliability.

  3. Software maintenance and evolution and automated software engineering

    NARCIS (Netherlands)

    Carver, Jeffrey C.; Serebrenik, Alexander

    2018-01-01

    This issue's column reports on the 33rd International Conference on Software Maintenance and Evolution and 32nd International Conference on Automated Software Engineering. Topics include flaky tests, technical debt, QA bots, and regular expressions.

  4. Computer games and software engineering

    CERN Document Server

    Cooper, Kendra M L

    2015-01-01

    Computer games represent a significant software application domain for innovative research in software engineering techniques and technologies. Game developers, whether focusing on entertainment-market opportunities or game-based applications in non-entertainment domains, thus share a common interest with software engineers and developers on how to best engineer game software.Featuring contributions from leading experts in software engineering, the book provides a comprehensive introduction to computer game software development that includes its history as well as emerging research on the inte

  5. Terminological recommendations for software localization

    Directory of Open Access Journals (Sweden)

    Klaus-Dirk Schmitz

    2012-08-01

    Full Text Available After an explosive growth of data processing and software starting at the beginning of the 1980s, the software industry shifted toward a strong orientation in non-US markets at the beginning of the 1990s. Today we see the global marketing of software in almost all regions of the world. Since software is no longer used by IT experts only, and since European and national regulations require user interfaces, manuals and documentation to be provided in the language of the customer, the market for software translation, i.e. for software localization, is the fastest growing market in the translation business.

  6. Terminological recommendations for software localization

    Directory of Open Access Journals (Sweden)

    Klaus-Dirk Schmitz

    2009-03-01

    Full Text Available After an explosive growth of data processing and software starting at the beginning of the 1980s, the software industry shifted toward a strong orientation in non-US markets at the beginning of the 1990s. Today we see the global marketing of software in almost all regions of the world. Since software is no longer used by IT experts only, and since European and national regulations require user interfaces, manuals and documentation to be provided in the language of the customer, the market for software translation, i.e. for software localization, is the fastest growing market in the translation business.

  7. Software design practice using two SCADA software packages

    DEFF Research Database (Denmark)

    Basse, K.P.; Christensen, Georg Kronborg; Frederiksen, P. K.

    1996-01-01

    Typical software development for manufacturing control is done either by specialists with consideral real-time programming experience or done by the adaptation of standard software packages for manufacturing control. After investigation and test of two commercial software packages: "InTouch" and ......Touch" and "Fix", it is argued, that a more efficient software solution can be achieved by utilising an integrated specification for SCADA and PLC-programming. Experiences gained from process control is planned investigated for descrete parts manufacturing....

  8. Imprinting Community College Computer Science Education with Software Engineering Principles

    Science.gov (United States)

    Hundley, Jacqueline Holliday

    Although the two-year curriculum guide includes coverage of all eight software engineering core topics, the computer science courses taught in Alabama community colleges limit student exposure to the programming, or coding, phase of the software development lifecycle and offer little experience in requirements analysis, design, testing, and maintenance. We proposed that some software engineering principles can be incorporated into the introductory-level of the computer science curriculum. Our vision is to give community college students a broader exposure to the software development lifecycle. For those students who plan to transfer to a baccalaureate program subsequent to their community college education, our vision is to prepare them sufficiently to move seamlessly into mainstream computer science and software engineering degrees. For those students who plan to move from the community college to a programming career, our vision is to equip them with the foundational knowledge and skills required by the software industry. To accomplish our goals, we developed curriculum modules for teaching seven of the software engineering knowledge areas within current computer science introductory-level courses. Each module was designed to be self-supported with suggested learning objectives, teaching outline, software tool support, teaching activities, and other material to assist the instructor in using it.

  9. Presenting an Evaluation Model for the Cancer Registry Software.

    Science.gov (United States)

    Moghaddasi, Hamid; Asadi, Farkhondeh; Rabiei, Reza; Rahimi, Farough; Shahbodaghi, Reihaneh

    2017-12-01

    As cancer is increasingly growing, cancer registry is of great importance as the main core of cancer control programs, and many different software has been designed for this purpose. Therefore, establishing a comprehensive evaluation model is essential to evaluate and compare a wide range of such software. In this study, the criteria of the cancer registry software have been determined by studying the documents and two functional software of this field. The evaluation tool was a checklist and in order to validate the model, this checklist was presented to experts in the form of a questionnaire. To analyze the results of validation, an agreed coefficient of %75 was determined in order to apply changes. Finally, when the model was approved, the final version of the evaluation model for the cancer registry software was presented. The evaluation model of this study contains tool and method of evaluation. The evaluation tool is a checklist including the general and specific criteria of the cancer registry software along with their sub-criteria. The evaluation method of this study was chosen as a criteria-based evaluation method based on the findings. The model of this study encompasses various dimensions of cancer registry software and a proper method for evaluating it. The strong point of this evaluation model is the separation between general criteria and the specific ones, while trying to fulfill the comprehensiveness of the criteria. Since this model has been validated, it can be used as a standard to evaluate the cancer registry software.

  10. ATLAS software configuration and build tool optimisation

    Science.gov (United States)

    Rybkin, Grigory; Atlas Collaboration

    2014-06-01

    multi-core computing resources utilisation, and considerably improved software developer and user experience.

  11. Software engineering methodologies and tools

    Science.gov (United States)

    Wilcox, Lawrence M.

    1993-01-01

    Over the years many engineering disciplines have developed, including chemical, electronic, etc. Common to all engineering disciplines is the use of rigor, models, metrics, and predefined methodologies. Recently, a new engineering discipline has appeared on the scene, called software engineering. For over thirty years computer software has been developed and the track record has not been good. Software development projects often miss schedules, are over budget, do not give the user what is wanted, and produce defects. One estimate is there are one to three defects per 1000 lines of deployed code. More and more systems are requiring larger and more complex software for support. As this requirement grows, the software development problems grow exponentially. It is believed that software quality can be improved by applying engineering principles. Another compelling reason to bring the engineering disciplines to software development is productivity. It has been estimated that productivity of producing software has only increased one to two percent a year in the last thirty years. Ironically, the computer and its software have contributed significantly to the industry-wide productivity, but computer professionals have done a poor job of using the computer to do their job. Engineering disciplines and methodologies are now emerging supported by software tools that address the problems of software development. This paper addresses some of the current software engineering methodologies as a backdrop for the general evaluation of computer assisted software engineering (CASE) tools from actual installation of and experimentation with some specific tools.

  12. Software To Go: A Catalog of Software Available for Loan.

    Science.gov (United States)

    Kurlychek, Ken, Comp.

    This catalog lists the holdings of the Software To Go software lending library and clearinghouse for programs and agencies serving students or clients who are deaf or hard of hearing. An introduction describes the clearinghouse and its collection of software, much of it commercial and copyrighted material, for Apple, Macintosh, and IBM (MS-DOS)…

  13. Model-Based Software Testing for Object-Oriented Software

    Science.gov (United States)

    Biju, Soly Mathew

    2008-01-01

    Model-based testing is one of the best solutions for testing object-oriented software. It has a better test coverage than other testing styles. Model-based testing takes into consideration behavioural aspects of a class, which are usually unchecked in other testing methods. An increase in the complexity of software has forced the software industry…

  14. The Software Invention Cube: A classification scheme for software inventions

    NARCIS (Netherlands)

    Bergstra, J.A.; Klint, P.

    2008-01-01

    The patent system protects inventions. The requirement that a software invention should make ‘a technical contribution’ turns out to be untenable in practice and this raises the question, what constitutes an invention in the realm of software. The authors developed the Software Invention Cube

  15. Earth's inner core: Innermost inner core or hemispherical variations?

    NARCIS (Netherlands)

    Lythgoe, K. H.; Deuss, A.|info:eu-repo/dai/nl/412396610; Rudge, J. F.; Neufeld, J. A.

    2014-01-01

    The structure of Earth's deep inner core has important implications for core evolution, since it is thought to be related to the early stages of core formation. Previous studies have suggested that there exists an innermost inner core with distinct anisotropy relative to the rest of the inner core.

  16. Dependence of Core and Extended Flux on Core Dominance ...

    Indian Academy of Sciences (India)

    Abstract. Based on two extragalactic radio source samples, the core dominance parameter is calculated, and the correlations between the core/extended flux density and core dominance parameter are investi- gated. When the core dominance parameter is lower than unity, it is linearly correlated with the core flux density, ...

  17. Korrelasjon mellom core styrke, core stabilitet og utholdende styrke i core

    OpenAIRE

    Berg-Olsen, Andrea Marie; Fugelsøy, Eivor; Maurstad, Ann-Louise

    2010-01-01

    Formålet med studien var å se hvilke korrelasjon det er mellom core styrke, core stabilitet og utholdende styrke i core. Testingen bestod av tre hoveddeler hvor vi testet core styrke, core stabilitet og utholdende styrke i core. Innenfor core styrke og utholdende styrke i core ble tre ulike tester utført. Ved måling av core stabilitet ble det gjennomført kun en test. I core styrke ble isometrisk abdominal fleksjon, isometrisk rygg ekstensjon og isometrisk lateral fleksjon testet. Sit-ups p...

  18. Design of software platform based on linux operating system for γ-spectrometry instrument

    International Nuclear Information System (INIS)

    Hong Tianqi; Zhou Chen; Zhang Yongjin

    2008-01-01

    This paper described the design of γ-spectrometry instrument software platform based on s3c2410a processor with arm920t core, emphases are focused on analyzing the integrated application of embedded linux operating system, yaffs file system and qt/embedded GUI development library. It presented a new software platform in portable instrument for γ measurement. (authors)

  19. Windscale pile core surveys

    International Nuclear Information System (INIS)

    Curtis, R.F.; Mathews, R.F.

    1996-01-01

    The two Windscale Piles were closed down, defueled as far as possible and mothballed for thirty years following a fire in the core of Pile 1 in 1957 resulting from the spontaneous release of stored Wigner energy in the graphite moderator. Decommissioning of the reactors commenced in 1987 and has reached the stage where the condition of both cores needs to be determined. To this end, non-intrusive and intrusive surveys and sampling of the cores have been planned and partly implemented. The objectives for each Pile differ slightly. The location and quantity of fuel remaining in the damaged core of Pile 1 needed to be established, whereas the removal of all fuel from Pile 2 needed to be confirmed. In Pile 1, the possible existence of a void in the core is to be explored and in Pile 2, the level of Wigner energy remaining required to be quantified. Levels of radioactivity in both cores needed to be measured. The planning of the surveys is described including strategy, design, safety case preparation and the remote handling and viewing equipment required to carry out the inspection, sampling and monitoring work. The results from the completed non-intrusive survey of Pile 2 are summarised. They confirm that the core is empty and the graphite is in good condition. The survey of Pile 1 has just started. (UK)

  20. Delivering LHC software to HPC compute elements

    CERN Document Server

    Blomer, Jakob; Hardi, Nikola; Popescu, Radu

    2017-01-01

    In recent years, there was a growing interest in improving the utilization of supercomputers by running applications of experiments at the Large Hadron Collider (LHC) at CERN when idle cores cannot be assigned to traditional HPC jobs. At the same time, the upcoming LHC machine and detector upgrades will produce some 60 times higher data rates and challenge LHC experiments to use so far untapped compute resources. LHC experiment applications are tailored to run on high-throughput computing resources and they have a different anatomy than HPC applications. LHC applications comprise a core framework that allows hundreds of researchers to plug in their specific algorithms. The software stacks easily accumulate to many gigabytes for a single release. New releases are often produced on a daily basis. To facilitate the distribution of these software stacks to world-wide distributed computing resources, LHC experiments use a purpose-built, global, POSIX file system, the CernVM File System. CernVM-FS pre-processes dat...

  1. Software Engineering for Human Spaceflight

    Science.gov (United States)

    Fredrickson, Steven E.

    2014-01-01

    The Spacecraft Software Engineering Branch of NASA Johnson Space Center (JSC) provides world-class products, leadership, and technical expertise in software engineering, processes, technology, and systems management for human spaceflight. The branch contributes to major NASA programs (e.g. ISS, MPCV/Orion) with in-house software development and prime contractor oversight, and maintains the JSC Engineering Directorate CMMI rating for flight software development. Software engineering teams work with hardware developers, mission planners, and system operators to integrate flight vehicles, habitats, robotics, and other spacecraft elements. They seek to infuse automation and autonomy into missions, and apply new technologies to flight processor and computational architectures. This presentation will provide an overview of key software-related projects, software methodologies and tools, and technology pursuits of interest to the JSC Spacecraft Software Engineering Branch.

  2. Software quality concepts and practice

    CERN Document Server

    Galin, Daniel

    2018-01-01

    The book presents a comprehensive discussion on software quality issues and software quality assurance (SQA) principles and practices, and lays special emphasis on implementing and managing SQA. Primarily designed to serve three audiences; universities and college students, vocational training participants, and software engineers and software development managers, the book may be applicable to all personnel engaged in a software projects Features: * A broad view of SQA. The book delves into SQA issues, going beyond the classic boundaries of custom-made software development to also cover in-house software development, subcontractors, and readymade software. * An up-to-date wide-range coverage of SQA and SQA related topics. Providing comprehensive coverage on multifarious SQA subjects, including topics, hardly explored till in SQA texts. * A systematic presentation of the SQA function and its tasks: establishing the SQA processes, planning, coordinating, follow-up, review and evaluation of SQA processes. * Fo...

  3. Engineering high quality medical software

    CERN Document Server

    Coronato, Antonio

    2018-01-01

    This book focuses on high-confidence medical software in the growing field of e-health, telecare services and health technology. It covers the development of methodologies and engineering tasks together with standards and regulations for medical software.

  4. Software Quality Assurance Audits Guidebooks

    Science.gov (United States)

    1990-01-01

    The growth in cost and importance of software to NASA has caused NASA to address the improvement of software development across the agency. One of the products of this program is a series of guidebooks that define a NASA concept of the assurance processes that are used in software development. The Software Assurance Guidebook, NASA-GB-A201, issued in September, 1989, provides an overall picture of the NASA concepts and practices in software assurance. Second level guidebooks focus on specific activities that fall within the software assurance discipline, and provide more detailed information for the manager and/or practitioner. This is the second level Software Quality Assurance Audits Guidebook that describes software quality assurance audits in a way that is compatible with practices at NASA Centers.

  5. Selecting and Buying Educational Software.

    Science.gov (United States)

    Ahl, David H.

    1983-01-01

    Guidelines for selecting/buying educational software are discussed under the following headings: educational soundness; appropriateness; challenge and progress; motivation and reward; correctness; compatibility with systems; instructions and handlings. Includes several sources of software reviews. (JN)

  6. Portable Medical Laboratory Applications Software

    OpenAIRE

    Silbert, Jerome A.

    1983-01-01

    Portability implies that a program can be run on a variety of computers with minimal software revision. The advantages of portability are outlined and design considerations for portable laboratory software are discussed. Specific approaches for achieving this goal are presented.

  7. Perspectives on Open Source Software

    National Research Council Canada - National Science Library

    Hissam, Scott

    2001-01-01

    Open source software (OSS) is emerging as the software community's next "silver bullet" and appears to be playing a significant role in the acquisition and development plans of the Department of Defense (DoD) and industry...

  8. Design Principles for Interactive Software

    DEFF Research Database (Denmark)

    The book addresses the crucial intersection of human-computer interaction (HCI) and software engineering by asking both what users require from interactive systems and what developers need to produce well-engineered software. Needs are expressed as...

  9. Core shroud corner joints

    Science.gov (United States)

    Gilmore, Charles B.; Forsyth, David R.

    2013-09-10

    A core shroud is provided, which includes a number of planar members, a number of unitary corners, and a number of subassemblies each comprising a combination of the planar members and the unitary corners. Each unitary corner comprises a unitary extrusion including a first planar portion and a second planar portion disposed perpendicularly with respect to the first planar portion. At least one of the subassemblies comprises a plurality of the unitary corners disposed side-by-side in an alternating opposing relationship. A plurality of the subassemblies can be combined to form a quarter perimeter segment of the core shroud. Four quarter perimeter segments join together to form the core shroud.

  10. IGCSE core mathematics

    CERN Document Server

    Wall, Terry

    2013-01-01

    Give your core level students the support and framework they require to get their best grades with this book dedicated to the core level content of the revised syllabus and written specifically to ensure a more appropriate pace. This title has been written for Core content of the revised Cambridge IGCSE Mathematics (0580) syllabus for first teaching from 2013. ? Gives students the practice they require to deepen their understanding through plenty of practice questions. ? Consolidates learning with unique digital resources on the CD, included free with every book. We are working with Cambridge

  11. IT & C Projects Duration Assessment Based on Audit and Software Reengineering

    Directory of Open Access Journals (Sweden)

    2009-01-01

    Full Text Available This paper analyses the effect of applying the core elements of software engineering and reengineering, probabilistic simulations and system development auditing to software development projects. Our main focus is reducing software development project duration. Due to the fast changing economy, the need for efficiency and productivity is greater than ever. Optimal allocation of resources has proved to be the main element contributing to an increase in efficiency.

  12. Extending Software Transactional Memory in Clojure with Side-Effects and Transaction Control

    DEFF Research Database (Denmark)

    Jensen, Søren Kejser; Thomsen, Lone Leth

    2016-01-01

    In conjunction with the increase of multi-core processors the use of functional programming languages has increased in recent years. The functional language Clojure has concurrency as a core feature, and provides Software Transactional Memory (STM) as a substitute for locks. Transactions in Cloju...

  13. Modern software cybernetics: new trends

    OpenAIRE

    Yang, H; Chen, F; Aliyu, S

    2017-01-01

    The file attached to this record is the author's final peer reviewed version. The Publisher's final version can be found by following the DOI link. Software cybernetics research is to apply a variety of techniques from cybernetics research to software engineering research. For more than fifteen years since 2001, there has been a dramatic increase in work relating to software cybernetics. From cybernetics viewpoint, the work is mainly on the first-order level, namely, the software under obs...

  14. Gammasphere software development. Progress report

    Energy Technology Data Exchange (ETDEWEB)

    Piercey, R.B.

    1994-01-01

    This report describes the activities of the nuclear physics group at Mississippi State University which were performed during 1993. Significant progress has been made in the focus areas: chairing the Gammasphere Software Working Group (SWG); assisting with the porting and enhancement of the ORNL UPAK histogramming software package; and developing standard formats for Gammasphere data products. In addition, they have established a new public ftp archive to distribute software and software development tools and information.

  15. GNSS Software Receiver for UAVs

    DEFF Research Database (Denmark)

    Olesen, Daniel Madelung; Jakobsen, Jakob; von Benzon, Hans-Henrik

    2016-01-01

    This paper describes the current activities of GPS/GNSS Software receiver development at DTU Space. GNSS Software receivers have received a great deal of attention in the last two decades and numerous implementations have already been presented. DTU Space has just recently started development of ...... of our own GNSS software-receiver targeted for mini UAV applications, and we will in in this paper present our current progress and briefly discuss the benefits of Software Receivers in relation to our research interests....

  16. Open source software and libraries

    OpenAIRE

    Randhawa, Sukhwinder

    2008-01-01

    Open source software is, software that users have the ability to run, copy, distribute, study, change, share and improve for any purpose. Open source library software’s does not need the initial cost of commercial software and enables libraries to have greater control over their working environment. Library professionals should be aware of the advantages of open source software and should involve in their development. They should have basic knowledge about the selection, installation and main...

  17. Software quality testing process analysis

    OpenAIRE

    Mera Paz, Julián

    2016-01-01

    Introduction: This article is the result of reading, review, analysis of books, magazines and articles well known for their scientific and research quality, which have addressed the software quality testing process. The author, based on his work experience in software development companies, teaching and other areas, has compiled and selected information to argue and substantiate the importance of the software quality testing process. Methodology: the existing literature on the software qualit...

  18. The Software Management Environment (SME)

    Science.gov (United States)

    Valett, Jon D.; Decker, William; Buell, John

    1988-01-01

    The Software Management Environment (SME) is a research effort designed to utilize the past experiences and results of the Software Engineering Laboratory (SEL) and to incorporate this knowledge into a tool for managing projects. SME provides the software development manager with the ability to observe, compare, predict, analyze, and control key software development parameters such as effort, reliability, and resource utilization. The major components of the SME, the architecture of the system, and examples of the functionality of the tool are discussed.

  19. Next generation software process improvement

    OpenAIRE

    Turnas, Daniel

    2003-01-01

    Approved for public release; distribution is unlimited Software is often developed under a process that can at best be described as ad hoc. While it is possible to develop quality software under an ad hoc process, formal processes can be developed to help increase the overall quality of the software under development. The application of these processes allows for an organization to mature. The software maturity level, and process improvement, of an organization can be measured with the Cap...

  20. Software engineering a practitioner's approach

    CERN Document Server

    Pressman, Roger S

    1997-01-01

    This indispensable guide to software engineering exploration enables practitioners to navigate the ins and outs of this rapidly changing field. Pressman's fully revised and updated Fourth Edition provides in-depth coverage of every important management and technical topic in software engineering. Moreover, readers will find the inclusion of the hottest developments in the field such as: formal methods and cleanroom software engineering, business process reengineering, and software reengineering.

  1. Calculation Software versus Illustration Software for Teaching Statistics

    DEFF Research Database (Denmark)

    Mortensen, Peter Stendahl; Boyle, Robin G.

    1999-01-01

    As personal computers have become more and more powerful, so have the software packages available to us for teaching statistics. This paper investigates what software packages are currently being used by progressive statistics instructors at university level, examines some of the deficiencies...... of such software, and indicates features that statistics instructors wish to have incorporated in software in the future. The basis of the paper is a survey of participants at ICOTS-5 (the Fifth International Conference on Teaching Statistics). These survey results, combined with the software based papers...

  2. Assessing Core Competencies

    Science.gov (United States)

    Narayanan, M.

    2004-12-01

    Catherine Palomba and Trudy Banta offer the following definition of assessment, adapted from one provided by Marches in 1987. Assessment in the systematic collection, review, and use of information about educational programs undertaken for the purpose of improving student learning and development. (Palomba and Banta 1999). It is widely recognized that sophisticated computing technologies are becoming a key element in today's classroom instructional techniques. Regardless, the Professor must be held responsible for creating an instructional environment in which the technology actually supplements learning outcomes of the students. Almost all academic disciplines have found a niche for computer-based instruction in their respective professional domain. In many cases, it is viewed as an essential and integral part of the educational process. Educational institutions are committing substantial resources to the establishment of dedicated technology-based laboratories, so that they will be able to accommodate and fulfill students' desire to master certain of these specific skills. This type of technology-based instruction may raise some fundamental questions about the core competencies of the student learner. Some of the most important questions are : 1. Is the utilization of these fast high-powered computers and user-friendly software programs creating a totally non-challenging instructional environment for the student learner ? 2. Can technology itself all too easily overshadow the learning outcomes intended ? 3. Are the educational institutions simply training students how to use technology rather than educating them in the appropriate field ? 4. Are we still teaching content-driven courses and analysis oriented subject matter ? 5. Are these sophisticated modern era technologies contributing to a decline in the Critical Thinking Capabilities of the 21st century technology-savvy students ? The author tries to focus on technology as a tool and not on the technology

  3. Automated Software Vulnerability Analysis

    Science.gov (United States)

    Sezer, Emre C.; Kil, Chongkyung; Ning, Peng

    Despite decades of research, software continues to have vulnerabilities. Successful exploitations of these vulnerabilities by attackers cost millions of dollars to businesses and individuals. Unfortunately, most effective defensive measures, such as patching and intrusion prevention systems, require an intimate knowledge of the vulnerabilities. Many systems for detecting attacks have been proposed. However, the analysis of the exploited vulnerabilities is left to security experts and programmers. Both the human effortinvolved and the slow analysis process are unfavorable for timely defensive measure to be deployed. The problem is exacerbated by zero-day attacks.

  4. Software del sistema osteomioarticular

    Directory of Open Access Journals (Sweden)

    Dianelys León Medina

    2015-06-01

    Full Text Available Introducción: la aplicación en la enseñanza de las Tecnologías de la Informática y las Comunicaciones, es una de las líneas que ha trazado el sistema de salud cubano en estudiantes de las ciencias médicas. La anatomía es una de las ciencias que integra la disciplina de Morfofisiología y para facilitar su comprensión resulta necesario el empleo de recursos y estrategias, entre los que puede figurar un software educativo. Objetivo: diseñar un software sobre la anatomía del sistema osteomioarticular de cabeza y cuello para los estudiantes de primer año de la carrera de Estomatología de Pinar del Río. Material y método: se realizó un estudio cualitativo. Para obtener los resultados se utilizó el método materialista dialéctico. Para la dialéctica del desarrollo del proceso estudiado se utilizaron los métodos teóricos, empíricos, el estudio profundo del tema, tipo de plataforma a utilizar y criterios de los especialistas. Para su procesamiento se utilizaron técnicas estadísticas descriptivas e inferenciales no paramétricas. Resultados: debido a la necesidad de fortalecer en los estudiantes las habilidades en la utilización de las tecnologías de la informática y las comunicaciones, se elaboró el software de la anatomía del sistema osteomioarticular de cabeza y cuello "Aprendiendo anatomía" mediante imágenes, videos y textos. Conclusiones: el software constituye un aporte al proceso enseñanza aprendizaje, el cual facilita el trabajo independiente y autopreparación mediante la interactividad con el contenido, retroalimentación y evaluación utilizando medios de enseñanza modernos, lo cual tributa al modo de actuación del futuro egresado.

  5. Office software Individual coaching

    CERN Multimedia

    HR Department

    2010-01-01

    If one or several particular topics cause you sleepless nights, you can get help from our trainer who will come to your workplace for a multiple of 1-hour slots . All fields in which our trainer can help are detailed in the course description in our training catalogue (Microsoft Office software, Adobe applications, i-applications etc.) Discover these new courses in our catalogue! http://cta.cern.ch/cta2/f?p=110:9 Technical Training Service Technical.Training@cern.ch Tel 74924

  6. Software Process Improvement

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; Diebold, Philipp; Münch, Jürgen

    2016-01-01

    Software process improvement (SPI) is around for decades: frameworks are proposed, success factors are studied, and experiences have been reported. However, the sheer mass of concepts, approaches, and standards published over the years overwhelms practitioners as well as researchers. What is out...... to new specialized frameworks. New and specialized frameworks account for the majority of the contributions found (approx. 38%). Furthermore, we find a growing interest in success factors (approx. 16%) to aid companies in conducting SPI and in adapting agile principles and practices for SPI (approx. 10...

  7. FASTBUS software workshop

    International Nuclear Information System (INIS)

    1985-01-01

    FASTBUS is a standard for modular high-speed data acquisition, data-processing and control, development for use in high-energy physics experiments incorporating different types of computers and microprocessors. This Workshop brought together users from different laboratories for a review of current software activities, using the standard both in experiments and for test equipment. There are also papers on interfacing and the present state of systems being developed for use in future LEP experiments. Also included is a discussion on the proposed revision of FASTBUS Standard Routines. (orig.)

  8. Software Startups - A Research Agenda

    Directory of Open Access Journals (Sweden)

    Michael Unterkalmsteiner

    2016-10-01

    Full Text Available Software startup companies develop innovative, software-intensive products within limited time frames and with few resources, searching for sustainable and scalable business models. Software startups are quite distinct from traditional mature software companies, but also from micro-, small-, and medium-sized enterprises, introducing new challenges relevant for software engineering research. This paper's research agenda focuses on software engineering in startups, identifying, in particular, 70+ research questions in the areas of supporting startup engineering activities, startup evolution models and patterns, ecosystems and innovation hubs, human aspects in software startups, applying startup concepts in non-startup environments, and methodologies and theories for startup research. We connect and motivate this research agenda with past studies in software startup research, while pointing out possible future directions. While all authors of this research agenda have their main background in Software Engineering or Computer Science, their interest in software startups broadens the perspective to the challenges, but also to the opportunities that emerge from multi-disciplinary research. Our audience is therefore primarily software engineering researchers, even though we aim at stimulating collaborations and research that crosses disciplinary boundaries. We believe that with this research agenda we cover a wide spectrum of the software startup industry current needs.

  9. Desiderata for Linguistic Software Design

    Science.gov (United States)

    Garretson, Gregory

    2008-01-01

    This article presents a series of guidelines both for researchers in search of software to be used in linguistic analysis and for programmers designing such software. A description of the intended audience and the types of software under consideration and a review of some relevant literature are followed by a discussion of several important…

  10. Model-based Software Engineering

    DEFF Research Database (Denmark)

    Kindler, Ekkart

    2010-01-01

    The vision of model-based software engineering is to make models the main focus of software development and to automatically generate software from these models. Part of that idea works already today. But, there are still difficulties when it comes to behaviour. Actually, there is no lack in models...

  11. Software that meets its Intent

    NARCIS (Netherlands)

    Huisman, Marieke; Bos, Herbert; Brinkkemper, Sjaak; van Deursen, Arie; Groote, Jan Friso; Lago, Patricia; van de Pol, Jaco; Visser, Eelco; Margaria, Tiziana; Steffen, Bernhard

    2016-01-01

    Software is widely used, and society increasingly depends on its reliability. However, software has become so complex and it evolves so quickly that we fail to keep it under control. Therefore, we propose intents: fundamental laws that capture a software systems’ intended behavior (resilient,

  12. Free Software and Free Textbooks

    Science.gov (United States)

    Takhteyev, Yuri

    2012-01-01

    Some of the world's best and most sophisticated software is distributed today under "free" or "open source" licenses, which allow the recipients of such software to use, modify, and share it without paying royalties or asking for permissions. If this works for software, could it also work for educational resources, such as books? The economics of…

  13. Software Products - Naval Oceanography Portal

    Science.gov (United States)

    are here: Home › USNO › Astronomical Applications › Software Products USNO Logo USNO Navigation Data Services Astronomical Information Center Almanacs and Other Publications Software Products For DoD Users Info Software Products MICA - Multiyear Interactive Computer Almanac MICA CDROM (thumb) MICA is an

  14. NCEP BUFRLIB Software User Guide

    Science.gov (United States)

    Integration Branch > Decoders > BUFRLIB BUFRLIB Software User Guide This document set describes how to use the NCEP BUFRLIB software to encode or decode BUFR messages. It is not intended to be a primer on background knowledge of the basic concepts of BUFR and will focus solely on how to use the BUFRLIB software

  15. SEI Software Engineering Education Directory.

    Science.gov (United States)

    1987-02-01

    Planning, and Control, Kotler , P. Marketing Decision Making, Concepts and Strategy, Cravens Managerial Fnance: Essentials, Kroncke, C., Nammers, E., and...Textbooks: Applying Software Engineering Principles , Maria Systems: Cyber Turbo Dos Variety of Micros Courses: Introduction to Software Engineering...Assistant Professor of Computer Systems (513) 255-6913 Courses: Software Engineeing Managemrent EENG543 G N R A Textbooks: Principles of Productive

  16. Entropy based software processes improvement

    NARCIS (Netherlands)

    Trienekens, J.J.M.; Kusters, R.J.; Kriek, D.; Siemons, P.

    2009-01-01

    Actual results of software process improvement projects show different levels of success. Although many software development organisations have adopted improvement models such as CMMI, it appears to be difficult to improve software development processes in the right way, e.g. tuned to the actual

  17. Reflections on Software Engineering Education

    NARCIS (Netherlands)

    van Vliet, H.

    2006-01-01

    In recent years, the software engineering community has focused on organizing its existing knowledge and finding opportunities to transform that knowledge into a university curriculum. SWEBOK (the Guide to the Software Engineering Body of Knowledge) and Software Engineering 2004 are two initiatives

  18. Nuclear medicine software: safety aspects

    International Nuclear Information System (INIS)

    Anon.

    1989-01-01

    A brief editorial discusses the safety aspects of nuclear medicine software. Topics covered include some specific features which should be incorporated into a well-written piece of software, some specific points regarding software testing and legal liability if inappropriate medical treatment was initiated as a result of information derived from a piece of clinical apparatus incorporating a malfunctioning computer program. (U.K.)

  19. Calidad del software: camino hacia una verdadera industria del software

    Directory of Open Access Journals (Sweden)

    Saulo Ernesto Rojas Salamanca

    1999-07-01

    Full Text Available El software es quizá uno de los productos de la ingeniería que más ha evolucionado en muy poco tiempo, pasando desde el software empírico o artesanal hasta llegar al software desarrollado bajo los principios y herramientas de la ingeniería del software. Sin embargo, dentro de estos cambios, las personas encargadas de la elaboración del software se han enfrentado a problemas muy comunes: unos debido a la exigencia cada vez mayor en la capacidad de resultados del software, debido al permanente cambio de condiciones lo que aumenta su complejidad y obsolescencia; y otros, debido a la carencia de herramientas adecuadas y estándares de tipo organizacional encaminados al mejoramiento de los procesos en el desarrollo del software. Hacia la búsqueda de mecanismos de solución de estos últimos problemas se orienta este artículo...

  20. Heterogeneous gas core reactor

    International Nuclear Information System (INIS)

    Diaz, N.J.; Dugan, E.T.

    1983-01-01

    A heterogeneous gas core nuclear reactor is disclosed comprising a core barrel provided interiorly with an array of moderator-containing tubes and being otherwise filled with a fissile and/or fertile gaseous fuel medium. The fuel medium may be flowed through the chamber and through an external circuit in which heat is extracted. The moderator may be a fluid which is flowed through the tubes and through an external circuit in which heat is extracted. The moderator may be a solid which may be cooled by a fluid flowing within the tubes and through an external heat extraction circuit. The core barrel is surrounded by moderator/coolant material. Fissionable blanket material may be disposed inwardly or outwardly of the core barrel

  1. iPSC Core

    Data.gov (United States)

    Federal Laboratory Consortium — The induced Pluripotent Stem Cells (iPSC) Core was created in 2011 to accelerate stem cell research in the NHLBI by providing investigators consultation, technical...

  2. PWR degraded core analysis

    International Nuclear Information System (INIS)

    Gittus, J.H.

    1982-04-01

    A review is presented of the various phenomena involved in degraded core accidents and the ensuing transport of fission products from the fuel to the primary circuit and the containment. The dominant accident sequences found in the PWR risk studies published to date are briefly described. Then chapters deal with the following topics: the condition and behaviour of water reactor fuel during normal operation and at the commencement of degraded core accidents; the generation of hydrogen from the Zircaloy-steam and the steel-steam reactions; the way in which the core deforms and finally melts following loss of coolant; debris relocation analysis; containment integrity; fission product behaviour during a degraded core accident. (U.K.)

  3. Restraint system for core elements of a reactor core

    International Nuclear Information System (INIS)

    Class, G.

    1975-01-01

    In a nuclear reactor, a core element bundle formed of a plurality of side-by-side arranged core elements is surrounded by restraining elements that exert a radially inwardly directly restraining force generating friction forces between the core elements in a restraining plane that is transverse to the core element axes. The adjoining core elements are in rolling contact with one another in the restraining plane by virtue of rolling-type bearing elements supported in the core elements. (Official Gazette)

  4. Heterogeneous gas core reactor

    International Nuclear Information System (INIS)

    Han, K.I.

    1977-01-01

    Preliminary investigations of a heterogeneous gas core reactor (HGCR) concept suggest that this potential power reactor offers distinct advantages over other existing or conceptual reactor power plants. One of the most favorable features of the HGCR is the flexibility of the power producing system which allows it to be efficiently designed to conform to a desired optimum condition without major conceptual changes. The arrangement of bundles of moderator/coolant channels in a fissionable gas or mixture of gases makes a truly heterogeneous nuclear reactor core. It is this full heterogeneity for a gas-fueled reactor core which accounts for the novelty of the heterogeneous gas core reactor concept and leads to noted significant advantages over previous gas core systems with respect to neutron and fuel economy, power density, and heat transfer characteristics. The purpose of this work is to provide an insight into the design, operating characteristics, and safety of a heterogeneous gas core reactor system. The studies consist mainly of neutronic, energetic and kinetic analyses of the power producing and conversion systems as a preliminary assessment of the heterogeneous gas core reactor concept and basic design. The results of the conducted research indicate a high potential for the heterogeneous gas core reactor system as an electrical power generating unit (either large or small), with an overall efficiency as high as 40 to 45%. The HGCR system is found to be stable and safe, under the conditions imposed upon the analyses conducted in this work, due to the inherent safety of ann expanding gaseous fuel and the intrinsic feedback effects of the gas and water coolant

  5. ATLAS software stack on ARM64

    CERN Document Server

    Smith, Joshua Wyatt; The ATLAS collaboration

    2016-01-01

    The ATLAS experiment explores new hardware and software platforms that, in the future, may be more suited to its data intensive workloads. One such alternative hardware platform is the ARM architecture, which is designed to be extremely power efficient and is found in most smartphones and tablets. CERN openlab recently installed a small cluster of ARM 64-bit evaluation prototype servers. Each server is based on a single-socket ARM 64-bit system on a chip, with 32 Cortex-A57 cores. In total, each server has 128 GB RAM connected with four fast memory channels. This paper reports on the port of the ATLAS software stack onto these new prototype ARM64 servers. This included building the "external" packages that the ATLAS software relies on. Patches were needed to introduce this new architecture into the build as well as patches that correct for platform specific code that caused failures on non-x86 architectures. These patches were applied such that porting to further platforms will need no or only very little adj...

  6. Software for precise tracking of cell proliferation

    International Nuclear Information System (INIS)

    Kurokawa, Hiroshi; Noda, Hisayori; Sugiyama, Mayu; Sakaue-Sawano, Asako; Fukami, Kiyoko; Miyawaki, Atsushi

    2012-01-01

    Highlights: ► We developed software for analyzing cultured cells that divide as well as migrate. ► The active contour model (Snakes) was used as the core algorithm. ► The time backward analysis was also used for efficient detection of cell division. ► With user-interactive correction functions, the software enables precise tracking. ► The software was successfully applied to cells with fluorescently-labeled nuclei. -- Abstract: We have developed a multi-target cell tracking program TADOR, which we applied to a series of fluorescence images. TADOR is based on an active contour model that is modified in order to be free of the problem of locally optimal solutions, and thus is resistant to signal fluctuation and morphological changes. Due to adoption of backward tracing and addition of user-interactive correction functions, TADOR is used in an off-line and semi-automated mode, but enables precise tracking of cell division. By applying TADOR to the analysis of cultured cells whose nuclei had been fluorescently labeled, we tracked cell division and cell-cycle progression on coverslips over an extended period of time.

  7. Software quality assurance plans for safety-critical software

    International Nuclear Information System (INIS)

    Liddle, P.

    2006-01-01

    Application software is defined as safety-critical if a fault in the software could prevent the system components from performing their nuclear-safety functions. Therefore, for nuclear-safety systems, the AREVA TELEPERM R XS (TXS) system is classified 1E, as defined in the Inst. of Electrical and Electronics Engineers (IEEE) Std 603-1998. The application software is classified as Software Integrity Level (SIL)-4, as defined in IEEE Std 7-4.3.2-2003. The AREVA NP Inc. Software Program Manual (SPM) describes the measures taken to ensure that the TELEPERM XS application software attains a level of quality commensurate with its importance to safety. The manual also describes how TELEPERM XS correctly performs the required safety functions and conforms to established technical and documentation requirements, conventions, rules, and standards. The program manual covers the requirements definition, detailed design, integration, and test phases for the TELEPERM XS application software, and supporting software created by AREVA NP Inc. The SPM is required for all safety-related TELEPERM XS system applications. The program comprises several basic plans and practices: 1. A Software Quality-Assurance Plan (SQAP) that describes the processes necessary to ensure that the software attains a level of quality commensurate with its importance to safety function. 2. A Software Safety Plan (SSP) that identifies the process to reasonably ensure that safety-critical software performs as intended during all abnormal conditions and events, and does not introduce any new hazards that could jeopardize the health and safety of the public. 3. A Software Verification and Validation (V and V) Plan that describes the method of ensuring the software is in accordance with the requirements. 4. A Software Configuration Management Plan (SCMP) that describes the method of maintaining the software in an identifiable state at all times. 5. A Software Operations and Maintenance Plan (SO and MP) that

  8. Application Service Providers (ASP Adoption in Core and Non-Core Functions

    Directory of Open Access Journals (Sweden)

    Aman Y.M. Chan

    2009-10-01

    Full Text Available With the further improvement in internet bandwidth, connection stability and data transmission security, a new wave of Application Service Providers (ASP is on his way. The recent booming on some models such as Software Application as Service (SaaS and On-Demand in 2008, has led to emergence of ASP model in core business functions. The traditional IS outsourcing covers the non-core business functions that are not critical to business performance and competitive advantages. Comparing with traditional IS outsourcing, ASP is a new phenomenon that can be considered as an emerging innovation as it covers both core and non-core business functions. Most of the executives do not comprehend the difference and similarity between traditional IS outsourcing and ASP mode. Hence, we propose to conduct a research so as to identify the determinants (cost benefit, gap in IS capability complementing the company's strategic goal, and trust to ASP's service and security level and moderating factors (management's attitude in ownership & control, and company aggressiveness of ASP adoption decision in both core and non-core business functions.

  9. FBR type reactor core

    International Nuclear Information System (INIS)

    Tamiya, Tadashi; Kawashima, Katsuyuki; Fujimura, Koji; Murakami, Tomoko.

    1995-01-01

    Neutron reflectors are disposed at the periphery of a reactor core fuel region and a blanket region, and a neutron shielding region is disposed at the periphery of them. The neutron reflector has a hollow duct structure having a sealed upper portion, a lower portion opened to cooling water, in which a gas and coolants separately sealed in the inside thereof. A driving pressure of a primary recycling pump is lowered upon reduction of coolant flow rate, then the liquid level of coolants in the neutron reflector is lowered due to imbalance between the driving pressure and a gas pressure, so that coolants having an effect as a reflector are eliminated from the outer circumference of the reactor core. Therefore, the amount of neutrons leaking from the reactor core is increased, and negative reactivity is charged to the reactor core. The negative reactivity of the neutron reflector is made greater than a power compensation reactivity. Since this enables reactor scram by using an inherent performance of the reactor core, the reactor core safety of an LMFBR-type reactor can be improved. (I.N.)

  10. The earths innermost core

    International Nuclear Information System (INIS)

    Nanda, J.N.

    1989-01-01

    A new earth model is advanced with a solid innermost core at the centre of the Earth where elements heavier than iron, over and above what can be retained in solution in the iron core, are collected. The innermost core is separated from the solid iron-nickel core by a shell of liquid copper. The innermost core has a natural vibration measured on the earth's surface as the long period 26 seconds microseisms. The earth was formed initially as a liquid sphere with a relatively thin solid crust above the Byerly discontinuity. The trace elements that entered the innermost core amounted to only 0.925 ppm of the molten mass. Gravitational differentiation must have led to the separation of an explosive thickness of pure 235 U causing a fission explosion that could expel beyond the Roche limit a crustal scab which would form the centre piece of the moon. A reservoir of helium floats on the liquid copper. A small proportion of helium-3, a relic of the ancient fission explosion present there will spell the exciting magnetic field. The field is stable for thousands of years because of the presence of large quantity of helium-4 which accounts for most of the gaseous collisions that will not disturb the atomic spin of helium-3 atoms. This field is prone to sudden reversals after long periods of stability. (author). 14 refs

  11. Software Defined Cyberinfrastructure

    Energy Technology Data Exchange (ETDEWEB)

    Foster, Ian; Blaiszik, Ben; Chard, Kyle; Chard, Ryan

    2017-07-17

    Within and across thousands of science labs, researchers and students struggle to manage data produced in experiments, simulations, and analyses. Largely manual research data lifecycle management processes mean that much time is wasted, research results are often irreproducible, and data sharing and reuse remain rare. In response, we propose a new approach to data lifecycle management in which researchers are empowered to define the actions to be performed at individual storage systems when data are created or modified: actions such as analysis, transformation, copying, and publication. We term this approach software-defined cyberinfrastructure because users can implement powerful data management policies by deploying rules to local storage systems, much as software-defined networking allows users to configure networks by deploying rules to switches.We argue that this approach can enable a new class of responsive distributed storage infrastructure that will accelerate research innovation by allowing any researcher to associate data workflows with data sources, whether local or remote, for such purposes as data ingest, characterization, indexing, and sharing. We report on early experiments with this approach in the context of experimental science, in which a simple if-trigger-then-action (IFTA) notation is used to define rules.

  12. Software and Computing News

    CERN Multimedia

    Barberis, D

    The last several months have been very busy ones for the ATLAS software developers. They've been trying to cope with the competing demands of multiple software stress tests and testbeds. These include Data Challenge Two (DC2), the Combined Testbeam (CTB), preparations for the Physics Workshop to be held in Rome in June 2005, and other testbeds, primarily one for the High-Level Trigger. Data Challenge 2 (DC2) The primary goal of this was to validate the computing model and to provide a test of simulating a day's worth of ATLAS data (10 million events) and of fully processing it and making it available to the physicists within 10 days (i.e. a 10% scale test). DC2 consists of three parts - the generation, simulation, and mixing of a representative sample of physics events with background events; the reconstruction of the mixed samples with initial classification into the different physics signatures; and the distribution of the data to multiple remote sites (Tier-1 centers) for analysis by physicists. Figu...

  13. Computer software review procedures

    International Nuclear Information System (INIS)

    Mauck, J.L.

    1993-01-01

    This article reviews the procedures which are used to review software written for computer based instrumentation and control functions in nuclear facilities. The utilization of computer based control systems is becoming much more prevalent in such installations, in addition to being retrofit into existing systems. Currently, the Nuclear Regulatory System uses Regulatory Guide 1.152, open-quotes Criteria for Programmable Digital Computer System Software in Safety-Related Systems of Nuclear Power Plantsclose quotes and ANSI/IEEE-ANS-7-4.3.2-1982, open-quotes Application Criteria for Programmable Digital Computer Systems in Safety Systems of Nuclear Power Generating Stationsclose quotes for guidance when performing reviews of digital systems. There is great concern about the process of verification and validation of these codes, so when inspections are done of such systems, inspectors examine very closely the processes which were followed in developing the codes, the errors which were detected, how they were found, and the analysis which went into tracing down the causes behind the errors to insure such errors were not propagated again in the future

  14. Evolution of the ATLAS Software Framework towards Concurrency

    CERN Document Server

    Jones, Roger; The ATLAS collaboration; Leggett, Charles; Wynne, Benjamin

    2015-01-01

    The ATLAS experiment has successfully used its Gaudi/Athena software framework for data taking and analysis during the first LHC run, with billions of events successfully processed. However, the design of Gaudi/Athena dates from early 2000 and the software and the physics code has been written using a single threaded, serial design. This programming model has increasing difficulty in exploiting the potential of current CPUs, which offer their best performance only through taking full advantage of multiple cores and wide vector registers. Future CPU evolution will intensify this trend, with core counts increasing and memory per core falling. Maximising performance per watt will be a key metric, so all of these cores must be used as efficiently as possible. In order to address the deficiencies of the current framework, ATLAS has embarked upon two projects: first, a practical demonstration of the use of multi-threading in our reconstruction software, using the GaudiHive framework; second, an exercise to gather r...

  15. Secure software development training course

    Directory of Open Access Journals (Sweden)

    Victor S. Gorbatov

    2017-06-01

    Full Text Available Information security is one of the most important criteria for the quality of developed software. To obtain a sufficient level of application security companies implement security process into software development life cycle. At this stage software companies encounter with deficit employees who able to solve problems of software design, implementation and application security. This article provides a description of the secure software development training course. Training course of application security is designed for co-education students of different IT-specializations.

  16. Software methodologies for the SSC

    International Nuclear Information System (INIS)

    Loken, S.C.

    1990-01-01

    This report describes some of the considerations that will determine how the author developed software for the SSC. He begins with a review of the general computing problem for SSC experiments and recent experiences in software engineering for the present generation of experiments. This leads to a discussion of the software technologies that will be critical for the SSC experiments. He describes the emerging software standards and commercial products that may be useful in addressing the SSC needs. He concludes with some comments on how collaborations and the SSC Lab should approach the software development issue

  17. The Ragnarok Software Development Environment

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak

    1999-01-01

    Ragnarok is an experimental software development environment that focuses on enhanced support for managerial activities in large scale software development taking the daily work of the software developer as its point of departure. The main emphasis is support in three areas: management, navigation......, and collaboration. The leitmotif is the software architecture, which is extended to handle managerial data in addition to source code; this extended software architecture is put under tight version- and configuration management control and furthermore used as basis for visualisation. Preliminary results of using...

  18. Characteristics for Software Optimization Projects

    Directory of Open Access Journals (Sweden)

    Iulian NITESCU

    2008-01-01

    Full Text Available The increasing of the software systems complexity imposes the identification and implementation of some methods and techniques in order to manage it. The software optimization project is a way in which the software complexity is controlled. The software optimization project must face to the organization need to earn profit. The software optimization project is an integrated part of the application cycle because share same resources, depends on other stages and influences next phases. The optimization project has some particularities because it works on an finished product around its quality. The process is quality and performance oriented and it assumes that the product life cycle is almost finished.

  19. Research and practice of application software verification and validation for nuclear safety digital I and C system

    International Nuclear Information System (INIS)

    Dong Yaxin; Xu Xianzhu; Bai Xiangji

    2014-01-01

    Application software V and V activities determine whether the output results are consistent with the requirements of tasks throughout each stage of application software development process, and confirm whether the ultimately generated application software is conform to its intended use and other related requirements. A set of reasonable and feasible workflow for application software V and V activities was proposed based on the characteristics of the application software in this paper, and was demonstrated with the application software V and V activities in transformation project of in-core instrumentation system (RIC) in a nuclear power plant. (authors)

  20. The core protection computer system fitted in Grafenrheinfeld NPP

    International Nuclear Information System (INIS)

    Rietzsch, L.

    1986-01-01

    This paper gives an overview of a four-train core protection computer system for KWU pressurized water reactors. Attention is focused on the methods used to ensure correct computer operation and correct results. Experience gained in trial operation is dealt with. Results of safety analysis of the hardware and the software verification work performed are discussed. (author)

  1. Factors That Affect Software Testability

    Science.gov (United States)

    Voas, Jeffrey M.

    1991-01-01

    Software faults that infrequently affect software's output are dangerous. When a software fault causes frequent software failures, testing is likely to reveal the fault before the software is releases; when the fault remains undetected during testing, it can cause disaster after the software is installed. A technique for predicting whether a particular piece of software is likely to reveal faults within itself during testing is found in [Voas91b]. A piece of software that is likely to reveal faults within itself during testing is said to have high testability. A piece of software that is not likely to reveal faults within itself during testing is said to have low testability. It is preferable to design software with higher testabilities from the outset, i.e., create software with as high of a degree of testability as possible to avoid the problems of having undetected faults that are associated with low testability. Information loss is a phenomenon that occurs during program execution that increases the likelihood that a fault will remain undetected. In this paper, I identify two brad classes of information loss, define them, and suggest ways of predicting the potential for information loss to occur. We do this in order to decrease the likelihood that faults will remain undetected during testing.

  2. Software for safety critical applications

    International Nuclear Information System (INIS)

    Kropik, M.; Matejka, K.; Jurickova, M.; Chudy, R.

    2001-01-01

    The contribution gives an overview of the project of the software development for safety critical applications. This project has been carried out since 1997. The principal goal of the project was to establish a research laboratory for the development of the software with the highest requirements for quality and reliability. This laboratory was established at the department, equipped with proper hardware and software to support software development. A research team of predominantly young researchers for software development was created. The activities of the research team started with studying and proposing the software development methodology. In addition, this methodology was applied to the real software development. The verification and validation process followed the software development. The validation system for the integrated hardware and software tests was brought into being and its control software was developed. The quality of the software tools was also observed, and the SOSAT tool was used during these activities. National and international contacts were established and maintained during the project solution.(author)

  3. Software process in Geant4

    International Nuclear Information System (INIS)

    Cosmo, G.

    2001-01-01

    Since its erliest years of R and D, the GEANT4 simulation toolkit has been developed following software process standards which dictated the overall evolution of the project. The complexity of the software involved, the wide areas of application of the software product, the huge amount of code and Category complexity, the size and distributed nature of the Collaboration itself are all ingredients which involve and correlate together a wide variety of software processes. Although in 'production' and available to the public since December 1998, the GEANT4 software product includes Category Domains which are still under active development. Therefore they require different treatment also in terms of improvement of the development cycle, system testing and user support. The author is meant to describe some of the software processes as they are applied in GEANT4 for both development, testing and maintenance of the software

  4. New ATLAS Software & Computing Organization

    CERN Multimedia

    Barberis, D

    Following the election by the ATLAS Collaboration Board of Dario Barberis (Genoa University/INFN) as Computing Coordinator and David Quarrie (LBNL) as Software Project Leader, it was considered necessary to modify the organization of the ATLAS Software & Computing ("S&C") project. The new organization is based upon the following principles: separation of the responsibilities for computing management from those of software development, with the appointment of a Computing Coordinator and a Software Project Leader who are both members of the Executive Board; hierarchical structure of responsibilities and reporting lines; coordination at all levels between TDAQ, S&C and Physics working groups; integration of the subdetector software development groups with the central S&C organization. A schematic diagram of the new organization can be seen in Fig.1. Figure 1: new ATLAS Software & Computing organization. Two Management Boards will help the Computing Coordinator and the Software Project...

  5. Modernization of software quality assurance

    Science.gov (United States)

    Bhaumik, Gokul

    1988-01-01

    The customers satisfaction depends not only on functional performance, it also depends on the quality characteristics of the software products. An examination of this quality aspect of software products will provide a clear, well defined framework for quality assurance functions, which improve the life-cycle activities of software development. Software developers must be aware of the following aspects which have been expressed by many quality experts: quality cannot be added on; the level of quality built into a program is a function of the quality attributes employed during the development process; and finally, quality must be managed. These concepts have guided our development of the following definition for a Software Quality Assurance function: Software Quality Assurance is a formal, planned approach of actions designed to evaluate the degree of an identifiable set of quality attributes present in all software systems and their products. This paper is an explanation of how this definition was developed and how it is used.

  6. Modern software approaches applied to a Hydrological model: the GEOtop Open-Source Software Project

    Science.gov (United States)

    Cozzini, Stefano; Endrizzi, Stefano; Cordano, Emanuele; Bertoldi, Giacomo; Dall'Amico, Matteo

    2017-04-01

    The GEOtop hydrological scientific package is an integrated hydrological model that simulates the heat and water budgets at and below the soil surface. It describes the three-dimensional water flow in the soil and the energy exchange with the atmosphere, considering the radiative and turbulent fluxes. Furthermore, it reproduces the highly non-linear interactions between the water and energy balance during soil freezing and thawing, and simulates the temporal evolution of snow cover, soil temperature and moisture. The core components of the package were presented in the 2.0 version (Endrizzi et al, 2014), which was released as Free Software Open-source project. However, despite the high scientific quality of the project, a modern software engineering approach was still missing. Such weakness hindered its scientific potential and its use both as a standalone package and, more importantly, in an integrate way with other hydrological software tools. In this contribution we present our recent software re-engineering efforts to create a robust and stable scientific software package open to the hydrological community, easily usable by researchers and experts, and interoperable with other packages. The activity takes as a starting point the 2.0 version, scientifically tested and published. This version, together with several test cases based on recent published or available GEOtop applications (Cordano and Rigon, 2013, WRR, Kollet et al, 2016, WRR) provides the baseline code and a certain number of referenced results as benchmark. Comparison and scientific validation can then be performed for each software re-engineering activity performed on the package. To keep track of any single change the package is published on its own github repository geotopmodel.github.io/geotop/ under GPL v3.0 license. A Continuous Integration mechanism by means of Travis-CI has been enabled on the github repository on master and main development branches. The usage of CMake configuration tool

  7. Software and the Scientist: Coding and Citation Practices in Geodynamics

    Science.gov (United States)

    Hwang, Lorraine; Fish, Allison; Soito, Laura; Smith, MacKenzie; Kellogg, Louise H.

    2017-11-01

    In geodynamics as in other scientific areas, computation has become a core component of research, complementing field observation, laboratory analysis, experiment, and theory. Computational tools for data analysis, mapping, visualization, modeling, and simulation are essential for all aspects of the scientific workflow. Specialized scientific software is often developed by geodynamicists for their own use, and this effort represents a distinctive intellectual contribution. Drawing on a geodynamics community that focuses on developing and disseminating scientific software, we assess the current practices of software development and attribution, as well as attitudes about the need and best practices for software citation. We analyzed publications by participants in the Computational Infrastructure for Geodynamics and conducted mixed method surveys of the solid earth geophysics community. From this we learned that coding skills are typically learned informally. Participants considered good code as trusted, reusable, readable, and not overly complex and considered a good coder as one that participates in the community in an open and reasonable manor contributing to both long- and short-term community projects. Participants strongly supported citing software reflected by the high rate a software package was named in the literature and the high rate of citations in the references. However, lacking are clear instructions from developers on how to cite and education of users on what to cite. In addition, citations did not always lead to discoverability of the resource. A unique identifier to the software package itself, community education, and citation tools would contribute to better attribution practices.

  8. KVANE - a Kvanefjeld drill core database

    International Nuclear Information System (INIS)

    Lund Clausen, F.

    1980-01-01

    A database KVANE containing all drill core information from the drilling programme carried out in 1958, 1962, 1969 and 1977 at the uranium deposit in Kvanefjeld, Southwest Greenland has been made. The applicaTion software ''Statistical Analysis System (SAS)'' was used as the programming tool. It is shown how this software, usually used for other purposes, satisfy a demand of easy storing of larger data amounts. The paper describes how KVANE was made and organized and how data can be picked out of the database. A short introduction to the SAS system is also given. The database has been implemented at the Northern European University Computing Center (NEUCC) at the Technical University of Denmark. (author)

  9. Future Scenarios for Software-Defined Metro and Access Networks and Software-Defined Photonics

    Directory of Open Access Journals (Sweden)

    Tommaso Muciaccia

    2017-01-01

    Full Text Available In recent years, architectures, devices, and components in telecommunication networks have been challenged by evolutionary and revolutionary factors which are drastically changing the traffic features. Most of these changes imply the need for major re-configurability and programmability not only in data-centers and core networks, but also in the metro-access segment. In a wide variety of contexts, this necessity has been addressed by the proposed introduction of the innovative paradigm of software-defined networks (SDNs. Several solutions inspired by the SDN model have been recently proposed also for metro and access networks, where the adoption of a new generation of software-defined reconfigurable integrated photonic devices is highly desirable. In this paper, we review the possible future application scenarios for software-defined metro and access networks and software-defined photonics (SDP, on the base of analytics, statistics, and surveys. This work describes the reasons underpinning the presented radical change of paradigm and summarizes the most significant solutions proposed in literature, with a specific emphasis to physical-layer reconfigurable networks and a focus on both architectures and devices.

  10. caCORE: a common infrastructure for cancer informatics.

    Science.gov (United States)

    Covitz, Peter A; Hartel, Frank; Schaefer, Carl; De Coronado, Sherri; Fragoso, Gilberto; Sahni, Himanso; Gustafson, Scott; Buetow, Kenneth H

    2003-12-12

    Sites with substantive bioinformatics operations are challenged to build data processing and delivery infrastructure that provides reliable access and enables data integration. Locally generated data must be processed and stored such that relationships to external data sources can be presented. Consistency and comparability across data sets requires annotation with controlled vocabularies and, further, metadata standards for data representation. Programmatic access to the processed data should be supported to ensure the maximum possible value is extracted. Confronted with these challenges at the National Cancer Institute Center for Bioinformatics, we decided to develop a robust infrastructure for data management and integration that supports advanced biomedical applications. We have developed an interconnected set of software and services called caCORE. Enterprise Vocabulary Services (EVS) provide controlled vocabulary, dictionary and thesaurus services. The Cancer Data Standards Repository (caDSR) provides a metadata registry for common data elements. Cancer Bioinformatics Infrastructure Objects (caBIO) implements an object-oriented model of the biomedical domain and provides Java, Simple Object Access Protocol and HTTP-XML application programming interfaces. caCORE has been used to develop scientific applications that bring together data from distinct genomic and clinical science sources. caCORE downloads and web interfaces can be accessed from links on the caCORE web site (http://ncicb.nci.nih.gov/core). caBIO software is distributed under an open source license that permits unrestricted academic and commercial use. Vocabulary and metadata content in the EVS and caDSR, respectively, is similarly unrestricted, and is available through web applications and FTP downloads. http://ncicb.nci.nih.gov/core/publications contains links to the caBIO 1.0 class diagram and the caCORE 1.0 Technical Guide, which provide detailed information on the present caCORE architecture

  11. lessons and challenges from software quality assessment

    African Journals Online (AJOL)

    DJFLEX

    www.globaljournalseries.com, Email: info@globaljournalseries.com ... ASSESSMENT: THE CASE OF SPACE SYSTEMS SOFTWARE. ... KEYWORDS: Software, Software Quality ,Quality Standard, Characteristics, ... and communication, etc.

  12. Three-dimensional discrete element method simulation of core disking

    Science.gov (United States)

    Wu, Shunchuan; Wu, Haoyan; Kemeny, John

    2018-04-01

    The phenomenon of core disking is commonly seen in deep drilling of highly stressed regions in the Earth's crust. Given its close relationship with the in situ stress state, the presence and features of core disking can be used to interpret the stresses when traditional in situ stress measuring techniques are not available. The core disking process was simulated in this paper using the three-dimensional discrete element method software PFC3D (particle flow code). In particular, PFC3D is used to examine the evolution of fracture initiation, propagation and coalescence associated with core disking under various stress states. In this paper, four unresolved problems concerning core disking are investigated with a series of numerical simulations. These simulations also provide some verification of existing results by other researchers: (1) Core disking occurs when the maximum principal stress is about 6.5 times the tensile strength. (2) For most stress situations, core disking occurs from the outer surface, except for the thrust faulting stress regime, where the fractures were found to initiate from the inner part. (3) The anisotropy of the two horizontal principal stresses has an effect on the core disking morphology. (4) The thickness of core disk has a positive relationship with radial stress and a negative relationship with axial stresses.

  13. Neutron beam tomography software

    International Nuclear Information System (INIS)

    Newbery, A.C.R.

    1988-05-01

    When a sample is traversed by a neutron beam, inhomogeneities in the sample will cause deflections, and the deflections will permit conclusions to be drawn concerning the location and size of the inhomogeneities. The associated computation is similar to problems in tomography, analogous to X-ray tomography though significantly different in detail. We do not have any point-sample information, but only mean values over short line segments. Since each mean value is derived from a separate neutron counter, the quantity of available data has to be modest; also, since each datum is an integral, its geometric precision is inferior to that of X-ray data. Our software is designed to cope with these difficulties. (orig.) [de

  14. Software and Network Engineering

    CERN Document Server

    2012-01-01

    The series "Studies in Computational Intelligence" (SCI) publishes new developments and advances in the various areas of computational intelligence – quickly and with a high quality. The intent is to cover the theory, applications, and design methods of computational intelligence, as embedded in the fields of engineering, computer science, physics and life science, as well as the methodologies behind them. The series contains monographs, lecture notes and edited volumes in computational intelligence spanning the areas of neural networks, connectionist systems, genetic algorithms, evolutionary computation, artificial intelligence, cellular automata, self-organizing systems, soft computing, fuzzy systems, and hybrid intelligent systems. Critical to both contributors and readers are the short publication time and world-wide distribution - this permits a rapid and broad dissemination of research results.   The purpose of the first ACIS International Symposium on Software and Network Engineering held on Decembe...

  15. Software development without languages

    Science.gov (United States)

    Osborne, Haywood S.

    1988-01-01

    Automatic programming generally involves the construction of a formal specification; i.e., one which allows unambiguous interpretation by tools for the subsequent production of the corresponding software. Previous practical efforts in this direction have focused on the serious problems of: (1) designing the optimum specification language; and (2) mapping (translating or compiling) from this specification language to the program itself. The approach proposed bypasses the above problems. It postulates that the specification proper should be an intermediate form, with the sole function of containing information sufficient to facilitate construction of programs and also of matching documentation. Thus, the means of forming the intermediary becomes a human factors task rather than a linguistic one; human users will read documents generated from the specification, rather than the specification itself.

  16. WISDAAM software programmer's manual

    International Nuclear Information System (INIS)

    Ball, J.R.

    1992-10-01

    The WISDAAM system was developed to provide quality control over test data associated with in situ testing at the Waste Isolation Pilot Plant (WIPP). Assurance of data quality is of critical importance as these tests supply the information which will be used for development and verification of the technology required for repository implementation. The amount of data collected from the tests, which are some of the largest ever fielded in an underground facility, prompted the undertaking of a major project task to address data processing. The goal was to create a conceptual umbrella under which all of the activities associated with processing WIPP data (i.e., data reduction, archiving, retrieval, etc.) could be grouped. The WISDAAM system was the product of this task. The overall system covers electronic as well as manual data processing; however, this document deals primarily with those operations implemented by software running on a VAX computer

  17. Software Defined Networking

    DEFF Research Database (Denmark)

    Caba, Cosmin Marius

    Network Service Providers (NSP) often choose to overprovision their networks instead of deploying proper Quality of Services (QoS) mechanisms that allow for traffic differentiation and predictable quality. This tendency of overprovisioning is not sustainable for the simple reason that network...... resources are limited. Hence, to counteract this trend, current QoS mechanisms must become simpler to deploy and operate, in order to motivate NSPs to employ QoS techniques instead of overprovisioning. Software Defined Networking (SDN) represents a paradigm shift in the way telecommunication and data...... generic perspective (e.g. service provisioning speed, resources availability). As a result, new mechanisms for providing QoS are proposed, solutions for SDN-specific QoS challenges are designed and tested, and new network management concepts are prototyped, all aiming to improve QoS for network services...

  18. Energy Tracking Software Platform

    Energy Technology Data Exchange (ETDEWEB)

    Ryan Davis; Nathan Bird; Rebecca Birx; Hal Knowles

    2011-04-04

    Acceleration has created an interactive energy tracking and visualization platform that supports decreasing electric, water, and gas usage. Homeowners have access to tools that allow them to gauge their use and track progress toward a smaller energy footprint. Real estate agents have access to consumption data, allowing for sharing a comparison with potential home buyers. Home builders have the opportunity to compare their neighborhood's energy efficiency with competitors. Home energy raters have a tool for gauging the progress of their clients after efficiency changes. And, social groups are able to help encourage members to reduce their energy bills and help their environment. EnergyIT.com is the business umbrella for all energy tracking solutions and is designed to provide information about our energy tracking software and promote sales. CompareAndConserve.com (Gainesville-Green.com) helps homeowners conserve energy through education and competition. ToolsForTenants.com helps renters factor energy usage into their housing decisions.

  19. BNL multiparticle spectrometer software

    International Nuclear Information System (INIS)

    Saulys, A.C.

    1984-01-01

    This paper discusses some solutions to problems common to the design, management and maintenance of a large high energy physics spectrometer software system. The experience of dealing with a large, complex program and the necessity of having the program controlled by various people at different levels of computer experience has led us to design a program control structure of mnemonic and self-explanatory nature. The use of this control language in both on-line and off-line operation of the program will be discussed. The solution of structuring a large program for modularity so that substantial changes to the program can be made easily for a wide variety of high energy physics experiments is discussed. Specialized tools for this type of large program management are also discussed

  20. Software Process Improvement

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; Konopka, Claudia; Nellemann, Peter

    2016-01-01

    directions. An analysis of 635 publications draws a big picture of SPI-related research of the past 25 years. Our study shows a high number of solution proposals, experience reports, and secondary studies, but only few theories. In particular, standard SPI models are analyzed and evaluated for applicability......Software process improvement (SPI) is around for decades: frameworks are proposed, success factors are studied, and experiences have been reported. However, the sheer mass of concepts, approaches, and standards published over the years overwhelms practitioners as well as researchers. What is out...... there? Are there new emerging approaches? What are open issues? Still, we struggle to answer the question for what is the current state of SPI and related research? We present initial results from a systematic mapping study to shed light on the field of SPI and to draw conclusions for future research...