WorldWideScience

Sample records for level software layers

  1. Enhancing the low-level tape layer of CERN Tape Archive software

    CERN Document Server

    AUTHOR|(CDS)2160814; Ruse, Laura Cristina; Vasilescu, Laura

    CERN manages the largest scientific data archive in the HEP domain. The archive currently holds over 180 Petabytes, with forecasts of up to 100PB of new data added per year. Considering this numbers, the most cost-effective solution for storage in terms of capacity and maintenance is represented by magnetic tapes. The drawback of this solution is the access time which can raise to several minutes for a series of files. In an environment where very large volumes of physics data are being traded from tape to disk and vice versa, this issue becomes a serious performance bottleneck. This thesis introduces two low-level tape access optimizations: first, adding support for the Recommended Access Order (RAO), a mechanism offered by the tape hardware infrastructure to compute the file order corresponding to the minimum access time. We will present our solution for including RAO in the file retrieval operations in CERN tape archive software in order to benefit from the reduced reading time. The second optimization ...

  2. Reliable software for unreliable hardware a cross layer perspective

    CERN Document Server

    Rehman, Semeen; Henkel, Jörg

    2016-01-01

    This book describes novel software concepts to increase reliability under user-defined constraints. The authors’ approach bridges, for the first time, the reliability gap between hardware and software. Readers will learn how to achieve increased soft error resilience on unreliable hardware, while exploiting the inherent error masking characteristics and error (stemming from soft errors, aging, and process variations) mitigations potential at different software layers. · Provides a comprehensive overview of reliability modeling and optimization techniques at different hardware and software levels; · Describes novel optimization techniques for software cross-layer reliability, targeting unreliable hardware.

  3. The JET level-1 software

    International Nuclear Information System (INIS)

    McCullen, P.A.; Farthing, J.W.

    1998-01-01

    The complex nature of the JET machine requires a large amount of control parameter preparation, selection and validation before a pulse may be started. Level-1 is defined as the centralized, cross-subsystem control of JET. Before it was introduced over 10 years ago, the Session Leader (SL) who is responsible for specifying the parameter settings for a JET pulse, had virtually no software available to help him except for a simple editor used for the creation of control waveforms. Most of the required parameter settings were calculated by hand and then passed on either verbally or via hand-written forms. These parameters were then set by a large number of people - Local Unit Responsible Officers (LUROs) and CODAS Duty Officers (CDOs) using a wide selection of dedicated software. At this time the Engineer in Charge (EiC) would largely depend on the LUROs to inform him that conditions were ready. He never set control parameters personally and had little or no software available to him to see what many of the settings were. The first implementation of Level-1 software went some way towards improving the task of pulse schedule preparation in that the SL could specify his requirements via a computer interface and store them in a database for later use. At that time the maximum number of parameters that could be handled was 500. (author)

  4. Software Design Level Security Vulnerabilities

    OpenAIRE

    S. Rehman; K. Mustafa

    2011-01-01

    Several thousand software design vulnerabilities have been reported through established databases. But they need to be structured and classified to be optimally usable in the pursuit of minimal and effective mitigation mechanism. In order we developed a criterion set for a communicative description of the same to serve the purpose as a taxonomic description of security vulnerabilities, arising in the design phase of Software development lifecycle. This description is a part of an effort to id...

  5. Software refactoring at the package level using clustering techniques

    KAUST Repository

    Alkhalid, A.; Alshayeb, M.; Mahmoud, S. A.

    2011-01-01

    Enhancing, modifying or adapting the software to new requirements increases the internal software complexity. Software with high level of internal complexity is difficult to maintain. Software refactoring reduces software complexity and hence

  6. Multi-Level Formation of Complex Software Systems

    Directory of Open Access Journals (Sweden)

    Hui Li

    2016-05-01

    Full Text Available We present a multi-level formation model for complex software systems. The previous works extract the software systems to software networks for further studies, but usually investigate the software networks at the class level. In contrast to these works, our treatment of software systems as multi-level networks is more realistic. In particular, the software networks are organized by three levels of granularity, which represents the modularity and hierarchy in the formation process of real-world software systems. More importantly, simulations based on this model have generated more realistic structural properties of software networks, such as power-law, clustering and modularization. On the basis of this model, how the structure of software systems effects software design principles is then explored, and it could be helpful for understanding software evolution and software engineering practices.

  7. Non-intrusive Instance Level Software Composition

    NARCIS (Netherlands)

    Hatun, Kardelen

    2014-01-01

    A software system is comprised of parts, which interact through shared interfaces. Certain qualities of integration, such as loose-coupling, requiring minimal changes to the software and fine-grained localisation of dependencies, have impact on the overall software quality. Current general-purpose

  8. Elementary study on γ analysis software for low level measurement

    International Nuclear Information System (INIS)

    Ruan Guanglin; Huang Xianguo; Xing Shixiong

    2001-01-01

    The difficulty in using fashion γ analysis software in low level measurement is discussed. The ROI report file of ORTEC operation system has been chosen as interface file to write γ analysis software for low-level measurement. The author gives software flowchart and applied example and discusses the existent problems

  9. Upper Secondary and Vocational Level Teachers at Social Software

    Science.gov (United States)

    Valtonen, Teemu; Kontkanen, Sini; Dillon, Patrick; Kukkonen, Jari; Väisänen, Pertti

    2014-01-01

    This study focuses on upper secondary and vocational level teachers as users of social software i.e. what software they use during their leisure and work and for what purposes they use software in teaching. The study is theorised within a technological pedagogical content knowledge framework, the emphasis is especially on technological knowledge…

  10. Software refactoring at the package level using clustering techniques

    KAUST Repository

    Alkhalid, A.

    2011-01-01

    Enhancing, modifying or adapting the software to new requirements increases the internal software complexity. Software with high level of internal complexity is difficult to maintain. Software refactoring reduces software complexity and hence decreases the maintenance effort. However, software refactoring becomes quite challenging task as the software evolves. The authors use clustering as a pattern recognition technique to assist in software refactoring activities at the package level. The approach presents a computer aided support for identifying ill-structured packages and provides suggestions for software designer to balance between intra-package cohesion and inter-package coupling. A comparative study is conducted applying three different clustering techniques on different software systems. In addition, the application of refactoring at the package level using an adaptive k-nearest neighbour (A-KNN) algorithm is introduced. The authors compared A-KNN technique with the other clustering techniques (viz. single linkage algorithm, complete linkage algorithm and weighted pair-group method using arithmetic averages). The new technique shows competitive performance with lower computational complexity. © 2011 The Institution of Engineering and Technology.

  11. The ATLAS online High Level Trigger framework experience reusing offline software components in the ATLAS trigger

    CERN Document Server

    Wiedenmann, W

    2009-01-01

    Event selection in the Atlas High Level Trigger is accomplished to a large extent by reusing software components and event selection algorithms developed and tested in an offline environment. Many of these offline software modules are not specifically designed to run in a heavily multi-threaded online data flow environment. The Atlas High Level Trigger (HLT) framework based on the Gaudi and Atlas Athena frameworks, forms the interface layer, which allows the execution of the HLT selection and monitoring code within the online run control and data flow software. While such an approach provides a unified environment for trigger event selection across all of Atlas, it also poses strict requirements on the reused software components in terms of performance, memory usage and stability. Experience of running the HLT selection software in the different environments and especially on large multi-node trigger farms has been gained in several commissioning periods using preloaded Monte Carlo events, in data taking peri...

  12. Recent Developments in Low-Level Software Security

    OpenAIRE

    Agten , Pieter; Nikiforakis , Nick; Strackx , Raoul; Groef , Willem ,; Piessens , Frank

    2012-01-01

    Part 1: Keynotes; International audience; An important objective for low-level software security research is to develop techniques that make it harder to launch attacks that exploit implementation details of the system under attack. Baltopoulos and Gordon have summarized this as the principle of source-based reasoning for security: security properties of a software system should follow from review of the source code and its source-level semantics, and should not depend on details of the compi...

  13. Cross-layer restoration with software defined networking based on IP over optical transport networks

    Science.gov (United States)

    Yang, Hui; Cheng, Lei; Deng, Junni; Zhao, Yongli; Zhang, Jie; Lee, Young

    2015-10-01

    The IP over optical transport network is a very promising networking architecture applied to the interconnection of geographically distributed data centers due to the performance guarantee of low delay, huge bandwidth and high reliability at a low cost. It can enable efficient resource utilization and support heterogeneous bandwidth demands in highly-available, cost-effective and energy-effective manner. In case of cross-layer link failure, to ensure a high-level quality of service (QoS) for user request after the failure becomes a research focus. In this paper, we propose a novel cross-layer restoration scheme for data center services with software defined networking based on IP over optical network. The cross-layer restoration scheme can enable joint optimization of IP network and optical network resources, and enhance the data center service restoration responsiveness to the dynamic end-to-end service demands. We quantitatively evaluate the feasibility and performances through the simulation under heavy traffic load scenario in terms of path blocking probability and path restoration latency. Numeric results show that the cross-layer restoration scheme improves the recovery success rate and minimizes the overall recovery time.

  14. EOS MLS Level 2 Data Processing Software Version 3

    Science.gov (United States)

    Livesey, Nathaniel J.; VanSnyder, Livesey W.; Read, William G.; Schwartz, Michael J.; Lambert, Alyn; Santee, Michelle L.; Nguyen, Honghanh T.; Froidevaux, Lucien; wang, Shuhui; Manney, Gloria L.; hide

    2011-01-01

    This software accepts the EOS MLS calibrated measurements of microwave radiances products and operational meteorological data, and produces a set of estimates of atmospheric temperature and composition. This version has been designed to be as flexible as possible. The software is controlled by a Level 2 Configuration File that controls all aspects of the software: defining the contents of state and measurement vectors, defining the configurations of the various forward models available, reading appropriate a priori spectroscopic and calibration data, performing retrievals, post-processing results, computing diagnostics, and outputting results in appropriate files. In production mode, the software operates in a parallel form, with one instance of the program acting as a master, coordinating the work of multiple slave instances on a cluster of computers, each computing the results for individual chunks of data. In addition, to do conventional retrieval calculations and producing geophysical products, the Level 2 Configuration File can instruct the software to produce files of simulated radiances based on a state vector formed from a set of geophysical product files taken as input. Combining both the retrieval and simulation tasks in a single piece of software makes it far easier to ensure that identical forward model algorithms and parameters are used in both tasks. This also dramatically reduces the complexity of the code maintenance effort.

  15. The ATLAS online High Level Trigger framework: Experience reusing offline software components in the ATLAS trigger

    International Nuclear Information System (INIS)

    Wiedenmann, Werner

    2010-01-01

    Event selection in the ATLAS High Level Trigger is accomplished to a large extent by reusing software components and event selection algorithms developed and tested in an offline environment. Many of these offline software modules are not specifically designed to run in a heavily multi-threaded online data flow environment. The ATLAS High Level Trigger (HLT) framework based on the GAUDI and ATLAS ATHENA frameworks, forms the interface layer, which allows the execution of the HLT selection and monitoring code within the online run control and data flow software. While such an approach provides a unified environment for trigger event selection across all of ATLAS, it also poses strict requirements on the reused software components in terms of performance, memory usage and stability. Experience of running the HLT selection software in the different environments and especially on large multi-node trigger farms has been gained in several commissioning periods using preloaded Monte Carlo events, in data taking periods with cosmic events and in a short period with proton beams from LHC. The contribution discusses the architectural aspects of the HLT framework, its performance and its software environment within the ATLAS computing, trigger and data flow projects. Emphasis is also put on the architectural implications for the software by the use of multi-core processors in the computing farms and the experiences gained with multi-threading and multi-process technologies.

  16. A Parallel Controls Software Approach for PEP II: AIDA and Matlab Middle Layer

    International Nuclear Information System (INIS)

    Wittmer, W.; Colocho, W.; White, G.

    2007-01-01

    The controls software in use at PEP II (Stanford Control Program - SCP) had originally been developed in the eighties. It is very successful in routine operation but due to its internal structure it is difficult and time consuming to extend its functionality. This is problematic during machine development and when solving operational issues. Routinely, data has to be exported from the system, analyzed offline, and calculated settings have to be reimported. Since this is a manual process, it is time consuming and error-prone. Setting up automated processes, as is done for MIA (Model Independent Analysis), is also time consuming and specific to each application. Recently, there has been a trend at light sources to use MATLAB as the platform to control accelerators using a 'MATLAB Middle Layer' (MML), and so called channel access (CA) programs to communicate with the low level control system (LLCS). This has proven very successful, especially during machine development time and trouble shooting. A special CA code, named AIDA (Accelerator Independent Data Access), was developed to handle the communication between MATLAB, modern software frameworks, and the SCP. The MML had to be adapted for implementation at PEP II. Colliders differ significantly in their designs compared to light sources, which poses a challenge. PEP II is the first collider at which this implementation is being done. We will report on this effort, which is still ongoing

  17. High-Level Synthesis: Productivity, Performance, and Software Constraints

    Directory of Open Access Journals (Sweden)

    Yun Liang

    2012-01-01

    Full Text Available FPGAs are an attractive platform for applications with high computation demand and low energy consumption requirements. However, design effort for FPGA implementations remains high—often an order of magnitude larger than design effort using high-level languages. Instead of this time-consuming process, high-level synthesis (HLS tools generate hardware implementations from algorithm descriptions in languages such as C/C++ and SystemC. Such tools reduce design effort: high-level descriptions are more compact and less error prone. HLS tools promise hardware development abstracted from software designer knowledge of the implementation platform. In this paper, we present an unbiased study of the performance, usability and productivity of HLS using AutoPilot (a state-of-the-art HLS tool. In particular, we first evaluate AutoPilot using the popular embedded benchmark kernels. Then, to evaluate the suitability of HLS on real-world applications, we perform a case study of stereo matching, an active area of computer vision research that uses techniques also common for image denoising, image retrieval, feature matching, and face recognition. Based on our study, we provide insights on current limitations of mapping general-purpose software to hardware using HLS and some future directions for HLS tool development. We also offer several guidelines for hardware-friendly software design. For popular embedded benchmark kernels, the designs produced by HLS achieve 4X to 126X speedup over the software version. The stereo matching algorithms achieve between 3.5X and 67.9X speedup over software (but still less than manual RTL design with a fivefold reduction in design effort versus manual RTL design.

  18. EOS MLS Level 1B Data Processing Software. Version 3

    Science.gov (United States)

    Perun, Vincent S.; Jarnot, Robert F.; Wagner, Paul A.; Cofield, Richard E., IV; Nguyen, Honghanh T.; Vuu, Christina

    2011-01-01

    This software is an improvement on Version 2, which was described in EOS MLS Level 1B Data Processing, Version 2.2, NASA Tech Briefs, Vol. 33, No. 5 (May 2009), p. 34. It accepts the EOS MLS Level 0 science/engineering data, and the EOS Aura spacecraft ephemeris/attitude data, and produces calibrated instrument radiances and associated engineering and diagnostic data. This version makes the code more robust, improves calibration, provides more diagnostics outputs, defines the Galactic core more finely, and fixes the equator crossing. The Level 1 processing software manages several different tasks. It qualifies each data quantity using instrument configuration and checksum data, as well as data transmission quality flags. Statistical tests are applied for data quality and reasonableness. The instrument engineering data (e.g., voltages, currents, temperatures, and encoder angles) is calibrated by the software, and the filter channel space reference measurements are interpolated onto the times of each limb measurement with the interpolates being differenced from the measurements. Filter channel calibration target measurements are interpolated onto the times of each limb measurement, and are used to compute radiometric gain. The total signal power is determined and analyzed by each digital autocorrelator spectrometer (DACS) during each data integration. The software converts each DACS data integration from an autocorrelation measurement in the time domain into a spectral measurement in the frequency domain, and estimates separately the spectrally, smoothly varying and spectrally averaged components of the limb port signal arising from antenna emission and scattering effects. Limb radiances are also calibrated.

  19. Integrated software system for low level waste management

    International Nuclear Information System (INIS)

    Worku, G.

    1995-01-01

    In the continually changing and uncertain world of low level waste management, many generators in the US are faced with the prospect of having to store their waste on site for the indefinite future. This consequently increases the set of tasks performed by the generators in the areas of packaging, characterizing, classifying, screening (if a set of acceptance criteria applies), and managing the inventory for the duration of onsite storage. When disposal sites become available, it is expected that the work will require re-evaluating the waste packages, including possible re-processing, re-packaging, or re-classifying in preparation for shipment for disposal under the regulatory requirements of the time. In this day and age, when there is wide use of computers and computer literacy is at high levels, an important waste management tool would be an integrated software system that aids waste management personnel in conducting these tasks quickly and accurately. It has become evident that such an integrated radwaste management software system offers great benefits to radwaste generators both in the US and other countries. This paper discusses one such approach to integrated radwaste management utilizing some globally accepted radiological assessment software applications

  20. Inferring Parametric Energy Consumption Functions at Different Software Levels

    DEFF Research Database (Denmark)

    Liqat, Umer; Georgiou, Kyriakos; Kerrison, Steve

    2016-01-01

    The static estimation of the energy consumed by program executions is an important challenge, which has applications in program optimization and verification, and is instrumental in energy-aware software development. Our objective is to estimate such energy consumption in the form of functions...... on the input data sizes of programs. We have developed a tool for experimentation with static analysis which infers such energy functions at two levels, the instruction set architecture (ISA) and the intermediate code (LLVM IR) levels, and reflects it upwards to the higher source code level. This required...... the development of a translation from LLVM IR to an intermediate representation and its integration with existing components, a translation from ISA to the same representation, a resource analyzer, an ISA-level energy model, and a mapping from this model to LLVM IR. The approach has been applied to programs...

  1. A Layered Software Architecture for the Management of a Manufacturing Company

    Directory of Open Access Journals (Sweden)

    Domenico CONSOLI

    2011-01-01

    Full Text Available In this paper we describe a layered software architecture in the management of a manufactur-ing company that intensively uses computer technology. Application tools, new and legacy, after the updating, operate in a context of an open web oriented architecture. The software architecture enables the integration and interoperability among all tools that support business processes. Manufacturing Executive System and Text Mining tools are excellent interfaces, the former both for internal production and management processes and the latter for external processes coming from the market. In this way, it is possible to implement, a computer integrated factory, flexible and agile, that immediately responds to customer requirements.

  2. Phosphorus and phytase levels for layer hens

    OpenAIRE

    Juliana Cristina Ramos Rezende; Antonio Carlos de Laurentiz; Rosemeire da Silva Filardi; Vitor Barbosa Fascina; Daniella Aparecida Berto; Sérgio Turra Sobrane Filho

    2013-01-01

    The objective of this research was to evaluate the performance and bone quality of laying hens after peak production fed diets containing phosphorus levels and phytase. An experiment was conducted with 384 Hy-line distributed in a completely randomized in a factorial 4 x 3 with 4 levels of available phosphorus and 3 levels of phytase. The experimental period was divided into four periods of 28 days, at the end of each cycle were determined experimental feed intake, egg production, egg weight,...

  3. Phosphorus and phytase levels for layer hens

    Directory of Open Access Journals (Sweden)

    Juliana Cristina Ramos Rezende

    2013-02-01

    Full Text Available The objective of this research was to evaluate the performance and bone quality of laying hens after peak production fed diets containing phosphorus levels and phytase. An experiment was conducted with 384 Hy-line distributed in a completely randomized in a factorial 4 x 3 with 4 levels of available phosphorus and 3 levels of phytase. The experimental period was divided into four periods of 28 days, at the end of each cycle were determined experimental feed intake, egg production, egg weight, feed conversion, mortality, and average egg weight, shell thickness, Haugh units and specific gravity. At the end of the experimental period were determined amounts of calcium and phosphorus excreted by the method of total excreta collection and a fowl per experimental unit was sacrificed for collection of bones and evaluation of width, length and level of robustness from femur and tibia. There was interaction between phosphorus levels and phytase on feed intake, feed conversion and percentage of posture. For inclusion levels of phytase all egg quality variables showed no significant differences. The treatments did not affect bone characteristics of laying hens.

  4. Cross-layer shared protection strategy towards data plane in software defined optical networks

    Science.gov (United States)

    Xiong, Yu; Li, Zhiqiang; Zhou, Bin; Dong, Xiancun

    2018-04-01

    In order to ensure reliable data transmission on the data plane and minimize resource consumption, a novel protection strategy towards data plane is proposed in software defined optical networks (SDON). Firstly, we establish a SDON architecture with hierarchical structure of data plane, which divides the data plane into four layers for getting fine-grained bandwidth resource. Then, we design the cross-layer routing and resource allocation based on this network architecture. Through jointly considering the bandwidth resource on all the layers, the SDN controller could allocate bandwidth resource to working path and backup path in an economical manner. Next, we construct auxiliary graphs and transform the shared protection problem into the graph vertex coloring problem. Therefore, the resource consumption on backup paths can be reduced further. The simulation results demonstrate that the proposed protection strategy can achieve lower protection overhead and higher resource utilization ratio.

  5. CGNS Mid-Level Software Library and Users Guide

    Science.gov (United States)

    Poirier, Diane; Smith, Charles A. (Technical Monitor)

    1998-01-01

    The "CFD General Notation System" (CGNS) consists of a collection of conventions, and conforming software, for the storage and retrieval of Computational Fluid Dynamics (CFD) data. It facilitates the exchange of data between sites and applications, and helps stabilize the archiving of aerodynamic data. This effort was initiated in order to streamline the procedures in exchanging data and software between NASA and its customers, but the goal is to develop CGNS into a National Standard for the exchange of aerodynamic data. The CGNS development team is comprised of members from Boeing Commercial Airplane Group, NASA-Ames, NASA-Langley, NASA-Lewis, McDonnell-Douglas Corporation (now Boeing-St. Louis), Air Force-Wright Lab., and ICEM-CFD Engineering. The elements of CGNS address all activities associated with the storage of data on external media and its movement to and from application programs. These elements include: - The Advanced Data Format (ADF) Database manager, consisting of both a file format specification and its I/O software, which handles the actual reading and writing of data from and to external storage media; - The Standard Interface Data Structures (SIDS), which specify the intellectual content of CFD data and the conventions governing naming and terminology; - The SIDS-to-ADF File Mapping conventions, which specify the exact location where the CFD data defined by the SIDS is to be stored within the ADF file(s); and - The CGNS Mid-level Library, which provides CFD-knowledgeable routines suitable for direct installation into application codes. The CGNS Mid-level Library was designed to ease the implementation of CGNS by providing developers with a collection of handy I/O functions. Since knowledge of the ADF core is not required to use this library, it will greatly facilitate the task of interfacing with CGNS. There are currently 48 user callable functions that comprise the Mid-level library and are described in the Users Guide. The library is written in

  6. Los Alamos MAWST software layered on Westinghouse Savannah River Company's nuclear materials accountability system

    International Nuclear Information System (INIS)

    Whitty, W.J.; Smith, J.E.; Davis, J.M. Jr.

    1995-01-01

    The Los Alamos Safeguards Systems Group's Materials Accounting With Sequential Testing (MAWST) computer program was developed to fulfill DOE Order 5633.3B requiring that inventory-difference control limits be based on variance propagation or any other statistically valid technique. Westinghouse Savannah River Company (WSRC) developed a generic computerized accountability system, NucMAS, to satisfy accounting and reporting requirements for material balance areas. NucMAS maintains the calculation methods and the measurement information required to compute nuclear material transactions in elemental and isotopic masses by material type code. The Safeguards Systems Group designed and implemented to WSRC's specifications a software interface application, called NucMASloe. It is a layered product for NucMAS that automatically formats a NucMAS data set to a format compatible with MAWST and runs MAWST. This paper traces the development of NucMASloe from the Software Requirements through the testing and demonstration stages. The general design constraints are described as well as the difficulties encountered on interfacing an external software product (MAWST) with an existing classical accounting structure (NucMAS). The lessons learned from this effort, the design, and some of the software are directly applicable to the Local Area Network Material Accountability System (LANMAS) being sponsored by DOE

  7. Reference Architecture for Multi-Layer Software Defined Optical Data Center Networks

    Directory of Open Access Journals (Sweden)

    Casimer DeCusatis

    2015-09-01

    Full Text Available As cloud computing data centers grow larger and networking devices proliferate; many complex issues arise in the network management architecture. We propose a framework for multi-layer; multi-vendor optical network management using open standards-based software defined networking (SDN. Experimental results are demonstrated in a test bed consisting of three data centers interconnected by a 125 km metropolitan area network; running OpenStack with KVM and VMW are components. Use cases include inter-data center connectivity via a packet-optical metropolitan area network; intra-data center connectivity using an optical mesh network; and SDN coordination of networking equipment within and between multiple data centers. We create and demonstrate original software to implement virtual network slicing and affinity policy-as-a-service offerings. Enhancements to synchronous storage backup; cloud exchanges; and Fibre Channel over Ethernet topologies are also discussed.

  8. The Software Architecture of the LHCb High Level Trigger

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    The LHCb experiment is a spectrometer dedicated to the study of heavy flavor at the LHC. The rate of proton-proton collisions at the LHC is 15 MHz, but disk space limitations mean that only 3 kHz can be written to tape for offline processing. For this reason the LHCb data acquisition system -- trigger -- plays a key role in selecting signal events and rejecting background. In contrast to previous experiments at hadron colliders like for example CDF or D0, the bulk of the LHCb trigger is implemented in software and deployed on a farm of 20k parallel processing nodes. This system, called the High Level Trigger (HLT) is responsible for reducing the rate from the maximum at which the detector can be read out, 1.1 MHz, to the 3 kHz which can be processed offline,and has 20 ms in which to process and accept/reject each event. In order to minimize systematic uncertainties, the HLT was designed from the outset to reuse the offline reconstruction and selection code, and is based around multiple independent and redunda...

  9. Software

    Energy Technology Data Exchange (ETDEWEB)

    Macedo, R.; Budd, G.; Ross, E.; Wells, P.

    2010-07-15

    The software section of this journal presented new software programs that have been developed to help in the exploration and development of hydrocarbon resources. Software provider IHS Inc. has made additions to its geological and engineering analysis software tool, IHS PETRA, a product used by geoscientists and engineers to visualize, analyze and manage well production, well log, drilling, reservoir, seismic and other related information. IHS PETRA also includes a directional well module and a decline curve analysis module to improve analysis capabilities in unconventional reservoirs. Petris Technology Inc. has developed a software to help manage the large volumes of data. PetrisWinds Enterprise (PWE) helps users find and manage wellbore data, including conventional wireline and MWD core data; analysis core photos and images; waveforms and NMR; and external files documentation. Ottawa-based Ambercore Software Inc. has been collaborating with Nexen on the Petroleum iQ software for steam assisted gravity drainage (SAGD) producers. Petroleum iQ integrates geology and geophysics data with engineering data in 3D and 4D. Calgary-based Envirosoft Corporation has developed a software that reduces the costly and time-consuming effort required to comply with Directive 39 of the Alberta Energy Resources Conservation Board. The product includes an emissions modelling software. Houston-based Seismic Micro-Technology (SMT) has developed the Kingdom software that features the latest in seismic interpretation. Holland-based Joa Oil and Gas and Calgary-based Computer Modelling Group have both supplied the petroleum industry with advanced reservoir simulation software that enables reservoir interpretation. The 2010 software survey included a guide to new software applications designed to facilitate petroleum exploration, drilling and production activities. Oil and gas producers can use the products for a range of functions, including reservoir characterization and accounting. In

  10. Stakeholder co-development of farm level nutrient management software

    Science.gov (United States)

    Buckley, Cathal; Mechan, Sarah; Macken-Walsh, Aine; Heanue, Kevin

    2013-04-01

    Over the last number of decades intensification in the use nitrogen (N) and phosphorus (P) in agricultural production has lead to excessive accumulations of these nutrients in soils, groundwaters and surface water bodies (Sutton et al., 2011). According to the European Environment Agency (2012) despite some progress diffuse pollution from agriculture is still significant in more than 40% of Europe's water bodies in rivers and coastal waters, and in one third of the water bodies in lakes and transitional waters. Recently it was estimated that approximately 29% of monitored river channel length is polluted to some degree across the Republic of Ireland. Agricultural sources were suspected in 47 per cent of cases (EPA, 2012). Farm level management practices to reduce nutrient transfers from agricultural land to watercourses can be divided into source reduction and source interception approaches (Ribaudo et al., 2001). Source interception approaches involve capturing nutrients post mobilisation through policy instruments such as riparian buffer zones or wetlands. Conversely, the source reduction approach is preventative in nature and promotes strict management of nutrient at farm and field level to reduce risk of mobilisation in the first instance. This has the potential to deliver a double dividend of reduced nutrient loss to the wider ecosystem while maximising economic return to agricultural production at the field and farm levels. Adoption and use of nutrient management plans among farmers is far from the norm. This research engages key farmer and extension stakeholders to explore how current nutrient management planning software and outputs should be developed to make it more user friendly and usable in a practical way. An open innovation technology co-development approach was adopted to investigate what is demanded by the end users - farm advisors and farmers. Open innovation is a knowledge management strategy that uses the input of stakeholders to improve

  11. A Real-Time GPP Software-Defined Radio Testbed for the Physical Layer of Wireless Standards

    NARCIS (Netherlands)

    Schiphorst, Roelof; Hoeksema, F.W.; Slump, Cornelis H.

    2005-01-01

    We present our contribution to the general-purpose-processor-(GPP)-based radio. We describe a baseband software-defined radio testbed for the physical layer of wireless LAN standards. All physical layer functions have been successfully mapped on a Pentium 4 processor that performs these functions in

  12. Digital Level Layers for Digital Curve Decomposition and Vectorization

    Directory of Open Access Journals (Sweden)

    Laurent Provot

    2014-07-01

    Full Text Available The purpose of this paper is to present Digital Level Layers and show the motivations for working with such analytical primitives in the framework of Digital Geometry. We first compare their properties to morphological and topological counterparts, and then we explain how to recognize them and use them to decompose or vectorize digital curves and contours.

  13. Unified Multi-Layer among Software Defined Multi-Domain Optical Networks (Invited

    Directory of Open Access Journals (Sweden)

    Hui Yang

    2015-06-01

    Full Text Available The software defined networking (SDN enabled by OpenFlow protocol has gained popularity which can enable the network to be programmable and accommodate both fixed and flexible bandwidth services. In this paper, we present a unified multi-layer (UML architecture with multiple controllers and a dynamic orchestra plane (DOP for software defined multi-domain optical networks. The proposed architecture can shield the differences among various optical devices from multi-vendors and the details of connecting heterogeneous networks. The cross-domain services with on-demand bandwidth can be deployed via unified interfaces provided by the dynamic orchestra plane. Additionally, the globalization strategy and practical capture of signal processing are presented based on the architecture. The overall feasibility and efficiency of the proposed architecture is experimentally verified on the control plane of our OpenFlow-based testbed. The performance of globalization strategy under heavy traffic load scenario is also quantitatively evaluated based on UML architecture compared with other strategies in terms of blocking probability, average hops, and average resource consumption.

  14. Physical layer impairments tolerance based lightpath provision in software defined optical network

    Institute of Scientific and Technical Information of China (English)

    Zhao Xianlong; Xu Xianze; Bai Huifeng

    2017-01-01

    As all-optical networks grow with ever increasing ultra-high speed, the communication quality suffers seriously from physical layer impairments ( PLIs) .The same problem still exists in software defined optical network ( SDON) controlled by OpenFlow.Aimed to solve this problem, a PLIs tol-erance based lightpath provision scheme is proposed for OpenFlow controlled optical networks.This proposed approach not only takes the OSNR model to represent those linear PLIs factors, but also in-troduces those nonlinear factors into the OSNR model.Thus, the proposed scheme is able to cover most PLIs factors of each optical link and conduct optical lightpath provison with better communica-tion quality.Moreover, PLIs tolerance model is also set up and considered in this work with some necessary extension to OpenFlow protocols to achieve better compatibility between physical layer im-pairments factors and various services connections.Simulation results show that the proposed scheme is able to get better performance in terms of packet loss rate and connection setup time.

  15. Demonstration of a Concurrently Programmed Tactical Level Control Software for Autonomous Vehicles and the Interface to the Execution Level Code

    National Research Council Canada - National Science Library

    Carroll, William

    2000-01-01

    .... One of the greatest challenges to the successful development of truly autonomous vehicles is the ability to link logically based high-level mission planning with low-level vehicle control software...

  16. Sound pressure level tools design used in occupational health by means of Labview software

    Directory of Open Access Journals (Sweden)

    Farhad Forouharmajd

    2015-01-01

    Conclusion: LabVIEW programming capabilities in the field of sound can be referred to the measurement of sound, frequency analysis, and sound control that actually the software acts like a sound level meter and sound analyzer. According to the mentioned features, we can use this software to analyze and process sound and vibration as a monitoring system.

  17. Unibert - PC software for radiometric level gauging - the LB440 measuring system

    International Nuclear Information System (INIS)

    Mann, H.; Bickert, M.

    2001-01-01

    In almost all industrial branches radiometric measuring systems are being used today for a lot of different tasks. The most common field of this application are level gauging measurements by use of gamma radiation, i.e. for level detection as well as for level gauging over ranges of up to several meters. For our level gauge measuring system LB440 we developed a clearly arranged PC software, which allows starting, measuring and service of the level gauge. Over the RS232-interface the industrial computer can be connected with a Laptop or PC. The software is a supplemental or even a substitute for the operation over the frontpanel. The measuring system can be completely controlled by the Unibert PC-Software, realised by LabVIEW 5.1.1, which offers an interactive graphical user interface. The same functionality as in the ''embedded - software'' is available, completed with some additional functions. (orig.) [de

  18. Managing Risk Areas in Software Development Offshoring: A CMMI Level 5 Case

    DEFF Research Database (Denmark)

    Persson, John Stouby; Schlichter, Bjarne Rerup

    2015-01-01

    Software companies are increasingly offshoring development to countries with high expertise at lower cost. Offshoring involves particular risk areas that if ignored increase the likelihood of failure. However, the offshoring client’s maturity level may influence the management of these risk areas....... Against this backdrop, we present an interpretive case study of how managers perceive and mitigate the risk areas in software development offshoring with a mature CMMI level 5 (Capability Maturity Model, Integrated) software company as the client. We find that managers perceive and mitigate most...

  19. Generating Safety-Critical PLC Code From a High-Level Application Software Specification

    Science.gov (United States)

    2008-01-01

    The benefits of automatic-application code generation are widely accepted within the software engineering community. These benefits include raised abstraction level of application programming, shorter product development time, lower maintenance costs, and increased code quality and consistency. Surprisingly, code generation concepts have not yet found wide acceptance and use in the field of programmable logic controller (PLC) software development. Software engineers at Kennedy Space Center recognized the need for PLC code generation while developing the new ground checkout and launch processing system, called the Launch Control System (LCS). Engineers developed a process and a prototype software tool that automatically translates a high-level representation or specification of application software into ladder logic that executes on a PLC. All the computer hardware in the LCS is planned to be commercial off the shelf (COTS), including industrial controllers or PLCs that are connected to the sensors and end items out in the field. Most of the software in LCS is also planned to be COTS, with only small adapter software modules that must be developed in order to interface between the various COTS software products. A domain-specific language (DSL) is a programming language designed to perform tasks and to solve problems in a particular domain, such as ground processing of launch vehicles. The LCS engineers created a DSL for developing test sequences of ground checkout and launch operations of future launch vehicle and spacecraft elements, and they are developing a tabular specification format that uses the DSL keywords and functions familiar to the ground and flight system users. The tabular specification format, or tabular spec, allows most ground and flight system users to document how the application software is intended to function and requires little or no software programming knowledge or experience. A small sample from a prototype tabular spec application is

  20. Nuclear and Particle Physics Simulations: The Consortium of Upper-Level Physics Software

    Science.gov (United States)

    Bigelow, Roberta; Moloney, Michael J.; Philpott, John; Rothberg, Joseph

    1995-06-01

    The Consortium for Upper Level Physics Software (CUPS) has developed a comprehensive series of Nine Book/Software packages that Wiley will publish in FY `95 and `96. CUPS is an international group of 27 physicists, all with extensive backgrounds in the research, teaching, and development of instructional software. The project is being supported by the National Science Foundation (PHY-9014548), and it has received other support from the IBM Corp., Apple Computer Corp., and George Mason University. The Simulations being developed are: Astrophysics, Classical Mechanics, Electricity & Magnetism, Modern Physics, Nuclear and Particle Physics, Quantum Mechanics, Solid State, Thermal and Statistical, and Wave and Optics.

  1. Software Process Improvement through the Removal of Project-Level Knowledge Flow Obstacles: The Perceptions of Software Engineers

    Science.gov (United States)

    Mitchell, Susan Marie

    2012-01-01

    Uncontrollable costs, schedule overruns, and poor end product quality continue to plague the software engineering field. Innovations formulated with the expectation to minimize or eliminate cost, schedule, and quality problems have generally fallen into one of three categories: programming paradigms, software tools, and software process…

  2. Incorporating Level-2 PSA Feature of CONPAS into AIMS-PSA Software

    International Nuclear Information System (INIS)

    Han, Sang Hoon; Lim, Hogon; Ahn, Kwang Il

    2014-01-01

    CONPAS (CONtainment Performance Analysis System) utilizes a methodology to treat containment phenomena in detail like APET but in simple way. In mid 2000's, KAERI has developed very fast cut set generator FTREX and PC's OS (Operating system) has changed into Windows 95. Thus, KAERI has developed new Level-1 PSA software, called AIMS-PSA (Advanced Information Management System for PSA) to replace KIRAP. Recently, KAERI has been developing an integrated PSA platform, called OCEANS (On-line Consolidator and Evaluator of All mode risk for Nuclear System), for the risk assessment of all power modes and all hazards. CONPAS for Level-2 PSA was developed in 1990's using the Visual Basic 6.0 compiler which is not supported any more. It needs to be updated for the integrated PSA software framework. This paper describes a study to incorporate the features of CONPAS into AIMS-PSA. The basic idea is to follow the approach of CONPAS, but in the integrated way. Various approaches for Level-2 PSA have been used since WASH-1400. APET approach of NUREG-1150 study would be most comprehensive and complex methodology for containment event tree analysis. CONPAS is the Level-2 PSA software to utilize an approach to treat containment phenomena in detail like APET but in simple way. But, new Level-2 PSA software is required to develop more integrated PSA framework. A modified approach of CONPAS is developed and incorporated in AIMS-PSA software that can handle Level-1 and Level-2 PSA in the integrated way (from the viewpoint of event tree and fault tree). AIMS-PSA combines whole Level-2 PSA model to produce a One Top fault tree and to generate cut sets in the same way as Level-1 PSA. Quantification results of Level-2 PSA such as frequency for each STC can be calculated from the minimal cut sets

  3. Embedding Knowledge Processes to Maintain Service Levels and Efficiency in a Growing Software Service Firm

    NARCIS (Netherlands)

    Oostdam, M.; Verburg, R.M.; Lobbezoo, M.

    2013-01-01

    Software service firms are challenged to maintain high service levels and to innovate at the same time. Therefore, valuable human resources need often to be balanced between innovation and operations related activities. In this paper we describe how such as a firm deals with these issues by

  4. High level issues in reliability quantification of safety-critical software

    International Nuclear Information System (INIS)

    Kim, Man Cheol

    2012-01-01

    For the purpose of developing a consensus method for the reliability assessment of safety-critical digital instrumentation and control systems in nuclear power plants, several high level issues in reliability assessment of the safety-critical software based on Bayesian belief network modeling and statistical testing are discussed. Related to the Bayesian belief network modeling, the relation between the assessment approach and the sources of evidence, the relation between qualitative evidence and quantitative evidence, how to consider qualitative evidence, and the cause-consequence relation are discussed. Related to the statistical testing, the need of the consideration of context-specific software failure probabilities and the inability to perform a huge number of tests in the real world are discussed. The discussions in this paper are expected to provide a common basis for future discussions on the reliability assessment of safety-critical software. (author)

  5. Tear film lipid layer: A molecular level view

    Czech Academy of Sciences Publication Activity Database

    Cwiklik, Lukasz

    2016-01-01

    Roč. 1858, č. 10 (2016), s. 2421-2430 ISSN 0005-2736 R&D Projects: GA ČR GA15-14292S Institutional support: RVO:61388955 Keywords : tear film * tear film lipid layer * molecular dynamics simulations Subject RIV: CF - Physical ; Theoretical Chemistry Impact factor: 3.498, year: 2016

  6. Design of a Multi-layer Lane-Level Map for Vehicle Route Planning

    Directory of Open Access Journals (Sweden)

    Liu Chaoran

    2017-01-01

    Full Text Available With the development of intelligent transportation system, there occurs further demand for high precision localization and route planning, and simultaneously the traditional road-level map fails to meet with this requirement, by which this paper is motivated. In this paper, t he three-layer lane-level map architecture for vehicle path guidance is established, and the mathematical models of road-level layer, intermediate layer and lane-level layer are designed considering efficiency and precision. The geometric model of the lane-level layer of the map is characterized by Cubic Hermite Spline for continuity. A method of generating the lane geometry with fixed and variable control points is proposed, which can effectively ensure the accuracy with limited num ber of control points. In experimental part, a multi-layer map of an intersection is built to validate the map model, and an example of a local map was generated with the lane-level geometry.

  7. Open high-level data formats and software for gamma-ray astronomy

    Science.gov (United States)

    Deil, Christoph; Boisson, Catherine; Kosack, Karl; Perkins, Jeremy; King, Johannes; Eger, Peter; Mayer, Michael; Wood, Matthew; Zabalza, Victor; Knödlseder, Jürgen; Hassan, Tarek; Mohrmann, Lars; Ziegler, Alexander; Khelifi, Bruno; Dorner, Daniela; Maier, Gernot; Pedaletti, Giovanna; Rosado, Jaime; Contreras, José Luis; Lefaucheur, Julien; Brügge, Kai; Servillat, Mathieu; Terrier, Régis; Walter, Roland; Lombardi, Saverio

    2017-01-01

    In gamma-ray astronomy, a variety of data formats and proprietary software have been traditionally used, often developed for one specific mission or experiment. Especially for ground-based imaging atmospheric Cherenkov telescopes (IACTs), data and software are mostly private to the collaborations operating the telescopes. However, there is a general movement in science towards the use of open data and software. In addition, the next-generation IACT instrument, the Cherenkov Telescope Array (CTA), will be operated as an open observatory. We have created a Github organisation at https://github.com/open-gamma-ray-astro where we are developing high-level data format specifications. A public mailing list was set up at https://lists.nasa.gov/mailman/listinfo/open-gamma-ray-astro and a first face-to-face meeting on the IACT high-level data model and formats took place in April 2016 in Meudon (France). This open multi-mission effort will help to accelerate the development of open data formats and open-source software for gamma-ray astronomy, leading to synergies in the development of analysis codes and eventually better scientific results (reproducible, multi-mission). This write-up presents this effort for the first time, explaining the motivation and context, the available resources and process we use, as well as the status and planned next steps for the data format specifications. We hope that it will stimulate feedback and future contributions from the gamma-ray astronomy community.

  8. Conception and validation software tools for the level 0 muon trigger of LHCb

    International Nuclear Information System (INIS)

    Aslanides, E.; Cachemiche, J. P.; Cogan, J.; Duval, P. Y.; Le Gac, R.; Hachon, F.; Leroy, O.; Liotard, P. L.; Marin, F.; Tsaregorodtsev, A.

    2009-01-01

    The Level-0 muon trigger processor of the LHCb experiment looks for straight particles crossing muon detector and measures their transverse momentum. It processes 40*10 6 proton-proton collisions per second. The tracking uses a road algorithm relying on the projectivity of the muon detector (the logical layout in the 5 muon station is projective in y to the interaction point and it is also projective in x when the bending in the horizontal direction introduced by the magnetic field is ignored). The architecture of the Level-0 muon trigger is complex with a dense network of data interconnections. The design and validation of such an intricate system has only been possible with intense use of software tools for the detector simulation, the modelling of the hardware components behaviour and the validation. A database describing the data-flow is the corner stone between the software and hardware components. (authors)

  9. Flexible event reconstruction software chains with the ALICE High-Level Trigger

    International Nuclear Information System (INIS)

    Ram, D; Breitner, T; Szostak, A

    2012-01-01

    The ALICE High-Level Trigger (HLT) has a large high-performance computing cluster at CERN whose main objective is to perform real-time analysis on the data generated by the ALICE experiment and scale it down to at-most 4GB/sec - which is the current maximum mass-storage bandwidth available. Data-flow in this cluster is controlled by a custom designed software framework. It consists of a set of components which can communicate with each other via a common control interface. The software framework also supports the creation of different configurations based on the detectors participating in the HLT. These configurations define a logical data processing “chain” of detector data-analysis components. Data flows through this software chain in a pipelined fashion so that several events can be processed at the same time. An instance of such a chain can run and manage a few thousand physics analysis and data-flow components. The HLT software and the configuration scheme used in the 2011 heavy-ion runs of ALICE, has been discussed in this contribution.

  10. 3D Voronoi grid dedicated software for modeling gas migration in deep layered sedimentary formations with TOUGH2-TMGAS

    Science.gov (United States)

    Bonduà, Stefano; Battistelli, Alfredo; Berry, Paolo; Bortolotti, Villiam; Consonni, Alberto; Cormio, Carlo; Geloni, Claudio; Vasini, Ester Maria

    2017-11-01

    As is known, a full three-dimensional (3D) unstructured grid permits a great degree of flexibility when performing accurate numerical reservoir simulations. However, when the Integral Finite Difference Method (IFDM) is used for spatial discretization, constraints (arising from the required orthogonality between the segment connecting the blocks nodes and the interface area between blocks) pose difficulties in the creation of grids with irregular shaped blocks. The full 3D Voronoi approach guarantees the respect of IFDM constraints and allows generation of grids conforming to geological formations and structural objects and at the same time higher grid resolution in volumes of interest. In this work, we present dedicated pre- and post-processing gridding software tools for the TOUGH family of numerical reservoir simulators, developed by the Geothermal Research Group of the DICAM Department, University of Bologna. VORO2MESH is a new software coded in C++, based on the voro++ library, allowing computation of the 3D Voronoi tessellation for a given domain and the creation of a ready to use TOUGH2 MESH file. If a set of geological surfaces is available, the software can directly generate the set of Voronoi seed points used for tessellation. In order to reduce the number of connections and so to decrease computation time, VORO2MESH can produce a mixed grid with regular blocks (orthogonal prisms) and irregular blocks (polyhedron Voronoi blocks) at the point of contact between different geological formations. In order to visualize 3D Voronoi grids together with the results of numerical simulations, the functionality of the TOUGH2Viewer post-processor has been extended. We describe an application of VORO2MESH and TOUGH2Viewer to validate the two tools. The case study deals with the simulation of the migration of gases in deep layered sedimentary formations at basin scale using TOUGH2-TMGAS. A comparison between the simulation performances of unstructured and structured

  11. SWATCH Common software for controlling and monitoring the upgraded CMS Level-1 trigger

    CERN Document Server

    Lazaridis, Christos; Bunkowski, Karol; Codispoti, Giuseppe; Dirkx, Glenn; Ghabrous Larrea, Carlos; Lingemann, Joschka; Kreczko, Lukasz; Thea, Alessandro; Williams, Tom

    2017-01-01

    The Large Hadron Collider at CERN restarted in 2015 with a higher centre-of-mass energy of 13 TeV. The instantaneous luminosity is expected to increase significantly in the coming years. An upgraded Level-1 trigger system is being deployed in the CMS experiment in order to maintain the same efficiencies for searches and precision measurements as those achieved in the previous run. This system must be controlled and monitored coherently through software, with high operational efficiency.The legacy system is composed of approximately 4000 data processor boards, of several custom application-specific designs. These boards are organised into several subsystems; each subsystem receives data from different detector systems (calorimeters, barrel/endcap muon detectors), or with differing granularity. These boards have been controlled and monitored by a medium-sized distributed system of over 40 computers and 200 processes. Only a small fraction of the control and monitoring software was common between the different s...

  12. Studies for a common selection software environment in ATLAS from the Level-2 Trigger to the offline reconstruction

    CERN Document Server

    Wiedenmann, W; Baines, J T M; Bee, C P; Biglietti, M; Bogaerts, A; Boisvert, V; Bosman, M; Brandt, S; Caron, B; Casado, M P; Cataldi, G; Cavalli, D; Cervetto, M; Comune, G; Corso-Radu, A; Di Mattia, A; Díaz-Gómez, M; Dos Anjos, A; Drohan, J; Ellis, Nick; Elsing, M; Epp, B; Etienne, F; Falciano, S; Farilla, A; George, S; Ghete, V M; González, S; Grothe, M; Kaczmarska, A; Karr, K M; Khomich, A; Konstantinidis, N P; Krasny, W; Li, W; Lowe, A; Luminari, L; Meessen, C; Mello, A G; Merino, G; Morettini, P; Moyse, E; Nairz, A; Negri, A; Nikitin, N V; Nisati, A; Padilla, C; Parodi, F; Pérez-Réale, V; Pinfold, J L; Pinto, P; Polesello, G; Qian, Z; Resconi, S; Rosati, S; Scannicchio, D A; Schiavi, C; Schörner-Sadenius, T; Segura, E; De Seixas, J M; Shears, T G; Sivoklokov, S Yu; Smizanska, M; Soluk, R A; Stanescu, C; Tapprogge, Stefan; Touchard, F; Vercesi, V; Watson, A T; Wengler, T; Werner, P; Wheeler, S; Wickens, F J; Wielers, M; Zobernig, G; NSS-MIC 2003 - IEEE Nuclear Science Symposium and Medical Imaging Conference, Part 1

    2004-01-01

    The Atlas High Level Trigger's primary function of event selection will be accomplished with a Level-2 trigger farm and an Event Filter farm, both running software components developed in the Atlas offline reconstruction framework. While this approach provides a unified software framework for event selection, it poses strict requirements on offline components critical for the Level-2 trigger. A Level-2 decision in Atlas must typically be accomplished within 10 ms and with multiple event processing in concurrent threads. In order to address these constraints, prototypes have been developed that incorporate elements of the Atlas Data Flow -, High Level Trigger -, and offline framework software. To realize a homogeneous software environment for offline components in the High Level Trigger, the Level-2 Steering Controller was developed. With electron/gamma- and muon-selection slices it has been shown that the required performance can be reached, if the offline components used are carefully designed and optimized ...

  13. A client-server software for the identification of groundwater vulnerability to pesticides at regional level.

    Science.gov (United States)

    Di Guardo, Andrea; Finizio, Antonio

    2015-10-15

    The groundwater VULnerability to PESticide software system (VULPES) is a user-friendly, GIS-based and client-server software developed to identify vulnerable areas to pesticides at regional level making use of pesticide fate models. It is a Decision Support System aimed to assist the public policy makers to investigate areas sensitive to specific substances and to propose limitations of use or mitigation measures. VULPES identify the so-called Uniform Geographical Unit (UGU) which are areas characterised by the same agro-environmental conditions. In each UGU it applies the PELMO model obtaining the 80th percentile of the substance concentration at 1 metre depth; then VULPES creates a vulnerability map in shapefile format which classifies the outputs comparing them with the lower threshold set to the legal limit concentration in groundwater (0.1 μg/l). This paper describes the software structure in details and a case study with the application of the terbuthylazine herbicide on the Lombardy region territory. Three zones with different degrees of vulnerabilities has been identified and described. Copyright © 2015 Elsevier B.V. All rights reserved.

  14. Common software for controlling and monitoring the upgraded CMS Level-1 trigger

    CERN Document Server

    Codispoti, Giuseppe

    2017-01-01

    The Large Hadron Collider restarted in 2015 with a higher centre-of-mass energy of 13 TeV. The instantaneous luminosity is expected to increase significantly in the coming years. An upgraded Level-1 trigger system was deployed in the CMS experiment in order to maintain the same efficiencies for searches and precision measurements as those achieved in 2012. This system must be controlled and monitored coherently through software, with high operational efficiency.The legacy system was composed of a large number of custom data processor boards; correspondingly, only a small fraction of the software was common between the different subsystems. The upgraded system is composed of a set of general purpose boards, that follow the MicroTCA specification, and transmit data over optical links, resulting in a more homogeneous system. The associated software is based on generic components corresponding to the firmware blocks that are shared across different cards, regardless of the role that the card plays in the system. ...

  15. Clinical evaluation of monitor unit software and the application of action levels

    International Nuclear Information System (INIS)

    Georg, Dietmar; Nyholm, Tufve; Olofsson, Joergen; Kjaer-Kristoffersen, Flemming; Schnekenburger, Bruno; Winkler, Peter; Nystroem, Hakan; Ahnesjoe, Anders; Karlsson, Mikael

    2007-01-01

    Purpose: The aim of this study was the clinical evaluation of an independent dose and monitor unit verification (MUV) software which is based on sophisticated semi-analytical modelling. The software was developed within the framework of an ESTRO project. Finally, consistent handling of dose calculation deviations applying individual action levels is discussed. Materials and methods: A Matlab-based software ('MUV') was distributed to five well-established treatment centres in Europe (Vienna, Graz, Basel, Copenhagen, and Umea) and evaluated as a quality assurance (QA) tool in clinical routine. Results were acquired for 226 individual treatment plans including a total of 815 radiation fields. About 150 beam verification measurements were performed for a portion of the individual treatment plans, mainly with time variable fluence patterns. The deviations between dose calculations performed with a treatment planning system (TPS) and the MUV software were scored with respect to treatment area, treatment technique, geometrical depth, radiological depth, etc. Results: In general good agreement was found between calculations performed with the different TPSs and MUV, with a mean deviation per field of 0.2 ± 3.5% (1 SD) and mean deviations of 0.2 ± 2.2% for composite treatment plans. For pelvic treatments less than 10% of all fields showed deviations larger than 3%. In general, when using the radiological depth for verification calculations the results and the spread in the results improved significantly, especially for head-and-neck and for thorax treatments. For IMRT head-and-neck beams, mean deviations between MUV and the local TPS were -1.0 ± 7.3% for dynamic, and -1.3 ± 3.2% for step-and-shoot IMRT delivery. For dynamic IMRT beams in the pelvis good agreement was obtained between MUV and the local TPS (mean: -1.6 ± 1.5%). Treatment site and treatment technique dependent action levels between ±3% and ±5% seem to be clinically realistic if a radiological depth

  16. Control, Test and Monitoring Software Framework for the ATLAS Level-1 Calorimeter Trigger

    CERN Document Server

    Achenbach, R; Aharrouche, M; Andrei, V; Åsman, B; Barnett, B M; Bauss, B; Bendel, M; Bohm, C; Booth, J R A; Bracinik, J; Brawn, I P; Charlton, D G; Childers, J T; Collins, N J; Curtis, C J; Davis, A O; Eckweiler, S; Eisenhandler, E F; Faulkner, P J W; Fleckner, J; Föhlisch, F; Gee, C N P; Gillman, A R; Goringer, C; Groll, M; Hadley, D R; Hanke, P; Hellman, S; Hidvegi, A; Hillier, S J; Johansen, M; Kluge, E E; Kühl, T; Landon, M; Lendermann, V; Lilley, J N; Mahboubi, K; Mahout, G; Meier, K; Middleton, R P; Moa, T; Morris, J D; Müller, F; Neusiedl, A; Ohm, C; Oltmann, B; Perera, V J O; Prieur, D P F; Qian, W; Rieke, S; Rühr, F; Sankey, D P C; Schäfer, U; Schmitt, K; Schultz-Coulon, H C; Silverstein, S; Sjölin, J; Staley, R J; Stamen, R; Stockton, M C; Tan, C L A; Tapprogge, S; Thomas, J P; Thompson, P D; Watkins, P M; Watson, A; Weber, P; Wessels, M; Wildt, M

    2008-01-01

    The ATLAS first-level calorimeter trigger is a hardware-based system designed to identify high-pT jets, electron/photon and tau candidates and to measure total and missing ET in the ATLAS calorimeters. The complete trigger system consists of over 300 customdesignedVME modules of varying complexity. These modules are based around FPGAs or ASICs with many configurable parameters, both to initialize the system with correct calibrations and timings and to allow flexibility in the trigger algorithms. The control, testing and monitoring of these modules requires a comprehensive, but well-designed and modular, software framework, which we will describe in this paper.

  17. ATLAS High Level Calorimeter Trigger Software Performance for Cosmic Ray Events

    CERN Document Server

    Oliveira Damazio, Denis; The ATLAS collaboration

    2009-01-01

    The ATLAS detector is undergoing intense commissioning effort with cosmic rays preparing for the first LHC collisions next spring. Combined runs with all of the ATLAS subsystems are being taken in order to evaluate the detector performance. This is an unique opportunity also for the trigger system to be studied with different detector operation modes, such as different event rates and detector configuration. The ATLAS trigger starts with a hardware based system which tries to identify detector regions where interesting physics objects may be found (eg: large energy depositions in the calorimeter system). An approved event will be further processed by more complex software algorithms at the second level where detailed features are extracted (full detector granularity data for small portions of the detector is available). Events accepted at this level will be further processed at the so-called event filter level. Full detector data at full granularity is available for offline like processing with complete calib...

  18. Jansen-MIDAS: A multi-level photomicrograph segmentation software based on isotropic undecimated wavelets.

    Science.gov (United States)

    de Siqueira, Alexandre Fioravante; Cabrera, Flávio Camargo; Nakasuga, Wagner Massayuki; Pagamisse, Aylton; Job, Aldo Eloizo

    2018-01-01

    Image segmentation, the process of separating the elements within a picture, is frequently used for obtaining information from photomicrographs. Segmentation methods should be used with reservations, since incorrect results can mislead when interpreting regions of interest (ROI). This decreases the success rate of extra procedures. Multi-Level Starlet Segmentation (MLSS) and Multi-Level Starlet Optimal Segmentation (MLSOS) were developed to be an alternative for general segmentation tools. These methods gave rise to Jansen-MIDAS, an open-source software. A scientist can use it to obtain several segmentations of hers/his photomicrographs. It is a reliable alternative to process different types of photomicrographs: previous versions of Jansen-MIDAS were used to segment ROI in photomicrographs of two different materials, with an accuracy superior to 89%. © 2017 Wiley Periodicals, Inc.

  19. Algorithms and programs for processing of satellite data on ozone layer and UV radiation levels

    International Nuclear Information System (INIS)

    Borkovskij, N.B.; Ivanyukovich, V.A.

    2012-01-01

    Some algorithms and programs for automatic retrieving and processing ozone layer satellite data are discussed. These techniques are used for reliable short-term UV-radiation levels forecasting. (authors)

  20. Virtual Systems Pharmacology (ViSP software for mechanistic system-level model simulations

    Directory of Open Access Journals (Sweden)

    Sergey eErmakov

    2014-10-01

    Full Text Available Multiple software programs are available for designing and running large scale system-level pharmacology models used in the drug development process. Depending on the problem, scientists may be forced to use several modeling tools that could increase model development time, IT costs and so on. Therefore it is desirable to have a single platform that allows setting up and running large-scale simulations for the models that have been developed with different modeling tools. We developed a workflow and a software platform in which a model file is compiled into a self-contained executable that is no longer dependent on the software that was used to create the model. At the same time the full model specifics is preserved by presenting all model parameters as input parameters for the executable. This platform was implemented as a model agnostic, therapeutic area agnostic and web-based application with a database back-end that can be used to configure, manage and execute large-scale simulations for multiple models by multiple users. The user interface is designed to be easily configurable to reflect the specifics of the model and the user’s particular needs and the back-end database has been implemented to store and manage all aspects of the systems, such as Models, Virtual Patients, User Interface Settings, and Results. The platform can be adapted and deployed on an existing cluster or cloud computing environment. Its use was demonstrated with a metabolic disease systems pharmacology model that simulates the effects of two antidiabetic drugs, metformin and fasiglifam, in type 2 diabetes mellitus patients.

  1. The High-Level Interface Definitions in the ASTRI/CTA Mini Array Software System (MASS)

    Science.gov (United States)

    Conforti, V.; Tosti, G.; Schwarz, J.; Bruno, P.; Cefal‘A, M.; Paola, A. D.; Gianotti, F.; Grillo, A.; Russo, F.; Tanci, C.; Testa, V.; Antonelli, L. A.; Canestrari, R.; Catalano, O.; Fiorini, M.; Gallozzi, S.; Giro, E.; Palombara, N. L.; Leto, G.; Maccarone, M. C.; Pareschi, G.; Stringhetti, L.; Trifoglio, M.; Vercellone, S.; Astri Collaboration; Cta Consortium

    2015-09-01

    ASTRI (Astrofisica con Specchi a Tecnologia Replicante Italiana) is a Flagship Project funded by the Italian Ministry of Education, University and Research, and led by INAF, the Italian National Institute of Astrophysics. Within this framework, INAF is currently developing an end-to-end prototype, named ASTRI SST-2M, of a Small Size Dual-Mirror Telescope for the Cherenkov Telescope Array, CTA. A second goal of the project is the realization of the ASTRI/CTA mini-array, which will be composed of seven SST-2M telescopes placed at the CTA Southern Site. The ASTRI Mini Array Software System (MASS) is designed to support the ASTRI/CTA mini-array operations. MASS is being built on top of the ALMA Common Software (ACS) framework, which provides support for the implementation of distributed data acquisition and control systems, and functionality for log and alarm management, message driven communication and hardware devices management. The first version of the MASS system, which will comply with the CTA requirements and guidelines, will be tested on the ASTRI SST-2M prototype. In this contribution we present the interface definitions of the MASS high level components in charge of the ASTRI SST-2M observation scheduling, telescope control and monitoring, and data taking. Particular emphasis is given to their potential reuse for the ASTRI/CTA mini-array.

  2. Relationship-Oriented Software Defined AS-Level Fast Rerouting for Multiple Link Failures

    Directory of Open Access Journals (Sweden)

    Chunxiu Li

    2015-01-01

    Full Text Available Large-scale deployments of mission-critical services have led to stringent demands on Internet routing, but frequently occurring network failures can dramatically degrade the network performance. However, Border Gateway Protocol (BGP can not react quickly to recover from them. Although extensive research has been conducted to deal with the problem, the multiple failure scenarios have never been properly addressed due to the limit of distributed control plane. In this paper, we propose a local fast reroute approach to effectively recover from multiple link failures in one administrative domain. The principle of Software Defined Networking (SDN is used to achieve the software defined AS-level fast rerouting. Considering AS relationships, efficient algorithms are proposed to automatically and dynamically find protection paths for multiple link failures; then OpenFlow forwarding rules are installed on routers to provide data forwarding continuity. Our approach is able to ensure applicability to ASes with flexibility and adaptability to multiple link failures, contributing toward improving the network performance. Through experimental results, we show that our proposal provides effective failure recovery and does not introduce significant control overhead to the network.

  3. Service Level Agreements as Vehicles for Managing Acquisition of Software-Intensive Systems

    National Research Council Canada - National Science Library

    Gaines, Leonard T; Michael, James B

    2005-01-01

    ... to support quality and process control throughout the entire lifecycle of a software-intensive system. This article defines SLAs, discusses software quality and describes how SLAS can be utilized to incorporate requirements pertaining to product, process, project, and deployment quality throughout the software lifecycle.

  4. Fostering Multirepresentational Levels of Chemical Concepts: A Framework to Develop Educational Software

    Science.gov (United States)

    Marson, Guilherme A.; Torres, Bayardo B.

    2011-01-01

    This work presents a convenient framework for developing interactive chemical education software to facilitate the integration of macroscopic, microscopic, and symbolic dimensions of chemical concepts--specifically, via the development of software for gel permeation chromatography. The instructional role of the software was evaluated in a study…

  5. Estimation of combined sewer overflow discharge: a software sensor approach based on local water level measurements.

    Science.gov (United States)

    Ahm, Malte; Thorndahl, Søren; Nielsen, Jesper E; Rasmussen, Michael R

    2016-12-01

    Combined sewer overflow (CSO) structures are constructed to effectively discharge excess water during heavy rainfall, to protect the urban drainage system from hydraulic overload. Consequently, most CSO structures are not constructed according to basic hydraulic principles for ideal measurement weirs. It can, therefore, be a challenge to quantify the discharges from CSOs. Quantification of CSO discharges are important in relation to the increased environmental awareness of the receiving water bodies. Furthermore, CSO discharge quantification is essential for closing the rainfall-runoff mass-balance in combined sewer catchments. A closed mass-balance is an advantage for calibration of all urban drainage models based on mass-balance principles. This study presents three different software sensor concepts based on local water level sensors, which can be used to estimate CSO discharge volumes from hydraulic complex CSO structures. The three concepts were tested and verified under real practical conditions. All three concepts were accurate when compared to electromagnetic flow measurements.

  6. High-Level software requirements specification for the TWRS controlled baseline database system

    International Nuclear Information System (INIS)

    Spencer, S.G.

    1998-01-01

    This Software Requirements Specification (SRS) is an as-built document that presents the Tank Waste Remediation System (TWRS) Controlled Baseline Database (TCBD) in its current state. It was originally known as the Performance Measurement Control System (PMCS). Conversion to the new system name has not occurred within the current production system. Therefore, for simplicity, all references to TCBD are equivalent to PMCS references. This SRS will reference the PMCS designator from this point forward to capture the as-built SRS. This SRS is written at a high-level and is intended to provide the design basis for the PMCS. The PMCS was first released as the electronic data repository for cost, schedule, and technical administrative baseline information for the TAAS Program. During its initial development, the PMCS was accepted by the customer, TARS Business Management, with no formal documentation to capture the initial requirements

  7. Precision calibration of the silicon doping level in gallium arsenide epitaxial layers

    Science.gov (United States)

    Mokhov, D. V.; Berezovskaya, T. N.; Kuzmenkov, A. G.; Maleev, N. A.; Timoshnev, S. N.; Ustinov, V. M.

    2017-10-01

    An approach to precision calibration of the silicon doping level in gallium arsenide epitaxial layers is discussed that is based on studying the dependence of the carrier density in the test GaAs layer on the silicon- source temperature using the Hall-effect and CV profiling techniques. The parameters are measured by standard or certified measuring techniques and approved measuring instruments. It is demonstrated that the use of CV profiling for controlling the carrier density in the test GaAs layer at the thorough optimization of the measuring procedure ensures the highest accuracy and reliability of doping level calibration in the epitaxial layers with a relative error of no larger than 2.5%.

  8. Performance of Layers Fed Graded Levels of Blood –Rumen ...

    African Journals Online (AJOL)

    240 laying hens were fed graded levels of Blood-Rumen content mixture (BRCM) for a period of eight weeks. The study was designed to determine the level of BRCM that layers can tolerate in their diet. Feed intake by birds fed the control and 4% BRCM diets were comparable, but significantly higher (P<0.05) than those ...

  9. The use of generalised audit software by internal audit functions in a developing country: A maturity level assessment

    OpenAIRE

    D.P. van der Nest; Louis Smidt; Dave Lubbe

    2017-01-01

    This article explores the existing practices of internal audit functions in the locally controlled South African banking industry regarding the use of Generalised Audit Software (GAS), against a benchmark developed from recognised data analytic maturity models, in order to assess the current maturity levels of the locally controlled South African banks in the use of this software for tests of controls. The literature review indicates that the use of GAS by internal audit functions is still at...

  10. Level-1 probability safety assessment of the Iranian heavy water reactor using SAPHIRE software

    Energy Technology Data Exchange (ETDEWEB)

    Faghihi, F. [Department of Nuclear Engineering, School of Engineering, Shiraz University, 71348-51153 Shiraz (Iran, Islamic Republic of); Research Center for Radiation Protection, Shiraz University, Shiraz (Iran, Islamic Republic of); Nuclear Safety Research Center, Shiraz University, Shiraz (Iran, Islamic Republic of)], E-mail: faghihif@shirazu.ac.ir; Ramezani, E. [Department of Nuclear Engineering, School of Engineering, Shiraz University, 71348-51153 Shiraz (Iran, Islamic Republic of); Yousefpour, F. [Atomic Energy Organization of Iran (AEOI), Tehran (Iran, Islamic Republic of); Mirvakili, S.M. [Department of Nuclear Engineering, School of Engineering, Shiraz University, 71348-51153 Shiraz (Iran, Islamic Republic of)

    2008-10-15

    The main goal of this review paper is to analyze the total frequency of the core damage of the Iranian Heavy Water Research Reactor (IHWRR) compared with standard criteria and to determine the strengths and the weaknesses of the reactor safety systems towards improving its design and operation. The PSA has been considered for full-power state of the reactor and this article represents a level-1 PSA analysis using System Analysis Programs for Hands-On Integrated Reliability Evaluations (SAPHIRE) software. It is specifically designed to permit a listing of the potential accident sequences, compute their frequencies of occurrence and assign each sequence to a consequence. The method used for modeling the systems and accident sequences, is Large Fault Tree/Small Event Tree method. This PSA level-1 for IHWRR indicates that, based on conservative assumptions, the total frequency of accidents that would lead to core damage from internal initiating events is 4.44E-05 per year of reactor operation.

  11. Level-1 probability safety assessment of the Iranian heavy water reactor using SAPHIRE software

    International Nuclear Information System (INIS)

    Faghihi, F.; Ramezani, E.; Yousefpour, F.; Mirvakili, S.M.

    2008-01-01

    The main goal of this review paper is to analyze the total frequency of the core damage of the Iranian Heavy Water Research Reactor (IHWRR) compared with standard criteria and to determine the strengths and the weaknesses of the reactor safety systems towards improving its design and operation. The PSA has been considered for full-power state of the reactor and this article represents a level-1 PSA analysis using System Analysis Programs for Hands-On Integrated Reliability Evaluations (SAPHIRE) software. It is specifically designed to permit a listing of the potential accident sequences, compute their frequencies of occurrence and assign each sequence to a consequence. The method used for modeling the systems and accident sequences, is Large Fault Tree/Small Event Tree method. This PSA level-1 for IHWRR indicates that, based on conservative assumptions, the total frequency of accidents that would lead to core damage from internal initiating events is 4.44E-05 per year of reactor operation

  12. Automated concept-level information extraction to reduce the need for custom software and rules development.

    Science.gov (United States)

    D'Avolio, Leonard W; Nguyen, Thien M; Goryachev, Sergey; Fiore, Louis D

    2011-01-01

    Despite at least 40 years of promising empirical performance, very few clinical natural language processing (NLP) or information extraction systems currently contribute to medical science or care. The authors address this gap by reducing the need for custom software and rules development with a graphical user interface-driven, highly generalizable approach to concept-level retrieval. A 'learn by example' approach combines features derived from open-source NLP pipelines with open-source machine learning classifiers to automatically and iteratively evaluate top-performing configurations. The Fourth i2b2/VA Shared Task Challenge's concept extraction task provided the data sets and metrics used to evaluate performance. Top F-measure scores for each of the tasks were medical problems (0.83), treatments (0.82), and tests (0.83). Recall lagged precision in all experiments. Precision was near or above 0.90 in all tasks. Discussion With no customization for the tasks and less than 5 min of end-user time to configure and launch each experiment, the average F-measure was 0.83, one point behind the mean F-measure of the 22 entrants in the competition. Strong precision scores indicate the potential of applying the approach for more specific clinical information extraction tasks. There was not one best configuration, supporting an iterative approach to model creation. Acceptable levels of performance can be achieved using fully automated and generalizable approaches to concept-level information extraction. The described implementation and related documentation is available for download.

  13. Testing on a Large Scale Running the ATLAS Data Acquisition and High Level Trigger Software on 700 PC Nodes

    CERN Document Server

    Burckhart-Chromek, Doris; Adragna, P; Alexandrov, L; Amorim, A; Armstrong, S; Badescu, E; Baines, J T M; Barros, N; Beck, H P; Bee, C; Blair, R; Bogaerts, J A C; Bold, T; Bosman, M; Caprini, M; Caramarcu, C; Ciobotaru, M; Comune, G; Corso-Radu, A; Cranfield, R; Crone, G; Dawson, J; Della Pietra, M; Di Mattia, A; Dobinson, Robert W; Dobson, M; Dos Anjos, A; Dotti, A; Drake, G; Ellis, Nick; Ermoline, Y; Ertorer, E; Falciano, S; Ferrari, R; Ferrer, M L; Francis, D; Gadomski, S; Gameiro, S; Garitaonandia, H; Gaudio, G; George, S; Gesualdi-Mello, A; Gorini, B; Green, B; Haas, S; Haberichter, W N; Hadavand, H; Haeberli, C; Haller, J; Hansen, J; Hauser, R; Hillier, S J; Höcker, A; Hughes-Jones, R E; Joos, M; Kazarov, A; Kieft, G; Klous, S; Kohno, T; Kolos, S; Korcyl, K; Kordas, K; Kotov, V; Kugel, A; Landon, M; Lankford, A; Leahu, L; Leahu, M; Lehmann-Miotto, G; Le Vine, M J; Liu, W; Maeno, T; Männer, R; Mapelli, L; Martin, B; Masik, J; McLaren, R; Meessen, C; Meirosu, C; Mineev, M; Misiejuk, A; Morettini, P; Mornacchi, G; Müller, M; Garcia-Murillo, R; Nagasaka, Y; Negri, A; Padilla, C; Pasqualucci, E; Pauly, T; Perera, V; Petersen, J; Pope, B; Albuquerque-Portes, M; Pretzl, K; Prigent, D; Roda, C; Ryabov, Yu; Salvatore, D; Schiavi, C; Schlereth, J L; Scholtes, I; Sole-Segura, E; Seixas, M; Sloper, J; Soloviev, I; Spiwoks, R; Stamen, R; Stancu, S; Strong, S; Sushkov, S; Szymocha, T; Tapprogge, S; Teixeira-Dias, P; Torres, R; Touchard, F; Tremblet, L; Ünel, G; Van Wasen, J; Vandelli, W; Vaz-Gil-Lopes, L; Vermeulen, J C; von der Schmitt, H; Wengler, T; Werner, P; Wheeler, S; Wickens, F; Wiedenmann, W; Wiesmann, M; Wu, X; Yasu, Y; Yu, M; Zema, F; Zobernig, H; Computing In High Energy and Nuclear Physics

    2006-01-01

    The ATLAS Data Acquisition (DAQ) and High Level Trigger (HLT) software system will be comprised initially of 2000 PC nodes which take part in the control, event readout, second level trigger and event filter operations. This high number of PCs will only be purchased before data taking in 2007. The large CERN IT LXBATCH facility provided the opportunity to run in July 2005 online functionality tests over a period of 5 weeks on a stepwise increasing farm size from 100 up to 700 PC dual nodes. The interplay between the control and monitoring software with the event readout, event building and the trigger software has been exercised the first time as an integrated system on this large scale. New was also to run algorithms in the online environment for the trigger selection and in the event filter processing tasks on a larger scale. A mechanism has been developed to package the offline software together with the DAQ/HLT software and to distribute it via peer-to-peer software efficiently to this large pc cluster. T...

  14. Testing on a Large Scale running the ATLAS Data Acquisition and High Level Trigger Software on 700 PC Nodes

    CERN Document Server

    Burckhart-Chromek, Doris; Adragna, P; Albuquerque-Portes, M; Alexandrov, L; Amorim, A; Armstrong, S; Badescu, E; Baines, J T M; Barros, N; Beck, H P; Bee, C; Blair, R; Bogaerts, J A C; Bold, T; Bosman, M; Caprini, M; Caramarcu, C; Ciobotaru, M; Comune, G; Corso-Radu, A; Cranfield, R; Crone, G; Dawson, J; Della Pietra, M; Di Mattia, A; Dobinson, Robert W; Dobson, M; Dos Anjos, A; Dotti, A; Drake, G; Ellis, Nick; Ermoline, Y; Ertorer, E; Falciano, S; Ferrari, R; Ferrer, M L; Francis, D; Gadomski, S; Gameiro, S; Garcia-Murillo, R; Garitaonandia, H; Gaudio, G; George, S; Gesualdi-Mello, A; Gorini, B; Green, B; Haas, S; Haberichter, W N; Hadavand, H; Haeberli, C; Haller, J; Hansen, J; Hauser, R; Hillier, S J; Hughes-Jones, R E; Höcker, A; Joos, M; Kazarov, A; Kieft, G; Klous, S; Kohno, T; Kolos, S; Korcyl, K; Kordas, K; Kotov, V; Kugel, A; Landon, M; Lankford, A; Le Vine, M J; Leahu, L; Leahu, M; Lehmann-Miotto, G; Liu, W; Maeno, T; Mapelli, L; Martin, B; Masik, J; McLaren, R; Meessen, C; Meirosu, C; Mineev, M; Misiejuk, A; Morettini, P; Mornacchi, G; Männer, R; Müller, M; Nagasaka, Y; Negri, A; Padilla, C; Pasqualucci, E; Pauly, T; Perera, V; Petersen, J; Pope, B; Pretzl, K; Prigent, D; Roda, C; Ryabov, Yu; Salvatore, D; Schiavi, C; Schlereth, J L; Scholtes, I; Seixas, M; Sloper, J; Sole-Segura, E; Soloviev, I; Spiwoks, R; Stamen, R; Stancu, S; Strong, S; Sushkov, S; Szymocha, T; Tapprogge, S; Teixeira-Dias, P; Torres, R; Touchard, F; Tremblet, L; Van Wasen, J; Vandelli, W; Vaz-Gil-Lopes, L; Vermeulen, J C; Wengler, T; Werner, P; Wheeler, S; Wickens, F; Wiedenmann, W; Wiesmann, M; Wu, X; Yasu, Y; Yu, M; Zema, F; Zobernig, H; von der Schmitt, H; Ünel, G; Computing In High Energy and Nuclear Physics

    2006-01-01

    The ATLAS Data Acquisition (DAQ) and High Level Trigger (HLT) software system will be comprised initially of 2000 PC nodes which take part in the control, event readout, second level trigger and event filter operations. This high number of PCs will only be purchased before data taking in 2007. The large CERN IT LXBATCH facility provided the opportunity to run in July 2005 online functionality tests over a period of 5 weeks on a stepwise increasing farm size from 100 up to 700 PC dual nodes. The interplay between the control and monitoring software with the event readout, event building and the trigger software has been exercised the first time as an integrated system on this large scale. New was also to run algorithms in the online environment for the trigger selection and in the event filter processing tasks on a larger scale. A mechanism has been developed to package the offline software together with the DAQ/HLT software and to distribute it via peer-to-peer software efficiently to this large pc cluster. T...

  15. Optimization of intrinsic layer thickness, dopant layer thickness and concentration for a-SiC/a-SiGe multilayer solar cell efficiency performance using Silvaco software

    Directory of Open Access Journals (Sweden)

    Wei Yuan Wong

    2017-01-01

    Full Text Available Solar cell is expanding as green renewable alternative to conventional fossil fuel electricity generation, but compared to other land-used electrical generators, it is a comparative beginner. Many applications covered by solar cells starting from low power mobile devices, terrestrial, satellites and many more. To date, the highest efficiency solar cell is given by GaAs based multilayer solar cell. However, this material is very expensive in fabrication and material costs compared to silicon which is cheaper due to the abundance of supply. Thus, this research is devoted to develop multilayer solar cell by combining two different layers of P-I-N structures with silicon carbide and silicon germanium. This research focused on optimising the intrinsic layer thickness, p-doped layer thickness and concentration, n-doped layer thickness and concentration in achieving the highest efficiency. As a result, both single layer a-SiC and a-SiGe showed positive efficiency improvement with the record of 27.19% and 9.07% respectively via parametric optimization. The optimized parameters is then applied on both SiC and SiGe P-I-N layers and resulted the convincing efficiency of 33.80%.

  16. Optimization of intrinsic layer thickness, dopant layer thickness and concentration for a-SiC/a-SiGe multilayer solar cell efficiency performance using Silvaco software

    Science.gov (United States)

    Yuan, Wong Wei; Natashah Norizan, Mohd; Salwani Mohamad, Ili; Jamalullail, Nurnaeimah; Hidayah Saad, Nor

    2017-11-01

    Solar cell is expanding as green renewable alternative to conventional fossil fuel electricity generation, but compared to other land-used electrical generators, it is a comparative beginner. Many applications covered by solar cells starting from low power mobile devices, terrestrial, satellites and many more. To date, the highest efficiency solar cell is given by GaAs based multilayer solar cell. However, this material is very expensive in fabrication and material costs compared to silicon which is cheaper due to the abundance of supply. Thus, this research is devoted to develop multilayer solar cell by combining two different layers of P-I-N structures with silicon carbide and silicon germanium. This research focused on optimising the intrinsic layer thickness, p-doped layer thickness and concentration, n-doped layer thickness and concentration in achieving the highest efficiency. As a result, both single layer a-SiC and a-SiGe showed positive efficiency improvement with the record of 27.19% and 9.07% respectively via parametric optimization. The optimized parameters is then applied on both SiC and SiGe P-I-N layers and resulted the convincing efficiency of 33.80%.

  17. Effects of dietary clinoptilolite and calcium levels on the performance and egg quality of commercial layers

    Directory of Open Access Journals (Sweden)

    DA Berto

    2013-09-01

    Full Text Available Among the different feed additives studied in poultry production, clinoptilolite, an aluminosilicate capable of adsorbing harmful substances and of improving live performance and egg and meat quality, was evaluated. The objective of the present study was to evaluate the influence of dietary clinoptilolite and calcium levels on the performance and egg quality of layers. In total, 576 layers were distributed according to a completely randomized experimental design in a 3 x 4 factorial arrangement (three calcium levels - 2.5, 3.1, or 3.7% and four clinoptilolite levels - 0.0, 0.15, 0.25, or 0.50%, with 12 treatments of six replicates of eight birds each. The experiment included four 28-d cycles. The experimental diets were based on corn and soybean meal. Results were submitted to analysis of variance and means were compared by the test of Tukey at 5% significance level using SISVAR statistical package. There was a significant interaction between the evaluated factors for egg production and feed conversion ratio per dozen eggs and egg mass. The lowest calcium level resulted in worse performance and eggshell quality. Clinoptilolite levels affected albumen and yolk content. It was concluded that up to 0.50% inclusion of clinoptilolite in layer diets does not benefit layer performance or eggshell quality. Although the inclusion of only 2.5% calcium in layer diets is not recommended, it is possible to add 3.1% because it promoted similar results as the recommended level of 3.7%.

  18. Improving Software Quality and Management Through Use of Service Level Agreements

    National Research Council Canada - National Science Library

    Gaines, Leonard T

    2005-01-01

    .... SLAs are typically used in outsourcing contracts for post-production support. We propose that SLAs be used in software acquisition to support quality and process control throughout the lifecycle...

  19. Improving Software Quality and Management Through Use of Service Level Agreements

    Science.gov (United States)

    2005-03-01

    many who believe that the quality of the development process is the best predictor of software product quality. ( Fenton ) Repeatable software processes...reduced errors per KLOC for small projects ( Fenton ), and the quality management metric (QMM) (Machniak, Osmundson). There are also numerous IEEE 14...attention to cosmetic user interface issues and any problems that may arise with the prototype. (Sawyer) The validation process is also another check

  20. Code Description for Generation of Meteorological Height and Pressure Level and Layer Profiles

    Science.gov (United States)

    2016-06-01

    defined by user input height or pressure levels. It can process input profiles from sensing systems such as radiosonde, lidar, or wind profiling radar...routine may be required for different input types and formats. meteorological sounding interpolation , integrated mean layer values, US Army Research...or other radiosonde soundings. There are 2 main versions or “methods” that produce output in height- or pressure-based profiles of interpolated level

  1. Software to compute elastostatic Green's functions for sources in 3D homogeneous elastic layers above a (visco)elastic halfspace

    Science.gov (United States)

    Bradley, A. M.; Segall, P.

    2012-12-01

    We describe software, in development, to calculate elastostatic displacement Green's functions and their derivatives for point and polygonal dislocations in three-dimensional homogeneous elastic layers above an elastic or a viscoelastic halfspace. The steps to calculate a Green's function for a point source at depth zs are as follows. 1. A grid in wavenumber space is chosen. 2. A six-element complex rotated stress-displacement vector x is obtained at each grid point by solving a two-point boundary value problem (2P-BVP). If the halfspace is viscoelastic, the solution is inverse Laplace transformed. 3. For each receiver, x is propagated to the receiver depth zr (often zr = 0) and then, 4, inverse Fourier transformed, with the Fourier component corresponding to the receiver's horizontal position. 5. The six elements are linearly combined into displacements and their derivatives. The dominant work is in step 2. The grid is chosen to represent the wavenumber-space solution with as few points as possible. First, the wavenumber space is transformed to increase sampling density near 0 wavenumber. Second, a tensor-product grid of Chebyshev points of the first kind is constructed in each quadrant of the transformed wavenumber space. Moment-tensor-dependent symmetries further reduce work. The numerical solution of the 2P-BVP problem in step 2 involves solving a linear equation A x = b. Half of the elements of x are of geophysical interest; the subset depends on whether zr ≤ zs. Denote these \\hat x. As wavenumber k increases, \\hat x can become inaccurate in finite precision arithmetic for two reasons: 1. The condition number of A becomes too large. 2. The norm-wise relative error (NWRE) in \\hat x is large even though it is small in x. To address this problem, a number of researchers have used determinants to obtain x. This may be the best approach for 6-dimensional or smaller 2P-BVP, where the combinatorial increase in work is still moderate. But there is an alternative

  2. Establishment of institutional diagnostic reference level for computed tomography with automated dose-tracking software.

    Science.gov (United States)

    Liang, Chong R; Chen, Priscilla X H; Kapur, Jeevesh; Ong, Michael K L; Quek, Swee T; Kapur, Subhash C

    2017-06-01

    The aim of this study was to establish institutional diagnostic reference levels (DRLs) by summarising doses collected across the five computed tomography (CT) system in our institution. CT dose data of 15940 patients were collected retrospectively from May 2015 to October 2015 in five institutional scanners. The mean, 75th percentile and 90th percentile of the dose spread were calculated according to anatomic region. The common CT examinations such as head, chest, combined abdomen/pelvis (A/P), and combined chest/abdomen/pelvis (C/A/P) were reviewed. Distribution of CT dose index (CTDIvol), dose-length product (DLP) and effective dose (ED) were extracted from the data for single-phasic and multiphasic examinations. The institutional DRL for our CT units were established as mean (50th percentile) of CTDIvol (mGy), DLP (mGy.cm) and ED (mSv) for single and multiphasic studies using the dose-tracking software. In single phasic examination, Head: (49.0 mGy), (978.0 mGy.cm), (2.4 mSv) respectively; Chest: (6.0 mGy), (254.0 mGy.cm), (4.9 mSv) respectively; CT A/P (10.0 mGy), (514.0 mGy.cm), (8.9 mSv) respectively; CT C/A/P (10.0 mGy), (674.0 mGy.cm), (11.8 mSv) respectively. In multiphasic studies: Head (45.0 mGy), (1822.0 mGy.cm), (5.0 mSv) respectively; Chest (8.0 mGy), (577.0 mGy.cm), (10.0 mSv) respectively; CT A/P: (10.0 mGy), (1153.0 mGy.cm), (20.2 mSv) respectively; CT C/A/P: (11.0 mGy), (1090.0 mGy.cm), (19.2 mSv) respectively. The reported metrics offer a variety of information that institutions can use for quality improvement activities. The variations in dose between scanners suggest a large potential for optimisation of radiation dose. © 2017 The Authors. Journal of Medical Radiation Sciences published by John Wiley & Sons Australia, Ltd on behalf of Australian Society of Medical Imaging and Radiation Therapy and New Zealand Institute of Medical Radiation Technology.

  3. CPAs in Mississippi: Communication Skills and Software Needed by Entry-Level Accountants

    Science.gov (United States)

    Bunn, Phyllis C.; Barfit, Laurie A.; Cooper, Jan

    2005-01-01

    The purpose of this paper was to determine what communication skills are considered most important by employers in the accounting profession as well as to determine the general office, income tax, and bookkeeping software packages used by CPA firms in Mississippi. The data was collected by means of an electronic five-point Likert-type survey…

  4. Cyclic Voltammetry Simulations with DigiSim Software: An Upper-Level Undergraduate Experiment

    Science.gov (United States)

    Messersmith, Stephania J.

    2014-01-01

    An upper-division undergraduate chemistry experiment is described which utilizes DigiSim software to simulate cyclic voltammetry (CV). Four mechanisms were studied: a reversible electron transfer with no subsequent or proceeding chemical reactions, a reversible electron transfer followed by a reversible chemical reaction, a reversible chemical…

  5. Improving Students’ Learning in Software Engineering Education through Multi-Level Assignments

    NARCIS (Netherlands)

    Dr. Leo Pruijt; Christian Köppe

    2014-01-01

    Author supplied: DOI : http://dx.doi.org/10.1145/2691352.2691357 Assignments and exercises are an essential part of software engineering education. It usually requires a variety of these assignments to cover a desired wide range of educational objectives as defined in the revised Bloom's taxonomy.

  6. Performance assessment of the disposal of vitrified high-level waste in a clay layer

    International Nuclear Information System (INIS)

    Mallants, Dirk; Marivoet, Jan; Sillen, Xavier

    2001-01-01

    Deep disposal is considered a safe solution to the management of high-level radioactive waste. The safety is usually demonstrated by means of a performance assessment. This paper discusses the methodological aspects and some of the results obtained for the performance assessment of the disposal of vitrified high-level waste in a clay layer in Belgium. The calculations consider radionuclide migration through the following multi-barrier components, all of which contribute to the overall safety: (1) engineered barriers and the host clay layer, (2) overlying aquifer, and (3) biosphere. The interfaces between aquifers and biosphere are limited to the well and river pathway. Results of the performance assessment calculations are given in terms of the time evolution of the dose rates of the most important fission and activation products and actinides. The role of the glass matrix in the overall performance of the repository is also discussed

  7. Software Quality Assurance in Software Projects: A Study of Pakistan

    OpenAIRE

    Faisal Shafique Butt; Sundus Shaukat; M. Wasif Nisar; Ehsan Ullah Munir; Muhammad Waseem; Kashif Ayyub

    2013-01-01

    Software quality is specific property which tells what kind of standard software should have. In a software project, quality is the key factor of success and decline of software related organization. Many researches have been done regarding software quality. Software related organization follows standards introduced by Capability Maturity Model Integration (CMMI) to achieve good quality software. Quality is divided into three main layers which are Software Quality Assurance (SQA), Software Qu...

  8. MoFi: A Software Tool for Annotating Glycoprotein Mass Spectra by Integrating Hybrid Data from the Intact Protein and Glycopeptide Level.

    Science.gov (United States)

    Skala, Wolfgang; Wohlschlager, Therese; Senn, Stefan; Huber, Gabriel E; Huber, Christian G

    2018-04-18

    Hybrid mass spectrometry (MS) is an emerging technique for characterizing glycoproteins, which typically display pronounced microheterogeneity. Since hybrid MS combines information from different experimental levels, it crucially depends on computational methods. Here, we describe a novel software tool, MoFi, which integrates hybrid MS data to assign glycans and other post-translational modifications (PTMs) in deconvoluted mass spectra of intact proteins. Its two-stage search algorithm first assigns monosaccharide/PTM compositions to each peak and then compiles a hierarchical list of glycan combinations compatible with these compositions. Importantly, the program only includes those combinations which are supported by a glycan library as derived from glycopeptide or released glycan analysis. By applying MoFi to mass spectra of rituximab, ado-trastuzumab emtansine, and recombinant human erythropoietin, we demonstrate how integration of bottom-up data may be used to refine information collected at the intact protein level. Accordingly, our software reveals that a single mass frequently can be explained by a considerable number of glycoforms. Yet, it simultaneously ranks proteoforms according to their probability, based on a score which is calculated from relative glycan abundances. Notably, glycoforms that comprise identical glycans may nevertheless differ in score if those glycans occupy different sites. Hence, MoFi exposes different layers of complexity that are present in the annotation of a glycoprotein mass spectrum.

  9. Mixed layer depths via Doppler lidar during low-level jet events

    Science.gov (United States)

    Carroll, Brian; Demoz, Belay; Bonin, Timothy; Delgado, Ruben

    2018-04-01

    A low-level jet (LLJ) is a prominent wind speed peak in the lower troposphere. Nocturnal LLJs have been shown to transport and mix atmospheric constituents from the residual layer down to the surface, breaching quiescent nocturnal conditions due to high wind shear. A new fuzzy logic algorithm combining turbulence and aerosol information from Doppler lidar scans can resolve the strength and depth of this mixing below the jet. Conclusions will be drawn about LLJ relations to turbulence and mixing.

  10. Normative data of outer photoreceptor layer thickness obtained by software image enhancing based on Stratus optical coherence tomography images

    DEFF Research Database (Denmark)

    Christensen, U.C.; Krøyer, K.; Thomadsen, Jakob

    2008-01-01

    backscattered light within the outer nuclear layer (ONL) in the fovea was registered and compared with backscattered light within the ONL in the peripheral part of the macula (I-ratio-ONL). Results: The mean RPE-OScomplex thickness in the foveal centre was 77.2 mu m (SD = 3.95). The RPE-OScomplex thickness...... in the superior macula 0.5-3 mm of the centre was significantly increased as compared with the corresponding inferior retina. In healthy subjects, the I-ratio-ONL was 1.06. Conclusions: Contrast-enhanced OCT images enable quantification of outer photoreceptor layer thickness, and normative values may help...

  11. Luminescence and deep-level transient spectroscopy of grown dislocation-rich Si layers

    Directory of Open Access Journals (Sweden)

    I. I. Kurkina

    2012-09-01

    Full Text Available The charge deep-level transient spectroscopy (Q-DLTS is applied to the study of the dislocation-rich Si layers grown on a surface composed of dense arrays of Ge islands prepared on the oxidized Si surface. This provides revealing three deep-level bands located at EV + 0.31 eV, EC – 0.35 eV and EC – 0.43 eV using the stripe-shaped p-i-n diodes fabricated on the basis of these layers. The most interesting observation is the local state recharging process which proceeds with low activation energy (∼50 meV or without activation. The recharging may occur by carrier tunneling within deep-level bands owing to the high dislocation density ∼ 1011 - 1012 cm-2. This result is in favor of the suggestion on the presence of carrier transport between the deep states, which was previously derived from the excitation dependence of photoluminescence (PL intensity. Electroluminescence (EL spectra measured from the stripe edge of the same diodes contain two peaks centered near 1.32 and 1.55 μm. Comparison with PL spectra indicates that the EL peaks are generated from arsenic-contaminated and pure areas of the layers, respectively.

  12. Measurement and Management of the Level of Quality Control Process in SoC (System on Chip Embedded Software Development

    Directory of Open Access Journals (Sweden)

    Ki-Won Song

    2012-04-01

    Full Text Available This paper presents the process of measuring the level of quality control process to ensure the quality of delivered software package during the development cycle. The success of the project requires three pre-requisites and they constrain one another. Quality is the most important factor for successful project completion. In other words, quality should not be sacrificed for the sake of meeting cost budget or delivering within schedule. Also, cost caused by any quality issues such as defect resolution increases exponentially once the product is out of the door. Having said that, we also have to consider the schedule side of constraints for the successful project. In other words, we have no time to do a quality job and we have to compete with other competitors to ship the product to the market earlier than them. So, the quality measurement and management concept is introduced to meet the agile software development environment in conjunction with performance strategies to execute within organization. Obviously, there are many key performance indexes derivable from the actual data associated with quality control activities and it is desirable to create a quality process to integrally represent overall level of quality control activities performed while developing the software deliverables. With the quality process, it is possible to evaluate whether enough quality control activities are performed for the project officially and secure the quality of the software deliverables before it is delivered to the customers.

  13. Design and implementation of a software defined HiperLAN/2 physical layer model for simulation purposes

    NARCIS (Netherlands)

    van Hoesel, L.F.W.

    2002-01-01

    In this Master of Science thesis a simulation model of the HiperLAN/2 physical layer is designed and implemented. The model should provide insight in the demodulation functions that are necessary in HiperLAN/2 and it should be useful for determining channel selection and computational requirements

  14. Study on electrical defects level in single layer two-dimensional Ta2O5

    Science.gov (United States)

    Dahai, Li; Xiongfei, Song; Linfeng, Hu; Ziyi, Wang; Rongjun, Zhang; Liangyao, Chen; David, Wei Zhang; Peng, Zhou

    2016-04-01

    Two-dimensional atomic-layered material is a recent research focus, and single layer Ta2O5 used as gate dielectric in field-effect transistors is obtained via assemblies of Ta2O5 nanosheets. However, the electrical performance is seriously affected by electronic defects existing in Ta2O5. Therefore, spectroscopic ellipsometry is used to calculate the transition energies and corresponding probabilities for two different charged oxygen vacancies, whose existence is revealed by x-ray photoelectron spectroscopy analysis. Spectroscopic ellipsometry fitting also calculates the thickness of single layer Ta2O5, exhibiting good agreement with atomic force microscopy measurement. Nondestructive and noncontact spectroscopic ellipsometry is appropriate for detecting the electrical defects level of single layer Ta2O5. Project supported by the National Natural Science Foundation of China (Grant Nos. 11174058 and 61376093), the Fund from Shanghai Municipal Science and Technology Commission (Grant No. 13QA1400400), the National Science and Technology Major Project, China (Grant No. 2011ZX02707), and the Innovation Program of Shanghai Municipal Education Commission (Grant No. 12ZZ010).

  15. A Prediction Packetizing Scheme for Reducing Channel Traffic in Transaction-Level Hardware/Software Co-Emulation

    OpenAIRE

    Lee , Jae-Gon; Chung , Moo-Kyoung; Ahn , Ki-Yong; Lee , Sang-Heon; Kyung , Chong-Min

    2005-01-01

    Submitted on behalf of EDAA (http://www.edaa.com/); International audience; This paper presents a scheme for efficient channel usage between simulator and accelerator where the accelerator models some RTL sub-blocks in the accelerator-based hardware/software co-simulation while the simulator runs transaction-level model of the remaining part of the whole chip being verified. With conventional simulation accelerator, evaluations of simulator and accelerator alternate at every valid simulation ...

  16. Interaction of Peat Soil and Sulphidic Material Substratum: Role of Peat Layer and Groundwater Level Fluctuations on Phosphorus Concentration

    Directory of Open Access Journals (Sweden)

    Benito Heru Purwanto

    2014-09-01

    Full Text Available Phosphorus (P often becomes limiting factor for plants growth. Phosphorus geochemistry in peatland soil is associated with the presence of peat layer and groundwater level fluctuations. The research was conducted to study the role of peat layer and groundwater level fluctuations on P concentration in peatland. The research was conducted on deep, moderate and shallow peat with sulphidic material as substratum, peaty acid sulphate soil, and potential acid sulphate soil. While P concentration was observed in wet season, in transition from wet to dry season, and in dry season. Soil samples were collected by using peat borer according to interlayer and soil horizon. The results showed that peat layer might act as the main source of P in peatland with sulphidic material substratum. The upper peat layer on sulphidic material caused by groundwater level fluctuations had no directly effect on P concentration in the peat layers. Increased of P concentration in the lowest sulphidic layer might relate to redox reaction of iron in the sulphidic layer and precipitation process. Phosphorus concentration in peatland with sulphidic material as substratum was not influenced by peat thickness. However, depletion or disappearance of peat layer decreased P concentration in soil solution. Disappearance of peat layer means loss of a natural source of P for peatland with sulphidic material as substratum, therefore peat layer must be kept in order to maintain of peatlands.

  17. Applications of artificial intelligence to space station and automated software techniques: High level robot command language

    Science.gov (United States)

    Mckee, James W.

    1989-01-01

    The objective is to develop a system that will allow a person not necessarily skilled in the art of programming robots to quickly and naturally create the necessary data and commands to enable a robot to perform a desired task. The system will use a menu driven graphical user interface. This interface will allow the user to input data to select objects to be moved. There will be an imbedded expert system to process the knowledge about objects and the robot to determine how they are to be moved. There will be automatic path planning to avoid obstacles in the work space and to create a near optimum path. The system will contain the software to generate the required robot instructions.

  18. CAPD Software Development for Automatic Piping System Design: Checking Piping Pocket, Checking Valve Level and Flexibility

    International Nuclear Information System (INIS)

    Ari Satmoko; Edi Karyanta; Dedy Haryanto; Abdul Hafid; Sudarno; Kussigit Santosa; Pinitoyo, A.; Demon Handoyo

    2003-01-01

    One of several steps in industrial plant construction is preparing piping layout drawing. In this drawing, pipe and all other pieces such as instrumentation, equipment, structure should be modeled A software called CAPD was developed to replace and to behave as piping drafter or designer. CAPD was successfully developed by adding both subprogram CHKUPIPE and CHKMANV. The first subprogram can check and gives warning if there is piping pocket in the piping system. The second can identify valve position and then check whether valve can be handled by operator hand The main program CAPD was also successfully modified in order to be capable in limiting the maximum length of straight pipe. By limiting the length, piping flexibility can be increased. (author)

  19. Architecture-Level Exploration of Alternative Interconnection Schemes Targeting 3D FPGAs: A Software-Supported Methodology

    Directory of Open Access Journals (Sweden)

    Kostas Siozios

    2008-01-01

    Full Text Available In current reconfigurable architectures, the interconnection structures increasingly contribute more to the delay and power consumption. The demand for increased clock frequencies and logic density (smaller area footprint makes the problem even more important. Three-dimensional (3D architectures are able to alleviate this problem by accommodating a number of functional layers, each of which might be fabricated in different technology. However, the benefits of such integration technology have not been sufficiently explored yet. In this paper, we propose a software-supported methodology for exploring and evaluating alternative interconnection schemes for 3D FPGAs. In order to support the proposed methodology, three new CAD tools were developed (part of the 3D MEANDER Design Framework. During our exploration, we study the impact of vertical interconnection between functional layers in a number of design parameters. More specifically, the average gains in operation frequency, power consumption, and wirelength are 35%, 32%, and 13%, respectively, compared to existing 2D FPGAs with identical logic resources. Also, we achieve higher utilization ratio for the vertical interconnections compared to existing approaches by 8% for designing 3D FPGAs, leading to cheaper and more reliable devices.

  20. The effects of exercise reminder software program on office workers' perceived pain level, work performance and quality of life.

    Science.gov (United States)

    Irmak, A; Bumin, G; Irmak, R

    2012-01-01

    In direct proportion to current technological developments, both the computer usage in the workplaces is increased and requirement of leaving the desk for an office worker in order to photocopy a document, send or receive an e-mail is decreased. Therefore, office workers stay in the same postures accompanied by long periods of keyboard usage. In recent years, with intent to reduce the incidence of work related musculoskeletal disorders several exercise reminder software programs have been developed. The purpose of this study is to evaluate the effectiveness of the exercise reminder software program on office workers' perceived pain level, work performance and quality of life. 39 healthy office workers accepted to attend the study. Participants were randomly split in to two groups, control group (n = 19) and intervention group (n = 20). Visual Analogue Scale to evaluate the perceived pain was administered all of the participants in the beginning and at the end of the study. The intervention group used the program for 10 weeks. Findings showed that the control group VAS scores remained the same, but the intervention group VAS scores decreased in a statistically significant way (p software programs may help to reduce perceived pain among office workers. Further long term studies with more subjects are needed to describe the effects of these programs and the mechanism under these effects.

  1. Top-Level Software for VVER-1000 In-core Monitoring System under Implementation of Expanded Nuclear Fuel Diversification Program in Ukraine

    International Nuclear Information System (INIS)

    Khalimonchuk, V.A.

    2015-01-01

    The paper considers the possibility and expediency of developing mathematical software for VVER-1000 ICMS in Ukraine. This mathematical software is among the most important conditions for implementation of the expanded nuclear fuel diversification program. The top-level software is to be developed based on SSTC own studies in the development of codes for power distribution recovery, which were successfully used previously for RBMK-1000 safety analysis

  2. Low-level RF LabVIEW reg-sign control software user's manual: Version 1.0

    International Nuclear Information System (INIS)

    1992-06-01

    This document details information on the low-level radio frequency (LLRF) software control package. The chapters in this manual cover the following topics: Chapter one describes the general operating principles of the LabVIEW software package, and also discusses the high-level menu panels which allow access to the individual control panels. Chapter two covers the control panels used for conditioning the cavity, and for controlling the accelerator under normal operating conditions. Chapter three provides information on the resonance detection and reflectometer calibration function, including the setup and status panels for each. Chapter four contain instructions on the use of those panels dedicated to controlling the cavity RF field. Chapter five discusses the control panels that provide setup and status information on the diagnostic monitor subsystem. Chapter six outlines those panels used to control the timing functions provided by the LLRF system. Finally, chapter seven describes the control panels used to monitor and adjust the alarm and limit functions of the system. Throughout the document, it is assumed that the reader has a general working knowledge of accelerators, high-power amplifier equipment, and low-level RF (LLRF) control systems. References are listed as footnotes as they occur in the text

  3. Summertime observations of elevated levels of ultrafine particles in the high Arctic marine boundary layer

    Science.gov (United States)

    Burkart, Julia; Willis, Megan D.; Bozem, Heiko; Thomas, Jennie L.; Law, Kathy; Hoor, Peter; Aliabadi, Amir A.; Köllner, Franziska; Schneider, Johannes; Herber, Andreas; Abbatt, Jonathan P. D.; Leaitch, W. Richard

    2017-05-01

    Motivated by increasing levels of open ocean in the Arctic summer and the lack of prior altitude-resolved studies, extensive aerosol measurements were made during 11 flights of the NETCARE July 2014 airborne campaign from Resolute Bay, Nunavut. Flights included vertical profiles (60 to 3000 m above ground level) over open ocean, fast ice, and boundary layer clouds and fogs. A general conclusion, from observations of particle numbers between 5 and 20 nm in diameter (N5 - 20), is that ultrafine particle formation occurs readily in the Canadian high Arctic marine boundary layer, especially just above ocean and clouds, reaching values of a few thousand particles cm-3. By contrast, ultrafine particle concentrations are much lower in the free troposphere. Elevated levels of larger particles (for example, from 20 to 40 nm in size, N20 - 40) are sometimes associated with high N5 - 20, especially over low clouds, suggestive of aerosol growth. The number densities of particles greater than 40 nm in diameter (N > 40) are relatively depleted at the lowest altitudes, indicative of depositional processes that will lower the condensation sink and promote new particle formation. The number of cloud condensation nuclei (CCN; measured at 0.6 % supersaturation) are positively correlated with the numbers of small particles (down to roughly 30 nm), indicating that some fraction of these newly formed particles are capable of being involved in cloud activation. Given that the summertime marine Arctic is a biologically active region, it is important to better establish the links between emissions from the ocean and the formation and growth of ultrafine particles within this rapidly changing environment.

  4. Alleviation of fermi-level pinning effect at metal/germanium interface by the insertion of graphene layers

    International Nuclear Information System (INIS)

    Baek, Seung-heon Chris; Seo, Yu-Jin; Oh, Joong Gun; Albert Park, Min Gyu; Bong, Jae Hoon; Yoon, Seong Jun; Lee, Seok-Hee; Seo, Minsu; Park, Seung-young; Park, Byong-Guk

    2014-01-01

    In this paper, we report the alleviation of the Fermi-level pinning on metal/n-germanium (Ge) contact by the insertion of multiple layers of single-layer graphene (SLG) at the metal/n-Ge interface. A decrease in the Schottky barrier height with an increase in the number of inserted SLG layers was observed, which supports the contention that Fermi-level pinning at metal/n-Ge contact originates from the metal-induced gap states at the metal/n-Ge interface. The modulation of Schottky barrier height by varying the number of inserted SLG layers (m) can bring about the use of Ge as the next-generation complementary metal-oxide-semiconductor material. Furthermore, the inserted SLG layers can be used as the tunnel barrier for spin injection into Ge substrate for spin-based transistors.

  5. Magneto-transport in the zero-energy Landau level of single-layer and bilayer graphene

    International Nuclear Information System (INIS)

    Zeitler, U; Giesbers, A J M; Elferen, H J van; Kurganova, E V; McCollam, A; Maan, J C

    2011-01-01

    We present recent low-temperature magnetotransport experiments on single-layer and bilayer graphene in high magnetic field up to 33 T. In single layer graphene the fourfold degeneracy of the zero-energy Landau level is lifted by a gap opening at filling factor ν = 0. In bilayer graphene, we observe a partial lifting of the degeneracy of the eightfold degenerate zero-energy Landau level.

  6. Calculations of the electronic levels, spin-Hamiltonian parameters and vibrational spectra for the CrCl{sub 3} layered crystals

    Energy Technology Data Exchange (ETDEWEB)

    Avram, C.N. [Faculty of Physics, West University of Timisoara, Bd. V. Parvan No. 4, 300223 Timisoara (Romania); Gruia, A.S., E-mail: adigruia@yahoo.com [Faculty of Physics, West University of Timisoara, Bd. V. Parvan No. 4, 300223 Timisoara (Romania); Brik, M.G. [College of Sciences, Chongqing University of Posts and Telecommunications, Chongqing 400065 (China); Institute of Physics, University of Tartu, Ravila 14C, Tartu 50411 (Estonia); Institute of Physics, Jan Dlugosz University, Armii Krajowej 13/15, PL-42200 Czestochowa (Poland); Institute of Physics, Polish Academy of Sciences, Al. Lotników 32/46, 02-668 Warsaw (Poland); Barb, A.M. [Faculty of Physics, West University of Timisoara, Bd. V. Parvan No. 4, 300223 Timisoara (Romania)

    2015-12-01

    Calculations of the Cr{sup 3+} energy levels, spin-Hamiltonian parameters and vibrational spectra for the layered CrCl{sub 3} crystals are reported for the first time. The crystal field parameters and the energy level scheme were calculated in the framework of the Exchange Charge Model of crystal field. The spin-Hamiltonian parameters (zero-field splitting parameter D and g-factors) for Cr{sup 3+} ion in CrCl{sub 3} crystals were obtained using two independent techniques: i) semi-empirical crystal field theory and ii) density functional theory (DFT)-based model. In the first approach, the spin-Hamiltonian parameters were calculated from the perturbation theory method and the complete diagonalization (of energy matrix) method. The infrared (IR) and Raman frequencies were calculated for both experimental and fully optimized geometry of the crystal structure, using CRYSTAL09 software. The obtained results are discussed and compared with the experimental available data.

  7. Software Assurance Competency Model

    Science.gov (United States)

    2013-03-01

    COTS) software , and software as a service ( SaaS ). L2: Define and analyze risks in the acquisition of contracted software , COTS software , and SaaS ...2010a]: Application of technologies and processes to achieve a required level of confidence that software systems and services function in the...

  8. Modeling of DNA and Protein Organization Levels with Cn3D Software

    Science.gov (United States)

    Stasinakis, Panagiotis K.; Nicolaou, Despoina

    2017-01-01

    The molecular structure of living organisms and the complex interactions amongst its components are the basis for the diversity observed at the macroscopic level. Proteins and nucleic acids are some of the major molecular components, and play a key role in several biological functions, such as those of development and evolution. This article…

  9. Analysis of a Floodplain I-Wall Embedded in Horizontally Stratified Soil Layers During Flood Events Using Corps I-Wall Software Version 1.0

    Science.gov (United States)

    2016-07-01

    100, 300, 500 and 1,000 simulations. ........................... 255 Figure A1. Cantilever retaining wall . (a) Two layered soil site. (b...of flood elevation. In a safety or risk assessment of I- Walls , the rotational limit state or probability of rotational failure of the I- Wall about a...for the net loading is computed about the lower of the RHS or LHS ground surfaces for level ground, for a retaining wall design with differential

  10. The use of generalised audit software by internal audit functions in a developing country: A maturity level assessment

    Directory of Open Access Journals (Sweden)

    D.P. van der Nest

    2017-12-01

    Full Text Available This article explores the existing practices of internal audit functions in the locally controlled South African banking industry regarding the use of Generalised Audit Software (GAS, against a benchmark developed from recognised data analytic maturity models, in order to assess the current maturity levels of the locally controlled South African banks in the use of this software for tests of controls. The literature review indicates that the use of GAS by internal audit functions is still at a relatively low level of maturity, despite the accelerating adoption of information technology and generation of big data within organisations. The empirical results of this article also confirm that the maturity of the use of GAS by the internal auditors employed by locally controlled South African banks is still lower than expected, given that the world, especially from a business perspective is now fully immersed in a technological-driven business environment. This study has since been extended to other industries in the following countries namely, Canada, Columbia, Portugal and Australia

  11. Development of High Level Trigger Software for Belle II at SuperKEKB

    International Nuclear Information System (INIS)

    Lee, S; Itoh, R; Katayama, N; Mineo, S

    2011-01-01

    The Belle collaboration has been trying for 10 years to reveal the mystery of the current matter-dominated universe. However, much more statistics is required to search for New Physics through quantum loops in decays of B mesons. In order to increase the experimental sensitivity, the next generation B-factory, SuperKEKB, is planned. The design luminosity of SuperKEKB is 8 x 10 35 cm −2 s −1 a factor 40 above KEKB's peak luminosity. At this high luminosity, the level 1 trigger of the Belle II experiment will stream events of 300 kB size at a 30 kHz rate. To reduce the data flow to a manageable level, a high-level trigger (HLT) is needed, which will be implemented using the full offline reconstruction on a large scale PC farm. There, physics level event selection is performed, reducing the event rate by ∼ 10 to a few kHz. To execute the reconstruction the HLT uses the offline event processing framework basf2, which has parallel processing capabilities used for multi-core processing and PC clusters. The event data handling in the HLT is totally object oriented utilizing ROOT I/O with a new method of object passing over the UNIX socket connection. Also under consideration is the use of the HLT output as well to reduce the pixel detector event size by only saving hits associated with a track, resulting in an additional data reduction of ∼ 100 for the pixel detector. In this contribution, the design and implementation of the Belle II HLT are presented together with a report of preliminary testing results.

  12. Development of free statistical software enabling researchers to calculate confidence levels, clinical significance curves and risk-benefit contours

    International Nuclear Information System (INIS)

    Shakespeare, T.P.; Mukherjee, R.K.; Gebski, V.J.

    2003-01-01

    Confidence levels, clinical significance curves, and risk-benefit contours are tools improving analysis of clinical studies and minimizing misinterpretation of published results, however no software has been available for their calculation. The objective was to develop software to help clinicians utilize these tools. Excel 2000 spreadsheets were designed using only built-in functions, without macros. The workbook was protected and encrypted so that users can modify only input cells. The workbook has 4 spreadsheets for use in studies comparing two patient groups. Sheet 1 comprises instructions and graphic examples for use. Sheet 2 allows the user to input the main study results (e.g. survival rates) into a 2-by-2 table. Confidence intervals (95%), p-value and the confidence level for Treatment A being better than Treatment B are automatically generated. An additional input cell allows the user to determine the confidence associated with a specified level of benefit. For example if the user wishes to know the confidence that Treatment A is at least 10% better than B, 10% is entered. Sheet 2 automatically displays clinical significance curves, graphically illustrating confidence levels for all possible benefits of one treatment over the other. Sheet 3 allows input of toxicity data, and calculates the confidence that one treatment is more toxic than the other. It also determines the confidence that the relative toxicity of the most effective arm does not exceed user-defined tolerability. Sheet 4 automatically calculates risk-benefit contours, displaying the confidence associated with a specified scenario of minimum benefit and maximum risk of one treatment arm over the other. The spreadsheet is freely downloadable at www.ontumor.com/professional/statistics.htm A simple, self-explanatory, freely available spreadsheet calculator was developed using Excel 2000. The incorporated decision-making tools can be used for data analysis and improve the reporting of results of any

  13. Toward General Software Level Silent Data Corruption Detection for Parallel Applications

    Energy Technology Data Exchange (ETDEWEB)

    Berrocal, Eduardo; Bautista-Gomez, Leonardo; Di, Sheng; Lan, Zhiling; Cappello, Franck

    2017-12-01

    Silent data corruption (SDC) poses a great challenge for high-performance computing (HPC) applications as we move to extreme-scale systems. Mechanisms have been proposed that are able to detect SDC in HPC applications by using the peculiarities of the data (more specifically, its “smoothness” in time and space) to make predictions. However, these data-analytic solutions are still far from fully protecting applications to a level comparable with more expensive solutions such as full replication. In this work, we propose partial replication to overcome this limitation. More specifically, we have observed that not all processes of an MPI application experience the same level of data variability at exactly the same time. Thus, we can smartly choose and replicate only those processes for which the lightweight data-analytic detectors would perform poorly. In addition, we propose a new evaluation method based on the probability that a corruption will pass unnoticed by a particular detector (instead of just reporting overall single-bit precision and recall). In our experiments, we use four applications dealing with different explosions. Our results indicate that our new approach can protect the MPI applications analyzed with 7–70% less overhead (depending on the application) than that of full duplication with similar detection recall.

  14. Accounting utility for determining individual usage of production level software systems

    Science.gov (United States)

    Garber, S. C.

    1984-01-01

    An accounting package was developed which determines the computer resources utilized by a user during the execution of a particular program and updates a file containing accumulated resource totals. The accounting package is divided into two separate programs. The first program determines the total amount of computer resources utilized by a user during the execution of a particular program. The second program uses these totals to update a file containing accumulated totals of computer resources utilized by a user for a particular program. This package is useful to those persons who have several other users continually accessing and running programs from their accounts. The package provides the ability to determine which users are accessing and running specified programs along with their total level of usage.

  15. EUMENES, a computer software for managing the radiation safety program information at an institutional level

    International Nuclear Information System (INIS)

    Hernandez Saiz, Alejandro; Cornejo Diaz, Nestor; Valdes Ramos, Maryzury; Martinez Gonzalez, Alina; Gonzalez Rodriguez, Niurka; Vergara Gil, Alex

    2008-01-01

    The correct application of national and international regulations in the field of Radiological Safety requires the implementation of Radiation Safety Programs appropriate to the developed practice. These Programs demand the preparation and keeping of an important number of records and data, the compliance with working schedules, systematic quality controls, audits, delivery of information to the Regulatory Authority, the execution of radiological assessments, etc. Therefore, it is unquestionable the necessity and importance of having a computer tool to support the management of the information related to the Radiation Safety Program in any institution. The present work describes a computer program that allows the efficient management of these data. Its design was based on the IAEA International Basic Safety Standards recommendations and on the requirements of the Cuban national standards, with the objective of being flexible enough to be applied in most of the institutions using ionizing radiations. The most important records of Radiation Safety Programs were incorporated and reports can be generated by the users. An additional tools-module allows the user to access to a radionuclide data library, and to carry out different calculations of interest in radiological protection. The program has been developed in Borland Delphi and manages Microsoft Access databases. It is a user friendly code that aims to support the optimization of Radiation Safety Programs. The program contributes to save resources and time, as the generated information is electronically kept and transmitted. The code has different security access levels according to the user responsibility at the institution and also provides for the analysis of the introduced data, in a quick and efficient way, as well as to notice deadlines, the exceeding of reference levels and situations that require attention. (author)

  16. Teaching Joint-Level Robot Programming with a New Robotics Software Tool

    Directory of Open Access Journals (Sweden)

    Fernando Gonzalez

    2017-12-01

    Full Text Available With the rising popularity of robotics in our modern world there is an increase in the number of engineering programs that offer the basic Introduction to Robotics course. This common introductory robotics course generally covers the fundamental theory of robotics including robot kinematics, dynamics, differential movements, trajectory planning and basic computer vision algorithms commonly used in the field of robotics. Joint programming, the task of writing a program that directly controls the robot’s joint motors, is an activity that involves robot kinematics, dynamics, and trajectory planning. In this paper, we introduce a new educational robotics tool developed for teaching joint programming. The tool allows the student to write a program in a modified C language that controls the movement of the arm by controlling the velocity of each joint motor. This is a very important activity in the robotics course and leads the student to gain knowledge of how to build a robotic arm controller. Sample assignments are presented for different levels of difficulty.

  17. Level-set dynamics and mixing efficiency of passive and active scalars in DNS and LES of turbulent mixing layers

    NARCIS (Netherlands)

    Geurts, Bernard J.; Vreman, Bert; Kuerten, Hans; Luo, Kai H.

    2001-01-01

    The mixing efficiency in a turbulent mixing layer is quantified by monitoring the surface-area of level-sets of scalar fields. The Laplace transform is applied to numerically calculate integrals over arbitrary level-sets. The analysis includes both direct and large-eddy simulation and is used to

  18. Some problems of software development for the plant-level automated control system of NPPs with the RBMK reactors

    International Nuclear Information System (INIS)

    Gorbunov, V.P.; Egorov, A.K.; Isaev, N.V.; Saprykin, E.M.

    1987-01-01

    Problems on development and operation of automated control system (ACS) software of NPPs with the RBMK reactors are discussed. The ES computer with large on-line storage (not less than 1 Mbite) and fast response (not less than 300.000 of operations per a second) should enter the ACS composition. Several program complexes are used in the NPP ACS. The programs collected into the EhNERGIYa library are used to provide central control system operation. The information-retrival system called the Fuel file is used to automate NPP fuel motion account, as well as to estimate efficiency of fuel application, to carry out calculations of a fuel component of electric and heat energy production cost. The automated information system for unit operation efficiency analysis, which solves both plant and unit-level problems, including engineering and economical factors and complexing of operation parameter bank, is under trial operation

  19. Effect of Software Designed by Computer Conceptual Map Method in Mobile Environment on Learning Level of Nursing Students

    Directory of Open Access Journals (Sweden)

    Salmani N

    2015-12-01

    Full Text Available Aims: In order to preserve its own progress, nursing training has to be utilized new training methods, in such a case that the teaching methods used by the nursing instructors enhance significant learning via preventing superficial learning in the students. Conceptual Map Method is one of the new training strategies playing important roles in the field. The aim of this study was to investigate the effectiveness of the designed software based on the mobile phone computer conceptual map on the learning level of the nursing students. Materials & Methods: In the semi-experimental study with pretest-posttest plan, 60 students, who were studying at the 5th semester, were studied at the 1st semester of 2015-16. Experimental group (n=30 from Meibod Nursing Faculty and control group (n=30 from Yazd Shahid Sadoughi Nursing Faculty were trained during the first 4 weeks of the semester, using computer conceptual map method and computer conceptual map method in mobile phone environment. Data was collected, using a researcher-made academic progress test including “knowledge” and “significant learning”. Data was analyzed in SPSS 21 software using Independent T, Paired T, and Fisher tests. Findings: There were significant increases in the mean scores of knowledge and significant learning in both groups before and after the intervention (p0.05. Nevertheless, the process of change of the scores of significant learning level between the groups was statistically significant (p<0.05.   Conclusion: Presenting the course content as conceptual map in mobile phone environment positively affects the significant learning of the nursing students.

  20. Energy level and thickness control on PEDOT:PSS layer for efficient planar heterojunction perovskite cells

    Science.gov (United States)

    Wang, Chunhua; Zhang, Chujun; Tong, Sichao; Xia, Huayan; Wang, Lijuan; Xie, Haipeng; Gao, Yongli; Yang, Junliang

    2018-01-01

    Efficient planar heterojunction perovskite solar cells (PHJ-PSCs) with an architecture of ITO/PEDOT:PSS/CH3NH3PbI3/PCBM/Al were fabricated by controlling the energy level and thickness of the PEDOT:PSS layer, where the PEDOT:PSS precursor was diluted with deionized water (H2O) and isopropyl alcohol (IPA), i.e. W-PEDOT:PSS and I-PEDOT:PSS. The performance parameters of the PHJ-PSCs showed soaring enhancement after employing W-PEDOT:PSS or I-PEDOT:PSS instead of pristine PEDOT:PSS (P-PEDOT:PSS), resulting in an increase of the power conversion efficiency (PCE) of W-PEDOT:PSS-based PHJ-PSCs to 15.60% from 11.95% for P-PEDOT:PSS-based PHJ-PSCs. The performance improvement results from two aspects. On the one hand, as compared to P-PEDOT:PSS, the occupied molecular orbital energy (HOMO) level of dilute PEDOT:PSS showed an impressive decrease and can well match the valence band of CH3NH3PbI3 film, resulting in less energy loss and a significant improvement in the open-circuit voltage (V oc). On the other hand, the dilute PEDOT:PSS could produce a thinner film as compared with the P-PEDOT:PSS, which also played an important role in the performance of the PHJ-PSCs. Furthermore, the electrochemical impedance spectroscopy (EIS) results indicated that the interface between perovskite and PEDOT:PSS was greatly improved by employing W-PEDOT:PSS or I-PEDOT:PSS, leading to an obvious decrease in the series resistance (R s) and an increase in the recombination resistance (R rec). The research demonstrated that diluting PEDOT:PSS with a common solvent, such as H2O and IPA, is a feasible low-temperature way of achieving efficient PHJ-PSCs.

  1. Centimeter-Level Positioning Using an Efficient New Baseband Mixing and Despreading Method for Software GNSS Receivers

    Directory of Open Access Journals (Sweden)

    G. Lachapelle

    2007-10-01

    Full Text Available This paper presents an efficient new method for performing the baseband mixing and despreading operations in a software-based GNSS receiver, and demonstrates that the method is capable of providing measurements for centimeter-level positioning accuracy. The method uses a single frequency carrier replica for the baseband mixing process, enabling all satellites to perform mixing simultaneously and yielding considerable computational savings. To compensate for signal-to-noise ratio (SNR losses caused by using a single frequency carrier replica, the integration interval after despreading is divided into subintervals, and the output from each subinterval then compensated for the known frequency error. Using this approach, receiver processing times are shown to be reduced by approximately 21% relative to the next fastest method when tracking seven satellites. The paper shows the mathematical derivation of the new algorithm, discusses practical considerations, and demonstrates its performance using simulations and real data. Results show that the new method is able to generate pseudorange and carrier phase measurements with the same accuracy as traditional methods. Stand-alone positioning accuracy is at the meter level, while differential processing can produce fixed ambiguity carrier phase positions accurate to the centimeter level.

  2. Agile Software Development

    Science.gov (United States)

    Biju, Soly Mathew

    2008-01-01

    Many software development firms are now adopting the agile software development method. This method involves the customer at every level of software development, thus reducing the impact of change in the requirement at a later stage. In this article, the principles of the agile method for software development are explored and there is a focus on…

  3. Software requirements management based on use cases

    International Nuclear Information System (INIS)

    Xiao Jin

    2009-01-01

    In this paper, the requirements management based on use cases is theoretically explored, and a multi-layer use-case model is introduced, which combined with three levels of use cases and a single use-case refinement model. Through the practice in a software project, the multi-layer use-case model provides a good solution on how to control the requirements scope and change, and provides the balance of work assignment between customer departments, information management departments and software development outsourcing team. (authors)

  4. A software methodology for compiling quantum programs

    Science.gov (United States)

    Häner, Thomas; Steiger, Damian S.; Svore, Krysta; Troyer, Matthias

    2018-04-01

    Quantum computers promise to transform our notions of computation by offering a completely new paradigm. To achieve scalable quantum computation, optimizing compilers and a corresponding software design flow will be essential. We present a software architecture for compiling quantum programs from a high-level language program to hardware-specific instructions. We describe the necessary layers of abstraction and their differences and similarities to classical layers of a computer-aided design flow. For each layer of the stack, we discuss the underlying methods for compilation and optimization. Our software methodology facilitates more rapid innovation among quantum algorithm designers, quantum hardware engineers, and experimentalists. It enables scalable compilation of complex quantum algorithms and can be targeted to any specific quantum hardware implementation.

  5. Software Reviews.

    Science.gov (United States)

    Kinnaman, Daniel E.; And Others

    1988-01-01

    Reviews four educational software packages for Apple, IBM, and Tandy computers. Includes "How the West was One + Three x Four,""Mavis Beacon Teaches Typing,""Math and Me," and "Write On." Reviews list hardware requirements, emphasis, levels, publisher, purchase agreements, and price. Discusses the strengths…

  6. Improving Chemistry Education by Offering Salient Technology Training to Preservice Teachers: A Graduate-Level Course on Using Software to Teach Chemistry

    Science.gov (United States)

    Tofan, Daniel C.

    2009-01-01

    This paper describes an upper-level undergraduate and graduate-level course on computers in chemical education that was developed and offered for the first time in Fall 2007. The course provides future chemistry teachers with exposure to current software tools that can improve productivity in teaching, curriculum development, and education…

  7. Evaluation of a Game to Teach Requirements Collection and Analysis in Software Engineering at Tertiary Education Level

    Science.gov (United States)

    Hainey, Thomas; Connolly, Thomas M.; Stansfield, Mark; Boyle, Elizabeth A.

    2011-01-01

    A highly important part of software engineering education is requirements collection and analysis which is one of the initial stages of the Database Application Lifecycle and arguably the most important stage of the Software Development Lifecycle. No other conceptual work is as difficult to rectify at a later stage or as damaging to the overall…

  8. An Accelerator control middle layer using Matlab

    CERN Document Server

    Portmann, G J; Terebilo, Andrei

    2005-01-01

    Matlab is a matrix manipulation language originally developed to be a convenient language for using the LINPACK and EISPACK libraries. What makes Matlab so appealing for accelerator physics is the combination of a matrix oriented programming language, an active workspace for system variables, powerful graphics capability, built-in math libraries, and platform independence. A number of software toolboxes for accelerators have been written in Matlab – the Accelerator Toolbox (AT) for machine simulations, LOCO for accelerator calibration, Matlab Channel Access Toolbox (MCA) for EPICS connections, and the Middle Layer. This paper will describe the MiddleLayer software toolbox that resides between the high-level control applications and the low-level accelerator control system. This software was a collaborative effort between ALS and Spear but was written to easily port. Five accelerators presently use this software – Spear, ALS, CLS, and the X-ray and VUV rings at Brookhaven. The Middle Layer fu...

  9. Architecture design of the application software for the low-level RF control system of the free-electron laser at Hamburg

    International Nuclear Information System (INIS)

    Geng, Z.; Ayvazyan, V.; Simrock, S.

    2012-01-01

    The superconducting linear accelerator of the Free-Electron Laser at Hamburg (FLASH) provides high performance electron beams to the lasing system to generate synchrotron radiation to various users. The Low-Level RF (LLRF) system is used to maintain the beam stabilities by stabilizing the RF field in the superconducting cavities with feedback and feed forward algorithms. The LLRF applications are sets of software to perform RF system model identification, control parameters optimization, exception detection and handling, so as to improve the precision, robustness and operability of the LLRF system. In order to implement the LLRF applications in the hardware with multiple distributed processors, an optimized architecture of the software is required for good understandability, maintainability and extendibility. This paper presents the design of the LLRF application software architecture based on the software engineering approach for FLASH. (authors)

  10. The Deuterator: software for the determination of backbone amide deuterium levels from H/D exchange MS data

    Directory of Open Access Journals (Sweden)

    Tsinoremas NF

    2007-05-01

    Full Text Available Abstract Background The combination of mass spectrometry and solution phase amide hydrogen/deuterium exchange (H/D exchange experiments is an effective method for characterizing protein dynamics, and protein-protein or protein-ligand interactions. Despite methodological advancements and improvements in instrumentation and automation, data analysis and display remains a tedious process. The factors that contribute to this bottleneck are the large number of data points produced in a typical experiment, each requiring manual curation and validation, and then calculation of the level of backbone amide exchange. Tools have become available that address some of these issues, but lack sufficient integration, functionality, and accessibility required to address the needs of the H/D exchange community. To date there is no software for the analysis of H/D exchange data that comprehensively addresses these issues. Results We have developed an integrated software system for the automated analysis and representation of H/D exchange data that has been titled "The Deuterator". Novel approaches have been implemented that enable high throughput analysis, automated determination of deuterium incorporation, and deconvolution of overlapping peptides. This has been achieved by using methods involving iterative theoretical envelope fitting, and consideration of peak data within expected m/z ranges. Existing common file formats have been leveraged to allow compatibility with the output from the myriad of MS instrument platforms and peptide sequence database search engines. A web-based interface is used to integrate the components of The Deuterator that are able to analyze and present mass spectral data from instruments with varying resolving powers. The results, if necessary, can then be confirmed, adjusted, re-calculated and saved. Additional tools synchronize the curated calculation parameters with replicate time points, increasing throughput. Saved results can then

  11. Phase time delay and Hartman effect in a one-dimensional photonic crystal with four-level atomic defect layer

    Science.gov (United States)

    Jamil, Rabia; Ali, Abu Bakar; Abbas, Muqaddar; Badshah, Fazal; Qamar, Sajid

    2017-08-01

    The Hartman effect is revisited using a Gaussian beam incident on a one-dimensional photonic crystal (1DPC) having a defect layer doped with four-level atoms. It is considered that each atom of the defect layer interacts with three driving fields, whereas a Gaussian beam of width w is used as a probe light to study Hartman effect. The atom-field interaction inside the defect layer exhibits electromagnetically induced transparency (EIT). The 1DPC acts as positive index material (PIM) and negative index material (NIM) corresponding to the normal and anomalous dispersion of the defect layer, respectively, via control of the phase associated with the driving fields and probe detuning. The positive and negative Hartman effects are noticed for PIM and NIM, respectively, via control of the relative phase corresponding to the driving fields and probe detuning. The advantage of using four-level EIT system is that a much smaller absorption of the transmitted beam occurs as compared to three-level EIT system corresponding to the anomalous dispersion, leading to negative Hartman effect.

  12. Agile software assessment

    OpenAIRE

    Nierstrasz Oscar; Lungu Mircea

    2012-01-01

    Informed decision making is a critical activity in software development but it is poorly supported by common development environments which focus mainly on low level programming tasks. We posit the need for agile software assessment which aims to support decision making by enabling rapid and effective construction of software models and custom analyses. Agile software assessment entails gathering and exploiting the broader context of software information related to the system at hand as well ...

  13. Double-layered buffer to enhance the thermal performance in a high-level radioactive waste disposal system

    International Nuclear Information System (INIS)

    Choi, Heui-Joo; Choi, Jongwon

    2008-01-01

    A thermal performance is one of the most important factors in the design of a geological disposal system for high-level radioactive wastes. According to the conceptual design of the Korean Reference disposal System, the maximum temperature of its buffer with a domestic Ca-bentonite is close to the thermal criterion, 100 deg. C. In order to improve the thermal conductivity of its buffer, several kinds of additives are compared. Among the additives, graphite shows the best result in that the thermal conductivity of the bentonite block is more than 2.0 W/m deg. C. We introduced the concept of a double-layered buffer instead of a traditional bentonite block in order to use the applied additive more effectively. The thermal analysis, based upon the three-dimensional finite element method, shows that a double-layered buffer could reduce the maximum temperature on a canister's surface by 7 deg. C under identical conditions when compared with a single-layered buffer. An analytical solution was derived to efficiently analyze the effects of a double-layered buffer. The illustrative cases show that the temperature differences due to a double-layered buffer depend on the thickness of the buffer

  14. Examining a Paradigm Shift in Organic Depot-Level Software Maintenance for Army Communications and Electronics Equipment

    Science.gov (United States)

    2015-05-30

    scalable application of cutting edge technologies. 20 4. Responding to changing resources—With likely significant resource reductions the depot...deal with underutilized organic capability while continuing to increase outsourcing of depot workload. In addition the study states that a...the unique organic skills that TYAD could 40 bring to the software sustainment mission could be valuable based on the specific type of software

  15. Parallel Fortran-MPI software for numerical inversion of the Laplace transform and its application to oscillatory water levels in groundwater environments

    Science.gov (United States)

    Zhan, X.

    2005-01-01

    A parallel Fortran-MPI (Message Passing Interface) software for numerical inversion of the Laplace transform based on a Fourier series method is developed to meet the need of solving intensive computational problems involving oscillatory water level's response to hydraulic tests in a groundwater environment. The software is a parallel version of ACM (The Association for Computing Machinery) Transactions on Mathematical Software (TOMS) Algorithm 796. Running 38 test examples indicated that implementation of MPI techniques with distributed memory architecture speedups the processing and improves the efficiency. Applications to oscillatory water levels in a well during aquifer tests are presented to illustrate how this package can be applied to solve complicated environmental problems involved in differential and integral equations. The package is free and is easy to use for people with little or no previous experience in using MPI but who wish to get off to a quick start in parallel computing. ?? 2004 Elsevier Ltd. All rights reserved.

  16. SOFTWARE REVIEW: Oxford Personal Revision Guides: A-level Physics 1999/2000 Syllabus GCSE Physics 1999/2000 Syllabus

    Science.gov (United States)

    Parker, Kerry

    2000-09-01

    ! Both CDs begin with an introductory section which guides the student into the Revision Plan Wizard. The authors have suggested how much time each section requires, so depending upon what topics the students needs to work at, and the date of their exam, they can design a revision timetable. The student is simply told how long they have to revise each day, and then in the main physics section they are told what they have to study each week. Both packages also feature an equation handler: `a piece of software that allows different manipulations on a predefined equation and is aimed at bettering one's arithmetical skills.' (I think the language gives away the fact that this software is not designed for lower ability GCSE candidates!) The GCSE physics content is divided into seven `chapters' - Making things happen, Heat, Forces at work, Waves, Electrical and magnetic phenomena, Properties of materials and The cosmic onion. There is also a comprehensive introduction, an equation handler, some exam board questions, tests and reports. The physics is well written and is taught in colourful images, many of which are animated and have a brief commentary. There are plenty of brief six-minute tests, interspersed with the revision materials, to keep the students on their toes, but I was disappointed with the interactivity in the physics content pages. To progress, the student only has to keep clicking `I've read this page'. The A-level material is subdivided into Foundations, Key topics, Further topics and Physical data. Foundations involves motion, work, electricity, magnetism and waves, while Key topics looks at dimensions, vectors, moments, circular motion and other material from the core syllabus. Further topics cover most of the material required by the options from different boards, like many revision books. The text is clearly written and the graphics are colourful, but most of the content is still a slightly animated electronic textbook. I was disappointed, for example, that

  17. Different defect levels configurations between double layers of nanorods and film in ZnO grown on c-Al2O3 by MOCVD

    International Nuclear Information System (INIS)

    Wu, Bin; Zhang, Yuantao; Shi, Zhifeng; Li, Xiang; Cui, Xijun; Zhuang, Shiwei; Zhang, Baolin; Du, Guotong

    2014-01-01

    Epitaxial ZnO structures with inherent two layers of nanorods layer on film layer were fabricated on c-Al 2 O 3 by metal-organic chemical vapor deposition (MOCVD) and studied by photoluminescence. Specially, photoluminescence spectra for the film layer were obtained by rendering the excitation from the substrate side. Different defect levels configurations between nanorods and film were revealed. Zinc vacancies tend to form in top nanorods layer, whereas abundant zinc–oxygen divacancies accumulate in bottom film layer. An acceptor state with activation energy of ∼200 meV is exclusive to the film layer. The stacking fault related acceptor and Al introduced donor are present in both layers. Besides, two other defect related donors contained in the nanorods layer perhaps also exist within the film layer. - Highlights: • Inherent double layer ZnO of nanorods on film layer were studied by PL. • V Zn tend to form in the nanorods layer, and V ZnO accumulate in the film layer. • An acceptor with activation energy of ∼200 meV is exclusive to the film layer. • Pure NBE emission without DLE in RT PL spectrum does not mean good crystallinity

  18. The influence of calcium and phosphorus levels on egg production, egg quality, tibia weight and 32P retention of layers

    International Nuclear Information System (INIS)

    Edwardly, Y.S.; Hendratno, C.; Yuyu Wahyu

    1979-01-01

    An experiment was conducted to study the influence of three levels of calcium and three levels of dietary phosphorus on egg production, egg quality, tibia weight and 32 P retention of layers. Calcium levels of 3.0; 3.5 and 4.0 percent were used and phosphorus contest of the diet fed were either 0.6; 0.9 or 1.2 percent.Egg production was highest with rations containing 3.5% calcium and 0.9% phosphorus. A significant increase in egg production was found at 0.9% levels of phosphorus compared to levels of 0.6 and 1.2%. Egg quality was increased significantly (p 32 P retention was low at the highest calcium level. (author)

  19. Pulse width and height modulation for multi-level resistance in bi-layer TaOx based RRAM

    Science.gov (United States)

    Alamgir, Zahiruddin; Beckmann, Karsten; Holt, Joshua; Cady, Nathaniel C.

    2017-08-01

    Mutli-level switching in resistive memory devices enables a wide range of computational paradigms, including neuromorphic and cognitive computing. To this end, we have developed a bi-layer tantalum oxide based resistive random access memory device using Hf as the oxygen exchange layer. Multiple, discrete resistance levels were achieved by modulating the RESET pulse width and height, ranging from 2 kΩ to several MΩ. For a fixed pulse height, OFF state resistance was found to increase gradually with the increase in the pulse width, whereas for a fixed pulse width, the increase in the pulse height resulted in drastic changes in resistance. Resistive switching in these devices transitioned from Schottky emission in the OFF state to tunneling based conduction in the ON state, based on I-V curve fitting and temperature dependent current measurements. These devices also demonstrated endurance of more than 108 cycles with a satisfactory Roff/Ron ratio and retention greater than 104 s.

  20. Nutritional Quality of Eggs of Amberlink and Hyline Layers Fed on Different Levels of Provitamin A-Biofortified Maize

    Directory of Open Access Journals (Sweden)

    GW Zeina

    Full Text Available ABSTRACT The study was conducted to determine the interaction of types of maize (Provitamin A-biofortified maize (PABM versus white maize and strain of laying birds (Amberlink and Hyline on nutritional quality of eggs. Twenty-one of each of Amberlink and Hyline laying hens were fed on three diets for 30 days. Birds were distributed in a 3 × 2 factorial arrangement constituting three diets (0, 50 or 100 % of PABM and two strains (Amberlink & Hyline. There was a diet × strain interaction on egg shell weight, average daily feed intake, egg production, egg weight and egg shell thickness. Eggs produced by layers under 100 % PABM had lighter eggshell weight and lower eggshell percentage. In contrast, eggs produced by layers under 50 % PABM diet had signi-ficantly higher eggshell percentage, heavier egg shell weight and thicker eggshell. As the level of PABM increased, the yellow and red hue (Hunter a* and b* values significantly increased while the lightness values (Hunter L* values decreased. As the level of PABM in the ration increased, the vitamin A content of the egg yolk also significantly increased. Assimilation of vitamin A from feed to egg yolk in Amberlink and Hyline hens was similar. Hence, egg enrichment with vitamin A can be achieved by using PABM in layers ration. The use of high level of PABM had a negative effect on the eggshell quality traits.

  1. Optical properties and defect levels in a surface layer found on CuInSe{sub 2} thin films

    Energy Technology Data Exchange (ETDEWEB)

    Abulfotuh, F.; Wangensteen, T.; Ahrenkiel, R.; Kazmerski, L.L. [National Renewable Energy Lab., Golden, CO (United States)

    1996-05-01

    In this paper the authors have used photoluminescence (PL) and wavelength scanning ellipsometry (WSE) to clarify the relationship among the electro-optical properties of copper indium diselenide (CIS) thin films, the type and origin of dominant defect states, and device performance. The PL study has revealed several shallow acceptor and donor levels dominating the semiconductor. PL emission from points at different depths from the surface of the CIS sample has been obtained by changing the angle of incidence of the excitation laser beam. The resulting data were used to determine the dominant defect states as a function of composition gradient at the surface of the chalcopyrite compound. The significance of this type of measurement is that it allowed the detection of a very thin layer with a larger bandgap (1.15-1.26 eV) than the CIS present on the surface of the CIS thin films. The presence of this layer has been correlated by several groups to improvement of the CIS cell performance. An important need that results from detecting this layer on the surface of the CIS semiconductor is the determination of its thickness and optical constants (n, k) as a function of wavelength. The thickness of this surface layer is about 500 {Angstrom}.

  2. Safety level of Levofloxacin following repeated oral adminstration in White Leg Horn layer birds

    Directory of Open Access Journals (Sweden)

    Jatin H. Patel

    2009-08-01

    Full Text Available Levofloxacin is a fluorinated quinolone which has broad-spectrum antibacterial activity at low plasma/tissue concentration. The present study was designed to investigate safety of levofloxacin (10 mg/kg after repeated oral administration at 12 hours interval for 14 days in layer birds (30-35 weeks old and weighing between 1.5-2.0 kg and to determine tissue concentration of the drug following oral administration (10 mg/kg for 5 days. Drug concentration in tissue was determined using High Performance Liquid Chromatography (HPLC. Repeated oral administration of levofloxacin in layer birds was found safe based on evaluation of haematological (Hb, PCV, TLC and DLC, blood biochemical (AST, ALT, AKP, ACP, LDH, BUN, Serum total protein, Serum albumin, Serum Creatinine, Blood glucose and Total bilirubin and histopathology of liver, kidney and joint cartilage. Levofloxacin could not be detected in body tissues (liver and skeletal muscle at 12 hours after the last administration. [Vet. World 2009; 2(4.000: 137-139

  3. Solution-processed high-LUMO-level polymers in n-type organic field-effect transistors: a comparative study as a semiconducting layer, dielectric layer, or charge injection layer

    International Nuclear Information System (INIS)

    Liu, Chuan; Xu, Yong; Liu, Xuying; Minari, Takeo; Sirringhaus, Henning; Noh, Yong-Young

    2015-01-01

    In solution-processed organic field-effect transistors (OFETs), the polymers with high level of lowest unoccupied molecular orbitals (LUMOs, > −3.5 eV) are especially susceptible to electron-trapping that causes low electron mobility and strong instability in successive operation. However, the role of high-LUMO-level polymers could be different depending on their locations relative to the semiconductor/insulator interface, or could even possibly benefit the device in some cases. We constructed unconventional polymer heterojunction n-type OFETs to control the location of the same polymer with a high LUMO level, to be in, under, or above the accumulation channel. We found that although the devices with the polymer in the channel suffer from dramatic instability, the same polymer causes much less instability when it acts as a dielectric modification layer or charge injection layer. Especially, it may even improve the device performance in the latter case. This result helps to improve our understanding of the electron-trapping and explore the value of these polymers in OFETs. (invited article)

  4. Software Engineering for Portability.

    Science.gov (United States)

    Stanchev, Ivan

    1990-01-01

    Discussion of the portability of educational software focuses on the software design and development process. Topics discussed include levels of portability; the user-computer dialog; software engineering principles; design techniques for student performance records; techniques of courseware programing; and suggestions for further research and…

  5. Analysis of macular and nerve fiber layer thickness in multiple sclerosis patients according to severity level and optic neuritis episodes.

    Science.gov (United States)

    Soler García, A; Padilla Parrado, F; Figueroa-Ortiz, L C; González Gómez, A; García-Ben, A; García-Ben, E; García-Campos, J M

    2016-01-01

    Quantitative assessment of macular and nerve fibre layer thickness in multiple sclerosis patients with regard to expanded disability status scale (EDSS) and presence or absence of previous optic neuritis episodes. We recruited 62 patients with multiple sclerosis (53 relapsing-remitting and 9 secondary progressive) and 12 disease-free controls. All patients underwent an ophthalmological examination, including quantitative analysis of the nerve fibre layer and macular thickness using optical coherence tomography. Patients were classified according to EDSS as A (lower than 1.5), B (between 1.5 and 3.5), and C (above 3.5). Mean nerve fibre layer thickness in control, A, B, and C groups was 103.35±12.62, 99.04±14.35, 93.59±15.41, and 87.36±18.75μm respectively, with statistically significant differences (P<.05). In patients with no history of optic neuritis, history of episodes in the last 3 to 6 months, or history longer than 6 months, mean nerve fibre layer thickness was 99.25±13.71, 93.92±13.30 and 80.07±15.91μm respectively; differences were significant (P<.05). Mean macular thickness in control, A, B, and C groups was 220.01±12.07, 217.78±20.02, 217.68±20.77, and 219.04±24.26μm respectively. Differences were not statistically significant. The mean retinal nerve fibre layer thickness in multiple sclerosis patients is related to the EDSS level. Patients with previous optic neuritis episodes have a thinner retinal nerve fibre layer than patients with no history of these episodes. Mean macular thickness is not correlated to EDSS level. Copyright © 2014 Sociedad Española de Neurología. Publicado por Elsevier España, S.L.U. All rights reserved.

  6. Calcium levels and calcium: available phosphorus ratios in diets for white egg layers from 42 to 58 weeks of age

    Directory of Open Access Journals (Sweden)

    Silvana Marques Pastore

    2012-12-01

    Full Text Available The experiment was conducted to determine the nutritional requirement of calcium and the best calcium:available phosphorus ratio for commercial layers at the post-laying peak. A total of 324 Hy-Line W-36 laying hens were utilized in the period from 42 to 58 weeks of age, distributed in a completely randomized design in a 3 × 3 factorial arrangement, composed of three levels of calcium (39, 42 and 45 g/kg and three calcium:phosphorus ratios (12.12:1; 10.53:1; and 9.30:1, totaling nine treatments with six replications and six birds per experimental unit. There was no significant effect from the calcium levels × calcium:phosphorus ratio interaction for any of the variables studied. The calcium levels and the calcium:phosphorus ratios did not affect the variables performance or egg and bone quality. At the evaluation of the calcium:phosphorus balance, as the levels of calcium of the diet were raised, the intake of calcium and phosphorus and the contents of mineral matter and calcium in the excreta increased linearly, and the retention of calcium by birds decreased linearly. With the reduction of the calcium:phosphorus ratios of the diet, intake, retention and excretion of phosphorus by layers increased. Diets containing calcium at 39 g/kg and a calcium:phosphorus ratio of 12.12:1, corresponding to an increase in calcium of 3.51 g/bird/day and available phosphorus of 289 mg/bird/day, meet the requirements of calcium and available phosphorus of white egg layers in the period from 42 to 58 weeks of age.

  7. Optimization of Antivirus Software

    OpenAIRE

    Catalin BOJA; Adrian VISOIU

    2007-01-01

    The paper describes the main techniques used in development of computer antivirus software applications. For this particular category of software, are identified and defined optimum criteria that helps determine which solution is better and what are the objectives of the optimization process. From the general viewpoint of software optimization are presented methods and techniques that are applied at code development level. Regarding the particularities of antivirus software, the paper analyze...

  8. High-Level Design for Ultra-Fast Software Defined Radio Prototyping on Multi-Processors Heterogeneous Platforms

    OpenAIRE

    Moy , Christophe; Raulet , Mickaël

    2010-01-01

    International audience; The design of Software Defined Radio (SDR) equipments (terminals, base stations, etc.) is still very challenging. We propose here a design methodology for ultra-fast prototyping on heterogeneous platforms made of GPPs (General Purpose Processors), DSPs (Digital Signal Processors) and FPGAs (Field Programmable Gate Array). Lying on a component-based approach, the methodology mainly aims at automating as much as possible the design from an algorithmic validation to a mul...

  9. E-learning Materials Development: Applying and Implementing Software Reuse Principles and Granularity Levels in the Small

    OpenAIRE

    Nabil Arman

    2010-01-01

    E-learning materials development is typically acknowledged as an expensive, complicated, and lengthy process, often producing materials that are of low quality and difficult to adaptand maintain. It has always been a challenge to identify proper e-learning materials that can be reused at a reasonable cost and effort. In this paper, software engineering reuse principlesare applied to e-learning materials development process. These principles are then applied and implemented in a prototype that...

  10. Cell layer level generalized dynamic modeling of a PEMFC stack using VHDL-AMS language

    Energy Technology Data Exchange (ETDEWEB)

    Gao, Fei; Blunier, Benjamin; Miraoui, Abdellatif; El-Moudni, Abdellah [Transport and Systems Laboratory (SeT) - EA 3317/UTBM, University of Technology of Belfort-Montbeliard, Rue Thierry Mieg, 90000 Belfort (France)

    2009-07-15

    A generalized, cell layer scale proton exchange membrane fuel cell (PEMFC) stack dynamic model is presented using VHDL-AMS (IEEE standard Very High Speed Integrated Circuit Hardware Description Language-Analog and Mixed-Signal Extensions) modeling language. A PEMFC stack system is a complex energy conversion system that covers three main energy domains: electrical, fluidic and thermal. The first part of this work shows the performance and the advantages of VHDL-AMS language when modeling such a complex system. Then, using the VHDL-AMS modeling standards, an electrical domain model, a fluidic domain model and a thermal domain model of the PEMFC stack are coupled and presented together. Thus, a complete coupled multi-domain fuel cell stack 1-D dynamic model is given. The simulation results are then compared with a Ballard 1.2 kW NEXA fuel cell system, and show a great agreement between the simulation and experimentation. This complex multi-domain VHDL-AMS stack model can be used for a model based control design or a Hardware-In-the-Loop application. (author)

  11. Deep layer-resolved core-level shifts in the beryllium surface

    DEFF Research Database (Denmark)

    Aldén, Magnus; Skriver, Hans Lomholt; Johansson, Börje

    1993-01-01

    Core-level energy shifts for the beryllium surface region are calculated by means of a Green’s function technique within the tight-binding linear muffin-tin orbitals method. Both initial- and final-state effects in the core-ionization process are fully accounted for. Anomalously large energy shifts...

  12. Using modern software tools to design, simulate and test a Level 1 trigger sub-system for the D Zero Detector

    International Nuclear Information System (INIS)

    Angstadt, R.; Borcherding, F.; Johnson, M.E.; Moreira, L.

    1995-06-01

    This paper describes a system which uses a commercial spreadsheet program and commercial hardware on an IBM PC to develop and test a track finding system for the D Zero Level 1 scintillating Fiber Trigger. The trigger system resides in a VME crate. This system allows the user to generate test input, write the pattern to the hardware simulate the results in software, read the hardware result: compare the results and inform the user of any differences

  13. Stress effects of the inter-level dielectric layer on the ferroelectric performance of integrated SrBi2Ta2O9 capacitors

    International Nuclear Information System (INIS)

    Hong, Suk-Kyoung; Yang, B.; Oh, Sang Hyun; Kang, Young Min; Kang, Nam Soo; Hwang, Cheol Seong; Kwon, Oh Seong

    2001-01-01

    The thermal stress effects of the inter-level dielectric (ILD) layer on the ferroelectric performance of integrated Pt/SrBi 2 Ta 2 O 9 (SBT)/Pt capacitors were investigated. Two different thin film materials, pure SiO 2 grown at 650 degree C and B- and P-doped SiO 2 grown at 400 degree C by chemical vapor deposition techniques, were tested as an ILD layer. The ILD layer encapsulated the SBT capacitor array. During high temperature thermal cycling (up to 800 degree C) after ILD deposition, which is used for both densifying the ILD and curing of the various damage imposed on the SBT capacitors, a large thermal stress occurred in the bottom Pt layer due to the thermal expansion mismatch between the various layers. In particular, the pure SiO 2 ILD layer between the capacitors did not allow thermal expansion of the Pt layers, which led to a large accumulation of compressive stress in the layer. This resulted in hillock formation in the bottom Pt layer and eventual capacitor failure. However, the B- and P-doped SiO 2 ILD layer contracted during thermal cycling by removing residual impurities, which allowed greater expansion of the Pt layer. Therefore, compressive stress accumulation did not occur and excellent ferroelectric properties were thus obtained from the integrated capacitor array. [copyright] 2001 American Institute of Physics

  14. Stochastic modelling of intermittent fluctuations in the scrape-off layer: Correlations, distributions, level crossings, and moment estimation

    Energy Technology Data Exchange (ETDEWEB)

    Garcia, O. E., E-mail: odd.erik.garcia@uit.no; Kube, R.; Theodorsen, A. [Department of Physics and Technology, UiT The Arctic University of Norway, N-9037 Tromsø (Norway); Pécseli, H. L. [Physics Department, University of Oslo, PO Box 1048 Blindern, N-0316 Oslo (Norway)

    2016-05-15

    A stochastic model is presented for intermittent fluctuations in the scrape-off layer of magnetically confined plasmas. The fluctuations in the plasma density are modeled by a super-position of uncorrelated pulses with fixed shape and duration, describing radial motion of blob-like structures. In the case of an exponential pulse shape and exponentially distributed pulse amplitudes, predictions are given for the lowest order moments, probability density function, auto-correlation function, level crossings, and average times for periods spent above and below a given threshold level. Also, the mean squared errors on estimators of sample mean and variance for realizations of the process by finite time series are obtained. These results are discussed in the context of single-point measurements of fluctuations in the scrape-off layer, broad density profiles, and implications for plasma–wall interactions due to the transient transport events in fusion grade plasmas. The results may also have wide applications for modelling fluctuations in other magnetized plasmas such as basic laboratory experiments and ionospheric irregularities.

  15. Software engineering

    CERN Document Server

    Sommerville, Ian

    2010-01-01

    The ninth edition of Software Engineering presents a broad perspective of software engineering, focusing on the processes and techniques fundamental to the creation of reliable, software systems. Increased coverage of agile methods and software reuse, along with coverage of 'traditional' plan-driven software engineering, gives readers the most up-to-date view of the field currently available. Practical case studies, a full set of easy-to-access supplements, and extensive web resources make teaching the course easier than ever.

  16. Analisis Performansi Algoritma Penjadwalan Log Rule dan Frame Level Schedule Skenario Multicell Pada Layer Mac LTE

    Directory of Open Access Journals (Sweden)

    Ridwan

    2016-03-01

    Full Text Available Mobile telecommunications technology gradually evolved to support better services such as voice, data, and video to users of telecommunications services. LTE (Long Term Evolution is a network based on Internet Protocol (IP standardized by 3rd Generation Partnership Project (3GPP. To support it, LTE requires a mechanism that can support. One of them by applying methods of scheduling packets in each service. Scheduling is a different treatment to packets that come in accordance with the priorities of the scheduling algorithm. In this research, to analyze the performance of LTE with paramater delay, packet loss ratio, throughput and fairness index uses a scheduling algorithms Frame Level Schedule (FLS and Log Rule on LTE-Simulator with scenarios using Voip traffic, Video and Best Effort (BE. The results is scheduling algorithms FLS is better than log rule in term of throughput values, while of scheduling algorithms log rule is better than FLS in terms of delay based on the number and speed of the users. This indicates that both scheduling algorithms suitable for use in LTE networks within conditions of traffic real time services, but not for non real time services such as BE.

  17. Behaviour of a clay layer submitted to bending: application to a landfill for storing very low level radioactive waste

    International Nuclear Information System (INIS)

    Camp Devernay, S.

    2008-12-01

    The sealing cover system of landfills for storing non bio-degradable and dangerous waste is most of the time made up of a layer of clay and/or a geo-membrane. The question of the optimization of the conditions of storage of the radioactive waste envisage a surface storage for very low level radioactive waste (VLLW) and low and intermediate short-lived radioactive waste. This study is applied to a VLLW disposal facility of which the cover is made up of a clay layer over a geo-membrane but can be transposed to landfill for dangerous waste. The cover clay barrier of a landfill must preserve its properties; in particular its permeability must remain inferior to ten to the minus nine meters per second, during the life of the landfill in spite of the various solicitations which can generate cracking. Among these solicitations, the relative settlements of subjacent waste, generating bending solicitation, are one of the most critical solicitations. The current regulation concerning the implementation as a cover of a clay layer presents gaps, in particular with regard to the deformability of clay. This study presents the interest to couple laboratory tests (four points bending tests, splitting test and punching test) with field bending tests carried out at scale one and with their modeling with centrifugal tests. These tests were also numerically modeled by finite elements. A good compatibility of the results, in particular with regard to the definition of the conditions of initiation of the crack by bending, is shown. Numerical modeling and centrifugal tests made it possible to extend the study to unperformed in situ cases (settlement tests, reinforcement of the clay). (author)

  18. 2D Effective Electron Mass at the Fermi Level in Accumulation and Inversion Layers of MOSFET Nano Devices.

    Science.gov (United States)

    Singh, S L; Singh, S B; Ghatak, K P

    2018-04-01

    In this paper an attempt is made to study the 2D Fermi Level Mass (FLM) in accumulation and inversion layers of nano MOSFET devices made of nonlinear optical, III-V, ternary, Quaternary, II-VI, IV-VI, Ge and stressed materials by formulating 2D carrier dispersion laws on the basis of k → ⋅ p → ⋅ formalism and considering the energy band constants of a particular material. It is observed taking accumulation and inversion layers of Cd3As2, CdGeAs2, InSb, Hg1-xCdxTe and In1-xGaxAsyP1-y lattice matched to InP, CdS, GaSb and Ge as examples that the FLM depends on sub band index for nano MOSFET devices made of Cd3As2 and CdGeAs2 materials which is the characteristic features such 2D systems. Besides, the FLM depends on the scattering potential in all the cases and the same mass changes with increasing surface electric field. The FLM exists in the band gap which is impossible without heavy doping.

  19. Loosely coupled level sets for retinal layers and drusen segmentation in subjects with dry age-related macular degeneration

    Science.gov (United States)

    Novosel, Jelena; Wang, Ziyuan; de Jong, Henk; Vermeer, Koenraad A.; van Vliet, Lucas J.

    2016-03-01

    Optical coherence tomography (OCT) is used to produce high-resolution three-dimensional images of the retina, which permit the investigation of retinal irregularities. In dry age-related macular degeneration (AMD), a chronic eye disease that causes central vision loss, disruptions such as drusen and changes in retinal layer thicknesses occur which could be used as biomarkers for disease monitoring and diagnosis. Due to the topology disrupting pathology, existing segmentation methods often fail. Here, we present a solution for the segmentation of retinal layers in dry AMD subjects by extending our previously presented loosely coupled level sets framework which operates on attenuation coefficients. In eyes affected by AMD, Bruch's membrane becomes visible only below the drusen and our segmentation framework is adapted to delineate such a partially discernible interface. Furthermore, the initialization stage, which tentatively segments five interfaces, is modified to accommodate the appearance of drusen. This stage is based on Dijkstra's algorithm and combines prior knowledge on the shape of the interface, gradient and attenuation coefficient in the newly proposed cost function. This prior knowledge is incorporated by varying the weights for horizontal, diagonal and vertical edges. Finally, quantitative evaluation of the accuracy shows a good agreement between manual and automated segmentation.

  20. Achieving Agility and Stability in Large-Scale Software Development

    Science.gov (United States)

    2013-01-16

    temporary team is assigned to prepare layers and frameworks for future feature teams. Presentation Layer Domain Layer Data Access Layer...http://www.sei.cmu.edu/training/ elearning ~ Software Engineering Institute CarnegieMellon

  1. Dtest Testing Software

    Science.gov (United States)

    Jain, Abhinandan; Cameron, Jonathan M.; Myint, Steven

    2013-01-01

    This software runs a suite of arbitrary software tests spanning various software languages and types of tests (unit level, system level, or file comparison tests). The dtest utility can be set to automate periodic testing of large suites of software, as well as running individual tests. It supports distributing multiple tests over multiple CPU cores, if available. The dtest tool is a utility program (written in Python) that scans through a directory (and its subdirectories) and finds all directories that match a certain pattern and then executes any tests in that directory as described in simple configuration files.

  2. SOFTWARE OPEN SOURCE, SOFTWARE GRATIS?

    Directory of Open Access Journals (Sweden)

    Nur Aini Rakhmawati

    2006-01-01

    Full Text Available Normal 0 false false false IN X-NONE X-NONE MicrosoftInternetExplorer4 Berlakunya Undang – undang Hak Atas Kekayaan Intelektual (HAKI, memunculkan suatu alternatif baru untuk menggunakan software open source. Penggunaan software open source menyebar seiring dengan isu global pada Information Communication Technology (ICT saat ini. Beberapa organisasi dan perusahaan mulai menjadikan software open source sebagai pertimbangan. Banyak konsep mengenai software open source ini. Mulai dari software yang gratis sampai software tidak berlisensi. Tidak sepenuhnya isu software open source benar, untuk itu perlu dikenalkan konsep software open source mulai dari sejarah, lisensi dan bagaimana cara memilih lisensi, serta pertimbangan dalam memilih software open source yang ada. Kata kunci :Lisensi, Open Source, HAKI

  3. Software Epistemology

    Science.gov (United States)

    2016-03-01

    in-vitro decision to incubate a startup, Lexumo [7], which is developing a commercial Software as a Service ( SaaS ) vulnerability assessment...LTS Label Transition System MUSE Mining and Understanding Software Enclaves RTEMS Real-Time Executive for Multi-processor Systems SaaS Software ...as a Service SSA Static Single Assignment SWE Software Epistemology UD/DU Def-Use/Use-Def Chains (Dataflow Graph)

  4. Deep levels in metamorphic InAs/InGaAs quantum dot structures with different composition of the embedding layers

    Science.gov (United States)

    Golovynskyi, S.; Datsenko, O.; Seravalli, L.; Kozak, O.; Trevisi, G.; Frigeri, P.; Babichuk, I. S.; Golovynska, I.; Qu, Junle

    2017-12-01

    Deep levels in metamorphic InAs/In x Ga1-x As quantum dot (QD) structures are studied with deep level thermally stimulated conductivity (TSC), photoconductivity (PC) and photoluminescence (PL) spectroscopy and compared with data from pseudomorphic InGaAs/GaAs QDs investigated previously by the same techniques. We have found that for a low content of indium (x = 0.15) the trap density in the plane of self-assembled QDs is comparable or less than the one for InGaAs/GaAs QDs. However, the trap density increases with x, resulting in a rise of the defect photoresponse in PC and TSC spectra as well as a reduction of the QD PL intensity. The activation energies of the deep levels and some traps correspond to known defect complexes EL2, EL6, EL7, EL9, and EL10 inherent in GaAs, and three traps are attributed to the extended defects, located in InGaAs embedding layers. The rest of them have been found as concentrated mainly close to QDs, as their density in the deeper InGaAs buffers is much lower. This an important result for the development of light-emitting and light-sensitive devices based on metamorphic InAs QDs, as it is a strong indication that the defect density is not higher than in pseudomorphic InAs QDs.

  5. HAZARD ANALYSIS SOFTWARE

    International Nuclear Information System (INIS)

    Sommer, S; Tinh Tran, T.

    2008-01-01

    Washington Safety Management Solutions, LLC developed web-based software to improve the efficiency and consistency of hazard identification and analysis, control selection and classification, and to standardize analysis reporting at Savannah River Site. In the new nuclear age, information technology provides methods to improve the efficiency of the documented safety analysis development process which includes hazard analysis activities. This software provides a web interface that interacts with a relational database to support analysis, record data, and to ensure reporting consistency. A team of subject matter experts participated in a series of meetings to review the associated processes and procedures for requirements and standard practices. Through these meetings, a set of software requirements were developed and compiled into a requirements traceability matrix from which software could be developed. The software was tested to ensure compliance with the requirements. Training was provided to the hazard analysis leads. Hazard analysis teams using the software have verified its operability. The software has been classified as NQA-1, Level D, as it supports the analysis team but does not perform the analysis. The software can be transported to other sites with alternate risk schemes. The software is being used to support the development of 14 hazard analyses. User responses have been positive with a number of suggestions for improvement which are being incorporated as time permits. The software has enforced a uniform implementation of the site procedures. The software has significantly improved the efficiency and standardization of the hazard analysis process

  6. Principle and geomorphological applicability of summit level and base level technique using Aster Gdem satellite-derived data and the original software Baz

    Directory of Open Access Journals (Sweden)

    Akihisa Motoki

    2015-05-01

    Full Text Available This article presents principle and geomorphological applicability of summit level technique using Aster Gdem satellite-derived topographicdata. Summit level corresponds to thevirtualtopographic surface constituted bylocalhighest points, such as peaks and plateau tops, and reconstitutes palaeo-geomorphology before the drainage erosion. Summit level map is efficient for reconstitution of palaeo-surfaces and detection of active tectonic movement. Base level is thevirtualsurface composed oflocallowest points, as valley bottoms. The difference between summit level and base level is called relief amount. Thesevirtualmapsareconstructed by theoriginalsoftwareBaz. Themacroconcavity index, MCI, is calculated from summit level and relief amount maps. The volume-normalised three-dimensional concavity index, TCI, is calculated from hypsometric diagram. The massifs with high erosive resistance tend to have convex general form and low MCI and TCI. Those with low resistance have concave form and high MCI and TCI. The diagram of TCI vs. MCI permits to distinguish erosive characteristics of massifs according to their constituent rocks. The base level map for ocean bottom detects the basement tectonic uplift which occurred before the formation of the volcanic seamounts.

  7. Assuring Software Reliability

    Science.gov (United States)

    2014-08-01

    technologies and processes to achieve a required level of confidence that software systems and services function in the intended manner. 1.3 Security Example...that took three high-voltage lines out of service and a software fail- ure (a race condition3) that disabled the computing service that notified the... service had failed. Instead of analyzing the details of the alarm server failure, the reviewers asked why the following software assurance claim had

  8. Software reliability

    CERN Document Server

    Bendell, A

    1986-01-01

    Software Reliability reviews some fundamental issues of software reliability as well as the techniques, models, and metrics used to predict the reliability of software. Topics covered include fault avoidance, fault removal, and fault tolerance, along with statistical methods for the objective assessment of predictive accuracy. Development cost models and life-cycle cost models are also discussed. This book is divided into eight sections and begins with a chapter on adaptive modeling used to predict software reliability, followed by a discussion on failure rate in software reliability growth mo

  9. Computer software.

    Science.gov (United States)

    Rosenthal, L E

    1986-10-01

    Software is the component in a computer system that permits the hardware to perform the various functions that a computer system is capable of doing. The history of software and its development can be traced to the early nineteenth century. All computer systems are designed to utilize the "stored program concept" as first developed by Charles Babbage in the 1850s. The concept was lost until the mid-1940s, when modern computers made their appearance. Today, because of the complex and myriad tasks that a computer system can perform, there has been a differentiation of types of software. There is software designed to perform specific business applications. There is software that controls the overall operation of a computer system. And there is software that is designed to carry out specialized tasks. Regardless of types, software is the most critical component of any computer system. Without it, all one has is a collection of circuits, transistors, and silicone chips.

  10. How Does Skype, as an Online Communication Software Tool, Contribute to K-12 Administrators' Level of Self-Efficacy?

    Science.gov (United States)

    Kiriakidis, Peter

    2012-01-01

    How does Skype, as an online communication tool, contribute to school and district administrators' reported level of self-efficacy? A sample of n = 39 participants of which 22 were school administrators and 17 were district administrators was purposefully selected to use Skype in their offices with a webcam and microphone to communicate with other…

  11. Low-level profiling and MARTE-compatible modeling of software components for real-time systems

    NARCIS (Netherlands)

    Triantafyllidis, K.; Bondarev, E.; With, de P.H.N.

    2012-01-01

    In this paper, we present a method for (a) profiling of individual components at high accuracy level, (b) modeling of the components with the accurate data obtained from profiling, and (c) model conversion to the MARTE profile. The resulting performance models of individual components are used at

  12. Nutrient balance of layers fed diets with different calcium levels and the inclusion of phytase and/or sodium butyrate

    Directory of Open Access Journals (Sweden)

    MM Vieira

    2011-06-01

    Full Text Available In this study, Hisex Brown layers in lay were evaluated between 40 and 44 weeks of age to evaluate the inclusion of bacterial phytase (Ph and sodium butyrate (SB to diets containing different calcium levels (CaL. Performance, average egg weight and eggshell percentage, in addition to nutrient metabolizability and Ca and P balance were evaluated for 28 days. Birds were distributed according to a completely randomized experimental design with a 3x2x2 factorial arrangement, with three calcium levels (2.8, 3.3, 3.8%; the addition or not of phytase (500PhU/kg and the addition or not of sodium butyrate (20mEq/kg, composing 12 treatments with eight replicates of one bird each. There was no additive effect of phytase or SB on the evaluated responses. Feed intake and feed conversion ratio were influenced by CaL, with the best performance obtained with 3.3% dietary Ca. Ca balance was positively affected by dietary Ca, and P balance by the addition of phytase. Ca dietary concentration, estimated to obtain Ca body balance, was 3.41%, corresponding to an apparent retention of 59.9% of Ca intake.

  13. Software quality in 1997

    Energy Technology Data Exchange (ETDEWEB)

    Jones, C. [Software Productivity Research, Inc., Burlington, MA (United States)

    1997-11-01

    For many years, software quality assurance lagged behind hardware quality assurance in terms of methods, metrics, and successful results. New approaches such as Quality Function Deployment (QFD) the ISO 9000-9004 standards, the SEI maturity levels, and Total Quality Management (TQM) are starting to attract wide attention, and in some cases to bring software quality levels up to a parity with manufacturing quality levels. Since software is on the critical path for many engineered products, and for internal business systems as well, the new approaches are starting to affect global competition and attract widespread international interest. It can be hypothesized that success in mastering software quality will be a key strategy for dominating global software markets in the 21st century.

  14. Software FMEA analysis for safety-related application software

    International Nuclear Information System (INIS)

    Park, Gee-Yong; Kim, Dong Hoon; Lee, Dong Young

    2014-01-01

    Highlights: • We develop a modified FMEA analysis suited for applying to software architecture. • A template for failure modes on a specific software language is established. • A detailed-level software FMEA analysis on nuclear safety software is presented. - Abstract: A method of a software safety analysis is described in this paper for safety-related application software. The target software system is a software code installed at an Automatic Test and Interface Processor (ATIP) in a digital reactor protection system (DRPS). For the ATIP software safety analysis, at first, an overall safety or hazard analysis is performed over the software architecture and modules, and then a detailed safety analysis based on the software FMEA (Failure Modes and Effect Analysis) method is applied to the ATIP program. For an efficient analysis, the software FMEA analysis is carried out based on the so-called failure-mode template extracted from the function blocks used in the function block diagram (FBD) for the ATIP software. The software safety analysis by the software FMEA analysis, being applied to the ATIP software code, which has been integrated and passed through a very rigorous system test procedure, is proven to be able to provide very valuable results (i.e., software defects) that could not be identified during various system tests

  15. A user's guide to the GoldSim/BLT-MS integrated software package:a low-level radioactive waste disposal performance assessment model

    International Nuclear Information System (INIS)

    Knowlton, Robert G.; Arnold, Bill Walter; Mattie, Patrick D.

    2007-01-01

    Sandia National Laboratories (Sandia), a U.S. Department of Energy National Laboratory, has over 30 years experience in the assessment of radioactive waste disposal and at the time of this publication is providing assistance internationally in a number of areas relevant to the safety assessment of radioactive waste disposal systems. In countries with small radioactive waste programs, international technology transfer program efforts are often hampered by small budgets, schedule constraints, and a lack of experienced personnel. In an effort to surmount these difficulties, Sandia has developed a system that utilizes a combination of commercially available software codes and existing legacy codes for probabilistic safety assessment modeling that facilitates the technology transfer and maximizes limited available funding. Numerous codes developed and endorsed by the United States Nuclear Regulatory Commission (NRC) and codes developed and maintained by United States Department of Energy are generally available to foreign countries after addressing import/export control and copyright requirements. From a programmatic view, it is easier to utilize existing codes than to develop new codes. From an economic perspective, it is not possible for most countries with small radioactive waste disposal programs to maintain complex software, which meets the rigors of both domestic regulatory requirements and international peer review. Therefore, revitalization of deterministic legacy codes, as well as an adaptation of contemporary deterministic codes, provides a credible and solid computational platform for constructing probabilistic safety assessment models. This document is a reference users guide for the GoldSim/BLT-MS integrated modeling software package developed as part of a cooperative technology transfer project between Sandia National Laboratories and the Institute of Nuclear Energy Research (INER) in Taiwan for the preliminary assessment of several candidate low-level

  16. Optimization of Antivirus Software

    Directory of Open Access Journals (Sweden)

    2007-01-01

    Full Text Available The paper describes the main techniques used in development of computer antivirus software applications. For this particular category of software, are identified and defined optimum criteria that helps determine which solution is better and what are the objectives of the optimization process. From the general viewpoint of software optimization are presented methods and techniques that are applied at code development level. Regarding the particularities of antivirus software, the paper analyzes some of the optimization concepts applied to this category of applications

  17. Low-level software for the pentek 6510 digital receiver board applied to the new AD beam measurement system

    CERN Document Server

    Angoletta, Maria Elena

    2002-01-01

    The new beam measurement system for the CERN Antiproton Decelerator heavily relies on a Pentek 6510 Digital Receiver (DRX) board. The new system goal is to extract beam parameters from pickup signals. Its digital implementation allows for higher precision, easier management of the hardware as well as modification and improvement with no hardware change. In this scheme, this innovative VME DRX board is responsible for parallel data acquisition, independent digital down conversion and processing of up to 4 digitised inputs. The in-house- developed low-level code (LLC), running on the board, takes care of several tasks, such as interfacing with the Real Time Task (RTT), data processing and board managing. The RTT runs on a PowerPC VME board and controls the DRX board as a master. The LLC is a state machine developed in C and Assembler, which services several interrupts and performs the FFT of complex input data. The DRX low-level system developed is highly modular and easily adaptable to other processing scenari...

  18. Thin-layer heap bioleaching of copper flotation tailings containing high levels of fine grains and microbial community succession analysis

    Science.gov (United States)

    Hao, Xiao-dong; Liang, Yi-li; Yin, Hua-qun; Liu, Hong-wei; Zeng, Wei-min; Liu, Xue-duan

    2017-04-01

    Thin-layer heap bioleaching of copper flotation tailings containing high levels of fine grains was carried out by mixed cultures on a small scale over a period of 210 d. Lump ores as a framework were loaded at the bottom of the ore heap. The overall copper leaching rates of tailings and lump ores were 57.10wt% and 65.52wt%, respectively. The dynamic shifts of microbial community structures about attached microorganisms were determined using the Illumina MiSeq sequencing platform based on 16S rRNA amplification strategy. The results indicated that chemolithotrophic genera Acidithiobacillus and Leptospirillum were always detected and dominated the microbial community in the initial and middle stages of the heap bioleaching process; both genera might be responsible for improving the copper extraction. However, Thermogymnomonas and Ferroplasma increased gradually in the final stage. Moreover, the effects of various physicochemical parameters and microbial community shifts on the leaching efficiency were further investigated and these associations provided some important clues for facilitating the effective application of bioleaching.

  19. Rendering of surface-geometries at job-generation level for camouflaging the layered nature of Additively Manufactured parts

    DEFF Research Database (Denmark)

    Pedersen, D. B.; Hansen, H. N.; Nielsen, J. S.

    2014-01-01

    The layered nature of Additive Manufactured parts, specifically those given from the Fused Deposition Modelling (FDM) process, exhibit a distinct surface definition. The origin of this is from the 2.5D deposition scheme, which leaves the seam between the individual layers clearly visible.[1...

  20. Convergence of Health Level Seven Version 2 Messages to Semantic Web Technologies for Software-Intensive Systems in Telemedicine Trauma Care.

    Science.gov (United States)

    Menezes, Pedro Monteiro; Cook, Timothy Wayne; Cavalini, Luciana Tricai

    2016-01-01

    To present the technical background and the development of a procedure that enriches the semantics of Health Level Seven version 2 (HL7v2) messages for software-intensive systems in telemedicine trauma care. This study followed a multilevel model-driven approach for the development of semantically interoperable health information systems. The Pre-Hospital Trauma Life Support (PHTLS) ABCDE protocol was adopted as the use case. A prototype application embedded the semantics into an HL7v2 message as an eXtensible Markup Language (XML) file, which was validated against an XML schema that defines constraints on a common reference model. This message was exchanged with a second prototype application, developed on the Mirth middleware, which was also used to parse and validate both the original and the hybrid messages. Both versions of the data instance (one pure XML, one embedded in the HL7v2 message) were equally validated and the RDF-based semantics recovered by the receiving side of the prototype from the shared XML schema. This study demonstrated the semantic enrichment of HL7v2 messages for intensive-software telemedicine systems for trauma care, by validating components of extracts generated in various computing environments. The adoption of the method proposed in this study ensures the compliance of the HL7v2 standard in Semantic Web technologies.

  1. Software Innovation

    DEFF Research Database (Denmark)

    Rose, Jeremy

      Innovation is the forgotten key to modern systems development - the element that defines the enterprising engineer, the thriving software firm and the cutting edge software application.  Traditional forms of technical education pay little attention to creativity - often encouraging overly...

  2. Effects of Boundary Layer Height on the Model of Ground-Level PM2.5 Concentrations from AOD: Comparison of Stable and Convective Boundary Layer Heights from Different Methods

    Directory of Open Access Journals (Sweden)

    Zengliang Zang

    2017-06-01

    Full Text Available The aerosol optical depth (AOD from satellites or ground-based sun photometer spectral observations has been widely used to estimate ground-level PM2.5 concentrations by regression methods. The boundary layer height (BLH is a popular factor in the regression model of AOD and PM2.5, but its effect is often uncertain. This may result from the structures between the stable and convective BLHs and from the calculation methods of the BLH. In this study, the boundary layer is divided into two types of stable and convective boundary layer, and the BLH is calculated using different methods from radiosonde data and National Centers for Environmental Prediction (NCEP reanalysis data for the station in Beijing, China during 2014–2015. The BLH values from these methods show significant differences for both the stable and convective boundary layer. Then, these BLHs were introduced into the regression model of AOD-PM2.5 to seek the respective optimal BLH for the two types of boundary layer. It was found that the optimal BLH for the stable boundary layer is determined using the method of surface-based inversion, and the optimal BLH for the convective layer is determined using the method of elevated inversion. Finally, the optimal BLH and other meteorological parameters were combined to predict the PM2.5 concentrations using the stepwise regression method. The results indicate that for the stable boundary layer, the optimal stepwise regression model includes the factors of surface relative humidity, BLH, and surface temperature. These three factors can significantly enhance the prediction accuracy of ground-level PM2.5 concentrations, with an increase of determination coefficient from 0.50 to 0.68. For the convective boundary layer, however, the optimal stepwise regression model includes the factors of BLH and surface wind speed. These two factors improve the determination coefficient, with a relatively low increase from 0.65 to 0.70. It is found that the

  3. Software engineering

    CERN Document Server

    Sommerville, Ian

    2016-01-01

    For courses in computer science and software engineering The Fundamental Practice of Software Engineering Software Engineering introduces readers to the overwhelmingly important subject of software programming and development. In the past few years, computer systems have come to dominate not just our technological growth, but the foundations of our world's major industries. This text seeks to lay out the fundamental concepts of this huge and continually growing subject area in a clear and comprehensive manner. The Tenth Edition contains new information that highlights various technological updates of recent years, providing readers with highly relevant and current information. Sommerville's experience in system dependability and systems engineering guides the text through a traditional plan-based approach that incorporates some novel agile methods. The text strives to teach the innovators of tomorrow how to create software that will make our world a better, safer, and more advanced place to live.

  4. Software Radar Technology

    Directory of Open Access Journals (Sweden)

    Tang Jun

    2015-08-01

    Full Text Available In this paper, the definition and the key features of Software Radar, which is a new concept, are proposed and discussed. We consider the development of modern radar system technology to be divided into three stages: Digital Radar, Software radar and Intelligent Radar, and the second stage is just commencing now. A Software Radar system should be a combination of various modern digital modular components conformed to certain software and hardware standards. Moreover, a software radar system with an open system architecture supporting to decouple application software and low level hardware would be easy to adopt "user requirements-oriented" developing methodology instead of traditional "specific function-oriented" developing methodology. Compared with traditional Digital Radar, Software Radar system can be easily reconfigured and scaled up or down to adapt to the changes of requirements and technologies. A demonstration Software Radar signal processing system, RadarLab 2.0, which has been developed by Tsinghua University, is introduced in this paper and the suggestions for the future development of Software Radar in China are also given in the conclusion.

  5. MZDASoft: a software architecture that enables large-scale comparison of protein expression levels over multiple samples based on liquid chromatography/tandem mass spectrometry.

    Science.gov (United States)

    Ghanat Bari, Mehrab; Ramirez, Nelson; Wang, Zhiwei; Zhang, Jianqiu Michelle

    2015-10-15

    Without accurate peak linking/alignment, only the expression levels of a small percentage of proteins can be compared across multiple samples in Liquid Chromatography/Mass Spectrometry/Tandem Mass Spectrometry (LC/MS/MS) due to the selective nature of tandem MS peptide identification. This greatly hampers biomedical research that aims at finding biomarkers for disease diagnosis, treatment, and the understanding of disease mechanisms. A recent algorithm, PeakLink, has allowed the accurate linking of LC/MS peaks without tandem MS identifications to their corresponding ones with identifications across multiple samples collected from different instruments, tissues and labs, which greatly enhanced the ability of comparing proteins. However, PeakLink cannot be implemented practically for large numbers of samples based on existing software architectures, because it requires access to peak elution profiles from multiple LC/MS/MS samples simultaneously. We propose a new architecture based on parallel processing, which extracts LC/MS peak features, and saves them in database files to enable the implementation of PeakLink for multiple samples. The software has been deployed in High-Performance Computing (HPC) environments. The core part of the software, MZDASoft Parallel Peak Extractor (PPE), can be downloaded with a user and developer's guide, and it can be run on HPC centers directly. The quantification applications, MZDASoft TandemQuant and MZDASoft PeakLink, are written in Matlab, which are compiled with a Matlab runtime compiler. A sample script that incorporates all necessary processing steps of MZDASoft for LC/MS/MS quantification in a parallel processing environment is available. The project webpage is http://compgenomics.utsa.edu/zgroup/MZDASoft. The proposed architecture enables the implementation of PeakLink for multiple samples. Significantly more (100%-500%) proteins can be compared over multiple samples with better quantification accuracy in test cases. MZDASoft

  6. An Accelerator Control Middle Layer Using MATLAB

    International Nuclear Information System (INIS)

    Portmann, Gregory J.; Corbett, Jeff; Terebilo, Andrei

    2005-01-01

    Matlab is a matrix manipulation language originally developed to be a convenient language for using the LINPACK and EISPACK libraries. What makes Matlab so appealing for accelerator physics is the combination of a matrix oriented programming language, an active workspace for system variables, powerful graphics capability, built-in math libraries, and platform independence. A number of software toolboxes for accelerators have been written in Matlab--the Accelerator Toolbox (AT) for machine simulations, LOCO for accelerator calibration, Matlab Channel Access Toolbox (MCA) for EPICS connections, and the Middle Layer. This paper will describe the ''middle layer'' software toolbox that resides between the high-level control applications and the low-level accelerator control system. This software was a collaborative effort between ALS (LBNL) and SPEAR3 (SSRL) but easily ports to other machines. Five accelerators presently use this software. The high-level Middle Layer functionality includes energy ramp, configuration control (save/restore), global orbit correction, local photon beam steering, insertion device compensation, beam-based alignment, tune correction, response matrix measurement, and script-based programs for machine physics studies

  7. The Effects of Dynamic Geometry Software and Physical Manipulatives on Pre-Service Primary Teachers’ Van Hiele Levels and Spatial Abilities

    Directory of Open Access Journals (Sweden)

    Fatih Karakuş

    2015-12-01

    Full Text Available The purpose of the study was to compare the influence of dynamic geometry software activities and influence of the physical manipulatives and drawing activities on the spatial ability and van Hiele levels of pre-service primary school teachers in a geometry course. A quasi-experimental statistical design was used in the study. The participants were 61 pre-service primary teachers in the second year of their undergraduate program in the Department of Elementary Education at Afyon Kocatepe University. A total of 32 pre-service teachers (computer group were trained in the dynamic geometry based activities and 29 pre-service teachers (physical-drawing group were trained in the physical manipulative and drawing based activities. In order to determine the two groups of the pre-service teachers’ geometric thinking levels, the van Hiele Geometry Test and in order to determine the two groups of the pre-service teachers’ spatial ability, The Purdue Spatial Visualization Test was used as the pre-test and post-test. The results of the study showed that there was no difference on the post-test of the two groups related to the van Hiele levels and spatial abilities. Moreover, both groups have significantly higher achievement on the post-test compared to the pre-test.

  8. Software requirements

    CERN Document Server

    Wiegers, Karl E

    2003-01-01

    Without formal, verifiable software requirements-and an effective system for managing them-the programs that developers think they've agreed to build often will not be the same products their customers are expecting. In SOFTWARE REQUIREMENTS, Second Edition, requirements engineering authority Karl Wiegers amplifies the best practices presented in his original award-winning text?now a mainstay for anyone participating in the software development process. In this book, you'll discover effective techniques for managing the requirements engineering process all the way through the development cy

  9. Hardening Software Defined Networks

    Science.gov (United States)

    2014-07-01

    Zarifis,Peyman Kazemian:Leveraging SDN layering to systematically troubleshoot networks. HotSDN 2013: 37-42 21. Aurojit Panda ,Colin Scott,Ali Ghodsi...Unlimited. 21 ICN.SIGCOMM 2013: 147-158 23. Sangjin Han (U.C.Berkeley), Norbert Egi (Huawei Corp.), Aurojit Panda , Sylvia Ratnasamy (U.C.Berkeley...balancers, traffic-shapers, and so on. SDN brings software and processing power to bear on all this complexity. While a large Data Center may be

  10. Addressing Software Security

    Science.gov (United States)

    Bailey, Brandon

    2015-01-01

    Historically security within organizations was thought of as an IT function (web sites/servers, email, workstation patching, etc.) Threat landscape has evolved (Script Kiddies, Hackers, Advanced Persistent Threat (APT), Nation States, etc.) Attack surface has expanded -Networks interconnected!! Some security posture factors Network Layer (Routers, Firewalls, etc.) Computer Network Defense (IPS/IDS, Sensors, Continuous Monitoring, etc.) Industrial Control Systems (ICS) Software Security (COTS, FOSS, Custom, etc.)

  11. Software Reviews.

    Science.gov (United States)

    Dwyer, Donna; And Others

    1989-01-01

    Reviewed are seven software packages for Apple and IBM computers. Included are: "Toxicology"; "Science Corner: Space Probe"; "Alcohol and Pregnancy"; "Science Tool Kit Plus"; Computer Investigations: Plant Growth"; "Climatrolls"; and "Animal Watch: Whales." (CW)

  12. Software Reviews.

    Science.gov (United States)

    Davis, Shelly J., Ed.; Knaupp, Jon, Ed.

    1984-01-01

    Reviewed is computer software on: (1) classification of living things, a tutorial program for grades 5-10; and (2) polynomial practice using tiles, a drill-and-practice program for algebra students. (MNS)

  13. Software Reviews.

    Science.gov (United States)

    Miller, Anne, Ed.; Radziemski, Cathy, Ed.

    1988-01-01

    Three pieces of computer software are described and reviewed: HyperCard, to build and use varied applications; Iggy's Gnees, for problem solving with shapes in grades kindergarten-two; and Algebra Shop, for practicing skills and problem solving. (MNS)

  14. The calculation of rare-earth levels in layered cobaltates R.sub.x/3./sub.CoO.sub.2./sub. (x≤1)

    Czech Academy of Sciences Publication Activity Database

    Novák, Pavel; Knížek, Karel; Jirák, Zdeněk; Buršík, Josef

    2015-01-01

    Roč. 381, May (2015), s. 145-150 ISSN 0304-8853 R&D Projects: GA ČR GA13-03708S Institutional support: RVO:68378271 ; RVO:61388980 Keywords : rare- earth electronic levels * crystal field splitting * layered cobaltates Subject RIV: BM - Solid Matter Physics ; Magnetism Impact factor: 2.357, year: 2015

  15. Entropy based software processes improvement

    NARCIS (Netherlands)

    Trienekens, J.J.M.; Kusters, R.J.; Kriek, D.; Siemons, P.

    2009-01-01

    Actual results of software process improvement projects show different levels of success. Although many software development organisations have adopted improvement models such as CMMI, it appears to be difficult to improve software development processes in the right way, e.g. tuned to the actual

  16. Software Quality Assurance Audits Guidebooks

    Science.gov (United States)

    1990-01-01

    The growth in cost and importance of software to NASA has caused NASA to address the improvement of software development across the agency. One of the products of this program is a series of guidebooks that define a NASA concept of the assurance processes that are used in software development. The Software Assurance Guidebook, NASA-GB-A201, issued in September, 1989, provides an overall picture of the NASA concepts and practices in software assurance. Second level guidebooks focus on specific activities that fall within the software assurance discipline, and provide more detailed information for the manager and/or practitioner. This is the second level Software Quality Assurance Audits Guidebook that describes software quality assurance audits in a way that is compatible with practices at NASA Centers.

  17. High serotonin levels during brain development alter the structural input-output connectivity of neural networks in the rat somatosensory layer IV

    Directory of Open Access Journals (Sweden)

    Stéphanie eMiceli

    2013-06-01

    Full Text Available Homeostatic regulation of serotonin (5-HT concentration is critical for normal topographical organization and development of thalamocortical (TC afferent circuits. Down-regulation of the serotonin transporter (SERT and the consequent impaired reuptake of 5-HT at the synapse, results in a reduced terminal branching of developing TC afferents within the primary somatosensory cortex (S1. Despite the presence of multiple genetic models, the effect of high extracellular 5-HT levels on the structure and function of developing intracortical neural networks is far from being understood. Here, using juvenile SERT knockout (SERT-/- rats we investigated, in vitro, the effect of increased 5-HT levels on the structural organization of (i the thalamocortical projections of the ventroposteromedial thalamic nucleus towards S1, (ii the general barrel-field pattern and (iii the electrophysiological and morphological properties of the excitatory cell population in layer IV of S1 (spiny stellate and pyramidal cells. Our results confirmed previous findings that high levels of 5-HT during development lead to a reduction of the topographical precision of TCA projections towards the barrel cortex. Also, the barrel pattern was altered but not abolished in SERT-/- rats. In layer IV, both excitatory spiny stellate and pyramidal cells showed a significantly reduced intracolumnar organization of their axonal projections. In addition, the layer IV spiny stellate cells gave rise to a prominent projection towards the infragranular layer Vb. Our findings point to a structural and functional reorganization, of TCAs, as well as early stage intracortical microcircuitry, following the disruption of 5-HT reuptake during critical developmental periods. The increased projection pattern of the layer IV neurons suggests that the intracortical network changes are not limited to the main entry layer IV but may also affect the subsequent stages of the canonical circuits of the barrel

  18. Modern software cybernetics: new trends

    OpenAIRE

    Yang, H; Chen, F; Aliyu, S

    2017-01-01

    The file attached to this record is the author's final peer reviewed version. The Publisher's final version can be found by following the DOI link. Software cybernetics research is to apply a variety of techniques from cybernetics research to software engineering research. For more than fifteen years since 2001, there has been a dramatic increase in work relating to software cybernetics. From cybernetics viewpoint, the work is mainly on the first-order level, namely, the software under obs...

  19. Next generation software process improvement

    OpenAIRE

    Turnas, Daniel

    2003-01-01

    Approved for public release; distribution is unlimited Software is often developed under a process that can at best be described as ad hoc. While it is possible to develop quality software under an ad hoc process, formal processes can be developed to help increase the overall quality of the software under development. The application of these processes allows for an organization to mature. The software maturity level, and process improvement, of an organization can be measured with the Cap...

  20. Characterization of leached surface layers on simulated high-level waste glasses by sputter-induced optical emission

    International Nuclear Information System (INIS)

    Houser, C.; Tsong, I.S.T.; White, W.B.

    1979-01-01

    The leaching process in simulated waste encapsulant glasses was studied by measuring the compositional depth-profiles of H (from water), the glass framework formers Si and B, the alkalis Na and Cs, the alkaline earths Ca and Sr, the transition metals Mo and Fe, the rare-earths La, Ce, and Nd, using the technique of sputter-induced optical emission. The leaching process of these glasses is highly complex. In addition to alkali/hydrogen exchange, there is breakdown of the glass framework, build-up of barrier layers on the surface, and formation of layered reaction zones of distinctly different chemistry all within the outer micrometer of the glass

  1. Software Defined Networking Demands on Software Technologies

    DEFF Research Database (Denmark)

    Galinac Grbac, T.; Caba, Cosmin Marius; Soler, José

    2015-01-01

    Software Defined Networking (SDN) is a networking approach based on a centralized control plane architecture with standardised interfaces between control and data planes. SDN enables fast configuration and reconfiguration of the network to enhance resource utilization and service performances....... This new approach enables a more dynamic and flexible network, which may adapt to user needs and application requirements. To this end, systemized solutions must be implemented in network software, aiming to provide secure network services that meet the required service performance levels. In this paper......, we review this new approach to networking from an architectural point of view, and identify and discuss some critical quality issues that require new developments in software technologies. These issues we discuss along with use case scenarios. Here in this paper we aim to identify challenges...

  2. Software reengineering

    Science.gov (United States)

    Fridge, Ernest M., III

    1991-01-01

    Today's software systems generally use obsolete technology, are not integrated properly with other software systems, and are difficult and costly to maintain. The discipline of reverse engineering is becoming prominent as organizations try to move their systems up to more modern and maintainable technology in a cost effective manner. JSC created a significant set of tools to develop and maintain FORTRAN and C code during development of the Space Shuttle. This tool set forms the basis for an integrated environment to re-engineer existing code into modern software engineering structures which are then easier and less costly to maintain and which allow a fairly straightforward translation into other target languages. The environment will support these structures and practices even in areas where the language definition and compilers do not enforce good software engineering. The knowledge and data captured using the reverse engineering tools is passed to standard forward engineering tools to redesign or perform major upgrades to software systems in a much more cost effective manner than using older technologies. A beta vision of the environment was released in Mar. 1991. The commercial potential for such re-engineering tools is very great. CASE TRENDS magazine reported it to be the primary concern of over four hundred of the top MIS executives.

  3. Evaluation of the effect of radiation levels and dose rates in irradiation of murine fibroblasts used as a feeder layer in the culture of human keratinocytes

    International Nuclear Information System (INIS)

    Yoshito, Daniele; Almeida, Tiago L.; Santin, Stefany Plumeri; Somessari, Elizabeth S.R.; Silveira, Carlos G. da; Mathor, Monica B.; Altran, Silvana C.; Isaac, Cesar

    2009-01-01

    In 1975, Rheinwald and Green published an effective methodology for obtaining and cultivating human keratinocytes. This methodology consisted of seeding keratinocytes onto a feeder layer composed of lineage 3T3 murine fibroblasts, the proliferation rate of which is then controlled through the action of ionizing radiation. The presence of the feeder layer encourages the development of keratinocyte colonies and their propagation in similar cultures, becoming possible several clinical applications as skin substitutes or wound dressings in situations such as post burn extensive skin loss and other skin disorders. However, good development of these keratinocytes depends on a high quality feeder layer among other factors. In the present work, we evaluated the relationship between radiation levels and dose rates applied to fibroblasts used in construction of feeder layers and the radiation effect on keratinocytes colonies forming efficiency. Results indicate 3T3 lineage murine fibroblasts irradiated with doses varying between 60 and 100 Gy can be used as a feeder layer immediately after irradiation or storage of the irradiated cells in suspension at 4 g C for 24 hours with similar results. The exception is when the irradiation dose rate is 2.75 Gyh -1 ; in this case, results suggested that the fibroblasts should be used immediately after irradiation. (author)

  4. Software Authentication

    International Nuclear Information System (INIS)

    Wolford, J.K.; Geelhood, B.D.; Hamilton, V.A.; Ingraham, J.; MacArthur, D.W.; Mitchell, D.J.; Mullens, J.A.; Vanier, P. E.; White, G.K.; Whiteson, R.

    2001-01-01

    The effort to define guidance for authentication of software for arms control and nuclear material transparency measurements draws on a variety of disciplines and has involved synthesizing established criteria and practices with newer methods. Challenges include the need to protect classified information that the software manipulates as well as deal with the rapid pace of innovation in the technology of nuclear material monitoring. The resulting guidance will shape the design of future systems and inform the process of authentication of instruments now being developed. This paper explores the technical issues underlying the guidance and presents its major tenets

  5. Software engineering

    CERN Document Server

    Thorin, Marc

    1985-01-01

    Software Engineering describes the conceptual bases as well as the main methods and rules on computer programming. This book presents software engineering as a coherent and logically built synthesis and makes it possible to properly carry out an application of small or medium difficulty that can later be developed and adapted to more complex cases. This text is comprised of six chapters and begins by introducing the reader to the fundamental notions of entities, actions, and programming. The next two chapters elaborate on the concepts of information and consistency domains and show that a proc

  6. The virtual double-slit experiment to High School level (Part I: behavior classical analysis (with bullets and waves and development of computational software

    Directory of Open Access Journals (Sweden)

    Danilo Cardoso Ferreira

    2016-09-01

    Full Text Available http://dx.doi.org/10.5007/2175-7941.2016v33n2p697   This paper analyses the double-slit virtual experiment and it composed of two parts: The part I covers the classical theory (with bullets and waves and the part II covers the interference with electrons or photons. Firstly, we have analyzed the same experiment that shoots a stream of bullets. In front of the gun we have a wall that has in it two holes just big enough to let a bullet through. Beyond the wall is a backstop (say a thick wall of wood which will absorb the bullets when they hit it. In this case, the probabilities just add together. The effect with both holes open is the sum of effects with each holes open alone. We have shown it for high school level. Next, we have analyzed a same experiment with water waves. The intensity observed when both holes are open is certainly not the sum of the intensity of the wave from hole 1 (which we find by measuring when hole 2 is blocked off and the intensity of the wave form hole 2 (seen when hole 1 is blocked. Finally, we have shown a software developed by students about double-slit experiment with bullets.

  7. Calculation Software versus Illustration Software for Teaching Statistics

    DEFF Research Database (Denmark)

    Mortensen, Peter Stendahl; Boyle, Robin G.

    1999-01-01

    As personal computers have become more and more powerful, so have the software packages available to us for teaching statistics. This paper investigates what software packages are currently being used by progressive statistics instructors at university level, examines some of the deficiencies...... of such software, and indicates features that statistics instructors wish to have incorporated in software in the future. The basis of the paper is a survey of participants at ICOTS-5 (the Fifth International Conference on Teaching Statistics). These survey results, combined with the software based papers...

  8. Reviews, Software.

    Science.gov (United States)

    Science Teacher, 1988

    1988-01-01

    Reviews two computer software packages for use in physical science, physics, and chemistry classes. Includes "Physics of Model Rocketry" for Apple II, and "Black Box" for Apple II and IBM compatible computers. "Black Box" is designed to help students understand the concept of indirect evidence. (CW)

  9. Software Review.

    Science.gov (United States)

    McGrath, Diane, Ed.

    1989-01-01

    Reviewed is a computer software package entitled "Audubon Wildlife Adventures: Grizzly Bears" for Apple II and IBM microcomputers. Included are availability, hardware requirements, cost, and a description of the program. The murder-mystery flavor of the program is stressed in this program that focuses on illegal hunting and game…

  10. Software Reviews.

    Science.gov (United States)

    Teles, Elizabeth, Ed.; And Others

    1990-01-01

    Reviewed are two computer software packages for Macintosh microcomputers including "Phase Portraits," an exploratory graphics tool for studying first-order planar systems; and "MacMath," a set of programs for exploring differential equations, linear algebra, and other mathematical topics. Features, ease of use, cost, availability, and hardware…

  11. MIAWARE Software

    DEFF Research Database (Denmark)

    Wilkowski, Bartlomiej; Pereira, Oscar N. M.; Dias, Paulo

    2008-01-01

    is automatically generated. Furthermore, MIAWARE software is accompanied with an intelligent search engine for medical reports, based on the relations between parts of the lungs. A logical structure of the lungs is introduced to the search algorithm through the specially developed ontology. As a result...

  12. Software quality engineering a practitioner's approach

    CERN Document Server

    Suryn, Witold

    2014-01-01

    Software quality stems from two distinctive, but associated, topics in software engineering: software functional quality and software structural quality. Software Quality Engineering studies the tenets of both of these notions, which focus on the efficiency and value of a design, respectively. The text addresses engineering quality on both the application and system levels with attention to Information Systems and Embedded Systems as well as recent developments. Targeted at graduate engineering students and software quality specialists, the book analyzes the relationship between functionality

  13. In vitro porcine blood-brain barrier model for permeability studies: pCEL-X software pKa(FLUX) method for aqueous boundary layer correction and detailed data analysis.

    Science.gov (United States)

    Yusof, Siti R; Avdeef, Alex; Abbott, N Joan

    2014-12-18

    In vitro blood-brain barrier (BBB) models from primary brain endothelial cells can closely resemble the in vivo BBB, offering valuable models to assay BBB functions and to screen potential central nervous system drugs. We have recently developed an in vitro BBB model using primary porcine brain endothelial cells. The model shows expression of tight junction proteins and high transendothelial electrical resistance, evidence for a restrictive paracellular pathway. Validation studies using small drug-like compounds demonstrated functional uptake and efflux transporters, showing the suitability of the model to assay drug permeability. However, one limitation of in vitro model permeability measurement is the presence of the aqueous boundary layer (ABL) resulting from inefficient stirring during the permeability assay. The ABL can be a rate-limiting step in permeation, particularly for lipophilic compounds, causing underestimation of the permeability. If the ABL effect is ignored, the permeability measured in vitro will not reflect the permeability in vivo. To address the issue, we explored the combination of in vitro permeability measurement using our porcine model with the pKa(FLUX) method in pCEL-X software to correct for the ABL effect and allow a detailed analysis of in vitro (transendothelial) permeability data, Papp. Published Papp using porcine models generated by our group and other groups are also analyzed. From the Papp, intrinsic transcellular permeability (P0) is derived by simultaneous refinement using a weighted nonlinear regression, taking into account permeability through the ABL, paracellular permeability and filter restrictions on permeation. The in vitro P0 derived for 22 compounds (35 measurements) showed good correlation with P0 derived from in situ brain perfusion data (r(2)=0.61). The analysis also gave evidence for carrier-mediated uptake of naloxone, propranolol and vinblastine. The combination of the in vitro porcine model and the software

  14. UTM TCL2 Software Requirements

    Science.gov (United States)

    Smith, Irene S.; Rios, Joseph L.; McGuirk, Patrick O.; Mulfinger, Daniel G.; Venkatesan, Priya; Smith, David R.; Baskaran, Vijayakumar; Wang, Leo

    2017-01-01

    The Unmanned Aircraft Systems (UAS) Traffic Management (UTM) Technical Capability Level (TCL) 2 software implements the UTM TCL 2 software requirements described herein. These software requirements are linked to the higher level UTM TCL 2 System Requirements. Each successive TCL implements additional UTM functionality, enabling additional use cases. TCL 2 demonstrated how to enable expanded multiple operations by implementing automation for beyond visual line-of-sight, tracking operations, and operations flying over sparsely populated areas.

  15. Computer Program Re-layers Engineering Drawings

    Science.gov (United States)

    Crosby, Dewey C., III

    1990-01-01

    RULCHK computer program aids in structuring layers of information pertaining to part or assembly designed with software described in article "Software for Drawing Design Details Concurrently" (MFS-28444). Checks and optionally updates structure of layers for part. Enables designer to construct model and annotate its documentation without burden of manually layering part to conform to standards at design time.

  16. Secure software development training course

    Directory of Open Access Journals (Sweden)

    Victor S. Gorbatov

    2017-06-01

    Full Text Available Information security is one of the most important criteria for the quality of developed software. To obtain a sufficient level of application security companies implement security process into software development life cycle. At this stage software companies encounter with deficit employees who able to solve problems of software design, implementation and application security. This article provides a description of the secure software development training course. Training course of application security is designed for co-education students of different IT-specializations.

  17. MITRE sensor layer prototype

    Science.gov (United States)

    Duff, Francis; McGarry, Donald; Zasada, David; Foote, Scott

    2009-05-01

    The MITRE Sensor Layer Prototype is an initial design effort to enable every sensor to help create new capabilities through collaborative data sharing. By making both upstream (raw) and downstream (processed) sensor data visible, users can access the specific level, type, and quantities of data needed to create new data products that were never anticipated by the original designers of the individual sensors. The major characteristic that sets sensor data services apart from typical enterprise services is the volume (on the order of multiple terabytes) of raw data that can be generated by most sensors. Traditional tightly coupled processing approaches extract pre-determined information from the incoming raw sensor data, format it, and send it to predetermined users. The community is rapidly reaching the conclusion that tightly coupled sensor processing loses too much potentially critical information.1 Hence upstream (raw and partially processed) data must be extracted, rapidly archived, and advertised to the enterprise for unanticipated uses. The authors believe layered sensing net-centric integration can be achieved through a standardize-encapsulate-syndicateaggregate- manipulate-process paradigm. The Sensor Layer Prototype's technical approach focuses on implementing this proof of concept framework to make sensor data visible, accessible and useful to the enterprise. To achieve this, a "raw" data tap between physical transducers associated with sensor arrays and the embedded sensor signal processing hardware and software has been exploited. Second, we encapsulate and expose both raw and partially processed data to the enterprise within the context of a service-oriented architecture. Third, we advertise the presence of multiple types, and multiple layers of data through geographic-enabled Really Simple Syndication (GeoRSS) services. These GeoRSS feeds are aggregated, manipulated, and filtered by a feed aggregator. After filtering these feeds to bring just the type

  18. Software Process Improvement: Supporting the Linking of the Software and the Business Strategies

    Science.gov (United States)

    Albuquerque, Adriano Bessa; Rocha, Ana Regina; Lima, Andreia Cavalcanti

    The market is becoming more and more competitive, a lot of products and services depend of the software product and the software is one of the most important assets, which influence the organizations’ businesses. Considering this context, we can observe that the companies must to deal with the software, developing or acquiring, carefully. One of the perspectives that can help to take advantage of the software, supporting effectively the business, is to invest on the organization’s software processes. This paper presents an approach to evaluate and improve the processes assets of the software organizations, based on internationally well-known standards and process models. This approach is supported by automated tools from the TABA Workstation and is part of a wider improvement strategy constituted of three layers (organizational layer, process execution layer and external entity layer). Moreover, this paper presents the experience of use and their results.

  19. Software Tools for Software Maintenance

    Science.gov (United States)

    1988-10-01

    COMMUNICATIONS, AND COMPUTER SCIENCES I ,(AIRMICS) FO~SOFTWARE TOOLS (.o FOR SOF1 ’ARE MAINTENANCE (ASQBG-1-89-001) October, 1988 DTIC ELECTE -ifB...SUNWW~. B..c Program An~Iysw HA.c C-Tractr C Cobol Stncturing Facility VS Cobol 11 F-Scan Foctma Futbol Cobol Fortran Sltiuc Code Anaiyaer Fortran IS

  20. EPIQR software

    Energy Technology Data Exchange (ETDEWEB)

    Flourentzos, F. [Federal Institute of Technology, Lausanne (Switzerland); Droutsa, K. [National Observatory of Athens, Athens (Greece); Wittchen, K.B. [Danish Building Research Institute, Hoersholm (Denmark)

    1999-11-01

    The support of the EPIQR method is a multimedia computer program. Several modules help the users of the method to treat the data collected during a diagnosis survey, to set up refurbishment scenario and calculate their cost or energy performance, and finally to visualize the results in a comprehensive way and to prepare quality reports. This article presents the structure and the main features of the software. (au)

  1. Software preservation

    Directory of Open Access Journals (Sweden)

    Tadej Vodopivec

    2011-01-01

    Full Text Available Comtrade Ltd. covers a wide range of activities related to information and communication technologies; its deliverables include web applications, locally installed programs,system software, drivers, embedded software (used e.g. in medical devices, auto parts,communication switchboards. Also the extensive knowledge and practical experience about digital long-term preservation technologies have been acquired. This wide spectrum of activities puts us in the position to discuss the often overlooked aspect of the digital preservation - preservation of software programs. There are many resources dedicated to digital preservation of digital data, documents and multimedia records,but not so many about how to preserve the functionalities and features of computer programs. Exactly these functionalities - dynamic response to inputs - render the computer programs rich compared to documents or linear multimedia. The article opens the questions on the beginning of the way to the permanent digital preservation. The purpose is to find a way in the right direction, where all relevant aspects will be covered in proper balance. The following questions are asked: why at all to preserve computer programs permanently, who should do this and for whom, when we should think about permanent program preservation, what should be persevered (such as source code, screenshots, documentation, and social context of the program - e.g. media response to it ..., where and how? To illustrate the theoretic concepts given the idea of virtual national museum of electronic banking is also presented.

  2. Self-Contained Cross-Cutting Pipeline Software Architecture

    OpenAIRE

    Patwardhan, Amol; Patwardhan, Rahul; Vartak, Sumalini

    2016-01-01

    Layered software architecture contains several intra-layer and inter-layer dependencies. Each layer depends on shared components making it difficult to release a code change, bug fix or feature without exhaustive testing and having to build the entire software code base. This paper proposed self-contained, cross-cutting pipeline architecture (SCPA) that is independent of existing layers. We chose 2 open source projects and 3 internal intern projects that used n-tier architecture and applied t...

  3. Software quality assurance plans for safety-critical software

    International Nuclear Information System (INIS)

    Liddle, P.

    2006-01-01

    Application software is defined as safety-critical if a fault in the software could prevent the system components from performing their nuclear-safety functions. Therefore, for nuclear-safety systems, the AREVA TELEPERM R XS (TXS) system is classified 1E, as defined in the Inst. of Electrical and Electronics Engineers (IEEE) Std 603-1998. The application software is classified as Software Integrity Level (SIL)-4, as defined in IEEE Std 7-4.3.2-2003. The AREVA NP Inc. Software Program Manual (SPM) describes the measures taken to ensure that the TELEPERM XS application software attains a level of quality commensurate with its importance to safety. The manual also describes how TELEPERM XS correctly performs the required safety functions and conforms to established technical and documentation requirements, conventions, rules, and standards. The program manual covers the requirements definition, detailed design, integration, and test phases for the TELEPERM XS application software, and supporting software created by AREVA NP Inc. The SPM is required for all safety-related TELEPERM XS system applications. The program comprises several basic plans and practices: 1. A Software Quality-Assurance Plan (SQAP) that describes the processes necessary to ensure that the software attains a level of quality commensurate with its importance to safety function. 2. A Software Safety Plan (SSP) that identifies the process to reasonably ensure that safety-critical software performs as intended during all abnormal conditions and events, and does not introduce any new hazards that could jeopardize the health and safety of the public. 3. A Software Verification and Validation (V and V) Plan that describes the method of ensuring the software is in accordance with the requirements. 4. A Software Configuration Management Plan (SCMP) that describes the method of maintaining the software in an identifiable state at all times. 5. A Software Operations and Maintenance Plan (SO and MP) that

  4. Belle II Software

    International Nuclear Information System (INIS)

    Kuhr, T; Ritter, M

    2016-01-01

    Belle II is a next generation B factory experiment that will collect 50 times more data than its predecessor, Belle. The higher luminosity at the SuperKEKB accelerator leads to higher background levels and requires a major upgrade of the detector. As a consequence, the simulation, reconstruction, and analysis software must also be upgraded substantially. Most of the software has been redesigned from scratch, taking into account the experience from Belle and other experiments and utilizing new technologies. The large amount of experimental and simulated data requires a high level of reliability and reproducibility, even in parallel environments. Several technologies, tools, and organizational measures are employed to evaluate and monitor the performance of the software during development. (paper)

  5. Establishing software quality assurance

    International Nuclear Information System (INIS)

    Malsbury, J.

    1983-01-01

    This paper is concerned with four questions about establishing software QA: What is software QA. Why have software QA. What is the role of software QA. What is necessary to ensure the success of software QA

  6. New ATLAS Software & Computing Organization

    CERN Multimedia

    Barberis, D

    Following the election by the ATLAS Collaboration Board of Dario Barberis (Genoa University/INFN) as Computing Coordinator and David Quarrie (LBNL) as Software Project Leader, it was considered necessary to modify the organization of the ATLAS Software & Computing ("S&C") project. The new organization is based upon the following principles: separation of the responsibilities for computing management from those of software development, with the appointment of a Computing Coordinator and a Software Project Leader who are both members of the Executive Board; hierarchical structure of responsibilities and reporting lines; coordination at all levels between TDAQ, S&C and Physics working groups; integration of the subdetector software development groups with the central S&C organization. A schematic diagram of the new organization can be seen in Fig.1. Figure 1: new ATLAS Software & Computing organization. Two Management Boards will help the Computing Coordinator and the Software Project...

  7. Modernization of software quality assurance

    Science.gov (United States)

    Bhaumik, Gokul

    1988-01-01

    The customers satisfaction depends not only on functional performance, it also depends on the quality characteristics of the software products. An examination of this quality aspect of software products will provide a clear, well defined framework for quality assurance functions, which improve the life-cycle activities of software development. Software developers must be aware of the following aspects which have been expressed by many quality experts: quality cannot be added on; the level of quality built into a program is a function of the quality attributes employed during the development process; and finally, quality must be managed. These concepts have guided our development of the following definition for a Software Quality Assurance function: Software Quality Assurance is a formal, planned approach of actions designed to evaluate the degree of an identifiable set of quality attributes present in all software systems and their products. This paper is an explanation of how this definition was developed and how it is used.

  8. Software Prototyping

    Science.gov (United States)

    Del Fiol, Guilherme; Hanseler, Haley; Crouch, Barbara Insley; Cummins, Mollie R.

    2016-01-01

    Summary Background Health information exchange (HIE) between Poison Control Centers (PCCs) and Emergency Departments (EDs) could improve care of poisoned patients. However, PCC information systems are not designed to facilitate HIE with EDs; therefore, we are developing specialized software to support HIE within the normal workflow of the PCC using user-centered design and rapid prototyping. Objective To describe the design of an HIE dashboard and the refinement of user requirements through rapid prototyping. Methods Using previously elicited user requirements, we designed low-fidelity sketches of designs on paper with iterative refinement. Next, we designed an interactive high-fidelity prototype and conducted scenario-based usability tests with end users. Users were asked to think aloud while accomplishing tasks related to a case vignette. After testing, the users provided feedback and evaluated the prototype using the System Usability Scale (SUS). Results Survey results from three users provided useful feedback that was then incorporated into the design. After achieving a stable design, we used the prototype itself as the specification for development of the actual software. Benefits of prototyping included having 1) subject-matter experts heavily involved with the design; 2) flexibility to make rapid changes, 3) the ability to minimize software development efforts early in the design stage; 4) rapid finalization of requirements; 5) early visualization of designs; 6) and a powerful vehicle for communication of the design to the programmers. Challenges included 1) time and effort to develop the prototypes and case scenarios; 2) no simulation of system performance; 3) not having all proposed functionality available in the final product; and 4) missing needed data elements in the PCC information system. PMID:27081404

  9. Software Safety and Security

    CERN Document Server

    Nipkow, T; Hauptmann, B

    2012-01-01

    Recent decades have seen major advances in methods and tools for checking the safety and security of software systems. Automatic tools can now detect security flaws not only in programs of the order of a million lines of code, but also in high-level protocol descriptions. There has also been something of a breakthrough in the area of operating system verification. This book presents the lectures from the NATO Advanced Study Institute on Tools for Analysis and Verification of Software Safety and Security; a summer school held at Bayrischzell, Germany, in 2011. This Advanced Study Institute was

  10. Accelerating Software Development through Agile Practices--A Case Study of a Small-Scale, Time-Intensive Web Development Project at a College-Level IT Competition

    Science.gov (United States)

    Zhang, Xuesong; Dorn, Bradley

    2012-01-01

    Agile development has received increasing interest both in industry and academia due to its benefits in developing software quickly, meeting customer needs, and keeping pace with the rapidly changing requirements. However, agile practices and scrum in particular have been mainly tested in mid- to large-size projects. In this paper, we present…

  11. Global Software Engineering: A Software Process Approach

    Science.gov (United States)

    Richardson, Ita; Casey, Valentine; Burton, John; McCaffery, Fergal

    Our research has shown that many companies are struggling with the successful implementation of global software engineering, due to temporal, cultural and geographical distance, which causes a range of factors to come into play. For example, cultural, project managementproject management and communication difficulties continually cause problems for software engineers and project managers. While the implementation of efficient software processes can be used to improve the quality of the software product, published software process models do not cater explicitly for the recent growth in global software engineering. Our thesis is that global software engineering factors should be included in software process models to ensure their continued usefulness in global organisations. Based on extensive global software engineering research, we have developed a software process, Global Teaming, which includes specific practices and sub-practices. The purpose is to ensure that requirements for successful global software engineering are stipulated so that organisations can ensure successful implementation of global software engineering.

  12. Bicriterial Optimization of Software

    Directory of Open Access Journals (Sweden)

    Ion IVAN

    2006-01-01

    Full Text Available There are defined two optimum criteria for software analysis. For each criterion there are defined solutions in order to reach a minimum level. There are analyzed the effects of pursuing one objective over the other one. There is developed an aggregate function for which it is determined the two criteria composed level. Based on this value it is selected the optimum solution

  13. Integrated modeling of software cost and quality

    International Nuclear Information System (INIS)

    Rone, K.Y.; Olson, K.M.

    1994-01-01

    In modeling the cost and quality of software systems, the relationship between cost and quality must be considered. This explicit relationship is dictated by the criticality of the software being developed. The balance between cost and quality is a viable software engineering trade-off throughout the life cycle. Therefore, the ability to accurately estimate the cost and quality of software systems is essential to providing reliable software on time and within budget. Software cost models relate the product error rate to the percent of the project labor that is required for independent verification and validation. The criticality of the software determines which cost model is used to estimate the labor required to develop the software. Software quality models yield an expected error discovery rate based on the software size, criticality, software development environment, and the level of competence of the project and the developers with respect to the processes being employed

  14. Software system safety

    Science.gov (United States)

    Uber, James G.

    1988-01-01

    Software itself is not hazardous, but since software and hardware share common interfaces there is an opportunity for software to create hazards. Further, these software systems are complex, and proven methods for the design, analysis, and measurement of software safety are not yet available. Some past software failures, future NASA software trends, software engineering methods, and tools and techniques for various software safety analyses are reviewed. Recommendations to NASA are made based on this review.

  15. Atomic-Level Co3O4 Layer Stabilized by Metallic Cobalt Nanoparticles: A Highly Active and Stable Electrocatalyst for Oxygen Reduction.

    Science.gov (United States)

    Liu, Min; Liu, Jingjun; Li, Zhilin; Wang, Feng

    2018-02-28

    Developing atomic-level transition oxides may be one of the most promising ways for providing ultrahigh electrocatalytic performance for oxygen reduction reaction (ORR), compared with their bulk counterparts. In this article, we developed a set of atomically thick Co 3 O 4 layers covered on Co nanoparticles through partial reduction of Co 3 O 4 nanoparticles using melamine as a reductive additive at an elevated temperature. Compared with the original Co 3 O 4 nanoparticles, the synthesized Co 3 O 4 with a thickness of 1.1 nm exhibits remarkably enhanced ORR activity and durability, which are even higher than those obtained by a commercial Pt/C in an alkaline environment. The superior activity can be attributed to the unique physical and chemical structures of the atomic-level oxide featuring the narrowed band gap and decreased work function, caused by the escaped lattice oxygen and the enriched coordination-unsaturated Co 2+ in this atomic layer. Besides, the outstanding durability of the catalyst can result from the chemically epitaxial deposition of the Co 3 O 4 on the cobalt surface. Therefore, the proposed synthetic strategy may offer a smart way to develop other atomic-level transition metals with high electrocatalytic activity and stability for energy conversion and storage devices.

  16. Power level control of the TRIGA Mark-II research reactor using the multifeedback layer neural network and the particle swarm optimization

    International Nuclear Information System (INIS)

    Coban, Ramazan

    2014-01-01

    Highlights: • A multifeedback-layer neural network controller is presented for a research reactor. • Off-line learning of the MFLNN is accomplished by the PSO algorithm. • The results revealed that the MFLNN–PSO controller has a remarkable performance. - Abstract: In this paper, an artificial neural network controller is presented using the Multifeedback-Layer Neural Network (MFLNN), which is a recently proposed recurrent neural network, for neutronic power level control of a nuclear research reactor. Off-line learning of the MFLNN is accomplished by the Particle Swarm Optimization (PSO) algorithm. The MFLNN-PSO controller design is based on a nonlinear model of the TRIGA Mark-II research reactor. The learning and the test processes are implemented by means of a computer program at different power levels. The simulation results obtained reveal that the MFLNN-PSO controller has a remarkable performance on the neutronic power level control of the reactor for tracking the step reference power trajectories

  17. NDAS Hardware Translation Layer Development

    Science.gov (United States)

    Nazaretian, Ryan N.; Holladay, Wendy T.

    2011-01-01

    The NASA Data Acquisition System (NDAS) project is aimed to replace all DAS software for NASA s Rocket Testing Facilities. There must be a software-hardware translation layer so the software can properly talk to the hardware. Since the hardware from each test stand varies, drivers for each stand have to be made. These drivers will act more like plugins for the software. If the software is being used in E3, then the software should point to the E3 driver package. If the software is being used at B2, then the software should point to the B2 driver package. The driver packages should also be filled with hardware drivers that are universal to the DAS system. For example, since A1, A2, and B2 all use the Preston 8300AU signal conditioners, then the driver for those three stands should be the same and updated collectively.

  18. Self-aligned 0-level sealing of MEMS devices by a two layer thin film reflow process

    NARCIS (Netherlands)

    Rusu, C.R.; Jansen, Henricus V.; Gunn, R.; Witvrouw, A.

    2003-01-01

    Many micro electromechanical systems (MEMS) require a vacuum or controlled atmosphere encapsulation in order to ensure either a good performance or an acceptable lifetime of operation. Two approaches for wafer-scale zero-level packaging exist. The most popular approach is based on wafer bonding.

  19. Self-aligned 0-level sealing of MEMS devices by a two layer thin film reflow process

    NARCIS (Netherlands)

    Rusu, C.R.; Jansen, Henricus V.; Gunn, R.; Witvrouw, A.

    2004-01-01

    Many micro electromechanical systems (MEMS) require a vacuum or controlled atmosphere encapsulation in order to ensure either a good performance or an acceptable lifetime of operation. Two approaches for waferscale zero-level packaging exist. The most popular approach is based on wafer bonding.

  20. Software reliability assessment

    International Nuclear Information System (INIS)

    Barnes, M.; Bradley, P.A.; Brewer, M.A.

    1994-01-01

    The increased usage and sophistication of computers applied to real time safety-related systems in the United Kingdom has spurred on the desire to provide a standard framework within which to assess dependable computing systems. Recent accidents and ensuing legislation have acted as a catalyst in this area. One particular aspect of dependable computing systems is that of software, which is usually designed to reduce risk at the system level, but which can increase risk if it is unreliable. Various organizations have recognized the problem of assessing the risk imposed to the system by unreliable software, and have taken initial steps to develop and use such assessment frameworks. This paper relates the approach of Consultancy Services of AEA Technology in developing a framework to assess the risk imposed by unreliable software. In addition, the paper discusses the experiences gained by Consultancy Services in applying the assessment framework to commercial and research projects. The framework is applicable to software used in safety applications, including proprietary software. Although the paper is written with Nuclear Reactor Safety applications in mind, the principles discussed can be applied to safety applications in all industries

  1. Core-level spectra and binding energies of transition metal nitrides by non-destructive x-ray photoelectron spectroscopy through capping layers

    International Nuclear Information System (INIS)

    Greczynski, G.; Primetzhofer, D.; Lu, J.; Hultman, L.

    2017-01-01

    Highlights: • First non-destructive measurements of XPS core level binding energies for group IVb-VIb transition metal nitrides are presented. • All films are grown under the same conditions and analyzed in the same instrument, providing a useful reference for future XPS studies. • Extracted core level BE values are more reliable than those obtained from sputter-cleaned N-deficient surfaces. • Comparison to Ar+-etched surfaces reveals that even mild etching conditions result in the formation of a nitrogen-deficient surface layer. • The N/metal concentration ratios from capped samples are found to be 25-90% higher than those from the corresponding ion-etched surfaces. - Abstract: We present the first measurements of x-ray photoelectron spectroscopy (XPS) core level binding energies (BE:s) for the widely-applicable group IVb-VIb polycrystalline transition metal nitrides (TMN’s) TiN, VN, CrN, ZrN, NbN, MoN, HfN, TaN, and WN as well as AlN and SiN, which are common components in the TMN-based alloy systems. Nitride thin film samples were grown at 400 °C by reactive dc magnetron sputtering from elemental targets in Ar/N 2 atmosphere. For XPS measurements, layers are either (i) Ar + ion-etched to remove surface oxides resulting from the air exposure during sample transfer from the growth chamber into the XPS system, or (ii) in situ capped with a few nm thick Cr or W overlayers in the deposition system prior to air-exposure and loading into the XPS instrument. Film elemental composition and phase content is thoroughly characterized with time-of-flight elastic recoil detection analysis (ToF-E ERDA), Rutherford backscattering spectrometry (RBS), and x-ray diffraction. High energy resolution core level XPS spectra acquired with monochromatic Al Kα radiation on the ISO-calibrated instrument reveal that even mild etching conditions result in the formation of a nitrogen-deficient surface layer that substantially affects the extracted binding energy values. These

  2. Core-level spectra and binding energies of transition metal nitrides by non-destructive x-ray photoelectron spectroscopy through capping layers

    Energy Technology Data Exchange (ETDEWEB)

    Greczynski, G., E-mail: grzgr@ifm.liu.se [Thin Film Physics Division, Department of Physics (IFM), Linköping University, SE-581 83 Linköping (Sweden); Primetzhofer, D. [Department of Physics and Astronomy, The Ångström Laboratory, Uppsala University, P.O. Box 516, SE-751 20 Uppsala (Sweden); Lu, J.; Hultman, L. [Thin Film Physics Division, Department of Physics (IFM), Linköping University, SE-581 83 Linköping (Sweden)

    2017-02-28

    Highlights: • First non-destructive measurements of XPS core level binding energies for group IVb-VIb transition metal nitrides are presented. • All films are grown under the same conditions and analyzed in the same instrument, providing a useful reference for future XPS studies. • Extracted core level BE values are more reliable than those obtained from sputter-cleaned N-deficient surfaces. • Comparison to Ar+-etched surfaces reveals that even mild etching conditions result in the formation of a nitrogen-deficient surface layer. • The N/metal concentration ratios from capped samples are found to be 25-90% higher than those from the corresponding ion-etched surfaces. - Abstract: We present the first measurements of x-ray photoelectron spectroscopy (XPS) core level binding energies (BE:s) for the widely-applicable group IVb-VIb polycrystalline transition metal nitrides (TMN’s) TiN, VN, CrN, ZrN, NbN, MoN, HfN, TaN, and WN as well as AlN and SiN, which are common components in the TMN-based alloy systems. Nitride thin film samples were grown at 400 °C by reactive dc magnetron sputtering from elemental targets in Ar/N{sub 2} atmosphere. For XPS measurements, layers are either (i) Ar{sup +} ion-etched to remove surface oxides resulting from the air exposure during sample transfer from the growth chamber into the XPS system, or (ii) in situ capped with a few nm thick Cr or W overlayers in the deposition system prior to air-exposure and loading into the XPS instrument. Film elemental composition and phase content is thoroughly characterized with time-of-flight elastic recoil detection analysis (ToF-E ERDA), Rutherford backscattering spectrometry (RBS), and x-ray diffraction. High energy resolution core level XPS spectra acquired with monochromatic Al Kα radiation on the ISO-calibrated instrument reveal that even mild etching conditions result in the formation of a nitrogen-deficient surface layer that substantially affects the extracted binding energy

  3. Effects of balanced dietary protein levels on egg production and egg quality parameters of individual commercial layers.

    Science.gov (United States)

    Shim, M Y; Song, E; Billard, L; Aggrey, S E; Pesti, G M; Sodsee, P

    2013-10-01

    The effects of a series of balanced dietary protein levels on egg production and egg quality parameters of laying hens from 18 through 74 wk of age were investigated. One hundred forty-four pullets (Bovans) were randomly assigned to individual cages with separate feeders including 3 different protein level series of isocaloric diets. Diets were separated into 4 phases of 18-22, 23-32, 33-44, and 45-74 wk of age. The high protein (H) series contained 21.62, 19.05, 16.32, and 16.05% CP, respectively. Medium protein (M) and low protein (L) series were 2 and 4% lower in balanced dietary protein. The results clearly demonstrated that the balanced dietary protein level was a limiting factor for BW, ADFI, egg weight, hen day egg production (HDEP), and feed per kilogram of eggs. Feeding with the L series resulted in lower ADFI and HDEP (90.33% peak production) and more feed per kilogram of eggs compared with the H or M series (HDEP; 93.23 and 95.68% peak production, monthly basis). Egg weight responded in a linear manner to balanced dietary protein level (58.78, 55.94, and 52.73 g for H, M, and L, respectively). Feed intake of all hens, but especially those in the L series, increased considerably after wk 54 when the temperature of the house decreased due to winter conditions. Thus, hens fed the L series seemed particularly dependent on house temperature to maintain BW, ADFI, and HDEP. For egg quality parameters, percent yolk, Haugh units, and egg specific gravity were similar regardless of diets. Haugh units were found to be greatly affected by the variation of housing temperature (P = 0.025). Maximum performance cannot always be expected to lead to maximum profits. Contrary to the idea of a daily amino acid requirement for maximum performance, these results may be used to determine profit-maximizing levels of balanced dietary protein based on the cost of protein and returns from different possible protein levels that may be fed.

  4. Software engineering : redundancy is key

    NARCIS (Netherlands)

    Brand, van den M.G.J.; Groote, J.F.

    2015-01-01

    Software engineers are humans and so they make lots of mistakes. Typically 1 out of 10 to 100 tasks go wrong. The only way to avoid these mistakes is to introduce redundancy in the software engineering process. This article is a plea to consciously introduce several levels of redundancy for each

  5. Software engineering processes principles and applications

    CERN Document Server

    Wang, Yingxu

    2000-01-01

    Fundamentals of the Software Engineering ProcessIntroductionA Unified Framework of the Software Engineering ProcessProcess AlgebraProcess-Based Software EngineeringSoftware Engineering Process System ModelingThe CMM ModelThe ISO 9001 ModelThe BOOTSTRAP ModelThe ISO/IEC 15504 (SPICE) ModelThe Software Engineering Process Reference Model: SEPRMSoftware Engineering Process System AnalysisBenchmarking the SEPRM ProcessesComparative Analysis of Current Process ModelsTransformation of Capability Levels Between Current Process ModelsSoftware Engineering Process EstablishmentSoftware Process Establish

  6. The fundamentals behind solving for unknown molecular structures using computer-assisted structure elucidation: a free software package at the undergraduate and graduate levels.

    Science.gov (United States)

    Moser, Arvin; Pautler, Brent G

    2016-05-15

    The successful elucidation of an unknown compound's molecular structure often requires an analyst with profound knowledge and experience of advanced spectroscopic techniques, such as Nuclear Magnetic Resonance (NMR) spectroscopy and mass spectrometry. The implementation of Computer-Assisted Structure Elucidation (CASE) software in solving for unknown structures, such as isolated natural products and/or reaction impurities, can serve both as elucidation and teaching tools. As such, the introduction of CASE software with 112 exercises to train students in conjunction with the traditional pen and paper approach will strengthen their overall understanding of solving unknowns and explore of various structural end points to determine the validity of the results quickly. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  7. PyBus -- A Python Software Bus

    OpenAIRE

    Lavrijsen, W

    2005-01-01

    A software bus, just like its hardware equivalent, allows for the discovery, installation, configuration, loading, unloading, and run-time replacement of software components, as well as channeling of inter-component communication. Python, a popular open-source programming language, encourages a modular design on software written in it, but it offers little or no component functionality. However, the language and its interpreter provide sufficient hooks to implement a thin, integral layer...

  8. Limpet Shells from the Aterian Level 8 of El Harhoura 2 Cave (Témara, Morocco): Preservation State of Crossed-Foliated Layers

    Science.gov (United States)

    Nouet, Julius; Chevallard, Corinne; Farre, Bastien; Nehrke, Gernot; Campmas, Emilie; Stoetzel, Emmanuelle; El Hajraoui, Mohamed Abdeljalil; Nespoulet, Roland

    2015-01-01

    The exploitation of mollusks by the first anatomically modern humans is a central question for archaeologists. This paper focuses on level 8 (dated around ∼ 100 ka BP) of El Harhoura 2 Cave, located along the coastline in the Rabat-Témara region (Morocco). The large quantity of Patella sp. shells found in this level highlights questions regarding their origin and preservation. This study presents an estimation of the preservation status of these shells. We focus here on the diagenetic evolution of both the microstructural patterns and organic components of crossed-foliated shell layers, in order to assess the viability of further investigations based on shell layer minor elements, isotopic or biochemical compositions. The results show that the shells seem to be well conserved, with microstructural patterns preserved down to sub-micrometric scales, and that some organic components are still present in situ. But faint taphonomic degradations affecting both mineral and organic components are nonetheless evidenced, such as the disappearance of organic envelopes surrounding crossed-foliated lamellae, combined with a partial recrystallization of the lamellae. Our results provide a solid case-study of the early stages of the diagenetic evolution of crossed-foliated shell layers. Moreover, they highlight the fact that extreme caution must be taken before using fossil shells for palaeoenvironmental or geochronological reconstructions. Without thorough investigation, the alteration patterns illustrated here would easily have gone unnoticed. However, these degradations are liable to bias any proxy based on the elemental, isotopic or biochemical composition of the shells. This study also provides significant data concerning human subsistence behavior: the presence of notches and the good preservation state of limpet shells (no dissolution/recrystallization, no bioerosion and no abrasion/fragmentation aspects) would attest that limpets were gathered alive with tools by

  9. Limpet Shells from the Aterian Level 8 of El Harhoura 2 Cave (Témara, Morocco: Preservation State of Crossed-Foliated Layers.

    Directory of Open Access Journals (Sweden)

    Julius Nouet

    Full Text Available The exploitation of mollusks by the first anatomically modern humans is a central question for archaeologists. This paper focuses on level 8 (dated around ∼ 100 ka BP of El Harhoura 2 Cave, located along the coastline in the Rabat-Témara region (Morocco. The large quantity of Patella sp. shells found in this level highlights questions regarding their origin and preservation. This study presents an estimation of the preservation status of these shells. We focus here on the diagenetic evolution of both the microstructural patterns and organic components of crossed-foliated shell layers, in order to assess the viability of further investigations based on shell layer minor elements, isotopic or biochemical compositions. The results show that the shells seem to be well conserved, with microstructural patterns preserved down to sub-micrometric scales, and that some organic components are still present in situ. But faint taphonomic degradations affecting both mineral and organic components are nonetheless evidenced, such as the disappearance of organic envelopes surrounding crossed-foliated lamellae, combined with a partial recrystallization of the lamellae. Our results provide a solid case-study of the early stages of the diagenetic evolution of crossed-foliated shell layers. Moreover, they highlight the fact that extreme caution must be taken before using fossil shells for palaeoenvironmental or geochronological reconstructions. Without thorough investigation, the alteration patterns illustrated here would easily have gone unnoticed. However, these degradations are liable to bias any proxy based on the elemental, isotopic or biochemical composition of the shells. This study also provides significant data concerning human subsistence behavior: the presence of notches and the good preservation state of limpet shells (no dissolution/recrystallization, no bioerosion and no abrasion/fragmentation aspects would attest that limpets were gathered alive

  10. High level heterologous protein production in Lactococcus and Lactobacillus using a new secretion system based on the Lactobacillus brevis S-layer signals.

    Science.gov (United States)

    Savijoki, K; Kahala, M; Palva, A

    1997-02-28

    A secretion cassette, based on the expression and secretion signals of a S-layer protein (SlpA) from Lactobacillus brevis, was constructed. E. coli beta-lactamase (Bla) was used as the reporter protein to determine the functionality of the S-layer signals for heterologous expression and secretion in Lactococcus lactis, Lactobacillus brevis, Lactobacillus plantarum, Lactobacillus gasseri and Lactobacillus casei using a low-copy-number plasmid derived from pGK12. In all hosts tested, the bla gene was expressed under the slpA signals and all Bla activity was secreted to the culture medium. The Lb. brevis S-layer promoters were very efficiently recognized in L. lactis, Lb. brevis and Lb. plantarum, whereas in Lb. gasseri the slpA promoter region appeared to be recognized at a lower level and in Lb. casei the level of transcripts was below the detection limit. The production of Bla was mainly restricted to the exponential phase of growth. The highest yield of Bla was obtained with L. lactis and Lb. brevis. Without pH control, substantial degradation of Bla occurred during prolonged cultivations with all lactic acid bacteria (LAB) tested. When growing L. lactis and Lb. brevis under pH control, the Bla activity could be stabilized also at the stationary phase. L. lactis produced up to 80 mg/l of Bla which to our knowledge represents the highest amount of a heterologous protein secreted by LAB so far. The short production phase implied a very high rate of secretion with a calculated value of 5 x 10(5) Bla molecules/cell per h. Such a high rate was also observed with Lb. plantarum, whereas in Lb. brevis the competition between the wild type slpA gene and the secretion construct probably lowered the rate of Bla production. The results obtained indicate wide applicability of the Lb. brevis slpA signals for efficient protein production and secretion in LAB.

  11. ACTS: from ATLAS software towards a common track reconstruction software

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00349786; The ATLAS collaboration; Salzburger, Andreas; Kiehn, Moritz; Hrdinka, Julia; Calace, Noemi

    2017-01-01

    Reconstruction of charged particles' trajectories is a crucial task for most particle physics experiments. The high instantaneous luminosity achieved at the LHC leads to a high number of proton-proton collisions per bunch crossing, which has put the track reconstruction software of the LHC experiments through a thorough test. Preserving track reconstruction performance under increasingly difficult experimental conditions, while keeping the usage of computational resources at a reasonable level, is an inherent problem for many HEP experiments. Exploiting concurrent algorithms and using multivariate techniques for track identification are the primary strategies to achieve that goal. Starting from current ATLAS software, the ACTS project aims to encapsulate track reconstruction software into a generic, framework- and experiment-independent software package. It provides a set of high-level algorithms and data structures for performing track reconstruction tasks as well as fast track simulation. The software is de...

  12. An Accelerator Control Middle Layer Using MATLAB

    International Nuclear Information System (INIS)

    Portmann, Gregory J.; Corbett, Jeff; Terebilo, Andrei

    2005-01-01

    Matlab is an interpretive programming language originally developed for convenient use with the LINPACK and EISPACK libraries. Matlab is appealing for accelerator physics because it is matrix-oriented, provides an active workspace for system variables, powerful graphics capabilities, built-in math libraries, and platform independence. A number of accelerator software toolboxes have been written in Matlab -- the Accelerator Toolbox (AT) for model-based machine simulations, LOCO for on-line model calibration, and Matlab Channel Access (MCA) to connect with EPICS. The function of the MATLAB ''MiddleLayer'' is to provide a scripting language for machine simulations and on-line control, including non-EPICS based control systems. The MiddleLayer has simplified and streamlined development of high-level applications including configuration control, energy ramp, orbit correction, photon beam steering, ID compensation, beam-based alignment, tune correction and response matrix measurement. The database-driven Middle Layer software is largely machine-independent and easy to port. Six accelerators presently use the software package with more scheduled to come on line soon

  13. Numerical Study on Open-Circuit Voltage of Single Layer Organic Solar Cells with Schottky Contacts: Effects of Molecular Energy Levels, Temperature and Thickness

    International Nuclear Information System (INIS)

    Rong-Hua, Li; Ying-Quan, Peng; Chao-Zhu, Ma; Run-Sheng, Wang; Hong-Wei, Xie; Ying, Wang; Wei-Min, Meng

    2010-01-01

    We numerically investigate the effects of the exciton generation rate G, temperature T, the active layer thickness d and the position of LUMO level E L related to the cathode work function W c at a given energy gap on the open-circuit voltage V oc of single layer organic solar cells with Schottky contact. It is demonstrated that open-circuit voltage increases concomitantly with the decreasing cathode work function W c for given anode work functions and exciton generation rates. In the case of given cathode and anode work functions, the open-circuit voltage first increases with the exciton generation rate and then reaches a saturation value, which equals to the built-in voltage. Additionally, it is worth noting that a significant improvement to V oc could be made by selecting an organic material which has a relative high LUMO level (low |E L | value). However, V oc decreases as the temperature increases, and the decreasing rate reduces with the enhancement of exciton generation rate. Our study also shows that it is of no benefit to improve the open-circuit voltage by increasing the device thickness because of an enhanced charge recombination in thicker devices. (cross-disciplinary physics and related areas of science and technology)

  14. Sandia software guidelines: Software quality planning

    Energy Technology Data Exchange (ETDEWEB)

    1987-08-01

    This volume is one in a series of Sandia Software Guidelines intended for use in producing quality software within Sandia National Laboratories. In consonance with the IEEE Standard for Software Quality Assurance Plans, this volume identifies procedures to follow in producing a Software Quality Assurance Plan for an organization or a project, and provides an example project SQA plan. 2 figs., 4 tabs.

  15. Avoidable Software Procurements

    Science.gov (United States)

    2012-09-01

    software license, software usage, ELA, Software as a Service , SaaS , Software Asset...PaaS Platform as a Service SaaS Software as a Service SAM Software Asset Management SMS System Management Server SEWP Solutions for Enterprise Wide...delivery of full Cloud Services , we will see the transition of the Cloud Computing service model from Iaas to SaaS , or Software as a Service . Software

  16. Toward Intelligent Software Defect Detection

    Science.gov (United States)

    Benson, Markland J.

    2011-01-01

    Source code level software defect detection has gone from state of the art to a software engineering best practice. Automated code analysis tools streamline many of the aspects of formal code inspections but have the drawback of being difficult to construct and either prone to false positives or severely limited in the set of defects that can be detected. Machine learning technology provides the promise of learning software defects by example, easing construction of detectors and broadening the range of defects that can be found. Pinpointing software defects with the same level of granularity as prominent source code analysis tools distinguishes this research from past efforts, which focused on analyzing software engineering metrics data with granularity limited to that of a particular function rather than a line of code.

  17. Recombination Suppression in PbS Quantum Dot Heterojunction Solar Cells by Energy-Level Alignment in the Quantum Dot Active Layers.

    Science.gov (United States)

    Ding, Chao; Zhang, Yaohong; Liu, Feng; Nakazawa, Naoki; Huang, Qingxun; Hayase, Shuzi; Ogomi, Yuhei; Toyoda, Taro; Wang, Ruixiang; Shen, Qing

    2017-09-22

    Using spatial energy-level gradient engineering with quantum dots (QDs) of different sizes to increase the generated carrier collection at the junction of a QD heterojunction solar cell (QDHSC) is a hopeful route for improving the energy-conversion efficiency. However, the results of current related research have shown that a variable band-gap structure in a QDHSC will create an appreciable increase, not in the illumination current density, but rather in the fill factor. In addition, there are a lack of studies on the mechanism of the effect of these graded structures on the photovoltaic performance of QDHSCs. This study presents the development of air atmosphere solution-processed TiO 2 /PbS QDs/Au QDHSCs by engineering the energy-level alignment (ELA) of the active layer via the use of a sorted order of differently sized QD layers (four QD sizes). In comparison to the ungraded device (without the ELA), the optimized graded architecture (containing the ELA) solar cells exhibited a great increase (21.4%) in short-circuit current density (J sc ). As a result, a J sc value greater than 30 mA/cm 2 has been realized in planar, thinner absorption layer (∼300 nm) PbS QDHSCs, and the open-circuit voltage (V oc ) and power-conversion efficiency (PCE) were also improved. Through characterization by the light intensity dependences of the J sc and V oc and transient photovoltage decay, we find that (i) the ELA structure, serving as an electron-blocking layer, reduces the interfacial recombination at the PbS/anode interface, and (ii) the ELA structure can drive more carriers toward the desirable collection electrode, and the additional carriers can fill the trap states, reducing the trap-assisted recombination in the PbS QDHSCs. This work has clearly elucidated the mechanism of the recombination suppression in the graded QDHSCs and demonstrated the effects of ELA structure on the improvement of J sc . The charge recombination mechanisms characterized in this work would be

  18. Control software for the CBM readout chain

    Energy Technology Data Exchange (ETDEWEB)

    Loizeau, Pierre-Alain [GSI Helmholtzzentrum fuer Schwerionenforschung GmbH (Germany)

    2016-07-01

    The Compressed Baryonic Matter (CBM) experiment, which will be built at FAIR, will use free-streaming readout electronics to acquire high-statistics data-sets of physics probes in fixed target heavy-ion collisions. Since no simple signatures suitable for a hardware trigger are available for most of them, reconstruction and selection of the interesting collisions will be done in software, in a computer farm called First Level Event Selector (FLES). The raw data coming from the detectors is pre-processed, pre-calibrated and aggregated in a FPGA based layer called Data Preprocessing Boards (DPB). IPbus will be used to communicate with the DPBs and through them with the elements of the readout chain closer to detectors. A slow control environment based on this software is developed by CBM to configure in an efficient way the DPBs as well as the Front-End Electronics and monitor their performances. This contribution presents the layout planned for the slow control software, its first implementation and corresponding test results.

  19. Software engineering architecture-driven software development

    CERN Document Server

    Schmidt, Richard F

    2013-01-01

    Software Engineering: Architecture-driven Software Development is the first comprehensive guide to the underlying skills embodied in the IEEE's Software Engineering Body of Knowledge (SWEBOK) standard. Standards expert Richard Schmidt explains the traditional software engineering practices recognized for developing projects for government or corporate systems. Software engineering education often lacks standardization, with many institutions focusing on implementation rather than design as it impacts product architecture. Many graduates join the workforce with incomplete skil

  20. Next Generation Software Process Improvement

    National Research Council Canada - National Science Library

    Turnas, Daniel

    2003-01-01

    .... The application of these processes allows for an organization to mature. The software maturity level, and process improvement, of an organization can be measured with the Capability Maturity Model...

  1. Operating System Abstraction Layer (OSAL)

    Science.gov (United States)

    Yanchik, Nicholas J.

    2007-01-01

    This viewgraph presentation reviews the concept of the Operating System Abstraction Layer (OSAL) and its benefits. The OSAL is A small layer of software that allows programs to run on many different operating systems and hardware platforms It runs independent of the underlying OS & hardware and it is self-contained. The benefits of OSAL are that it removes dependencies from any one operating system, promotes portable, reusable flight software. It allows for Core Flight software (FSW) to be built for multiple processors and operating systems. The presentation discusses the functionality, the various OSAL releases, and describes the specifications.

  2. The software life cycle

    CERN Document Server

    Ince, Darrel

    1990-01-01

    The Software Life Cycle deals with the software lifecycle, that is, what exactly happens when software is developed. Topics covered include aspects of software engineering, structured techniques of software development, and software project management. The use of mathematics to design and develop computer systems is also discussed. This book is comprised of 20 chapters divided into four sections and begins with an overview of software engineering and software development, paying particular attention to the birth of software engineering and the introduction of formal methods of software develop

  3. Modular Software Performance Monitoring

    CERN Document Server

    Kruse, D F

    2011-01-01

    CPU clock frequency is not likely to be increased significantly in the coming years, and data analysis speed can be improved by using more processors or buying new machines, only if one is willing to change the paradigm to a parallel one. Therefore, performance monitoring procedures and tools are needed to help programmers to optimize existing software running on current and future hardware. Low level information from hardware performance counters is vital to spot specific performance problems slowing program execution. HEP software is often huge and complex, and existing tools are unable to give results with the required granularity. We will report on the approach we have chose to solve this problem that involves decomposing the application into parts and monitoring each of them separately. Both counting and sampling methods are used to allow an analysis with the required custom granularity: from global level, up to the function level. A set of tools (based on perfmon2 – a software interface to hardware co...

  4. Collaborative-Hybrid Multi-Layer Network Control for Emerging Cyber-Infrastructures

    Energy Technology Data Exchange (ETDEWEB)

    Lehman, Tom [USC; Ghani, Nasir [UNM; Boyd, Eric [UCAID

    2010-08-31

    At a high level, there were four basic task areas identified for the Hybrid-MLN project. They are: o Multi-Layer, Multi-Domain, Control Plane Architecture and Implementation, including OSCARS layer2 and InterDomain Adaptation, Integration of LambdaStation and Terapaths with Layer2 dynamic provisioning, Control plane software release, Scheduling, AAA, security architecture, Network Virtualization architecture, Multi-Layer Network Architecture Framework Definition; o Heterogeneous DataPlane Testing; o Simulation; o Project Publications, Reports, and Presentations.

  5. A deep-level transient spectroscopy study of gamma-ray irradiation on the passivation properties of silicon nitride layer on silicon

    Science.gov (United States)

    Dong, Peng; Yu, Xuegong; Ma, Yao; Xie, Meng; Li, Yun; Huang, Chunlai; Li, Mo; Dai, Gang; Zhang, Jian

    2017-08-01

    Plasma-enhanced chemical vapor deposited silicon nitride (SiNx) films are extensively used as passivation material in the solar cell industry. Such SiNx passivation layers are the most sensitive part to gamma-ray irradiation in solar cells. In this work, deep-level transient spectroscopy has been applied to analyse the influence of gamma-ray irradiation on the passivation properties of SiNx layer on silicon. It is shown that the effective carrier lifetime decreases with the irradiation dose. At the same time, the interface state density is significantly increased after irradiation, and its energy distribution is broadened and shifts deeper with respect to the conduction band edge, which makes the interface states becoming more efficient recombination centers for carriers. Besides, C-V characteristics show a progressive negative shift with increasing dose, indicating the generation of effective positive charges in SiNx films. Such positive charges are beneficial for shielding holes from the n-type silicon substrates, i. e. the field-effect passivation. However, based on the reduced carrier lifetime after irradiation, it can be inferred that the irradiation induced interface defects play a dominant role over the trapped positive charges, and therefore lead to the degradation of passivation properties of SiNx on silicon.

  6. A comparison study of single and double layer repositories for high level radioactive wastes within a saturated and discontinuous granitic rock mass

    International Nuclear Information System (INIS)

    Kim, Jhin Wung; Choi, Jong Won; Bae, Dae Suk

    2004-02-01

    The present study is to analyze and compare a long term thermohydro mechanical interaction behavior of a single layer and a double layer repository for high level radioactive wastes within a saturated and discontinuous granitic rock mass, and then to contribute this understanding to the development of a Korean disposal concept. The model includes a saturated and discontinuous granitic rock mass, PWR spent nuclear fuel in a disposal canister surrounded by compacted bentonite inside a deposition hole, and mixed bentonite backfilled in the rest of the space within a repository cavern. It is assumed that two joint sets exist within the model. Joint set 1 includes joints of 56 .deg. dip angle, spaced at 20 m, and joint set 2 is in the perpendicular direction to joint set 1 and includes joints of .deg. dip angle, spaced at 20 m. The two dimensional distinct element code, UDEC is used for the analysis. To understand the joint behavior adjacent to the repository cavern, Barton-Bandis joint model is used. Effect of the decay heat from PWR spent fuels on the repository model has been analyzed, and a steady state flow algorithm is used for the hydraulic analysis

  7. The Adobe Photoshop layers book

    CERN Document Server

    Lynch, Richard

    2011-01-01

    Layers are the building blocks for working in Photoshop. With the correct use of the Layers Tool, you can edit individual components of your images nondestructively to ensure that your end result is a combination of the best parts of your work. Despite how important it is for successful Photoshop work, the Layers Tool is one of the most often misused and misunderstood features within this powerful software program. This book will show you absolutely everything you need to know to work with layers, including how to use masks, blending, modes and layer management. You'll learn professional tech

  8. Phynx: an open source software solution supporting data management and web-based patient-level data review for drug safety studies in the general practice research database and other health care databases.

    Science.gov (United States)

    Egbring, Marco; Kullak-Ublick, Gerd A; Russmann, Stefan

    2010-01-01

    To develop a software solution that supports management and clinical review of patient data from electronic medical records databases or claims databases for pharmacoepidemiological drug safety studies. We used open source software to build a data management system and an internet application with a Flex client on a Java application server with a MySQL database backend. The application is hosted on Amazon Elastic Compute Cloud. This solution named Phynx supports data management, Web-based display of electronic patient information, and interactive review of patient-level information in the individual clinical context. This system was applied to a dataset from the UK General Practice Research Database (GPRD). Our solution can be setup and customized with limited programming resources, and there is almost no extra cost for software. Access times are short, the displayed information is structured in chronological order and visually attractive, and selected information such as drug exposure can be blinded. External experts can review patient profiles and save evaluations and comments via a common Web browser. Phynx provides a flexible and economical solution for patient-level review of electronic medical information from databases considering the individual clinical context. It can therefore make an important contribution to an efficient validation of outcome assessment in drug safety database studies.

  9. Pybus - A Python Software Bus

    International Nuclear Information System (INIS)

    Lavrijsen, Wim T.L.P.

    2004-01-01

    A software bus, just like its hardware equivalent, allows for the discovery, installation, configuration, loading, unloading, and run-time replacement of software components, as well as channeling of inter-component communication. Python, a popular open-source programming language, encourages a modular design on software written in it, but it offers little or no component functionality. However, the language and its interpreter provide sufficient hooks to implement a thin, integral layer of component support. This functionality can be presented to the developer in the form of a module, making it very easy to use. This paper describes a Pythonmodule, PyBus, with which the concept of a ''software bus'' can be realized in Python. It demonstrates, within the context of the ATLAS software framework Athena, how PyBus can be used for the installation and (run-time) configuration of software, not necessarily Python modules, from a Python application in a way that is transparent to the end-user

  10. Electrochemical sensor based on EDTA intercalated into layered double hydroxides of magnesium and aluminum for ultra trace level detection of lead (II)

    International Nuclear Information System (INIS)

    Dong, Junping; Fang, Qinghua; He, Haibo; Xu, Jiaqiang; Zhang, Yuan; Sun, Youbao

    2015-01-01

    The chelator ethylene diaminetetraacetate (EDTA) has been intercalated into layered double hydroxides by the anion exchange method. The resulting composites were characterized by powder X-ray diffraction, FTIR spectroscopy, thermogravimetry and X-ray photoelectron spectrometry. They were applied to modify a carbon paste electrode for the stripping voltammetric determination of lead (II) ions at ng L −1 levels. Stripping currents are linearly related to the logarithm of Pb (II) concentrations from 2 ng L −1 to 33 μg L −1 . The detection limit (3σ) is as low as 0.95 ng L −1 . The method was successfully applied to the determination of Pb (II) in spiked tap water without any pretreatment.(author)

  11. Software development an open source approach

    CERN Document Server

    Tucker, Allen; de Silva, Chamindra

    2011-01-01

    Overview and Motivation Software Free and Open Source Software (FOSS)Two Case Studies Working with a Project Team Key FOSS Activities Client-Oriented vs. Community-Oriented Projects Working on a Client-Oriented Project Joining a Community-Oriented Project Using Project Tools Collaboration Tools Code Management Tools Run-Time System ConstraintsSoftware Architecture Architectural Patterns Layers, Cohesion, and Coupling Security Concurrency, Race Conditions, and DeadlocksWorking with Code Bad Smells and Metrics Refactoring Testing Debugging Extending the Software for a New ProjectDeveloping the D

  12. Process-based software project management

    CERN Document Server

    Goodman, F Alan

    2006-01-01

    Not connecting software project management (SPM) to actual, real-world development processes can lead to a complete divorcing of SPM to software engineering that can undermine any successful software project. By explaining how a layered process architectural model improves operational efficiency, Process-Based Software Project Management outlines a new method that is more effective than the traditional method when dealing with SPM. With a clear and easy-to-read approach, the book discusses the benefits of an integrated project management-process management connection. The described tight coup

  13. Model-driven software migration a methodology

    CERN Document Server

    Wagner, Christian

    2014-01-01

    Today, reliable software systems are the basis of any business or company. The continuous further development of those systems is the central component in software evolution. It requires a huge amount of time- man power- as well as financial resources. The challenges are size, seniority and heterogeneity of those software systems. Christian Wagner addresses software evolution: the inherent problems and uncertainties in the process. He presents a model-driven method which leads to a synchronization between source code and design. As a result the model layer will be the central part in further e

  14. Software-based acoustical measurements

    CERN Document Server

    Miyara, Federico

    2017-01-01

    This textbook provides a detailed introduction to the use of software in combination with simple and economical hardware (a sound level meter with calibrated AC output and a digital recording system) to obtain sophisticated measurements usually requiring expensive equipment. It emphasizes the use of free, open source, and multiplatform software. Many commercial acoustical measurement systems use software algorithms as an integral component; however the methods are not disclosed. This book enables the reader to develop useful algorithms and provides insight into the use of digital audio editing tools to document features in the signal. Topics covered include acoustical measurement principles, in-depth critical study of uncertainty applied to acoustical measurements, digital signal processing from the basics, and metrologically-oriented spectral and statistical analysis of signals. The student will gain a deep understanding of the use of software for measurement purposes; the ability to implement software-based...

  15. FPGAs for software programmers

    CERN Document Server

    Hannig, Frank; Ziener, Daniel

    2016-01-01

    This book makes powerful Field Programmable Gate Array (FPGA) and reconfigurable technology accessible to software engineers by covering different state-of-the-art high-level synthesis approaches (e.g., OpenCL and several C-to-gates compilers). It introduces FPGA technology, its programming model, and how various applications can be implemented on FPGAs without going through low-level hardware design phases. Readers will get a realistic sense for problems that are suited for FPGAs and how to implement them from a software designer’s point of view. The authors demonstrate that FPGAs and their programming model reflect the needs of stream processing problems much better than traditional CPU or GPU architectures, making them well-suited for a wide variety of systems, from embedded systems performing sensor processing to large setups for Big Data number crunching. This book serves as an invaluable tool for software designers and FPGA design engineers who are interested in high design productivity through behavi...

  16. Effects of dietary trace mineral sources and levels fed to layers in their second laying cycle on the quality of eggs stored at different temperatures and for different periods

    Directory of Open Access Journals (Sweden)

    ESPB Saldanha

    2010-12-01

    Full Text Available This study aimed at evaluating the effects of trace mineral levels and sources supplemented to diets fed to semi-heavy layers in their second laying cycle on the quality of eggs stored for 14 days at different temperatures. The experimental diets consisted of the inclusion of inorganic trace minerals (T1 - control: 100% ITM and five supplementation levels of organic trace minerals (carboaminophopho chelates (110, 100, 90, 80, and 70% OTM. Trace mineral inclusion levels (mg/kg feed were: T1: control - 100% ITM: Zn (54, Fe (54, Mn (72, Cu (10, I (0.61 Se (0.3; T2 - 110% OTM: Zn (59.4, Fe (59.4, Mn (79.2, Cu (11.88, I (1.21 Se (0.59; T3 - 100%: OTM: Zn (54, Fe (54, Mn (72, Cu (10.8, I (1.10 Se (0.54; T4 - 90% OTM: Zn (48.6, Fe (48.6, Mn (64.8, Cu (9.72, I (0.99 Se (0.49; T5 - 80% OTM: Zn (43.2, Fe (43.2, Mn (57.6, Cu (8.64, I (0.88, Se (0.43; T6 - 70% OTM: Zn (37.8, Fe (37.8, Mn (50.4, Cu (7.56, I (0.77 Se (0.38. A completely randomized experimental design in a split-plot arrangement with 60 treatments of four replicates each was applied. The combination of six diets versus storage temperature (room or under refrigeration was randomized in plots, whereas the sub-plots consisted of storage times (0, 3, 7, 10, and 14 days. Data were submitted to analysis of variance of a model in slip-plots in time using the software package SAS (2000 at 5% probability level. It was concluded that 70% OTM supplementation can be used with no damage to egg quality, independently from storage temperature or time. The quality of refrigerated eggs stored up to 14 days is better than those stored at room temperature.

  17. ESTSC - Software Best Practices

    Science.gov (United States)

    DOE Scientific and Technical Software Best Practices December 2010 Table of Contents 1.0 Introduction 2.0 Responsibilities 2.1 OSTI/ESTSC 2.2 SIACs 2.3 Software Submitting Sites/Creators 2.4 Software Sensitivity Review 3.0 Software Announcement and Submission 3.1 STI Software Appropriate for Announcement 3.2

  18. Software Past, Present, and Future: Views from Government, Industry and Academia

    Science.gov (United States)

    Holcomb, Lee; Page, Jerry; Evangelist, Michael

    2000-01-01

    Views from the NASA CIO NASA Software Engineering Workshop on software development from the past, present, and future are presented. The topics include: 1) Software Past; 2) Software Present; 3) NASA's Largest Software Challenges; 4) 8330 Software Projects in Industry Standish Groups 1994 Report; 5) Software Future; 6) Capability Maturity Model (CMM): Software Engineering Institute (SEI) levels; 7) System Engineering Quality Also Part of the Problem; 8) University Environment Trends Will Increase the Problem in Software Engineering; and 9) NASA Software Engineering Goals.

  19. Software attribute visualization for high integrity software

    Energy Technology Data Exchange (ETDEWEB)

    Pollock, G.M.

    1998-03-01

    This report documents a prototype tool developed to investigate the use of visualization and virtual reality technologies for improving software surety confidence. The tool is utilized within the execution phase of the software life cycle. It provides a capability to monitor an executing program against prespecified requirements constraints provided in a program written in the requirements specification language SAGE. The resulting Software Attribute Visual Analysis Tool (SAVAnT) also provides a technique to assess the completeness of a software specification.

  20. Model-integrating software components engineering flexible software systems

    CERN Document Server

    Derakhshanmanesh, Mahdi

    2015-01-01

    In his study, Mahdi Derakhshanmanesh builds on the state of the art in modeling by proposing to integrate models into running software on the component-level without translating them to code. Such so-called model-integrating software exploits all advantages of models: models implicitly support a good separation of concerns, they are self-documenting and thus improve understandability and maintainability and in contrast to model-driven approaches there is no synchronization problem anymore between the models and the code generated from them. Using model-integrating components, software will be

  1. Software quality metrics aggregation in industry

    NARCIS (Netherlands)

    Mordal, K.; Anquetil, N.; Laval, J.; Serebrenik, A.; Vasilescu, B.N.; Ducasse, S.

    2013-01-01

    With the growing need for quality assessment of entire software systems in the industry, new issues are emerging. First, because most software quality metrics are defined at the level of individual software components, there is a need for aggregation methods to summarize the results at the system

  2. TOUGH2 software qualification

    International Nuclear Information System (INIS)

    Pruess, K.; Simmons, A.; Wu, Y.S.; Moridis, G.

    1996-02-01

    TOUGH2 is a numerical simulation code for multi-dimensional coupled fluid and heat flow of multiphase, multicomponent fluid mixtures in porous and fractured media. It belongs to the MULKOM (open-quotes MULti-KOMponentclose quotes) family of codes and is a more general version of the TOUGH simulator. The MULKOM family of codes was originally developed with a focus on geothermal reservoir simulation. They are suited to modeling systems which contain different fluid mixtures, with applications to flow problems arising in the context of high-level nuclear waste isolation, oil and gas recovery and storage, and groundwater resource protection. TOUGH2 is essentially a subset of MULKOM, consisting of a selection of the better tested and documented MULKOM program modules. The purpose of this package of reports is to provide all software baseline documents necessary for the software qualification of TOUGH2

  3. TOUGH2 software qualification

    Energy Technology Data Exchange (ETDEWEB)

    Pruess, K.; Simmons, A.; Wu, Y.S.; Moridis, G.

    1996-02-01

    TOUGH2 is a numerical simulation code for multi-dimensional coupled fluid and heat flow of multiphase, multicomponent fluid mixtures in porous and fractured media. It belongs to the MULKOM ({open_quotes}MULti-KOMponent{close_quotes}) family of codes and is a more general version of the TOUGH simulator. The MULKOM family of codes was originally developed with a focus on geothermal reservoir simulation. They are suited to modeling systems which contain different fluid mixtures, with applications to flow problems arising in the context of high-level nuclear waste isolation, oil and gas recovery and storage, and groundwater resource protection. TOUGH2 is essentially a subset of MULKOM, consisting of a selection of the better tested and documented MULKOM program modules. The purpose of this package of reports is to provide all software baseline documents necessary for the software qualification of TOUGH2.

  4. Knowledge-Based Software Management

    International Nuclear Information System (INIS)

    Sally Schaffner; Matthew Bickley; Brian Bevins; Leon Clancy; Karen White

    2003-01-01

    Management of software in a dynamic environment such as is found at Jefferson Lab can be a daunting task. Software development tasks are distributed over a wide range of people with varying skill levels. The machine configuration is constantly changing requiring upgrades to software at both the hardware control level and the operator control level. In order to obtain high quality support from vendor service agreements, which is vital to maintaining 24/7 operations, hardware and software must be kept at industry's current levels. This means that periodic upgrades independent of machine configuration changes must take place. It is often difficult to identify and organize the information needed to guide the process of development, upgrades and enhancements. Dependencies between support software and applications need to be consistently identified to prevent introducing errors during upgrades and to allow adequate testing to be planned and performed. Developers also need access to information regarding compilers, make files and organized distribution directories. This paper describes a system under development at Jefferson Lab which will provide software developers and managers this type of information in a timely user-friendly fashion. The current status and future plans for the system will be detailed

  5. Service-oriented Software Defined Optical Networks for Cloud Computing

    Science.gov (United States)

    Liu, Yuze; Li, Hui; Ji, Yuefeng

    2017-10-01

    With the development of big data and cloud computing technology, the traditional software-defined network is facing new challenges (e.g., ubiquitous accessibility, higher bandwidth, more flexible management and greater security). This paper proposes a new service-oriented software defined optical network architecture, including a resource layer, a service abstract layer, a control layer and an application layer. We then dwell on the corresponding service providing method. Different service ID is used to identify the service a device can offer. Finally, we experimentally evaluate that proposed service providing method can be applied to transmit different services based on the service ID in the service-oriented software defined optical network.

  6. Deep level defects in Ge-doped (010) β-Ga2O3 layers grown by plasma-assisted molecular beam epitaxy

    Science.gov (United States)

    Farzana, Esmat; Ahmadi, Elaheh; Speck, James S.; Arehart, Aaron R.; Ringel, Steven A.

    2018-04-01

    Deep level defects were characterized in Ge-doped (010) β-Ga2O3 layers grown by plasma-assisted molecular beam epitaxy (PAMBE) using deep level optical spectroscopy (DLOS) and deep level transient (thermal) spectroscopy (DLTS) applied to Ni/β-Ga2O3:Ge (010) Schottky diodes that displayed Schottky barrier heights of 1.50 eV. DLOS revealed states at EC - 2.00 eV, EC - 3.25 eV, and EC - 4.37 eV with concentrations on the order of 1016 cm-3, and a lower concentration level at EC - 1.27 eV. In contrast to these states within the middle and lower parts of the bandgap probed by DLOS, DLTS measurements revealed much lower concentrations of states within the upper bandgap region at EC - 0.1 - 0.2 eV and EC - 0.98 eV. There was no evidence of the commonly observed trap state at ˜EC - 0.82 eV that has been reported to dominate the DLTS spectrum in substrate materials synthesized by melt-based growth methods such as edge defined film fed growth (EFG) and Czochralski methods [Zhang et al., Appl. Phys. Lett. 108, 052105 (2016) and Irmscher et al., J. Appl. Phys. 110, 063720 (2011)]. This strong sensitivity of defect incorporation on crystal growth method and conditions is unsurprising, which for PAMBE-grown β-Ga2O3:Ge manifests as a relatively "clean" upper part of the bandgap. However, the states at ˜EC - 0.98 eV, EC - 2.00 eV, and EC - 4.37 eV are reminiscent of similar findings from these earlier results on EFG-grown materials, suggesting that possible common sources might also be present irrespective of growth method.

  7. Software Architecture Reconstruction Method, a Survey

    OpenAIRE

    Zainab Nayyar; Nazish Rafique

    2014-01-01

    Architecture reconstruction belongs to a reverse engineering process, in which we move from code to architecture level for reconstructing architecture. Software architectures are the blue prints of projects which depict the external overview of the software system. Mostly maintenance and testing cause the software to deviate from its original architecture, because sometimes for enhancing the functionality of a system the software deviates from its documented specifications, some new modules a...

  8. Reliability of software

    International Nuclear Information System (INIS)

    Kopetz, H.

    1980-01-01

    Common factors and differences in the reliability of hardware and software; reliability increase by means of methods of software redundancy. Maintenance of software for long term operating behavior. (HP) [de

  9. NASA Software Engineering Benchmarking Study

    Science.gov (United States)

    Rarick, Heather L.; Godfrey, Sara H.; Kelly, John C.; Crumbley, Robert T.; Wifl, Joel M.

    2013-01-01

    was its software assurance practices, which seemed to rate well in comparison to the other organizational groups and also seemed to include a larger scope of activities. An unexpected benefit of the software benchmarking study was the identification of many opportunities for collaboration in areas including metrics, training, sharing of CMMI experiences and resources such as instructors and CMMI Lead Appraisers, and even sharing of assets such as documented processes. A further unexpected benefit of the study was the feedback on NASA practices that was received from some of the organizations interviewed. From that feedback, other potential areas where NASA could improve were highlighted, such as accuracy of software cost estimation and budgetary practices. The detailed report contains discussion of the practices noted in each of the topic areas, as well as a summary of observations and recommendations from each of the topic areas. The resulting 24 recommendations from the topic areas were then consolidated to eliminate duplication and culled into a set of 14 suggested actionable recommendations. This final set of actionable recommendations, listed below, are items that can be implemented to improve NASA's software engineering practices and to help address many of the items that were listed in the NASA top software engineering issues. 1. Develop and implement standard contract language for software procurements. 2. Advance accurate and trusted software cost estimates for both procured and in-house software and improve the capture of actual cost data to facilitate further improvements. 3. Establish a consistent set of objectives and expectations, specifically types of metrics at the Agency level, so key trends and models can be identified and used to continuously improve software processes and each software development effort. 4. Maintain the CMMI Maturity Level requirement for critical NASA projects and use CMMI to measure organizations developing software for NASA. 5

  10. The Ragnarok Architectural Software Configuration Management Model

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak

    1999-01-01

    The architecture is the fundamental framework for designing and implementing large scale software, and the ability to trace and control its evolution is essential. However, many traditional software configuration management tools view 'software' merely as a set of files, not as an architecture....... This introduces an unfortunate impedance mismatch between the design domain (architecture level) and configuration management domain (file level.) This paper presents a software configuration management model that allows tight version control and configuration management of the architecture of a software system...

  11. Persistent photoconductivity in AlGaN/GaN heterojunction channels caused by the ionization of deep levels in the AlGaN barrier layer

    International Nuclear Information System (INIS)

    Murayama, H.; Akiyama, Y.; Niwa, R.; Sakashita, H.; Sakaki, H.; Kachi, T.; Sugimoto, M.

    2013-01-01

    Time-dependent responses of drain current (I d ) in an AlGaN/GaN HEMT under UV (3.3 eV) and red (2.0 eV) light illumination have been studied at 300 K and 250 K. UV illumination enhances I d by about 10 %, indicating that the density of two-dimensional electrons is raised by about 10 12 cm −2 . When UV light is turned off at 300 K, a part of increased I d decays quickly but the other part of increment is persistent, showing a slow decay. At 250 K, the majority of increment remains persistent. It is found that such a persistent increase of I d at 250 K can be partially erased by the illumination of red light. These photo-responses are explained by a simple band-bending model in which deep levels in the AlGaN barrier get positively charged by the UV light, resulting in a parabolic band bending in the AlGaN layer, while some potion of those deep levels are neutralized by the red light

  12. Space Flight Software Development Software for Intelligent System Health Management

    Science.gov (United States)

    Trevino, Luis C.; Crumbley, Tim

    2004-01-01

    The slide presentation examines the Marshall Space Flight Center Flight Software Branch, including software development projects, mission critical space flight software development, software technical insight, advanced software development technologies, and continuous improvement in the software development processes and methods.

  13. Software Engineering Guidebook

    Science.gov (United States)

    Connell, John; Wenneson, Greg

    1993-01-01

    The Software Engineering Guidebook describes SEPG (Software Engineering Process Group) supported processes and techniques for engineering quality software in NASA environments. Three process models are supported: structured, object-oriented, and evolutionary rapid-prototyping. The guidebook covers software life-cycles, engineering, assurance, and configuration management. The guidebook is written for managers and engineers who manage, develop, enhance, and/or maintain software under the Computer Software Services Contract.

  14. The Prototype Plume Busters Software: A New Tool for Exploring Issues Related to Environmental Policy in Undergraduate-level Earth and Environmental Science Courses

    Science.gov (United States)

    Macfarlane, P. A.

    2006-12-01

    Students seldom have an opportunity to explore the issues related to the environmental impact of contamination on water resources. With NSF support we have developed the prototype Plume Busters, in which students take on the role of an environmental consultant. The software consists of an interactive, Java application and accompanying HTML linked pages. Following a pipeline spill, the environmental consultant is hired by the pipeline owner to locate the resulting plume created by the spill and remediate the contaminated aquifer at minimum monetary and time cost. The contamination must be removed from the aquifer before it reaches the river and eventually a downstream public water supply. The application simulates movement of a plume from a pipeline break through a shallow alluvial aquifer towards the river upstream from a municipal water supply intake. To locate the plume, the student places observation wells on a gridded map of the study area and the simulation returns the contaminant concentrations at those locations on the appropriate sample dates. Once the plume is located, the student is able to site pumping and injection wells on the map for aquifer remediation using a simple pump-and-treat technique. The simulation then computes the movement of particles to the pumping wells and returns the cumulative mass removed by the production remediation well. Plume Busters also provides teachers with a means to initiate student exploration of a wide range of environmental issues, including (1) source-water assessment and ground-water and wellhead protection zones, (2) the impact of human activities and technology on the hydrosphere and the biosphere, (3) the role of technology in the resolution of environmental issues (4) legal, social, political, and economic implications of environmental issues, and (5) risk assessment resulting from human activities.

  15. Software Quality Assurance for Nuclear Safety Systems

    International Nuclear Information System (INIS)

    Sparkman, D R; Lagdon, R

    2004-01-01

    The US Department of Energy has undertaken an initiative to improve the quality of software used to design and operate their nuclear facilities across the United States. One aspect of this initiative is to revise or create new directives and guides associated with quality practices for the safety software in its nuclear facilities. Safety software includes the safety structures, systems, and components software and firmware, support software and design and analysis software used to ensure the safety of the facility. DOE nuclear facilities are unique when compared to commercial nuclear or other industrial activities in terms of the types and quantities of hazards that must be controlled to protect workers, public and the environment. Because of these differences, DOE must develop an approach to software quality assurance that ensures appropriate risk mitigation by developing a framework of requirements that accomplishes the following goals: (sm b ullet) Ensures the software processes developed to address nuclear safety in design, operation, construction and maintenance of its facilities are safe (sm b ullet) Considers the larger system that uses the software and its impacts (sm b ullet) Ensures that the software failures do not create unsafe conditions Software designers for nuclear systems and processes must reduce risks in software applications by incorporating processes that recognize, detect, and mitigate software failure in safety related systems. It must also ensure that fail safe modes and component testing are incorporated into software design. For nuclear facilities, the consideration of risk is not necessarily sufficient to ensure safety. Systematic evaluation, independent verification and system safety analysis must be considered for software design, implementation, and operation. The software industry primarily uses risk analysis to determine the appropriate level of rigor applied to software practices. This risk-based approach distinguishes safety

  16. A Smart Layer For Remote Laboratories

    Directory of Open Access Journals (Sweden)

    Ricardo J. Costa

    2007-08-01

    Full Text Available Commonly, when a weblab is developed tosupport remote experiments in sciences and engineeringcourses, a particular hardware/software architecture isimplemented. However, the existence of severaltechnological solutions to implement those architecturesdifficults the emergence of a standard, both at hardwareand software levels. While particular solutions are adoptedassuming that only qualified people may implement aweblab, the control of the physical space and the powerconsumption are often forgotten. Since controlling these twoprevious aspects may increase the quality of the weblabhosting the remote experiments, this paper proposes the useof a new layer implemented by a domotic system bus withseveral devices (e.g. lights, power sockets, temperaturesensors, and others able to be controlled through theInternet. We also provide a brief proof-of-concept in theform of a weblab equipped with a simple domotic systemusually implemented in smart houses. The added value tothe remote experiment hosted at the weblab is also identifiedin terms of power savings and environment conditions.

  17. Perceptions of community and family level IDU and HIV related stigma, disclosure decisions and experiences with layered stigma among HIV positive injection drug users in Vietnam

    OpenAIRE

    Rudolph, A.E.; Davis, W.W.; Quan, V.M.; Ha, T.V.; Minh, N.L.; Gregowski, A.; Salter, Megan; Celentano, D.D.; Go, V.

    2011-01-01

    This paper explores how perceived stigma and layered stigma related to injection drug use and being HIV positive influence the decision to disclose one’s HIV status to family and community and experiences with stigma following disclosure among a population of HIV positive male injection drug users (IDUs) in Thai Nguyen, Vietnam. In qualitative interviews conducted between 2007 and 2008, 25 HIV positive male IDUs described layered stigma in their community but an absence of layered stigma with...

  18. Antenna Controller Replacement Software

    Science.gov (United States)

    Chao, Roger Y.; Morgan, Scott C.; Strain, Martha M.; Rockwell, Stephen T.; Shimizu, Kenneth J.; Tehrani, Barzia J.; Kwok, Jaclyn H.; Tuazon-Wong, Michelle; Valtier, Henry; Nalbandi, Reza; hide

    2010-01-01

    The Antenna Controller Replacement (ACR) software accurately points and monitors the Deep Space Network (DSN) 70-m and 34-m high-efficiency (HEF) ground-based antennas that are used to track primarily spacecraft and, periodically, celestial targets. To track a spacecraft, or other targets, the antenna must be accurately pointed at the spacecraft, which can be very far away with very weak signals. ACR s conical scanning capability collects the signal in a circular pattern around the target, calculates the location of the strongest signal, and adjusts the antenna pointing to point directly at the spacecraft. A real-time, closed-loop servo control algorithm performed every 0.02 second allows accurate positioning of the antenna in order to track these distant spacecraft. Additionally, this advanced servo control algorithm provides better antenna pointing performance in windy conditions. The ACR software provides high-level commands that provide a very easy user interface for the DSN operator. The operator only needs to enter two commands to start the antenna and subreflector, and Master Equatorial tracking. The most accurate antenna pointing is accomplished by aligning the antenna to the Master Equatorial, which because of its small size and sheltered location, has the most stable pointing. The antenna has hundreds of digital and analog monitor points. The ACR software provides compact displays to summarize the status of the antenna, subreflector, and the Master Equatorial. The ACR software has two major functions. First, it performs all of the steps required to accurately point the antenna (and subreflector and Master Equatorial) at the spacecraft (or celestial target). This involves controlling the antenna/ subreflector/Master-Equatorial hardware, initiating and monitoring the correct sequence of operations, calculating the position of the spacecraft relative to the antenna, executing the real-time servo control algorithm to maintain the correct position, and

  19. Software for Optimizing Quality Assurance of Other Software

    Science.gov (United States)

    Feather, Martin; Cornford, Steven; Menzies, Tim

    2004-01-01

    Software assurance is the planned and systematic set of activities that ensures that software processes and products conform to requirements, standards, and procedures. Examples of such activities are the following: code inspections, unit tests, design reviews, performance analyses, construction of traceability matrices, etc. In practice, software development projects have only limited resources (e.g., schedule, budget, and availability of personnel) to cover the entire development effort, of which assurance is but a part. Projects must therefore select judiciously from among the possible assurance activities. At its heart, this can be viewed as an optimization problem; namely, to determine the allocation of limited resources (time, money, and personnel) to minimize risk or, alternatively, to minimize the resources needed to reduce risk to an acceptable level. The end result of the work reported here is a means to optimize quality-assurance processes used in developing software.

  20. Teaching Empirical Software Engineering Using Expert Teams

    DEFF Research Database (Denmark)

    Kuhrmann, Marco

    2017-01-01

    Empirical software engineering aims at making software engineering claims measurable, i.e., to analyze and understand phenomena in software engineering and to evaluate software engineering approaches and solutions. Due to the involvement of humans and the multitude of fields for which software...... is crucial, software engineering is considered hard to teach. Yet, empirical software engineering increases this difficulty by adding the scientific method as extra dimension. In this paper, we present a Master-level course on empirical software engineering in which different empirical instruments...... an extra specific expertise that they offer as service to other teams, thus, fostering cross-team collaboration. The paper outlines the general course setup, topics addressed, and it provides initial lessons learned....

  1. Application of Formal Methods in Software Engineering

    Directory of Open Access Journals (Sweden)

    Adriana Morales

    2011-12-01

    Full Text Available The purpose of this research work is to examine: (1 why are necessary the formal methods for software systems today, (2 high integrity systems through the methodology C-by-C –Correctness-by-Construction–, and (3 an affordable methodology to apply formal methods in software engineering. The research process included reviews of the literature through Internet, in publications and presentations in events. Among the Research results found that: (1 there is increasing the dependence that the nations have, the companies and people of software systems, (2 there is growing demand for software Engineering to increase social trust in the software systems, (3 exist methodologies, as C-by-C, that can provide that level of trust, (4 Formal Methods constitute a principle of computer science that can be applied software engineering to perform reliable process in software development, (5 software users have the responsibility to demand reliable software products, and (6 software engineers have the responsibility to develop reliable software products. Furthermore, it is concluded that: (1 it takes more research to identify and analyze other methodologies and tools that provide process to apply the Formal Software Engineering methods, (2 Formal Methods provide an unprecedented ability to increase the trust in the exactitude of the software products and (3 by development of new methodologies and tools is being achieved costs are not more a disadvantage for application of formal methods.

  2. Ensuring Software IP Cleanliness

    Directory of Open Access Journals (Sweden)

    Mahshad Koohgoli

    2007-12-01

    Full Text Available At many points in the life of a software enterprise, determination of intellectual property (IP cleanliness becomes critical. The value of an enterprise that develops and sells software may depend on how clean the software is from the IP perspective. This article examines various methods of ensuring software IP cleanliness and discusses some of the benefits and shortcomings of current solutions.

  3. Commercial Literacy Software.

    Science.gov (United States)

    Balajthy, Ernest

    1997-01-01

    Presents the first year's results of a continuing project to monitor the availability of software of relevance for literacy education purposes. Concludes there is an enormous amount of software available for use by teachers of reading and literacy--whereas drill-and-practice software is the largest category of software available, large numbers of…

  4. Ensuring Software IP Cleanliness

    OpenAIRE

    Mahshad Koohgoli; Richard Mayer

    2007-01-01

    At many points in the life of a software enterprise, determination of intellectual property (IP) cleanliness becomes critical. The value of an enterprise that develops and sells software may depend on how clean the software is from the IP perspective. This article examines various methods of ensuring software IP cleanliness and discusses some of the benefits and shortcomings of current solutions.

  5. Statistical Software Engineering

    Science.gov (United States)

    1998-04-13

    multiversion software subject to coincident errors. IEEE Trans. Software Eng. SE-11:1511-1517. Eckhardt, D.E., A.K Caglayan, J.C. Knight, L.D. Lee, D.F...J.C. and N.G. Leveson. 1986. Experimental evaluation of the assumption of independence in multiversion software. IEEE Trans. Software

  6. Improving Software Developer's Competence

    DEFF Research Database (Denmark)

    Abrahamsson, Pekka; Kautz, Karlheinz; Sieppi, Heikki

    2002-01-01

    Emerging agile software development methods are people oriented development approaches to be used by the software industry. The personal software process (PSP) is an accepted method for improving the capabilities of a single software engineer. Five original hypotheses regarding the impact...

  7. Software - Naval Oceanography Portal

    Science.gov (United States)

    are here: Home › USNO › Earth Orientation › Software USNO Logo USNO Navigation Earth Orientation Products GPS-based Products VLBI-based Products EO Information Center Publications about Products Software Search databases Auxiliary Software Supporting Software Form Folder Earth Orientation Matrix Calculator

  8. Software Engineering Education Directory

    Science.gov (United States)

    1990-04-01

    and Engineering (CMSC 735) Codes: GPEV2 * Textiooks: IEEE Tutoria on Models and Metrics for Software Management and Engameeing by Basi, Victor R...Software Engineering (Comp 227) Codes: GPRY5 Textbooks: IEEE Tutoria on Software Design Techniques by Freeman, Peter and Wasserman, Anthony 1. Software

  9. Great software debates

    CERN Document Server

    Davis, A

    2004-01-01

    The industry’s most outspoken and insightful critic explains how the software industry REALLY works. In Great Software Debates, Al Davis, shares what he has learned about the difference between the theory and the realities of business and encourages you to question and think about software engineering in ways that will help you succeed where others fail. In short, provocative essays, Davis fearlessly reveals the truth about process improvement, productivity, software quality, metrics, agile development, requirements documentation, modeling, software marketing and sales, empiricism, start-up financing, software research, requirements triage, software estimation, and entrepreneurship.

  10. Layered distributed simulation architecture to support the C2 enterprise

    CSIR Research Space (South Africa)

    Duvenhage, A

    2009-09-01

    Full Text Available between these systems and that a capability is required to demonstrate, support and evaluate interoperability. This paper discusses the layered software architecture of a C++ software application framework for developing applications that support...

  11. Software for the LHCb experiment

    CERN Document Server

    Corti, Gloria; Belyaev, Ivan; Cattaneo, Marco; Charpentier, Philippe; Frank, Markus; Koppenburg, Patrick; Mato-Vila, P; Ranjard, Florence; Roiser, Stefan

    2006-01-01

    LHCb is an experiment for precision measurements of CP-violation and rare decays in B mesons at the LHC collider at CERN. The LHCb software development strategy follows an architecture-centric approach as a way of creating a resilient software framework that can withstand changes in requirements and technology over the expected long lifetime of the experiment. The software architecture, called GAUDI, supports event data processing applications that run in different processing environments ranging from the real-time high- level triggers in the online system to the final physics analysis performed by more than one hundred physicists. The major architectural design choices and the arguments that lead to these choices will be outlined. Object oriented technologies have been used throughout. Initially developed for the LHCb experiment, GAUDI has been adopted and extended by other experiments. Several iterations of the GAUDI software framework have been released and are now being used routinely by the physicists of...

  12. Multidimensional Recurrence Quantification Analysis (MdRQA) for the Analysis of Multidimensional Time-Series: A Software Implementation in MATLAB and Its Application to Group-Level Data in Joint Action.

    Science.gov (United States)

    Wallot, Sebastian; Roepstorff, Andreas; Mønster, Dan

    2016-01-01

    We introduce Multidimensional Recurrence Quantification Analysis (MdRQA) as a tool to analyze multidimensional time-series data. We show how MdRQA can be used to capture the dynamics of high-dimensional signals, and how MdRQA can be used to assess coupling between two or more variables. In particular, we describe applications of the method in research on joint and collective action, as it provides a coherent analysis framework to systematically investigate dynamics at different group levels-from individual dynamics, to dyadic dynamics, up to global group-level of arbitrary size. The Appendix in Supplementary Material contains a software implementation in MATLAB to calculate MdRQA measures.

  13. Views on Software Testability

    OpenAIRE

    Shimeall, Timothy; Friedman, Michael; Chilenski, John; Voas, Jeffrey

    1994-01-01

    The field of testability is an active, well-established part of engineering of modern computer systems. However, only recently have technologies for software testability began to be developed. These technologies focus on accessing the aspects of software that improve or depreciate the ease of testing. As both the size of implemented software and the amount of effort required to test that software increase, so will the important of software testability technologies in influencing the softwa...

  14. Software component quality evaluation

    Science.gov (United States)

    Clough, A. J.

    1991-01-01

    The paper describes a software inspection process that can be used to evaluate the quality of software components. Quality criteria, process application, independent testing of the process and proposed associated tool support are covered. Early results indicate that this technique is well suited for assessing software component quality in a standardized fashion. With automated machine assistance to facilitate both the evaluation and selection of software components, such a technique should promote effective reuse of software components.

  15. Software as a service approach to sensor simulation software deployment

    Science.gov (United States)

    Webster, Steven; Miller, Gordon; Mayott, Gregory

    2012-05-01

    Traditionally, military simulation has been problem domain specific. Executing an exercise currently requires multiple simulation software providers to specialize, deploy, and configure their respective implementations, integrate the collection of software to achieve a specific system behavior, and then execute for the purpose at hand. This approach leads to rigid system integrations which require simulation expertise for each deployment due to changes in location, hardware, and software. Our alternative is Software as a Service (SaaS) predicated on the virtualization of Night Vision Electronic Sensors (NVESD) sensor simulations as an exemplary case. Management middleware elements layer self provisioning, configuration, and integration services onto the virtualized sensors to present a system of services at run time. Given an Infrastructure as a Service (IaaS) environment, enabled and managed system of simulations yields a durable SaaS delivery without requiring user simulation expertise. Persistent SaaS simulations would provide on demand availability to connected users, decrease integration costs and timelines, and benefit the domain community from immediate deployment of lessons learned.

  16. Software engineering from a Langley perspective

    Science.gov (United States)

    Voigt, Susan

    1994-01-01

    A brief introduction to software engineering is presented. The talk is divided into four sections beginning with the question 'What is software engineering', followed by a brief history of the progression of software engineering at the Langley Research Center in the context of an expanding computing environment. Several basic concepts and terms are introduced, including software development life cycles and maturity levels. Finally, comments are offered on what software engineering means for the Langley Research Center and where to find more information on the subject.

  17. Software Quality Assurance Metrics

    Science.gov (United States)

    McRae, Kalindra A.

    2004-01-01

    Software Quality Assurance (SQA) is a planned and systematic set of activities that ensures conformance of software life cycle processes and products conform to requirements, standards and procedures. In software development, software quality means meeting requirements and a degree of excellence and refinement of a project or product. Software Quality is a set of attributes of a software product by which its quality is described and evaluated. The set of attributes includes functionality, reliability, usability, efficiency, maintainability, and portability. Software Metrics help us understand the technical process that is used to develop a product. The process is measured to improve it and the product is measured to increase quality throughout the life cycle of software. Software Metrics are measurements of the quality of software. Software is measured to indicate the quality of the product, to assess the productivity of the people who produce the product, to assess the benefits derived from new software engineering methods and tools, to form a baseline for estimation, and to help justify requests for new tools or additional training. Any part of the software development can be measured. If Software Metrics are implemented in software development, it can save time, money, and allow the organization to identify the caused of defects which have the greatest effect on software development. The summer of 2004, I worked with Cynthia Calhoun and Frank Robinson in the Software Assurance/Risk Management department. My task was to research and collect, compile, and analyze SQA Metrics that have been used in other projects that are not currently being used by the SA team and report them to the Software Assurance team to see if any metrics can be implemented in their software assurance life cycle process.

  18. Software Engineering Program: Software Process Improvement Guidebook

    Science.gov (United States)

    1996-01-01

    The purpose of this document is to provide experience-based guidance in implementing a software process improvement program in any NASA software development or maintenance community. This guidebook details how to define, operate, and implement a working software process improvement program. It describes the concept of the software process improvement program and its basic organizational components. It then describes the structure, organization, and operation of the software process improvement program, illustrating all these concepts with specific NASA examples. The information presented in the document is derived from the experiences of several NASA software organizations, including the SEL, the SEAL, and the SORCE. Their experiences reflect many of the elements of software process improvement within NASA. This guidebook presents lessons learned in a form usable by anyone considering establishing a software process improvement program within his or her own environment. This guidebook attempts to balance general and detailed information. It provides material general enough to be usable by NASA organizations whose characteristics do not directly match those of the sources of the information and models presented herein. It also keeps the ideas sufficiently close to the sources of the practical experiences that have generated the models and information.

  19. From Software Development to Software Assembly

    NARCIS (Netherlands)

    Sneed, Harry M.; Verhoef, Chris

    2016-01-01

    The lack of skilled programming personnel and the growing burden of maintaining customized software are forcing organizations to quit producing their own software. It's high time they turned to ready-made, standard components to fulfill their business requirements. Cloud services might be one way to

  20. Improving network management with Software Defined Networking

    International Nuclear Information System (INIS)

    Dzhunev, Pavel

    2013-01-01

    Software-defined networking (SDN) is developed as an alternative to closed networks in centers for data processing by providing a means to separate the control layer data layer switches, and routers. SDN introduces new possibilities for network management and configuration methods. In this article, we identify problems with the current state-of-the-art network configuration and management mechanisms and introduce mechanisms to improve various aspects of network management

  1. The iCub Software Architecture: evolution and lessons learned

    Directory of Open Access Journals (Sweden)

    Lorenzo eNatale

    2016-04-01

    Full Text Available The complexity of humanoid robots is increasing with the availability of new sensors, embedded CPUs and actuators. This wealth of technologies allows researchers to investigate new problems like whole-body force control, multi-modal human-robot interaction and sensory fusion. Under the hood of these robots, the software architecture has an important role: it allows researchers to get access to the robot functionalities focusing primarily on their research problems, it supports code reuse to minimize development and debugging, especially when new hardware becomes available. But more importantly it allows increasing the complexity of the experiments that can be implemented before system integration becomes unmanageable and debugging draws more resources than research itself.In this paper we illustrate the software architecture of the iCub humanoid robot and the software engineering best practices that have emerged driven by the needs of our research community. We describe the latest developments at the level of the middleware supporting interface definition and automatic code generation, logging, ROS compatibility and channel prioritization. We show the robot abstraction layer and how it has been modified to better address the requirements of the users and to support new hardware as it became available. We also describe the testing framework we have recently adopted for developing code using a test driven methodology. We conclude the paper discussing the lessons we have learned during the past eleven years of software development on the iCub humanoid robot.

  2. Fighting Software Piracy: Some Global Conditional Policy Instruments

    OpenAIRE

    Asongu, Simplice A; Singh, Pritam; Le Roux, Sara

    2016-01-01

    This study examines the efficiency of tools for fighting software piracy in the conditional distributions of software piracy. Our paper examines software piracy in 99 countries for the period 1994-2010, using contemporary and non-contemporary quantile regressions. The intuition for modelling distributions contingent on existing levels of software piracy is that the effectiveness of tools against piracy may consistently decrease or increase simultaneously with increasing levels of software pir...

  3. Instrument control software development process for the multi-star AO system ARGOS

    Science.gov (United States)

    Kulas, M.; Barl, L.; Borelli, J. L.; Gässler, W.; Rabien, S.

    2012-09-01

    The ARGOS project (Advanced Rayleigh guided Ground layer adaptive Optics System) will upgrade the Large Binocular Telescope (LBT) with an AO System consisting of six Rayleigh laser guide stars. This adaptive optics system integrates several control loops and many different components like lasers, calibration swing arms and slope computers that are dispersed throughout the telescope. The purpose of the instrument control software (ICS) is running this AO system and providing convenient client interfaces to the instruments and the control loops. The challenges for the ARGOS ICS are the development of a distributed and safety-critical software system with no defects in a short time, the creation of huge and complex software programs with a maintainable code base, the delivery of software components with the desired functionality and the support of geographically distributed project partners. To tackle these difficult tasks, the ARGOS software engineers reuse existing software like the novel middleware from LINC-NIRVANA, an instrument for the LBT, provide many tests at different functional levels like unit tests and regression tests, agree about code and architecture style and deliver software incrementally while closely collaborating with the project partners. Many ARGOS ICS components are already successfully in use in the laboratories for testing ARGOS control loops.

  4. ACTS: from ATLAS software towards a common track reconstruction software

    Science.gov (United States)

    Gumpert, C.; Salzburger, A.; Kiehn, M.; Hrdinka, J.; Calace, N.; ATLAS Collaboration

    2017-10-01

    Reconstruction of charged particles’ trajectories is a crucial task for most particle physics experiments. The high instantaneous luminosity achieved at the LHC leads to a high number of proton-proton collisions per bunch crossing, which has put the track reconstruction software of the LHC experiments through a thorough test. Preserving track reconstruction performance under increasingly difficult experimental conditions, while keeping the usage of computational resources at a reasonable level, is an inherent problem for many HEP experiments. Exploiting concurrent algorithms and using multivariate techniques for track identification are the primary strategies to achieve that goal. Starting from current ATLAS software, the ACTS project aims to encapsulate track reconstruction software into a generic, framework- and experiment-independent software package. It provides a set of high-level algorithms and data structures for performing track reconstruction tasks as well as fast track simulation. The software is developed with special emphasis on thread-safety to support parallel execution of the code and data structures are optimised for vectorisation to speed up linear algebra operations. The implementation is agnostic to the details of the detection technologies and magnetic field configuration which makes it applicable to many different experiments.

  5. Quantification of retinal layer thickness changes in acute macular neuroretinopathy

    DEFF Research Database (Denmark)

    Munk, Marion R.; Beck, Marco; Kolb, Simone

    2017-01-01

    Purpose To quantitatively evaluate retinal layer thickness changes in acute macular neuroretinopathy (AMN). Methods AMN areas were identified using near-infrared reflectance (NIR) images. Intraretinal layer segmentation using Heidelberg software was performed. The inbuilt ETDRS -grid was moved on...

  6. Programming Language Software For Graphics Applications

    Science.gov (United States)

    Beckman, Brian C.

    1993-01-01

    New approach reduces repetitive development of features common to different applications. High-level programming language and interactive environment with access to graphical hardware and software created by adding graphical commands and other constructs to standardized, general-purpose programming language, "Scheme". Designed for use in developing other software incorporating interactive computer-graphics capabilities into application programs. Provides alternative to programming entire applications in C or FORTRAN, specifically ameliorating design and implementation of complex control and data structures typifying applications with interactive graphics. Enables experimental programming and rapid development of prototype software, and yields high-level programs serving as executable versions of software-design documentation.

  7. Software and Computing News

    CERN Multimedia

    Barberis, D

    The last several months have been very busy ones for the ATLAS software developers. They've been trying to cope with the competing demands of multiple software stress tests and testbeds. These include Data Challenge Two (DC2), the Combined Testbeam (CTB), preparations for the Physics Workshop to be held in Rome in June 2005, and other testbeds, primarily one for the High-Level Trigger. Data Challenge 2 (DC2) The primary goal of this was to validate the computing model and to provide a test of simulating a day's worth of ATLAS data (10 million events) and of fully processing it and making it available to the physicists within 10 days (i.e. a 10% scale test). DC2 consists of three parts - the generation, simulation, and mixing of a representative sample of physics events with background events; the reconstruction of the mixed samples with initial classification into the different physics signatures; and the distribution of the data to multiple remote sites (Tier-1 centers) for analysis by physicists. Figu...

  8. Software Engineering Improvement Plan

    Science.gov (United States)

    2006-01-01

    In performance of this task order, bd Systems personnel provided support to the Flight Software Branch and the Software Working Group through multiple tasks related to software engineering improvement and to activities of the independent Technical Authority (iTA) Discipline Technical Warrant Holder (DTWH) for software engineering. To ensure that the products, comments, and recommendations complied with customer requirements and the statement of work, bd Systems personnel maintained close coordination with the customer. These personnel performed work in areas such as update of agency requirements and directives database, software effort estimation, software problem reports, a web-based process asset library, miscellaneous documentation review, software system requirements, issue tracking software survey, systems engineering NPR, and project-related reviews. This report contains a summary of the work performed and the accomplishments in each of these areas.

  9. Spotting software errors sooner

    International Nuclear Information System (INIS)

    Munro, D.

    1989-01-01

    Static analysis is helping to identify software errors at an earlier stage and more cheaply than conventional methods of testing. RTP Software's MALPAS system also has the ability to check that a code conforms to its original specification. (author)

  10. Avionics and Software Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The goal of the AES Avionics and Software (A&S) project is to develop a reference avionics and software architecture that is based on standards and that can be...

  11. Paladin Software Support Lab

    Data.gov (United States)

    Federal Laboratory Consortium — The Paladin Software Support Environment (SSE) occupies 2,241 square-feet. It contains the hardware and software tools required to support the Paladin Automatic Fire...

  12. Pragmatic Software Innovation

    DEFF Research Database (Denmark)

    Aaen, Ivan; Jensen, Rikke Hagensby

    2014-01-01

    We understand software innovation as concerned with introducing innovation into the development of software intensive systems, i.e. systems in which software development and/or integration are dominant considerations. Innovation is key in almost any strategy for competitiveness in existing markets......, for creating new markets, or for curbing rising public expenses, and software intensive systems are core elements in most such strategies. Software innovation therefore is vital for about every sector of the economy. Changes in software technologies over the last decades have opened up for experimentation......, learning, and flexibility in ongoing software projects, but how can this change be used to facilitate software innovation? How can a team systematically identify and pursue opportunities to create added value in ongoing projects? In this paper, we describe Deweyan pragmatism as the philosophical foundation...

  13. Process mining software repositories

    NARCIS (Netherlands)

    Poncin, W.; Serebrenik, A.; Brand, van den M.G.J.

    2011-01-01

    Software developers' activities are in general recorded in software repositories such as version control systems, bug trackers and mail archives. While abundant information is usually present in such repositories, successful information extraction is often challenged by the necessity to

  14. Research on software behavior trust based on hierarchy evaluation

    Science.gov (United States)

    Long, Ke; Xu, Haishui

    2017-08-01

    In view of the correlation software behavior, we evaluate software behavior credibility from two levels of control flow and data flow. In control flow level, method of the software behavior of trace based on support vector machine (SVM) is proposed. In data flow level, behavioral evidence evaluation based on fuzzy decision analysis method is put forward.

  15. Open Source Software Development

    Science.gov (United States)

    2011-01-01

    appropriate to refer to FOSS or FLOSS (L for Libre , where the alternative term “ libre software ” has popularity in some parts of the world) in order...Applying Social Network Analysis to Community-Drive Libre Software Projects, Intern. J. Info. Tech. and Web Engineering, 2006, 1(3), 27-28. 17...Open Source Software Development* Walt Scacchi Institute for Software Researcher University of California, Irvine Irvine, CA 92697-3455 USA Abstract

  16. Gammasphere software development

    International Nuclear Information System (INIS)

    Piercey, R.B.

    1994-01-01

    This report describes the activities of the nuclear physics group at Mississippi State University which were performed during 1993. Significant progress has been made in the focus areas: chairing the Gammasphere Software Working Group (SWG); assisting with the porting and enhancement of the ORNL UPAK histogramming software package; and developing standard formats for Gammasphere data products. In addition, they have established a new public ftp archive to distribute software and software development tools and information

  17. Software engineer's pocket book

    CERN Document Server

    Tooley, Michael

    2013-01-01

    Software Engineer's Pocket Book provides a concise discussion on various aspects of software engineering. The book is comprised of six chapters that tackle various areas of concerns in software engineering. Chapter 1 discusses software development, and Chapter 2 covers programming languages. Chapter 3 deals with operating systems. The book also tackles discrete mathematics and numerical computation. Data structures and algorithms are also explained. The text will be of great use to individuals involved in the specification, design, development, implementation, testing, maintenance, and qualit

  18. Software Testing Requires Variability

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak

    2003-01-01

    Software variability is the ability of a software system or artefact to be changed, customized or configured for use in a particular context. Variability in software systems is important from a number of perspectives. Some perspectives rightly receive much attention due to their direct economic...... impact in software production. As is also apparent from the call for papers these perspectives focus on qualities such as reuse, adaptability, and maintainability....

  19. A Mathematics Software Database Update.

    Science.gov (United States)

    Cunningham, R. S.; Smith, David A.

    1987-01-01

    Contains an update of an earlier listing of software for mathematics instruction at the college level. Topics are: advanced mathematics, algebra, calculus, differential equations, discrete mathematics, equation solving, general mathematics, geometry, linear and matrix algebra, logic, statistics and probability, and trigonometry. (PK)

  20. Software for graphic display systems

    International Nuclear Information System (INIS)

    Karlov, A.A.

    1978-01-01

    In this paper some aspects of graphic display systems are discussed. The design of a display subroutine library is described, with an example, and graphic dialogue software is considered primarily from the point of view of the programmer who uses a high-level language. (Auth.)

  1. Intercomparison of alpha particle spectrometry software packages

    International Nuclear Information System (INIS)

    1999-08-01

    Software has reached an important level as the 'logical controller' at different levels, from a single instrument to an entire computer-controlled experiment. This is also the case for software packages in nuclear instruments and experiments. In particular, because of the range of applications of alpha-particle spectrometry, software packages in this field are often used. It is the aim of this intercomparison to test and describe the abilities of four such software packages. The main objectives of the intercomparison were the ability of the programs to determine the peak areas and the peak area uncertainties, and the statistical control and stability of reported results. In this report, the task, methods and results of the intercomparison are presented in order to asist the potential users of such software and to stimulate the development of even better alpha-particle spectrum analysis software

  2. Computer software quality assurance

    International Nuclear Information System (INIS)

    Ives, K.A.

    1986-06-01

    The author defines some criteria for the evaluation of software quality assurance elements for applicability to the regulation of the nuclear industry. The author then analyses a number of software quality assurance (SQA) standards. The major extracted SQA elements are then discussed, and finally specific software quality assurance recommendations are made for the nuclear industry

  3. Software Architecture Evolution

    Science.gov (United States)

    Barnes, Jeffrey M.

    2013-01-01

    Many software systems eventually undergo changes to their basic architectural structure. Such changes may be prompted by new feature requests, new quality attribute requirements, changing technology, or other reasons. Whatever the causes, architecture evolution is commonplace in real-world software projects. Today's software architects, however,…

  4. XES Software Communication Extension

    NARCIS (Netherlands)

    Leemans, M.; Liu, C.

    2017-01-01

    During the execution of software, execution data can be recorded. With the development of process mining techniques on the one hand, and the growing availability of software execution data on the other hand, a new form of software analytics comes into reach. That is, applying process mining

  5. Neutron Scattering Software

    Science.gov (United States)

    Home Page | Facilities | Reference | Software | Conferences | Announcements | Mailing Lists Neutron Scattering Banner Neutron Scattering Software A new portal for neutron scattering has just been established sets KUPLOT: data plotting and fitting software ILL/TAS: Matlab probrams for analyzing triple axis data

  6. XES Software Event Extension

    NARCIS (Netherlands)

    Leemans, M.; Liu, C.

    2017-01-01

    During the execution of software, execution data can be recorded. With the development of process mining techniques on the one hand, and the growing availability of software execution data on the other hand, a new form of software analytics comes into reach. That is, applying process mining

  7. ARC Software and Models

    Science.gov (United States)

    Archives RESEARCH ▼ Research Areas Ongoing Projects Completed Projects SOFTWARE CONTACT ▼ Primary Contacts Researchers External Link MLibrary Deep Blue Software Archive Most research conducted at the ARC produce software code and methodologies that are transferred to TARDEC and industry partners. These

  8. XES Software Telemetry Extension

    NARCIS (Netherlands)

    Leemans, M.; Liu, C.

    2017-01-01

    During the execution of software, execution data can be recorded. With the development of process mining techniques on the one hand, and the growing availability of software execution data on the other hand, a new form of software analytics comes into reach. That is, applying process mining

  9. Specifications in software prototyping

    OpenAIRE

    Luqi; Chang, Carl K.; Zhu, Hong

    1998-01-01

    We explore the use of software speci®cations for software prototyping. This paper describes a process model for software prototyping, and shows how specifications can be used to support such a process via a cellular mobile phone switch example.

  10. Software Acquisition and Software Engineering Best Practices

    National Research Council Canada - National Science Library

    Eslinger, S

    1999-01-01

    The purpose of this white paper is to address the issues raised in the recently published Senate Armed Services Committee Report 106-50 concerning Software Management Improvements for the Department of Defense (DoD...

  11. Strategies for successful software development risk management

    Directory of Open Access Journals (Sweden)

    Marija Boban

    2003-01-01

    Full Text Available Nowadays, software is becoming a major part of enterprise business. Software development is activity connected with advanced technology and high level of knowledge. Risks on software development projects must be successfully mitigated to produce successful software systems. Lack of a defined approach to risk management is one of the common causes for project failures. To improve project chances for success, this work investigates common risk impact areas to perceive a foundation that can be used to define a common approach to software risk management. Based on typical risk impact areas on software development projects, we propose three risk management strategies suitable for a broad area of enterprises and software development projects with different amounts of connected risks. Proposed strategies define activities that should be performed for successful risk management, the one that will enable software development projects to perceive risks as soon as possible and to solve problems connected with risk materialization. We also propose a risk-based approach to software development planning and risk management as attempts to address and retire the highest impact risks as early as possible in the development process. Proposed strategies should improve risk management on software development projects and help create a successful software solution.

  12. BNL multiparticle spectrometer software

    International Nuclear Information System (INIS)

    Saulys, A.C.

    1984-01-01

    This paper discusses some solutions to problems common to the design, management and maintenance of a large high energy physics spectrometer software system. The experience of dealing with a large, complex program and the necessity of having the program controlled by various people at different levels of computer experience has led us to design a program control structure of mnemonic and self-explanatory nature. The use of this control language in both on-line and off-line operation of the program will be discussed. The solution of structuring a large program for modularity so that substantial changes to the program can be made easily for a wide variety of high energy physics experiments is discussed. Specialized tools for this type of large program management are also discussed

  13. Model-based software process improvement

    Science.gov (United States)

    Zettervall, Brenda T.

    1994-01-01

    The activities of a field test site for the Software Engineering Institute's software process definition project are discussed. Products tested included the improvement model itself, descriptive modeling techniques, the CMM level 2 framework document, and the use of process definition guidelines and templates. The software process improvement model represents a five stage cyclic approach for organizational process improvement. The cycles consist of the initiating, diagnosing, establishing, acting, and leveraging phases.

  14. Software Maintenance Management Evaluation and Continuous Improvement

    CERN Document Server

    April, Alain

    2008-01-01

    This book explores the domain of software maintenance management and provides road maps for improving software maintenance organizations. It describes full maintenance maturity models organized by levels 1, 2, and 3, which allow for benchmarking and continuous improvement paths. Goals for each key practice area are also provided, and the model presented is fully aligned with the architecture and framework of software development maturity models of CMMI and ISO 15504. It is complete with case studies, figures, tables, and graphs.

  15. Evolvable Neural Software System

    Science.gov (United States)

    Curtis, Steven A.

    2009-01-01

    The Evolvable Neural Software System (ENSS) is composed of sets of Neural Basis Functions (NBFs), which can be totally autonomously created and removed according to the changing needs and requirements of the software system. The resulting structure is both hierarchical and self-similar in that a given set of NBFs may have a ruler NBF, which in turn communicates with other sets of NBFs. These sets of NBFs may function as nodes to a ruler node, which are also NBF constructs. In this manner, the synthetic neural system can exhibit the complexity, three-dimensional connectivity, and adaptability of biological neural systems. An added advantage of ENSS over a natural neural system is its ability to modify its core genetic code in response to environmental changes as reflected in needs and requirements. The neural system is fully adaptive and evolvable and is trainable before release. It continues to rewire itself while on the job. The NBF is a unique, bilevel intelligence neural system composed of a higher-level heuristic neural system (HNS) and a lower-level, autonomic neural system (ANS). Taken together, the HNS and the ANS give each NBF the complete capabilities of a biological neural system to match sensory inputs to actions. Another feature of the NBF is the Evolvable Neural Interface (ENI), which links the HNS and ANS. The ENI solves the interface problem between these two systems by actively adapting and evolving from a primitive initial state (a Neural Thread) to a complicated, operational ENI and successfully adapting to a training sequence of sensory input. This simulates the adaptation of a biological neural system in a developmental phase. Within the greater multi-NBF and multi-node ENSS, self-similar ENI s provide the basis for inter-NBF and inter-node connectivity.

  16. Licensing safety critical software

    International Nuclear Information System (INIS)

    Archinoff, G.H.; Brown, R.A.

    1990-01-01

    Licensing difficulties with the shutdown system software at the Darlington Nuclear Generating Station contributed to delays in starting up the station. Even though the station has now been given approval by the Atomic Energy Control Board (AECB) to operate, the software issue has not disappeared - Ontario Hydro has been instructed by the AECB to redesign the software. This article attempts to explain why software based shutdown systems were chosen for Darlington, why there was so much difficulty licensing them, and what the implications are for other safety related software based applications

  17. TMT approach to observatory software development process

    Science.gov (United States)

    Buur, Hanne; Subramaniam, Annapurni; Gillies, Kim; Dumas, Christophe; Bhatia, Ravinder

    2016-07-01

    effective communications; adopting an agile-based software development process across the observatory to enable frequent software releases to help mitigate subsystem interdependencies; defining concise scope and work packages for each of the OSW subsystems to facilitate effective outsourcing of software deliverables to the ITCC partner, and to enable performance monitoring and risk management. At this stage, the architecture and high-level design of the software system has been established and reviewed. During construction each subsystem will have a final design phase with reviews, followed by implementation and testing. The results of the TMT approach to the Observatory Software development process will only be preliminary at the time of the submittal of this paper, but it is anticipated that the early results will be a favorable indication of progress.

  18. Software Validation in ATLAS

    International Nuclear Information System (INIS)

    Hodgkinson, Mark; Seuster, Rolf; Simmons, Brinick; Sherwood, Peter; Rousseau, David

    2012-01-01

    The ATLAS collaboration operates an extensive set of protocols to validate the quality of the offline software in a timely manner. This is essential in order to process the large amounts of data being collected by the ATLAS detector in 2011 without complications on the offline software side. We will discuss a number of different strategies used to validate the ATLAS offline software; running the ATLAS framework software, Athena, in a variety of configurations daily on each nightly build via the ATLAS Nightly System (ATN) and Run Time Tester (RTT) systems; the monitoring of these tests and checking the compilation of the software via distributed teams of rotating shifters; monitoring of and follow up on bug reports by the shifter teams and periodic software cleaning weeks to improve the quality of the offline software further.

  19. Work plan for the identification of techniques for in-situ sensing of layering/interfaces of Hanford high level waste tank

    International Nuclear Information System (INIS)

    Vargo, G.F. Jr.

    1995-01-01

    The purpose of this work scope is to identify a specific potential technology/device/instrument/ideas that would provide the tank waste data. A method is needed for identifying layering and physical state within the large waste tanks at the Hanford site in Washington State. These interfaces and state changes can adversely impact sampling and characterization activities

  20. Software architecture for time-constrained machine vision applications

    Science.gov (United States)

    Usamentiaga, Rubén; Molleda, Julio; García, Daniel F.; Bulnes, Francisco G.

    2013-01-01

    Real-time image and video processing applications require skilled architects, and recent trends in the hardware platform make the design and implementation of these applications increasingly complex. Many frameworks and libraries have been proposed or commercialized to simplify the design and tuning of real-time image processing applications. However, they tend to lack flexibility, because they are normally oriented toward particular types of applications, or they impose specific data processing models such as the pipeline. Other issues include large memory footprints, difficulty for reuse, and inefficient execution on multicore processors. We present a novel software architecture for time-constrained machine vision applications that addresses these issues. The architecture is divided into three layers. The platform abstraction layer provides a high-level application programming interface for the rest of the architecture. The messaging layer provides a message-passing interface based on a dynamic publish/subscribe pattern. A topic-based filtering in which messages are published to topics is used to route the messages from the publishers to the subscribers interested in a particular type of message. The application layer provides a repository for reusable application modules designed for machine vision applications. These modules, which include acquisition, visualization, communication, user interface, and data processing, take advantage of the power of well-known libraries such as OpenCV, Intel IPP, or CUDA. Finally, the proposed architecture is applied to a real machine vision application: a jam detector for steel pickling lines.

  1. The Keck keyword layer

    Science.gov (United States)

    Conrad, A. R.; Lupton, W. F.

    1992-01-01

    Each Keck instrument presents a consistent software view to the user interface programmer. The view consists of a small library of functions, which are identical for all instruments, and a large set of keywords, that vary from instrument to instrument. All knowledge of the underlying task structure is hidden from the application programmer by the keyword layer. Image capture software uses the same function library to collect data for the image header. Because the image capture software and the instrument control software are built on top of the same keyword layer, a given observation can be 'replayed' by extracting keyword-value pairs from the image header and passing them back to the control system. The keyword layer features non-blocking as well as blocking I/O. A non-blocking keyword write operation (such as setting a filter position) specifies a callback to be invoked when the operation is complete. A non-blocking keyword read operation specifies a callback to be invoked whenever the keyword changes state. The keyword-callback style meshes well with the widget-callback style commonly used in X window programs. The first keyword library was built for the two Keck optical instruments. More recently, keyword libraries have been developed for the infrared instruments and for telescope control. Although the underlying mechanisms used for inter-process communication by each of these systems vary widely (Lick MUSIC, Sun RPC, and direct socket I/O, respectively), a basic user interface has been written that can be used with any of these systems. Since the keyword libraries are bound to user interface programs dynamically at run time, only a single set of user interface executables is needed. For example, the same program, 'xshow', can be used to display continuously the telescope's position, the time left in an instrument's exposure, or both values simultaneously. Less generic tools that operate on specific keywords, for example an X display that controls optical

  2. Expert software for accident identification

    International Nuclear Information System (INIS)

    Dobnikar, M.; Nemec, T.; Muehleisen, A.

    2003-01-01

    Each type of an accident in a Nuclear Power Plant (NPP) causes immediately after the start of the accident variations of physical parameters that are typical for that type of the accident thus enabling its identification. Examples of these parameter are: decrease of reactor coolant system pressure, increase of radiation level in the containment, increase of pressure in the containment. An expert software enabling a fast preliminary identification of the type of the accident in Krsko NPP has been developed. As input data selected typical parameters from Emergency Response Data System (ERDS) of the Krsko NPP are used. Based on these parameters the expert software identifies the type of the accident and also provides the user with appropriate references (past analyses and other documentation of such an accident). The expert software is to be used as a support tool by an expert team that forms in case of an emergency at Slovenian Nuclear Safety Administration (SNSA) with the task to determine the cause of the accident, its most probable scenario and the source term. The expert software should provide initial identification of the event, while the final one is still to be made after appropriate assessment of the event by the expert group considering possibility of non-typical events, multiple causes, initial conditions, influences of operators' actions etc. The expert software can be also used as an educational/training tool and even as a simple database of available accident analyses. (author)

  3. Fully Employing Software Inspections Data

    Science.gov (United States)

    Shull, Forrest; Feldmann, Raimund L.; Seaman, Carolyn; Regardie, Myrna; Godfrey, Sally

    2009-01-01

    Software inspections provide a proven approach to quality assurance for software products of all kinds, including requirements, design, code, test plans, among others. Common to all inspections is the aim of finding and fixing defects as early as possible, and thereby providing cost savings by minimizing the amount of rework necessary later in the lifecycle. Measurement data, such as the number and type of found defects and the effort spent by the inspection team, provide not only direct feedback about the software product to the project team but are also valuable for process improvement activities. In this paper, we discuss NASA's use of software inspections and the rich set of data that has resulted. In particular, we present results from analysis of inspection data that illustrate the benefits of fully utilizing that data for process improvement at several levels. Examining such data across multiple inspections or projects allows team members to monitor and trigger cross project improvements. Such improvements may focus on the software development processes of the whole organization as well as improvements to the applied inspection process itself.

  4. NASA software documentation standard software engineering program

    Science.gov (United States)

    1991-01-01

    The NASA Software Documentation Standard (hereinafter referred to as Standard) can be applied to the documentation of all NASA software. This Standard is limited to documentation format and content requirements. It does not mandate specific management, engineering, or assurance standards or techniques. This Standard defines the format and content of documentation for software acquisition, development, and sustaining engineering. Format requirements address where information shall be recorded and content requirements address what information shall be recorded. This Standard provides a framework to allow consistency of documentation across NASA and visibility into the completeness of project documentation. This basic framework consists of four major sections (or volumes). The Management Plan contains all planning and business aspects of a software project, including engineering and assurance planning. The Product Specification contains all technical engineering information, including software requirements and design. The Assurance and Test Procedures contains all technical assurance information, including Test, Quality Assurance (QA), and Verification and Validation (V&V). The Management, Engineering, and Assurance Reports is the library and/or listing of all project reports.

  5. Science and Software

    Science.gov (United States)

    Zelt, C. A.

    2017-12-01

    Earth science attempts to understand how the earth works. This research often depends on software for modeling, processing, inverting or imaging. Freely sharing open-source software is essential to prevent reinventing the wheel and allows software to be improved and applied in ways the original author may never have envisioned. For young scientists, releasing software can increase their name ID when applying for jobs and funding, and create opportunities for collaborations when scientists who collect data want the software's creator to be involved in their project. However, we frequently hear scientists say software is a tool, it's not science. Creating software that implements a new or better way of earth modeling or geophysical processing, inverting or imaging should be viewed as earth science. Creating software for things like data visualization, format conversion, storage, or transmission, or programming to enhance computational performance, may be viewed as computer science. The former, ideally with an application to real data, can be published in earth science journals, the latter possibly in computer science journals. Citations in either case should accurately reflect the impact of the software on the community. Funding agencies need to support more software development and open-source releasing, and the community should give more high-profile awards for developing impactful open-source software. Funding support and community recognition for software development can have far reaching benefits when the software is used in foreseen and unforeseen ways, potentially for years after the original investment in the software development. For funding, an open-source release that is well documented should be required, with example input and output files. Appropriate funding will provide the incentive and time to release user-friendly software, and minimize the need for others to duplicate the effort. All funded software should be available through a single web site

  6. Software Maintenance and Evolution: The Implication for Software ...

    African Journals Online (AJOL)

    Software Maintenance and Evolution: The Implication for Software Development. ... Software maintenance is the process of modifying existing operational software by correcting errors, ... EMAIL FREE FULL TEXT EMAIL FREE FULL TEXT

  7. From napkin sketches to reliable software

    NARCIS (Netherlands)

    Engelen, L.J.P.

    2012-01-01

    In the past few years, model-driven software engineering (MDSE) and domain-specific modeling languages (DSMLs) have received a lot of attention from both research and industry. The main goal of MDSE is generating software from models that describe systems on a high level of abstraction. DSMLs are

  8. Software Testing An ISEB Intermediate Certificate

    CERN Document Server

    Hambling, Brian

    2009-01-01

    Covering testing fundamentals, reviews, testing and risk, test management and test analysis, this book helps newly qualified software testers to learn the skills and techniques to take them to the next level. Written by leading authors in the field, this is the only official textbook of the ISEB Intermediate Certificate in Software Testing.

  9. Hardware And Software Architectures For Reconfigurable Time-Critical Control Tasks

    Directory of Open Access Journals (Sweden)

    Adam Piłat

    2007-01-01

    Full Text Available The most popular configuration of the controlled laboratory test-rigs is the personalcomputer (PC equipped with the I/O board. The dedicated software components allowsto conduct a wide range of user-defined tasks. The typical configuration functionality canbe customized by PC hardware components and their programmable reconfiguration. Thenext step in the automatic control system design is the embedded solution. Usually, thedesign process of the embedded control system is supported by the high-level software. Thededicated programming tools support multitasking property of the microcontroller by selectionof different sampling frequencies of algorithm blocks. In this case the multi-layer andmultitasking control strategy can be realized on the chip. The proposed solutions implementrapid prototyping approach. The available toolkits and device drivers integrate system-leveldesign environment and the real-time application software, transferring the functionality ofMATLAB/Simulink programs to PCs or microcontrolers application environment.

  10. A software system for the simulation of chest lesions

    Science.gov (United States)

    Ryan, John T.; McEntee, Mark; Barrett, Saoirse; Evanoff, Michael; Manning, David; Brennan, Patrick

    2007-03-01

    We report on the development of a novel software tool for the simulation of chest lesions. This software tool was developed for use in our study to attain optimal ambient lighting conditions for chest radiology. This study involved 61 consultant radiologists from the American Board of Radiology. Because of its success, we intend to use the same tool for future studies. The software has two main functions: the simulation of lesions and retrieval of information for ROC (Receiver Operating Characteristic) and JAFROC (Jack-Knife Free Response ROC) analysis. The simulation layer operates by randomly selecting an image from a bank of reportedly normal chest x-rays. A random location is then generated for each lesion, which is checked against a reference lung-map. If the location is within the lung fields, as derived from the lung-map, a lesion is superimposed. Lesions are also randomly selected from a bank of manually created chest lesion images. A blending algorithm determines which are the best intensity levels for the lesion to sit naturally within the chest x-ray. The same software was used to run a study for all 61 radiologists. A sequence of images is displayed in random order. Half of these images had simulated lesions, ranging from subtle to obvious, and half of the images were normal. The operator then selects locations where he/she thinks lesions exist and grades the lesion accordingly. We have found that this software was very effective in this study and intend to use the same principles for future studies.

  11. Methods of Software Quality Assurance under a Nuclear Quality Assurance Program

    International Nuclear Information System (INIS)

    Kim, Jang Yeol; Lee, Young Jun; Cha, Kyung Ho; Cheon, Se Woo; Lee, Jang Soo; Kwon, Kee Choon

    2005-01-01

    This paper addresses a substantial implementation of a software quality assurance under a nuclear quality assurance program. The relationship of the responsibility between a top-level nuclear quality assurance program such as ASME/NQA-1 and its lower level software quality assurance is described. Software quality assurance activities and software quality assurance procedures during the software development life cycle are also described

  12. FY1995 study of very flexible software structures based on soft-software components; 1995 nendo yawarankana software buhin ni motozuku software no choju kozo ni kansuru kenkyu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-03-01

    The purpose of this study is to develop the method and tools for changing the software structure flexibly along with the continuous continuous change of its environment and conditions of use. The goal is the software of very high adaptability by using soft-software components and flexible assembly. The CASE tool platform Sapid based on a fine-grained repository was developed and enforced for raising the abstraction level of program code and for mining potential flexible components. To reconstruct the software adaptable to a required environment, the SQM (Software Quark Model) was used in managing interconnectivity and other semantic relationships of among components. On these two basic systems, we developed various methods and tools such as those for static and dynamic analysis of very flexible software structures, program transformation description, program pattern extraction and composition component optimization by partial evaluation, component extraction by function slicing, code encapsulation, and component navigation and application. (NEDO)

  13. Test process for the safety-critical embedded software

    International Nuclear Information System (INIS)

    Sung, Ahyoung; Choi, Byoungju; Lee, Jangsoo

    2004-01-01

    Digitalization of nuclear Instrumentation and Control (I and C) system requires high reliability of not only hardware but also software. Verification and Validation (V and V) process is recommended for software reliability. But a more quantitative method is necessary such as software testing. Most of software in the nuclear I and C system is safety-critical embedded software. Safety-critical embedded software is specified, verified and developed according to V and V process. Hence two types of software testing techniques are necessary for the developed code. First, code-based software testing is required to examine the developed code. Second, after code-based software testing, software testing affected by hardware is required to reveal the interaction fault that may cause unexpected results. We call the testing of hardware's influence on software, an interaction testing. In case of safety-critical embedded software, it is also important to consider the interaction between hardware and software. Even if no faults are detected when testing either hardware or software alone, combining these components may lead to unexpected results due to the interaction. In this paper, we propose a software test process that embraces test levels, test techniques, required test tasks and documents for safety-critical embedded software. We apply the proposed test process to safety-critical embedded software as a case study, and show the effectiveness of it. (author)

  14. Software engineering in industry

    Science.gov (United States)

    Story, C. M.

    1989-12-01

    Can software be "engineered"? Can a few people with limited resources and a negligible budget produce high quality software solutions to complex software problems? It is possible to resolve the conflict between research activities and the necessity to view software development as a means to an end rather than as an end in itself? The aim of this paper is to encourage further thought and discussion on various topics which, in the author's experience, are becoming increasingly critical in large current software production and development projects, inside and outside high energy physics (HEP). This is done by briefly exploring some of the software engineering ideas and technologies now used in the information industry, using, as a case-study, a project with many similarities to those currently under way in HEP.

  15. A software product certification model

    NARCIS (Netherlands)

    Heck, P.M.; Klabbers, M.D.; van Eekelen, Marko

    2010-01-01

    Certification of software artifacts offers organizations more certainty and confidence about software. Certification of software helps software sales, acquisition, and can be used to certify legislative compliance or to achieve acceptable deliverables in outsourcing. In this article, we present a

  16. Software verification for nuclear industry

    International Nuclear Information System (INIS)

    Wilburn, N.P.

    1985-08-01

    Why verification of software products throughout the software life cycle is necessary is considered. Concepts of verification, software verification planning, and some verification methodologies for products generated throughout the software life cycle are then discussed

  17. Software evolution and maintenance

    CERN Document Server

    Tripathy, Priyadarshi

    2014-01-01

    Software Evolution and Maintenance: A Practitioner's Approach is an accessible textbook for students and professionals, which collates the advances in software development and provides the most current models and techniques in maintenance.Explains two maintenance standards: IEEE/EIA 1219 and ISO/IEC14764Discusses several commercial reverse and domain engineering toolkitsSlides for instructors are available onlineInformation is based on the IEEE SWEBOK (Software Engineering Body of Knowledge)

  18. Software for microcircuit systems

    International Nuclear Information System (INIS)

    Kunz, P.F.

    1978-10-01

    Modern Large Scale Integration (LSI) microcircuits are meant to be programed in order to control the function that they perform. The basics of microprograming and new microcircuits have already been discussed. In this course, the methods of developing software for these microcircuits are explored. This generally requires a package of support software in order to assemble the microprogram, and also some amount of support software to test the microprograms and to test the microprogramed circuit itself. 15 figures, 2 tables

  19. Hospital Management Software Development

    OpenAIRE

    sobogunGod, olawale

    2012-01-01

    The purpose of this thesis was to implement a hospital management software which is suitable for small private hospitals in Nigeria, especially for the ones that use a file based system for storing information rather than having it stored in a more efficient and safer environment like databases or excel programming software. The software developed within this thesis project was specifically designed for the Rainbow specialist hospital which is based in Lagos, the commercial neurological cente...

  20. Computer software configuration management

    International Nuclear Information System (INIS)

    Pelletier, G.

    1987-08-01

    This report reviews the basic elements of software configuration management (SCM) as defined by military and industry standards. Several software configuration management standards are evaluated given the requirements of the nuclear industry. A survey is included of available automated tools for supporting SCM activities. Some information is given on the experience of establishing and using SCM plans of other organizations that manage critical software. The report concludes with recommendations of practices that would be most appropriate for the nuclear power industry in Canada

  1. Gammasphere software development

    International Nuclear Information System (INIS)

    Piercey, R.B.

    1993-01-01

    Activities of the nuclear physics group are described. Progress was made in organizing the Gammasphere Software Working Group, establishing a nuclear computing facility, participating in software development at Lawrence Berkeley, developing a common data file format, and adapting the ORNL UPAK software to run at Gammasphere. A universal histogram object was developed that defines a file format and provides for an objective-oriented programming model. An automated liquid nitrogen fill system was developed for Gammasphere (110 Ge detectors comprise the sphere)

  2. Software quality management

    International Nuclear Information System (INIS)

    Bishop, D.C.; Pymm, P.

    1991-01-01

    As programmable electronic (software-based) systems are increasingly being proposed as design solutions for high integrity applications in nuclear power stations, the need to adopt suitable quality management arrangements is paramount. The authors describe Scottish Nuclear's strategy for software quality management and, using the main on-line monitoring system at Torness Power Station as an example, explain how this strategy is put into practice. Particular attention is given to the topics of software quality planning and change control. (author)

  3. Software Process Improvement Defined

    DEFF Research Database (Denmark)

    Aaen, Ivan

    2002-01-01

    This paper argues in favor of the development of explanatory theory on software process improvement. The last one or two decades commitment to prescriptive approaches in software process improvement theory may contribute to the emergence of a gulf dividing theorists and practitioners....... It is proposed that this divide be met by the development of theory evaluating prescriptive approaches and informing practice with a focus on the software process policymaking and process control aspects of improvement efforts...

  4. Software evolution with XVCL

    DEFF Research Database (Denmark)

    Zhang, Weishan; Jarzabek, Stan; Zhang, Hongyu

    2004-01-01

    This chapter introduces software evolution with XVCL (XML-based Variant Configuration Language), which is an XML-based metaprogramming technique. As the software evolves, a large number of variants may arise, especially whtn such kinds of evolutions are related to multiple platforms as shown in our...... case study. Handling variants and tracing the impact of variants across the development lifecycle is a challenge. This chapter shows how we can maintain different versions of software in a reuse-based way....

  5. FASTBUS software status

    International Nuclear Information System (INIS)

    Gustavson, D.B.

    1980-10-01

    Computer software will be needed in addition to the mechanical, electrical, protocol and timing specifications of the FASTBUS, in order to facilitate the use of this flexible new multiprocessor and multisegment data acquisition and processing system. Software considerations have been important in the FASTBUS design, but standard subroutines and recommended algorithms will be needed as the FASTBUS comes into use. This paper summarizes current FASTBUS software projects, goals and status

  6. Software configuration management

    CERN Document Server

    Keyes, Jessica

    2004-01-01

    Software Configuration Management discusses the framework from a standards viewpoint, using the original DoD MIL-STD-973 and EIA-649 standards to describe the elements of configuration management within a software engineering perspective. Divided into two parts, the first section is composed of 14 chapters that explain every facet of configuration management related to software engineering. The second section consists of 25 appendices that contain many valuable real world CM templates.

  7. Solar Asset Management Software

    Energy Technology Data Exchange (ETDEWEB)

    Iverson, Aaron [Ra Power Management, Inc., Oakland, CA (United States); Zviagin, George [Ra Power Management, Inc., Oakland, CA (United States)

    2016-09-30

    Ra Power Management (RPM) has developed a cloud based software platform that manages the financial and operational functions of third party financed solar projects throughout their lifecycle. RPM’s software streamlines and automates the sales, financing, and management of a portfolio of solar assets. The software helps solar developers automate the most difficult aspects of asset management, leading to increased transparency, efficiency, and reduction in human error. More importantly, our platform will help developers save money by improving their operating margins.

  8. Essential software architecture

    CERN Document Server

    Gorton, Ian

    2011-01-01

    Job titles like ""Technical Architect"" and ""Chief Architect"" nowadays abound in software industry, yet many people suspect that ""architecture"" is one of the most overused and least understood terms in professional software development. Gorton's book tries to resolve this dilemma. It concisely describes the essential elements of knowledge and key skills required to be a software architect. The explanations encompass the essentials of architecture thinking, practices, and supporting technologies. They range from a general understanding of structure and quality attributes through technical i

  9. Software safety analysis techniques for developing safety critical software in the digital protection system of the LMR

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jang Soo; Cheon, Se Woo; Kim, Chang Hoi; Sim, Yun Sub

    2001-02-01

    This report has described the software safety analysis techniques and the engineering guidelines for developing safety critical software to identify the state of the art in this field and to give the software safety engineer a trail map between the code and standards layer and the design methodology and documents layer. We have surveyed the management aspects of software safety activities during the software lifecycle in order to improve the safety. After identifying the conventional safety analysis techniques for systems, we have surveyed in details the software safety analysis techniques, software FMEA(Failure Mode and Effects Analysis), software HAZOP(Hazard and Operability Analysis), and software FTA(Fault Tree Analysis). We have also surveyed the state of the art in the software reliability assessment techniques. The most important results from the reliability techniques are not the specific probability numbers generated, but the insights into the risk importance of software features. To defend against potential common-mode failures, high quality, defense-in-depth, and diversity are considered to be key elements in digital I and C system design. To minimize the possibility of CMFs and thus increase the plant reliability, we have provided D-in-D and D analysis guidelines.

  10. Software safety analysis techniques for developing safety critical software in the digital protection system of the LMR

    International Nuclear Information System (INIS)

    Lee, Jang Soo; Cheon, Se Woo; Kim, Chang Hoi; Sim, Yun Sub

    2001-02-01

    This report has described the software safety analysis techniques and the engineering guidelines for developing safety critical software to identify the state of the art in this field and to give the software safety engineer a trail map between the code and standards layer and the design methodology and documents layer. We have surveyed the management aspects of software safety activities during the software lifecycle in order to improve the safety. After identifying the conventional safety analysis techniques for systems, we have surveyed in details the software safety analysis techniques, software FMEA(Failure Mode and Effects Analysis), software HAZOP(Hazard and Operability Analysis), and software FTA(Fault Tree Analysis). We have also surveyed the state of the art in the software reliability assessment techniques. The most important results from the reliability techniques are not the specific probability numbers generated, but the insights into the risk importance of software features. To defend against potential common-mode failures, high quality, defense-in-depth, and diversity are considered to be key elements in digital I and C system design. To minimize the possibility of CMFs and thus increase the plant reliability, we have provided D-in-D and D analysis guidelines

  11. Software engineering the current practice

    CERN Document Server

    Rajlich, Vaclav

    2011-01-01

    INTRODUCTION History of Software EngineeringSoftware PropertiesOrigins of SoftwareBirth of Software EngineeringThird Paradigm: Iterative ApproachSoftware Life Span ModelsStaged ModelVariants of Staged ModelSoftware Technologies Programming Languages and CompilersObject-Oriented TechnologyVersion Control SystemSoftware ModelsClass DiagramsUML Activity DiagramsClass Dependency Graphs and ContractsSOFTWARE CHANGEIntroduction to Software ChangeCharacteristics of Software ChangePhases of Software ChangeRequirements and Their ElicitationRequirements Analysis and Change InitiationConcepts and Concept

  12. Agile software development

    CERN Document Server

    Dingsoyr, Torgeir; Moe, Nils Brede

    2010-01-01

    Agile software development has become an umbrella term for a number of changes in how software developers plan and coordinate their work, how they communicate with customers and external stakeholders, and how software development is organized in small, medium, and large companies, from the telecom and healthcare sectors to games and interactive media. Still, after a decade of research, agile software development is the source of continued debate due to its multifaceted nature and insufficient synthesis of research results. Dingsoyr, Dyba, and Moe now present a comprehensive snapshot of the kno

  13. Software quality assurance

    CERN Document Server

    Laporte, Claude Y

    2018-01-01

    This book introduces Software Quality Assurance (SQA) and provides an overview of standards used to implement SQA. It defines ways to assess the effectiveness of how one approaches software quality across key industry sectors such as telecommunications, transport, defense, and aerospace. * Includes supplementary website with an instructor's guide and solutions * Applies IEEE software standards as well as the Capability Maturity Model Integration for Development (CMMI) * Illustrates the application of software quality assurance practices through the use of practical examples, quotes from experts, and tips from the authors

  14. Software architecture 2

    CERN Document Server

    Oussalah, Mourad Chabanne

    2014-01-01

    Over the past 20 years, software architectures have significantly contributed to the development of complex and distributed systems. Nowadays, it is recognized that one of the critical problems in the design and development of any complex software system is its architecture, i.e. the organization of its architectural elements. Software Architecture presents the software architecture paradigms based on objects, components, services and models, as well as the various architectural techniques and methods, the analysis of architectural qualities, models of representation of architectural templa

  15. Software as quality product

    International Nuclear Information System (INIS)

    Enders, A.

    1975-01-01

    In many discussions on the reliability of computer systems, software is presented as the weak link in the chain. The contribution attempts to identify the reasons for this situation as seen from the software development. The concepts correctness and reliability of programmes are explained as they are understood in the specialist discussion of today. Measures and methods are discussed which are particularly relevant as far as the obtaining of fault-free and reliable programmes is concerned. Conclusions are drawn for the user of software so that he is in the position to judge himself what can be justly expected frm the product software compared to other products. (orig./LH) [de

  16. Essence: Facilitating Software Innovation

    DEFF Research Database (Denmark)

    Aaen, Ivan

    2008-01-01

      This paper suggests ways to facilitate creativity and innovation in software development. The paper applies four perspectives – Product, Project, Process, and People –to identify an outlook for software innovation. The paper then describes a new facility–Software Innovation Research Lab (SIRL......) – and a new method concept for software innovation – Essence – based on views, modes, and team roles. Finally, the paper reports from an early experiment using SIRL and Essence and identifies further research....

  17. Global Software Engineering

    DEFF Research Database (Denmark)

    Ebert, Christof; Kuhrmann, Marco; Prikladnicki, Rafael

    2016-01-01

    Professional software products and IT systems and services today are developed mostly by globally distributed teams, projects, and companies. Successfully orchestrating Global Software Engineering (GSE) has become the major success factor both for organizations and practitioners. Yet, more than...... and experience reported at the IEEE International Conference on Software Engineering (ICGSE) series. The outcomes of our analysis show GSE as a field highly attached to industry and, thus, a considerable share of ICGSE papers address the transfer of Software Engineering concepts and solutions to the global stage...

  18. Software Intensive Systems

    National Research Council Canada - National Science Library

    Horvitz, E; Katz, D. J; Rumpf, R. L; Shrobe, H; Smith, T. B; Webber, G. E; Williamson, W. E; Winston, P. H; Wolbarsht, James L

    2006-01-01

    .... Additionally, recommend that DoN invest in software engineering, particularly as it complements commercial industry developments and promotes the application of systems engineering methodology...

  19. Contractor Software Charges

    National Research Council Canada - National Science Library

    Granetto, Paul

    1994-01-01

    .... Examples of computer software costs that contractors charge through indirect rates are material management systems, security systems, labor accounting systems, and computer-aided design and manufacturing...

  20. Decentralized Software Architecture

    National Research Council Canada - National Science Library

    Khare, Rohit

    2002-01-01

    .... While the term "decentralization" is familiar from political and economic contexts, it has been applied extensively, if indiscriminately, to describe recent trends in software architecture towards...

  1. Software architecture 1

    CERN Document Server

    Oussalah , Mourad Chabane

    2014-01-01

    Over the past 20 years, software architectures have significantly contributed to the development of complex and distributed systems. Nowadays, it is recognized that one of the critical problems in the design and development of any complex software system is its architecture, i.e. the organization of its architectural elements. Software Architecture presents the software architecture paradigms based on objects, components, services and models, as well as the various architectural techniques and methods, the analysis of architectural qualities, models of representation of architectural template

  2. Social software in global software development

    DEFF Research Database (Denmark)

    Giuffrida, Rosalba; Dittrich, Yvonne

    2010-01-01

    variety of tools such as: instant messaging, internet forums, mailing lists, blogs, wikis, social network sites, social bookmarking, social libraries, virtual worlds. Though normally rather belonging to the private realm, the use of social software in corporate context has been reported, e.g. as a way...

  3. NASA's Software Safety Standard

    Science.gov (United States)

    Ramsay, Christopher M.

    2007-01-01

    NASA relies more and more on software to control, monitor, and verify its safety critical systems, facilities and operations. Since the 1960's there has hardly been a spacecraft launched that does not have a computer on board that will provide command and control services. There have been recent incidents where software has played a role in high-profile mission failures and hazardous incidents. For example, the Mars Orbiter, Mars Polar Lander, the DART (Demonstration of Autonomous Rendezvous Technology), and MER (Mars Exploration Rover) Spirit anomalies were all caused or contributed to by software. The Mission Control Centers for the Shuttle, ISS, and unmanned programs are highly dependant on software for data displays, analysis, and mission planning. Despite this growing dependence on software control and monitoring, there has been little to no consistent application of software safety practices and methodology to NASA's projects with safety critical software. Meanwhile, academia and private industry have been stepping forward with procedures and standards for safety critical systems and software, for example Dr. Nancy Leveson's book Safeware: System Safety and Computers. The NASA Software Safety Standard, originally published in 1997, was widely ignored due to its complexity and poor organization. It also focused on concepts rather than definite procedural requirements organized around a software project lifecycle. Led by NASA Headquarters Office of Safety and Mission Assurance, the NASA Software Safety Standard has recently undergone a significant update. This new standard provides the procedures and guidelines for evaluating a project for safety criticality and then lays out the minimum project lifecycle requirements to assure the software is created, operated, and maintained in the safest possible manner. This update of the standard clearly delineates the minimum set of software safety requirements for a project without detailing the implementation for those

  4. Níveis de cálcio em dietas para poedeiras semipesadas após o pico de postura Levels of calcium in diets for brown layers post-peak production

    Directory of Open Access Journals (Sweden)

    Fernando Guilherme Perazzo Costa

    2008-04-01

    Full Text Available Um experimento foi conduzido com o objetivo de estudar os efeitos dos níveis de cálcio nas rações sobre o desempenho e a qualidade de ovos de poedeiras comerciais. Foram utilizadas 216 poedeiras da linhagem Lohmann Brown no período de 39 a 55 semanas de idade. O delineamento utilizado foi inteiramente casualizado, composto de seis níveis de cálcio (3,0; 3,4; 3,8; 4,2; 4,6; 5,0% e seis repetições de seis aves por parcela. Os níveis de cálcio avaliados não influenciaram os parâmetros produtivos, mas afetaram significativamente as porcentagens de albúmen e casca do ovo. O aumento do cálcio na dieta promove incremento na qualidade da casca do ovo em relação aos demais componentes do ovo. Recomenda-se nível de 4,3% de cálcio em dietas para poedeiras semipesadas após o pico de postura.Two hundred and sixteen layers were raised from 39 to 55 weeks of age to study the effects of levels of calcium in the rations on performance and eggs quality of brown commercial layers. A completely randomized experimental design were used with six levels of calcium (3.0; 3.4; 3.8; 4.2; 4.6 and 5.0% and six replicates of six layers hens. There was no effect of calcium levels on hen performance, however significantly affected the percentage of albumen and egg shell. The increase of calcium in the diet promoted increment in egg shell quality in relationship to other eggs components. The calcium level of 4.3% is recommended for brown layer hens after production peak.

  5. The Effects of Development Team Skill on Software Product Quality

    Science.gov (United States)

    Beaver, Justin M.; Schiavone, Guy A.

    2006-01-01

    This paper provides an analysis of the effect of the skill/experience of the software development team on the quality of the final software product. A method for the assessment of software development team skill and experience is proposed, and was derived from a workforce management tool currently in use by the National Aeronautics and Space Administration. Using data from 26 smallscale software development projects, the team skill measures are correlated to 5 software product quality metrics from the ISO/IEC 9126 Software Engineering Product Quality standard. in the analysis of the results, development team skill is found to be a significant factor in the adequacy of the design and implementation. In addition, the results imply that inexperienced software developers are tasked with responsibilities ill-suited to their skill level, and thus have a significant adverse effect on the quality of the software product. Keywords: software quality, development skill, software metrics

  6. Customer Interaction in Software Development: A Comparison of Software Methodologies Deployed in Namibian Software Firms

    CSIR Research Space (South Africa)

    Iyawa, GE

    2016-01-01

    Full Text Available within the Namibian context. An implication for software project managers and software developers is that customer interaction should be properly managed to ensure that the software methodologies for improving software development processes...

  7. The Systems Biology Research Tool: evolvable open-source software

    OpenAIRE

    Wright, J; Wagner, A

    2008-01-01

    Abstract Background Research in the field of systems biology requires software for a variety of purposes. Software must be used to store, retrieve, analyze, and sometimes even to collect the data obtained from system-level (often high-throughput) experiments. Software must also be used to implement mathematical models and algorithms required for simulation and theoretical predictions on the system-level. Results We introduce a free, easy-to-use, open-source, integrated software platform calle...

  8. Open software architecture for east articulated maintenance arm

    International Nuclear Information System (INIS)

    Wu, Jing; Wu, Huapeng; Song, Yuntao; Li, Ming; Yang, Yang; Alcina, Daniel A.M.

    2016-01-01

    Highlights: • A software requirement of serial-articulated robot for EAST assembly and maintains is presented. • A open software architecture of the robot is developed. • A component-based model distribution system with real-time communication of the robot is constructed. - Abstract: For the inside inspection and the maintenance of vacuum vessel in the EAST, an articulated maintenance arm is developed. In this article, an open software architecture developed for the EAST articulated maintenance arm (EAMA) is described, which offers a robust and proper performance and easy-going experience based on standard open robotic platform OROCOS. The paper presents a component-based model software architecture using multi-layer structure: end layer, up layer, middle, and down layer. In the end layer the components are defined off-line in the task planner manner. The components in up layer complete the function of trajectory plan. The CORBA, as a communication framework, is adopted to exchange the data between the distributed components. The contributors use Real-Time Workshop from the MATLAB/Simulink to generate the components in the middle layer. Real-time Toolkit guarantees control applications running in the hard real-time mode. Ethernets and the CAN bus are used for data transfer in the down layer, where the components implement the hardware functions. The distributed architecture of control system associates each processing node with each joint, which is mapped to a component with all functioning features of the framework.

  9. Open software architecture for east articulated maintenance arm

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Jing, E-mail: wujing@ipp.ac.cn [Institute of Plasma Physics Chinese Academy of Sciences, 350 Shushanhu Rd Hefei Anhui (China); Lappeenranta University of Technology, Skinnarilankatu 34 Lappeenranta (Finland); Wu, Huapeng [Lappeenranta University of Technology, Skinnarilankatu 34 Lappeenranta (Finland); Song, Yuntao [Institute of Plasma Physics Chinese Academy of Sciences, 350 Shushanhu Rd Hefei Anhui (China); Li, Ming [Lappeenranta University of Technology, Skinnarilankatu 34 Lappeenranta (Finland); Yang, Yang [Institute of Plasma Physics Chinese Academy of Sciences, 350 Shushanhu Rd Hefei Anhui (China); Alcina, Daniel A.M. [Lappeenranta University of Technology, Skinnarilankatu 34 Lappeenranta (Finland)

    2016-11-01

    Highlights: • A software requirement of serial-articulated robot for EAST assembly and maintains is presented. • A open software architecture of the robot is developed. • A component-based model distribution system with real-time communication of the robot is constructed. - Abstract: For the inside inspection and the maintenance of vacuum vessel in the EAST, an articulated maintenance arm is developed. In this article, an open software architecture developed for the EAST articulated maintenance arm (EAMA) is described, which offers a robust and proper performance and easy-going experience based on standard open robotic platform OROCOS. The paper presents a component-based model software architecture using multi-layer structure: end layer, up layer, middle, and down layer. In the end layer the components are defined off-line in the task planner manner. The components in up layer complete the function of trajectory plan. The CORBA, as a communication framework, is adopted to exchange the data between the distributed components. The contributors use Real-Time Workshop from the MATLAB/Simulink to generate the components in the middle layer. Real-time Toolkit guarantees control applications running in the hard real-time mode. Ethernets and the CAN bus are used for data transfer in the down layer, where the components implement the hardware functions. The distributed architecture of control system associates each processing node with each joint, which is mapped to a component with all functioning features of the framework.

  10. Software Component Certification: 10 Useful Distinctions

    National Research Council Canada - National Science Library

    Wallnau, Kurt C

    2004-01-01

    .... One persistent and largely unaddressed challenge is how the consumers of software components-that is, the developers of mission-critical systems-can obtain a meaningful level of trust in the runtime...

  11. High performance in software development

    CERN Multimedia

    CERN. Geneva; Haapio, Petri; Liukkonen, Juha-Matti

    2015-01-01

    What are the ingredients of high-performing software? Software development, especially for large high-performance systems, is one the most complex tasks mankind has ever tried. Technological change leads to huge opportunities but challenges our old ways of working. Processing large data sets, possibly in real time or with other tight computational constraints, requires an efficient solution architecture. Efficiency requirements span from the distributed storage and large-scale organization of computation and data onto the lowest level of processor and data bus behavior. Integrating performance behavior over these levels is especially important when the computation is resource-bounded, as it is in numerics: physical simulation, machine learning, estimation of statistical models, etc. For example, memory locality and utilization of vector processing are essential for harnessing the computing power of modern processor architectures due to the deep memory hierarchies of modern general-purpose computers. As a r...

  12. Marketing Mix del Software.

    Directory of Open Access Journals (Sweden)

    Yudith del Carmen Rodríguez Pérez

    2006-03-01

    Por ello, en este trabajo se define el concepto de producto software, se caracteriza al mismo y se exponen sus atributos de calidad. Además, se aborda la mezcla de marketing del software necesaria y diferente a la de otros productos para que este triunfe en el mercado.

  13. Sustainability in Software Engineering

    NARCIS (Netherlands)

    Wolfram, N.J.E.; Lago, P.; Osborne, Francesco

    2017-01-01

    The intersection between software engineering research and issues related to sustainability and green IT has been the subject of increasing attention. In spite of that, we observe that sustainability is still not clearly defined, or understood, in the field of software engineering. This lack of

  14. Software cost estimation

    NARCIS (Netherlands)

    Heemstra, F.J.

    1992-01-01

    The paper gives an overview of the state of the art of software cost estimation (SCE). The main questions to be answered in the paper are: (1) What are the reasons for overruns of budgets and planned durations? (2) What are the prerequisites for estimating? (3) How can software development effort be

  15. Software cost estimation

    NARCIS (Netherlands)

    Heemstra, F.J.; Heemstra, F.J.

    1993-01-01

    The paper gives an overview of the state of the art of software cost estimation (SCE). The main questions to be answered in the paper are: (1) What are the reasons for overruns of budgets and planned durations? (2) What are the prerequisites for estimating? (3) How can software development effort be

  16. Software engineering ethics

    Science.gov (United States)

    Bown, Rodney L.

    1991-01-01

    Software engineering ethics is reviewed. The following subject areas are covered: lack of a system viewpoint; arrogance of PC DOS software vendors; violation od upward compatibility; internet worm; internet worm revisited; student cheating and company hiring interviews; computing practitioners and the commodity market; new projects and old programming languages; schedule and budget; and recent public domain comments.

  17. Computer Software Reviews.

    Science.gov (United States)

    Hawaii State Dept. of Education, Honolulu. Office of Instructional Services.

    Intended to provide guidance in the selection of the best computer software available to support instruction and to make optimal use of schools' financial resources, this publication provides a listing of computer software programs that have been evaluated according to their currency, relevance, and value to Hawaii's educational programs. The…

  18. Software product family evaluation

    NARCIS (Netherlands)

    van der Linden, F; Bosch, J; Kamsties, E; Kansala, K; Krzanik, L; Obbink, H; VanDerLinden, F

    2004-01-01

    This paper proposes a 4-dimensional software product family engineering evaluation model. The 4 dimensions relate to the software engineering concerns of business, architecture, organisation and process. The evaluation model is meant to be used within organisations to determine the status of their

  19. Selecting the Right Software.

    Science.gov (United States)

    Shearn, Joseph

    1987-01-01

    Selection of administrative software requires analyzing present needs and, to meet future needs, choosing software that will function with a more powerful computer system. Other important factors to include are a professional system demonstration, maintenance and training, and financial considerations that allow leasing or renting alternatives.…

  20. Influence of a deep-level-defect band formed in a heavily Mg-doped GaN contact layer on the Ni/Au contact to p-GaN

    International Nuclear Information System (INIS)

    Li Xiao-Jing; Zhao De-Gang; Jiang De-Sheng; Chen Ping; Zhu Jian-Jun; Liu Zong-Shun; Yang Jing; He Xiao-Guang; Yang Hui; Zhang Li-Qun; Zhang Shu-Ming; Le Ling-Cong; Liu Jian-Ping

    2015-01-01

    The influence of a deep-level-defect (DLD) band formed in a heavily Mg-doped GaN contact layer on the performance of Ni/Au contact to p-GaN is investigated. The thin heavily Mg-doped GaN (p ++ -GaN) contact layer with DLD band can effectively improve the performance of Ni/Au ohmic contact to p-GaN. The temperature-dependent I–V measurement shows that the variable-range hopping (VRH) transportation through the DLD band plays a dominant role in the ohmic contact. The thickness and Mg/Ga flow ratio of p ++ -GaN contact layer have a significant effect on ohmic contact by controlling the Mg impurity doping and the formation of a proper DLD band. When the thickness of the p ++ -GaN contact layer is 25 nm thick and the Mg/Ga flow rate ratio is 10.29%, an ohmic contact with low specific contact resistivity of 6.97× 10 −4 Ω·cm 2 is achieved. (paper)

  1. Trends in software testing

    CERN Document Server

    Mohanty, J; Balakrishnan, Arunkumar

    2017-01-01

    This book is focused on the advancements in the field of software testing and the innovative practices that the industry is adopting. Considering the widely varied nature of software testing, the book addresses contemporary aspects that are important for both academia and industry. There are dedicated chapters on seamless high-efficiency frameworks, automation on regression testing, software by search, and system evolution management. There are a host of mathematical models that are promising for software quality improvement by model-based testing. There are three chapters addressing this concern. Students and researchers in particular will find these chapters useful for their mathematical strength and rigor. Other topics covered include uncertainty in testing, software security testing, testing as a service, test technical debt (or test debt), disruption caused by digital advancement (social media, cloud computing, mobile application and data analytics), and challenges and benefits of outsourcing. The book w...

  2. Revisiting software ecosystems research

    DEFF Research Database (Denmark)

    Manikas, Konstantinos

    2016-01-01

    Software ecosystems’ is argued to first appear as a concept more than 10 years ago and software ecosystem research started to take off in 2010. We conduct a systematic literature study, based on the most extensive literature review in the field up to date, with two primarily aims: (a) to provide...... an updated overview of the field and (b) to document evolution in the field. In total, we analyze 231 papers from 2007 until 2014 and provide an overview of the research in software ecosystems. Our analysis reveals a field that is rapidly growing both in volume and empirical focus while becoming more mature...... from evolving. We propose means for future research and the community to address them. Finally, our analysis shapes the view of the field having evolved outside the existing definitions of software ecosystems and thus propose the update of the definition of software ecosystems....

  3. Software safety hazard analysis

    International Nuclear Information System (INIS)

    Lawrence, J.D.

    1996-02-01

    Techniques for analyzing the safety and reliability of analog-based electronic protection systems that serve to mitigate hazards in process control systems have been developed over many years, and are reasonably well understood. An example is the protection system in a nuclear power plant. The extension of these techniques to systems which include digital computers is not well developed, and there is little consensus among software engineering experts and safety experts on how to analyze such systems. One possible technique is to extend hazard analysis to include digital computer-based systems. Software is frequently overlooked during system hazard analyses, but this is unacceptable when the software is in control of a potentially hazardous operation. In such cases, hazard analysis should be extended to fully cover the software. A method for performing software hazard analysis is proposed in this paper

  4. Systematic Software Development

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; Méndez Fernández, Daniel

    2015-01-01

    The speed of innovation and the global allocation of resources to accelerate development or to reduce cost put pressure on the software industry. In the global competition, especially so-called high-price countries have to present arguments why the higher development cost is justified and what...... makes these countries an attractive host for software companies. Often, high-quality engineering and excellent quality of products, e.g., machinery and equipment, are mentioned. Yet, the question is: Can such arguments be also found for the software industry? We aim at investigating the degree...... of professionalism and systematization of software development to draw a map of strengths and weaknesses. To this end, we conducted as a first step an exploratory survey in Germany, presented in this paper. In this survey, we focused on the perceived importance of the two general software engineering process areas...

  5. Software architecture evolution

    DEFF Research Database (Denmark)

    Barais, Olivier; Le Meur, Anne-Francoise; Duchien, Laurence

    2008-01-01

    Software architectures must frequently evolve to cope with changing requirements, and this evolution often implies integrating new concerns. Unfortunately, when the new concerns are crosscutting, existing architecture description languages provide little or no support for this kind of evolution....... The software architect must modify multiple elements of the architecture manually, which risks introducing inconsistencies. This chapter provides an overview, comparison and detailed treatment of the various state-of-the-art approaches to describing and evolving software architectures. Furthermore, we discuss...... one particular framework named Tran SAT, which addresses the above problems of software architecture evolution. Tran SAT provides a new element in the software architecture descriptions language, called an architectural aspect, for describing new concerns and their integration into an existing...

  6. Developing Software Simulations

    Directory of Open Access Journals (Sweden)

    Tom Hall

    2007-06-01

    Full Text Available Programs in education and business often require learners to develop and demonstrate competence in specified areas and then be able to effectively apply this knowledge. One method to aid in developing a skill set in these areas is through the use of software simulations. These simulations can be used for learner demonstrations of competencies in a specified course as well as a review of the basic skills at the beginning of subsequent courses. The first section of this paper discusses ToolBook, the software used to develop our software simulations. The second section discusses the process of developing software simulations. The third part discusses how we have used software simulations to assess student knowledge of research design by providing simulations that allow the student to practice using SPSS and Excel.

  7. Software licenses: Stay honest!

    CERN Multimedia

    Computer Security Team

    2012-01-01

    Do you recall our article about copyright violation in the last issue of the CERN Bulletin, “Music, videos and the risk for CERN”? Now let’s be more precise. “Violating copyright” not only means the illegal download of music and videos, it also applies to software packages and applications.   Users must respect proprietary rights in compliance with the CERN Computing Rules (OC5). Not having legitimately obtained a program or the required licenses to run that software is not a minor offense. It violates CERN rules and puts the Organization at risk! Vendors deserve credit and compensation. Therefore, make sure that you have the right to use their software. In other words, you have bought the software via legitimate channels and use a valid and honestly obtained license. This also applies to “Shareware” and software under open licenses, which might also come with a cost. Usually, only “Freeware” is complete...

  8. Understanding software faults and their role in software reliability modeling

    Science.gov (United States)

    Munson, John C.

    1994-01-01

    This study is a direct result of an on-going project to model the reliability of a large real-time control avionics system. In previous modeling efforts with this system, hardware reliability models were applied in modeling the reliability behavior of this system. In an attempt to enhance the performance of the adapted reliability models, certain software attributes were introduced in these models to control for differences between programs and also sequential executions of the same program. As the basic nature of the software attributes that affect software reliability become better understood in the modeling process, this information begins to have important implications on the software development process. A significant problem arises when raw attribute measures are to be used in statistical models as predictors, for example, of measures of software quality. This is because many of the metrics are highly correlated. Consider the two attributes: lines of code, LOC, and number of program statements, Stmts. In this case, it is quite obvious that a program with a high value of LOC probably will also have a relatively high value of Stmts. In the case of low level languages, such as assembly language programs, there might be a one-to-one relationship between the statement count and the lines of code. When there is a complete absence of linear relationship among the metrics, they are said to be orthogonal or uncorrelated. Usually the lack of orthogonality is not serious enough to affect a statistical analysis. However, for the purposes of some statistical analysis such as multiple regression, the software metrics are so strongly interrelated that the regression results may be ambiguous and possibly even misleading. Typically, it is difficult to estimate the unique effects of individual software metrics in the regression equation. The estimated values of the coefficients are very sensitive to slight changes in the data and to the addition or deletion of variables in the

  9. The Legacy of Space Shuttle Flight Software

    Science.gov (United States)

    Hickey, Christopher J.; Loveall, James B.; Orr, James K.; Klausman, Andrew L.

    2011-01-01

    The initial goals of the Space Shuttle Program required that the avionics and software systems blaze new trails in advancing avionics system technology. Many of the requirements placed on avionics and software were accomplished for the first time on this program. Examples include comprehensive digital fly-by-wire technology, use of a digital databus for flight critical functions, fail operational/fail safe requirements, complex automated redundancy management, and the use of a high-order software language for flight software development. In order to meet the operational and safety goals of the program, the Space Shuttle software had to be extremely high quality, reliable, robust, reconfigurable and maintainable. To achieve this, the software development team evolved a software process focused on continuous process improvement and defect elimination that consistently produced highly predictable and top quality results, providing software managers the confidence needed to sign each Certificate of Flight Readiness (COFR). This process, which has been appraised at Capability Maturity Model (CMM)/Capability Maturity Model Integration (CMMI) Level 5, has resulted in one of the lowest software defect rates in the industry. This paper will present an overview of the evolution of the Primary Avionics Software System (PASS) project and processes over thirty years, an argument for strong statistical control of software processes with examples, an overview of the success story for identifying and driving out errors before flight, a case study of the few significant software issues and how they were either identified before flight or slipped through the process onto a flight vehicle, and identification of the valuable lessons learned over the life of the project.

  10. Advanced Modular Software Performance Monitoring

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    The LHCb software is based on the Gaudi framework, on top of which are built several large and complex software applications. The LHCb experiment is now in the active phase of collecting and analyzing data and significant performance problems arise in the Gaudi based software beginning from High Level Trigger (HLT) programs and ending with data analysis frameworks (DaVinci). It’s not easy to find hot spots in the code - only special tools can help to understand where CPU or memory usage is not reasonable. There exist many performance analyzing tools, but the main problem is that they show reports in terms of class and function names and such information usually is not very useful - the majority of algorithm developers use the Gaudi framework abstractions and usually do not know about functions which lie at the lower level. We will show a new approach which adds to performance reports a higher abstraction level based on knowledge of framework architecture and run-time object properties. A set of profiling to...

  11. Configurable software for satellite graphics

    Energy Technology Data Exchange (ETDEWEB)

    Hartzman, P D

    1977-12-01

    An important goal in interactive computer graphics is to provide users with both quick system responses for basic graphics functions and enough computing power for complex calculations. One solution is to have a distributed graphics system in which a minicomputer and a powerful large computer share the work. The most versatile type of distributed system is an intelligent satellite system in which the minicomputer is programmable by the application user and can do most of the work while the large remote machine is used for difficult computations. At New York University, the hardware was configured from available equipment. The level of system intelligence resulted almost completely from software development. Unlike previous work with intelligent satellites, the resulting system had system control centered in the satellite. It also had the ability to reconfigure software during realtime operation. The design of the system was done at a very high level using set theoretic language. The specification clearly illustrated processor boundaries and interfaces. The high-level specification also produced a compact, machine-independent virtual graphics data structure for picture representation. The software was written in a systems implementation language; thus, only one set of programs was needed for both machines. A user can program both machines in a single language. Tests of the system with an application program indicate that is has very high potential. A major result of this work is the demonstration that a gigantic investment in new hardware is not necessary for computing facilities interested in graphics.

  12. Automating software design system DESTA

    Science.gov (United States)

    Lovitsky, Vladimir A.; Pearce, Patricia D.

    1992-01-01

    'DESTA' is the acronym for the Dialogue Evolutionary Synthesizer of Turnkey Algorithms by means of a natural language (Russian or English) functional specification of algorithms or software being developed. DESTA represents the computer-aided and/or automatic artificial intelligence 'forgiving' system which provides users with software tools support for algorithm and/or structured program development. The DESTA system is intended to provide support for the higher levels and earlier stages of engineering design of software in contrast to conventional Computer Aided Design (CAD) systems which provide low level tools for use at a stage when the major planning and structuring decisions have already been taken. DESTA is a knowledge-intensive system. The main features of the knowledge are procedures, functions, modules, operating system commands, batch files, their natural language specifications, and their interlinks. The specific domain for the DESTA system is a high level programming language like Turbo Pascal 6.0. The DESTA system is operational and runs on an IBM PC computer.

  13. Advanced modular software performance monitoring

    CERN Document Server

    Mazurov, A

    2012-01-01

    The LHCb software is based on the Gaudi framework, on top of which are built several large and complex software applications. As the LHCb experiment is now in the active phase of collecting and analyzing data, performance problems arise in various parts of the software, from the High Level Trigger (HLT) programs to data analysis frameworks. It is not easy to find hotspots in the code - only specialized tools can help to understand where CPU or memory usage are not reasonable. There exist many performance analyzing tools, but the main problem is that they show reports in terms of class and function names and such information usually is not very useful - the majority of algorithm developers use the Gaudi framework abstractions and usually do not know about functions which lie at the lower level. We will show a new approach which adds to performance reports a higher abstraction level based on knowledge of framework architecture and run-time object properties. A set of profiling tools (based on Intel VTune Amplif...

  14. Requirements Engineering in Building Climate Science Software

    Science.gov (United States)

    Batcheller, Archer L.

    Software has an important role in supporting scientific work. This dissertation studies teams that build scientific software, focusing on the way that they determine what the software should do. These requirements engineering processes are investigated through three case studies of climate science software projects. The Earth System Modeling Framework assists modeling applications, the Earth System Grid distributes data via a web portal, and the NCAR (National Center for Atmospheric Research) Command Language is used to convert, analyze and visualize data. Document analysis, observation, and interviews were used to investigate the requirements-related work. The first research question is about how and why stakeholders engage in a project, and what they do for the project. Two key findings arise. First, user counts are a vital measure of project success, which makes adoption important and makes counting tricky and political. Second, despite the importance of quantities of users, a few particular "power users" develop a relationship with the software developers and play a special role in providing feedback to the software team and integrating the system into user practice. The second research question focuses on how project objectives are articulated and how they are put into practice. The team seeks to both build a software system according to product requirements but also to conduct their work according to process requirements such as user support. Support provides essential communication between users and developers that assists with refining and identifying requirements for the software. It also helps users to learn and apply the software to their real needs. User support is a vital activity for scientific software teams aspiring to create infrastructure. The third research question is about how change in scientific practice and knowledge leads to changes in the software, and vice versa. The "thickness" of a layer of software infrastructure impacts whether the

  15. Layered materials

    Science.gov (United States)

    Johnson, David; Clarke, Simon; Wiley, John; Koumoto, Kunihito

    2014-06-01

    Layered compounds, materials with a large anisotropy to their bonding, electrical and/or magnetic properties, have been important in the development of solid state chemistry, physics and engineering applications. Layered materials were the initial test bed where chemists developed intercalation chemistry that evolved into the field of topochemical reactions where researchers are able to perform sequential steps to arrive at kinetically stable products that cannot be directly prepared by other approaches. Physicists have used layered compounds to discover and understand novel phenomena made more apparent through reduced dimensionality. The discovery of charge and spin density waves and more recently the remarkable discovery in condensed matter physics of the two-dimensional topological insulating state were discovered in two-dimensional materials. The understanding developed in two-dimensional materials enabled subsequent extension of these and other phenomena into three-dimensional materials. Layered compounds have also been used in many technologies as engineers and scientists used their unique properties to solve challenging technical problems (low temperature ion conduction for batteries, easy shear planes for lubrication in vacuum, edge decorated catalyst sites for catalytic removal of sulfur from oil, etc). The articles that are published in this issue provide an excellent overview of the spectrum of activities that are being pursued, as well as an introduction to some of the most established achievements in the field. Clusters of papers discussing thermoelectric properties, electronic structure and transport properties, growth of single two-dimensional layers, intercalation and more extensive topochemical reactions and the interleaving of two structures to form new materials highlight the breadth of current research in this area. These papers will hopefully serve as a useful guideline for the interested reader to different important aspects in this field and

  16. Software reuse example and challenges at NSIDC

    Science.gov (United States)

    Billingsley, B. W.; Brodzik, M.; Collins, J. A.

    2009-12-01

    NSIDC has created a new data discovery and access system, Searchlight, to provide users with the data they want in the format they want. NSIDC Searchlight supports discovery and access to disparate data types with on-the-fly reprojection, regridding and reformatting. Architected to both reuse open source systems and be reused itself, Searchlight reuses GDAL and Proj4 for manipulating data and format conversions, the netCDF Java library for creating netCDF output, MapServer and OpenLayers for defining spatial criteria and the JTS Topology Suite (JTS) in conjunction with Hibernate Spatial for database interaction and rich OGC-compliant spatial objects. The application reuses popular Java and Java Script libraries including Struts 2, Spring, JPA (Hibernate), Sitemesh, JFreeChart, JQuery, DOJO and a PostGIS PostgreSQL database. Future reuse of Searchlight components is supported at varying architecture levels, ranging from the database and model components to web services. We present the tools, libraries and programs that Searchlight has reused. We describe the architecture of Searchlight and explain the strategies deployed for reusing existing software and how Searchlight is built for reuse. We will discuss NSIDC reuse of the Searchlight components to support rapid development of new data delivery systems.

  17. Towards an Ontology of Software

    OpenAIRE

    Wang, Xiaowei

    2016-01-01

    Software is permeating every aspect of our personal and social life. And yet, the cluster of concepts around the notion of software, such as the notions of a software product, software requirements, software specifications, are still poorly understood with no consensus on the horizon. For many, software is just code, something intangible best defined in contrast with hardware, but it is not particularly illuminating. This erroneous notion, software is just code, presents both in the ontology ...

  18. A study to investigate viscous coupling effects on the hydraulic conductance of fluid layers in two-phase flow at the pore level.

    Science.gov (United States)

    Shams, Mosayeb; Raeini, Ali Q; Blunt, Martin J; Bijeljic, Branko

    2018-07-15

    This paper examines the role of momentum transfer across fluid-fluid interfaces in two-phase flow. A volume-of-fluid finite-volume numerical method is used to solve the Navier-Stokes equations for two-phase flow at the micro-scale. The model is applied to investigate viscous coupling effects as a function of the viscosity ratio, the wetting phase saturation and the wettability, for different fluid configurations in simple pore geometries. It is shown that viscous coupling effects can be significant for certain pore geometries such as oil layers sandwiched between water in the corner of mixed wettability capillaries. A simple parametric model is then presented to estimate general mobility terms as a function of geometric properties and viscosity ratio. Finally, the model is validated by comparison with the mobilities computed using direct numerical simulation. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.

  19. LDUA software custodian's notebook

    International Nuclear Information System (INIS)

    Aftanas, B.L.

    1998-01-01

    This plan describes the activities to be performed and controls to be applied to the process of specifying, obtaining, and qualifying the control and data acquisition software for the Light Duty Utility Arm (LDUA) System. It serves the purpose of a software quality assurance plan, a verification and validation plan, and a configuration management plan. This plan applies to all software that is an integral part of the LDUA control and data acquisition system, that is, software that is installed in the computers that are part of the LDUA system as it is deployed in the field. This plan applies to the entire development process, including: requirements; design; implementation; and operations and maintenance. This plan does not apply to any software that is not integral with the LDUA system. This plan has-been prepared in accordance with WHC-CM-6-1 Engineering Practices, EP-2.1; WHC-CM-3-10 Software Practices; and WHC-CM-4-2, QR 19.0, Software Quality Assurance Requirements

  20. Software Formal Inspections Guidebook

    Science.gov (United States)

    1993-01-01

    The Software Formal Inspections Guidebook is designed to support the inspection process of software developed by and for NASA. This document provides information on how to implement a recommended and proven method for conducting formal inspections of NASA software. This Guidebook is a companion document to NASA Standard 2202-93, Software Formal Inspections Standard, approved April 1993, which provides the rules, procedures, and specific requirements for conducting software formal inspections. Application of the Formal Inspections Standard is optional to NASA program or project management. In cases where program or project management decide to use the formal inspections method, this Guidebook provides additional information on how to establish and implement the process. The goal of the formal inspections process as documented in the above-mentioned Standard and this Guidebook is to provide a framework and model for an inspection process that will enable the detection and elimination of defects as early as possible in the software life cycle. An ancillary aspect of the formal inspection process incorporates the collection and analysis of inspection data to effect continual improvement in the inspection process and the quality of the software subjected to the process.

  1. The STARLINK software collection

    Science.gov (United States)

    Penny, A. J.; Wallace, P. T.; Sherman, J. C.; Terret, D. L.

    1993-12-01

    A demonstration will be given of some recent Starlink software. STARLINK is: a network of computers used by UK astronomers; a collection of programs for the calibration and analysis of astronomical data; a team of people giving hardware, software and administrative support. The Starlink Project has been in operation since 1980 to provide UK astronomers with interactive image processing and data reduction facilities. There are now Starlink computer systems at 25 UK locations, serving about 1500 registered users. The Starlink software collection now has about 25 major packages covering a wide range of astronomical data reduction and analysis techniques, as well as many smaller programs and utilities. At the core of most of the packages is a common `software environment', which provides many of the functions which applications need and offers standardized methods of structuring and accessing data. The software environment simplifies programming and support, and makes it easy to use different packages for different stages of the data reduction. Users see a consistent style, and can mix applications without hitting problems of differing data formats. The Project group coordinates the writing and distribution of this software collection, which is Unix based. Outside the UK, Starlink is used at a large number of places, which range from installations at major UK telescopes, which are Starlink-compatible and managed like Starlink sites, to individuals who run only small parts of the Starlink software collection.

  2. Software breadboard study

    Science.gov (United States)

    Nuckolls, C.; Frank, Mark

    1990-01-01

    The overall goal of this study was to develop new concepts and technology for the Comet Rendezvous Asteroid Flyby (CRAF), Cassini, and other future deep space missions which maximally conform to the Functional Specification for the NASA X-Band Transponder (NXT), FM513778 (preliminary, revised July 26, 1988). The study is composed of two tasks. The first task was to investigate a new digital signal processing technique which involves the processing of 1-bit samples and has the potential for significant size, mass, power, and electrical performance improvements over conventional analog approaches. The entire X-band receiver tracking loop was simulated on a digital computer using a high-level programming language. Simulations on this 'software breadboard' showed the technique to be well-behaved and a good approximation to its analog predecessor from threshold to strong signal levels in terms of tracking-loop performance, command signal-to-noise ratio and ranging signal-to-noise ratio. The successful completion of this task paves the way for building a hardware breadboard, the recommended next step in confirming this approach is ready for incorporation into flight hardware. The second task in this study was to investigate another technique which provides considerable simplification in the synthesis of the receiver first LO over conventional phase-locked multiplier schemes and in this approach, provides down-conversion for an S-band emergency receive mode without the need of an additional LO. The objective of this study was to develop methodology and models to predict the conversion loss, input RF bandwidth, and output RF bandwidth of a series GaAs FET sampling mixer and to breadboard and test a circuit design suitable for the X and S-band down-conversion applications.

  3. High–Level Control System for Biomimetic Autonomous Under-water Vehicle

    Directory of Open Access Journals (Sweden)

    Praczyk Tomasz

    2017-01-01

    Full Text Available Usually, a rough software architecture designed for a robot can be can be shortly presented in the form of layers. The lowest layer is responsible for direct control of the hardware, i.e. engines, energy system, sensors, navigation devices, etc. A next layer is a low–level control which knows how to use the hardware in order to achieve a desired state of the robot, e.g. to stay on a desired course. And the last layer, the layer which is the nearest to the human–operator, is a high–level control which decides how to use the low–level control and sometimes also individual pieces of the hardware to achieve predefined objectives. The paper describes architecture, tasks and operation of the high–level control system (HLCS designed for Biomimetic Autonomous Underwater Vehicle (BAUV.

  4. Evidence synthesis software.

    Science.gov (United States)

    Park, Sophie Elizabeth; Thomas, James

    2018-06-07

    It can be challenging to decide which evidence synthesis software to choose when doing a systematic review. This article discusses some of the important questions to consider in relation to the chosen method and synthesis approach. Software can support researchers in a range of ways. Here, a range of review conditions and software solutions. For example, facilitating contemporaneous collaboration across time and geographical space; in-built bias assessment tools; and line-by-line coding for qualitative textual analysis. EPPI-Reviewer is a review software for research synthesis managed by the EPPI-centre, UCL Institute of Education. EPPI-Reviewer has text mining automation technologies. Version 5 supports data sharing and re-use across the systematic review community. Open source software will soon be released. EPPI-Centre will continue to offer the software as a cloud-based service. The software is offered via a subscription with a one-month (extendible) trial available and volume discounts for 'site licences'. It is free to use for Cochrane and Campbell reviews. The next EPPI-Reviewer version is being built in collaboration with National Institute for Health and Care Excellence using 'surveillance' of newly published research to support 'living' iterative reviews. This is achieved using a combination of machine learning and traditional information retrieval technologies to identify the type of research each new publication describes and determine its relevance for a particular review, domain or guideline. While the amount of available knowledge and research is constantly increasing, the ways in which software can support the focus and relevance of data identification are also developing fast. Software advances are maximising the opportunities for the production of relevant and timely reviews. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise

  5. Effective software-oriented cryptosystem in complex PC security software

    Directory of Open Access Journals (Sweden)

    A. Moldovyan

    1995-02-01

    Full Text Available To ensure high encryption rate and good data security, an organization of an encipherement program in the form of two modules was proposed. The first module is used for customizing the second one, the latter being the resident of the program, which maintains all application calls about encryption procedures. This approach is shown to be perspective for the elaboration of the cryptosystems with indefinite cryptalgorithm. Several typical software-oriented cryptoschemes are considered. The developed cryptomodules have high encipherement rate (2-10 Mbps for Intel 386 and secure high information protection level Organization of a new computer security software complex COBRA is considered. High enciphering rate and good data protection are provided by the resident cryptomodule using less than 1 kbyte of the main memory and working in dynamic encryption mode.

  6. Beginning software engineering

    CERN Document Server

    Stephens, Rod

    2015-01-01

    Beginning Software Engineering demystifies the software engineering methodologies and techniques that professional developers use to design and build robust, efficient, and consistently reliable software. Free of jargon and assuming no previous programming, development, or management experience, this accessible guide explains important concepts and techniques that can be applied to any programming language. Each chapter ends with exercises that let you test your understanding and help you elaborate on the chapter's main concepts. Everything you need to understand waterfall, Sashimi, agile, RAD, Scrum, Kanban, Extreme Programming, and many other development models is inside!

  7. Software industrial flexible

    OpenAIRE

    Díaz Araya, Daniel; Muñoz, Leandro; Sirerol, Daniel; Oviedo, Sandra; Ibáñez, Francisco S.

    2012-01-01

    En este trabajo se pretende investigar y proponer técnicas, métodos y tecnologías que permitan el desarrollo de software flexible en ambientes industriales. El objetivo es generar métodos y técnicas para facilitar el desarrollo de software flexible en ambientes industriales. Las áreas de investigación son los sistemas de scheduling de producción, la generación de software para plataformas de hardware abiertas y la innovación.

  8. Thyroid uptake software

    International Nuclear Information System (INIS)

    Alonso, Dolores; Arista, Eduardo

    2003-01-01

    The DETEC-PC software was developed as a complement to a measurement system (hardware) able to perform Iodine Thyroid Uptake studies. The software was designed according to the principles of Object oriented programming using C++ language. The software automatically fixes spectrometric measurement parameters and besides patient measurement also performs statistical analysis of a batch of samples. It possesses a PARADOX database with all information of measured patients and a help system with the system options and medical concepts related to the thyroid uptake study

  9. Criteria for software modularization

    Science.gov (United States)

    Card, David N.; Page, Gerald T.; Mcgarry, Frank E.

    1985-01-01

    A central issue in programming practice involves determining the appropriate size and information content of a software module. This study attempted to determine the effectiveness of two widely used criteria for software modularization, strength and size, in reducing fault rate and development cost. Data from 453 FORTRAN modules developed by professional programmers were analyzed. The results indicated that module strength is a good criterion with respect to fault rate, whereas arbitrary module size limitations inhibit programmer productivity. This analysis is a first step toward defining empirically based standards for software modularization.

  10. Error Free Software

    Science.gov (United States)

    1985-01-01

    A mathematical theory for development of "higher order" software to catch computer mistakes resulted from a Johnson Space Center contract for Apollo spacecraft navigation. Two women who were involved in the project formed Higher Order Software, Inc. to develop and market the system of error analysis and correction. They designed software which is logically error-free, which, in one instance, was found to increase productivity by 600%. USE.IT defines its objectives using AXES -- a user can write in English and the system converts to computer languages. It is employed by several large corporations.

  11. Upgrade Software and Computing

    CERN Document Server

    The LHCb Collaboration, CERN

    2018-01-01

    This document reports the Research and Development activities that are carried out in the software and computing domains in view of the upgrade of the LHCb experiment. The implementation of a full software trigger implies major changes in the core software framework, in the event data model, and in the reconstruction algorithms. The increase of the data volumes for both real and simulated datasets requires a corresponding scaling of the distributed computing infrastructure. An implementation plan in both domains is presented, together with a risk assessment analysis.

  12. Global Software Engineering

    DEFF Research Database (Denmark)

    Ebert, Christof; Kuhrmann, Marco; Prikladnicki, Rafael

    2016-01-01

    SOFTWARE, LIKE ALL industry products, is the result of complex multinational supply chains with many partners from concept to development to production and maintenance. Global software engineering (GSE), IT outsourcing, and business process outsourcing during the past decade have showed growth...... rates of 10 to 20 percent per year. This instalment of Practitioner’s Digest summarizes experiences and guidance from industry to facilitate knowledge and technology transfer for GSE. It’s based on industry feedback from the annual IEEE International Conference on Global Software Engineering, which had...

  13. Software for microcircuit systems

    International Nuclear Information System (INIS)

    Kunz, P.F.

    1978-01-01

    Modern Large Scale Integration (LSI) microcircuits are meant to be programmed in order to control the function that they perform. In the previous paper the author has already discussed the basics of microprogramming and have studied in some detail two types of new microcircuits. In this paper, methods of developing software for these microcircuits are explored. This generally requires a package of support software in order to assemble the microprogram, and also some amount of support software to test the microprograms and to test the microprogrammed circuit itself. (Auth.)

  14. Guide to software export

    CERN Document Server

    Philips, Roger A

    2014-01-01

    An ideal reference source for CEOs, marketing and sales managers, sales consultants, and students of international marketing, Guide to Software Export provides a step-by-step approach to initiating or expanding international software sales. It teaches you how to examine critically your candidate product for exportability; how to find distributors, agents, and resellers abroad; how to identify the best distribution structure for export; and much, much more!Not content with providing just the guidelines for setting up, expanding, and managing your international sales channels, Guide to Software

  15. Software takes command

    CERN Document Server

    Manovich, Lev

    2013-01-01

    Software has replaced a diverse array of physical, mechanical, and electronic technologies used before 21st century to create, store, distribute and interact with cultural artifacts. It has become our interface to the world, to others, to our memory and our imagination - a universal language through which the world speaks, and a universal engine on which the world runs. What electricity and combustion engine were to the early 20th century, software is to the early 21st century. Offering the the first theoretical and historical account of software for media authoring and its effects on the prac

  16. Software quality assurance handbook

    Energy Technology Data Exchange (ETDEWEB)

    1990-09-01

    There are two important reasons for Software Quality Assurance (SQA) at Allied-Signal Inc., Kansas City Division (KCD): First, the benefits from SQA make good business sense. Second, the Department of Energy has requested SQA. This handbook is one of the first steps in a plant-wide implementation of Software Quality Assurance at KCD. The handbook has two main purposes. The first is to provide information that you will need to perform software quality assurance activities. The second is to provide a common thread to unify the approach to SQA at KCD. 2 figs.

  17. Sobre software libre

    OpenAIRE

    Matellán Olivera, Vicente; González Barahona, Jesús; Heras Quirós, Pedro de las; Robles Martínez, Gregorio

    2004-01-01

    220 p. "Sobre software libre" reune casi una treintena de ensayos sobre temas de candente actualidad relacionados con el software libre (del cual Linux es su ex- ponente más conocido). Los ensayos que el lector encontrará están divididos en bloques temáticos que van desde la propiedad intelectual o las cuestiones económicas y sociales de este modelo hasta su uso en la educación y las administraciones publicas, pasando por alguno que repasa la historia del software libre en l...

  18. Framework for Small-Scale Experiments in Software Engineering: Guidance and Control Software Project: Software Engineering Case Study

    Science.gov (United States)

    Hayhurst, Kelly J.

    1998-01-01

    Software is becoming increasingly significant in today's critical avionics systems. To achieve safe, reliable software, government regulatory agencies such as the Federal Aviation Administration (FAA) and the Department of Defense mandate the use of certain software development methods. However, little scientific evidence exists to show a correlation between software development methods and product quality. Given this lack of evidence, a series of experiments has been conducted to understand why and how software fails. The Guidance and Control Software (GCS) project is the latest in this series. The GCS project is a case study of the Requirements and Technical Concepts for Aviation RTCA/DO-178B guidelines, Software Considerations in Airborne Systems and Equipment Certification. All civil transport airframe and equipment vendors are expected to comply with these guidelines in building systems to be certified by the FAA for use in commercial aircraft. For the case study, two implementations of a guidance and control application were developed to comply with the DO-178B guidelines for Level A (critical) software. The development included the requirements, design, coding, verification, configuration management, and quality assurance processes. This paper discusses the details of the GCS project and presents the results of the case study.

  19. Strategies employed for LHC software performance studies

    CERN Document Server

    Nowak, A

    2010-01-01

    The objective of this work is to collect and assess the software performance related strategies employed by the major players in the LHC software arena: the four main experiments (ALICE, ATLAS, CMS and LHCb) and the two main software frameworks (Geant4 and ROOT). As the software used differs between the parties, so do the directions and methods in optimization, and their intensity. The common feeling shared by nearly all interviewed parties is that performance is not one of their top priorities and that maintaining it at a constant level is a satisfactory solution, given the resources at hand. In principle, despite some organized efforts, a less structured approach seems to be the dominant one, and opportunistic optimization prevails. Four out of six surveyed groups are investigating memory management related effects, deemed to be the primary cause of their performance issues. The most commonly used tools include Valgrind and homegrown software. All questioned groups expressed the desire for advanced tools, s...

  20. Real-time SHVC software decoding with multi-threaded parallel processing

    Science.gov (United States)

    Gudumasu, Srinivas; He, Yuwen; Ye, Yan; He, Yong; Ryu, Eun-Seok; Dong, Jie; Xiu, Xiaoyu

    2014-09-01

    This paper proposes a parallel decoding framework for scalable HEVC (SHVC). Various optimization technologies are implemented on the basis of SHVC reference software SHM-2.0 to achieve real-time decoding speed for the two layer spatial scalability configuration. SHVC decoder complexity is analyzed with profiling information. The decoding process at each layer and the up-sampling process are designed in parallel and scheduled by a high level application task manager. Within each layer, multi-threaded decoding is applied to accelerate the layer decoding speed. Entropy decoding, reconstruction, and in-loop processing are pipeline designed with multiple threads based on groups of coding tree units (CTU). A group of CTUs is treated as a processing unit in each pipeline stage to achieve a better trade-off between parallelism and synchronization. Motion compensation, inverse quantization, and inverse transform modules are further optimized with SSE4 SIMD instructions. Simulations on a desktop with an Intel i7 processor 2600 running at 3.4 GHz show that the parallel SHVC software decoder is able to decode 1080p spatial 2x at up to 60 fps (frames per second) and 1080p spatial 1.5x at up to 50 fps for those bitstreams generated with SHVC common test conditions in the JCT-VC standardization group. The decoding performance at various bitrates with different optimization technologies and different numbers of threads are compared in terms of decoding speed and resource usage, including processor and memory.

  1. Intellectual Property Protection of Software – At the Crossroads of Software Patents and Open Source Software

    OpenAIRE

    Tantarimäki, Maria

    2018-01-01

    The thesis considers the intellectual property protection of software in Europe and in the US, which is increasingly important subject as the world is globalizing and digitalizing. The special nature of software has challenges the intellectual property rights. The current protection of software is based on copyright protection but in this thesis, two other options are considered: software patents and open source software. Software patents provide strong protection for software whereas the pur...

  2. Software for virtual accelerator designing

    International Nuclear Information System (INIS)

    Kulabukhova, N.; Ivanov, A.; Korkhov, V.; Lazarev, A.

    2012-01-01

    The article discusses appropriate technologies for software implementation of the Virtual Accelerator. The Virtual Accelerator is considered as a set of services and tools enabling transparent execution of computational software for modeling beam dynamics in accelerators on distributed computing resources. Distributed storage and information processing facilities utilized by the Virtual Accelerator make use of the Service-Oriented Architecture (SOA) according to a cloud computing paradigm. Control system tool-kits (such as EPICS, TANGO), computing modules (including high-performance computing), realization of the GUI with existing frameworks and visualization of the data are discussed in the paper. The presented research consists of software analysis for realization of interaction between all levels of the Virtual Accelerator and some samples of middle-ware implementation. A set of the servers and clusters at St.-Petersburg State University form the infrastructure of the computing environment for Virtual Accelerator design. Usage of component-oriented technology for realization of Virtual Accelerator levels interaction is proposed. The article concludes with an overview and substantiation of a choice of technologies that will be used for design and implementation of the Virtual Accelerator. (authors)

  3. KTM Tokamak operation scenarios software infrastructure

    Energy Technology Data Exchange (ETDEWEB)

    Pavlov, V.; Baystrukov, K.; Golobkov, YU.; Ovchinnikov, A.; Meaentsev, A.; Merkulov, S.; Lee, A. [National Research Tomsk Polytechnic University, Tomsk (Russian Federation); Tazhibayeva, I.; Shapovalov, G. [National Nuclear Center (NNC), Kurchatov (Kazakhstan)

    2014-10-15

    One of the largest problems for tokamak devices such as Kazakhstan Tokamak for Material Testing (KTM) is the operation scenarios' development and execution. Operation scenarios may be varied often, so a convenient hardware and software solution is required for scenario management and execution. Dozens of diagnostic and control subsystems with numerous configuration settings may be used in an experiment, so it is required to automate the subsystem configuration process to coordinate changes of the related settings and to prevent errors. Most of the diagnostic and control subsystems software at KTM was unified using an extra software layer, describing the hardware abstraction interface. The experiment sequence was described using a command language. The whole infrastructure was brought together by a universal communication protocol supporting various media, including Ethernet and serial links. The operation sequence execution infrastructure was used at KTM to carry out plasma experiments.

  4. Software quality assurance plan for viscometer

    International Nuclear Information System (INIS)

    Gimera, M.

    1994-01-01

    The in situ viscometer is a portable instrument designed to raise and lower a sphere (rheometer ball) through layers of tank waste material while recording ball position, velocity, and cable tension. In the field, the viscometer attaches to a decontamination spool piece which in turn is designed to attach to any 4-inch, 150-pound flange (typical of many available tank risers). The motion of the ball and collection of data is controlled by instrumentation and control equipment housed in a separate remote control console. This document covers the product, Viscometer Data Acquisition Software. This document provides the software quality assurance plan, verification and validation plan, and configuration management plan for developing the software for the instrumentation that will be used to obtain rheology data from Tank SY-101

  5. Center for Adaptive Optics | Software

    Science.gov (United States)

    Optics Software The Center for Adaptive Optics acts as a clearing house for distributing Software to Institutes it gives specialists in Adaptive Optics a place to distribute their software. All software is shared on an "as-is" basis and the users should consult with the software authors with any

  6. A concept of software testing for SMART MMIS software

    International Nuclear Information System (INIS)

    Seo, Yong Seok; Seong, Seung Hwan; Park, Keun Ok; Hur, Sub; Kim, Dong Hoon

    2001-01-01

    In order to achieve high quality of SMART MMIS software, the well-constructed software testing concept shall be required. This paper established software testing concept which is to be applied to SMART MMIS software, in terms of software testing organization, documentation. procedure, and methods. The software testing methods are classified into source code static analysis and dynamic testing. The software dynamic testing methods are discussed with two aspects: white-box and black-box testing. As software testing concept introduced in this paper is applied to the SMART MMIS software. the high quality of the software will be produced. In the future, software failure data will be collected through the construction of SMART MMIS prototyping facility which the software testing concept of this paper is applied to

  7. Computer Center: Software Review.

    Science.gov (United States)

    Duhrkopf, Richard, Ed.; Belshe, John F., Ed.

    1988-01-01

    Reviews a software package, "Mitosis-Meiosis," available for Apple II or IBM computers with colorgraphics capabilities. Describes the documentation, presentation and flexibility of the program. Rates the program based on graphics and usability in a biology classroom. (CW)

  8. Petroleum software profiles

    International Nuclear Information System (INIS)

    Anon.

    1996-01-01

    A profile of twenty-two software packages designed for petroleum exploration and production was provided. Some focussed on the oil and gas engineering industry, and others on mapping systems containing well history files and well data summaries. Still other programs provided accounting systems designed to address the complexities of the oil and gas industry. The software packages reviewed were developed by some of the best-known groups involved in software development for the oil and gas industry, including among others, Geoquest, the Can Tek Group, Applied Terravision Systems Inc., Neotechnology Consultants Ltd., (12) OGCI Software Inc., Oracle Energy, Production Revenue Information Systems Management, Virtual Computing Services Ltd., and geoLogic Systems Ltd

  9. Next Generation Software Development

    National Research Council Canada - National Science Library

    Manna, Zohar

    2005-01-01

    Under this grant we have studied the development of a scientifically sound basis for software development that builds on widely used pragmatic methods but is firmly grounded in well-established formal...

  10. Managing Distributed Software Projects

    DEFF Research Database (Denmark)

    Persson, John Stouby

    Increasingly, software projects are becoming geographically distributed, with limited face-toface interaction between participants. These projects face particular challenges that need careful managerial attention. This PhD study reports on how we can understand and support the management...... of distributed software projects, based on a literature study and a case study. The main emphasis of the literature study was on how to support the management of distributed software projects, but also contributed to an understanding of these projects. The main emphasis of the case study was on how to understand...... the management of distributed software projects, but also contributed to supporting the management of these projects. The literature study integrates what we know about risks and risk-resolution techniques, into a framework for managing risks in distributed contexts. This framework was developed iteratively...

  11. eSoftwareList

    Data.gov (United States)

    US Agency for International Development — USAID Software Database reporting tool created in Oracle Application Express (APEX). This version provides read only access to a database view of the JIRA SAR...

  12. Software didattico: integrazione scolastica

    Directory of Open Access Journals (Sweden)

    Lucia Ferlino

    1996-01-01

    Full Text Available Discussion of the use of educational software for school integration. Requires being aware of its potential effectiveness and know that it also lies in the choice of functional products.

  13. Core Flight Software

    Data.gov (United States)

    National Aeronautics and Space Administration — The AES Core Flight Software (CFS) project purpose is to analyze applicability, and evolve and extend the reusability of the CFS system originally developed by...

  14. Tier2 Submit Software

    Science.gov (United States)

    Download this tool for Windows or Mac, which helps facilities prepare a Tier II electronic chemical inventory report. The data can also be exported into the CAMEOfm (Computer-Aided Management of Emergency Operations) emergency planning software.

  15. SEER Data & Software

    Science.gov (United States)

    Options for accessing datasets for incidence, mortality, county populations, standard populations, expected survival, and SEER-linked and specialized data. Plus variable definitions, documentation for reporting and using datasets, statistical software (SEER*Stat), and observational research resources.

  16. Managing Software Process Evolution

    DEFF Research Database (Denmark)

    This book focuses on the design, development, management, governance and application of evolving software processes that are aligned with changing business objectives, such as expansion to new domains or shifting to global production. In the context of an evolving business world, it examines...... the complete software process lifecycle, from the initial definition of a product to its systematic improvement. In doing so, it addresses difficult problems, such as how to implement processes in highly regulated domains or where to find a suitable notation system for documenting processes, and provides...... essential insights and tips to help readers manage process evolutions. And last but not least, it provides a wealth of examples and cases on how to deal with software evolution in practice. Reflecting these topics, the book is divided into three parts. Part 1 focuses on software business transformation...

  17. Software for nuclear spectrometry

    International Nuclear Information System (INIS)

    1998-10-01

    The Advisory Group Meeting (AGM) on Software for Nuclear Spectrometry was dedicated to review the present status of software for nuclear spectrometry and to advise on future activities in this field. Because similar AGM and consultant's meetings had been held in the past; together with an attempt to get more streamlined, this AGM was devoted to the specific field of software for gamma ray spectrometry. Nevertheless, many of the issues discussed and the recommendations made are of general concern for any software on nuclear spectrometry. The report is organized by sections. The 'Summary' gives conclusions and recommendations adopted at the AGM. These conclusions and recommendations resulted from the discussions held during and after presentations of the scientific and technical papers. These papers are reported here in their integral form in the following Sections

  18. Software for radiation protection

    International Nuclear Information System (INIS)

    Graffunder, H.

    2002-01-01

    The software products presented are universally usable programs for radiation protection. The systems were designed in order to establish a comprehensive database specific to radiation protection and, on this basis, model in programs subjects of radiation protection. Development initially focused on the creation of the database. Each software product was to access the same nuclide-specific data; input errors and differences in spelling were to be excluded from the outset. This makes the products more compatible with each other and able to exchange data among each other. The software products are modular in design. Functions recurring in radiation protection are always treated the same way in different programs, and also represented the same way on the program surface. The recognition effect makes it easy for users to familiarize with the products quickly. All software products are written in German and are tailored to the administrative needs and codes and regulations in Germany and in Switzerland. (orig.) [de

  19. ITSY Handheld Software Radio

    National Research Council Canada - National Science Library

    Bose, Vanu

    2001-01-01

    .... A handheld software radio platform would enable the construction of devices that could inter-operate with multiple legacy systems, download new waveforms and be used to construct adhoc networks...

  20. Error-Free Software

    Science.gov (United States)

    1989-01-01

    001 is an integrated tool suited for automatically developing ultra reliable models, simulations and software systems. Developed and marketed by Hamilton Technologies, Inc. (HTI), it has been applied in engineering, manufacturing, banking and software tools development. The software provides the ability to simplify the complex. A system developed with 001 can be a prototype or fully developed with production quality code. It is free of interface errors, consistent, logically complete and has no data or control flow errors. Systems can be designed, developed and maintained with maximum productivity. Margaret Hamilton, President of Hamilton Technologies, also directed the research and development of USE.IT, an earlier product which was the first computer aided software engineering product in the industry to concentrate on automatically supporting the development of an ultrareliable system throughout its life cycle. Both products originated in NASA technology developed under a Johnson Space Center contract.

  1. Lab Streaming Layer Enabled Myo Data Collection Software User Manual

    Science.gov (United States)

    2017-06-07

    SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT UU 18. NUMBER OF PAGES 32 19a. NAME OF RESPONSIBLE PERSON...www.tumsenahopayga.com/ wp-content/uploads/2015/12/Myo- By-Thalmic-Labs-Gesture-Sensor-Controller-Armbad-For-iPhone- Android -Mac-and-PC.jpg...uploads/2015/12/Myo-By-Thalmic-Labs- Gesture-Sensor-Controller-Armbad-For-iPhone- Android -Mac-and-PC.jpg) LSL is an open source system for transmitting

  2. Limpet Shells from the Aterian Level 8 of El Harhoura 2 Cave (Témara, Morocco): Preservation State of Crossed-Foliated Layers

    OpenAIRE

    Nouet, Julius; Chevallard, Corinne; Farre, Bastien; Nehrke, Gernot; Campmas, Emilie; Stoetzel, Emmanuelle; El Hajraoui, Mohamed Abdeljalil; Nespoulet, Roland

    2015-01-01

    International audience; The exploitation of mollusks by the first anatomically modern humans is a central question for archaeologists. This paper focuses on level 8 (dated around * 100 ka BP) of El Har-houra 2 Cave, located along the coastline in the Rabat-Témara region (Morocco). The large quantity of Patella sp. shells found in this level highlights questions regarding their origin and preservation. This study presents an estimation of the preservation status of these shells. We focus here ...

  3. Deep levels in a-plane, high Mg-content MgxZn1−xO epitaxial layers grown by molecular beam epitaxy

    International Nuclear Information System (INIS)

    Gür, Emre; Tabares, G.; Hierro, A.; Arehart, A.; Ringel, S. A.; Chauveau, J. M.

    2012-01-01

    Deep level defects in n-type unintentionally doped a-plane Mg x Zn 1−x O, grown by molecular beam epitaxy on r-plane sapphire were fully characterized using deep level optical spectroscopy (DLOS) and related methods. Four compositions of Mg x Zn 1−x O were examined with x = 0.31, 0.44, 0.52, and 0.56 together with a control ZnO sample. DLOS measurements revealed the presence of five deep levels in each Mg-containing sample, having energy levels of E c − 1.4 eV, 2.1 eV, 2.6 V, and E v + 0.3 eV and 0.6 eV. For all Mg compositions, the activation energies of the first three states were constant with respect to the conduction band edge, whereas the latter two revealed constant activation energies with respect to the valence band edge. In contrast to the ternary materials, only three levels, at E c − 2.1 eV, E v + 0.3 eV, and 0.6 eV, were observed for the ZnO control sample in this systematically grown series of samples. Substantially higher concentrations of the deep levels at E v + 0.3 eV and E c − 2.1 eV were observed in ZnO compared to the Mg alloyed samples. Moreover, there is a general invariance of trap concentration of the E v + 0.3 eV and 0.6 eV levels on Mg content, while at least and order of magnitude dependency of the E c − 1.4 eV and E c − 2.6 eV levels in Mg alloyed samples.

  4. Incorporating Code-Based Software in an Introductory Statistics Course

    Science.gov (United States)

    Doehler, Kirsten; Taylor, Laura

    2015-01-01

    This article is based on the experiences of two statistics professors who have taught students to write and effectively utilize code-based software in a college-level introductory statistics course. Advantages of using software and code-based software in this context are discussed. Suggestions are made on how to ease students into using code with…

  5. The 1988 Directory of Educational Software Publishing Companies.

    Science.gov (United States)

    Electronic Learning, 1988

    1988-01-01

    Based on questionnaires sent to educational software companies in January 1988, this directory lists 78 companies. Information given includes company address, curriculum subject areas for which the company publishes software, types of machines and operating systems on which the software operates, and grade level for which it is targeted. (LRW)

  6. MARS software package status

    International Nuclear Information System (INIS)

    Azhgirej, I.L.; Talanov, V.V.

    2000-01-01

    The MARS software package is intended for simulating the nuclear-electromagnetic cascades and the secondary neutrons and muons transport in the heterogeneous medium of arbitrary complexity in the magnetic fields presence. The inclusive approach to describing the particle production in the nuclear and electromagnetic interactions and by the unstable particles decay is realized in the package. The MARS software package was actively applied for solving various radiation physical problems [ru

  7. MAGIC user's group software

    International Nuclear Information System (INIS)

    Warren, G.; Ludeking, L.; McDonald, J.; Nguyen, K.; Goplen, B.

    1990-01-01

    The MAGIC User's Group has been established to facilitate the use of electromagnetic particle-in-cell software by universities, government agencies, and industrial firms. The software consists of a series of independent executables that are capable of inter-communication. MAGIC, SOS, μ SOS are used to perform electromagnetic simulations while POSTER is used to provide post-processing capabilities. Each is described in the paper. Use of the codes for Klystrode simulation is discussed

  8. Global software development

    DEFF Research Database (Denmark)

    Matthiesen, Stina

    2016-01-01

    This overview presents the mid stages of my doctoral research-based on ethnographic work conducted in IT companies in India and in Denmark-on collaborative work within global software development (GSD). In the following I briefly introduce how this research seeks to spark a debate in CSCW...... by challenging contemporary ideals about software development outsourcing through the exploration of the multiplicities and asymmetric dynamics inherent in the collaborative work of GSD....

  9. Principles of Antifragile Software

    OpenAIRE

    Monperrus, Martin

    2014-01-01

    The goal of this paper is to study and define the concept of "antifragile software". For this, I start from Taleb's statement that antifragile systems love errors, and discuss whether traditional software dependability fits into this class. The answer is somewhat negative, although adaptive fault tolerance is antifragile: the system learns something when an error happens, and always imrpoves. Automatic runtime bug fixing is changing the code in response to errors, fault injection in productio...

  10. Software product quality measurement

    OpenAIRE

    Godliauskas, Eimantas

    2016-01-01

    This paper analyses Ruby product quality measures, suggesting three new measures for Ruby product quality measurement tool Rubocop to measure Ruby product quality characteristics defined in ISO 2502n standard series. This paper consists of four main chapters. The first chapter gives a brief view of software product quality and software product quality measurement. The second chapter analyses object oriented quality measures. The third chapter gives a brief view of the most popular Ruby qualit...

  11. Managing MDO Software Development Projects

    Science.gov (United States)

    Townsend, J. C.; Salas, A. O.

    2002-01-01

    Over the past decade, the NASA Langley Research Center developed a series of 'grand challenge' applications demonstrating the use of parallel and distributed computation and multidisciplinary design optimization. All but the last of these applications were focused on the high-speed civil transport vehicle; the final application focused on reusable launch vehicles. Teams of discipline experts developed these multidisciplinary applications by integrating legacy engineering analysis codes. As teams became larger and the application development became more complex with increasing levels of fidelity and numbers of disciplines, the need for applying software engineering practices became evident. This paper briefly introduces the application projects and then describes the approaches taken in project management and software engineering for each project; lessons learned are highlighted.

  12. The NLC Software Requirements Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Shoaee, Hamid

    2002-08-20

    We describe the software requirements and development methodology developed for the NLC control system. Given the longevity of that project, and the likely geographical distribution of the collaborating engineers, the planned requirements management process is somewhat more formal than the norm in high energy physics projects. The short term goals of the requirements process are to accurately estimate costs, to decompose the problem, and to determine likely technologies. The long term goal is to enable a smooth transition from high level functional requirements to specific subsystem and component requirements for individual programmers, and to support distributed development. The methodology covers both ends of that life cycle. It covers both the analytical and documentary tools for software engineering, and project management support. This paper introduces the methodology, which is fully described in [1].

  13. Examining software complexity and quality for scientific software

    International Nuclear Information System (INIS)

    Kelly, D.; Shepard, T.

    2005-01-01

    Research has not found a simple relationship between software complexity and software quality, and particularly no relationship between commonly used software complexity metrics and the occurrence of software faults. A study with an example of scientific software from the nuclear power industry illustrates the importance of addressing cognitive complexity, the complexity related to understanding the intellectual content of the software. Simple practices such as aptly-named variables contributes more to high quality software than limiting code sizes. This paper examines the research into complexity and quality and reports on a longitudinal study using the example of nuclear software. (author)

  14. Usage of Modified Heuristic Model for Determination of Software Stability

    Directory of Open Access Journals (Sweden)

    Sergey Konstantinovich Marfenko

    2013-02-01

    Full Text Available The subject of this paper is analysis method for determining the stability of software against the attacks on its integrity. It is suggested to use the modified heuristic model of software reliability as mathematic basis of this method. This model is based on classic approach, but it takes into account impact levels of different software errors on system integrity. It allows to define critical characteristics of software: percentage of time in stable working, the possibility of failure.

  15. Continuous software engineering – a microservices architecture perspective

    OpenAIRE

    O'Connor, Rory; Elger, Peter; Clarke, Paul

    2017-01-01

    From its earliest days, software development has been beset with challenges in relation to timely delivery, appropriateness of features and quality of deliverables. Many advances in software development processes have helped to address these concerns. For example, agile software development has helped to deliver working software more frequently and capability maturity frameworks have brought about improved consistency in quality levels. However, the age-old challenge of better, cheaper, faste...

  16. Co-sourcing in software development offshoring

    DEFF Research Database (Denmark)

    Schlichter, Bjarne Rerup; Persson, John Stouby

    2013-01-01

    Software development projects are increasingly geographical distributed with offshoring, which introduce complex risks that can lead to project failure. Co-sourcing is a highly integrative and cohesive approach, seen successful, to software development offshoring. However, research of how co......-sourcing shapes the perception and alleviation of common offshoring risks is limited. We present a case study of how a certified CMMI-level 5 Danish software supplier approaches these risks in offshore co-sourcing. The paper explains how common offshoring risks are perceived and alleviated when adopting the co...

  17. Computing and software

    Directory of Open Access Journals (Sweden)

    White, G. C.

    2004-06-01

    Full Text Available The reality is that the statistical methods used for analysis of data depend upon the availability of software. Analysis of marked animal data is no different than the rest of the statistical field. The methods used for analysis are those that are available in reliable software packages. Thus, the critical importance of having reliable, up–to–date software available to biologists is obvious. Statisticians have continued to develop more robust models, ever expanding the suite of potential analysis methods available. But without software to implement these newer methods, they will languish in the abstract, and not be applied to the problems deserving them. In the Computers and Software Session, two new software packages are described, a comparison of implementation of methods for the estimation of nest survival is provided, and a more speculative paper about how the next generation of software might be structured is presented. Rotella et al. (2004 compare nest survival estimation with different software packages: SAS logistic regression, SAS non–linear mixed models, and Program MARK. Nests are assumed to be visited at various, possibly infrequent, intervals. All of the approaches described compute nest survival with the same likelihood, and require that the age of the nest is known to account for nests that eventually hatch. However, each approach offers advantages and disadvantages, explored by Rotella et al. (2004. Efford et al. (2004 present a new software package called DENSITY. The package computes population abundance and density from trapping arrays and other detection methods with a new and unique approach. DENSITY represents the first major addition to the analysis of trapping arrays in 20 years. Barker & White (2004 discuss how existing software such as Program MARK require that each new model’s likelihood must be programmed specifically for that model. They wishfully think that future software might allow the user to combine

  18. Software Engineering Reviews and Audits

    CERN Document Server

    Summers, Boyd L

    2011-01-01

    Accurate software engineering reviews and audits have become essential to the success of software companies and military and aerospace programs. These reviews and audits define the framework and specific requirements for verifying software development efforts. Authored by an industry professional with three decades of experience, Software Engineering Reviews and Audits offers authoritative guidance for conducting and performing software first article inspections, and functional and physical configuration software audits. It prepares readers to answer common questions for conducting and perform

  19. Business Management Software Axolon ERP

    OpenAIRE

    Axolon ERP Solution

    2018-01-01

    Axolon ERP a Business Management Software www.axolonerp.com by Micromind is a comprehensive business management software solution for businesses. We deliver Business Management Software Dubai in UAE, GCC Countries and products also include ERP Software Dubai. HR & Payroll, Inventory Software, Project Management, Software Development, Solutions and Services in Dubai, UAE for small and medium sized Enterprises (SME) in the middle east with a easy-to-use, secure and efficient business management...

  20. Managing Cultural Variation in Software Process Improvement

    DEFF Research Database (Denmark)

    Müller, Sune Dueholm; Kræmmergaard, Pernille; Mathiassen, Lars

    The scale and complexity of change in software process improvement (SPI) are considerable and managerial attention to organizational culture during SPI can therefore potentially contribute to successful outcomes. However, we know little about the impact of variations in organizational subculture...... CMMI level 2 as planned, ASY struggled to implement even modest improvements. To explain these differences, we analyzed the underlying organizational culture within ISY and ASY using two different methods for subculture assessment. The study demonstrates how variations in culture across software...