WorldWideScience

Sample records for catissue core software

  1. caTissue Suite 1.2 released —

    Science.gov (United States)

    caTissue Suite 1.2 is an open-source, web and programmatically accessible tool for managing biospecimens collected in support of basic and clinical research. Building on the capabilities of previous releases the application helps users manage biospecimen inventory, annotation and sample tracking. It also supports clinical and pathology report annotation and provides query capabilities for researchers to identify and find biospecimens for their research projects. In addition, it features "Dynamic Extensions", allowing Biorepositories to extend the caTissue data model and develop annotations customized for their institution.

  2. Core Flight Software Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The mission of the CFS project is to provide reusable software in support of human space exploration programs.   The top-level technical approach to...

  3. caTissue Suite to OpenSpecimen: Developing an extensible, open source, web-based biobanking management system.

    Science.gov (United States)

    McIntosh, Leslie D; Sharma, Mukesh K; Mulvihill, David; Gupta, Snehil; Juehne, Anthony; George, Bijoy; Khot, Suhas B; Kaushal, Atul; Watson, Mark A; Nagarajan, Rakesh

    2015-10-01

    The National Cancer Institute (NCI) Cancer Biomedical Informatics Grid® (caBIG®) program established standards and best practices for biorepository data management by creating an infrastructure to propagate biospecimen resource sharing while maintaining data integrity and security. caTissue Suite, a biospecimen data management software tool, has evolved from this effort. More recently, the caTissue Suite continues to evolve as an open source initiative known as OpenSpecimen. The essential functionality of OpenSpecimen includes the capture and representation of highly granular, hierarchically-structured data for biospecimen processing, quality assurance, tracking, and annotation. Ideal for multi-user and multi-site biorepository environments, OpenSpecimen permits role-based access to specific sets of data operations through a user-interface designed to accommodate varying workflows and unique user needs. The software is interoperable, both syntactically and semantically, with an array of other bioinformatics tools given its integration of standard vocabularies thus enabling research involving biospecimens. End-users are encouraged to share their day-to-day experiences in working with the application, thus providing to the community board insight into the needs and limitations which need be addressed. Users are also requested to review and validate new features through group testing environments and mock screens. Through this user interaction, application flexibility and interoperability have been recognized as necessary developmental focuses essential for accommodating diverse adoption scenarios and biobanking workflows to catalyze advances in biomedical research and operations. Given the diversity of biobanking practices and workforce roles, efforts have been made consistently to maintain robust data granularity while aiding user accessibility, data discoverability, and security within and across applications by providing a lower learning curve in using Open

  4. Core software security security at the source

    CERN Document Server

    Ransome, James

    2013-01-01

    First and foremost, Ransome and Misra have made an engaging book that will empower readers in both large and small software development and engineering organizations to build security into their products. This book clarifies to executives the decisions to be made on software security and then provides guidance to managers and developers on process and procedure. Readers are armed with firm solutions for the fight against cyber threats.-Dr. Dena Haritos Tsamitis. Carnegie Mellon UniversityIn the wake of cloud computing and mobile apps, the issue of software security has never been more importan

  5. Core Flight Software (CFS) Maturation Towards Human Rating Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The Core Flight Software (CFS) system developed by Goddard Space Flight Center, through experience on Morpheus, has proven to be a quality product and a viable...

  6. Improved parallelism and scheduling in multi-core software routers

    OpenAIRE

    Egi, Norbert; Iannaccone, Gianluca; Manesh, Maziar; Mathy, Laurent; Ratnasamy, Sylvia

    2011-01-01

    Recent technological advances in commodity server architectures, with multiple multi-core CPUs, integrated memory controllers, high-speed interconnects, and enhanced network interface cards, provide substantial computational capacity, and thus an attractive platform for packet forwarding. However, to exploit this available capacity, we need a suitable software platform that allows effective parallel packet processing and resource management. In this paper, we at first introduce an ...

  7. The core trigger software framework of the ATLAS experiment

    CERN Document Server

    Bold, T; The ATLAS collaboration; Kama, S; Emeliyanov, D

    2013-01-01

    The high level trigger (HLT) of the ATLAS experiment at the LHC selects interesting proton-proton and heavy ion collision events for the wide ranging ATLAS physics program. The HLT examines events selected by the level-1 hardware trigger using a combination of specially designed software algorithms and offline reconstruction algorithms. The flexible design of the entire trigger system was critical for the success of the ATLAS data taking during the first run of the LHC. The flexibility of the HLT is due to a versatile core software which includes a steering infrastructure, responsible for configuration and execution of hundreds of trigger algorithms, and navigation infrastructure, responsible for storing trigger results for physics analysis and combining algorithms into multi-object triggers. The multi-object triggers are crucial for efficient selection of interesting physics events at high LHC luminosity while running within limited bandwidth budgets. A resource consumption by the software algorithms was min...

  8. DETERMINING THE CORE PART OF SOFTWARE DEVELOPMENT CURRICULUM APPLYING ASSOCIATION RULE MINING ON SOFTWARE JOB ADS IN TURKEY

    Directory of Open Access Journals (Sweden)

    Ilkay Yelmen

    2016-01-01

    Full Text Available The software technology is advancing rapidly over the years. In order to adapt to this advancement, the employees on software development should renew themselves consistently. During this rapid change, it is vital to train the proper software developer with respect to the criteria desired by the industry. Therefore, the curriculum of the programs related to software development at the universities should be revised according to software industry requirements. In this study, the core part of Software Development Curriculum is determined by applying association rule mining on Software Job ads in Turkey. The courses in the core part are chosen with respect to IEEE/ACM computer science curriculum. As a future study, it is also important to gather the academic personnel and the software company professionals to determine the compulsory and elective courses so that newly graduated software dev

  9. Cronos 2: a neutronic simulation software for reactor core calculations

    International Nuclear Information System (INIS)

    The CRONOS2 software is that part of the SAPHYR code system dedicated to neutronic core calculations. CRONOS2 is a powerful tool for reactor design, fuel management and safety studies. Its modular structure and great flexibility make CRONOS2 an unique simulation tool for research and development for a wide variety of reactor systems. CRONOS2 is a versatile tool that covers a large range of applications from very fast calculations used in training simulators to time and memory consuming reference calculations needed to understand complex physical phenomena. CRONOS2 has a procedure library named CPROC that allows the user to create its own application environment fitted to a specific industrial use. (authors)

  10. Experience with Intel's Many Integrated Core architecture in ATLAS software

    Science.gov (United States)

    Fleischmann, S.; Kama, S.; Lavrijsen, W.; Neumann, M.; Vitillo, R.; Atlas Collaboration

    2014-06-01

    Intel recently released the first commercial boards of its Many Integrated Core (MIC) Architecture. MIC is Intel's solution for the domain of throughput computing, currently dominated by general purpose programming on graphics processors (GPGPU). MIC allows the use of the more familiar x86 programming model and supports standard technologies such as OpenMP, MPI, and Intel's Threading Building Blocks (TBB). This should make it possible to develop for both throughput and latency devices using a single code base. In ATLAS Software, track reconstruction has been shown to be a good candidate for throughput computing on GPGPU devices. In addition, the newly proposed offline parallel event-processing framework, GaudiHive, uses TBB for task scheduling. The MIC is thus, in principle, a good fit for this domain. In this paper, we report our experiences of porting to and optimizing ATLAS tracking algorithms for the MIC, comparing the programmability and relative cost/performance of the MIC against those of current GPGPUs and latency-optimized CPUs.

  11. Apply of measure and control data management software in core neutron flux measurement system

    International Nuclear Information System (INIS)

    It describes the development of C2 project based on the OPC protocol core measurement system measurement and control data management software. The main achievement is a measure of the reactor core of the neutron flux distribution, monitoring core power distortion, and accumulated fuel consumption data and other functions, and place the abnormal response timely. The monitoring software running on the main cabinet computer achieves the integration of system monitoring, and the monitoring software running on the channel cabinet achieves system monitoring. The monitoring and control software plays an important role in ensuring the safety and economy of the nuclear power plants. (authors)

  12. Core design methodology and software for Temelin NPP

    International Nuclear Information System (INIS)

    In the frame of the process of fuel vendor change at Temelin NPP in the Czech Republic, where, starting since 2010, TVEL TVSA-T fuel is loaded instead of Westinghouse VVANTAGE-6 fuel, new methodologies for core design and core reload safety evaluation have been developed. These documents are based on the methodologies delivered by TVEL within the fuel contract, and they were further adapted according to Temelin NPP operational needs and according to the current practice at NPP. Along with the methodology development the 3D core analysis code ANDREA, licensed for core reload safety evaluation in 2010, have been upgraded in order to optimize the safety evaluation process. New sequences of calculations were implemented in order to simplify the evaluation of different limiting parameters and output visualization tools were developed to make the verification process user friendly. Interfaces to the fuel performance code TRANSURANUS and sub-channel analysis code SUBCAL were developed as well. (authors)

  13. Experience with Intel's Many Integrated Core Architecture in ATLAS Software

    CERN Document Server

    Fleischmann, S; The ATLAS collaboration; Lavrijsen, W; Neumann, M; Vitillo, R

    2013-01-01

    Intel recently released the first commercial boards of its Many Integrated Core (MIC) Architecture. MIC is Intel's solution for the domain of throughput computing, currently dominated by general purpose programming on graphics processors (GPGPU). MIC allows the use of the more familiar x86 programming model and supports standard technologies such as OpenMP, MPI, and Intel's Threading Building Blocks. This should make it possible to develop for both throughput and latency devices using a single code base.\

  14. Experience with Intel's Many Integrated Core Architecture in ATLAS Software

    CERN Document Server

    Fleischmann, S; The ATLAS collaboration; Lavrijsen, W; Neumann, M; Vitillo, R

    2014-01-01

    Intel recently released the first commercial boards of its Many Integrated Core (MIC) Architecture. MIC is Intel's solution for the domain of throughput computing, currently dominated by general purpose programming on graphics processors (GPGPU). MIC allows the use of the more familiar x86 programming model and supports standard technologies such as OpenMP, MPI, and Intel's Threading Building Blocks. This should make it possible to develop for both throughput and latency devices using a single code base.\

  15. PanPlot - software to visualize profiles and core logs

    OpenAIRE

    Sieger, Rainer; Grobe, Hannes

    2005-01-01

    The program PanPlot was developed as a visualization tool for the information system PANGAEA. It can be used as a stand-alone application to plot data versus depth or time or in a ternary view. Data input format is tab-delimited ASCII (e.g. by export from MS-Excel or from PANGAEA). The default scales and graphic features can individualy be modified. PanPlot graphs can be exported in platform-specific interchange formats (EMF, PICT) which can be imported by graphic software for further process...

  16. Core Network Design of Software Defined Radio Testbed

    OpenAIRE

    Maheshwari, Kapil

    2013-01-01

    The 4th generation of cellular system (LTE) does not inherit the traditional voice (circuit-switched) capabilities from its predecessors. Instead it relies on its high speed packet-switched core network with IMS (IP Multimedia Subsystem) for voice capabilities. Even though there are temporary solutions available until LTE gets its full deployment and coverage, operators are looking for a long term solution known as VoIMS which uses VoIP with SIP protocol for voice in the LTE network (VoLTE) t...

  17. Adaptive Multiclient Network-on-Chip Memory Core: Hardware Architecture, Software Abstraction Layer, and Application Exploration

    OpenAIRE

    Diana Göhringer; Lukas Meder; Stephan Werner; Oliver Oey; Jürgen Becker; Michael Hübner

    2012-01-01

    This paper presents the hardware architecture and the software abstraction layer of an adaptive multiclient Network-on-Chip (NoC) memory core. The memory core supports the flexibility of a heterogeneous FPGA-based runtime adaptive multiprocessor system called RAMPSoC. The processing elements, also called clients, can access the memory core via the Network-on-Chip (NoC). The memory core supports a dynamic mapping of an address space for the different clients as well as different data transfer ...

  18. The future of commodity computing and many-core versus the interests of HEP software

    International Nuclear Information System (INIS)

    As the mainstream computing world has shifted from multi-core to many-core platforms, the situation for software developers has changed as well. With the numerous hardware and software options available, choices balancing programmability and performance are becoming a significant challenge. The expanding multiplicative dimensions of performance offer a growing number of possibilities that need to be assessed and addressed on several levels of abstraction. This paper reviews the major trade-offs forced upon the software domain by the changing landscape of parallel technologies – hardware and software alike. Recent developments, paradigms and techniques are considered with respect to their impact on the rather traditional HEP programming models. Other considerations addressed include aspects of efficiency and reasonably achievable targets for the parallelization of large scale HEP workloads.

  19. Optimizing Project Administrative Workflow with Formstack, Sharepoint, and Vanderbilt CORES Software

    Science.gov (United States)

    Vinson, Paige; Wright, Lisa

    2013-01-01

    Tracking administrative workflow for Core projects is a difficult task. Cores are increasingly required to provide metrics demonstrating productivity, scope of projects, and success rates, yet scientific staff members do not have sufficient access or bandwidth to produce this type of broad spectrum data easily. In an effort to reduce redundancy, automate recurrent tasks and minimize staff labor, the Vanderbilt High Throughput Screening (HTS) Facility has combined readily available web-based software with institutional CORE software. The HTS Facility is striving toward a goal of having common sets of metrics available, as needed, to communicate the institutional impact of the Core to senior leadership and funding agencies. These administrative workflow improvements also increase effective and efficient communication in daily project administration and minimized required labor from scientific staff.

  20. Optimal hardware/software co-synthesis for core-based SoC designs

    Institute of Scientific and Technical Information of China (English)

    Zhan Jinyu; Xiong Guangze

    2006-01-01

    A hardware/software co-synthesis method is presented for SoC designs consisting of both hardware IP cores and software components on a graph-theoretic formulation. Given a SoC integrated with a set of functions and a set of performance factors, a core for each function is selected from a set of alternative IP cores and software components, and optimal partitions is found in a way to evenly balance the performance factors and to ultimately reduce the overall cost, size, power consumption and runtime of the core-based SoC. The algorithm formulates IP cores and components into the corresponding mathematical models, presents a graph-theoretic model for finding the optimal partitions of SoC design and transforms SoC hardware/software co-synthesis problem into finding optimal paths in a weighted, directed graph. Overcoming the three main deficiencies of the traditional methods, this method can work automatically, evaluate more performance factors at the same time and meet the particularity of SoC designs.At last, the approach is illustrated that is practical and effective through partitioning a practical system.

  1. A Usability Driven Approach to Software Development for Core Facility Management

    OpenAIRE

    Marchand, Mathieu; Perret, Emmanuelle; Roux, Pascal; Shorte, Spencer

    2013-01-01

    Core facilities” have become an integral part of modern biomedical research infrastructures and today require integrated management tools to help ensure their optimization for research. Web-based software presents great opportunities, but requires innovation inasmuch as the current generation of laboratory information management systems (LIMS) is mostly comprised of automatons overseeing predictable laboratory equipment processes. By contrast, core facility processes involve not just equipme...

  2. Core Community Specifications for Electron Microprobe Operating Systems: Software, Quality Control, and Data Management Issues

    Science.gov (United States)

    Fournelle, John; Carpenter, Paul

    2006-01-01

    Modem electron microprobe systems have become increasingly sophisticated. These systems utilize either UNIX or PC computer systems for measurement, automation, and data reduction. These systems have undergone major improvements in processing, storage, display, and communications, due to increased capabilities of hardware and software. Instrument specifications are typically utilized at the time of purchase and concentrate on hardware performance. The microanalysis community includes analysts, researchers, software developers, and manufacturers, who could benefit from exchange of ideas and the ultimate development of core community specifications (CCS) for hardware and software components of microprobe instrumentation and operating systems.

  3. A Core Plug and Play Architecture for Reusable Flight Software Systems

    Science.gov (United States)

    Wilmot, Jonathan

    2006-01-01

    The Flight Software Branch, at Goddard Space Flight Center (GSFC), has been working on a run-time approach to facilitate a formal software reuse process. The reuse process is designed to enable rapid development and integration of high-quality software systems and to more accurately predict development costs and schedule. Previous reuse practices have been somewhat successful when the same teams are moved from project to project. But this typically requires taking the software system in an all-or-nothing approach where useful components cannot be easily extracted from the whole. As a result, the system is less flexible and scalable with limited applicability to new projects. This paper will focus on the rationale behind, and implementation of the run-time executive. This executive is the core for the component-based flight software commonality and reuse process adopted at Goddard.

  4. Spent nuclear fuel application of CORE reg-sign systems engineering software

    International Nuclear Information System (INIS)

    The DOE has adopted a systems engineering approach for the successful completion of the Spent Nuclear Fuel (SNF) Program mission. The DOE has utilized systems engineering principles to develop the SNF program guidance documents and has held several systems engineering workshops to develop the functional hierarchies of both the programmatic and technical side of the SNF program. The sheer size and complexity of the SNF program has led to problems that the Westinghouse Savannah River Company (WSRC) is working to manage through the use of systems engineering software. WSRC began using CORE reg-sign, an off the shelf PC based software package, to assist DOE in management of the SNF program. This paper details the successful use of the CORE reg-sign systems engineering software to date and the proposed future activities

  5. Spent nuclear fuel application of CORE{reg_sign} systems engineering software

    Energy Technology Data Exchange (ETDEWEB)

    Grimm, R.J. [Westinghouse Savannah River Company, Aiken, SC (United States)

    1996-12-01

    The Department of Energy (DOE) has adopted a systems engineering approach for the successful completion of the Spent Nuclear Fuel (SNF) Program mission. The DOE has utilized systems engineering principles to develop the SNF Program guidance documents and has held several systems engineering workshops to develop the functional hierarchies of both the programmatic and technical side of the SNF Program. The sheer size and complexity of the SNF Program, however, has led to problems that the Westinghouse Savannah River Company (WSRC) is working to manage through the use of systems engineering software. WSRC began using CORE{reg_sign}, an off-the-shelf PC based software package, to assist the DOE in management of the SNF program. This paper details the successful use of the CORE{reg_sign} systems engineering software to date and the proposed future activities.

  6. Exploring the Impact of Socio-Technical Core-Periphery Structures in Open Source Software Development

    CERN Document Server

    Amrit, Chintan

    2010-01-01

    In this paper we apply the social network concept of core-periphery structure to the sociotechnical structure of a software development team. We propose a socio-technical pattern that can be used to locate emerging coordination problems in Open Source projects. With the help of our tool and method called TESNA, we demonstrate a method to monitor the socio-technical core-periphery movement in Open Source projects. We then study the impact of different core-periphery movements on Open Source projects. We conclude that a steady core-periphery shift towards the core is beneficial to the project, whereas shifts away from the core are clearly not good. Furthermore, oscillatory shifts towards and away from the core can be considered as an indication of the instability of the project. Such an analysis can provide developers with a good insight into the health of an Open Source project. Researchers can gain from the pattern theory, and from the method we use to study the core-periphery movements.

  7. Development and preliminary verification of the PWR on-line core monitoring software system. SOPHORA

    International Nuclear Information System (INIS)

    This paper presents an introduction to the development and preliminary verification of a new on-line core monitoring software system (CMSS), named SOPHORA, for fixed in-core detector (FID) system of PWR. Developed at China General Nuclear Power Corporation (CGN), SOPHORA integrates CGN’s advanced PWR core simulator COCO and thermal-hydraulic sub-channel code LINDEN to manage the real-time core calculation and analysis. Currents measured by the FID are re-evaluated and used as bases to reconstruct the 3-D core power distribution. The key parameters such as peak local power margin and minimum DNBR margin are obtained by comparing with operation limits. Pseudo FID signals generated by data from movable in-core detector (MID) are used to verify the SOPHORA system. Comparison between predicted power peak and the responding MID in-core flux map results shows that the SOPHORA results are reasonable and satisfying. Further verification and validation of SOPHORA is undergoing and will be reported later. (author)

  8. ESAIR: A Behavior-Based Robotic Software Architecture on Multi-Core Processor Platforms

    Directory of Open Access Journals (Sweden)

    Chin-Yuan Tseng

    2013-03-01

    Full Text Available This paper introduces an Embedded Software Architecture for Intelligent Robot systems (ESAIR that addresses the issues of parallel thread executions on multi-core processor platforms. ESAIR provides a thread scheduling interface to improve the execution performance of a robot system by assigning a dedicated core to a running thread on the fly and dynamically rescheduling the priority of the thread. In the paper, we describe the object-oriented design and the control functions of ESAIR. The modular design of ESAIR helps improve the software quality, reliability and scalability in research and real practice. We prove the improvement by realizing ESAIR on an autonomous robot, named AVATAR. AVATAR implements various human-robot interactions, including speech recognition, human following, face recognition, speaker identification, etc. With the support of ESAIR, AVATAR can integrate a comprehensive set of behaviors and peripherals with better resource utilization.

  9. The caCORE Software Development Kit: Streamlining construction of interoperable biomedical information services

    Directory of Open Access Journals (Sweden)

    Warzel Denise

    2006-01-01

    Full Text Available Abstract Background Robust, programmatically accessible biomedical information services that syntactically and semantically interoperate with other resources are challenging to construct. Such systems require the adoption of common information models, data representations and terminology standards as well as documented application programming interfaces (APIs. The National Cancer Institute (NCI developed the cancer common ontologic representation environment (caCORE to provide the infrastructure necessary to achieve interoperability across the systems it develops or sponsors. The caCORE Software Development Kit (SDK was designed to provide developers both within and outside the NCI with the tools needed to construct such interoperable software systems. Results The caCORE SDK requires a Unified Modeling Language (UML tool to begin the development workflow with the construction of a domain information model in the form of a UML Class Diagram. Models are annotated with concepts and definitions from a description logic terminology source using the Semantic Connector component. The annotated model is registered in the Cancer Data Standards Repository (caDSR using the UML Loader component. System software is automatically generated using the Codegen component, which produces middleware that runs on an application server. The caCORE SDK was initially tested and validated using a seven-class UML model, and has been used to generate the caCORE production system, which includes models with dozens of classes. The deployed system supports access through object-oriented APIs with consistent syntax for retrieval of any type of data object across all classes in the original UML model. The caCORE SDK is currently being used by several development teams, including by participants in the cancer biomedical informatics grid (caBIG program, to create compatible data services. caBIG compatibility standards are based upon caCORE resources, and thus the caCORE SDK has

  10. Improved Forwarding Architecture and Resource Management for Multi-Core Software Routers

    OpenAIRE

    Egi, Norbert; Greenhalgh, Adam; Handley, Mark; Iannaccone, Gianluca; Manesh, Maziar; Mathy, Laurent; Ratnasamy, Sylvia

    2009-01-01

    Recent technological advances in commodity server architectures, with multiple multi-core CPUs, integrated memory controllers, high-speed interconnects and enhanced network interface cards, provide substantial computational capacity and thus an attractive platform for packet forwarding. However, to exploit this available capacity, we need a suitable software platform that allows effective parallel packet processing and resource management. In this paper, we at first introduce an improved forw...

  11. Geolocating thermal binoculars based on a software defined camera core incorporating HOT MCT grown by MOVPE

    Science.gov (United States)

    Pillans, Luke; Harmer, Jack; Edwards, Tim; Richardson, Lee

    2016-05-01

    Geolocation is the process of calculating a target position based on bearing and range relative to the known location of the observer. A high performance thermal imager with integrated geolocation functions is a powerful long range targeting device. Firefly is a software defined camera core incorporating a system-on-a-chip processor running the AndroidTM operating system. The processor has a range of industry standard serial interfaces which were used to interface to peripheral devices including a laser rangefinder and a digital magnetic compass. The core has built in Global Positioning System (GPS) which provides the third variable required for geolocation. The graphical capability of Firefly allowed flexibility in the design of the man-machine interface (MMI), so the finished system can give access to extensive functionality without appearing cumbersome or over-complicated to the user. This paper covers both the hardware and software design of the system, including how the camera core influenced the selection of peripheral hardware, and the MMI design process which incorporated user feedback at various stages.

  12. Dynamic optical resource allocation for mobile core networks with software defined elastic optical networking.

    Science.gov (United States)

    Zhao, Yongli; Chen, Zhendong; Zhang, Jie; Wang, Xinbo

    2016-07-25

    Driven by the forthcoming of 5G mobile communications, the all-IP architecture of mobile core networks, i.e. evolved packet core (EPC) proposed by 3GPP, has been greatly challenged by the users' demands for higher data rate and more reliable end-to-end connection, as well as operators' demands for low operational cost. These challenges can be potentially met by software defined optical networking (SDON), which enables dynamic resource allocation according to the users' requirement. In this article, a novel network architecture for mobile core network is proposed based on SDON. A software defined network (SDN) controller is designed to realize the coordinated control over different entities in EPC networks. We analyze the requirement of EPC-lightpath (EPCL) in data plane and propose an optical switch load balancing (OSLB) algorithm for resource allocation in optical layer. The procedure of establishment and adjustment of EPCLs is demonstrated on a SDON-based EPC testbed with extended OpenFlow protocol. We also evaluate the OSLB algorithm through simulation in terms of bandwidth blocking ratio, traffic load distribution, and resource utilization ratio compared with link-based load balancing (LLB) and MinHops algorithms. PMID:27464120

  13. Computer software development of the in-core monitoring system for the WWER type reactor

    International Nuclear Information System (INIS)

    All of this has been led to the necessity of the requirement increase to the maintenance of the safety project limit guaranteeing during the exploitation of the fuel loading of the WWER reactor. It also causes the need of the functional possibility widening and computer software development of the in-core monitoring system for the WWER reactors. provided ways and fulfilment peculiarity of the corresponding work for the NPP Kola (WWER-440) and NPP Rostov (WWER-1000) are briefly discussed in the report (Authors)

  14. Evaluating the scalability of HEP software and multi-core hardware

    International Nuclear Information System (INIS)

    As researchers have reached the practical limits of processor performance improvements by frequency scaling, it is clear that the future of computing lies in the effective utilization of parallel and multi-core architectures. Since this significant change in computing is well underway, it is vital for HEP programmers to understand the scalability of their software on modern hardware and the opportunities for potential improvements. This work aims to quantify the benefit of new mainstream architectures to the HEP community through practical benchmarking on recent hardware solutions, including the usage of parallelized HEP applications.

  15. Evaluating the scalability of HEP software and multi-core hardware

    CERN Document Server

    Jarp, S; Leduc, J; Nowak, A

    2011-01-01

    As researchers have reached the practical limits of processor performance improvements by frequency scaling, it is clear that the future of computing lies in the effective utilization of parallel and multi-core architectures. Since this significant change in computing is well underway, it is vital for HEP programmers to understand the scalability of their software on modern hardware and the opportunities for potential improvements. This work aims to quantify the benefit of new mainstream architectures to the HEP community through practical benchmarking on recent hardware solutions, including the usage of parallelized HEP applications.

  16. A Study of the Speedups and Competitiveness of FPGA Soft Processor Cores using Dynamic Hardware/Software Partitioning

    CERN Document Server

    Lysecky, Roman

    2011-01-01

    Field programmable gate arrays (FPGAs) provide designers with the ability to quickly create hardware circuits. Increases in FPGA configurable logic capacity and decreasing FPGA costs have enabled designers to more readily incorporate FPGAs in their designs. FPGA vendors have begun providing configurable soft processor cores that can be synthesized onto their FPGA products. While FPGAs with soft processor cores provide designers with increased flexibility, such processors typically have degraded performance and energy consumption compared to hard-core processors. Previously, we proposed warp processing, a technique capable of optimizing a software application by dynamically and transparently re-implementing critical software kernels as custom circuits in on-chip configurable logic. In this paper, we study the potential of a MicroBlaze soft-core based warp processing system to eliminate the performance and energy overhead of a soft-core processor compared to a hard-core processor. We demonstrate that the soft-c...

  17. Extension of the AMBER molecular dynamics software to Intel's Many Integrated Core (MIC) architecture

    Science.gov (United States)

    Needham, Perri J.; Bhuiyan, Ashraf; Walker, Ross C.

    2016-04-01

    We present an implementation of explicit solvent particle mesh Ewald (PME) classical molecular dynamics (MD) within the PMEMD molecular dynamics engine, that forms part of the AMBER v14 MD software package, that makes use of Intel Xeon Phi coprocessors by offloading portions of the PME direct summation and neighbor list build to the coprocessor. We refer to this implementation as pmemd MIC offload and in this paper present the technical details of the algorithm, including basic models for MPI and OpenMP configuration, and analyze the resultant performance. The algorithm provides the best performance improvement for large systems (>400,000 atoms), achieving a ∼35% performance improvement for satellite tobacco mosaic virus (1,067,095 atoms) when 2 Intel E5-2697 v2 processors (2 ×12 cores, 30M cache, 2.7 GHz) are coupled to an Intel Xeon Phi coprocessor (Model 7120P-1.238/1.333 GHz, 61 cores). The implementation utilizes a two-fold decomposition strategy: spatial decomposition using an MPI library and thread-based decomposition using OpenMP. We also present compiler optimization settings that improve the performance on Intel Xeon processors, while retaining simulation accuracy.

  18. Evaluating the core damage frequency of a TRIGA research reactor using risk assessment tool software

    Energy Technology Data Exchange (ETDEWEB)

    Kamyab, Shahabeddin [School of Engineering, Shiraz University, 71348-51154 Shiraz (Iran, Islamic Republic of); Nematollahi, Mohammadreza, E-mail: mrnema@yahoo.com [School of Engineering, Shiraz University, 71348-51154 Shiraz (Iran, Islamic Republic of); Safety Research Center of Shiraz University, 71348-51154 Shiraz (Iran, Islamic Republic of)

    2011-08-15

    Highlights: {center_dot} In this study, level-I PSA is performed, to reveal and modify the weak points threatening the safe operation of a typical TRIGA reactor. {center_dot} After identification of the initiating events and developing the appropriate event trees and fault trees, by the risk assessment tool interface, the core damage frequency has been estimated to be 8.368E-6 per year of reactor operation, which meets the IAEA standards. {center_dot} The results also indicate the significant effects of the common cause failures. - Abstract: After all preventive and mitigative measures considered in the design of a nuclear reactor, the installation still represents a residual risk to the outside world. Probabilistic safety assessment (PSA) is a powerful method to survey the safety of nuclear reactors. In this study the occurrence frequency of different types of core damage states (CDS) which may potentially arise in Tehran Research Reactor (TRR) is evaluated by use of the recently developed risk assessment tool (RAT) software which has been designed and represented in the Safety Research Center of Shiraz University. RAT uses event trees and fault trees to evaluate the total final core damage frequency (CDF) through studying the frequencies of initiation events, and following their consequences has resulted in one type of the CDS. The criterion must be of the order of smaller than 1E-04 through IAEA standards for research reactors (). Results show that the total final CDF for TRR is of the order of 10{sup -6}, which meets the criterion of nuclear research reactor.

  19. User Friendly Processing of Sediment CT Data: Software and Application in High Resolution Non-Destructive Sediment Core Data Sets

    Science.gov (United States)

    Reilly, B. T.; Stoner, J. S.; Wiest, J.; Abbott, M. B.; Francus, P.; Lapointe, F.

    2015-12-01

    Computed Tomography (CT) of sediment cores allow for high resolution images, three dimensional volumes, and down core profiles, generated through the attenuation of X-rays as a function of density and atomic number. When using a medical CT-Scanner, these quantitative data are stored in pixels using the Hounsfield scale, which are relative to the attenuation of X-rays in water and air at standard temperature and pressure. Here we present MATLAB based software specifically designed for sedimentary applications with a user friendly graphical interface to process DICOM files and stitch overlapping CT scans. For visualization, the software allows easy generation of core slice images with grayscale and false color relative to a user defined Hounsfield number range. For comparison to other high resolution non-destructive methods, down core Hounsfield number profiles are extracted using a method robust to coring imperfections, like deformation, bowing, gaps, and gas expansion. We demonstrate the usefulness of this technique with lacustrine sediment cores from the Western United States and Canadian High Arctic, including Fish Lake, Oregon, and Sawtooth Lake, Ellesmere Island. These sites represent two different depositional environments and provide examples for a variety of common coring defects and lithologies. The Hounsfield profiles and images can be used in combination with other high resolution data sets, including sediment magnetic parameters, XRF core scans and many other types of data, to provide unique insights into how lithology influences paleoenvironmental and paleomagnetic records and their interpretations.

  20. On the Design of Energy Efficient Optical Networks with Software Defined Networking Control Across Core and Access Networks

    DEFF Research Database (Denmark)

    Wang, Jiayuan; Yan, Ying; Dittmann, Lars

    2013-01-01

    This paper presents a Software Defined Networking (SDN) control plane based on an overlay GMPLS control model. The SDN control platform manages optical core networks (WDM/DWDM networks) and the associated access networks (GPON networks), which makes it possible to gather global information and...

  1. On the Design of Energy Efficient Optical Networks with Software Defined Networking Control Across Core and Access Networks

    DEFF Research Database (Denmark)

    Wang, Jiayuan; Yan, Ying; Dittmann, Lars

    This paper presents a Software Defined Networking (SDN) control plane based on an overlay GMPLS control model. The SDN control platform manages optical core networks (WDM/DWDM networks) and the associated access networks (GPON networks), which makes it possible to gather global information and...

  2. A Reusable and Adaptable Software Architecture for Embedded Space Flight System: The Core Flight Software System (CFS)

    Science.gov (United States)

    Wilmot, Jonathan

    2005-01-01

    The contents include the following: High availability. Hardware is in harsh environment. Flight processor (constraints) very widely due to power and weight constraints. Software must be remotely modifiable and still operate while changes are being made. Many custom one of kind interfaces for one of a kind missions. Sustaining engineering. Price of failure is high, tens to hundreds of millions of dollars.

  3. Experience with Intel's many integrated core architecture in ATLAS software

    International Nuclear Information System (INIS)

    Intel recently released the first commercial boards of its Many Integrated Core (MIC) Architecture. MIC is Intel's solution for the domain of throughput computing, currently dominated by general purpose programming on graphics processors (GPGPU). MIC allows the use of the more familiar x86 programming model and supports standard technologies such as OpenMP, MPI, and Intel's Threading Building Blocks (TBB). This should make it possible to develop for both throughput and latency devices using a single code base. In ATLAS Software, track reconstruction has been shown to be a good candidate for throughput computing on GPGPU devices. In addition, the newly proposed offline parallel event-processing framework, GaudiHive, uses TBB for task scheduling. The MIC is thus, in principle, a good fit for this domain. In this paper, we report our experiences of porting to and optimizing ATLAS tracking algorithms for the MIC, comparing the programmability and relative cost/performance of the MIC against those of current GPGPUs and latency-optimized CPUs.

  4. GENIE: a software package for gene-gene interaction analysis in genetic association studies using multiple GPU or CPU cores

    Directory of Open Access Journals (Sweden)

    Wang Kai

    2011-05-01

    Full Text Available Abstract Background Gene-gene interaction in genetic association studies is computationally intensive when a large number of SNPs are involved. Most of the latest Central Processing Units (CPUs have multiple cores, whereas Graphics Processing Units (GPUs also have hundreds of cores and have been recently used to implement faster scientific software. However, currently there are no genetic analysis software packages that allow users to fully utilize the computing power of these multi-core devices for genetic interaction analysis for binary traits. Findings Here we present a novel software package GENIE, which utilizes the power of multiple GPU or CPU processor cores to parallelize the interaction analysis. GENIE reads an entire genetic association study dataset into memory and partitions the dataset into fragments with non-overlapping sets of SNPs. For each fragment, GENIE analyzes: 1 the interaction of SNPs within it in parallel, and 2 the interaction between the SNPs of the current fragment and other fragments in parallel. We tested GENIE on a large-scale candidate gene study on high-density lipoprotein cholesterol. Using an NVIDIA Tesla C1060 graphics card, the GPU mode of GENIE achieves a speedup of 27 times over its single-core CPU mode run. Conclusions GENIE is open-source, economical, user-friendly, and scalable. Since the computing power and memory capacity of graphics cards are increasing rapidly while their cost is going down, we anticipate that GENIE will achieve greater speedups with faster GPU cards. Documentation, source code, and precompiled binaries can be downloaded from http://www.cceb.upenn.edu/~mli/software/GENIE/.

  5. GENIE: a software package for gene-gene interaction analysis in genetic association studies using multiple GPU or CPU cores

    Science.gov (United States)

    2011-01-01

    Background Gene-gene interaction in genetic association studies is computationally intensive when a large number of SNPs are involved. Most of the latest Central Processing Units (CPUs) have multiple cores, whereas Graphics Processing Units (GPUs) also have hundreds of cores and have been recently used to implement faster scientific software. However, currently there are no genetic analysis software packages that allow users to fully utilize the computing power of these multi-core devices for genetic interaction analysis for binary traits. Findings Here we present a novel software package GENIE, which utilizes the power of multiple GPU or CPU processor cores to parallelize the interaction analysis. GENIE reads an entire genetic association study dataset into memory and partitions the dataset into fragments with non-overlapping sets of SNPs. For each fragment, GENIE analyzes: 1) the interaction of SNPs within it in parallel, and 2) the interaction between the SNPs of the current fragment and other fragments in parallel. We tested GENIE on a large-scale candidate gene study on high-density lipoprotein cholesterol. Using an NVIDIA Tesla C1060 graphics card, the GPU mode of GENIE achieves a speedup of 27 times over its single-core CPU mode run. Conclusions GENIE is open-source, economical, user-friendly, and scalable. Since the computing power and memory capacity of graphics cards are increasing rapidly while their cost is going down, we anticipate that GENIE will achieve greater speedups with faster GPU cards. Documentation, source code, and precompiled binaries can be downloaded from http://www.cceb.upenn.edu/~mli/software/GENIE/. PMID:21615923

  6. Coupling of the 3D neutron kinetic core model DYN3D with the CFD software ANSYS-CFX

    International Nuclear Information System (INIS)

    Highlights: • Improved thermal hydraulic description of nuclear reactor cores. • Possibility of three-dimensional flow phenomena in the core, such as cross flow, flow reversal, flow around obstacles. • Simulation at higher spatial resolution as compared to system codes. - Abstract: This article presents the implementation of a coupling between the 3D neutron kinetic core model DYN3D and the commercial, general purpose computational fluid dynamics (CFD) software ANSYS-CFX. In the coupling approach, parts of the thermal hydraulic calculation are transferred to CFX for its better ability to simulate the three-dimensional coolant redistribution in the reactor core region. The calculation of the heat transfer from the fuel into the coolant remains with DYN3D, which incorporates well tested and validated heat transfer models for rod-type fuel elements. On the CFX side, the core region is modeled based on the porous body approach. The implementation of the code coupling is verified by comparing test case results with reference solutions of the DYN3D standalone version. Test cases cover mini and full core geometries, control rod movement and partial overcooling transients

  7. Rapid Development of Guidance, Navigation, and Control Core Flight System Software Applications Using Simulink Models Project

    Data.gov (United States)

    National Aeronautics and Space Administration — We will demonstrate the usefulness of SIL for GSFC missions by attempting to compile the SIL source code with an autocoded sample GNC application flight software....

  8. An Assessment of the Role of Risk Management Practices in Core Banking Software Project Success: A Case of Commercial Banks in Kenya

    OpenAIRE

    John Paul Otieno

    2013-01-01

    Core Banking Software Change project in a bank is a project that is very costly and very delicate such that its success emerges the top most priority for the organization that is undergoing the change. There is need for business continuity, data integrity and customer service value addition from the product of the project.This study sought to establish the role of risk management practices towards enhancing project success in commercial banks in Kenya during Core Banking Software Change. Give...

  9. Development of New European VLIW Space DSP ASICS, IP Cores and Related Software via ESA Contracts in 2015 and Beyond

    Science.gov (United States)

    Trautner, R.

    2015-09-01

    European space industry needs a new generation of payload data processors in order to cope with in-creasing payload data processing requirements. ESA has defined a roadmap for the development of future payload processor hardware which is being implemented. A key part of this roadmap addresses the development of VLIW Digital Signal Processor (DSP) ASICs, IP cores and associated software. In this paper, we first present an overview of the ESA roadmap and the key development routes. We recapitulate the activities that have created the technology base for the ongoing DSP development, and present the ASIC development and several accompanying activities that will lead to the availability of a new space qualified DSP - the Scalable Sensor Data Processor (SSDP) - in the near future. We then present the expected future evolution of this technology area, and summarize the corresponding ESA roadmap part on VLIW DSPs and related IP and software.

  10. GRAPES: a software for parallel searching on biological graphs targeting multi-core architectures.

    Directory of Open Access Journals (Sweden)

    Rosalba Giugno

    Full Text Available Biological applications, from genomics to ecology, deal with graphs that represents the structure of interactions. Analyzing such data requires searching for subgraphs in collections of graphs. This task is computationally expensive. Even though multicore architectures, from commodity computers to more advanced symmetric multiprocessing (SMP, offer scalable computing power, currently published software implementations for indexing and graph matching are fundamentally sequential. As a consequence, such software implementations (i do not fully exploit available parallel computing power and (ii they do not scale with respect to the size of graphs in the database. We present GRAPES, software for parallel searching on databases of large biological graphs. GRAPES implements a parallel version of well-established graph searching algorithms, and introduces new strategies which naturally lead to a faster parallel searching system especially for large graphs. GRAPES decomposes graphs into subcomponents that can be efficiently searched in parallel. We show the performance of GRAPES on representative biological datasets containing antiviral chemical compounds, DNA, RNA, proteins, protein contact maps and protein interactions networks.

  11. GRAPES: a software for parallel searching on biological graphs targeting multi-core architectures.

    Science.gov (United States)

    Giugno, Rosalba; Bonnici, Vincenzo; Bombieri, Nicola; Pulvirenti, Alfredo; Ferro, Alfredo; Shasha, Dennis

    2013-01-01

    Biological applications, from genomics to ecology, deal with graphs that represents the structure of interactions. Analyzing such data requires searching for subgraphs in collections of graphs. This task is computationally expensive. Even though multicore architectures, from commodity computers to more advanced symmetric multiprocessing (SMP), offer scalable computing power, currently published software implementations for indexing and graph matching are fundamentally sequential. As a consequence, such software implementations (i) do not fully exploit available parallel computing power and (ii) they do not scale with respect to the size of graphs in the database. We present GRAPES, software for parallel searching on databases of large biological graphs. GRAPES implements a parallel version of well-established graph searching algorithms, and introduces new strategies which naturally lead to a faster parallel searching system especially for large graphs. GRAPES decomposes graphs into subcomponents that can be efficiently searched in parallel. We show the performance of GRAPES on representative biological datasets containing antiviral chemical compounds, DNA, RNA, proteins, protein contact maps and protein interactions networks. PMID:24167551

  12. The astrometric core solution for the Gaia mission. Overview of models, algorithms and software implementation

    CERN Document Server

    Lindegren, Lennart; Hobbs, David; O'Mullane, William; Bastian, Ulrich; Hernández, José

    2011-01-01

    The Gaia satellite will observe about one billion stars and other point-like sources. The astrometric core solution will determine the astrometric parameters (position, parallax, and proper motion) for a subset of these sources, using a global solution approach which must also include a large number of parameters for the satellite attitude and optical instrument. The accurate and efficient implementation of this solution is an extremely demanding task, but crucial for the outcome of the mission. We provide a comprehensive overview of the mathematical and physical models applicable to this solution, as well as its numerical and algorithmic framework. The astrometric core solution is a simultaneous least-squares estimation of about half a billion parameters, including the astrometric parameters for some 100 million well-behaved so-called primary sources. The global nature of the solution requires an iterative approach, which can be broken down into a small number of distinct processing blocks (source, attitude,...

  13. BROCCOLI: Software for Fast fMRI Analysis on Many-Core CPUs and GPUs

    Directory of Open Access Journals (Sweden)

    Anders eEklund

    2014-03-01

    Full Text Available Analysis of functional magnetic resonance imaging (fMRI data is becoming ever more computationally demanding as temporal and spatial resolutions improve, and large, publicly available data sets proliferate. Moreover, methodological improvements in the neuroimaging pipeline, such as non-linear spatial normalization, non-parametric permutation tests and Bayesian Markov Chain Monte Carlo approaches, can dramatically increase the computational burden. Despite these challenges, there do not yet exist any fMRI software packages which leverage inexpensive and powerful graphics processing units (GPUs to perform these analyses. Here, we therefore present BROCCOLI, a free software package written in OpenCL (Open Computing Language that can be used for parallel analysis of fMRI data on a large variety of hardware configurations. BROCCOLI has, for example, been tested with an Intel CPU, an Nvidia GPU and an AMD GPU. These tests show that parallel processing of fMRI data can lead to significantly faster analysis pipelines. This speedup can be achieved on relatively standard hardware, but further, dramatic speed improvements require only a modest investment in GPU hardware. BROCCOLI (running on a GPU can perform non-linear spatial normalization to a 1 mm3 brain template in 4-6 seconds, and run a second level permutation test with 10,000 permutations in about a minute. These non-parametric tests are generally more robust than their parametric counterparts, and can also enable more sophisticated analyses by estimating complicated null distributions. Additionally, BROCCOLI includes support for Bayesian first-level fMRI analysis using a Gibbs sampler. The new software is freely available under GNU GPL3 and can be downloaded from github (https://github.com/wanderine/BROCCOLI/.

  14. BROCCOLI: Software for fast fMRI analysis on many-core CPUs and GPUs.

    Science.gov (United States)

    Eklund, Anders; Dufort, Paul; Villani, Mattias; Laconte, Stephen

    2014-01-01

    Analysis of functional magnetic resonance imaging (fMRI) data is becoming ever more computationally demanding as temporal and spatial resolutions improve, and large, publicly available data sets proliferate. Moreover, methodological improvements in the neuroimaging pipeline, such as non-linear spatial normalization, non-parametric permutation tests and Bayesian Markov Chain Monte Carlo approaches, can dramatically increase the computational burden. Despite these challenges, there do not yet exist any fMRI software packages which leverage inexpensive and powerful graphics processing units (GPUs) to perform these analyses. Here, we therefore present BROCCOLI, a free software package written in OpenCL (Open Computing Language) that can be used for parallel analysis of fMRI data on a large variety of hardware configurations. BROCCOLI has, for example, been tested with an Intel CPU, an Nvidia GPU, and an AMD GPU. These tests show that parallel processing of fMRI data can lead to significantly faster analysis pipelines. This speedup can be achieved on relatively standard hardware, but further, dramatic speed improvements require only a modest investment in GPU hardware. BROCCOLI (running on a GPU) can perform non-linear spatial normalization to a 1 mm(3) brain template in 4-6 s, and run a second level permutation test with 10,000 permutations in about a minute. These non-parametric tests are generally more robust than their parametric counterparts, and can also enable more sophisticated analyses by estimating complicated null distributions. Additionally, BROCCOLI includes support for Bayesian first-level fMRI analysis using a Gibbs sampler. The new software is freely available under GNU GPL3 and can be downloaded from github (https://github.com/wanderine/BROCCOLI/). PMID:24672471

  15. Design and Implementation of an Efficient Software Communications Architecture Core Framework for a Digital Signal Processors Platform

    Directory of Open Access Journals (Sweden)

    Wael A. Murtada

    2011-01-01

    Full Text Available Problem statement: The Software Communications Architecture (SCA was developed to improve software reuse and interoperability in Software Defined Radios (SDR. However, there have been performance concerns since its conception. Arguably, the majority of the problems and inefficiencies associated with the SCA can be attributed to the assumption of modular distributed platforms relying on General Purpose Processors (GPPs to perform all signal processing. Approach: Significant improvements in cost and power consumption can be obtained by utilizing specialized and more efficient platforms. Digital Signal Processors (DSPs present such a platform and have been widely used in the communications industry. Improvements in development tools and middleware technology opened the possibility of fully integrating DSPs into the SCA. This approach takes advantage of the exceptional power, cost and performance characteristics of DSPs, while still enjoying the flexibility and portability of the SCA. Results: This study presents the design and implementation of an SCA Core Framework (CF for a TI TMS320C6416 DSP. The framework is deployed on a C6416 Device Cycle Accurate Simulator and TI C6416 Development board. The SCA CF is implemented by leveraging OSSIE, an open-source implementation of the SCA, to support the DSP platform. OIS’s ORBExpress DSP and DSP/BIOS are used as the middleware and operating system, respectively. A sample waveform was developed to demonstrate the framework’s functionality. Benchmark results for the framework and sample applications are provided. Conclusion: Benchmark results show that, using OIS ORBExpress DSP ORB middleware has an impact for decreasing the Software Memory Footprint and increasing the System Performance compared with PrismTech's e*ORB middleware.

  16. Harmonic Domain Modelling of Transformer Core Nonlinearities Using the DIgSILENT PowerFactory Software

    DEFF Research Database (Denmark)

    Bak, Claus Leth; Bak-Jensen, Birgitte; Wiechowski, Wojciech

    2008-01-01

    This paper demonstrates the results of implementation and verification of an already existing algorithm that allows for calculating saturation characteristics of singlephase power transformers. The algorithm was described for the first time in 1993. Now this algorithm has been implemented using the...... DIgSILENT Programming Language (DPL) as an external script in the harmonic domain calculations of a power system analysis tool PowerFactory [10]. The algorithm is verified by harmonic measurements on a single-phase power transformer. A theoretical analysis of the core nonlinearities phenomena in...

  17. Development and Evaluation of Vectorised and Multi-Core Event Reconstruction Algorithms within the CMS Software Framework

    CERN Document Server

    CERN. Geneva

    2012-01-01

    The processing of data acquired by the CMS detector at LHC is carried out with an object-oriented C++ software framework: CMSSW. With the increasing luminosity delivered by the LHC, the treatment of recorded data requires extraordinary large computing resources, also in terms of CPU usage. A possible solution to cope with this task is the exploitation of the features offered by the latest microprocessor architectures. Modern CPUs present several vector units, the capacity of which is growing steadily with the introduction of new processor generations. Moreover, an increasing number of cores per die is offered by the main vendors, even on consumer hardware. Most recent C++ compilers provide facilities to take advantage of such innovations, either by explicit statements in the programs’ sources or automatically adapting the generated machine instructions to the available hardware, without the need of modifying the existing code base. Programming techniques to implement reconstruction algorithms and optimised ...

  18. CORE

    DEFF Research Database (Denmark)

    Krigslund, Jeppe; Hansen, Jonas; Hundebøll, Martin;

    2013-01-01

    different flows. Instead of maintaining these approaches separate, we propose a protocol (CORE) that brings together these coding mechanisms. Our protocol uses random linear network coding (RLNC) for intra- session coding but allows nodes in the network to setup inter- session coding regions where flows...... increase the benefits of XORing by exploiting the underlying RLNC structure of individual flows. This goes beyond providing additional reliability to each individual session and beyond exploiting coding opportunistically. Our numerical results show that CORE outperforms both forwarding and COPE......-like schemes in general. More importantly, we show gains of up to 4 fold over COPE-like schemes in terms of transmissions per packet in one of the investigated topologies....

  19. Development and Evaluation of Vectorised and Multi-Core Event Reconstruction Algorithms within the CMS Software Framework

    International Nuclear Information System (INIS)

    The processing of data acquired by the CMS detector at LHC is carried out with an object-oriented C++ software framework: CMSSW. With the increasing luminosity delivered by the LHC, the treatment of recorded data requires extraordinary large computing resources, also in terms of CPU usage. A possible solution to cope with this task is the exploitation of the features offered by the latest microprocessor architectures. Modern CPUs present several vector units, the capacity of which is growing steadily with the introduction of new processor generations. Moreover, an increasing number of cores per die is offered by the main vendors, even on consumer hardware. Most recent C++ compilers provide facilities to take advantage of such innovations, either by explicit statements in the programs sources or automatically adapting the generated machine instructions to the available hardware, without the need of modifying the existing code base. Programming techniques to implement reconstruction algorithms and optimised data structures are presented, that aim to scalable vectorization and parallelization of the calculations. One of their features is the usage of new language features of the C++11 standard. Portions of the CMSSW framework are illustrated which have been found to be especially profitable for the application of vectorization and multi-threading techniques. Specific utility components have been developed to help vectorization and parallelization. They can easily become part of a larger common library. To conclude, careful measurements are described, which show the execution speedups achieved via vectorised and multi-threaded code in the context of CMSSW.

  20. Using CORE Model-Based Systems Engineering Software to Support Program Management in the U.S. Department of Energy Office of the Biomass Project: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Riley, C.; Sandor, D.; Simpkins, P.

    2006-11-01

    This paper describes how a model-based systems engineering software, CORE, is helping the U. S. Department of Energy's Office of Biomass Program assist with bringing biomass-derived biofuels to the market. This software tool provides information to guide informed decision-making as biomass-to-biofuels systems are advanced from concept to commercial adoption. It facilitates management and communication of program status by automatically generating custom reports, Gantt charts, and tables using the widely available programs of Microsoft Word, Project and Excel.

  1. Software Program: Software Management Guidebook

    Science.gov (United States)

    1996-01-01

    The purpose of this NASA Software Management Guidebook is twofold. First, this document defines the core products and activities required of NASA software projects. It defines life-cycle models and activity-related methods but acknowledges that no single life-cycle model is appropriate for all NASA software projects. It also acknowledges that the appropriate method for accomplishing a required activity depends on characteristics of the software project. Second, this guidebook provides specific guidance to software project managers and team leaders in selecting appropriate life cycles and methods to develop a tailored plan for a software engineering project.

  2. Core component integration tests for the back-end software sub-system in the ATLAS data acquisition and event filter prototype -1 project

    International Nuclear Information System (INIS)

    The ATLAS data acquisition (DAQ) and Event Filter (EF) prototype -1 project was intended to produce a prototype system for evaluating candidate technologies and architectures for the final ATLAS DAQ system on the LHC accelerator at CERN. Within the prototype project, the back-end sub-system encompasses the software for configuring, controlling and monitoring the DAQ. The back-end sub-system includes core components and detector integration components. The core components provide the basic functionality and had priority in terms of time-scale for development in order to have a baseline sub-system that can be used for integration with the data-flow sub-system and event filter. The following components are considered to be the core of the back-end sub-system: - Configuration databases, describe a large number of parameters of the DAQ system architecture, hardware and software components, running modes and status; - Message reporting system (MRS), allows all software components to report messages to other components in the distributed environment; - Information service (IS) allows the information exchange for software components; - Process manager (PMG), performs basic job control of software components (start, stop, monitoring the status); - Run control (RC), controls the data taking activities by coordinating the operations of the DAQ sub-systems, back-end software and external systems. Performance and scalability tests have been made for individual components. The back-end subsystem integration tests bring together all the core components and several trigger/DAQ/detector integration components to simulate the control and configuration of data taking sessions. For back-end integration tests a test plan was provided. The tests have been done using a shell script that goes through different phases as follows: - starting the back-end server processes to initialize communication services and PMG; - launching configuration specific processes via DAQ supervisor as

  3. Cronos 2: a neutronic simulation software for reactor core calculations; Cronos 2: un logiciel de simulation neutronique des coeurs de reacteurs

    Energy Technology Data Exchange (ETDEWEB)

    Lautard, J.J.; Magnaud, C.; Moreau, F.; Baudron, A.M. [CEA Saclay, Dept. de Mecanique et de Technologie (DMT/SERMA), 91 - Gif-sur-Yvette (France)

    1999-07-01

    The CRONOS2 software is that part of the SAPHYR code system dedicated to neutronic core calculations. CRONOS2 is a powerful tool for reactor design, fuel management and safety studies. Its modular structure and great flexibility make CRONOS2 an unique simulation tool for research and development for a wide variety of reactor systems. CRONOS2 is a versatile tool that covers a large range of applications from very fast calculations used in training simulators to time and memory consuming reference calculations needed to understand complex physical phenomena. CRONOS2 has a procedure library named CPROC that allows the user to create its own application environment fitted to a specific industrial use. (authors)

  4. The coupling of the Star-Cd software to a whole-core neutron transport code Decart for PWR applications

    International Nuclear Information System (INIS)

    As part of a U.S.- Korea collaborative U.S. Department of Energy INERI project, a comprehensive high-fidelity reactor-core modeling capability is being developed for detailed analysis of existing and advanced PWR reactor designs. An essential element of the project has been the development of an interface between the computational fluid dynamics (CFD) module, STAR-CD, and the neutronics module, DeCART. Since the computational mesh for CFD and neutronics calculations are generally different, the capability to average and decompose data on these different meshes has been an important part of code coupling activities. An averaging process has been developed to extract neutronics zone temperatures in the fuel and coolant and to generate appropriate multi group cross sections and densities. Similar procedures have also been established to map the power distribution from the neutronics zones to the mesh structure used in the CFD module. Since MPI is used as the parallel model in STAR-CD and conflicts arise during initiation of a second level of MPI, the interface developed here is based on using TCP/IP protocol sockets to establish communication between the CFD and neutronics modules. Preliminary coupled calculations have been performed for PWR fuel assembly size problems and converged solutions have been achieved for a series of steady-state problems ranging from a single pin to a 1/8 model of a 17 x 17 PWR fuel assembly. (authors)

  5. Oilpatch software

    Energy Technology Data Exchange (ETDEWEB)

    Macedo, Ricardo; Dutta, Ashok; Smith, Maurice; Chandler, Graham

    2011-07-15

    In the oil and gas industry, new software are often developed to improve productivity or processes. Some software developments are presented: Adapx developed Capturx which digitalize and send data collected on paper to the office thanks to digital pens. For offshore purposes, Oceanic Consulting Corporation developed a package including a suite of marine simulation software. Beyond Compliance developed a compliance system to make regulatory compliance easier by streamlining data collection, processing and management. Zantek Information Technology developed an accounting package that integrates all core business functions. Reality Mobile created a platform where employees can share live video onto a secured network using devices they already have. DataShare has developed an online tool to prepare emergency response plans, OPERA, and they also do mapping work. All of these six software are helping oil and gas companies to meet regulatory compliance or facilitate communication.

  6. Validation of a new software version for monitoring of the core of the Unit 2 of the Laguna Verde power plant with ARTS; Validacion de una nueva version del software para monitoreo del nucleo de la Unidad 2 de la Central Laguna Verde con ARTS

    Energy Technology Data Exchange (ETDEWEB)

    Calleros, G.; Riestra, M.; Ibanez, C.; Lopez, X.; Vargas, A.; Mendez, A.; Gomez, R. [CFE, Central Nucleoelectrica de Laguna Verde, Alto Lucero, Veracruz (Mexico)]. e-mail: gcm9acpp@cfe.gob.mx

    2005-07-01

    In this work it is intended a methodology to validate a new version of the software used for monitoring the reactor core, which requires of the evaluation of the thermal limits settled down in the Operation Technical Specifications, for the Unit 2 of Laguna Verde with ARTS (improvements to the APRMs, Rod Block Monitor and Technical specifications). According to the proposed methodology, those are shown differences found in the thermal limits determined with the new versions and previous of the core monitoring software. Author)

  7. Software concepts for the build-up of complex systems - selection and realization taking as example a program system for calculation of hypothetical core meltdown accidents

    International Nuclear Information System (INIS)

    Development and application of simulation systems for the analysis of complex processes require on the one hand and detailed engineering knowledge of the plant and the processes to be simulated and on the other hand a detailled knowledge about software engineering, numerics and data structures. The cooperation of specialists of both areas will become easier if it is possible to reduce the complexicity of the problems to be solved in a way that the analyses will not be disturbed and the communication between different disciplines will not become unnecessarily complicated. One solution to reduce the complexity is to consider computer science as an engineering discipline which provides mainly abstract elements and to allow engineers to build application systems based on these abstract elements. The principle of abstraction leads through the processes of modularisation and the solution of the interface problem to an almost problem independent system architecture where the elements of the system (modules, model components and models) operate only on those data assigned to them. In addition the development of abstract data types allows the formal description of the relations and interactions between system elements. This work describes how these ideas can be concretized to build complex systems which allow reliable and effective problem solutions. These ideas were applied successfully during the design, realization and application of the code system KESS, which allows the analysis of core melt down accidents in pressurized water reactors. (orig.)

  8. Pragmatic Software Innovation

    DEFF Research Database (Denmark)

    Aaen, Ivan; Jensen, Rikke Hagensby

    2014-01-01

    We understand software innovation as concerned with introducing innovation into the development of software intensive systems, i.e. systems in which software development and/or integration are dominant considerations. Innovation is key in almost any strategy for competitiveness in existing market...... foundation for Essence – a software innovation methodology – where unknown options and needs emerge as part of the development process itself. The foundation is illustrated via a simple example.......We understand software innovation as concerned with introducing innovation into the development of software intensive systems, i.e. systems in which software development and/or integration are dominant considerations. Innovation is key in almost any strategy for competitiveness in existing markets......, for creating new markets, or for curbing rising public expenses, and software intensive systems are core elements in most such strategies. Software innovation therefore is vital for about every sector of the economy. Changes in software technologies over the last decades have opened up for...

  9. Nuclear application software package

    International Nuclear Information System (INIS)

    The Nuclear Application Software Package generates a full-core distribution and power peaking analysis every six minutes during reactor operation. Information for these calculations is provided by a set of fixed incore, self-powered rhodium detectors whose signals are monitored and averaged to obtain input for the software. Following the calculation of a power distribution and its normalization to a core heat balance, the maximum power peaks in the core and minimum DNBR are calculated. Additional routines are provided to calculate the core reactivity, future xenon concentrations, critical rod positions, and assembly isotopic concentrations

  10. Software Defined Radio with Parallelized Software Architecture

    Science.gov (United States)

    Heckler, Greg

    2013-01-01

    This software implements software-defined radio procession over multicore, multi-CPU systems in a way that maximizes the use of CPU resources in the system. The software treats each processing step in either a communications or navigation modulator or demodulator system as an independent, threaded block. Each threaded block is defined with a programmable number of input or output buffers; these buffers are implemented using POSIX pipes. In addition, each threaded block is assigned a unique thread upon block installation. A modulator or demodulator system is built by assembly of the threaded blocks into a flow graph, which assembles the processing blocks to accomplish the desired signal processing. This software architecture allows the software to scale effortlessly between single CPU/single-core computers or multi-CPU/multi-core computers without recompilation. NASA spaceflight and ground communications systems currently rely exclusively on ASICs or FPGAs. This software allows low- and medium-bandwidth (100 bps to approx.50 Mbps) software defined radios to be designed and implemented solely in C/C++ software, while lowering development costs and facilitating reuse and extensibility.

  11. Dtest Testing Software

    Science.gov (United States)

    Jain, Abhinandan; Cameron, Jonathan M.; Myint, Steven

    2013-01-01

    This software runs a suite of arbitrary software tests spanning various software languages and types of tests (unit level, system level, or file comparison tests). The dtest utility can be set to automate periodic testing of large suites of software, as well as running individual tests. It supports distributing multiple tests over multiple CPU cores, if available. The dtest tool is a utility program (written in Python) that scans through a directory (and its subdirectories) and finds all directories that match a certain pattern and then executes any tests in that directory as described in simple configuration files.

  12. Software Process Models and Analysis on Failure of Software Development Projects

    OpenAIRE

    Kaur, Rupinder; Sengupta, Jyotsna

    2013-01-01

    The software process model consists of a set of activities undertaken to design, develop and maintain software systems. A variety of software process models have been designed to structure, describe and prescribe the software development process. The software process models play a very important role in software development, so it forms the core of the software product. Software project failure is often devastating to an organization. Schedule slips, buggy releases and missing features can me...

  13. Teaching Software Engineering through Robotics

    OpenAIRE

    Shin, Jiwon; Rusakov, Andrey; Meyer, Bertrand

    2014-01-01

    This paper presents a newly-developed robotics programming course and reports the initial results of software engineering education in robotics context. Robotics programming, as a multidisciplinary course, puts equal emphasis on software engineering and robotics. It teaches students proper software engineering -- in particular, modularity and documentation -- by having them implement four core robotics algorithms for an educational robot. To evaluate the effect of software engineering educati...

  14. Exploring Software Resilience

    OpenAIRE

    Ståhl, Björn

    2011-01-01

    Software has, for better or worse, become a core component in the structured management and manipulation of vast quantitates of information, and is therefore central to many crucial services and infrastructures. However, hidden among the various benefits that the inclusion of software may bring is the potential of unwanted and unforeseen interactions, ranging from mere annoyances all the way up to full-blown catastrophes. Overcoming adversities of this nature is a challenge shared with other ...

  15. Software engineering

    CERN Document Server

    Sommerville, Ian

    2010-01-01

    The ninth edition of Software Engineering presents a broad perspective of software engineering, focusing on the processes and techniques fundamental to the creation of reliable, software systems. Increased coverage of agile methods and software reuse, along with coverage of 'traditional' plan-driven software engineering, gives readers the most up-to-date view of the field currently available. Practical case studies, a full set of easy-to-access supplements, and extensive web resources make teaching the course easier than ever.

  16. UWB Tracking Software Development

    Science.gov (United States)

    Gross, Julia; Arndt, Dickey; Ngo, Phong; Phan, Chau; Dusl, John; Ni, Jianjun; Rafford, Melinda

    2006-01-01

    An Ultra-Wideband (UWB) two-cluster Angle of Arrival (AOA) tracking prototype system is currently being developed and tested at NASA Johnson Space Center for space exploration applications. This talk discusses the software development efforts for this UWB two-cluster AOA tracking system. The role the software plays in this system is to take waveform data from two UWB radio receivers as an input, feed this input into an AOA tracking algorithm, and generate the target position as an output. The architecture of the software (Input/Output Interface and Algorithm Core) will be introduced in this talk. The development of this software has three phases. In Phase I, the software is mostly Matlab driven and calls C++ socket functions to provide the communication links to the radios. This is beneficial in the early stage when it is necessary to frequently test changes in the algorithm. Phase II of the development is to have the software mostly C++ driven and call a Matlab function for the AOA tracking algorithm. This is beneficial in order to send the tracking results to other systems and also to improve the tracking update rate of the system. The third phase is part of future work and is to have the software completely C++ driven with a graphics user interface. This software design enables the fine resolution tracking of the UWB two-cluster AOA tracking system.

  17. Core Recursive Hierarchical Image Segmentation

    Science.gov (United States)

    Tilton, James

    2011-01-01

    The Recursive Hierarchical Image Segmentation (RHSEG) software has been repackaged to provide a version of the RHSEG software that is not subject to patent restrictions and that can be released to the general public through NASA GSFC's Open Source release process. Like the Core HSEG Software Package, this Core RHSEG Software Package also includes a visualization program called HSEGViewer along with a utility program HSEGReader. It also includes an additional utility program called HSEGExtract. The unique feature of the Core RHSEG package is that it is a repackaging of the RHSEG technology designed to specifically avoid the inclusion of the certain software technology. Unlike the Core HSEG package, it includes the recursive portions of the technology, but does not include processing window artifact elimination technology.

  18. Reliable software

    OpenAIRE

    Arenas Solà, Concepción; Mestres i Naval, Francesc

    2015-01-01

    In biomedicine, biodiversity and other fields of research, large databases are used. Assuming that a proper statistical procedure has been chosen, a crucial point is the selection of the right software to compute the data. The available software has to be sufficiently proven and having the guarantee that it is reliable. Currently, it is easy to obtain free software for most statistical procedures. We agree that a free software is especially useful because as a large number of researchers can ...

  19. The dynamic of modern software development project management and the software crisis of quality. An integrated system dynamics approach towards software quality improvement

    OpenAIRE

    Nasirikaljahi, Armindokht

    2012-01-01

    The software industry is plagued by cost-overruns, delays, poor customer satisfaction and quality issues that are costing clients and customers world-wide billions of dollars each year. The phenomenon is coined The Software Crisis", and poses a huge challenge for software project management. This thesis addresses one of the core issues of the software crisis, namely software quality. The challenges of software quality are central for understanding the other symptoms of the software crisis. Th...

  20. Software reliability

    CERN Document Server

    Bendell, A

    1986-01-01

    Software Reliability reviews some fundamental issues of software reliability as well as the techniques, models, and metrics used to predict the reliability of software. Topics covered include fault avoidance, fault removal, and fault tolerance, along with statistical methods for the objective assessment of predictive accuracy. Development cost models and life-cycle cost models are also discussed. This book is divided into eight sections and begins with a chapter on adaptive modeling used to predict software reliability, followed by a discussion on failure rate in software reliability growth mo

  1. Software Engineering for Tagging Software

    OpenAIRE

    Karan Gupta; Anita Goel

    2013-01-01

    Tagging is integrated into web application to ease maintenance of large amount of information stored in aweb application. With no mention of requirement specification or design document for tagging software,academically or otherwise, integrating tagging software in a web application is a tedious task. In thispaper, a framework has been created for integration of tagging software in a web application. Theframework follows the software development life cycle paradigms and is to be used during i...

  2. Software Complexity Methodologies & Software Security

    OpenAIRE

    Masoud Rafighi; Nasser Modiri

    2011-01-01

    It is broadly clear that complexity is one of the software natural features. Software natural complexity and software requirement functionality are two inseparable part and they have special range. measurement complexity have explained with using the MacCabe and Halsted models and with an example discuss about software complexity in this paper Flow metric information Henry and Kafura, complexity metric system Agresti-card-glass, design metric in item’s level have compared and peruse then cate...

  3. Software Transparency

    OpenAIRE

    Leite, Julio Cesar Sampaio do Prado

    2009-01-01

    Software transparency is a new concern that software developers must deal with. As society moves towards the digitalization of day to day processes, the transparency of these digital processes becomes of fundamental importance if citizens would like to exercise their right to know. Informed discourse is only possible if processes that affect the public are open to evaluation. Achieving software transparency to this level of openness brings up several roadblocks. Thi...

  4. The Ettention software package.

    Science.gov (United States)

    Dahmen, Tim; Marsalek, Lukas; Marniok, Nico; Turoňová, Beata; Bogachev, Sviatoslav; Trampert, Patrick; Nickels, Stefan; Slusallek, Philipp

    2016-02-01

    We present a novel software package for the problem "reconstruction from projections" in electron microscopy. The Ettention framework consists of a set of modular building-blocks for tomographic reconstruction algorithms. The well-known block iterative reconstruction method based on Kaczmarz algorithm is implemented using these building-blocks, including adaptations specific to electron tomography. Ettention simultaneously features (1) a modular, object-oriented software design, (2) optimized access to high-performance computing (HPC) platforms such as graphic processing units (GPU) or many-core architectures like Xeon Phi, and (3) accessibility to microscopy end-users via integration in the IMOD package and eTomo user interface. We also provide developers with a clean and well-structured application programming interface (API) that allows for extending the software easily and thus makes it an ideal platform for algorithmic research while hiding most of the technical details of high-performance computing. PMID:26686659

  5. Software Reviews.

    Science.gov (United States)

    Miller, Anne, Ed.; Radziemski, Cathy, Ed.

    1988-01-01

    Reviews two software packages for the Macintosh series. "Course Builder 2.0," a courseware authoring system, allows the user to create programs which stand alone and may be used independently in the classroom. "World Builder," an artificial intelligence software package, allows creative thinking, problem-solving, and decision-making. (YP)

  6. Software RISCO

    OpenAIRE

    Silva, J.M.G.; F. S. Barbosa

    2008-01-01

    Os processos utilizados do desenvolvimento do software RISCO (software para bordadeiras), para terem sido eficazes, tiveram em conta as vertentes ambientais (humanas e materiais) a replicar numa aplicação digital. Para tal foi capital a observação dos dados e comportamentos, de forma a oferecer alterações positivas decorrentes do uso da nova ferramenta de trabalho.

  7. AJL Software

    OpenAIRE

    2012-01-01

    AJL Software es una empresa dedicada al desarrollo de software a la medida, el segmento al cual se enfocara el negocio actualmente está en auge en Colombia gracias a las ventajas que ofrecen las herramientas de computo por la modernización y automatización que ofrece la tecnología en todos las empresas.

  8. Software engineering

    CERN Document Server

    Sommerville, Ian

    2016-01-01

    For courses in computer science and software engineering The Fundamental Practice of Software Engineering Software Engineering introduces readers to the overwhelmingly important subject of software programming and development. In the past few years, computer systems have come to dominate not just our technological growth, but the foundations of our world's major industries. This text seeks to lay out the fundamental concepts of this huge and continually growing subject area in a clear and comprehensive manner. The Tenth Edition contains new information that highlights various technological updates of recent years, providing readers with highly relevant and current information. Sommerville's experience in system dependability and systems engineering guides the text through a traditional plan-based approach that incorporates some novel agile methods. The text strives to teach the innovators of tomorrow how to create software that will make our world a better, safer, and more advanced place to live.

  9. Software requirements

    CERN Document Server

    Wiegers, Karl E

    2003-01-01

    Without formal, verifiable software requirements-and an effective system for managing them-the programs that developers think they've agreed to build often will not be the same products their customers are expecting. In SOFTWARE REQUIREMENTS, Second Edition, requirements engineering authority Karl Wiegers amplifies the best practices presented in his original award-winning text?now a mainstay for anyone participating in the software development process. In this book, you'll discover effective techniques for managing the requirements engineering process all the way through the development cy

  10. Software testing

    Science.gov (United States)

    Price-Whelan, Adrian M.

    2016-01-01

    Now more than ever, scientific results are dependent on sophisticated software and analysis. Why should we trust code written by others? How do you ensure your own code produces sensible results? How do you make sure it continues to do so as you update, modify, and add functionality? Software testing is an integral part of code validation and writing tests should be a requirement for any software project. I will talk about Python-based tools that make managing and running tests much easier and explore some statistics for projects hosted on GitHub that contain tests.

  11. Software Innovation

    DEFF Research Database (Denmark)

    Rose, Jeremy

      Innovation is the forgotten key to modern systems development - the element that defines the enterprising engineer, the thriving software firm and the cutting edge software application.  Traditional forms of technical education pay little attention to creativity - often encouraging overly...... rationalistic ways of thinking which stifle the ability to innovate. Professional software developers are often drowned in commercial drudgery and overwhelmed by work pressure and deadlines. The topic that will both ensure success in the market and revitalize their work lives is never addressed. This book sets...

  12. Software Patents

    OpenAIRE

    Hellstadius, Åsa

    2010-01-01

    The purpose of this chapter is to give an overview of the main issues in regard to software patenting in the 21st century. The focus is on the question of patentability of software, since this is the area which has caused the most problems for patent offices and courts. The main systems of concern are the European and U.S. patent systems. The chapter begins with a presentation of the concept of software in section 2, followed by section 3 with a presentation of IP and patents and the internat...

  13. Software Review

    OpenAIRE

    Tim Castellano

    2008-01-01

    Extend from Imagine That Inc. is simulation software which the company advertises as software for the next millennium. I had not seen this software before, and therefore, was not sure of what to expect from it. But I was pleasantly surprised with its abilities after working with it for a few days. Extend is supplied on a CD, accompanied by a Users Manual which covers various topics such as building a model, enhancing the model and running the model with the blocks provided with the model. It ...

  14. Proteomics Core

    Data.gov (United States)

    Federal Laboratory Consortium — Proteomics Core is the central resource for mass spectrometry based proteomics within the NHLBI. The Core staff help collaborators design proteomics experiments in...

  15. ANALYSIS OF SOFTWARE COST ESTIMATION MODELS

    OpenAIRE

    Tahir Abdullah; Rabia Saleem; Shahbaz Nazeer; Muhammad Usman

    2012-01-01

    Software Cost estimation is a process of forecasting the Cost of project in terms of budget, time, and other resources needed to complete a software system and it is a core issue in the software project management to estimate the cost of a project before initiating the Software Project. Different models have been developed to estimate the cost of software projects for the last several years. Most of these models rely on the Analysts’ experience, size of the software project and some other sof...

  16. Application Software

    OpenAIRE

    Yang, Seungwon

    2009-01-01

    This module covers commonly used application software, which are specifically designed for the creation and development of digital library (DL) systems and similar types of collections and services, such as open access archives.

  17. Software Reviews.

    Science.gov (United States)

    Davis, Shelly J., Ed.; Knaupp, Jon, Ed.

    1984-01-01

    Reviewed is computer software on: (1) classification of living things, a tutorial program for grades 5-10; and (2) polynomial practice using tiles, a drill-and-practice program for algebra students. (MNS)

  18. Software Reviews.

    Science.gov (United States)

    Dwyer, Donna; And Others

    1989-01-01

    Reviewed are seven software packages for Apple and IBM computers. Included are: "Toxicology"; "Science Corner: Space Probe"; "Alcohol and Pregnancy"; "Science Tool Kit Plus"; Computer Investigations: Plant Growth"; "Climatrolls"; and "Animal Watch: Whales." (CW)

  19. Exploiting Distributed Software Transactional Memory

    OpenAIRE

    Kotseldis, Christos-Efthymios

    2011-01-01

    Over the past years research and development on computer architecture hasshifted from uni-processor systems to multi-core architectures. This transitionhas created new incentives in software development because in order for the soft-ware to scale it has to be highly parallel. Traditional synchronization primitivesbased on mutual exclusion locking are challenging to use and therefore are onlyefficiently employed by a minority of expert programmers.Transactional Memory (TM) is a new alternative...

  20. Software preservation

    OpenAIRE

    Tadej Vodopivec

    2011-01-01

    Comtrade Ltd. covers a wide range of activities related to information and communication technologies; its deliverables include web applications, locally installed programs,system software, drivers, embedded software (used e.g. in medical devices, auto parts,communication switchboards). Also the extensive knowledge and practical experience about digital long-term preservation technologies have been acquired. This wide spectrum of activities puts us in the position to discuss the often overloo...

  1. Pattern-based software architecture for service-oriented software systems

    OpenAIRE

    Barrett Ronan; Pahl Claus

    2010-01-01

    Service-oriented architecture is a recent conceptual framework for service-oriented software platforms. Architectures are of great importance for the evolution of software systems. We present a modelling and transformation technique for service-centric distributed software systems. Architectural configurations, expressed through hierarchical architectural patterns, form the core of a specification and transformation technique. Patterns on different levels of abstraction form transformation...

  2. Software reengineering

    Science.gov (United States)

    Fridge, Ernest M., III

    1991-01-01

    Today's software systems generally use obsolete technology, are not integrated properly with other software systems, and are difficult and costly to maintain. The discipline of reverse engineering is becoming prominent as organizations try to move their systems up to more modern and maintainable technology in a cost effective manner. JSC created a significant set of tools to develop and maintain FORTRAN and C code during development of the Space Shuttle. This tool set forms the basis for an integrated environment to re-engineer existing code into modern software engineering structures which are then easier and less costly to maintain and which allow a fairly straightforward translation into other target languages. The environment will support these structures and practices even in areas where the language definition and compilers do not enforce good software engineering. The knowledge and data captured using the reverse engineering tools is passed to standard forward engineering tools to redesign or perform major upgrades to software systems in a much more cost effective manner than using older technologies. A beta vision of the environment was released in Mar. 1991. The commercial potential for such re-engineering tools is very great. CASE TRENDS magazine reported it to be the primary concern of over four hundred of the top MIS executives.

  3. Practice and ExpIoration of CuItivating Core Competencies Embedded Software Testing Course in Higher VocationaI Education%职业核心能力培养嵌入高职《软件测试》课程的实践与探索

    Institute of Scientific and Technical Information of China (English)

    吴伶琳

    2014-01-01

    According to the problems such as lack of the cultivation of professional core competencies in the higher vocational education, expounds the reasons of lack of occupation of professional core competencies in teaching, combined with the teaching practice of Software Testing course. In order to improve the students professional core competencies in higher vocational education, explores the effective path of combining with the cultivation of professional core competencies and course teaching of Software Testing.%针对当前高职教育中缺乏职业核心能力培养的问题,结合《软件测试》课程的教学实践,阐述职业核心能力在教学中缺失的原因。为了提升高职学生的职业核心能力,探讨《软件测试》课程教学与职业核心能力培养结合的有效路径。

  4. MIAWARE Software

    DEFF Research Database (Denmark)

    Wilkowski, Bartlomiej; Pereira, Oscar N. M.; Dias, Paulo;

    2008-01-01

    This article presents MIAWARE, a software for Medical Image Analysis With Automated Reporting Engine, which was designed and developed for doctor/radiologist assistance. It allows to analyze an image stack from computed axial tomography scan of lungs (thorax) and, at the same time, to mark all...... pathologies on images and report their characteristics. The reporting process is normalized - radiologists cannot describe pathological changes with their own words, but can only use some terms from a specific vocabulary set provided by the software. Consequently, a normalized radiological report is...... automatically generated. Furthermore, MIAWARE software is accompanied with an intelligent search engine for medical reports, based on the relations between parts of the lungs. A logical structure of the lungs is introduced to the search algorithm through the specially developed ontology. As a result, a...

  5. Software engineering

    CERN Document Server

    Thorin, Marc

    1985-01-01

    Software Engineering describes the conceptual bases as well as the main methods and rules on computer programming. This book presents software engineering as a coherent and logically built synthesis and makes it possible to properly carry out an application of small or medium difficulty that can later be developed and adapted to more complex cases. This text is comprised of six chapters and begins by introducing the reader to the fundamental notions of entities, actions, and programming. The next two chapters elaborate on the concepts of information and consistency domains and show that a proc

  6. Ice cores

    DEFF Research Database (Denmark)

    Svensson, Anders

    Ice cores from Antarctica, from Greenland, and from a number of smaller glaciers around the world yield a wealth of information on past climates and environments. Ice cores offer unique records on past temperatures, atmospheric composition (including greenhouse gases), volcanism, solar activity......, dustiness, and biomass burning, among others. In Antarctica, ice cores extend back more than 800,000 years before present (Jouzel et al. 2007), whereas. Greenland ice cores cover the last 130,000 years...

  7. Defect Management in Agile Software Development

    Directory of Open Access Journals (Sweden)

    Rida Noor

    2014-03-01

    Full Text Available Agile development reduces the risk of developing low quality software in the first place by minimizing defects. In agile software development formal defect management processes help to build quality software. The core purpose of defect management is to make the software more effective and efficient in order to increase its quality. There are several methods for handling defects like defect prevention, defect discovery and resolution which are used by software developers and testers. Refactoring keeps the system clean by identifying and removing quality defects. To gain the full confidence of the customer defect management should be involved at every stage of development. Agile methodologies focus on delivering the software in form of short iterations. Thus each iteration helps to overcome defects and leads better development and end user satisfaction. This study paints the picture of handling the software defects using agile based Software Development Process.

  8. Software Engineering to Professionalize Software Development

    Directory of Open Access Journals (Sweden)

    Juan Miguel Alonso

    2011-12-01

    Full Text Available The role, increasingly important, that plays the software in the systems with widespread effects presents new challenges for the formation of Software Engineers. Not only because social dependence software is increasing, but also because the character of software development is also changing and with it the demands for software developers certified. In this paper are propose some challenges and aspirations that guide the learning processes Software Engineering and help to identify the need to train professionals in software development.

  9. Software Engineering Education: Some Important Dimensions

    Science.gov (United States)

    Mishra, Alok; Cagiltay, Nergiz Ercil; Kilic, Ozkan

    2007-01-01

    Software engineering education has been emerging as an independent and mature discipline. Accordingly, various studies are being done to provide guidelines for curriculum design. The main focus of these guidelines is around core and foundation courses. This paper summarizes the current problems of software engineering education programs. It also…

  10. EPIQR software

    Energy Technology Data Exchange (ETDEWEB)

    Flourentzos, F. [Federal Institute of Technology, Lausanne (Switzerland); Droutsa, K. [National Observatory of Athens, Athens (Greece); Wittchen, K.B. [Danish Building Research Institute, Hoersholm (Denmark)

    1999-11-01

    The support of the EPIQR method is a multimedia computer program. Several modules help the users of the method to treat the data collected during a diagnosis survey, to set up refurbishment scenario and calculate their cost or energy performance, and finally to visualize the results in a comprehensive way and to prepare quality reports. This article presents the structure and the main features of the software. (au)

  11. Software preservation

    Directory of Open Access Journals (Sweden)

    Tadej Vodopivec

    2011-01-01

    Full Text Available Comtrade Ltd. covers a wide range of activities related to information and communication technologies; its deliverables include web applications, locally installed programs,system software, drivers, embedded software (used e.g. in medical devices, auto parts,communication switchboards. Also the extensive knowledge and practical experience about digital long-term preservation technologies have been acquired. This wide spectrum of activities puts us in the position to discuss the often overlooked aspect of the digital preservation - preservation of software programs. There are many resources dedicated to digital preservation of digital data, documents and multimedia records,but not so many about how to preserve the functionalities and features of computer programs. Exactly these functionalities - dynamic response to inputs - render the computer programs rich compared to documents or linear multimedia. The article opens the questions on the beginning of the way to the permanent digital preservation. The purpose is to find a way in the right direction, where all relevant aspects will be covered in proper balance. The following questions are asked: why at all to preserve computer programs permanently, who should do this and for whom, when we should think about permanent program preservation, what should be persevered (such as source code, screenshots, documentation, and social context of the program - e.g. media response to it ..., where and how? To illustrate the theoretic concepts given the idea of virtual national museum of electronic banking is also presented.

  12. COMPARATIVE ANALYSIS OF SINGLE-CORE AND MULTI-CORE SYSTEMS

    OpenAIRE

    Ogundairo Johnson; Omosehinmi Dinyo

    2015-01-01

    Overall performance of computer systems are better investigated and evaluated when its various components are considered, components such as the hardware, software and firmware. The comparative analysis of single-core and multi-core systems was carried out using Intel Pentium G640T 2.4GHz dualcore, Intel Pentium IV 2.4GHz single-core and Intel Pentium IV 2.8GHz single-core systems. The approach method was using hi-tech benchmarking and stress testing software(s) to examine systems...

  13. Reliability Testing Strategy - Reliability in Software Engineering

    OpenAIRE

    Taylor-Sakyi, Kevin

    2016-01-01

    This paper presents the core principles of reliability in software engineering - outlining why reliability testing is critical and specifying the process of measuring reliability. The paper provides insight for both novice and experts in the software engineering field for assessing failure intensity as well as predicting failure of software systems. Measurements are conducted by utilizing information from an operational profile to further enhance a test plan and test cases, all of which this ...

  14. DEVELOPING SOFTWARE FOR CORPUS RESEARCH

    Directory of Open Access Journals (Sweden)

    Oliver Mason

    2008-06-01

    Full Text Available Despite the central role of the computer in corpus research, programming is generally not seen as a core skill within corpus linguistics. As a consequence, limitations in software for text and corpus analysis slow down the progress of research while analysts often have to rely on third party software or even manual data analysis if no suitable software is available. Apart from software itself, data formats are also of great importance for text processing. But again, many practitioners are not very aware of the options available to them, and thus idiosyncratic text formats often make sharing of resources difficult if not impossible. This article discusses some issues relating to both data and processing which should aid researchers to become more aware of the choices available to them when it comes to using computers in linguistic research. It also describes an easy way towards automating some common text processing tasks that can easily be acquired without knowledge of actual computer programming.

  15. SproutCore web application development

    CERN Document Server

    Keating, Tyler

    2013-01-01

    Written as a practical, step-by-step tutorial, Creating HTML5 Apps with SproutCore is full of engaging examples to help you learn in a practical context.This book is for any person looking to write software for the Web or already writing software for the Web. Whether your background is in web development or in software development, Creating HTML5 Apps with SproutCore will help you expand your skills so that you will be ready to apply the software development principles in the web development space.

  16. Software architecture

    CERN Document Server

    Vogel, Oliver; Chughtai, Arif

    2011-01-01

    As a software architect you work in a wide-ranging and dynamic environment. You have to understand the needs of your customer, design architectures that satisfy both functional and non-functional requirements, and lead development teams in implementing the architecture. And it is an environment that is constantly changing: trends such as cloud computing, service orientation, and model-driven procedures open up new architectural possibilities. This book will help you to develop a holistic architectural awareness and knowledge base that extends beyond concrete methods, techniques, and technologi

  17. Software schematics

    OpenAIRE

    Colomb, Julien; Reiter, Lutz; Blaszkiewicz, Jedrzej; Wessnitzer, Jan; Brembs, Björn

    2012-01-01

    The experimenter enters information (in red) about the fly and the platform (semi-automatically) into the tracker application (BuriTrack). The tracker saves this information along with a time stamp in an XML file. Online analysis of the video leads to the extraction of the position of the fly over time, which is directly saved to the data file. The analysis software (CeTrAn) then reads a text file indicating the path to the XML file and the fly grouping information. It then automatically impo...

  18. Software Engineering to Professionalize Software Development

    OpenAIRE

    Juan Miguel Alonso; Fernando García

    2011-01-01

    The role, increasingly important, that plays the software in the systems with widespread effects presents new challenges for the formation of Software Engineers. Not only because social dependence software is increasing, but also because the character of software development is also changing and with it the demands for software developers certified. In this paper are propose some challenges and aspirations that guide the learning processes Software Engineering and help to identify the need to...

  19. SOFTWARE METRICS VALIDATION METHODOLOGIES IN SOFTWARE ENGINEERING

    OpenAIRE

    K.P. Srinivasan; T. Devi

    2014-01-01

    In the software measurement validations, assessing the validation of software metrics in software engineering is a very difficult task due to lack of theoretical methodology and empirical methodology [41, 44, 45]. During recent years, there have been a number of researchers addressing the issue of validating software metrics. At present, software metrics are validated theoretically using properties of measures. Further, software measurement plays an important role in understanding...

  20. Preparing HEP software for concurrency

    Science.gov (United States)

    Clemencic, M.; Hegner, B.; Mato, P.; Piparo, D.

    2014-06-01

    The necessity for thread-safe experiment software has recently become very evident, largely driven by the evolution of CPU architectures towards exploiting increasing levels of parallelism. For high-energy physics this represents a real paradigm shift, as concurrent programming was previously only limited to special, well-defined domains like control software or software framework internals. This paradigm shift, however, falls into the middle of the successful LHC programme and many million lines of code have already been written without the need for parallel execution in mind. In this paper we have a closer look at the offline processing applications of the LHC experiments and their readiness for the many-core era. We review how previous design choices impact the move to concurrent programming. We present our findings on transforming parts of the LHC experiment reconstruction software to thread-safe code, and the main design patterns that have emerged during the process. A plethora of parallel-programming patterns are well known outside the HEP community, but only a few have turned out to be straightforward enough to be suited for non-expert physics programmers. Finally, we propose a potential strategy for the migration of existing HEP experiment software to the many-core era.

  1. Preparing HEP software for concurrency

    International Nuclear Information System (INIS)

    The necessity for thread-safe experiment software has recently become very evident, largely driven by the evolution of CPU architectures towards exploiting increasing levels of parallelism. For high-energy physics this represents a real paradigm shift, as concurrent programming was previously only limited to special, well-defined domains like control software or software framework internals. This paradigm shift, however, falls into the middle of the successful LHC programme and many million lines of code have already been written without the need for parallel execution in mind. In this paper we have a closer look at the offline processing applications of the LHC experiments and their readiness for the many-core era. We review how previous design choices impact the move to concurrent programming. We present our findings on transforming parts of the LHC experiment reconstruction software to thread-safe code, and the main design patterns that have emerged during the process. A plethora of parallel-programming patterns are well known outside the HEP community, but only a few have turned out to be straightforward enough to be suited for non-expert physics programmers. Finally, we propose a potential strategy for the migration of existing HEP experiment software to the many-core era.

  2. SOFTWARE METRICS VALIDATION METHODOLOGIES IN SOFTWARE ENGINEERING

    Directory of Open Access Journals (Sweden)

    K.P. Srinivasan

    2014-12-01

    Full Text Available In the software measurement validations, assessing the validation of software metrics in software engineering is a very difficult task due to lack of theoretical methodology and empirical methodology [41, 44, 45]. During recent years, there have been a number of researchers addressing the issue of validating software metrics. At present, software metrics are validated theoretically using properties of measures. Further, software measurement plays an important role in understanding and controlling software development practices and products. The major requirement in software measurement is that the measures must represent accurately those attributes they purport to quantify and validation is critical to the success of software measurement. Normally, validation is a collection of analysis and testing activities across the full life cycle and complements the efforts of other quality engineering functions and validation is a critical task in any engineering project. Further, validation objective is to discover defects in a system and assess whether or not the system is useful and usable in operational situation. In the case of software engineering, validation is one of the software engineering disciplines that help build quality into software. The major objective of software validation process is to determine that the software performs its intended functions correctly and provides information about its quality and reliability. This paper discusses the validation methodology, techniques and different properties of measures that are used for software metrics validation. In most cases, theoretical and empirical validations are conducted for software metrics validations in software engineering [1-50].

  3. Core Technical Capability Laboratory Management System

    Science.gov (United States)

    Shaykhian, Linda; Dugger, Curtis; Griffin, Laurie

    2008-01-01

    The Core Technical Capability Lab - oratory Management System (CTCLMS) consists of dynamically generated Web pages used to access a database containing detailed CTC lab data with the software hosted on a server that allows users to have remote access.

  4. EPICS: porting iocCore to multiple operating systems.

    Energy Technology Data Exchange (ETDEWEB)

    Kraimer, M.

    1999-09-30

    An important component of EPICS (Experimental Physics and Industrial Control System) is iocCore, which is the core software in the IOC (input/output controller) front-end processors. Currently iocCore requires the vxWorks operating system. This paper describes the porting of iocCore to other operating systems.

  5. Exploring the Sources of Enterprise Agility in Software Organizations

    OpenAIRE

    Srinivasan, Jayakanth

    2009-01-01

    Software is one of the core elements that drive the modern economy, with visible use in areas such as personal computing, telecommunications and banking, and background use in areas such as aircraft traffic management, nuclear power generation, and automotive control systems. Organizations that build software are unique in that they span industrial domains, and at their core of what they do is codifying human knowledge. When we talk about software organizations, we think of organizations that...

  6. Continuing Progress on a Lattice QCD Software Infrastructure

    OpenAIRE

    Joo, Balint; Collaboration, for the USQCD

    2008-01-01

    We report on the progress of the software effort in the QCD Application Area of SciDAC. In particular, we discuss how the software developed under SciDAC enabled the aggressive exploitation of leadership computers, and we report on progress in the area of QCD software for multi-core architectures.

  7. Continuing progress on a lattice QCD software infrastructure

    International Nuclear Information System (INIS)

    We report on the progress of the software effort in the QCD application area of SciDAC. In particular, we discuss how the software developed under SciDAC enabled the aggressive exploitation of leadership computers, and we report on progress in the area of QCD software for multi-core architectures

  8. Risk reduction using DDP (Defect Detection and Prevention): Software support and software applications

    Science.gov (United States)

    Feather, M. S.

    2001-01-01

    Risk assessment and mitigation is the focus of the Defect Detection and Prevention (DDP) process, which has been applied to spacecraft technology assessments and planning, both hardware and software. DDP's major elements and their relevance to core requirement engineering concerns are summarized. The accompanying research demonstration illustrates DDP's tool support, and further customizations for application to software.

  9. Office Computer Software: A Comprehensive Review of Software Programs.

    Science.gov (United States)

    Secretary, 1992

    1992-01-01

    Describes types of software including system software, application software, spreadsheets, accounting software, graphics packages, desktop publishing software, database, desktop and personal information management software, project and records management software, groupware, and shareware. (JOW)

  10. Developing CMS software documentation system

    CERN Document Server

    Stankevicius, Mantas

    2012-01-01

    CMSSW (CMS SoftWare) is the overall collection of software and services needed by the simulation, calibration and alignment, and reconstruction modules that process data so that physicists can perform their analyses. It is a long term project, with a large amount of source code. In large scale and complex projects is important to have as up-to-date and automated software documentation as possible. The core of the documentation should be version-based and available online with the source code. CMS uses Doxygen and Twiki as the main tools to provide automated and non-automated documentation. Both of them are heavily cross-linked to prevent duplication of information. Doxygen is used to generate functional documentation and dependency graphs from the source code. Twiki is divided into two parts: WorkBook and Software Guide. WorkBook contains tutorial-type instructions on accessing computing resources and using the software to perform analysis within the CMS collaboration and Software Guide gives further details....

  11. Software engineering architecture-driven software development

    CERN Document Server

    Schmidt, Richard F

    2013-01-01

    Software Engineering: Architecture-driven Software Development is the first comprehensive guide to the underlying skills embodied in the IEEE's Software Engineering Body of Knowledge (SWEBOK) standard. Standards expert Richard Schmidt explains the traditional software engineering practices recognized for developing projects for government or corporate systems. Software engineering education often lacks standardization, with many institutions focusing on implementation rather than design as it impacts product architecture. Many graduates join the workforce with incomplete skil

  12. Analyzing software repositories to understand software evolution

    OpenAIRE

    D'Ambros, M; Gall, H.C.; Lanza, M; Pinzger, M.

    2008-01-01

    Software repositories such as versioning systems, defect tracking systems, and archived communication between project personnel are used to help manage the progress of software projects. Software practitioners and researchers increasingly recognize the potential benefit of mining this information to support the maintenance of software systems, improve software design or reuse, and empirically validate novel ideas and techniques. Research is now proceeding to uncover ways in which mining th...

  13. Variability in software engineering paradigms

    OpenAIRE

    Huysegoms, Tom; Snoeck, Monique

    2012-01-01

    The concept of variability is not new in software engineering, but current research mostly remains vague about the overall variability concept when it comes to giving a clear overview of the dimensions of variability. In this paper we evaluate the core variability concept by proposing an overview of the set of definitions concerning variability related concepts and by setting up dimensions of variability. These dimensions represent different possible views on variability for all types of ...

  14. Quality of freeware antivirus software

    OpenAIRE

    Rasool, Muhammad Ahsan; Jamal, Abdul

    2011-01-01

    War between malware and antimalware software started two decade back and have adopted the modern techniques with the evolution of technological development in the field of information technology. This thesis was targeted to analyze the performance of freeware antivirus programs available in the market. Several tests were performed to analyze the performance with respect to the core responsibilities of these software’s to scan and detect the viruses and also prevent and eradicate form them. Al...

  15. Software Metrics to Estimate Software Quality using Software Component Reusability

    OpenAIRE

    Prakriti Trivedi; Rajeev Kumar

    2012-01-01

    Today most of the applications developed using some existing libraries, codes, open sources etc. As a code is accessed in a program, it is represented as the software component. Such as in java beans and in .net ActiveX controls are the software components. These components are ready to use programming code or controls that excel the code development. A component based software system defines the concept of software reusability. While using these components the main question arise is whether ...

  16. The software life cycle

    CERN Document Server

    Ince, Darrel

    1990-01-01

    The Software Life Cycle deals with the software lifecycle, that is, what exactly happens when software is developed. Topics covered include aspects of software engineering, structured techniques of software development, and software project management. The use of mathematics to design and develop computer systems is also discussed. This book is comprised of 20 chapters divided into four sections and begins with an overview of software engineering and software development, paying particular attention to the birth of software engineering and the introduction of formal methods of software develop

  17. Neutronics computational methods for cores

    International Nuclear Information System (INIS)

    This engineering-oriented publication contains a detailed presentation of neutronics computational methods for cores. More precisely, it presents neutronics equations: Boltzmann equation for neutron transport, resolution principles, use of high performance computing. The next parts present the problematic (values to be computed, computation software and methods), nuclear data and their processing. Then the authors describe the application of the Monte Carlo method to reactor physics: resolution of the transport equation by the Monte Carlo method, convergence of a Monte Carlo calculation and notion of quality factor, and software. Deterministic methods are then addressed: discretization, processing of resonant absorption, network calculations, core calculation, deterministic software, fuel evolution, and kinetics. The next chapter addresses multi-physical aspects: necessity of a coupling, principles of neutronic/thermal hydraulic coupling, example of an accidental transient. The last part addresses the checking approach, and neutronics computational code validation

  18. Ice Cores

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Records of past temperature, precipitation, atmospheric trace gases, and other aspects of climate and environment derived from ice cores drilled on glaciers and ice...

  19. Software Model Of Software-Development Process

    Science.gov (United States)

    Lin, Chi Y.; Synott, Debra J.; Levary, Reuven R.

    1990-01-01

    Collection of computer programs constitutes software tool for simulation of medium- to large-scale software-development projects. Necessary to include easily identifiable and more-readily quantifiable characteristics like costs, times, and numbers of errors. Mathematical model incorporating these and other factors of dynamics of software-development process implemented in the Software Life Cycle Simulator (SLICS) computer program. Simulates dynamics of software-development process. In combination with input and output expert software systems and knowledge-based management software system, develops information for use in managing large software-development project. Intended to aid managers in planning, managing, and controlling software-development processes by reducing uncertainties in budgets, required personnel, and schedules.

  20. Evolvability of Software Systems

    OpenAIRE

    Nasir, Muhammad-Iftikhar; Iqbal, Rizwan

    2008-01-01

    Software evolvability, meeting the future requirements of the customer is one of the emerging challenges which software industry is facing nowadays. Software evolvability is the ability of software system to accommodate future requirements. Studies have shown that software evolvability has large economic benefits but at the same time it’s difficult to assess. Over the time many methods have been derived to assess the software evolvability. Software evolvability depends upon various characteri...

  1. Software fault tolerance

    OpenAIRE

    Kazinov, Tofik Hasanaga; Mostafa, Jalilian Shahrukh

    2009-01-01

    Because of our present inability to produce errorfree software, software fault tolerance is and will contiune to be an important consideration in software system. The root cause of software design errors in the complexity of the systems. This paper surveys various software fault tolerance techniquest and methodologies. They are two gpoups: Single version and Multi version software fault tolerance techniques. It is expected that software fault tolerance research will benefit from this research...

  2. Software Innovation in a Mission Critical Environment

    Science.gov (United States)

    Fredrickson, Steven

    2015-01-01

    Operating in mission-critical environments requires trusted solutions, and the preference for "tried and true" approaches presents a potential barrier to infusing innovation into mission-critical systems. This presentation explores opportunities to overcome this barrier in the software domain. It outlines specific areas of innovation in software development achieved by the Johnson Space Center (JSC) Engineering Directorate in support of NASA's major human spaceflight programs, including International Space Station, Multi-Purpose Crew Vehicle (Orion), and Commercial Crew Programs. Software engineering teams at JSC work with hardware developers, mission planners, and system operators to integrate flight vehicles, habitats, robotics, and other spacecraft elements for genuinely mission critical applications. The innovations described, including the use of NASA Core Flight Software and its associated software tool chain, can lead to software that is more affordable, more reliable, better modelled, more flexible, more easily maintained, better tested, and enabling of automation.

  3. Criteria-Based Framework for Software Product management

    OpenAIRE

    Samer I. Mohamed; Islam A. M. ElMaddah; Ayman M. Wahba

    2010-01-01

    Value-Based Software Engineering (VBSE) becomes one of the most promising approaches for software product management [10]. It focuses on the critical role by which stakeholders and business core values affect decision making which in turn influence the product success. This paper illustrates the Criteria-Based approach for software product management through a computer based software framework. The framework can select the best candidate requirements for each release based on the stakeholders...

  4. A model-driven traceability framework for software product lines

    OpenAIRE

    Anquetil, Nicolas; Kulesza, Uirá; Mitschke, Ralf; Moreira, Ana; Royer, Jean-Claude; Rummler, Andreas; Sousa, André

    2010-01-01

    Software product line (SPL) engineering is a recent approach to software development where a set of software products are derived for a well defined target application domain, from a common set of core assets using analogous means of production (for instance, through Model Driven Engineering). Therefore, such family of products are built from reuse, instead of developed individually from scratch. SPL promise to lower the costs of development, increase the quality of software, give clients mor...

  5. A Practical Evaluation of Next Generation Sequencing & Molecular Cloning Software

    OpenAIRE

    Meintjes, Peter; Qaadri, Kashef; Olsen, Christian

    2013-01-01

    Laboratories using Next Generation Sequencing (NGS) technologies and/ or high-throughput molecular cloning experiments can spend a significant amount of their research budget on data analysis and data management. The decision to develop in-house software, to rely on combinations of free software packages, or to purchase commercial software can significantly affect productivity and ROI. In this talk, we will describe a practical software evaluation process that was developed to assist core fac...

  6. Economics of Software Engineering

    OpenAIRE

    Mohamed Mohamed Al Hady

    2007-01-01

    An article about the economic aspects of software engineering, it discusses many important issues in this field, like; knowledge economics, history of software engineering, prosperities of software industry, and economics of software, then it discusses the methods followed to improvement of software economics

  7. Software Architecture Evolution and Software Evolvability

    OpenAIRE

    Pei Breivold, Hongyu

    2009-01-01

    Software is characterized by inevitable changes and increasing complexity, which in turn may lead to huge costs unless rigorously taking into account change accommodations. This is in particular true for long-lived systems. For such systems, there is a need to address evolvability explicitly during the entire lifecycle, carry out software evolution efficiently and reliably, and prolong the productive lifetime of the software systems. In this thesis, we study evolution of software architecture...

  8. Software attribute visualization for high integrity software

    Energy Technology Data Exchange (ETDEWEB)

    Pollock, G.M.

    1998-03-01

    This report documents a prototype tool developed to investigate the use of visualization and virtual reality technologies for improving software surety confidence. The tool is utilized within the execution phase of the software life cycle. It provides a capability to monitor an executing program against prespecified requirements constraints provided in a program written in the requirements specification language SAGE. The resulting Software Attribute Visual Analysis Tool (SAVAnT) also provides a technique to assess the completeness of a software specification.

  9. Core transfer

    Science.gov (United States)

    Good news for all petroleum geoscientists, mining and environmental scientists, university researchers, and the like: Shell Oil Company has deeded its Midland core and sample repository to the Bureau of Economic Geology (BEG) at the University of Texas at Austin. The Midland repository includes more than 1 million linear meters of slab, whole core, and prepared cuttings. Data comprising one of the largest U.S. core collections—the geologic samples from wells drilled in Texas and 39 other states—are now public data and will be incorporated into the existing BEG database. Both Shell and the University of Texas at Austin are affiliated with the American Geological Institute, which assisted in arranging the transfer as part of its goal to establish a National Geoscience Data Repository System at regional centers across the United States.

  10. Ontologies for software engineering and software technology

    CERN Document Server

    Calero, Coral; Piattini, Mario

    2006-01-01

    Covers two applications of ontologies in software engineering and software technology: sharing knowledge of the problem domain and using a common terminology among all stakeholders; and filtering the knowledge when defining models and metamodels. This book is of benefit to software engineering researchers in both academia and industry.

  11. Core strengthening.

    Science.gov (United States)

    Arendt, Elizabeth A

    2007-01-01

    Several recent studies have evaluated interventional techniques designed to reduce the risk of serious knee injuries, particularly noncontact anterior cruciate ligament injuries in female athletes. Maintenance of rotational control of the limb underneath the pelvis, especially in response to cutting and jumping activities, is a common goal in many training programs. Rotational control of the limb underneath the pelvis is mediated by a complex set of factors including the strength of the trunk muscles and the relationship between the core muscles. It is important to examine the interrelationship between lower extremity function and core stability. PMID:17472321

  12. Core BPEL

    DEFF Research Database (Denmark)

    Hallwyl, Tim; Højsgaard, Espen

    The Web Services Business Process Execution Language (WS-BPEL) is a language for expressing business process behaviour based on web services. The language is intentionally not minimal but provides a rich set of constructs, allows omission of constructs by relying on defaults, and supports language...... extensions. Combined with the fact that the language definition does not provide a formal semantics, it is an arduous task to work formally with the language (e.g. to give an implementation). In this paper we identify a core subset of the language, called Core BPEL, which has fewer and simpler constructs...

  13. Software Development for JSA Source Jerk Measurement

    Institute of Scientific and Technical Information of China (English)

    LUO; Huang-da; ZHANG; Tao

    2013-01-01

    We have developed a series of experiment measurement system for Jordan sub-critical assembly.The source jerk measurement system is used for measuring the reactivity of sub-critical reactor.It mainlyconsists of a BF3 neutron detector around the reactor core,main amplifier,the data acquisition and processing software.The software acquires neutron pulse data by controlling DAQ card,and displaying

  14. Space Flight Software Development Software for Intelligent System Health Management

    Science.gov (United States)

    Trevino, Luis C.; Crumbley, Tim

    2004-01-01

    The slide presentation examines the Marshall Space Flight Center Flight Software Branch, including software development projects, mission critical space flight software development, software technical insight, advanced software development technologies, and continuous improvement in the software development processes and methods.

  15. Controlling Software Piracy.

    Science.gov (United States)

    King, Albert S.

    1992-01-01

    Explains what software manufacturers are doing to combat software piracy, recommends how managers should deal with this problem, and provides a role-playing exercise to help students understand the issues in software piracy. (SR)

  16. Software Engineering Guidebook

    Science.gov (United States)

    Connell, John; Wenneson, Greg

    1993-01-01

    The Software Engineering Guidebook describes SEPG (Software Engineering Process Group) supported processes and techniques for engineering quality software in NASA environments. Three process models are supported: structured, object-oriented, and evolutionary rapid-prototyping. The guidebook covers software life-cycles, engineering, assurance, and configuration management. The guidebook is written for managers and engineers who manage, develop, enhance, and/or maintain software under the Computer Software Services Contract.

  17. Software Development Practices in Global Software Work : Developing Quality Software

    OpenAIRE

    2005-01-01

    This thesis is about software development practices, including the project management aspects, in the context of global software outsourcing. It was focused on the issues of achieving quality product namely here: software. It is built on the premise that the global context, in which the stakeholders are geographically separated by national boundaries, poses unique and inherent challenges derived from separation of place, time and culture.

  18. Validation of reactor core protection system

    International Nuclear Information System (INIS)

    Reactor COre Protection System (RCOPS), an advanced core protection calculator system, is a digitized one which provides core protection function based on two reactor core operation parameters, Departure from Nucleate Boiling Ratio (DNBR) and Local Power Density (LPD). It generates a reactor trip signal when the core condition exceeds the DNBR or LPD design limit. It consists of four independent channels adapted a two-out-of-four trip logic. System configuration, hardware platform and an improved algorithm of the newly designed core protection calculator system are described in this paper. One channel of RCOPS was implemented as a single channel facility for this R and D project where we performed final integration software testing. To implement custom function blocks, pSET is used. Software test is performed by two methods. The first method is a 'Software Module Test' and the second method is a 'Software Unit Test'. New features include improvement of core thermal margin through a revised on-line DNBR algorithm, resolution of the latching problem of control element assembly signal and addition of the pre-trip alarm generation. The change of the on-line DNBR calculation algorithm is considered to improve the DNBR net margin by 2.5%-3.3%. (author)

  19. Evaluating and Optimizing IP Lookup on Many core Processors

    OpenAIRE

    He, Peng; Guan, Hongtao; Xie, Gaogang; Salamatian, Kavé

    2012-01-01

    In recent years, there has been a growing interest in multi/many core processors as a target architecture for high performance software router. Because of its key position in routers, hardware IP lookup implementation has been intensively studied with TCAM and FPGA based architecture. However, increasing interest in software implementation has also been observed. In this paper, we evaluate the performance of software only IP lookup on a many core chip, the TILEPro64 processor. For this purpos...

  20. Software and systems traceability

    CERN Document Server

    Cleland-Huang, Jane; Zisman, Andrea

    2012-01-01

    ""Software and Systems Traceability"" provides a comprehensive description of the practices and theories of software traceability across all phases of the software development lifecycle. The term software traceability is derived from the concept of requirements traceability. Requirements traceability is the ability to track a requirement all the way from its origins to the downstream work products that implement that requirement in a software system. Software traceability is defined as the ability to relate the various types of software artefacts created during the development of software syst

  1. Developing LHCb Grid Software: Experiences and Advances

    CERN Document Server

    Stokes-Rees, I; Cioffi, C; Tsaregorodtsev, A; Garonne, V; Graciani, R; Sanchez, M; Frank, M; Closier, J; Kuznetsov, G

    2004-01-01

    The LHCb grid software has been used for two Physics Data Challenges, the most recent of which will have produced 90 TB of data and required over 400 processor-years of computing power. This paper discusses the group's experience with developing Grid Services, interfacing to the LCG, running LHCb experiment software on the grid, and the integration of a number of new technologies into the LHCb grid software. Our experience and utilisation of the following core technologies will be discussed: OGSI, XML-RPC, grid services, LCG middle-ware, and instant messaging.

  2. Software Testing Techniques and Strategies

    OpenAIRE

    Isha,; Sunita Sangwan

    2014-01-01

    Software testing provides a means to reduce errors, cut maintenance and overall software costs. Numerous software development and testing methodologies, tools, and techniques have emerged over the last few decades promising to enhance software quality. This paper describes Software testing, need for software testing, Software testing goals and principles. Further it describe about different Software testing techniques and different software testing strategies.

  3. EPIC 2011: Third Workshop on Leveraging Empirical Research Results for Software Business Success

    NARCIS (Netherlands)

    Daneva, Maya; Herrmann, Andrea; Regnell, Björn; Weerd, van de Inge; De Troyer, Olga

    2011-01-01

    For many companies, software development is their core business process. For this process to be economically viable, it is not enough that software companies deliver software products that satisfy customers´ written specification. Software businesses also deem other requirements important as to deli

  4. Evolvable Neural Software System

    Science.gov (United States)

    Curtis, Steven A.

    2009-01-01

    The Evolvable Neural Software System (ENSS) is composed of sets of Neural Basis Functions (NBFs), which can be totally autonomously created and removed according to the changing needs and requirements of the software system. The resulting structure is both hierarchical and self-similar in that a given set of NBFs may have a ruler NBF, which in turn communicates with other sets of NBFs. These sets of NBFs may function as nodes to a ruler node, which are also NBF constructs. In this manner, the synthetic neural system can exhibit the complexity, three-dimensional connectivity, and adaptability of biological neural systems. An added advantage of ENSS over a natural neural system is its ability to modify its core genetic code in response to environmental changes as reflected in needs and requirements. The neural system is fully adaptive and evolvable and is trainable before release. It continues to rewire itself while on the job. The NBF is a unique, bilevel intelligence neural system composed of a higher-level heuristic neural system (HNS) and a lower-level, autonomic neural system (ANS). Taken together, the HNS and the ANS give each NBF the complete capabilities of a biological neural system to match sensory inputs to actions. Another feature of the NBF is the Evolvable Neural Interface (ENI), which links the HNS and ANS. The ENI solves the interface problem between these two systems by actively adapting and evolving from a primitive initial state (a Neural Thread) to a complicated, operational ENI and successfully adapting to a training sequence of sensory input. This simulates the adaptation of a biological neural system in a developmental phase. Within the greater multi-NBF and multi-node ENSS, self-similar ENI s provide the basis for inter-NBF and inter-node connectivity.

  5. Improving Software Developer's Competence

    DEFF Research Database (Denmark)

    Abrahamsson, Pekka; Kautz, Karlheinz; Sieppi, Heikki;

    2002-01-01

    Emerging agile software development methods are people oriented development approaches to be used by the software industry. The personal software process (PSP) is an accepted method for improving the capabilities of a single software engineer. Five original hypotheses regarding the impact of the ...

  6. Software distribution using xnetlib

    Energy Technology Data Exchange (ETDEWEB)

    Dongarra, J.J. [Univ. of Tennessee, Knoxville, TN (US). Dept. of Computer Science]|[Oak Ridge National Lab., TN (US); Rowan, T.H. [Oak Ridge National Lab., TN (US); Wade, R.C. [Univ. of Tennessee, Knoxville, TN (US). Dept. of Computer Science

    1993-06-01

    Xnetlib is a new tool for software distribution. Whereas its predecessor netlib uses e-mail as the user interface to its large collection of public-domain mathematical software, xnetlib uses an X Window interface and socket-based communication. Xnetlib makes it easy to search through a large distributed collection of software and to retrieve requested software in seconds.

  7. Ensuring Software IP Cleanliness

    Directory of Open Access Journals (Sweden)

    Mahshad Koohgoli

    2007-12-01

    Full Text Available At many points in the life of a software enterprise, determination of intellectual property (IP cleanliness becomes critical. The value of an enterprise that develops and sells software may depend on how clean the software is from the IP perspective. This article examines various methods of ensuring software IP cleanliness and discusses some of the benefits and shortcomings of current solutions.

  8. Agile Software Development

    Science.gov (United States)

    Biju, Soly Mathew

    2008-01-01

    Many software development firms are now adopting the agile software development method. This method involves the customer at every level of software development, thus reducing the impact of change in the requirement at a later stage. In this article, the principles of the agile method for software development are explored and there is a focus on…

  9. Software productivity improvement through software engineering technology

    Science.gov (United States)

    Mcgarry, F. E.

    1985-01-01

    It has been estimated that NASA expends anywhere from 6 to 10 percent of its annual budget on the acquisition, implementation and maintenance of computer software. Although researchers have produced numerous software engineering approaches over the past 5-10 years; each claiming to be more effective than the other, there is very limited quantitative information verifying the measurable impact htat any of these technologies may have in a production environment. At NASA/GSFC, an extended research effort aimed at identifying and measuring software techniques that favorably impact productivity of software development, has been active over the past 8 years. Specific, measurable, software development technologies have been applied and measured in a production environment. Resulting software development approaches have been shown to be effective in both improving quality as well as productivity in this one environment.

  10. Software Language Evolution

    OpenAIRE

    Vermolen, S.D.

    2012-01-01

    Software plays a critical role in our daily life. Vast amounts of money are spent on more and more complex systems. All software, regardless if it controls a plane or the game on your phone is never finished. Software changes when it contains bugs or when new functionality is added. This process of change is called software eovlution. Despite what the name suggests, this is in practice a rapid process. Software is described in a software language. Not only software can evolve, also the langua...

  11. Great software debates

    CERN Document Server

    Davis, A

    2004-01-01

    The industry’s most outspoken and insightful critic explains how the software industry REALLY works. In Great Software Debates, Al Davis, shares what he has learned about the difference between the theory and the realities of business and encourages you to question and think about software engineering in ways that will help you succeed where others fail. In short, provocative essays, Davis fearlessly reveals the truth about process improvement, productivity, software quality, metrics, agile development, requirements documentation, modeling, software marketing and sales, empiricism, start-up financing, software research, requirements triage, software estimation, and entrepreneurship.

  12. Improving software quality with software error prediction

    OpenAIRE

    Taipale, T. (Taneli)

    2015-01-01

    Today's agile software development can be a complicated process, especially when dealing with a large-scale project with demands for tight communication. The tools used in software development, while aiding the process itself, can also offer meaningful statistics. With the aid of machine learning, these statistics can be used for predicting the behavior patterns of the development process. The starting point of this thesis is a software project developed to be a part of a large telecommun...

  13. Software Engineering for Practiced Software Enhancement

    OpenAIRE

    Rashmi Yadav; Ravindra Patel; Abhay Kothari

    2011-01-01

    Software development scenario particularly in IT industries is very competitive and demands for development with minimum resources. Software development started and prevailed up to an extent in industry without the use of software engineering practices, which was perceived as an overhead. This approach causes over use of resources, such as money, man-hours, hardware components. This paper attempts to present the causes of inefficiencies in an almost exhaustive way. Further, an attempt has bee...

  14. Software Engineering for Practiced Software Enhancement

    Directory of Open Access Journals (Sweden)

    Rashmi Yadav

    2011-03-01

    Full Text Available Software development scenario particularly in IT industries is very competitive and demands for development with minimum resources. Software development started and prevailed up to an extent in industry without the use of software engineering practices, which was perceived as an overhead. This approach causes over use of resources, such as money, man-hours, hardware components. This paper attempts to present the causes of inefficiencies in an almost exhaustive way. Further, an attempt has been made to elaborate the software engineering methods as remedies against the listed causes of inefficiencies of development.

  15. Software Cost Estimation Review

    OpenAIRE

    Ongere, Alphonce

    2013-01-01

    Software cost estimation is the process of predicting the effort, the time and the cost re-quired to complete software project successfully. It involves size measurement of the soft-ware project to be produced, estimating and allocating the effort, drawing the project schedules, and finally, estimating overall cost of the project. Accurate estimation of software project cost is an important factor for business and the welfare of software organization in general. If cost and effort estimat...

  16. Software Radar Technology

    OpenAIRE

    Tang Jun; Wu Hong; Wei Kun-peng

    2015-01-01

    In this paper, the definition and the key features of Software Radar, which is a new concept, are proposed and discussed. We consider the development of modern radar system technology to be divided into three stages: Digital Radar, Software radar and Intelligent Radar, and the second stage is just commencing now. A Software Radar system should be a combination of various modern digital modular components conformed to certain software and hardware standards. Moreover, a software radar system w...

  17. PGP Encryption Software

    OpenAIRE

    Wang, Shuhan

    2014-01-01

    The PGP encryption software is considered as the most powerful and effective software to protect the confidentiality of the email. The goal of this thesis was to render the reader an overview of what the PGP software is, what cryptographic there are behind the PGP software, how the PGP software works and how it can be installed. In the theoretical part, the basic cryptographic terminologies and concepts were explained to help the understanding of the PGP working principles. In the mid-dl...

  18. Software component quality evaluation

    Science.gov (United States)

    Clough, A. J.

    1991-01-01

    The paper describes a software inspection process that can be used to evaluate the quality of software components. Quality criteria, process application, independent testing of the process and proposed associated tool support are covered. Early results indicate that this technique is well suited for assessing software component quality in a standardized fashion. With automated machine assistance to facilitate both the evaluation and selection of software components, such a technique should promote effective reuse of software components.

  19. Software Process Improvement Framework

    OpenAIRE

    Nikitina, Natalja

    2014-01-01

    Many software development organizations today are keen on improving their software development processes in order to develop software products faster, cheaper or better. For that reason, Software Process Improvement (SPI) has received significant attention from the research community over the last few decades. Process maturity models have become widely known for benchmarking software processes against predefined practices and for identifying processes to be improved or implemented, whereas pr...

  20. Views on Software Testability

    OpenAIRE

    Shimeall, Timothy; Friedman, Michael; Chilenski, John; Voas, Jeffrey

    1994-01-01

    The field of testability is an active, well-established part of engineering of modern computer systems. However, only recently have technologies for software testability began to be developed. These technologies focus on accessing the aspects of software that improve or depreciate the ease of testing. As both the size of implemented software and the amount of effort required to test that software increase, so will the important of software testability technologies in influencing the softwa...

  1. Requirement emergence computation of networked software

    Institute of Scientific and Technical Information of China (English)

    HE Keqing; LIANG Peng; PENG Rong; LI Bing; LIU Jing

    2007-01-01

    Emergence Computation has become a hot topic in the research of complex systems in recent years.With the substantial increase in scale and complexity of network-based information systems,the uncertain user requirements from the Internet and personalized application requirement result in the frequent change for the software requirement.Meanwhile,the software system with non self-possessed,resource become more and more complex.Furthermore,the interaction and cooperation requirement between software units and running environment in service computing increase the complexity of software systems.The software systems with complex system characteristics are developing into the"Networked Software" with characteristics of change-on-demand and change-with-cooperation.The concepts "programming","compiling" and "running"of software in common sense are extended from "desktop" to "network".The core issue of software engineering is moving to the requirement engineering,which becomes the research focus of complex systemsoftware engineering.In this paper,we present the software network view based on complex system theory,and the concept of networked software and networked requirement.We proposethe challenge problem in the research of emergence computation of networked software requirement.A hierarchical & cooperative Unified requirement modeling framework URF (Unified Requirement Framework) and related RGPS (Role,Goal,Process and Service) meta-models are proposed.Five scales and the evolutionary growth mechanismin requirement emergence computation of networked software are given with focus on user-dominant and domain-oriented requirement,and the rules and predictability in requirement emergence computation are analyzed.A case study in the application of networked e-Business with evolutionary growth based on State design pattern is presented in the end.

  2. Comparative Performance Evaluation Of Software Architectural Styles With UML

    Directory of Open Access Journals (Sweden)

    Kamna Gauri

    2012-01-01

    Full Text Available This paper presents a performance evaluation of different Software Architectural styles. We all have seen many books and articles to consider the feel of different Software Architectural styles of a system. A good architecture should be approachable, simple, clear separation of concerns, resilient, balanced distribution of responsibilities. Here we provide an introduction to the field of Software Architecture, Software Architectural styles. Software Architecture is also described as a strategic design i.e. how a solution is implemented and many application product lines are built around core architecture with variants that satisfy particular customer requirements.

  3. Payload software technology: Software technology development plan

    Science.gov (United States)

    1977-01-01

    Programmatic requirements for the advancement of software technology are identified for meeting the space flight requirements in the 1980 to 1990 time period. The development items are described, and software technology item derivation worksheets are presented along with the cost/time/priority assessments.

  4. Software Engineering Program: Software Process Improvement Guidebook

    Science.gov (United States)

    1996-01-01

    The purpose of this document is to provide experience-based guidance in implementing a software process improvement program in any NASA software development or maintenance community. This guidebook details how to define, operate, and implement a working software process improvement program. It describes the concept of the software process improvement program and its basic organizational components. It then describes the structure, organization, and operation of the software process improvement program, illustrating all these concepts with specific NASA examples. The information presented in the document is derived from the experiences of several NASA software organizations, including the SEL, the SEAL, and the SORCE. Their experiences reflect many of the elements of software process improvement within NASA. This guidebook presents lessons learned in a form usable by anyone considering establishing a software process improvement program within his or her own environment. This guidebook attempts to balance general and detailed information. It provides material general enough to be usable by NASA organizations whose characteristics do not directly match those of the sources of the information and models presented herein. It also keeps the ideas sufficiently close to the sources of the practical experiences that have generated the models and information.

  5. A lock circuit for a multi-core processor

    DEFF Research Database (Denmark)

    2015-01-01

    An integrated circuit comprising a multiple processor cores and a lock circuit that comprises a queue register with respective bits set or reset via respective, connections dedicated to respective processor cores, whereby the queue register identifies those among the multiple processor cores that...... are enqueued in the queue register. Furthermore, the integrated circuit comprises a current register and a selector circuit configured to select a processor core and identify that processor core by a value in the current register. A selected processor core is a prioritized processor core among the...... cores that have a bit that is set in the queue register. The processor cores are connected to receive a signal from the current register. Correspondingly: a method of synchronizing access to software and/or hardware resources by a core of a multi-core processor by means of a lock circuit; a multi...

  6. Imprinting Community College Computer Science Education with Software Engineering Principles

    Science.gov (United States)

    Hundley, Jacqueline Holliday

    2012-01-01

    Although the two-year curriculum guide includes coverage of all eight software engineering core topics, the computer science courses taught in Alabama community colleges limit student exposure to the programming, or coding, phase of the software development lifecycle and offer little experience in requirements analysis, design, testing, and…

  7. A software quality model and metrics for risk assessment

    Science.gov (United States)

    Hyatt, L.; Rosenberg, L.

    1996-01-01

    A software quality model and its associated attributes are defined and used as the model for the basis for a discussion on risk. Specific quality goals and attributes are selected based on their importance to a software development project and their ability to be quantified. Risks that can be determined by the model's metrics are identified. A core set of metrics relating to the software development process and its products is defined. Measurements for each metric and their usability and applicability are discussed.

  8. Holistic Marketing of Software Products: The New Paradigm

    Directory of Open Access Journals (Sweden)

    Dr. Ashutosh Nigam

    2011-05-01

    Full Text Available The software product firms needs to be competent in offeringservices with ever changing demands of the dynamic marketingenvironment. To overcome these barriers, the firms should deployholistic marketing strategies based on the established niche marketsfor specialized software products. Holistic marketing embraces allaspects of software firm’s products and customized solutions. Theconcept stresses on the interrelationship with the stakeholders toachieve distinction with core focus towards the customerrequirements.

  9. Development of software phantoms for software validation

    International Nuclear Information System (INIS)

    Nuclear medicine software is expected to meet certain criteria. The specifications are frequently not available to the user and, as a consequence, the performance of a particular software package may not meet the users' expectations. Under most circumstances this may be evident immediately, but frequently the user will assume certain specifications based upon the clinical procedure that is being performed, and assume that the software should function in a certain fashion to give the value of a desired parameter. To this end, it is useful to have a number of software phantoms which can act as standard data sets for validation of the software and ensure that the results obtained do meet expectations. A number of problems surround the development of a set of software phantoms that can be transported between different systems. One solution is the creation of mathematical phantoms, in which case algorithms or source code may be transportable. This paper describes four such mathematical phantoms that have been used to validate an ejection fraction and Fourier analysis package. This particular software package has been found lacking in several respects, none of which would have been evident from the documentation provided. (author). 12 refs, 4 figs

  10. Guidelines for software inspections

    Science.gov (United States)

    1983-01-01

    Quality control inspections are software problem finding procedures which provide defect removal as well as improvements in software functionality, maintenance, quality, and development and testing methodology is discussed. The many side benefits include education, documentation, training, and scheduling.

  11. ATLAS software packaging

    CERN Document Server

    Rybkin, G

    2012-01-01

    Software packaging is indispensable part of build and prerequisite for deployment processes. Full ATLAS software stack consists of TDAQ, HLT, and Offline software. These software groups depend on some 80 external software packages. We present tools, package PackDist, developed and used to package all this software except for TDAQ project. PackDist is based on and driven by CMT, ATLAS software configuration and build tool, and consists of shell and Python scripts. The packaging unit used is CMT project. Each CMT project is packaged as several packages - platform dependent (one per platform available), source code excluding header files, other platform independent files, documentation, and debug information packages (the last two being built optionally). Packaging can be done recursively to package all the dependencies. The whole set of packages for one software release, distribution kit, also includes configuration packages and contains some 120 packages for one platform. Also packaged are physics analysis pro...

  12. Improving Software Reliability Forecasting

    NARCIS (Netherlands)

    Burtsy, Bernard; Albeanu, Grigore; Boros, Dragos N.; Popentiu, Florin; Nicola, Victor

    1997-01-01

    This work investigates some methods for software reliability forecasting. A supermodel is presented as a suited tool for prediction of reliability in software project development. Also, times series forecasting for cumulative interfailure time is proposed and illustrated.

  13. Software Engineering Improvement Plan

    Science.gov (United States)

    2006-01-01

    In performance of this task order, bd Systems personnel provided support to the Flight Software Branch and the Software Working Group through multiple tasks related to software engineering improvement and to activities of the independent Technical Authority (iTA) Discipline Technical Warrant Holder (DTWH) for software engineering. To ensure that the products, comments, and recommendations complied with customer requirements and the statement of work, bd Systems personnel maintained close coordination with the customer. These personnel performed work in areas such as update of agency requirements and directives database, software effort estimation, software problem reports, a web-based process asset library, miscellaneous documentation review, software system requirements, issue tracking software survey, systems engineering NPR, and project-related reviews. This report contains a summary of the work performed and the accomplishments in each of these areas.

  14. Gammasphere software development

    International Nuclear Information System (INIS)

    This report describes the activities of the nuclear physics group at Mississippi State University which were performed during 1993. Significant progress has been made in the focus areas: chairing the Gammasphere Software Working Group (SWG); assisting with the porting and enhancement of the ORNL UPAK histogramming software package; and developing standard formats for Gammasphere data products. In addition, they have established a new public ftp archive to distribute software and software development tools and information

  15. Healthcare Software Assurance

    OpenAIRE

    Cooper, Jason G.; Pauley, Keith A.

    2006-01-01

    Software assurance is a rigorous, lifecycle phase-independent set of activities which ensure completeness, safety, and reliability of software processes and products. This is accomplished by guaranteeing conformance to all requirements, standards, procedures, and regulations. These assurance processes are even more important when coupled with healthcare software systems, embedded software in medical instrumentation, and other healthcare-oriented life-critical systems. The current Food and Dru...

  16. SOFAS: Software Analysis Services

    OpenAIRE

    Ghezzi, G

    2010-01-01

    We propose a distributed and collaborative software analysis platform to enable seamless interoperability of software analysis tools across platform, geographical and organizational boundaries. In particular, we devise software analysis tools as services that can be accessed and composed over the Internet. These distributed services shall be widely accessible through a software analysis broker where organizations and research groups can register and share their tools. To enable (semi)-automat...

  17. Software engineer's pocket book

    CERN Document Server

    Tooley, Michael

    2013-01-01

    Software Engineer's Pocket Book provides a concise discussion on various aspects of software engineering. The book is comprised of six chapters that tackle various areas of concerns in software engineering. Chapter 1 discusses software development, and Chapter 2 covers programming languages. Chapter 3 deals with operating systems. The book also tackles discrete mathematics and numerical computation. Data structures and algorithms are also explained. The text will be of great use to individuals involved in the specification, design, development, implementation, testing, maintenance, and qualit

  18. Fault Tolerant Software Architectures

    OpenAIRE

    Saridakis, Titos; Issarny, Valérie

    1998-01-01

    Coping explicitly with failures during the conception and the design of software development complicates significantly the designer's job. The design complexity leads to software descriptions difficult to understand, which have to undergo many simplifications until their first functioning version. To support the systematic development of complex, fault tolerant software, this paper proposes a layered framework for the analysis of the fault tolerance software properties, where the top-most lay...

  19. Software Architecture Simulation

    OpenAIRE

    Mårtensson, Frans; Jönsson, Per

    2002-01-01

    A software architecture is one of the first steps towards a software system. A software architecture can be designed in different ways. During the design phase, it is important to select the most suitable design of the architecture, in order to create a good foundation for the system. The selection process is performed by evaluating architecture alternatives against each other. We investigate the use of continuous simulation of a software architecture as a support tool for architecture evalua...

  20. Software-Komponentenmodelle

    OpenAIRE

    Becker, Steffen; Happe, Jens; Koziolek, Heiko; Krogmann, Klaus; Kuperberg, Michael; Reussner, Ralf; Reichelt, Sebastian; Burger, Erik [Hrsg.; Goussev, Igor; Hodzhev, Dimitar; ben Nasr Omri, Fouad

    2007-01-01

    In der Welt der komponentenbasierten Software-Entwicklung werden Komponentenmodelle unter Anderem dazu eingesetzt, Software-Systeme mit vorhersagbaren Eigenschaften zu erstellen. Die Bandbreite reicht von Forschungs- bis zu Industrie-Modellen. In Abhängigkeit von den Zielen der Modelle werden unterschiedliche Aspekte von Software in ein Komponentenmodell abgebildet. In diesem technischen Bericht wird ein Überblick über die heute verfügbaren Software-Komponentenmodelle ver...

  1. Tools for software visualization

    OpenAIRE

    Stojanova, Aleksandra; Stojkovic, Natasa; Bikov, Dusan

    2015-01-01

    Software visualization is a kind of computer art, and in the same time is a science for generating visual representations of different software aspects and of software development process. There are many tools that allow software visualization but we are focusing on some of them. In this paper will be examined in details just four tools: Jeliot 3, SRec, jGrasp and DDD. Visualizations that they produce will be reviewed and analyzed and will be mentioned possible places for their application. A...

  2. Visualizing software structure understandability

    OpenAIRE

    Dugerdil, Philippe; Niculescu, Mihnea

    2014-01-01

    Software architecture design is known to be driven by the quality attributes we may want to satisfy. Among them, modifiability plays an important role since software maintenance takes the lion's share in the software development costs. However, to successfully maintain a legacy system, the latter must be sufficiently understood so that the maintenance team will not introduce new bugs when correcting others. Then we present a software metric that we called the Autonomy Ratio (AR). We show this...

  3. Software Engineering Process Metamodels

    OpenAIRE

    Ragna Steenweg; Marco Kuhrmann; Daniel Méndez Fernández

    2013-01-01

    Software processes help to structure and organize software projects. Since software projects are complex endeavors and continuously grow in terms of size, budget, and complexity, software processes are used to coordinate people and teams, to define inter- faces in a multi-site project setting in global distributed development, and to provide a shared terminology and knowledge base. Since much process knowledge is available, appropriate tools are required to structure knowledge and to make it ...

  4. Search Based Software Engineering

    OpenAIRE

    Jaspreet Bedi; Kuljit Kaur

    2014-01-01

    This paper reviews the search based software engineering research and finds the major milestones in this direction. The SBSE approach has been the topic of several surveys and reviews. Search Based Software Engineering (SBSE) consists of the application of search-based optimization to software engineering. Using SBSE, a software engineering task is formulated as a search problem by defining a suitable candidate solution representation and a fitness function to differentiate be...

  5. Lean Software Development

    OpenAIRE

    Austad, Henriette

    2011-01-01

    Agile is the name of the common denominator between several methodologies. Agile software development uses short iterations and independent, cross functional teams to create software.Development is performed in tight cooperation with the customer.Lean software development (LSD) is the translation of Lean principles into the realm of software development. Every action that does not produce value for the customer is considered waste. The goal is to achieve a waste-free process, where each step ...

  6. Software engineering measurement

    CERN Document Server

    Munson, PhD, John C

    2003-01-01

    By demonstrating how to develop simple experiments for the empirical validation of theoretical research and showing how to convert measurement data into meaningful and valuable information, this text fosters more precise use of software measurement in the computer science and software engineering literature. Software Engineering Measurement shows you how to convert your measurement data to valuable information that can be used immediately for software process improvement.

  7. Java for flight software

    Science.gov (United States)

    Benowitz, E.; Niessner, A.

    2003-01-01

    This work involves developing representative mission-critical spacecraft software using the Real-Time Specification for Java (RTSJ). This work currently leverages actual flight software used in the design of actual flight software in the NASA's Deep Space 1 (DSI), which flew in 1998.

  8. Software Testing Requires Variability

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak

    2003-01-01

    Software variability is the ability of a software system or artefact to be changed, customized or configured for use in a particular context. Variability in software systems is important from a number of perspectives. Some perspectives rightly receive much attention due to their direct economic i...

  9. Software Architecture Evolution

    Science.gov (United States)

    Barnes, Jeffrey M.

    2013-01-01

    Many software systems eventually undergo changes to their basic architectural structure. Such changes may be prompted by new feature requests, new quality attribute requirements, changing technology, or other reasons. Whatever the causes, architecture evolution is commonplace in real-world software projects. Today's software architects, however,…

  10. Evaluation Software in Counseling.

    Science.gov (United States)

    Sabella, Russell A.

    Counselors today are presented with a number of differing applications software. This article intends to advance the counselor's knowledge and considerations of the various aspects of application software. Included is a discussion of the software applications typically of help to counselors in (a) managing their work (computer managed counseling);…

  11. Software engineering: a roadmap

    OpenAIRE

    Finkelstein, A.; Kramer, J.

    2000-01-01

    This paper provides a roadmap for software engineering. It identifies the principal research challenges being faced by the discipline and brings together the threads derived from the key research specialisations within software engineering. The paper draws heavily on the roadmaps covering specific areas of software engineering research collected in this volume.

  12. Computer software quality assurance

    International Nuclear Information System (INIS)

    The author defines some criteria for the evaluation of software quality assurance elements for applicability to the regulation of the nuclear industry. The author then analyses a number of software quality assurance (SQA) standards. The major extracted SQA elements are then discussed, and finally specific software quality assurance recommendations are made for the nuclear industry

  13. Software testing and software fault injection

    OpenAIRE

    Kooli, Maha; Bosio, Alberto; Benoit, Pascal; Torres, Lionel

    2015-01-01

    Reliability is one of the most important characteristics of the system quality. It is defined as the probability of failure-free operation of system for a specified period of time in a specified environment. For microprocessor based systems, reliability includes both software and hardware reliability. Many methods and techniques have been proposed in the literature so far to evaluate and test both software faults (e.g., Mutation Testing, Control Flow Testing, Data Flow Testing) and hardware f...

  14. HAZARD ANALYSIS SOFTWARE

    International Nuclear Information System (INIS)

    Washington Safety Management Solutions, LLC developed web-based software to improve the efficiency and consistency of hazard identification and analysis, control selection and classification, and to standardize analysis reporting at Savannah River Site. In the new nuclear age, information technology provides methods to improve the efficiency of the documented safety analysis development process which includes hazard analysis activities. This software provides a web interface that interacts with a relational database to support analysis, record data, and to ensure reporting consistency. A team of subject matter experts participated in a series of meetings to review the associated processes and procedures for requirements and standard practices. Through these meetings, a set of software requirements were developed and compiled into a requirements traceability matrix from which software could be developed. The software was tested to ensure compliance with the requirements. Training was provided to the hazard analysis leads. Hazard analysis teams using the software have verified its operability. The software has been classified as NQA-1, Level D, as it supports the analysis team but does not perform the analysis. The software can be transported to other sites with alternate risk schemes. The software is being used to support the development of 14 hazard analyses. User responses have been positive with a number of suggestions for improvement which are being incorporated as time permits. The software has enforced a uniform implementation of the site procedures. The software has significantly improved the efficiency and standardization of the hazard analysis process

  15. Software Validation in ATLAS

    International Nuclear Information System (INIS)

    The ATLAS collaboration operates an extensive set of protocols to validate the quality of the offline software in a timely manner. This is essential in order to process the large amounts of data being collected by the ATLAS detector in 2011 without complications on the offline software side. We will discuss a number of different strategies used to validate the ATLAS offline software; running the ATLAS framework software, Athena, in a variety of configurations daily on each nightly build via the ATLAS Nightly System (ATN) and Run Time Tester (RTT) systems; the monitoring of these tests and checking the compilation of the software via distributed teams of rotating shifters; monitoring of and follow up on bug reports by the shifter teams and periodic software cleaning weeks to improve the quality of the offline software further.

  16. Mathematical software production

    Energy Technology Data Exchange (ETDEWEB)

    Cowell, W. R.; Fosdick, L. D.

    1977-01-01

    Locally constructed collections of mathematical routines are gradually being replaced by mathematical software that has been produced for broad dissemination and use. The process of producing such software begins with algorithmic analysis, and proceeds through software construction and documentation to extensive testing and, finally, to distribution and support of the software products. These are demanding and costly activities which require such a range of skills that they are carried out in collaborative projects. The costs and effort are justified by the utility of high-quality software, the efficiency of producing it for general distribution, and the benefits of providing a conduit from research to applications. This paper first reviews certain of the early developments in the field of mathematical software. Then it examines the technical problems that distinguish software production as an intellectual activity, problems whose descriptions also serve to characterize ideal mathematical software. Next, three mathematical software projects are sketched with attention to their emphasis, accomplishments, organization, and costs. Finally, comments are offered on possible future directions for mathematical software production, as extrapolations of the present involvement of universities, government laboratories, and private industry. 48 references.

  17. NASA software documentation standard software engineering program

    Science.gov (United States)

    1991-01-01

    The NASA Software Documentation Standard (hereinafter referred to as Standard) can be applied to the documentation of all NASA software. This Standard is limited to documentation format and content requirements. It does not mandate specific management, engineering, or assurance standards or techniques. This Standard defines the format and content of documentation for software acquisition, development, and sustaining engineering. Format requirements address where information shall be recorded and content requirements address what information shall be recorded. This Standard provides a framework to allow consistency of documentation across NASA and visibility into the completeness of project documentation. This basic framework consists of four major sections (or volumes). The Management Plan contains all planning and business aspects of a software project, including engineering and assurance planning. The Product Specification contains all technical engineering information, including software requirements and design. The Assurance and Test Procedures contains all technical assurance information, including Test, Quality Assurance (QA), and Verification and Validation (V&V). The Management, Engineering, and Assurance Reports is the library and/or listing of all project reports.

  18. On simultaneous s-cores/t-cores

    OpenAIRE

    Aukerman, D; Kane, B.; Sze, L

    2009-01-01

    In this paper, the authors investigate the question of when a partition of n∈N is an s-core and also a t-core when s and t are not relatively prime. A characterization of all such s/t-cores is given, as well as a generating function dependent upon the polynomial generating functions for s/t-cores when s and t are relatively prime. Furthermore, characterizations and generating functions are given for s/t-cores which are self-conjugate and also for (e,r)/(e′,r)-cores.

  19. A Prototype for the Support of Integrated Software Process Development and Improvement

    Science.gov (United States)

    Porrawatpreyakorn, Nalinpat; Quirchmayr, Gerald; Chutimaskul, Wichian

    An efficient software development process is one of key success factors for quality software. Not only can the appropriate establishment but also the continuous improvement of integrated project management and of the software development process result in efficiency. This paper hence proposes a software process maintenance framework which consists of two core components: an integrated PMBOK-Scrum model describing how to establish a comprehensive set of project management and software engineering processes and a software development maturity model advocating software process improvement. Besides, a prototype tool to support the framework is introduced.

  20. Integrating Base Stations with a Software Defined Core Network

    OpenAIRE

    Hernandez Zamora, Bruno

    2016-01-01

    An unprecedented increase is expected in the demand for mobile data traffic, which requires significant changes in the access network. On the one hand, the penetration of new technologies and networks such as Internet of Things (IoT) introduces a large amount of additional traffic of different kinds. On the other hand, the cell size will be reduced, which results in a higher number of base stations that need to be deployed, requiring additional investments by operators. This thesis propos...

  1. Software engineering the current practice

    CERN Document Server

    Rajlich, Vaclav

    2011-01-01

    INTRODUCTION History of Software EngineeringSoftware PropertiesOrigins of SoftwareBirth of Software EngineeringThird Paradigm: Iterative ApproachSoftware Life Span ModelsStaged ModelVariants of Staged ModelSoftware Technologies Programming Languages and CompilersObject-Oriented TechnologyVersion Control SystemSoftware ModelsClass DiagramsUML Activity DiagramsClass Dependency Graphs and ContractsSOFTWARE CHANGEIntroduction to Software ChangeCharacteristics of Software ChangePhases of Software ChangeRequirements and Their ElicitationRequirements Analysis and Change InitiationConcepts and Concept

  2. Software Defined Networking Demands on Software Technologies

    DEFF Research Database (Denmark)

    Galinac Grbac, T.; Caba, Cosmin Marius; Soler, José

    2015-01-01

    Software Defined Networking (SDN) is a networking approach based on a centralized control plane architecture with standardised interfaces between control and data planes. SDN enables fast configuration and reconfiguration of the network to enhance resource utilization and service performances. This......, we review this new approach to networking from an architectural point of view, and identify and discuss some critical quality issues that require new developments in software technologies. These issues we discuss along with use case scenarios. Here in this paper we aim to identify challenges for...... new approach enables a more dynamic and flexible network, which may adapt to user needs and application requirements. To this end, systemized solutions must be implemented in network software, aiming to provide secure network services that meet the required service performance levels. In this paper...

  3. Systems and software variability management concepts, tools and experiences

    CERN Document Server

    Capilla, Rafael; Kang, Kyo-Chul

    2013-01-01

    The success of product line engineering techniques in the last 15 years has popularized the use of software variability as a key modeling approach for describing the commonality and variability of systems at all stages of the software lifecycle. Software product lines enable a family of products to share a common core platform, while allowing for product specific functionality being built on top of the platform. Many companies have exploited the concept of software product lines to increase the resources that focus on highly differentiating functionality and thus improve their competitiveness

  4. Integrating Behaviour in Software Models: An Event Coordination Notation

    DEFF Research Database (Denmark)

    Kindler, Ekkart

    One of the main problems in model-based software engineering is modelling behaviour in such a way that the behaviour models can be easily integrated with each other, with the structural software models and with pre-existing software. In this paper, we propose an event coordination notation (ECNO......) that deals with this problem. We present the main concepts and rationales behind this notation and discuss a prototype and run-time environment that executes these models, and provides an API so that other parts of the software can be easily integrated. The core concepts of the ECNO seem to be...

  5. Architecture of the ATLAS High Level Trigger Event Selection Software

    CERN Document Server

    Grothe, M; Baines, J T M; Bee, C P; Biglietti, M; Bogaerts, A; Boisvert, V; Bosman, M; Brandt, S; Caron, B; Casado, M P; Cataldi, G; Cavalli, D; Cervetto, M; Comune, G; Corso-Radu, A; Di Mattia, A; Díaz-Gómez, M; Dos Anjos, A; Drohan, J; Ellis, Nick; Elsing, M; Epp, B; Etienne, F; Falciano, S; Farilla, A; George, S; Ghete, V M; González, S; Kaczmarska, A; Karr, K M; Khomich, A; Konstantinidis, N P; Krasny, W; Li, W; Lowe, A; Luminari, L; Ma, H; Meessen, C; Mello, A G; Merino, G; Morettini, P; Moyse, E; Nairz, A; Negri, A; Nikitin, N V; Nisati, A; Padilla, C; Parodi, F; Pérez-Réale, V; Pinfold, J L; Pinto, P; Polesello, G; Qian, Z; Rajagopalan, S; Resconi, S; Rosati, S; Scannicchio, D A; Schiavi, C; Schörner-Sadenius, T; Segura, E; De Seixas, J M; Shears, T G; Sivoklokov, S Yu; Smizanska, M; Soluk, R A; Stanescu, C; Tapprogge, Stefan; Touchard, F; Vercesi, V; Watson, A; Wengler, T; Werner, P; Wheeler, S; Wickens, F J; Wiedenmann, W; Wielers, M; Zobernig, G; CHEP 2003 Computing in High Energy Physics; Grothe, Monika

    2004-01-01

    The ATLAS High Level Trigger (HLT) consists of two selection steps: the second level trigger and the event filter. Both will be implemented in software, running on mostly commodity hardware. Both levels have a coherent approach to event selection, so a common core software framework has been designed to maximize this coherency, while allowing sufficient flexibility to meet the different interfaces and requirements of the two different levels. The approach is extended further to allow the software to run in an off-line simulation and reconstruction environment for the purposes of development. This paper describes the architecture and high level design of the software.

  6. Trace Software Pipelining

    Institute of Scientific and Technical Information of China (English)

    王剑; AndreasKrall; 等

    1995-01-01

    Global software pipelining is a complex but efficient compilation technique to exploit instruction-level parallelism for loops with branches.This paper presents a novel global software pipelining technique,called Trace Software Pipelining,targeted to the instruction-level parallel processors such as Very Long Instruction Word (VLIW) and superscalar machines.Trace software pipelining applies a global code scheduling technique to compact the original loop body.The resulting loop is called a trace software pipelined (TSP) code.The trace softwrae pipelined code can be directly executed with special architectural support or can be transformed into a globally software pipelined loop for the current VLIW and superscalar processors.Thus,exploiting parallelism across all iterations of a loop can be completed through compacting the original loop body with any global code scheduling technique.This makes our new technique very promising in practical compilers.Finally,we also present the preliminary experimental results to support our new approach.

  7. COTS software selection process.

    Energy Technology Data Exchange (ETDEWEB)

    Watkins, William M. (Strike Wire Technologies, Louisville, CO); Lin, Han Wei; McClelland, Kelly (U.S. Security Associates, Livermore, CA); Ullrich, Rebecca Ann; Khanjenoori, Soheil; Dalton, Karen; Lai, Anh Tri; Kuca, Michal; Pacheco, Sandra; Shaffer-Gant, Jessica

    2006-05-01

    Today's need for rapid software development has generated a great interest in employing Commercial-Off-The-Shelf (COTS) software products as a way of managing cost, developing time, and effort. With an abundance of COTS software packages to choose from, the problem now is how to systematically evaluate, rank, and select a COTS product that best meets the software project requirements and at the same time can leverage off the current corporate information technology architectural environment. This paper describes a systematic process for decision support in evaluating and ranking COTS software. Performed right after the requirements analysis, this process provides the evaluators with more concise, structural, and step-by-step activities for determining the best COTS software product with manageable risk. In addition, the process is presented in phases that are flexible to allow for customization or tailoring to meet various projects' requirements.

  8. A hardware/software co-optimization approach for embedded software of MP3 decoder

    Institute of Scientific and Technical Information of China (English)

    ZHANG Wei; LIU Peng; ZHAI Zhi-bo

    2007-01-01

    In order to improve the efficiency of embedded software running on processor core, this paper proposes a hardware/software co-optimization approach for embedded software from the system point of view. The proposed stepwise methods aim at exploiting the structure and the resources of the processor as much as possible for software algorithm optimization. To achieve low memory usage and low frequency need for the same performance, this co-optimization approach was used to optimize embedded software of MP3 decoder based on a 16-bit fixed-point DSP core. After the optimization, the results of decoding 128kbps, 44.1 kHz stereo MP3 on DSP evaluation platform need 45.9 MIPS and 20.4 kbytes memory space. The optimization rate achieves 65.6% for memory and 49.6% for frequency respectively compared with the results by compiler using floating-point computation. The experimental result indicates the availability of the hardware/software co-optimization approach depending on the algorithm and architecture.

  9. Social software in global software development

    DEFF Research Database (Denmark)

    Giuffrida, Rosalba; Dittrich, Yvonne

    2010-01-01

    Social software (SoSo) is defined by Farkas as tools that (1) allow people to communicate, collaborate, and build community online (2) can be syndicated, shared, reused or remixed and (3) let people learn easily from and capitalize on the behavior and knowledge of others. [1]. SoSo include a wide...... variety of tools such as: instant messaging, internet forums, mailing lists, blogs, wikis, social network sites, social bookmarking, social libraries, virtual worlds. Though normally rather belonging to the private realm, the use of social software in corporate context has been reported, e.g. as a way...

  10. Essential software architecture

    CERN Document Server

    Gorton, Ian

    2011-01-01

    Job titles like ""Technical Architect"" and ""Chief Architect"" nowadays abound in software industry, yet many people suspect that ""architecture"" is one of the most overused and least understood terms in professional software development. Gorton's book tries to resolve this dilemma. It concisely describes the essential elements of knowledge and key skills required to be a software architect. The explanations encompass the essentials of architecture thinking, practices, and supporting technologies. They range from a general understanding of structure and quality attributes through technical i

  11. Gammasphere software development

    Energy Technology Data Exchange (ETDEWEB)

    Piercey, R.B.

    1993-01-01

    Activities of the nuclear physics group are described. Progress was made in organizing the Gammasphere Software Working Group, establishing a nuclear computing facility, participating in software development at Lawrence Berkeley, developing a common data file format, and adapting the ORNL UPAK software to run at Gammasphere. A universal histogram object was developed that defines a file format and provides for an objective-oriented programming model. An automated liquid nitrogen fill system was developed for Gammasphere (110 Ge detectors comprise the sphere).

  12. Software systems for astronomy

    CERN Document Server

    Conrad, Albert R

    2014-01-01

    This book covers the use and development of software for astronomy. It describes the control systems used to point the telescope and operate its cameras and spectrographs, as well as the web-based tools used to plan those observations. In addition, the book also covers the analysis and archiving of astronomical data once it has been acquired. Readers will learn about existing software tools and packages, develop their own software tools, and analyze real data sets.

  13. Software Process Improvement Defined

    DEFF Research Database (Denmark)

    Aaen, Ivan

    2002-01-01

    This paper argues in favor of the development of explanatory theory on software process improvement. The last one or two decades commitment to prescriptive approaches in software process improvement theory may contribute to the emergence of a gulf dividing theorists and practitioners. It is...... proposed that this divide be met by the development of theory evaluating prescriptive approaches and informing practice with a focus on the software process policymaking and process control aspects of improvement efforts...

  14. Software Preservation Benefits Framework

    OpenAIRE

    Chue Hong, Neil; Crouch, Steve; Hettrick, Simon; Parkinson, Tim; Shreeve, Matt

    2010-01-01

    An investigation of software preservation has been carried out by Curtis+Cartwright Consulting Limited, in partnership with the Software Sustainability Institute (SSI), on behalf of the JISC. The aim of the study was to raise awareness and build capacity throughout the Further and Higher Education (FE/HE) sector to engage with preservation issues as part of the process of software development. Part of this involved examining the purpose and benefits of employing preservation measures in relat...

  15. Unit shutdown software

    International Nuclear Information System (INIS)

    The paper Unit Shutdown Software contains a description of software which is to be used by the WWER 440 type NPP unit manager during unit shutdown operations. The software enables to display measured or calculated values on the basis of real-time technological process data acquisition, provides advices during unit shutdown and, in the cases when there is a faulty intervention or inception of dangerous situation during the unit shutdown process, produces warnings. (author). 4 figs

  16. Computer software configuration management

    International Nuclear Information System (INIS)

    This report reviews the basic elements of software configuration management (SCM) as defined by military and industry standards. Several software configuration management standards are evaluated given the requirements of the nuclear industry. A survey is included of available automated tools for supporting SCM activities. Some information is given on the experience of establishing and using SCM plans of other organizations that manage critical software. The report concludes with recommendations of practices that would be most appropriate for the nuclear power industry in Canada

  17. The Other Software

    OpenAIRE

    McWilliams, Chandler B.

    2009-01-01

    This paper considers the absence of the human actor, specifically the programmer, from Friedrich Kittler’s analysis of software in his essay There is no Software. By focusing too intently on the machine and its specific, material existence, Kittler removes the human user / operator / writer from his analysis of software. Thus, he has no choice but to interpret the layers of language, assembler, opcode and WordPerfect, DOS, BIOS—both chains ending in an essentializing reduction to voltages—as ...

  18. Software Radar signal processing

    OpenAIRE

    T. Grydeland; Lind, F. D.; Erickson, P J; J. M. Holt

    2005-01-01

    Software infrastructure is a growing part of modern radio science systems. As part of developing a generic infrastructure for implementing Software Radar systems, we have developed a set of reusable signal processing components. These components are generic software-based implementations for use on general purpose computing systems. The components allow for the implementation of signal processing chains for radio frequency signal reception, correlation-based data processing, and cross-correla...

  19. Managing Distributed Software Projects

    OpenAIRE

    Persson, John Stouby

    2010-01-01

    Increasingly, software projects are becoming geographically distributed, with limited face-toface interaction between participants. These projects face particular challenges that need careful managerial attention. This PhD study reports on how we can understand and support the management of distributed software projects, based on a literature study and a case study. The main emphasis of the literature study was on how to support the management of distributed software projects, but also contri...

  20. Software configuration management

    CERN Document Server

    Keyes, Jessica

    2004-01-01

    Software Configuration Management discusses the framework from a standards viewpoint, using the original DoD MIL-STD-973 and EIA-649 standards to describe the elements of configuration management within a software engineering perspective. Divided into two parts, the first section is composed of 14 chapters that explain every facet of configuration management related to software engineering. The second section consists of 25 appendices that contain many valuable real world CM templates.

  1. Automated functional software testing

    OpenAIRE

    Jelnikar, Kristina

    2009-01-01

    The following work describes an approach to software test automation of functional testing. In the introductory part we are introducing what testing problems development companies are facing. The second chapter describes some testing methods, what role does testing have in software development, some approaches to software development and the meaning of testing environment. Chapter 3 is all about test automation. After a brief historical presentation, we are demonstrating through s...

  2. Banking Software Applications Security

    OpenAIRE

    Ioan Alexandru Bubu

    2015-01-01

    Computer software products are among the most complex artifacts, if not the most complex artifacts mankind has created. Securing those artifacts against intelligent attackers who try to exploit flaws in software design and construct is a great challenge too.The purpose of this paper is to introduce a secure alternative to banking software applications that are currently in use. This new application aims to cover most of the well-known vulnerabilities that plague the majority of current softwa...

  3. Software Dataplane Verification

    OpenAIRE

    Dobrescu, Mihai; Argyraki, Katerina

    2014-01-01

    Software dataplanes are emerging as an alternative to traditional hardware switches and routers, promising programmability and short time to market. These advantages are set against the risk of disrupting the network with bugs, unpredictable performance, or security vulnerabilities. We explore the feasibility of verifying software dataplanes to ensure smooth network operation. For general programs, verifiability and performance are competing goals; we argue that software dataplanes are differ...

  4. Lean software development

    OpenAIRE

    Hefnerová, Lucie

    2011-01-01

    The main goal of this bachelor thesis is the emergence of the clear Czech written material concerning the concept of Lean Software Development, which has been gaining significant attention in the field of software development, recently. Another goal of this thesis is to summarize the possible approaches of categorizing the concept and to summarize the possible approaches of defining the relationship between Lean and Agile software development. The detailed categorization of the tools potentia...

  5. Generative Software Development

    OpenAIRE

    Rumpe, Bernhard; Schindler, Martin; Völkel, Steven; Weisemöller, Ingo

    2014-01-01

    Generation of software from modeling languages such as UML and domain specific languages (DSLs) has become an important paradigm in software engineering. In this contribution, we present some positions on software development in a model based, generative manner based on home grown DSLs as well as the UML. This includes development of DSLs as well as development of models in these languages in order to generate executable code, test cases or models in different languages. Development of formal...

  6. Software Upgrades under Monopoly

    OpenAIRE

    Jiri Strelicky; Kresimir Zigic

    2013-01-01

    We study price discrimination in a monopolistic software market. The monopolist charges different prices for the upgrade version and for the full version. Consumers are heterogeneous in taste for infinitely durable software and there is no resale. We show that price discrimination leads to a higher software quality but raises both absolute price and price per quality. This price discrimination does not increase sales and it decreases the total number of consumers compared to no discrimination...

  7. Software Architecture in Depth

    OpenAIRE

    Lars Heinemann; Christian Neumann; Birgit Penzenstadler; Wassiou Sitou

    2016-01-01

    The quality of software architecture is one of the crucial success factors for the development of large and/or complex systems. Therefore, a good software architect plays a key role in every demanding project: She or he has the overview of the overall system and sets the framework for the implementation. In order to be successful in this task, software architects need well-founded and encompassing knowledge about design, which exceeds pure programming and specific spe- cialization a...

  8. Software Requirements Management

    OpenAIRE

    Ali Altalbe

    2015-01-01

    Requirements are defined as the desired set of characteristics of a product or a service. In the world of software development, it is estimated that more than half of the failures are attributed towards poor requirements management. This means that although the software functions correctly, it is not what the client requested. Modern software requirements management methodologies are available to reduce the occur-rence of such incidents. This paper performs a review on the available literatur...

  9. Reverse engineering software ecosystems

    OpenAIRE

    Lungu, Mircea F.; Lanza, Michele

    2009-01-01

    Reverse engineering is an active area of research concerned with the development of techniques and tools that support the understanding of software systems. All the techniques that were pro- posed until now study individual systems in isolation. However, software systems are seldom developed in isolation; instead, they are developed together with other projects in the wider context of an organization. We call the collection of projects that are developed in such a con- text a software ...

  10. DIVERSIFICATION IN SOFTWARE ENGINEERING

    OpenAIRE

    Er.Kirtesh Jailia,; Manisha Jailia; Er.Pramod Kumar,; Manisha Agarwal

    2010-01-01

    In this paper we examine the factors that have promoted the iversification of software process models. The intention is to understand more clearly the problem-solving process in software engineering & try to find out the efficient way to manage the risk. A review of software process modeling is given first, followed by a discussion of process evaluation techniques. A taxonomy for categorizing process models, based on establishing decision criteria,is identified that can guide selecting the a...

  11. Knowledge Level Software Engineering

    OpenAIRE

    Giunchiglia, Fausto; PERINI, Anna; Sannicolo', Fabrizio

    2001-01-01

    We contend that, at least in the first stages of definition of the early and late requirements, the software development process should be articulated using knowledge level concepts. These concepts include actors, who can be (social, organizational, human or software) agents, positions or roles, goals, and social dependencies for defining the obligations of actors to other actors. The goal of this paper is to instantiate this claim by describing how Tropos, an agent-oriented software engineer...

  12. Software libre (fuentes abiertas)

    OpenAIRE

    Solís Portillo, Daniel

    2010-01-01

    Esta jornada persigue dar respuesta a las dudas de los investigadores y profundizar en los aspectos relacionados con la protección del software con expertos sobre: Formas de protección industrial y de protección intelectual. Aspectos legales a tener en cuenta. La patentabilidad de invenciones software e implementadas por ordenador. Software libre (fuentes abiertas). Protección en la UC3M. Experiencias de nuestros investigadores.

  13. Software evolution and maintenance

    CERN Document Server

    Tripathy, Priyadarshi

    2014-01-01

    Software Evolution and Maintenance: A Practitioner's Approach is an accessible textbook for students and professionals, which collates the advances in software development and provides the most current models and techniques in maintenance.Explains two maintenance standards: IEEE/EIA 1219 and ISO/IEC14764Discusses several commercial reverse and domain engineering toolkitsSlides for instructors are available onlineInformation is based on the IEEE SWEBOK (Software Engineering Body of Knowledge)

  14. Systems, methods and apparatus for developing and maintaining evolving systems with software product lines

    Science.gov (United States)

    Hinchey, Michael G. (Inventor); Rash, James L. (Inventor); Pena, Joaquin (Inventor)

    2011-01-01

    Systems, methods and apparatus are provided through which an evolutionary system is managed and viewed as a software product line. In some embodiments, the core architecture is a relatively unchanging part of the system, and each version of the system is viewed as a product from the product line. Each software product is generated from the core architecture with some agent-based additions. The result may be a multi-agent system software product line.

  15. Dual-core antiresonant hollow core fibers.

    Science.gov (United States)

    Liu, Xuesong; Fan, Zhongwei; Shi, Zhaohui; Ma, Yunfeng; Yu, Jin; Zhang, Jing

    2016-07-25

    In this work, dual-core antiresonant hollow core fibers (AR-HCFs) are numerically demonstrated, based on our knowledge, for the first time. Two fiber structures are proposed. One is a composite of two single-core nested nodeless AR-HCFs, exhibiting low confinement loss and a circular mode profile in each core. The other has a relatively simple structure, with a whole elliptical outer jacket, presenting a uniform and wide transmission band. The modal couplings of the dual-core AR-HCFs rely on a unique mechanism that transfers power through the air. The core separation and the gap between the two cores influence the modal coupling strength. With proper designs, both of the dual-core fibers can have low phase birefringence and short modal coupling lengths of several centimeters. PMID:27464191

  16. Agile software development

    CERN Document Server

    Dingsoyr, Torgeir; Moe, Nils Brede

    2010-01-01

    Agile software development has become an umbrella term for a number of changes in how software developers plan and coordinate their work, how they communicate with customers and external stakeholders, and how software development is organized in small, medium, and large companies, from the telecom and healthcare sectors to games and interactive media. Still, after a decade of research, agile software development is the source of continued debate due to its multifaceted nature and insufficient synthesis of research results. Dingsoyr, Dyba, and Moe now present a comprehensive snapshot of the kno

  17. Software architecture 2

    CERN Document Server

    Oussalah, Mourad Chabanne

    2014-01-01

    Over the past 20 years, software architectures have significantly contributed to the development of complex and distributed systems. Nowadays, it is recognized that one of the critical problems in the design and development of any complex software system is its architecture, i.e. the organization of its architectural elements. Software Architecture presents the software architecture paradigms based on objects, components, services and models, as well as the various architectural techniques and methods, the analysis of architectural qualities, models of representation of architectural templa

  18. Marketing Mix del Software.

    OpenAIRE

    Yudith del Carmen Rodríguez Pérez

    2006-01-01

    La ingeniería del software y los modelos de calidad del software han consolidado sus esfuerzos en el proceso de producción del mismo, sin embargo son pocos sus aportes en el proceso de comercialización. Es esencial en la ciencia de la computación desarrollar un modelo de comercialización para las organizaciones productoras de software con el fin de elevar la productividad de las mismas. Sin embargo, es preciso primero conocer las características del producto software que los diferencian de ot...

  19. Software Radar Technology

    Directory of Open Access Journals (Sweden)

    Tang Jun

    2015-08-01

    Full Text Available In this paper, the definition and the key features of Software Radar, which is a new concept, are proposed and discussed. We consider the development of modern radar system technology to be divided into three stages: Digital Radar, Software radar and Intelligent Radar, and the second stage is just commencing now. A Software Radar system should be a combination of various modern digital modular components conformed to certain software and hardware standards. Moreover, a software radar system with an open system architecture supporting to decouple application software and low level hardware would be easy to adopt "user requirements-oriented" developing methodology instead of traditional "specific function-oriented" developing methodology. Compared with traditional Digital Radar, Software Radar system can be easily reconfigured and scaled up or down to adapt to the changes of requirements and technologies. A demonstration Software Radar signal processing system, RadarLab 2.0, which has been developed by Tsinghua University, is introduced in this paper and the suggestions for the future development of Software Radar in China are also given in the conclusion.

  20. Architecture for Verifiable Software

    Science.gov (United States)

    Reinholtz, William; Dvorak, Daniel

    2005-01-01

    Verifiable MDS Architecture (VMA) is a software architecture that facilitates the construction of highly verifiable flight software for NASA s Mission Data System (MDS), especially for smaller missions subject to cost constraints. More specifically, the purpose served by VMA is to facilitate aggressive verification and validation of flight software while imposing a minimum of constraints on overall functionality. VMA exploits the state-based architecture of the MDS and partitions verification issues into elements susceptible to independent verification and validation, in such a manner that scaling issues are minimized, so that relatively large software systems can be aggressively verified in a cost-effective manner.

  1. Software architecture 1

    CERN Document Server

    Oussalah , Mourad Chabane

    2014-01-01

    Over the past 20 years, software architectures have significantly contributed to the development of complex and distributed systems. Nowadays, it is recognized that one of the critical problems in the design and development of any complex software system is its architecture, i.e. the organization of its architectural elements. Software Architecture presents the software architecture paradigms based on objects, components, services and models, as well as the various architectural techniques and methods, the analysis of architectural qualities, models of representation of architectural template

  2. MYOB software for dummies

    CERN Document Server

    Curtis, Veechi

    2012-01-01

    Your complete guide to MYOB® AccountRight softwareNow in its seventh edition, MYOB® Software For Dummies walks you through everything you need to know, from starting your MYOB® file from scratch and recording payments and receipts, to tracking profit and analysing sales. This new edition includes all the information you need on the new generation of MYOB® AccountRight software, including the new cloud computing features. Set up MYOB® software - understand how to make it work the first time Keep track of purchases and sales - monitor customer accounts and ensure you get pai

  3. Essence: Facilitating Software Innovation

    DEFF Research Database (Denmark)

    Aaen, Ivan

    2008-01-01

      This paper suggests ways to facilitate creativity and innovation in software development. The paper applies four perspectives – Product, Project, Process, and People –to identify an outlook for software innovation. The paper then describes a new facility–Software Innovation Research Lab (SIRL......) – and a new method concept for software innovation – Essence – based on views, modes, and team roles. Finally, the paper reports from an early experiment using SIRL and Essence and identifies further research....

  4. Decomposed software pipelining

    OpenAIRE

    Wang, J.; Eisenbeis, Christine

    1993-01-01

    This report presents a new view on software pipelining in which we consider software pipelining as an instruction level transformation from a vector of one-dimension to a matrix of two-dimensions. Thus, the software pipelining problem can be naturally decomposed into two subproblems, one to determine the row-number of operations in the matrix and another to determine the column-numbers. Using this view-point as a basis, we develop a new loop scheduling approach, called decomposed software pip...

  5. Armament Software Engineering Center

    Data.gov (United States)

    Federal Laboratory Consortium — The Fire Control Systems qaodmasdkwaspemas4ajkqlsmdqpakldnzsdfls Technology Directorateqaodmasdkwaspemas3ajkqlsmdqpakldnzsdflss 89,000 square-foot Armament Software...

  6. Software Metrics for Identifying Software Size in Software Development Projects

    OpenAIRE

    V.S.P Vidanapathirana; K.H.M.R Peiris

    2015-01-01

    Measurements are fundamental any engineering discipline. They indicate the amount, extent, dimension or capacity of an attribute or a product, in a quantitative manner. The analyzed results of the measured data can be given as the basic idea of metrics. It is a quantitative representation of the measurements of the degree to which a system, component, or process possesses a given attribute. When it comes to software, the metrics are a wide scope of measurements of computer programming. The si...

  7. CoreDevRec:Automatic Core Member Recommendation for Contribution Evaluation

    Institute of Scientific and Technical Information of China (English)

    蒋竞; 贺佳欢; 陈学渊

    2015-01-01

    The pull-based software development helps developers make contributions flexibly and effciently. Core members evaluate code changes submitted by contributors, and decide whether to merge these code changes into repositories or not. Ideally, code changes are assigned to core members and evaluated within a short time after their submission. However, in reality, some popular projects receive many pull requests, and core members have di昋culties in choosing pull requests which are to be evaluated. Therefore, there is a growing need for automatic core member recommendation, which improves the evaluation process. In this paper, we investigate pull requests with manual assignment. Results show that 3.2%∼40.6% of pull requests are manually assigned to specific core members. To assist with the manual assignment, we propose CoreDevRec to recommend core members for contribution evaluation in GitHub. CoreDevRec uses support vector machines to analyze different kinds of features, including file paths of modified codes, relationships between contributors and core members, and activeness of core members. We evaluate CoreDevRec on 18 651 pull requests of five popular projects in GitHub. Results show that CoreDevRec achieves accuracy from 72.9% to 93.5% for top 3 recommendation. In comparison with a baseline approach, CoreDevRec improves the accuracy from 18.7% to 81.3% for top 3 recommendation. Moreover, CoreDevRec even has higher accuracy than manual assignment in the project TrinityCore. We believe that CoreDevRec can improve the assignment of pull requests.

  8. Software Validation Infrastructure for the ATLAS Trigger

    CERN Document Server

    Adorisio, C; Beauchemin, P; Bell, P; Biglietti, M; Coccaro, A; Damazio, D; Ehrenfeld, W; Faulkner, P; George, S; Giagu, S; Goncalo, R; Hamilton, A; Jones, G; Kirk, J; Kwee, R; Lane, J; Enoque Ferreira de Lima, D; Masik, J; Mincer, A; Monticelli, F; Omachi, C; Oyarzun, A; Panikashvili, N; Potter, C; Quinonez, F; Reinsch, A; Robinson, M; Rodríguez, D; Sarkisyan-Grinbaum, E; Sidoti, A; Sinev, N; Strom, D; Sutton, M; Ventura, A; Winklmeier, F; Zhao, L

    2009-01-01

    The ATLAS trigger system is responsible for selecting the interesting collision events delivered by the Large Hadron Collider (LHC). The ATLAS trigger will need to achieve a ~10^-7 rejection factor against random proton-proton collisions, and still be able to efficiently select interesting events. After a first processing level based on hardware, the final event selection is based on custom software running on two CPU farms, containing around two thousand multi-core machines. This is known as the high-level trigger. Running the trigger online during long periods demands very high quality software. It must be fast, performant, and essentially bug-free. With more than 100 contributors and around 250 different packages, a thorough validation of the HLT software is essential. This relies on a variety of unit and integration tests as well as on software metrics, and uses both in-house and open source software. This presentation presents the existing infrastructure used for validating the high-level trigger softwar...

  9. Multicore Considerations for Legacy Flight Software Migration

    Science.gov (United States)

    Vines, Kenneth; Day, Len

    2013-01-01

    In this paper we will discuss potential benefits and pitfalls when considering a migration from an existing single core code base to a multicore processor implementation. The results of this study present options that should be considered before migrating fault managers, device handlers and tasks with time-constrained requirements to a multicore flight software environment. Possible future multicore test bed demonstrations are also discussed.

  10. Algorithm Parallelization using Software Design Patterns

    OpenAIRE

    Vincke, Robbie; De Witte, Nico; Van Landschoot, Sille; Steegmans, Eric; Boydens, Jeroen

    2013-01-01

    Multi-core systems are becoming mainstream. However, it is still a big challenge to develop concurrent software. Parallel Design Patterns can help in the migration process from legacy sequential to high-performing parallel code. Therefore we propose a layered model of parallel design patterns. When going through the layered model in a topdown approach, the developer is guided through the transition from sequential to parallel code. The value of the layered model is shown using a cycle/chain-d...

  11. LANMAS core: Update and current directions

    International Nuclear Information System (INIS)

    Local Area Network Material Accountability system (LANMAS) core software provides the framework of a material accountability system. It tracks the movement of material throughout a site and generates the required material accountability reports. LANMAS is a net-work- based nuclear material accountability system that runs in a client/server mode. The database of material type and location resides on the server, while the user interface runs on the client. The user interface accesses the data stored on the server via a network. The LANMAS core can be used as the foundation for building required materials control and accountability (MCA) functionality at any site requiring a new MCA system. An individual site will build on the LANMAS core by supplying site-specific software. This paper will provide an update on the current LANMAS development activities and discuss the current direction of the LANMAS project

  12. LANMAS core: Update and current directions

    International Nuclear Information System (INIS)

    Local Area Network Material Accountability System (LANMAS) core software will provide the framework of a material accountability system. LANMAS is a network-based nuclear material accountability system. It tracks the movement of material throughout a site and generates the required reports on material accountability. LANMAS will run in a client/server mode. The database of material type and location will reside on the server, while the user interface runs on the client. The user interface accesses the server via a network. The LANMAS core can be used as the foundation for building required Materials Control and Accountability (MC ampersand A) functionality at any site requiring a new MC ampersand A system. An individual site will build on the LANMAS core by supplying site-specific software. This paper will provide an update on the current LANMAS development activities and discuss the current direction of the LANMAS project

  13. ATLAS software packaging

    Science.gov (United States)

    Rybkin, Grigory

    2012-12-01

    Software packaging is indispensable part of build and prerequisite for deployment processes. Full ATLAS software stack consists of TDAQ, HLT, and Offline software. These software groups depend on some 80 external software packages. We present tools, package PackDist, developed and used to package all this software except for TDAQ project. PackDist is based on and driven by CMT, ATLAS software configuration and build tool, and consists of shell and Python scripts. The packaging unit used is CMT project. Each CMT project is packaged as several packages—platform dependent (one per platform available), source code excluding header files, other platform independent files, documentation, and debug information packages (the last two being built optionally). Packaging can be done recursively to package all the dependencies. The whole set of packages for one software release, distribution kit, also includes configuration packages and contains some 120 packages for one platform. Also packaged are physics analysis projects (currently 6) used by particular physics groups on top of the full release. The tools provide an installation test for the full distribution kit. Packaging is done in two formats for use with the Pacman and RPM package managers. The tools are functional on the platforms supported by ATLAS—GNU/Linux and Mac OS X. The packaged software is used for software deployment on all ATLAS computing resources from the detector and trigger computing farms, collaboration laboratories computing centres, grid sites, to physicist laptops, and CERN VMFS and covers the use cases of running all applications as well as of software development.

  14. ATLAS software packaging

    International Nuclear Information System (INIS)

    Software packaging is indispensable part of build and prerequisite for deployment processes. Full ATLAS software stack consists of TDAQ, HLT, and Offline software. These software groups depend on some 80 external software packages. We present tools, package PackDist, developed and used to package all this software except for TDAQ project. PackDist is based on and driven by CMT, ATLAS software configuration and build tool, and consists of shell and Python scripts. The packaging unit used is CMT project. Each CMT project is packaged as several packages—platform dependent (one per platform available), source code excluding header files, other platform independent files, documentation, and debug information packages (the last two being built optionally). Packaging can be done recursively to package all the dependencies. The whole set of packages for one software release, distribution kit, also includes configuration packages and contains some 120 packages for one platform. Also packaged are physics analysis projects (currently 6) used by particular physics groups on top of the full release. The tools provide an installation test for the full distribution kit. Packaging is done in two formats for use with the Pacman and RPM package managers. The tools are functional on the platforms supported by ATLAS—GNU/Linux and Mac OS X. The packaged software is used for software deployment on all ATLAS computing resources from the detector and trigger computing farms, collaboration laboratories computing centres, grid sites, to physicist laptops, and CERN VMFS and covers the use cases of running all applications as well as of software development.

  15. SOFTWARE MEASUREMENTS AND METRICS: ROLE IN EFFECTIVE SOFTWARE TESTING

    OpenAIRE

    Sheikh Umar Farooq; S. M. K. Quadri,; Nesar Ahmad

    2011-01-01

    Measurement has always been fundamental to the progress to any engineering discipline and software testing is no exception. Software metrics have been used in making quantitative/qualitative decisions as well as in risk assessment and reduction in software projects. In this paper we discuss software measurement and metrics and their fundamental role in software development life cycle. This paper focusing on software test metrics discusses their key role in software testing process and also cl...

  16. Software business models and contexts for software innovation: key areas software business research

    OpenAIRE

    Käkölä, Timo

    2003-01-01

    This paper examines business, design, and product development aspects of software business models. Contexts of small and large companies for creating software innovations are also analysed. Finally, software business research is called for and an agenda for software business research is presented to better understand the dynamics of the software industry and help create and manage successful software-intensive ventures.

  17. Designing Scientific Software for Heterogeneous Computing

    DEFF Research Database (Denmark)

    Glimberg, Stefan Lemvig

    , algorithms and data structures must be designed to utilize the underlying parallel architecture. The architectural changes in hardware design within the last decade, from single to multi and many-core architectures, require software developers to identify and properly implement methods that both exploit...... concurrency and maintain numerical efficiency. Graphical Processing Units (GPUs) have proven to be very e_ective units for computing the solution of scientific problems described by partial differential equations (PDEs). GPUs have today become standard devices in portable, desktop, and supercomputers, which...... makes parallel software design applicable, but also a challenge for scientific software developers at all levels. We have developed a generic C++ library for fast prototyping of large-scale PDEs solvers based on flexible-order finite difference approximations on structured regular grids. The library...

  18. Software abstractions logic, language, and analysis

    CERN Document Server

    Jackson, Daniel

    2011-01-01

    In Software Abstractions Daniel Jackson introduces an approach to software design that draws on traditional formal methods but exploits automated tools to find flaws as early as possible. This approach--which Jackson calls "lightweight formal methods" or "agile modeling"--takes from formal specification the idea of a precise and expressive notation based on a tiny core of simple and robust concepts but replaces conventional analysis based on theorem proving with a fully automated analysis that gives designers immediate feedback. Jackson has developed Alloy, a language that captures the essence of software abstractions simply and succinctly, using a minimal toolkit of mathematical notions. This revised edition updates the text, examples, and appendixes to be fully compatible with the latest version of Alloy (Alloy 4). The designer can use automated analysis not only to correct errors but also to make models that are more precise and elegant. This approach, Jackson says, can rescue designers from "the tarpit of...

  19. Who Owns Computer Software?

    Science.gov (United States)

    Branscomb, Anne Wells

    1995-01-01

    Discusses the protection of intellectual property as it applies to computer software and its impact on private enterprise and the public good. Highlights include the role of patents, copyrights, and trade secrets; some court cases; and recommendations for alternatives to the existing legal framework for protecting computer software. (KRN)

  20. Software Marketing Considerations.

    Science.gov (United States)

    Fuchs, Victor E.

    Seven factors that currently affect the potential for marketing and publishing computer software for education are discussed: (1) computers as an inplace technology in education, (2) marketing and distribution patterns for software, (3) consumer demand, (4) quality, (5) timelessenss, (6) basic skills, and (7) the future. The proliferation of…

  1. Software measurement guidebook

    Science.gov (United States)

    Bassman, Mitchell J.; Mcgarry, Frank; Pajerski, Rose

    1994-01-01

    This software Measurement Guidebook presents information on the purpose and importance of measurement. It discusses the specific procedures and activities of a measurement program and the roles of the people involved. The guidebook also clarifies the roles that measurement can and must play in the goal of continual, sustained improvement for all software production and maintenance efforts.

  2. Measuring software technology

    Science.gov (United States)

    Agresti, W. W.; Card, D. N.; Church, V. E.; Page, G.; Mcgarry, F. E.

    1983-01-01

    Results are reported from a series of investigations into the effectiveness of various methods and tools used in a software production environment. The basis for the analysis is a project data base, built through extensive data collection and process instrumentation. The project profiles become an organizational memory, serving as a reference point for an active program of measurement and experimentation on software technology.

  3. Cactus: Software Priorities

    Science.gov (United States)

    Hyde, Hartley

    2009-01-01

    The early eighties saw a period of rapid change in computing and teachers lost control of how they used computers in their classrooms. Software companies produced computer tools that looked so good that teachers forgot about writing their own classroom materials and happily purchased software--that offered much more than teachers needed--from…

  4. Fastbus software progress

    International Nuclear Information System (INIS)

    The current status of the Fastbus software development program of the Fastbus Software Working Group is reported, and future plans are discussed. A package of Fastbus interface subroutines has been prepared as a proposed standard, language support for diagnostics and bench testing has been developed, and new documentation to help users find these resources and use them effectively is being written

  5. Measuring software design

    Science.gov (United States)

    1986-01-01

    An extensive series of studies of software design measures conducted by the Software Engineering Laboratory is described. Included are the objectives and results of the studies, the method used to perform the studies, and the problems encountered. The document should be useful to researchers planning similar studies as well as to managers and designers concerned with applying quantitative design measures.

  6. Technology 84: software

    Energy Technology Data Exchange (ETDEWEB)

    Wallich, P.

    1984-01-01

    Progress is reported with regard to knowledge systems-artificial intelligence software capable of giving expert advice or analyzing complex information-and their major tasks and applications. A standard military language, ADA, is also discussed along with efforts to standardize software environments.

  7. Cartographic applications software

    Science.gov (United States)

    U.S. Geological Survey

    1992-01-01

    The Office of the Assistant Division Chief for Research, National Mapping Division, develops computer software for the solution of geometronic problems in the fields of surveying, geodesy, remote sensing, and photogrammetry. Software that has been developed using public funds is available on request for a nominal charge to recover the cost of duplication.

  8. Marketing Mix del Software.

    Directory of Open Access Journals (Sweden)

    Yudith del Carmen Rodríguez Pérez

    2006-03-01

    Por ello, en este trabajo se define el concepto de producto software, se caracteriza al mismo y se exponen sus atributos de calidad. Además, se aborda la mezcla de marketing del software necesaria y diferente a la de otros productos para que este triunfe en el mercado.

  9. Software evolution with XVCL

    DEFF Research Database (Denmark)

    Zhang, Weishan; Jarzabek, Stan; Zhang, Hongyu;

    2004-01-01

    This chapter introduces software evolution with XVCL (XML-based Variant Configuration Language), which is an XML-based metaprogramming technique. As the software evolves, a large number of variants may arise, especially whtn such kinds of evolutions are related to multiple platforms as shown in our...

  10. Software-Wiederverwendung

    OpenAIRE

    Ludewig, Jochen

    1993-01-01

    Wiederverwendete Software ist billiger als eine Neuentwicklung und hat in der Regel weniger Fehler. Zudem trägt sie zur Standardisierung bei. Diese und andere Vorteile sprechen für die Wiederverwendung von Software. Ihrer Förderung stehen aber auch Hindernisse im Weg.

  11. Threats to Bitcoin Software

    OpenAIRE

    Kateraas, Christian H

    2014-01-01

    Collect and analyse threat models to the Bitcoin ecosystem and its software. The create misuse case, attack trees, and sequence diagrams of the threats. Create a malicious client from the gathered threat models. Once the development of the client is complete, test the client and evaluate its performance. From this, assess the security of the Bitcoin software.

  12. Emergency core cooling device

    International Nuclear Information System (INIS)

    The present invention provides an emergency core cooling device without using a reactor core spray device, in which the reactor core of a BWR type reactor is cooled effectively and certainly by flooding of the reactor core. That is, the emergency core cooling device comprises a high pressure core water injection system as an emergency core cooling system (ECCS) for cooling the inside of the reactor core upon loss of coolants accident (LOCA). By means of the high pressure core water injection system, water is injected from a condensate storage vessel or a suppression pool to the inside of the reactor core shroud upon LOCA. Accordingly, the reactor core is cooled effectively by reactor core flooding. In this device, cooling water can be injected to the inside of the reactor core shroud by means of the high pressure core injection system upon LOCA in which the coolants are discharged from the outside of the reactor core shroud. On the other hand, upon LOCA in which the coolants are discharged from the inside of the reactor core shroud, the cooling water can be supplied to the reactor core by means of a cooling system upon reactor isolation which injects water to the outside of the reactor core or a low pressure water injection system. (I.S.)

  13. NASA Software Safety Standard

    Science.gov (United States)

    Rosenberg, Linda

    1997-01-01

    If software is a critical element in a safety critical system, it is imperative to implement a systematic approach to software safety as an integral part of the overall system safety programs. The NASA-STD-8719.13A, "NASA Software Safety Standard", describes the activities necessary to ensure that safety is designed into software that is acquired or developed by NASA, and that safety is maintained throughout the software life cycle. A PDF version, is available on the WWW from Lewis. A Guidebook that will assist in the implementation of the requirements in the Safety Standard is under development at the Lewis Research Center (LeRC). After completion, it will also be available on the WWW from Lewis.

  14. Revisiting software ecosystems research

    DEFF Research Database (Denmark)

    Manikas, Konstantinos

    2016-01-01

    Software ecosystems’ is argued to first appear as a concept more than 10 years ago and software ecosystem research started to take off in 2010. We conduct a systematic literature study, based on the most extensive literature review in the field up to date, with two primarily aims: (a) to provide...... an updated overview of the field and (b) to document evolution in the field. In total, we analyze 231 papers from 2007 until 2014 and provide an overview of the research in software ecosystems. Our analysis reveals a field that is rapidly growing both in volume and empirical focus while becoming more...... from evolving. We propose means for future research and the community to address them. Finally, our analysis shapes the view of the field having evolved outside the existing definitions of software ecosystems and thus propose the update of the definition of software ecosystems....

  15. Developing Software Simulations

    Directory of Open Access Journals (Sweden)

    Tom Hall

    2007-06-01

    Full Text Available Programs in education and business often require learners to develop and demonstrate competence in specified areas and then be able to effectively apply this knowledge. One method to aid in developing a skill set in these areas is through the use of software simulations. These simulations can be used for learner demonstrations of competencies in a specified course as well as a review of the basic skills at the beginning of subsequent courses. The first section of this paper discusses ToolBook, the software used to develop our software simulations. The second section discusses the process of developing software simulations. The third part discusses how we have used software simulations to assess student knowledge of research design by providing simulations that allow the student to practice using SPSS and Excel.

  16. Software Requirements Management

    Directory of Open Access Journals (Sweden)

    Ali Altalbe

    2015-04-01

    Full Text Available Requirements are defined as the desired set of characteristics of a product or a service. In the world of software development, it is estimated that more than half of the failures are attributed towards poor requirements management. This means that although the software functions correctly, it is not what the client requested. Modern software requirements management methodologies are available to reduce the occur-rence of such incidents. This paper performs a review on the available literature in the area while tabulating possible methods of managing requirements. It also highlights the benefits of following a proper guideline for the requirements management task. With the introduction of specific software tools for the requirements management task, better software products are now been developed with lesser resources.

  17. Software quality in 1997

    Energy Technology Data Exchange (ETDEWEB)

    Jones, C. [Software Productivity Research, Inc., Burlington, MA (United States)

    1997-11-01

    For many years, software quality assurance lagged behind hardware quality assurance in terms of methods, metrics, and successful results. New approaches such as Quality Function Deployment (QFD) the ISO 9000-9004 standards, the SEI maturity levels, and Total Quality Management (TQM) are starting to attract wide attention, and in some cases to bring software quality levels up to a parity with manufacturing quality levels. Since software is on the critical path for many engineered products, and for internal business systems as well, the new approaches are starting to affect global competition and attract widespread international interest. It can be hypothesized that success in mastering software quality will be a key strategy for dominating global software markets in the 21st century.

  18. DIVERSIFICATION IN SOFTWARE ENGINEERING

    Directory of Open Access Journals (Sweden)

    Er.Kirtesh Jailia,

    2010-06-01

    Full Text Available In this paper we examine the factors that have promoted the iversification of software process models. The intention is to understand more clearly the problem-solving process in software engineering & try to find out the efficient way to manage the risk. A review of software process modeling is given first, followed by a discussion of process evaluation techniques. A taxonomy for categorizing process models, based on establishing decision criteria,is identified that can guide selecting the appropriate model from a set of alternatives on the basis of model characteristics and software project needs. We are proposing a model in this paper, for dealing with the diversification in software engineering.

  19. Software licenses: Stay honest!

    CERN Multimedia

    Computer Security Team

    2012-01-01

    Do you recall our article about copyright violation in the last issue of the CERN Bulletin, “Music, videos and the risk for CERN”? Now let’s be more precise. “Violating copyright” not only means the illegal download of music and videos, it also applies to software packages and applications.   Users must respect proprietary rights in compliance with the CERN Computing Rules (OC5). Not having legitimately obtained a program or the required licenses to run that software is not a minor offense. It violates CERN rules and puts the Organization at risk! Vendors deserve credit and compensation. Therefore, make sure that you have the right to use their software. In other words, you have bought the software via legitimate channels and use a valid and honestly obtained license. This also applies to “Shareware” and software under open licenses, which might also come with a cost. Usually, only “Freeware” is complete...

  20. Software safety hazard analysis

    International Nuclear Information System (INIS)

    Techniques for analyzing the safety and reliability of analog-based electronic protection systems that serve to mitigate hazards in process control systems have been developed over many years, and are reasonably well understood. An example is the protection system in a nuclear power plant. The extension of these techniques to systems which include digital computers is not well developed, and there is little consensus among software engineering experts and safety experts on how to analyze such systems. One possible technique is to extend hazard analysis to include digital computer-based systems. Software is frequently overlooked during system hazard analyses, but this is unacceptable when the software is in control of a potentially hazardous operation. In such cases, hazard analysis should be extended to fully cover the software. A method for performing software hazard analysis is proposed in this paper

  1. Software architecture evolution

    DEFF Research Database (Denmark)

    Barais, Olivier; Le Meur, Anne-Francoise; Duchien, Laurence;

    2008-01-01

    Software architectures must frequently evolve to cope with changing requirements, and this evolution often implies integrating new concerns. Unfortunately, when the new concerns are crosscutting, existing architecture description languages provide little or no support for this kind of evolution....... The software architect must modify multiple elements of the architecture manually, which risks introducing inconsistencies. This chapter provides an overview, comparison and detailed treatment of the various state-of-the-art approaches to describing and evolving software architectures. Furthermore, we...... discuss one particular framework named Tran SAT, which addresses the above problems of software architecture evolution. Tran SAT provides a new element in the software architecture descriptions language, called an architectural aspect, for describing new concerns and their integration into an existing...

  2. Systematic Software Development

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; Méndez Fernández, Daniel

    2015-01-01

    The speed of innovation and the global allocation of resources to accelerate development or to reduce cost put pressure on the software industry. In the global competition, especially so-called high-price countries have to present arguments why the higher development cost is justified and what...... of professionalism and systematization of software development to draw a map of strengths and weaknesses. To this end, we conducted as a first step an exploratory survey in Germany, presented in this paper. In this survey, we focused on the perceived importance of the two general software engineering process areas...... project- and quality management and their implementation in practice. So far, our results suggest that the necessity for a systematic software development is well recognized, while software development still follows an ad-hoc rather than a systematized style. Our results provide initial findings, which we...

  3. Many-core on the Grid: From exploration to production

    International Nuclear Information System (INIS)

    High Energy Physics experiments have successfully demonstrated that many-core devices such as GPUs can be used to accelerate critical algorithms in their software. There is now increasing community interest for many-core devices to be made available on the LHC Computing Grid infrastructure. Despite anticipated usage there is no standard method available to run many-core applications in distributed computing environments and before many-core resources are made available on the Grid a number of operational issues such as job scheduling and resource discovery will need to be addressed. The key challenges for Grid-enabling many-core devices will be discussed.

  4. Scientific Software Component Technology

    Energy Technology Data Exchange (ETDEWEB)

    Kohn, S.; Dykman, N.; Kumfert, G.; Smolinski, B.

    2000-02-16

    We are developing new software component technology for high-performance parallel scientific computing to address issues of complexity, re-use, and interoperability for laboratory software. Component technology enables cross-project code re-use, reduces software development costs, and provides additional simulation capabilities for massively parallel laboratory application codes. The success of our approach will be measured by its impact on DOE mathematical and scientific software efforts. Thus, we are collaborating closely with library developers and application scientists in the Common Component Architecture forum, the Equation Solver Interface forum, and other DOE mathematical software groups to gather requirements, write and adopt a variety of design specifications, and develop demonstration projects to validate our approach. Numerical simulation is essential to the science mission at the laboratory. However, it is becoming increasingly difficult to manage the complexity of modern simulation software. Computational scientists develop complex, three-dimensional, massively parallel, full-physics simulations that require the integration of diverse software packages written by outside development teams. Currently, the integration of a new software package, such as a new linear solver library, can require several months of effort. Current industry component technologies such as CORBA, JavaBeans, and COM have all been used successfully in the business domain to reduce software development costs and increase software quality. However, these existing industry component infrastructures will not scale to support massively parallel applications in science and engineering. In particular, they do not address issues related to high-performance parallel computing on ASCI-class machines, such as fast in-process connections between components, language interoperability for scientific languages such as Fortran, parallel data redistribution between components, and massively

  5. Power laws in software systems

    OpenAIRE

    Tonelli, Roberto

    2012-01-01

    The main topic of my PhD has been the study of power laws in software systems within the perspective of describing software quality. My PhD research contributes to a recent stream of studies in software engineering, where the investigation of power laws in software systems has become widely popular in recent years, since they appear on an incredible variety of different software quantities and properties, like, for example, software metrics, software faults, refactoring, Java byte-code,...

  6. Software Developers’ Perceptions of Productivity

    OpenAIRE

    Meyer, André; Fritz, Thomas; Murphy, Gail C.; Zimmermann, Thomas

    2014-01-01

    The better the software development community becomes at creating software, the more software the world seems to demand. Although there is a large body of research about measuring and investigating productivity from an organizational point of view, there is a paucity of research about how software developers, those at the front-line of software construction, think about, assess and try to improve their productivity. To investigate software developers' perceptions of software development produ...

  7. Design and performance test of spacecraft test and operation software

    Science.gov (United States)

    Wang, Guohua; Cui, Yan; Wang, Shuo; Meng, Xiaofeng

    2011-06-01

    Main test processor (MTP) software is the key element of Electrical Ground Support Equipment (EGSE) for spacecraft test and operation used in the Chinese Academy of Space Technology (CAST) for years without innovation. With the increasing demand for a more efficient and agile MTP software, the new MTP software was developed. It adopts layered and plug-in based software architecture, whose core runtime server provides message queue management, share memory management and process management services and forms the framework for a configurable and open architecture system. To investigate the MTP software's performance, the test case of network response time, test sequence management capability and data-processing capability was introduced in detail. Test results show that the MTP software is common and has higher performance than the legacy one.

  8. Software platform virtualization in chemistry research and university teaching

    Directory of Open Access Journals (Sweden)

    Kind Tobias

    2009-11-01

    Full Text Available Abstract Background Modern chemistry laboratories operate with a wide range of software applications under different operating systems, such as Windows, LINUX or Mac OS X. Instead of installing software on different computers it is possible to install those applications on a single computer using Virtual Machine software. Software platform virtualization allows a single guest operating system to execute multiple other operating systems on the same computer. We apply and discuss the use of virtual machines in chemistry research and teaching laboratories. Results Virtual machines are commonly used for cheminformatics software development and testing. Benchmarking multiple chemistry software packages we have confirmed that the computational speed penalty for using virtual machines is low and around 5% to 10%. Software virtualization in a teaching environment allows faster deployment and easy use of commercial and open source software in hands-on computer teaching labs. Conclusion Software virtualization in chemistry, mass spectrometry and cheminformatics is needed for software testing and development of software for different operating systems. In order to obtain maximum performance the virtualization software should be multi-core enabled and allow the use of multiprocessor configurations in the virtual machine environment. Server consolidation, by running multiple tasks and operating systems on a single physical machine, can lead to lower maintenance and hardware costs especially in small research labs. The use of virtual machines can prevent software virus infections and security breaches when used as a sandbox system for internet access and software testing. Complex software setups can be created with virtual machines and are easily deployed later to multiple computers for hands-on teaching classes. We discuss the popularity of bioinformatics compared to cheminformatics as well as the missing cheminformatics education at universities worldwide.

  9. Software Platform Architecture for Laboratory Workstation Software

    OpenAIRE

    Tuhkanen, Tomi

    2013-01-01

    The aim of the thesis was to design an architecture for a workstation software platform to control laboratory instruments and to decide technologies that were used with the platform. The platform needed to support multiple instruments with different functionalities and also support multiple simultaneous instruments. The application, based on the architecture, needed to function as a stand-alone application and as an automation service a. The plat-form was to be developed with Microsoft .NET. ...

  10. Software Formal Inspections Guidebook

    Science.gov (United States)

    1993-01-01

    The Software Formal Inspections Guidebook is designed to support the inspection process of software developed by and for NASA. This document provides information on how to implement a recommended and proven method for conducting formal inspections of NASA software. This Guidebook is a companion document to NASA Standard 2202-93, Software Formal Inspections Standard, approved April 1993, which provides the rules, procedures, and specific requirements for conducting software formal inspections. Application of the Formal Inspections Standard is optional to NASA program or project management. In cases where program or project management decide to use the formal inspections method, this Guidebook provides additional information on how to establish and implement the process. The goal of the formal inspections process as documented in the above-mentioned Standard and this Guidebook is to provide a framework and model for an inspection process that will enable the detection and elimination of defects as early as possible in the software life cycle. An ancillary aspect of the formal inspection process incorporates the collection and analysis of inspection data to effect continual improvement in the inspection process and the quality of the software subjected to the process.

  11. LDUA software custodian's notebook

    International Nuclear Information System (INIS)

    This plan describes the activities to be performed and controls to be applied to the process of specifying, obtaining, and qualifying the control and data acquisition software for the Light Duty Utility Arm (LDUA) System. It serves the purpose of a software quality assurance plan, a verification and validation plan, and a configuration management plan. This plan applies to all software that is an integral part of the LDUA control and data acquisition system, that is, software that is installed in the computers that are part of the LDUA system as it is deployed in the field. This plan applies to the entire development process, including: requirements; design; implementation; and operations and maintenance. This plan does not apply to any software that is not integral with the LDUA system. This plan has-been prepared in accordance with WHC-CM-6-1 Engineering Practices, EP-2.1; WHC-CM-3-10 Software Practices; and WHC-CM-4-2, QR 19.0, Software Quality Assurance Requirements

  12. Software packager user's guide

    Science.gov (United States)

    Callahan, John R.

    1995-01-01

    Software integration is a growing area of concern for many programmers and software managers because the need to build new programs quickly from existing components is greater than ever. This includes building versions of software products for multiple hardware platforms and operating systems, building programs from components written in different languages, and building systems from components that must execute on different machines in a distributed network. The goal of software integration is to make building new programs from existing components more seamless -- programmers should pay minimal attention to the underlying configuration issues involved. Libraries of reusable components and classes are important tools but only partial solutions to software development problems. Even though software components may have compatible interfaces, there may be other reasons, such as differences between execution environments, why they cannot be integrated. Often, components must be adapted or reimplemented to fit into another application because of implementation differences -- they are implemented in different programming languages, dependent on different operating system resources, or must execute on different physical machines. The software packager is a tool that allows programmers to deal with interfaces between software components and ignore complex integration details. The packager takes modular descriptions of the structure of a software system written in the package specification language and produces an integration program in the form of a makefile. If complex integration tools are needed to integrate a set of components, such as remote procedure call stubs, their use is implied by the packager automatically and stub generation tools are invoked in the corresponding makefile. The programmer deals only with the components themselves and not the details of how to build the system on any given platform.

  13. Software quality assurance handbook

    Energy Technology Data Exchange (ETDEWEB)

    1990-09-01

    There are two important reasons for Software Quality Assurance (SQA) at Allied-Signal Inc., Kansas City Division (KCD): First, the benefits from SQA make good business sense. Second, the Department of Energy has requested SQA. This handbook is one of the first steps in a plant-wide implementation of Software Quality Assurance at KCD. The handbook has two main purposes. The first is to provide information that you will need to perform software quality assurance activities. The second is to provide a common thread to unify the approach to SQA at KCD. 2 figs.

  14. Advanced fingerprint verification software

    Science.gov (United States)

    Baradarani, A.; Taylor, J. R. B.; Severin, F.; Maev, R. Gr.

    2016-05-01

    We have developed a fingerprint software package that can be used in a wide range of applications from law enforcement to public and private security systems, and to personal devices such as laptops, vehicles, and door- locks. The software and processing units are a unique implementation of new and sophisticated algorithms that compete with the current best systems in the world. Development of the software package has been in line with the third generation of our ultrasonic fingerprinting machine1. Solid and robust performance is achieved in the presence of misplaced and low quality fingerprints.

  15. Guide to software export

    CERN Document Server

    Philips, Roger A

    2014-01-01

    An ideal reference source for CEOs, marketing and sales managers, sales consultants, and students of international marketing, Guide to Software Export provides a step-by-step approach to initiating or expanding international software sales. It teaches you how to examine critically your candidate product for exportability; how to find distributors, agents, and resellers abroad; how to identify the best distribution structure for export; and much, much more!Not content with providing just the guidelines for setting up, expanding, and managing your international sales channels, Guide to Software

  16. CNEOST Control Software System

    Science.gov (United States)

    Wang, Xin; Zhao, Hai-bin; Xia, Yan; Lu, Hao; Li, Bin

    2016-01-01

    In 2013, CNEOST (China Near Earth Object Survey Telescope) adapted its hardware system for the new CCD camera. Based on the new system architecture, the control software is re-designed and implemented. The software system adopts the messaging mechanism based on the WebSocket protocol, and possesses good flexibility and expansibility. The user interface based on the responsive web design has realized the remote observations under both desktop and mobile devices. The stable operation of the software system has greatly enhanced the operation efficiency while reducing the complexity, and has also made a successful attempt for the future system design of telescope and telescope cloud.

  17. Software cost estimation

    OpenAIRE

    Heemstra, F.J.

    1992-01-01

    The paper gives an overview of the state of the art of software cost estimation (SCE). The main questions to be answered in the paper are: (1) What are the reasons for overruns of budgets and planned durations? (2) What are the prerequisites for estimating? (3) How can software development effort be estimated? (4) What can software project management expect from SCE models, how accurate are estimations which are made using these kind of models, and what are the pros and cons of cost estimatio...

  18. Global Software Engineering

    DEFF Research Database (Denmark)

    Ebert, Christof; Kuhrmann, Marco; Prikladnicki, Rafael

    2016-01-01

    SOFTWARE, LIKE ALL industry products, is the result of complex multinational supply chains with many partners from concept to development to production and maintenance. Global software engineering (GSE), IT outsourcing, and business process outsourcing during the past decade have showed growth...... rates of 10 to 20 percent per year. This instalment of Practitioner’s Digest summarizes experiences and guidance from industry to facilitate knowledge and technology transfer for GSE. It’s based on industry feedback from the annual IEEE International Conference on Global Software Engineering, which had...

  19. Calidad de componentes software

    OpenAIRE

    Carvallo Vega, Juan Pablo; Franch Gutiérrez, Javier; Quer Bosor, Maria Carme

    2010-01-01

    En los últimos años se constata una tendencia creciente por parte de las organizaciones a desarrollar sus sistemas software mediante la combinación de componentes, en lugar de desarrollar dichos sistemas partiendo de cero. Esta tendencia es debida a varios factores. Entre ellos cabe destacar: la necesidad de las organizaciones de reducir los costes y el tiempo dedicados al desarrollo de los sistemas software; el crecimiento del mercado de componentes software; la reducción de la distancia ent...

  20. Pricing of Software Services.

    OpenAIRE

    R. Bala; Carr, S. C.

    2005-01-01

    We analyze and compare fixed-fee and usage-fee software pricing schemes - in fixed-fee pricing, all users pay the same price; in usage-fee pricing, the users’ fees depend on the amount that they use the software (e.g., the user of an online-database service might be charged for each data query). We employ a two-dimensional model of customer heterogeneity - specifically, we assume that customers vary in the amount that they will use the software (usage heterogeneity) and also in their per-use ...

  1. Orbit Software Suite

    Science.gov (United States)

    Osgood, Cathy; Williams, Kevin; Gentry, Philip; Brownfield, Dana; Hallstrom, John; Stuit, Tim

    2012-01-01

    Orbit Software Suite is used to support a variety of NASA/DM (Dependable Multiprocessor) mission planning and analysis activities on the IPS (Intrusion Prevention System) platform. The suite of Orbit software tools (Orbit Design and Orbit Dynamics) resides on IPS/Linux workstations, and is used to perform mission design and analysis tasks corresponding to trajectory/ launch window, rendezvous, and proximity operations flight segments. A list of tools in Orbit Software Suite represents tool versions established during/after the Equipment Rehost-3 Project.

  2. Software design practice using two SCADA software packages

    DEFF Research Database (Denmark)

    Basse, K.P.; Christensen, Georg Kronborg; Frederiksen, P. K.

    1996-01-01

    Typical software development for manufacturing control is done either by specialists with consideral real-time programming experience or done by the adaptation of standard software packages for manufacturing control. After investigation and test of two commercial software packages: "In...

  3. A concept of software testing for SMART MMIS software

    International Nuclear Information System (INIS)

    In order to achieve high quality of SMART MMIS software, the well-constructed software testing concept shall be required. This paper established software testing concept which is to be applied to SMART MMIS software, in terms of software testing organization, documentation. procedure, and methods. The software testing methods are classified into source code static analysis and dynamic testing. The software dynamic testing methods are discussed with two aspects: white-box and black-box testing. As software testing concept introduced in this paper is applied to the SMART MMIS software. the high quality of the software will be produced. In the future, software failure data will be collected through the construction of SMART MMIS prototyping facility which the software testing concept of this paper is applied to

  4. Software Patent and its Impact on Software Innovation in Japan

    OpenAIRE

    Motohashi, Kazuyuki

    2009-01-01

    In Japan, the software patent system has been reformed and now software has become a patentable subject matter. In this paper, this pro-patent shift on software is surveyed and its impact on software innovation is analyzed. Before the 1990's, inventions related to software could not be patented by themselves, but they could be applied when combined with hardware related inventions. Therefore, integrated electronics firms used to be the major software patent applicants. However, during the per...

  5. Software Project Documentation - An Essence of Software Development

    OpenAIRE

    Vikas S. Chomal; Dr. Jatinderkumar R. Saini

    2015-01-01

    Software documentation is a critical attribute of both software projects and software engineering in general. Documentation is considered as a media of communication among the parties involved during software development as well the one who will be using the software. It consists of written particulars concerning software specifications as well as what it does, in which manner it accomplishes the specified details and even how to exercise it. In this paper, we tried to focus on the role of do...

  6. A Simple Complexity Measurement for Software Verification and Software Testing

    OpenAIRE

    Cheng, Zheng; Monahan, Rosemary; Power, James F.

    2012-01-01

    In this paper, we used a simple metric (i.e. Lines of Code) to measure the complexity involved in software verification and software testing. The goal is then, to argue for software verification over software testing, and motivate a discussion of how to reduce the complexity involved in software verification. We propose to reduce this complexity by translating the software to a simple intermediate representation which can be verified using an efficient verifier, such as Boog...

  7. SPC for Software Reliability-Imperfect Software Debugging Model

    OpenAIRE

    R Satya Prasad; Supriya, N.; G. Krishna Mohan

    2011-01-01

    Software reliability process can be monitored efficiently by using Statistical Process Control (SPC). It assists the software development team to identify failures and actions to be taken during software failure process and hence, assures better software reliability. In this paper, we consider a software reliability growth model of Non-Homogenous Poisson Process (NHPP) based, that incorporates imperfect debugging problem. The proposed model utilizes the failure data collected from software de...

  8. Review of the Unified Software Development Process%统一软件开发过程述评

    Institute of Scientific and Technical Information of China (English)

    麻志毅

    2002-01-01

    The Unified Software Development Process(USDP),published by a few masters in the software engineering field and Rational Software Corporation,and Supported by OMG,is attracting wide attention in the area of software engineering.After summarizing USDP,the paper introduces the phases and the core workflows of USDP in detail,then discusses positive influence of USDP on software development process,and points out some possible problems.

  9. Spreadsheet Auditing Software

    CERN Document Server

    Nixon, David

    2010-01-01

    It is now widely accepted that errors in spreadsheets are both common and potentially dangerous. Further research has taken place to investigate how frequently these errors occur, what impact they have, how the risk of spreadsheet errors can be reduced by following spreadsheet design guidelines and methodologies, and how effective auditing of a spreadsheet is in the detection of these errors. However, little research exists to establish the usefulness of software tools in the auditing of spreadsheets. This paper documents and tests office software tools designed to assist in the audit of spreadsheets. The test was designed to identify the success of software tools in detecting different types of errors, to identify how the software tools assist the auditor and to determine the usefulness of the tools.

  10. The Software Patent Debate

    OpenAIRE

    Guadamuz, Andres

    2006-01-01

    The paper discusses the proposed European Directive on the Patentability of Computer-Implemented Inventions and the subsequent debate that followed. Do software patents - as argued by policymakers' - result in increased innovation?

  11. Software for nuclear spectrometry

    International Nuclear Information System (INIS)

    The Advisory Group Meeting (AGM) on Software for Nuclear Spectrometry was dedicated to review the present status of software for nuclear spectrometry and to advise on future activities in this field. Because similar AGM and consultant's meetings had been held in the past; together with an attempt to get more streamlined, this AGM was devoted to the specific field of software for gamma ray spectrometry. Nevertheless, many of the issues discussed and the recommendations made are of general concern for any software on nuclear spectrometry. The report is organized by sections. The 'Summary' gives conclusions and recommendations adopted at the AGM. These conclusions and recommendations resulted from the discussions held during and after presentations of the scientific and technical papers. These papers are reported here in their integral form in the following Sections

  12. Managing Distributed Software Projects

    DEFF Research Database (Denmark)

    Persson, John Stouby

    Increasingly, software projects are becoming geographically distributed, with limited face-toface interaction between participants. These projects face particular challenges that need careful managerial attention. This PhD study reports on how we can understand and support the management of...... distributed software projects, based on a literature study and a case study. The main emphasis of the literature study was on how to support the management of distributed software projects, but also contributed to an understanding of these projects. The main emphasis of the case study was on how to understand...... the management of distributed software projects, but also contributed to supporting the management of these projects. The literature study integrates what we know about risks and risk-resolution techniques, into a framework for managing risks in distributed contexts. This framework was developed...

  13. eSoftwareList

    Data.gov (United States)

    US Agency for International Development — USAID Software Database reporting tool created in Oracle Application Express (APEX). This version provides read only access to a database view of the JIRA SAR...

  14. Evaluation of Visualization Software

    Science.gov (United States)

    Globus, Al; Uselton, Sam

    1995-01-01

    Visualization software is widely used in scientific and engineering research. But computed visualizations can be very misleading, and the errors are easy to miss. We feel that the software producing the visualizations must be thoroughly evaluated and the evaluation process as well as the results must be made available. Testing and evaluation of visualization software is not a trivial problem. Several methods used in testing other software are helpful, but these methods are (apparently) often not used. When they are used, the description and results are generally not available to the end user. Additional evaluation methods specific to visualization must also be developed. We present several useful approaches to evaluation, ranging from numerical analysis of mathematical portions of algorithms to measurement of human performance while using visualization systems. Along with this brief survey, we present arguments for the importance of evaluations and discussions of appropriate use of some methods.

  15. Sharing control system software

    International Nuclear Information System (INIS)

    Building a custom accelerator control system requires effort in the range of 30-100 person-years. This represents a significant investment of time, effort, and risk, as well as challenges for management. Even when the system is successful, the software has not yet been applied to the particular project; no custom control algorithms, either engineering or physics-based, have been implemented; and the system has not been documented for long-term maintenance and use. This paper reviews the requirements for sharing software between accelerator control system projects. It also reviews the three mechanisms by which control system software has been shared in the past and is being shared now, as well as some of the experiences. After reviewing the mechanisms and experiences, one can conclude there is no one best solution. The right software sharing mechanism depends upon the needs of the client site, the client resources available, and the services the provider can give

  16. Advanced Software Protection Now

    CERN Document Server

    Bendersky, Diego; Notarfrancesco, Luciano; Sarraute, Carlos; Waissbein, Ariel

    2010-01-01

    Software digital rights management is a pressing need for the software development industry which remains, as no practical solutions have been acclamaimed succesful by the industry. We introduce a novel software-protection method, fully implemented with today's technologies, that provides traitor tracing and license enforcement and requires no additional hardware nor inter-connectivity. Our work benefits from the use of secure triggers, a cryptographic primitive that is secure assuming the existence of an ind-cpa secure block cipher. Using our framework, developers may insert license checks and fingerprints, and obfuscate the code using secure triggers. As a result, this rises the cost that software analysis tools have detect and modify protection mechanisms. Thus rising the complexity of cracking this system.

  17. Software Design Analyzer System

    Science.gov (United States)

    Tausworthe, R. C.

    1985-01-01

    CRISP80 software design analyzer system a set of programs that supports top-down, hierarchic, modular structured design, and programing methodologies. CRISP80 allows for expression of design as picture of program.

  18. Project Portfolio Management Software

    OpenAIRE

    Paul POCATILU

    2006-01-01

    In order to design a methodology for the development of project portfolio management (PPM) applications, the existing applications have to be studied. This paper describes the main characteristics of the leading project portfolio management software applications.

  19. Testing software product lines

    OpenAIRE

    da Mota Silveira Neto, Paulo Anselmo; Runeson, Per; Machado, Ivan do Carmo; Almeida, Eduardo Santana de; de Lemos Meira, Silvio Romero; Engström, Emelie

    2011-01-01

    Two studies of testing practices for software product lines identify gaps between required techniques and existing approaches in the available literature. This Web extra offers extra details for the main article (specifically, the bibliography for the two studies described).

  20. Distribution engineering software

    Energy Technology Data Exchange (ETDEWEB)

    MacMillan, M.

    1996-11-01

    The role of engineering software in the electric utility industry was discussed. Ottawa Hydro explained how engineering software has become an integral part of its day-to-day operations. As a result of the changes in meter technology Ottawa Hydro`s largest users are all digitally metered. This has created a demand for specialized software than can instantaneously interpret digital meter data. Multimaster by Schlumberger Industries is the most important package used in the company`s meter section. Monarch by Personics, is another software package designed to organize files and data for use in spreadsheets. In the operations divisions, PC-based packages are used to plot out electrical protection curves, drawing relay time current curves, and performing conducting fault calculations. The most commonly used programs are V-PRO and V-NET produced by Cooper Power Systems.

  1. Microprocessor-based integrated LMFBR core surveillance

    International Nuclear Information System (INIS)

    This report results from a joint study of KfK and INTERATOM. The aim of this study is to explore the advantages of microprocessors and microelectronics for a more sophisticated core surveillance, which is based on the integration of separate surveillance techniques. Due to new developments in microelectronics and related software an approach to LMFBR core surveillance can be conceived that combines a number of measurements into a more intelligent decision-making data processing system. The following techniques are considered to contribute essentially to an integrated core surveillance system: - subassembly state and thermal hydraulics performance monitoring, - temperature noise analysis, - acoustic core surveillance, - failure characterization and failure prediction based on DND- and cover gas signals, and - flux tilting techniques. Starting from a description of these techniques it is shown that by combination and correlation of these individual techniques a higher degree of cost-effectiveness, reliability and accuracy can be achieved. (orig./GL)

  2. Visualizing Object-oriented Software for Understanding and Documentation

    OpenAIRE

    Al-Msie'Deen, Ra'Fat

    2016-01-01

    Understanding or comprehending source code is one of the core activities of software engineering. Understanding object-oriented source code is essential and required when a programmer maintains, migrates, reuses, documents or enhances source code. The source code that is not comprehended cannot be changed. The comprehension of object-oriented source code is a difficult problem solving process. In order to document object-oriented software system there are needs to understand its source code. ...

  3. Software Architecture: Architecture Constraints

    OpenAIRE

    Tibermacine, Chouki

    2014-01-01

    International audience In this chapter, we introduce an additional, yet essential, concept in describing software architectures : architecture constraints. We explain the precise role of these entities and their importance in object-oriented, component-based or service-oriented software engi-neering. We then describe the way in which they are specified and interpreted. An architect can define architecture constraints and then associate them to architectural descriptions to limit their stru...

  4. Mining unstructured software data

    OpenAIRE

    Bacchelli, Alberto; Lanza, Michele

    2013-01-01

    Our thesis is that the analysis of unstructured data supports software understanding and evolution analysis, and complements the data mined from structured sources. To this aim, we implemented the necessary toolset and investigated methods for exploring, exposing, and exploiting unstructured data.To validate our thesis, we focused on development email data. We found two main challenges in using it to support program comprehension and software development: The disconnection between emai...

  5. Fostering software quality assessment

    OpenAIRE

    Brandtner, Martin

    2013-01-01

    Software quality assessment shall monitor and guide the evolution of a system based on quality measurements. This continuous process should ideally involve multiple stakeholders and provide adequate information for each of them to use. We want to support an effective selection of quality measurements based on the type of software and individual information needs of the involved stakeholders. We propose an approach that brings together quality measurements and individual information needs for ...

  6. Occupational radiation protection software

    International Nuclear Information System (INIS)

    This paper presents a reflection on the basic essentials of a Radiation Work Permit (RWP). Based on the latest WANO Recommendations, this paper considers the RWP as a complete process rather than a simple administrative procedure. This process is implemented via software which is also presented in this paper. The software has been designed to achieve the following objectives: - To configure the radiological map of the plant. To plan radiological surveillance, to input data, to update radiological signposting and mandatory protective clothing in each area of the station. All this information can be checked from any personnel computer connected to a network. - To collect radiological data by means of a palmtop (PDA) and to upload it to a personnel computer, thereby speeding up the job and reducing human errors. - To implement the RWP by allowing on-line consultation of the permitted individual doses of the workers and the planned collective dose for each job. The software also supplies the radiological information to the workers. - To collect and arrange pictures, maps and sketches of equipment placed in rooms or in areas of the plant. - To allow the software to be used in real time from different workstations. - High reliability and speed of working. - Flexible data enquiry. The software provides a number of standard data enquiries such as numbers of workers on each job and their individual dose received...etc. It also allows data to be exported to other well-known software applications such as Excel and Access for further data analysis. The software has been designed by radiation protection professionals and developed by computer programmers who were integrated into the radiological work environment. The software would fulfill Occupational Radiation Protection Department requirements. (author)

  7. Software Risk Management

    OpenAIRE

    Radhika Naik

    2013-01-01

    Software Risk Management is a critical area among the nine knowledge areas used in Software project management. This paper describes different frameworks and paradigms used in risk management. A framework or a model is decided for managing risks in a project. This model acts as a tool for efficient risk management. It has basic steps of Risk identification, risk planning, risk assessment, risk mitigation and risk monitoring and controlling.

  8. Personalised continuous software engineering

    OpenAIRE

    Papatheocharous, Efi; Belk, Marios; Nyfjord, Jaana; Germanakos, Panagiotis; Samaras, George

    2014-01-01

    This work describes how human factors can influence continuous software engineering. The reasoning begins from the Agile Manifesto promoting individuals and interactions over processes and tools. The organisational need to continuously develop, release and learn from software development in rapid cycles requires empowered and self-organised agile teams. However, these teams are formed without necessarily considering the members’ individual characteristics towards effective teamwork, from the ...

  9. Creative Software Engineering

    OpenAIRE

    Hooper, Clare J.; Millard, David E.

    2010-01-01

    Software engineering is traditionally seen as very structured and methodical. However, it often involves creative steps: consider requirements analysis, architecture engineering and GUI design. This poster describes three existing software engineering methods which include creative steps, alongside a method called 'experience deconstruction'. Deconstruction, which also includes a creative step, is used to help understand user experiences and re-provide these experiences in new contexts.

  10. Transformational Leadershipin Software Projects

    OpenAIRE

    MOUSAVIKHAH, MARYAM

    2013-01-01

    Lack of management in software projects is among the most important reasons for the failure of this kind of projects. Considering this fact, in addition to high rate of IS (Information System) projects’ failure, and the lack of leadership studies in IS field, it is necessary to pay more attention to the concept of leadership in software projects. Transformational leadership as one of the most popular leadership theories, although might bring specific advantages for this kind of projects, has ...

  11. Gamification of Software Applications

    OpenAIRE

    Marovt, Jakob

    2012-01-01

    More and more software applications operate online, which means that we, as software architects, are able to much better track how our users are using and engaging with the applications. Some of the results lately, have been staggering – more than 26% applications are not utilized and have poor retention and engagement rates. Behavior design and user experience researchers have been trying to solve these problems in many ways. Lately, one of the most popular methods has been the use of gamifi...

  12. Obsolete Software Requirements

    OpenAIRE

    Zahda, Showayb

    2011-01-01

    Context. Requirements changes are unavoidable in any software project. Requirements change over time as software projects progress, and involved stakeholders (mainly customers) and developers gain better understanding of the final product. Additionally, time and budget constraints prevent implementing all candidate requirements and force project management to select a subset of requirements that are prioritized more important than the others so as to be implemented. As a result, some requirem...

  13. Engineering and Software Engineering

    Science.gov (United States)

    Jackson, Michael

    The phrase ‘software engineering' has many meanings. One central meaning is the reliable development of dependable computer-based systems, especially those for critical applications. This is not a solved problem. Failures in software development have played a large part in many fatalities and in huge economic losses. While some of these failures may be attributable to programming errors in the narrowest sense—a program's failure to satisfy a given formal specification—there is good reason to think that most of them have other roots. These roots are located in the problem of software engineering rather than in the problem of program correctness. The famous 1968 conference was motivated by the belief that software development should be based on “the types of theoretical foundations and practical disciplines that are traditional in the established branches of engineering.” Yet after forty years of currency the phrase ‘software engineering' still denotes no more than a vague and largely unfulfilled aspiration. Two major causes of this disappointment are immediately clear. First, too many areas of software development are inadequately specialised, and consequently have not developed the repertoires of normal designs that are the indispensable basis of reliable engineering success. Second, the relationship between structural design and formal analytical techniques for software has rarely been one of fruitful synergy: too often it has defined a boundary between competing dogmas, at which mutual distrust and incomprehension deprive both sides of advantages that should be within their grasp. This paper discusses these causes and their effects. Whether the common practice of software development will eventually satisfy the broad aspiration of 1968 is hard to predict; but an understanding of past failure is surely a prerequisite of future success.

  14. Automotive Software & Service Engineering

    OpenAIRE

    Zauner, Andreas; Hoffmann, Holger; Leimeister, Jan Marco; Krcmar, Helmut

    2014-01-01

    Automotive Software und Services (ASS) sind zentrale Innovationstreiber in der Automobilindustrie und gewinnen zunehmend an Bedeutung. In diesem Arbeitsbericht beschreiben wir die aktuellen Herausforderungen bei der Erstellung von ASS sowie Entwicklungstrends in der Dom?ne, jeweils aus Sicht der wesentlichen Stakeholder in der Industrie auf Basis von 21 Experteninterviews. Dazu geben wir einen ?berblick ?ber bestehende Automotive Software und beschreiben unser methodisches Vorgehen bei der Da...

  15. Software for Schenkerian Analysis

    OpenAIRE

    Marsden, Alan

    2011-01-01

    Software developed to automate the process of Schen-kerian analysis is described. The current state of the art is that moderately good analyses of small extracts can be generated, but more information is required about the criteria by which analysts make decisions among alternative interpretations in the course of analysis. The software described here allows the procedure of reduction to be examined while in process, allowing decision points, and potentially criteria, to become clear.

  16. Architecting software concurrency

    OpenAIRE

    Dumitru Ciorba; Victor Besliu

    2011-01-01

    Nowadays, the majority of software systems are inherently concurrent. Anyway, internal and external concurrent activities increase the complexity of systems' behavior. Adequate architecting can significantly decrease implementation errors. This work is motivated by the desire to understand how concurrency can constrain or influence software architecting. As a result, in the paper a methodological architecting framework applied for systems with "concurrency-intensive architecture" is described...

  17. SDN : Software defined networks

    OpenAIRE

    Wiklund, Petter

    2014-01-01

    This report is a specialization in Software defined networking. SDN really comes to revolutionize the industry and it’s under constant development. But is the technology ready to be launched into operation yet? The report would initially involve a number of problems that today's network technology is facing. It then follows a deeper description of what this software-based networking technology really is and how it works. Further, the technique is being tested in a lab assignment, using a prog...

  18. Predictive software design measures

    OpenAIRE

    Love, Randall James

    1994-01-01

    This research develops a set of predictive measures enabling software testers and designers to identify and target potential problem areas for additional and/or enhanced testing. Predictions are available as early in the design process as requirements allocation and as late as code walk-throughs. These predictions are based on characteristics of the design artifacts prior to coding. Prediction equations are formed at established points in the software development process...

  19. Software for tolerance design

    OpenAIRE

    Shilo, Galina; Kovalenko, Daria; Gaponenko, Mykola

    2012-01-01

    Software for tolerance assignment and element selection is presented in the paper. Methods of tolerance design apply mathematical models of tolerance regions in shapes of hyperparallelepiped and hyperellipsoid which makes possible to take into consideration distribution laws of element parameters. The methods allow carrying element selection and tolerance assignment taking into account external influences. Specification of software functional characteristics and input data presentation were s...

  20. What Is Software Engineering?

    OpenAIRE

    Dzerzhinskiy, Fedor; Raykov, Leonid D.

    2015-01-01

    A later translation (2015) of the article in Russian published in 1990. The article proposes an approach to defining a set of basic notions for subject area of software engineering discipline. The set of notions is intended to serve as a basis for detection and correction of some widespread conceptual mistakes in the efforts aimed at improving the quality and work productivity in creation and operation of software.

  1. The LUCIFER control software

    Science.gov (United States)

    Jütte, Marcus; Knierim, Volker; Polsterer, Kai; Lehmitz, Michael; Storz, Clemens; Seifert, Walter; Ageorges, Nancy

    2010-07-01

    The successful roll-out of the control software for a complex NIR imager/spectrograph with MOS calls for flexible development strategies due to changing requirements during different phases of the project. A waterfall strategy used in the beginning has to change to a more iterative and agile process in the later stages. The choice of an appropriate program language as well as suitable software layout is crucial. For example the software has to accomplish multiple demands of different user groups, including a high level of flexibility for later changes and extensions. Different access levels to the instrument are mandatory to afford direct control mechanisms for lab operations and inspections of the instrument as well as tools to accomplish efficient science observations. Our hierarchical software structure with four layers of increasing abstract levels and the use of an object oriented language ideally supports these requirements. Here we describe our software architecture, the software development process, the different access levels and our commissioning experiences with LUCIFER 1.

  2. Software Process Assessment (SPA)

    Science.gov (United States)

    Rosenberg, Linda H.; Sheppard, Sylvia B.; Butler, Scott A.

    1994-01-01

    NASA's environment mirrors the changes taking place in the nation at large, i.e. workers are being asked to do more work with fewer resources. For software developers at NASA's Goddard Space Flight Center (GSFC), the effects of this change are that we must continue to produce quality code that is maintainable and reusable, but we must learn to produce it more efficiently and less expensively. To accomplish this goal, the Data Systems Technology Division (DSTD) at GSFC is trying a variety of both proven and state-of-the-art techniques for software development (e.g., object-oriented design, prototyping, designing for reuse, etc.). In order to evaluate the effectiveness of these techniques, the Software Process Assessment (SPA) program was initiated. SPA was begun under the assumption that the effects of different software development processes, techniques, and tools, on the resulting product must be evaluated in an objective manner in order to assess any benefits that may have accrued. SPA involves the collection and analysis of software product and process data. These data include metrics such as effort, code changes, size, complexity, and code readability. This paper describes the SPA data collection and analysis methodology and presents examples of benefits realized thus far by DSTD's software developers and managers.

  3. Towards research on software cybernetics

    OpenAIRE

    Cai, KY; Chen, TY; Tse, TH

    2002-01-01

    Software cybernetics is a newly proposed area in software engineering. It makes better use of the interplay between control theory/engineering and software engineering. In this paper, we look into the research potentials of this emerging area.

  4. Conceptual Models Core to Good Design

    CERN Document Server

    Johnson, Jeff

    2011-01-01

    People make use of software applications in their activities, applying them as tools in carrying out tasks. That this use should be good for people--easy, effective, efficient, and enjoyable--is a principal goal of design. In this book, we present the notion of Conceptual Models, and argue that Conceptual Models are core to achieving good design. From years of helping companies create software applications, we have come to believe that building applications without Conceptual Models is just asking for designs that will be confusing and difficult to learn, remember, and use. We show how Concept

  5. Updated Core Libraries of the ALPS Project

    CERN Document Server

    Gaenko, A; Carcassi, G; Chen, T; Chen, X; Dong, Q; Gamper, L; Gukelberger, J; Igarashi, R; Iskakov, S; Könz, M; LeBlanc, J P F; Levy, R; Ma, P N; Paki, J E; Shinaoka, H; Todo, S; Troyer, M; Gull, E

    2016-01-01

    The open source ALPS (Algorithms and Libraries for Physics Simulations) project provides a collection of physics libraries and applications, with a focus on simulations of lattice models and strongly correlated systems. The libraries provide a convenient set of well-documented and reusable components for developing condensed matter physics simulation code, and the applications strive to make commonly used and proven computational algorithms available to a non-expert community. In this paper we present an updated and refactored version of the core ALPS libraries geared at the computational physics software development community, rewritten with focus on documentation, ease of installation, and software maintainability.

  6. Software bibliotecario abierto y gratuito

    OpenAIRE

    Lencinas, Verónica

    2001-01-01

    Free software, known also as Open Source software, has a number of advantages for implementation in libraries. It offers free and full source code of the software that can be used to correct errors, modify it and integrate with other programs. Because of these advantages, free software offers better opportunities for libraries as closed software. Library management systems will soon be available and can be a real alternative to commercial software. The methodology used to develop the open sof...

  7. Management aspects of software maintenance

    OpenAIRE

    Henderson, Brian J.; Sullivan, Brenda J.

    1984-01-01

    Approved for public release; distribution is unlimited The Federal government depends upon software systems to fulfill its missions. These software systems must be maintained and improved to continue to meet the growing demands placed on them. The process of software maintenance and improvement may be called "software evolution". The software manager must be educated in the complex nature cf soft- ware maintenance to be able to properly evaluate and manage the software maintenance effort. ...

  8. Software Engineering Reviews and Audits

    CERN Document Server

    Summers, Boyd L

    2011-01-01

    Accurate software engineering reviews and audits have become essential to the success of software companies and military and aerospace programs. These reviews and audits define the framework and specific requirements for verifying software development efforts. Authored by an industry professional with three decades of experience, Software Engineering Reviews and Audits offers authoritative guidance for conducting and performing software first article inspections, and functional and physical configuration software audits. It prepares readers to answer common questions for conducting and perform

  9. Network externality and software piracy

    OpenAIRE

    Poddar, Sougata

    2002-01-01

    The pervasiveness of the illegal copying of software is a worldwide phenomenon. Software piracy implies a huge loss of potential customers of original software buyers, which directly translates into revenue losses for the software industry. Given this, conventional wisdom would suggest the need for the legal software firms and governments to take a harsh approach on piracy of software. Interestingly, there is a trend of literature, which establishes that it is actually profitable for the orig...

  10. Software FMEA analysis for safety-related application software

    International Nuclear Information System (INIS)

    Highlights: • We develop a modified FMEA analysis suited for applying to software architecture. • A template for failure modes on a specific software language is established. • A detailed-level software FMEA analysis on nuclear safety software is presented. - Abstract: A method of a software safety analysis is described in this paper for safety-related application software. The target software system is a software code installed at an Automatic Test and Interface Processor (ATIP) in a digital reactor protection system (DRPS). For the ATIP software safety analysis, at first, an overall safety or hazard analysis is performed over the software architecture and modules, and then a detailed safety analysis based on the software FMEA (Failure Modes and Effect Analysis) method is applied to the ATIP program. For an efficient analysis, the software FMEA analysis is carried out based on the so-called failure-mode template extracted from the function blocks used in the function block diagram (FBD) for the ATIP software. The software safety analysis by the software FMEA analysis, being applied to the ATIP software code, which has been integrated and passed through a very rigorous system test procedure, is proven to be able to provide very valuable results (i.e., software defects) that could not be identified during various system tests

  11. Academic Rigor: The Core of the Core

    Science.gov (United States)

    Brunner, Judy

    2013-01-01

    Some educators see the Common Core State Standards as reason for stress, most recognize the positive possibilities associated with them and are willing to make the professional commitment to implementing them so that academic rigor for all students will increase. But business leaders, parents, and the authors of the Common Core are not the only…

  12. SECURED CLOUD SUPPORT FOR GLOBAL SOFTWARE REQUIREMENT RISK MANAGEMENT

    OpenAIRE

    Shruti Patil; Roshani Ade

    2014-01-01

    This paper presents core problem solution to security of Global Software Development Requirement Information. Currently the major issue deals with hacking of sensitive client information which may lead to major financial as well as social loss. To avoid this system provides cloud security by encryption of data as well as deployment of tool over the cloud will provide significant security to whole global content management system. The core findings are presented in terms of how hac...

  13. Advanced Core Monitoring Framework: An overview description

    International Nuclear Information System (INIS)

    One of the most significant developments in nuclear power plant operations in recent years is the application of digital computers to monitor and manage power plant process. The introduction of this technology, moreover is not without its problems. At present each of these advanced core monitoring systems as GE's MONICORE, EXXON's POWERPLEX, EPRI's PSMS, etc., works only by itself in an operating configuration which makes it difficult to compare, benchmark or replace with alternative core monitoring packages. The Advanced Core Monitoring Framework (ACMF) was conceived to provide one standard software framework in a number of different virtual-memory mini-computers within which modules from any of the core monitoring systems (both BWR and PWR) could be installed. The primary theme of ACMF is to build a framework that allows software plug-in compatibility for a variety of core monitoring functional packages by carefully controlling (standardizing) module interfaces to a well-defined database and requiring a common man-machine interface to be installed

  14. Software cost/resource modeling: Software quality tradeoff measurement

    Science.gov (United States)

    Lawler, R. W.

    1980-01-01

    A conceptual framework for treating software quality from a total system perspective is developed. Examples are given to show how system quality objectives may be allocated to hardware and software; to illustrate trades among quality factors, both hardware and software, to achieve system performance objectives; and to illustrate the impact of certain design choices on software functionality.

  15. Software Engineering Laboratory Series: Collected Software Engineering Papers. Volume 13

    Science.gov (United States)

    1995-01-01

    The Software Engineering Laboratory (SEL) is an organization sponsored by NASA/GSFC and created to investigate the effectiveness of software engineering technologies when applied to the development of application software. The activities, findings, and recommendations of the SEL are recorded in the Software Engineering Laboratory Series, a continuing series of reports that includes this document.

  16. Software Engineering Laboratory Series: Collected Software Engineering Papers. Volume 15

    Science.gov (United States)

    1997-01-01

    The Software Engineering Laboratory (SEL) is an organization sponsored by NASA/GSFC and created to investigate the effectiveness of software engineering technologies when applied to the development of application software. The activities, findings, and recommendations of the SEL are recorded in the Software Engineering Laboratory Series, a continuing series of reports that includes this document.

  17. Software Engineering Laboratory Series: Collected Software Engineering Papers. Volume 14

    Science.gov (United States)

    1996-01-01

    The Software Engineering Laboratory (SEL) is an organization sponsored by NASA/GSFC and created to investigate the effectiveness of software engineering technologies when applied to the development of application software. The activities, findings, and recommendations of the SEL are recorded in the Software Engineering Laboratory Series, a continuing series of reports that includes this document.

  18. Impact of Agile Software Development Model on Software Maintainability

    Science.gov (United States)

    Gawali, Ajay R.

    2012-01-01

    Software maintenance and support costs account for up to 60% of the overall software life cycle cost and often burdens tightly budgeted information technology (IT) organizations. Agile software development approach delivers business value early, but implications on software maintainability are still unknown. The purpose of this quantitative study…

  19. Software Activation Using Multithreading

    Directory of Open Access Journals (Sweden)

    Jianrui Zhang

    2012-11-01

    Full Text Available Software activation is an anti-piracy technology designed to verify that software products have been legitimately licensed. Activation should be quick and simple while simultaneously being secure and protecting customer privacy. The most common form of software activation is for the user to enter a legitimate product serial number. However, software activation based on serial numbers appears to be weak, since cracks for many programs are readily available on the Internet. Users can employ such cracks to bypass software activation.Serial number verification logic usually executes sequentially in a single thread. Such an approach is relatively easy to break since attackers can trace the code to understand how the logic works. In this paper, we develop a practical multi-threaded verification design. Our results show that by proper use of multi-threading, the amount of traceable code in a debugger can be reduced to a low percentage of the total and the traceable code in each run can differ as well. This makes it significantly more difficult for an attacker to reverse engineer the code as a means of bypassing a security check. Finally, we attempt to quantify the increased effort needed to break our verification logic.

  20. Software reliability assessment

    International Nuclear Information System (INIS)

    The increased usage and sophistication of computers applied to real time safety-related systems in the United Kingdom has spurred on the desire to provide a standard framework within which to assess dependable computing systems. Recent accidents and ensuing legislation have acted as a catalyst in this area. One particular aspect of dependable computing systems is that of software, which is usually designed to reduce risk at the system level, but which can increase risk if it is unreliable. Various organizations have recognized the problem of assessing the risk imposed to the system by unreliable software, and have taken initial steps to develop and use such assessment frameworks. This paper relates the approach of Consultancy Services of AEA Technology in developing a framework to assess the risk imposed by unreliable software. In addition, the paper discusses the experiences gained by Consultancy Services in applying the assessment framework to commercial and research projects. The framework is applicable to software used in safety applications, including proprietary software. Although the paper is written with Nuclear Reactor Safety applications in mind, the principles discussed can be applied to safety applications in all industries

  1. Control software for EUTERPE

    International Nuclear Information System (INIS)

    This paper describes the software design of the EUTERPE synchrotron radiation facility. Applications are developed as a set of separate programs. Services are exported from these programs and can be used by other programs. The programs are built from classes following the object oriented programming paradigm. Objects are created from these classes when the programs are distributed over a set of processors. The objects of the applications, which represent existing accelerator related objects, also profit from standard facilities provided by the control system software, like: adaptable acquisition and user dependent object views (e.g. B-field for physicist and power-supply for engineer). This approach makes the application software independent of the underlying control system structure. Applications do not see if the underlying structure is 1-, 2- or 3-layered. Accordingly, the mapping of the application software to the hardware can be postponed until the last moment. Once installed, the control system structure can be adapted to new performance and flexibility requirements without consequences for the application software

  2. SOFTWARE MEASUREMENTS AND METRICS: ROLE IN EFFECTIVE SOFTWARE TESTING

    Directory of Open Access Journals (Sweden)

    Sheikh Umar Farooq

    2011-01-01

    Full Text Available Measurement has always been fundamental to the progress to any engineering discipline and software testing is no exception. Software metrics have been used in making quantitative/qualitative decisions as well as in risk assessment and reduction in software projects. In this paper we discuss software measurement and metrics and their fundamental role in software development life cycle. This paper focusing on software test metrics discusses their key role in software testing process and also classifies and systematically analyzes the various test metrics.

  3. Implications of Responsive Space on the Flight Software Architecture

    Science.gov (United States)

    Wilmot, Jonathan

    2006-01-01

    The Responsive Space initiative has several implications for flight software that need to be addressed not only within the run-time element, but the development infrastructure and software life-cycle process elements as well. The runtime element must at a minimum support Plug & Play, while the development and process elements need to incorporate methods to quickly generate the needed documentation, code, tests, and all of the artifacts required of flight quality software. Very rapid response times go even further, and imply little or no new software development, requiring instead, using only predeveloped and certified software modules that can be integrated and tested through automated methods. These elements have typically been addressed individually with significant benefits, but it is when they are combined that they can have the greatest impact to Responsive Space. The Flight Software Branch at NASA's Goddard Space Flight Center has been developing the runtime, infrastructure and process elements needed for rapid integration with the Core Flight software System (CFS) architecture. The CFS architecture consists of three main components; the core Flight Executive (cFE), the component catalog, and the Integrated Development Environment (DE). This paper will discuss the design of the components, how they facilitate rapid integration, and lessons learned as the architecture is utilized for an upcoming spacecraft.

  4. The optimal community detection of software based on complex networks

    Science.gov (United States)

    Huang, Guoyan; Zhang, Peng; Zhang, Bing; Yin, Tengteng; Ren, Jiadong

    2016-02-01

    The community structure is important for software in terms of understanding the design patterns, controlling the development and the maintenance process. In order to detect the optimal community structure in the software network, a method Optimal Partition Software Network (OPSN) is proposed based on the dependency relationship among the software functions. First, by analyzing the information of multiple execution traces of one software, we construct Software Execution Dependency Network (SEDN). Second, based on the relationship among the function nodes in the network, we define Fault Accumulation (FA) to measure the importance of the function node and sort the nodes with measure results. Third, we select the top K(K=1,2,…) nodes as the core of the primal communities (only exist one core node). By comparing the dependency relationships between each node and the K communities, we put the node into the existing community which has the most close relationship. Finally, we calculate the modularity with different initial K to obtain the optimal division. With experiments, the method OPSN is verified to be efficient to detect the optimal community in various softwares.

  5. Software Communication Architecture Implementation and Its Waveform Application

    Institute of Scientific and Technical Information of China (English)

    SUN Pei-gang; ZHAO Hai; WANG Ting-chang; FAN Jian-hua

    2006-01-01

    This paper attempts to do a research on the development of software defined radio(SDR) based on software communication architecture(SCA). Firstly, SCA is studied and a whole reference model of SCA3.0 core framework (CF)is realized; Secondly, an application-specific FM3TR waveform is implemented on the platform of common software based on the reference model; Thirdly, from the point of view of real-time performance and software reuse, tests and validations are made on the above realized CF reference model and FM3TR waveform. As a result, the SCA-compliant SDR has favorable interoperability and software portability and can satisfy the real-time performance requirements which are not too rigorous.

  6. Overview of the LCG applications area software projects

    CERN Document Server

    Pfeiffer, Andreas

    2004-01-01

    The Applications Area of the LHC Computing Grid Project (LCG) develops part of the physics applications software and associated infrastructure that can be shared among the LHC experiments. The scope includes common applications software infrastructure, frameworks, libraries and tools, together with domain specific components such as simulation and analysis toolkits. The work of the applications area is conducted within a number of projects: software process and infrastructure (SPI), persistency framework (POOL), core libraries and services (SEAL), physicist interface (PI), and simulation (SIMU). ROOT, which is a general purpose object oriented framework that implements software for managing object persistency and for supporting interactive data analysis and visualization, is also used to implement vital parts of Applications Area software. The project has been in an active development phase for more than 2 years and is being integrated by the experiments in their frameworks on which they base the applications...

  7. Secure software practices among Malaysian software practitioners: An exploratory study

    Science.gov (United States)

    Mohamed, Shafinah Farvin Packeer; Baharom, Fauziah; Deraman, Aziz; Yahya, Jamaiah; Mohd, Haslina

    2016-08-01

    Secure software practices is increasingly gaining much importance among software practitioners and researchers due to the rise of computer crimes in the software industry. It has become as one of the determinant factors for producing high quality software. Even though its importance has been revealed, its current practice in the software industry is still scarce, particularly in Malaysia. Thus, an exploratory study is conducted among software practitioners in Malaysia to study their experiences and practices in the real-world projects. This paper discusses the findings from the study, which involved 93 software practitioners. Structured questionnaire is utilized for data collection purpose whilst statistical methods such as frequency, mean, and cross tabulation are used for data analysis. Outcomes from this study reveal that software practitioners are becoming increasingly aware on the importance of secure software practices, however, they lack of appropriate implementation, which could affect the quality of produced software.

  8. A Software Reliability Estimation Method to Nuclear Safety Software

    Energy Technology Data Exchange (ETDEWEB)

    Park, Geeyong; Jang, Seung Cheol [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2014-02-15

    A method for estimating software reliability for nuclear safety software is proposed in this paper. This method is based on the software reliability growth model (SRGM), where the behavior of software failure is assumed to follow a nonhomogeneous Poisson process. Two types of modeling schemes based on a particular underlying method are proposed in order to more precisely estimate and predict the number of software defects based on very rare software failure data. The Bayesian statistical inference is employed to estimate the model parameters by incorporating software test cases as a covariate into the model. It was identified that these models are capable of reasonably estimating the remaining number of software defects which directly affects the reactor trip functions. The software reliability might be estimated from these modeling equations, and one approach of obtaining software reliability value is proposed in this paper.

  9. UAS-NAS Live Virtual Constructive Distributed Environment (LVC): LVC Gateway, Gateway Toolbox, Gateway Data Logger (GDL), SaaProc Software Design Description

    Science.gov (United States)

    Jovic, Srboljub

    2015-01-01

    This document provides the software design description for the two core software components, the LVC Gateway, the LVC Gateway Toolbox, and two participants, the LVC Gateway Data Logger and the SAA Processor (SaaProc).

  10. Astronomers as Software Developers

    Science.gov (United States)

    Pildis, Rachel A.

    2016-01-01

    Astronomers know that their research requires writing, adapting, and documenting computer software. Furthermore, they often have to learn new computer languages and figure out how existing programs work without much documentation or guidance and with extreme time pressure. These are all skills that can lead to a software development job, but recruiters and employers probably won't know that. I will discuss all the highly useful experience that astronomers may not know that they already have, and how to explain that knowledge to others when looking for non-academic software positions. I will also talk about some of the pitfalls I have run into while interviewing for jobs and working as a developer, and encourage you to embrace the curiosity employers might have about your non-standard background.

  11. CONRAD Software Architecture

    Science.gov (United States)

    Guzman, J. C.; Bennett, T.

    2008-08-01

    The Convergent Radio Astronomy Demonstrator (CONRAD) is a collaboration between the computing teams of two SKA pathfinder instruments, MeerKAT (South Africa) and ASKAP (Australia). Our goal is to produce the required common software to operate, process and store the data from the two instruments. Both instruments are synthesis arrays composed of a large number of antennas (40 - 100) operating at centimeter wavelengths with wide-field capabilities. Key challenges are the processing of high volume of data in real-time as well as the remote mode of operations. Here we present the software architecture for CONRAD. Our design approach is to maximize the use of open solutions and third-party software widely deployed in commercial applications, such as SNMP and LDAP, and to utilize modern web-based technologies for the user interfaces, such as AJAX.

  12. Test af Software

    DEFF Research Database (Denmark)

    Dette dokument udgør slutrapporten for netværkssamarbejdet ”Testnet”, som er udført i perioden 1.4.2006 til 31.12.2008. Netværket beskæftiger sig navnlig med emner inden for test af indlejret og teknisk software, men et antal eksempler på problemstillinger og løsninger forbundet med test af...... administrativ software indgår også. Rapporten er opdelt i følgende 3 dele: Overblik. Her giver vi et resumé af netværkets formål, aktiviteter og resultater. State of the art af software test ridses op. Vi omtaler, at CISS og netværket tager nye tiltag. Netværket. Formål, deltagere og behandlede emner på ti...

  13. The CMS Reconstruction Software

    Science.gov (United States)

    Lange, David J.; CMS Collaboration

    2011-12-01

    We report on the status and plans for the event reconstruction software of the CMS experiment. The CMS reconstruction algorithms are the basis for a wide range of data analysis approaches currently under study by the CMS collaboration using the first high-energy run of the LHC. These algorithms have been primarily developed and validated using simulated data samples, and are now being commissioned with LHC proton-proton collision data samples. The CMS reconstruction is now operated routinely on all events triggered by the CMS detector, both in a close to real-time prompt reconstruction processing and in frequent passes over the full recorded CMS data set. We discuss the overall software design, development cycle, computational requirements and performance, recent operational performance, and planned improvements of the CMS reconstruction software.

  14. Lecture 2: Software Security

    CERN Document Server

    CERN. Geneva

    2013-01-01

    Computer security has been an increasing concern for IT professionals for a number of years, yet despite all the efforts, computer systems and networks remain highly vulnerable to attacks of different kinds. Design flaws and security bugs in the underlying software are among the main reasons for this. This lecture addresses the following question: how to create secure software? The lecture starts with a definition of computer security and an explanation of why it is so difficult to achieve. It then introduces the main security principles (like least-privilege, or defense-in-depth) and discusses security in different phases of the software development cycle. The emphasis is put on the implementation part: most common pitfalls and security bugs are listed, followed by advice on best practice for security development, testing and deployment. Sebastian Lopienski is CERN’s deputy Computer Security Officer. He works on security strategy and policies; offers internal consultancy and audit services; develops and ...

  15. Software Risk Management Practice: Evidence From Thai Software Industry

    OpenAIRE

    Tharwon Arnuphaptrairong

    2014-01-01

    Software risk management has been around at least since it was introduced in mainstream of software management process, in 1989 [1]-[3] but little has been reported about its industrial practice [4]-[6]. This paper reports the current software risk management practice in Thai software industry. A questionnaire survey was designed to capture the information of the software project risk management practice. The questionnaire was sent to 141 companies and received a response rate 28 percent. The...

  16. The software invention cube: A classification scheme for software inventions

    OpenAIRE

    Bergstra, J.A.; Klint, Paul

    2008-01-01

    The patent system protects inventions. The requirement that a software invention should make ‘a technical contribution’ turns out to be untenable in practice and this raises the question, what constitutes an invention in the realm of software. The authors developed the Software Invention Cube (SWIC), a classification of software inventions and used this classification to explore the meaning of the notions ‘novelty’, ‘inventive step’ and ‘someone skilled in the art’ for software inventions. Th...

  17. Reliable Software Development with Proposed Quality Oriented Software Testing Metrics

    OpenAIRE

    Latika Kharb; Dr. Vijay Singh Rathore

    2011-01-01

    For an effective test measurement, a software tester requires a testing metrics that could measure the quality and productivity of software development process along with increasing its reusability, correctness and maintainability. Until now, the understanding of measuring software quality is not yet sophisticated enough and is still far away from being standardized and in order to assess the software quality, an appropriate set of software metrics needs to be identified that could express th...

  18. How can Software Packages Certification Improve Software Process

    OpenAIRE

    Pivka, Marjan; Potočan, Vojko

    1997-01-01

    Popular software assessment models such as CMM, BOOTSTRAP, SPICE or ISO 9000 ignore the impact of software product certification on software quality. The first standard for software product quality was German DIN 66285. Based on this standard, the ISO developed a international standard for quality requirements and testing procedures for software packages: ISO/IEC 12119. This paper presents our experience with classical testing models based on ISO/IEC 12119 and DIN 66285 and with our improved ...

  19. From Agile Software Product Line Engineering Towards Software Ecosystems

    OpenAIRE

    Hanssen, Geir Kjetil

    2010-01-01

    Development and evolution of software products is a challenging endeavor and a significant subfield of software engineering. One of the commonly applied approaches to control and manage this process is software product line engineering (SPLE). There exist a few process frameworks where the development of lines of related software products is basically a sum of two processes: the development of reusable assets and the rapid construction of software applications using predeveloped assets. Agile...

  20. UNIFIED SOFTWARE DEVELOPMENT MODEL FOR FREE/OPEN SOURCE SOFTWARE

    OpenAIRE

    Md. Anawarul Kabir; Md. Salahuddin Pasha; Mohammad Abdur Razzak

    2011-01-01

    Most of the process models so far have been introduced in the domain of software engineering are meantfor proprietary software. Though, in the case of open source software development, the popular Bazaarmodel can be mentioned, it has some limitations too. The present research has attempted to formulate aprocess model for developing a software project exclusively for the open source paradigm. We have namedit “Unified Software Development Model for Free/Open Source (USDM)”.Not only we have intr...

  1. Software design :communication between human factors engineers and software developers

    OpenAIRE

    Bradley, Roxanne

    1991-01-01

    As computers pervade aspects of daily life, users demand software that is easy to use. It has been suggested that adding human factors engineers (HFEs) to software development teams would help software development companies meet these user demands. However, there are qualitative data which suggest that software developers (80s) and HFEs do not communicate well with each other. It is believed that this lack of communication has inhibited the use of HFEs on software developmen...

  2. Green in software engineering

    CERN Document Server

    Calero Munoz, Coral

    2015-01-01

    This is the first book that presents a comprehensive overview of sustainability aspects in software engineering. Its format follows the structure of the SWEBOK and covers the key areas involved in the incorporation of green aspects in software engineering, encompassing topics from requirement elicitation to quality assurance and maintenance, while also considering professional practices and economic aspects. The book consists of thirteen chapters, which are structured in five parts. First the "Introduction" gives an overview of the primary general concepts related to Green IT, discussing wha

  3. Processeringsoptimering med Canons software

    DEFF Research Database (Denmark)

    Precht, Helle

    2009-01-01

    . Muligheder i software optimering blev studeret i relation til optimal billedkvalitet og kontrol optagelser, for at undersøge om det var muligt at acceptere diagnostisk billedkvalitet og derved tage afsæt i ALARA. Metode og materialer Et kvantitativt eksperimentelt studie baseret på forsøg med teknisk og...... humant fantom. CD Rad fantom anvendes som teknisk fantom, hvor billederne blev analyseret med CD Rad software, og resultatet var en objektiv IQF værdi. Det humane fantom var et lamme pelvis med femur, der via NRPB’ er sammenlignelig med absorptionen ved et femårigt barn. De humane forsøgsbilleder blev...

  4. Agile software development

    CERN Document Server

    Stober, Thomas

    2009-01-01

    Software Development is moving towards a more agile and more flexible approach. It turns out that the traditional 'waterfall' model is not supportive in an environment where technical, financial and strategic constraints are changing almost every day. But what is agility? What are today's major approaches? And especially: What is the impact of agile development principles on the development teams, on project management and on software architects? How can large enterprises become more agile and improve their business processes, which have been existing since many, many years? What are the limit

  5. Agile Software Development

    OpenAIRE

    Stewart, Rhonda

    2009-01-01

    One of the most noticeable changes to software process thinking in the last ten years has been the appearance of the word ‘agile’ (Fowler, 2005). In the Information Technology (IT) industry Agile Software Development, or simply Agile is used to refer to a family of lightweight development approaches that share a common set of values and principles1 focused around adapting to change and putting people first (Fowler, 2005). Such Agile methods2 provide an alternative to the well-established Wate...

  6. Six Sigma software development

    CERN Document Server

    Tayntor, Christine B

    2002-01-01

    Since Six Sigma has had marked success in improving quality in other settings, and since the quality of software remains poor, it seems a natural evolution to apply the concepts and tools of Six Sigma to system development and the IT department. Until now however, there were no books available that applied these concepts to the system development process. Six Sigma Software Development fills this void and illustrates how Six Sigma concepts can be applied to all aspects of the evolving system development process. It includes the traditional waterfall model and in the support of legacy systems,

  7. Software Testing as Science

    Directory of Open Access Journals (Sweden)

    Ingrid Gallesdic

    2013-06-01

    Full Text Available The most widespread opinion among people who have some connection with software testing is that this activity is an art. In fact, books have been published widely whose titles refer to it as art, role or process. But because software complexity is increasing every year, this paper proposes a new approach, conceiving the test as a science. This is because the processes by which they are applied are the steps of the scientific method: inputs, processes, outputs. The contents of this paper examines the similarities and test characteristics as science.

  8. MIDI Interpreter Software

    OpenAIRE

    Vahtera, Timo

    2009-01-01

    The MIDI interpreter was part of the HAMK Örch Orchestra project. The goal of the Örch Orchestra was to compete in the Artemis musical robot competition held in Athens 3.6.2008. The MIDI interpreter is a standalone hardware and software solution that interprets MIDI messages for a piano playing robot. This thesis involves everything from designing and creating the MIDI interpreter software, including relevant information about the hardware it was programmed for and about the Örch Orchestr...

  9. Software product quality control

    CERN Document Server

    Wagner, Stefan

    2013-01-01

    Quality is not a fixed or universal property of software; it depends on the context and goals of its stakeholders. Hence, when you want to develop a high-quality software system, the first step must be a clear and precise specification of quality. Yet even if you get it right and complete, you can be sure that it will become invalid over time. So the only solution is continuous quality control: the steady and explicit evaluation of a product's properties with respect to its updated quality goals.This book guides you in setting up and running continuous quality control in your environment. Star

  10. Machine Tool Software

    Science.gov (United States)

    1988-01-01

    A NASA-developed software package has played a part in technical education of students who major in Mechanical Engineering Technology at William Rainey Harper College. Professor Hack has been using (APT) Automatically Programmed Tool Software since 1969 in his CAD/CAM Computer Aided Design and Manufacturing curriculum. Professor Hack teaches the use of APT programming languages for control of metal cutting machines. Machine tool instructions are geometry definitions written in APT Language to constitute a "part program." The part program is processed by the machine tool. CAD/CAM students go from writing a program to cutting steel in the course of a semester.

  11. The PANIC software system

    Science.gov (United States)

    Ibáñez Mengual, José M.; Fernández, Matilde; Rodríguez Gómez, Julio F.; García Segura, Antonio J.; Storz, Clemens

    2010-07-01

    PANIC is the Panoramic Near Infrared Camera for the 2.2m and 3.5m telescopes at Calar Alto observatory. The aim of the project is to build a wide-field general purpose NIR camera. In this paper we describe the software system of the instrument, which comprises four main packages: GEIRS for the instrument control and the data acquisition; the Observation Tool (OT), the software used for detailed definition and pre-planning the observations, developed in Java; the Quick Look tool (PQL) for easy inspection of the data in real-time and a scientific pipeline (PAPI), both based on the Python programming language.

  12. Software Safety and Security

    CERN Document Server

    Nipkow, T; Hauptmann, B

    2012-01-01

    Recent decades have seen major advances in methods and tools for checking the safety and security of software systems. Automatic tools can now detect security flaws not only in programs of the order of a million lines of code, but also in high-level protocol descriptions. There has also been something of a breakthrough in the area of operating system verification. This book presents the lectures from the NATO Advanced Study Institute on Tools for Analysis and Verification of Software Safety and Security; a summer school held at Bayrischzell, Germany, in 2011. This Advanced Study Institute was

  13. Producir software seguro argentino

    OpenAIRE

    Romaniz, Susana Cristina; Arce, Iván; Gaspoz, Ivana; Castellaro, Marta

    2014-01-01

    Una fuente principal de incidentes que ponen en riesgo la seguridad de la información y el funcionamiento esperado de los sistemas basados en tecnologías de la información y de las comunicaciones es la dificultad de producir software seguro. Concebir a la seguridad del software como un atributo emergente de su proceso de desarrollo es una visión que ha comenzado a lograr consenso entre los actores vinculados directamente con su producción. Disponer de las capacidades necesarias para atender a...

  14. On malfunctioning software

    OpenAIRE

    Floridi, Luciano; Fresco, Nir; Primiero, Giuseppe

    2015-01-01

    Artefacts do not always do what they are supposed to, due to a variety of reasons, including manufacturing problems, poor maintenance, and normal wear-and-tear. Since software is an artefact, it should be subject to malfunctioning in the same sense in which other artefacts can malfunction. Yet, whether software is on a par with other artefacts when it comes to malfunctioning crucially depends on the abstraction used in the analysis. We distinguish between “negative” and “positive” notions of ...

  15. Software Testing as Science

    Directory of Open Access Journals (Sweden)

    Ingrid Gallesdic

    2013-07-01

    Full Text Available The most widespread opinion among people who have some connection with software testing is that this activity is an art. In fact, books have been published widely whose titles refer to it as art, role or process. But because software complexity is increasing every year, this paper proposes a new approach, conceiving the test as a science. This is because the processes by which they are applied are the steps of the scientific method: inputs, processes, outputs. The contents of this paper examines the similarities and test characteristics as science.

  16. Evaluating Commercial Game System Software

    OpenAIRE

    Quinn, Kelly; クイン, ケリー

    2009-01-01

    This paper describes the TOEIC test DS Training software published by Obunsha for the Nintendo DS game system. This paper describes the different features of the software, the advantages and disadvantages of the software and the results of a survey of students' reactions to the software and using the DS as a platform for studying English.

  17. The fallacy of Software Patents

    CERN Document Server

    CERN. Geneva

    2015-01-01

    Software patents are usually used as argument for innovation but do they really promote innovation? Who really benefits from software patents? This talk attempts to show the problems with software patents and how they can actually harm innovation having little value for software users and our society in general.

  18. The NOvA software testing framework

    Science.gov (United States)

    Tamsett, M.; C Group

    2015-12-01

    The NOvA experiment at Fermilab is a long-baseline neutrino experiment designed to study vε appearance in a vμ beam. NOvA has already produced more than one million Monte Carlo and detector generated files amounting to more than 1 PB in size. This data is divided between a number of parallel streams such as far and near detector beam spills, cosmic ray backgrounds, a number of data-driven triggers and over 20 different Monte Carlo configurations. Each of these data streams must be processed through the appropriate steps of the rapidly evolving, multi-tiered, interdependent NOvA software framework. In total there are greater than 12 individual software tiers, each of which performs a different function and can be configured differently depending on the input stream. In order to regularly test and validate that all of these software stages are working correctly NOvA has designed a powerful, modular testing framework that enables detailed validation and benchmarking to be performed in a fast, efficient and accessible way with minimal expert knowledge. The core of this system is a novel series of python modules which wrap, monitor and handle the underlying C++ software framework and then report the results to a slick front-end web-based interface. This interface utilises modern, cross-platform, visualisation libraries to render the test results in a meaningful way. They are fast and flexible, allowing for the easy addition of new tests and datasets. In total upwards of 14 individual streams are regularly tested amounting to over 70 individual software processes, producing over 25 GB of output files. The rigour enforced through this flexible testing framework enables NOvA to rapidly verify configurations, results and software and thus ensure that data is available for physics analysis in a timely and robust manner.

  19. The Art of Software Testing

    CERN Document Server

    Myers, Glenford J; Badgett, Tom

    2011-01-01

    The classic, landmark work on software testing The hardware and software of computing have changed markedly in the three decades since the first edition of The Art of Software Testing, but this book's powerful underlying analysis has stood the test of time. Whereas most books on software testing target particular development techniques, languages, or testing methods, The Art of Software Testing, Third Edition provides a brief but powerful and comprehensive presentation of time-proven software testing approaches. If your software development project is mission critical, this book is an investme

  20. Patterns for Parallel Software Design

    CERN Document Server

    Ortega-Arjona, Jorge Luis

    2010-01-01

    Essential reading to understand patterns for parallel programming Software patterns have revolutionized the way we think about how software is designed, built, and documented, and the design of parallel software requires you to consider other particular design aspects and special skills. From clusters to supercomputers, success heavily depends on the design skills of software developers. Patterns for Parallel Software Design presents a pattern-oriented software architecture approach to parallel software design. This approach is not a design method in the classic sense, but a new way of managin

  1. Software testing concepts and operations

    CERN Document Server

    Mili, Ali

    2015-01-01

    Explores and identifies the main issues, concepts, principles and evolution of software testing, including software quality engineering and testing concepts, test data generation, test deployment analysis, and software test managementThis book examines the principles, concepts, and processes that are fundamental to the software testing function. This book is divided into five broad parts. Part I introduces software testing in the broader context of software engineering and explores the qualities that testing aims to achieve or ascertain, as well as the lifecycle of software testing. Part II c

  2. SAPHIRE models and software for ASP evaluations

    Energy Technology Data Exchange (ETDEWEB)

    Sattison, M.B.; Schroeder, J.A.; Russell, K.D. [Idaho National Engineering Lab., Idaho Falls, ID (United States)] [and others

    1995-04-01

    The Idaho National Engineering Laboratory (INEL) over the past year has created 75 plant-specific Accident Sequence Precursor (ASP) models using the SAPHIRE suite of PRA codes. Along with the new models, the INEL has also developed a new module for SAPHIRE which is tailored specifically to the unique needs of conditional core damage probability (CCDP) evaluations. These models and software will be the next generation of risk tools for the evaluation of accident precursors by both NRR and AEOD. This paper presents an overview of the models and software. Key characteristics include: (1) classification of the plant models according to plant response with a unique set of event trees for each plant class, (2) plant-specific fault trees using supercomponents, (3) generation and retention of all system and sequence cutsets, (4) full flexibility in modifying logic, regenerating cutsets, and requantifying results, and (5) user interface for streamlined evaluation of ASP events.

  3. Software for airborne radiation monitoring system

    International Nuclear Information System (INIS)

    The Airborne Radiation Monitoring System monitors radioactive contamination in the air or on the ground. The contamination source can be a radioactive plume or an area contaminated with radionuclides. This system is composed of two major parts: Airborne Unit carried by a helicopter, and Ground Station carried by a truck. The Airborne software is intended to be the core of a computerized airborne station. The software is written in C++ under MS-Windows with object-oriented methodology. It has been designed to be user-friendly: function keys and other accelerators are used for vital operations, a help file and help subjects are available, the Human-Machine-Interface is plain and obvious. (authors)

  4. Software-Defined Cellular Mobile Network Solutions

    Institute of Scientific and Technical Information of China (English)

    Jiandong Li; Peng Liu; Hongyan Li

    2014-01-01

    The emergency relating to software-defined networking (SDN), especially in terms of the prototype associated with OpenFlow, pro-vides new possibilities for innovating on network design. Researchers have started to extend SDN to cellular networks. Such new programmable architecture is beneficial to the evolution of mobile networks and allows operators to provide better services. The typical cellular network comprises radio access network (RAN) and core network (CN); hence, the technique roadmap diverges in two ways. In this paper, we investigate SoftRAN, the latest SDN solution for RAN, and SoftCell and MobileFlow, the latest solu-tions for CN. We also define a series of control functions for CROWD. Unlike in the other literature, we emphasize only software-defined cellular network solutions and specifications in order to provide possible research directions.

  5. SAPHIRE models and software for ASP evaluations

    International Nuclear Information System (INIS)

    The Idaho National Engineering Laboratory (INEL) over the past year has created 75 plant-specific Accident Sequence Precursor (ASP) models using the SAPHIRE suite of PRA codes. Along with the new models, the INEL has also developed a new module for SAPHIRE which is tailored specifically to the unique needs of conditional core damage probability (CCDP) evaluations. These models and software will be the next generation of risk tools for the evaluation of accident precursors by both NRR and AEOD. This paper presents an overview of the models and software. Key characteristics include: (1) classification of the plant models according to plant response with a unique set of event trees for each plant class, (2) plant-specific fault trees using supercomponents, (3) generation and retention of all system and sequence cutsets, (4) full flexibility in modifying logic, regenerating cutsets, and requantifying results, and (5) user interface for streamlined evaluation of ASP events

  6. Models of scientific software development

    OpenAIRE

    Segal, Judith

    2008-01-01

    Over the past decade, I have performed several field studies with scientists developing software either on their own or together with software engineers. Based on these field study data, I identify a model of scientific software development as practiced in many scientific laboratories and communities. This model does not fit the standard software engineering models. For example, the tasks of requirement elicitation and software evaluation are not clearly delineated. Nevertheless, it appears t...

  7. Evaluation & Optimization of Software Engineering

    OpenAIRE

    Asaduzzaman Noman; Atik Ahmed Sourav; Shakh Md. Alimuzjaman Alim

    2016-01-01

    The term is made of two words, software and engineering. Software is more than just a program code. A program is an executable code, which serves some computational purpose. Software is considered to be collection of executable programming code, associated libraries and documentations. Software, when made for a specific requirement is called software product. Engineering on the other hand, is all about developing products, using well-defined, scientific principles and methods. The outcome ...

  8. Facets of Software Component Repository

    OpenAIRE

    Vaneet Kaur; Shivani Goel

    2011-01-01

    The software repository is used for storing, managing, and retrieving large numbers of software components. Repositories should be designed to meet the growing and changing needs of the software development organizations. Storage and representation of reusable software components in software repositories to assist retrieval is a key concern area. In this paper we have discussed various assets of thecomponent repository like component searching mechanisms and classifications such as Free Text,...

  9. Self-assembling software generator

    Energy Technology Data Exchange (ETDEWEB)

    Bouchard, Ann M. (Albuquerque, NM); Osbourn, Gordon C. (Albuquerque, NM)

    2011-11-25

    A technique to generate an executable task includes inspecting a task specification data structure to determine what software entities are to be generated to create the executable task, inspecting the task specification data structure to determine how the software entities will be linked after generating the software entities, inspecting the task specification data structure to determine logic to be executed by the software entities, and generating the software entities to create the executable task.

  10. Multi-core System Architecture for Safety-critical Control Applications

    DEFF Research Database (Denmark)

    Li, Gang

    . Partitioning architecture is definitely employed to provide sufficient temporal and spatial isolation between components with different SILs in the multi-core architecture, aiming to support modular certification. It prevents failure propagation between isolated components. The dissertation focuses...... for the software executing on top. Software architecture design concentrates on software multi-core architectures, a multi-core real-time separation kernel and a paravirtualized hypervisor (trusted computing base). From a safety point of view, the multi-core system architecture shall be simple and certifiable......Visor, targeting minimized code size, overhead and complexity. It provides isolation between its hosting virtual machines (e.g., general-purpose operating systems, real-time kernels or bare-metal applications). The proposed multi-core hardware and software architectures are evaluated by the ARTEMIS project RECOMP...

  11. Preliminaries on core image analysis using fault drilling samples; Core image kaiseki kotohajime (danso kussaku core kaisekirei)

    Energy Technology Data Exchange (ETDEWEB)

    Miyazaki, T.; Ito, H. [Geological Survey of Japan, Tsukuba (Japan)

    1996-05-01

    This paper introduces examples of image data analysis on fault drilling samples. The paper describes the following matters: core samples used in the analysis are those obtained from wells drilled piercing the Nojima fault which has moved in the Hygoken-Nanbu Earthquake; the CORESCAN system made by DMT Corporation, Germany, used in acquiring the image data consists of a CCD camera, a light source and core rotation mechanism, and a personal computer, its resolution being about 5 pixels/mm in both axial and circumferential directions, and 24-bit full color; with respect to the opening fractures in core samples collected by using a constant azimuth coring, it was possible to derive values of the opening width, inclination angle, and travel from the image data by using a commercially available software for the personal computer; and comparison of this core image with the BHTV record and the hydrophone VSP record (travel and inclination obtained from the BHTV record agree well with those obtained from the core image). 4 refs., 4 figs.

  12. Software Geometry in Simulations

    Science.gov (United States)

    Alion, Tyler; Viren, Brett; Junk, Tom

    2015-04-01

    The Long Baseline Neutrino Experiment (LBNE) involves many detectors. The experiment's near detector (ND) facility, may ultimately involve several detectors. The far detector (FD) will be significantly larger than any other Liquid Argon (LAr) detector yet constructed; many prototype detectors are being constructed and studied to motivate a plethora of proposed FD designs. Whether it be a constructed prototype or a proposed ND/FD design, every design must be simulated and analyzed. This presents a considerable challenge to LBNE software experts; each detector geometry must be described to the simulation software in an efficient way which allows for multiple authors to easily collaborate. Furthermore, different geometry versions must be tracked throughout their use. We present a framework called General Geometry Description (GGD), written and developed by LBNE software collaborators for managing software to generate geometries. Though GGD is flexible enough to be used by any experiment working with detectors, we present it's first use in generating Geometry Description Markup Language (GDML) files to interface with LArSoft, a framework of detector simulations, event reconstruction, and data analyses written for all LAr technology users at Fermilab. Brett is the other of the framework discussed here, the General Geometry Description (GGD).

  13. Software Process Improvement

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; Konopka, Claudia; Nellemann, Peter;

    2016-01-01

    Software process improvement (SPI) is around for decades: frameworks are proposed, success factors are studied, and experiences have been reported. However, the sheer mass of concepts, approaches, and standards published over the years overwhelms practitioners as well as researchers. What is out...

  14. Software configuration management

    International Nuclear Information System (INIS)

    Software Configuration Management is directed towards identifying system configuration at specific points of its life cycle, so as to control changes to the configuration and to maintain the integrity and traceability of the configuration throughout its life. SCM functions and tasks are presented in the paper

  15. Software system dependability

    International Nuclear Information System (INIS)

    Development of software with very high dependability constraints to drive or monitor critical systems must comply with stringent and complex requirements, to ensure the dependability level achieved and to give confidence to the involved people (development team, verification team, customers, control or certification bodies, users and other stakeholders). (authors)

  16. Iterative software kernels

    Energy Technology Data Exchange (ETDEWEB)

    Duff, I.

    1994-12-31

    This workshop focuses on kernels for iterative software packages. Specifically, the three speakers discuss various aspects of sparse BLAS kernels. Their topics are: `Current status of user lever sparse BLAS`; Current status of the sparse BLAS toolkit`; and `Adding matrix-matrix and matrix-matrix-matrix multiply to the sparse BLAS toolkit`.

  17. The FARE Software

    Science.gov (United States)

    Pitarello, Adriana

    2015-01-01

    This article highlights the importance of immediate corrective feedback in tutorial software for language teaching in an academic learning environment. We aim to demonstrate that, rather than simply reporting on the performance of the foreign language learner, this feedback can act as a mediator of students' cognitive and metacognitive activity.…

  18. Writing testable software requirements

    Energy Technology Data Exchange (ETDEWEB)

    Knirk, D. [Sandia National Labs., Albuquerque, NM (United States)

    1997-11-01

    This tutorial identifies common problems in analyzing requirements in the problem and constructing a written specification of what the software is to do. It deals with two main problem areas: identifying and describing problem requirements, and analyzing and describing behavior specifications.

  19. Software management issues

    International Nuclear Information System (INIS)

    The difficulty of managing the software in large HEP collaborations appears to becoming progressively worst with each new generation of detector. If one were to extrapolate to the SSC, it will become a major problem. This paper explores the possible causes of the difficulty and makes suggestions on what corrective actions should be taken

  20. Software Carpentry: lessons learned

    Science.gov (United States)

    Wilson, Greg

    2016-01-01

    Since its start in 1998, Software Carpentry has evolved from a week-long training course at the US national laboratories into a worldwide volunteer effort to improve researchers' computing skills. This paper explains what we have learned along the way, the challenges we now face, and our plans for the future. PMID:24715981

  1. Automatisches Software-Update

    OpenAIRE

    Clauß, Matthias; Fischer, Günther

    2003-01-01

    Vorgestellt wird ein neuer Dienst zum eigenverantwortlichen Software-Update von PC-Systemen, die unter Linux Red Hat 7.3 betrieben werden. Grundlage des Dienstes bildet das Verfahren YARU (Yum based Automatic RPM Update) als Bestandteil der im URZ eingesetzten Admin-Technologie für Linux-Rechner.

  2. Software Carpentry: Lessons Learned

    OpenAIRE

    Wilson, Greg

    2013-01-01

    Over the last 15 years, Software Carpentry has evolved from a week-long training course at the US national laboratories into a worldwide volunteer effort to raise standards in scientific computing. This article explains what we have learned along the way the challenges we now face, and our plans for the future.

  3. Software engineering tools.

    Science.gov (United States)

    Wear, L L; Pinkert, J R

    1994-01-01

    We have looked at general descriptions and illustrations of several software development tools, such as tools for prototyping, developing DFDs, testing, and maintenance. Many others are available, and new ones are being developed. However, you have at least seen some examples of powerful CASE tools for systems development. PMID:10131419

  4. Software for noise measurements

    International Nuclear Information System (INIS)

    The CURS program library comprising 38 fortran-programs, designed for processing descrete experimental data in the form of random or determined periodic processes is described. The library is based on the modular construction principle which allows one to create on its base any sets of programs to solve tasks related to NPP operation, and to develop special software

  5. Overview of NWIS software

    International Nuclear Information System (INIS)

    The Nuclear Weapons Identification System (NWIS) is a system that performs radiation signature measurements on objects such as nuclear weapons components. NWIS consists of a 252 Cf fission source, radiation detectors and associated analog electronics, data acquisition boards, and computer running Windows NT and the application software. NWIS uses signal processing techniques to produced a radiation signature from the radiation emitted from the object. The signature can be stored and later compared to another signature to determine whether two objects are similar. A library of such signatures can be used to identify objects in closed containers as well as determine such attributes as fissile mass and it some cases enrichment. There are three executables built from the software: (1) Windows NT kernel-mode device driver; (2) data acquisition application; and (3) data analysis application. The device driver is the interface between the NWIS data acquisition boards and the remainder of the software. The data acquisition executable is the user's tool for making an NWIS measurement; it has limited data display abilities. The data analysis executable is a user's tool for displaying an NWIS measurement, including matching it to other NWIS measurements. A users manual for the software is included

  6. Banded transformer cores

    Science.gov (United States)

    Mclyman, C. W. T. (Inventor)

    1974-01-01

    A banded transformer core formed by positioning a pair of mated, similar core halves on a supporting pedestal. The core halves are encircled with a strap, selectively applying tension whereby a compressive force is applied to the core edge for reducing the innate air gap. A dc magnetic field is employed in supporting the core halves during initial phases of the banding operation, while an ac magnetic field subsequently is employed for detecting dimension changes occurring in the air gaps as tension is applied to the strap.

  7. Are Academic Programs Adequate for the Software Profession?

    Science.gov (United States)

    Koster, Alexis

    2010-01-01

    According to the Bureau of Labor Statistics, close to 1.8 million people, or 77% of all computer professionals, were working in the design, development, deployment, maintenance, and management of software in 2006. The ACM [Association for Computing Machinery] model curriculum for the BS in computer science proposes that about 42% of the core body…

  8. Framework of Software Quality Management Using Object oriented Software Agent

    Directory of Open Access Journals (Sweden)

    Anand Pandey

    2013-01-01

    Full Text Available Development of software is a scientific and economic problem, particularly the design of complex systems whichrequire evolving methods and approaches. Agent technology is currently one of the most active and vibrant areas of IT research and development. Object-oriented Software Engineering (OOSE has become an active area of research in recent years. In this paper, we review the framework of software quality management using object- oriented methodology concepts for software agents.The software specification acts as a bridge between customers, architects, software developers and testers. Using object-oriented concept of software agent and its standard it may offer benefits even if the system is implemented without an object-based language or framework . We propose and discuss a software agent framework, specifically to support software quality management. Although still in its initial phases, research indicates some promise in enabling software developers to meet market expectations and produce projects timeously, within budget and to users satisfaction. However, the software quality management environment has also changed and is continuously evolving. Currently software projects are developed and deployed in distributed, pervasive and collaborative environments and its quality should be managed by applying its best standard. From the point of view of software engineering this framework and its standards are applying for developing the software projects.We discuss the standard and benefits that can be gained by using object-oriented concepts, and where the concepts require further development.

  9. Generic Kalman Filter Software

    Science.gov (United States)

    Lisano, Michael E., II; Crues, Edwin Z.

    2005-01-01

    The Generic Kalman Filter (GKF) software provides a standard basis for the development of application-specific Kalman-filter programs. Historically, Kalman filters have been implemented by customized programs that must be written, coded, and debugged anew for each unique application, then tested and tuned with simulated or actual measurement data. Total development times for typical Kalman-filter application programs have ranged from months to weeks. The GKF software can simplify the development process and reduce the development time by eliminating the need to re-create the fundamental implementation of the Kalman filter for each new application. The GKF software is written in the ANSI C programming language. It contains a generic Kalman-filter-development directory that, in turn, contains a code for a generic Kalman filter function; more specifically, it contains a generically designed and generically coded implementation of linear, linearized, and extended Kalman filtering algorithms, including algorithms for state- and covariance-update and -propagation functions. The mathematical theory that underlies the algorithms is well known and has been reported extensively in the open technical literature. Also contained in the directory are a header file that defines generic Kalman-filter data structures and prototype functions and template versions of application-specific subfunction and calling navigation/estimation routine code and headers. Once the user has provided a calling routine and the required application-specific subfunctions, the application-specific Kalman-filter software can be compiled and executed immediately. During execution, the generic Kalman-filter function is called from a higher-level navigation or estimation routine that preprocesses measurement data and post-processes output data. The generic Kalman-filter function uses the aforementioned data structures and five implementation- specific subfunctions, which have been developed by the user on

  10. Software Polarization Spectrometer "PolariS"

    OpenAIRE

    Mizuno, Izumi; Kameno, Seiji; Kano, Amane; Kuroo, Makoto; Nakamura, Fumitaka; KAWAGUCHI, Noriyuki; Shibata, Katsunori M.; Kuji, Seisuke; Kuno, Nario

    2014-01-01

    We have developed a software-based polarization spectrometer, PolariS, to acquire full-Stokes spectra with a very high spectral resolution of 61 Hz. The primary aim of PolariS is to measure the magnetic fields in dense star-forming cores by detecting the Zeeman splitting of molecular emission lines. The spectrometer consists of a commercially available digital sampler and a Linux computer. The computer is equipped with a graphics processing unit (GPU) to process FFT and cross-correlation usin...

  11. Flight Software Math Library

    Science.gov (United States)

    McComas, David

    2013-01-01

    The flight software (FSW) math library is a collection of reusable math components that provides typical math utilities required by spacecraft flight software. These utilities are intended to increase flight software quality reusability and maintainability by providing a set of consistent, well-documented, and tested math utilities. This library only has dependencies on ANSI C, so it is easily ported. Prior to this library, each mission typically created its own math utilities using ideas/code from previous missions. Part of the reason for this is that math libraries can be written with different strategies in areas like error handling, parameters orders, naming conventions, etc. Changing the utilities for each mission introduces risks and costs. The obvious risks and costs are that the utilities must be coded and revalidated. The hidden risks and costs arise in miscommunication between engineers. These utilities must be understood by both the flight software engineers and other subsystem engineers (primarily guidance navigation and control). The FSW math library is part of a larger goal to produce a library of reusable Guidance Navigation and Control (GN&C) FSW components. A GN&C FSW library cannot be created unless a standardized math basis is created. This library solves the standardization problem by defining a common feature set and establishing policies for the library s design. This allows the libraries to be maintained with the same strategy used in its initial development, which supports a library of reusable GN&C FSW components. The FSW math library is written for an embedded software environment in C. This places restrictions on the language features that can be used by the library. Another advantage of the FSW math library is that it can be used in the FSW as well as other environments like the GN&C analyst s simulators. This helps communication between the teams because they can use the same utilities with the same feature set and syntax.

  12. PWR core monitoring system and benchmarking

    International Nuclear Information System (INIS)

    The PWR Power Shape Monitoring System (PSMS) provides site engineers with new capabilities for monitoring and predicting core power distributions. These capabilities can lead to increased plant output as a result of greater operating margins, better load maneuvering, earlier detection of anomalies, and improved fuel reliability. The heart of the PSMS consists of nodal code (NODEP-2/THERM-P) that computes the 3-D core power distribution. This code is coupled to a simplified nodal version of the COBRA-IIIC/MIT-2 thermal-hydraulic model to determine the DNBR. These calculations can be completed in about 30 seconds on a PRIME-750 mini computer. Activation of the calculations and review of the results is through user-friendly interactive software that can be tailored to the requirements and capabilities of the different categories of users through table-driven menus. The PSMS provides unique advances over core power monitoring systems based purely on measurements. The PSMS approach permits the three-dimensional core simulation model to be routinely corrected with in-core/ex-core measurements while simultaneously identifying consistent instrument errors

  13. Software Development Practices, Software Complexity, and Software Maintenance Performance: A Field Study

    OpenAIRE

    Banker, Rajiv D.; Davis, Gordon B.; Sandra A. Slaughter

    1998-01-01

    Software maintenance claims a large proportion of organizational resources. It is thought that many maintenance problems derive from inadequate software design and development practices. Poor design choices can result in complex software that is costly to support and difficult to change. However, it is difficult to assess the actual maintenance performance effects of software development practices because their impact is realized over the software life cycle. To estimate the impact of develop...

  14. A software engineering process for safety-critical software application

    International Nuclear Information System (INIS)

    Application of computer software to safety-critical systems in on the increase. To be successful, the software must be designed and constructed to meet the functional and performance requirements of the system. For safety reason, the software must be demonstrated not only to meet these requirements, but also to operate safely as a component within the system. For longer-term cost consideration, the software must be designed and structured to ease future maintenance and modifications. This paper presents a software engineering process for the production of safety-critical software for a nuclear power plant. The presentation is expository in nature of a viable high quality safety-critical software development. It is based on the ideas of a rational design process and on the experience of the adaptation of such process in the production of the safety-critical software for the shutdown system number two of Wolsung 2, 3 and 4 nuclear power generation plants. This process is significantly different from a conventional process in terms of rigorous software development phases and software design techniques, The process covers documentation, design, verification and testing using mathematically precise notations and highly reviewable tabular format to specify software requirements and software requirements and software requirements and code against software design using static analysis. The software engineering process described in this paper applies the principle of information-hiding decomposition in software design using a modular design technique so that when a change is required or an error is detected, the affected scope can be readily and confidently located. it also facilitates a sense of high degree of confidence in the 'correctness' of the software production, and provides a relatively simple and straightforward code implementation effort. 1 figs., 10 refs. (Author)

  15. TOWARD SOFTWARE ENGINEERING PRINCIPLES BASED ON ISLAMIC ETHICAL VALUES

    Directory of Open Access Journals (Sweden)

    shihab A. Hameed

    2010-09-01

    Full Text Available Software is the core for Computer-based applications which became an essential part for critical control systems, health and human life guard systems, financial and banking systems, educational and other systems. It requires qualified software engineers professionally and ethically. L.R and survey results show that software engineering professionals facing several ethical related problems which are costly, harmful and affected high ratio of people. Professional organizations like ACM, IEEE, ABET and CSAC have established codes of ethics to help software engineering professionals to understand and manage their ethical responsibilities. Islam considers ethics an essential factor to build individuals,communities and society. Islamic Ethics are set of moral principles and guidance that recognizes what is right behavior from wrong, which are comprehensive, stable, fair, and historically prove success in building ethically great society. The 1.3 billions of Muslims with 10s of thousands of software engineers should have an effective role in software development and life, which requires them to understand and implement ethics, specially the Islamic ethics in their work. This paper is a frame-work for modeling software engineering principle. It focuses mainly on adopting a new version of software engineering principle based on Islamic ethical values.

  16. Computer games and software engineering

    CERN Document Server

    Cooper, Kendra M L

    2015-01-01

    Computer games represent a significant software application domain for innovative research in software engineering techniques and technologies. Game developers, whether focusing on entertainment-market opportunities or game-based applications in non-entertainment domains, thus share a common interest with software engineers and developers on how to best engineer game software.Featuring contributions from leading experts in software engineering, the book provides a comprehensive introduction to computer game software development that includes its history as well as emerging research on the inte

  17. Terminological recommendations for software localization

    Directory of Open Access Journals (Sweden)

    Klaus-Dirk Schmitz

    2012-08-01

    Full Text Available After an explosive growth of data processing and software starting at the beginning of the 1980s, the software industry shifted toward a strong orientation in non-US markets at the beginning of the 1990s. Today we see the global marketing of software in almost all regions of the world. Since software is no longer used by IT experts only, and since European and national regulations require user interfaces, manuals and documentation to be provided in the language of the customer, the market for software translation, i.e. for software localization, is the fastest growing market in the translation business.

  18. Software Architecture Design Reasoning

    Science.gov (United States)

    Tang, Antony; van Vliet, Hans

    Despite recent advancements in software architecture knowledge management and design rationale modeling, industrial practice is behind in adopting these methods. The lack of empirical proofs and the lack of a practical process that can be easily incorporated by practitioners are some of the hindrance for adoptions. In particular, the process to support systematic design reasoning is not available. To rectify this issue, we propose a design reasoning process to help architects cope with an architectural design environment where design concerns are cross-cutting and diversified.We use an industrial case study to validate that the design reasoning process can help improve the quality of software architecture design. The results have indicated that associating design concerns and identifying design options are important steps in design reasoning.

  19. Chemical recognition software

    Energy Technology Data Exchange (ETDEWEB)

    Wagner, J.S.; Trahan, M.W.; Nelson, W.E.; Hargis, P.J. Jr.; Tisone, G.C.

    1994-12-01

    We have developed a capability to make real time concentration measurements of individual chemicals in a complex mixture using a multispectral laser remote sensing system. Our chemical recognition and analysis software consists of three parts: (1) a rigorous multivariate analysis package for quantitative concentration and uncertainty estimates, (2) a genetic optimizer which customizes and tailors the multivariate algorithm for a particular application, and (3) an intelligent neural net chemical filter which pre-selects from the chemical database to find the appropriate candidate chemicals for quantitative analyses by the multivariate algorithms, as well as providing a quick-look concentration estimate and consistency check. Detailed simulations using both laboratory fluorescence data and computer synthesized spectra indicate that our software can make accurate concentration estimates from complex multicomponent mixtures. even when the mixture is noisy and contaminated with unknowns.

  20. Chemical recognition software

    Energy Technology Data Exchange (ETDEWEB)

    Wagner, J.S.; Trahan, M.W.; Nelson, W.E.; Hargis, P.H. Jr.; Tisone, G.C.

    1994-06-01

    We have developed a capability to make real time concentration measurements of individual chemicals in a complex mixture using a multispectral laser remote sensing system. Our chemical recognition and analysis software consists of three parts: (1) a rigorous multivariate analysis package for quantitative concentration and uncertainty estimates, (2) a genetic optimizer which customizes and tailors the multivariate algorithm for a particular application, and (3) an intelligent neural net chemical filter which pre-selects from the chemical database to find the appropriate candidate chemicals for quantitative analyses by the multivariate algorithms, as well as providing a quick-look concentration estimate and consistency check. Detailed simulations using both laboratory fluorescence data and computer synthesized spectra indicate that our software can make accurate concentration estimates from complex multicomponent mixtures, even when the mixture is noisy and contaminated with unknowns.

  1. TOUGH2 software qualification

    Energy Technology Data Exchange (ETDEWEB)

    Pruess, K.; Simmons, A.; Wu, Y.S.; Moridis, G.

    1996-02-01

    TOUGH2 is a numerical simulation code for multi-dimensional coupled fluid and heat flow of multiphase, multicomponent fluid mixtures in porous and fractured media. It belongs to the MULKOM ({open_quotes}MULti-KOMponent{close_quotes}) family of codes and is a more general version of the TOUGH simulator. The MULKOM family of codes was originally developed with a focus on geothermal reservoir simulation. They are suited to modeling systems which contain different fluid mixtures, with applications to flow problems arising in the context of high-level nuclear waste isolation, oil and gas recovery and storage, and groundwater resource protection. TOUGH2 is essentially a subset of MULKOM, consisting of a selection of the better tested and documented MULKOM program modules. The purpose of this package of reports is to provide all software baseline documents necessary for the software qualification of TOUGH2.

  2. TOUGH2 software qualification

    International Nuclear Information System (INIS)

    TOUGH2 is a numerical simulation code for multi-dimensional coupled fluid and heat flow of multiphase, multicomponent fluid mixtures in porous and fractured media. It belongs to the MULKOM (open-quotes MULti-KOMponentclose quotes) family of codes and is a more general version of the TOUGH simulator. The MULKOM family of codes was originally developed with a focus on geothermal reservoir simulation. They are suited to modeling systems which contain different fluid mixtures, with applications to flow problems arising in the context of high-level nuclear waste isolation, oil and gas recovery and storage, and groundwater resource protection. TOUGH2 is essentially a subset of MULKOM, consisting of a selection of the better tested and documented MULKOM program modules. The purpose of this package of reports is to provide all software baseline documents necessary for the software qualification of TOUGH2

  3. Implementing Software Defined Radio

    CERN Document Server

    Grayver, Eugene

    2013-01-01

    Software Defined Radio makes wireless communications easier, more efficient, and more reliable. This book bridges the gap between academic research and practical implementation. When beginning a project, practicing engineers, technical managers, and graduate students can save countless hours by considering the concepts presented in these pages. The author covers the myriad options and trade-offs available when selecting an appropriate hardware architecture. As demonstrated here, the choice between hardware- and software-centric architecture can mean the difference between meeting an aggressive schedule and bogging down in endless design iterations. Because of the author’s experience overseeing dozens of failed and successful developments, he is able to present many real-life examples. Some of the key concepts covered are: Choosing the right architecture for the market – laboratory, military, or commercial Hardware platforms – FPGAs, GPPs, specialized and hybrid devices Standardization efforts to ens...

  4. Agile distributed software development

    DEFF Research Database (Denmark)

    Persson, John Stouby; Mathiassen, Lars; Aaen, Ivan

    2012-01-01

    While face-to-face interaction is fundamental in agile software development, distributed environments must rely extensively on mediated interactions. Practicing agile principles in distributed environments therefore poses particular control challenges related to balancing fixed vs. evolving quality...... requirements and people vs. process-based collaboration. To investigate these challenges, we conducted an in-depth case study of a successful agile distributed software project with participants from a Russian firm and a Danish firm. Applying Kirsch’s elements of control framework, we offer an analysis of how...... in conjunction with informal roles and relationships such as clan-like control inherent in agile development. Overall, the study demonstrates that, if appropriately applied, communication technologies can significantly support distributed, agile practices by allowing concurrent enactment of both formal...

  5. IAU SOFA Software

    Science.gov (United States)

    Hohenkerk, Catherine

    2012-08-01

    SOFA (Standards Of Fundamental Astronomy) software is a resource for astronomers, provided via IAU Division 1. The library contains the latest (IAU approved) algorithms for Earth attitude - precession, nutation, Earth rotation angle, sidereal time. Does your software use time? Need to convert between, for example UTC, UT1, or TT? Then SOFA has all you need. Using SOFA you can convert between FK5 and Hipparcos positions, between geodetic and geocentric coordinates, as well as conversions between the BCRS (ICRS) or J2000.0 and both the celestial and terrestrial reference systems. All routines, Fortran or ANSI C, are available as source code or as part of a library. Visit our website at http://www.iausofa.org/ to find out more and download what you need.

  6. Design of software platform based on linux operating system for γ-spectrometry instrument

    International Nuclear Information System (INIS)

    This paper described the design of γ-spectrometry instrument software platform based on s3c2410a processor with arm920t core, emphases are focused on analyzing the integrated application of embedded linux operating system, yaffs file system and qt/embedded GUI development library. It presented a new software platform in portable instrument for γ measurement. (authors)

  7. Spreadsheet Auditing Software

    OpenAIRE

    Nixon, David; O'Hara, Mike

    2010-01-01

    It is now widely accepted that errors in spreadsheets are both common and potentially dangerous. Further research has taken place to investigate how frequently these errors occur, what impact they have, how the risk of spreadsheet errors can be reduced by following spreadsheet design guidelines and methodologies, and how effective auditing of a spreadsheet is in the detection of these errors. However, little research exists to establish the usefulness of software tools in the auditing of spre...

  8. Addressing Software Security

    Science.gov (United States)

    Bailey, Brandon

    2015-01-01

    Historically security within organizations was thought of as an IT function (web sites/servers, email, workstation patching, etc.) Threat landscape has evolved (Script Kiddies, Hackers, Advanced Persistent Threat (APT), Nation States, etc.) Attack surface has expanded -Networks interconnected!! Some security posture factors Network Layer (Routers, Firewalls, etc.) Computer Network Defense (IPS/IDS, Sensors, Continuous Monitoring, etc.) Industrial Control Systems (ICS) Software Security (COTS, FOSS, Custom, etc.)

  9. Image Processing Software

    Science.gov (United States)

    1992-01-01

    To convert raw data into environmental products, the National Weather Service and other organizations use the Global 9000 image processing system marketed by Global Imaging, Inc. The company's GAE software package is an enhanced version of the TAE, developed by Goddard Space Flight Center to support remote sensing and image processing applications. The system can be operated in three modes and is combined with HP Apollo workstation hardware.

  10. Software Integration Final Report

    OpenAIRE

    Bégin, Marc-Elian; Merifield, Louise

    2012-01-01

    This document reports on the integration activities performed during the whole StratusLab project, focusing on lessons learned and identifying the areas of improvement, throughout the different releases of the StratusLab software, including the process used during the development, integration and test. The goal of this document is also to provide lessons that could be applied as the project transitions to an open source consortium, such that it continues to improve its performance.

  11. Improving Agile Software Practice

    DEFF Research Database (Denmark)

    Tjørnehøj, Gitte

    2006-01-01

    Software process improvement in small and agile organizations is often problematic, but achieving good SPI-assessments can still be necessary to stay in the marked or to meet demands of multinational owners. The traditional norm driven, centralized and control centered improvement approaches has ...... analysis to support designing processes that balance the firm-culture with that of the CMM-model and achieving real change in practice by building commitment and engagement through participative SPI-planning and design....

  12. Software fault tolerance

    OpenAIRE

    Strigini, Lorenzo

    1990-01-01

    Software design faults are a cause of major concern, and their relative importance is growing as techniques for tolerating hardware faults gain wider acceptance. The application of fault tolerance to design faults is both increasing, in particular in some life-critical applications, and controversial, due to the imperfect state of knowledge about it. This paper surveys the existing applications and research results, to help the reader form an initial picture of the existing possibilities, and...

  13. Office software Individual coaching

    CERN Document Server

    HR Department

    2010-01-01

    If one or several particular topics cause you sleepless nights, you can get the help of our trainer who will come to your workplace for a multiple of 1-hour slots . All fields in which our trainer can help are detailed in the course description in our training catalogue (Microsoft Office software, Adobe applications, i-applications etc.). Please discover these new courses in our catalogue! Tel. 74924

  14. ThermalTracker Software

    Energy Technology Data Exchange (ETDEWEB)

    2016-08-10

    The software processes recorded thermal video and detects the flight tracks of birds and bats that passed through the camera's field of view. The output is a set of images that show complete flight tracks for any detections, with the direction of travel indicated and the thermal image of the animal delineated. A report of the descriptive features of each detected track is also output in the form of a comma-separated value text file.

  15. Short Software Descriptions

    OpenAIRE

    Makowski, M.; Rogowski, T.

    1991-01-01

    This paper briefly presents the software for interactive decision support that was developed in 1990-1991 within the Contracted Study Agreement between the System and Decision Sciences Program at IIASA and several Polish scientific institutions, namely: Institute of Automatic Control (Warsaw University of Technology); Institute of Computing Science (Technical University of Poznaii); Institute of Informatics (Warsaw University); and Systems Research Institute of the Polish Academy of Sciences...

  16. Open Source Software Acquisition

    OpenAIRE

    Holck,, Jesper; Pedersen, Mogens Kühn; Larsen, Michael Holm

    2005-01-01

    Lately we have seen a growing interest from both public and private organisations to adopt Open Source Software (OSS), not only for a few, specific applications but also on a more general level throughout the organisation. As a consequence, the organisations’ decisions on adoption of OSS are becoming increasingly more important and complex. We present three perspectives organisations can employ in their decisions: seeing OSS acquisition as a business case, as COTS acquisition, and as architec...

  17. Energy-aware Software

    OpenAIRE

    Ardito, Luca

    2014-01-01

    Luca Ardito has focused his PhD on studying how to identify and to reduce the energy consumption caused by software. The project concentrates on the application level, with an experimental approach to discover and modify characteristics that waste energy. We can define five research goals: RG1. Is it possible to measure the energy consumption of an application? Measuring the energy consumption of an electronic device (PC, mobile phone, etc.) is straightforward, but several applications coexis...

  18. Software Delivery Concept

    OpenAIRE

    Maula, Marjo

    2014-01-01

    The purpose of this study was to create a software delivery concept for ERP in IT-project, comparable metrics for the project management and make a improvement list of the future actions. Both qualitative and quantitative research methodology was utilized in this study. The qualitative research data consists of six in-depth interviews. The interviews were done with project people and people who are working in global projects. Quantitative data research data was gathered with a quest...

  19. Continuous software delivery

    OpenAIRE

    KRMAVNAR, NINA

    2015-01-01

    The main purpose of the thesis is the demonstration of one of the best possible approaches to an automated continuous delivery process as it relates to certain application types. In the introductory part, the main reason for choosing the subject is presented, along with a few examples of why nowadays - in order to keep pace with the competition - such an approach seems necessary. Following chapters discuss the basics of software delivery, starting with configuration and version control manage...

  20. Generalized Software Security Framework

    OpenAIRE

    Smriti Jain; Maya Ingle

    2011-01-01

    Security of information has become a major concern in today's digitized world. As a result, effective techniques to secure information are required. The most effective way is to incorporate security in the development process itself thereby resulting into secured product. In this paper, we propose a framework that enables security to be included in the software development process. The framework consists of three layers namely; control layer, aspect layer and development layer. The control la...

  1. Mutate my software

    OpenAIRE

    Micallef, Mark; Colombo, Christian; Duca, Edward

    2015-01-01

    Computer systems run the world and are found in fridges to hospitals. Every application needs testing, which is expensive and time-consuming. Dr Mark Micallef and Dr Christian Colombo from the PEST research group (Faculty of ICT, University of Malta) tells THINK about a new technique which could make testing easier and more consistent. Illustrations by NO MAD. http://www.um.edu.mt/think/mutate-my-software/

  2. Classifications of Software Transfers

    OpenAIRE

    Wohlin, Claes; Smite, Darja

    2012-01-01

    Many companies have development sites around the globe. This inevitably means that development work may be transferred between the sites. This paper defines a classification of software transfer types; it divides transfers into three main types: full, partial and gradual transfers to describe the context of a transfer. The differences between transfer types, and hence the need for a classification, are illustrated with staffing curves for two different transfer types. The staffing curves are ...

  3. Peppy: Proteogenomic Search Software

    OpenAIRE

    Risk, Brian A.; Spitzer, Wendy J; Giddings, Morgan C

    2013-01-01

    Proteogenomic searching is a useful method for identifying novel proteins, annotating genes and detecting peptides unique to an individual genome. The approach, however, can be laborious, as it often requires search segmentation and the use of several unintegrated tools. Furthermore, many proteogenomic efforts have been limited to small genomes, as large genomes can prove impractical due to the required amount of computer memory and computation time. We present Peppy, a software tool designed...

  4. JPI UML Software Modeling

    OpenAIRE

    Cristian Vidal Silva; Leopoldo López; Rodolfo Schmal; Rodolfo Villarroel; Miguel Bustamante; Víctor Rea Sanchez

    2015-01-01

    Aspect-Oriented Programming AOP extends object-oriented programming OOP with aspects to modularize crosscutting behavior on classes by means of aspects to advise base code in the occurrence of join points according to pointcut rules definition. However, join points introduce dependencies between aspects and base code, a great issue to achieve an effective independent development of software modules. Join Point Interfaces JPI represent join points using interfaces between classes and aspect, t...

  5. Generative Software Engineering

    OpenAIRE

    Jézéquel, Jean-Marc

    2007-01-01

    Researching evermore abstract and powerful ways of composing programs is the meat of software engineering for half a century. Important early steps were subroutines (to encapsulate actions) and records (to encapsulate data). A large step forward came with the introduction of the object-oriented concepts (classes, subclasses and virtual methods) where classes can encapsulate both data and behaviors in a very powerful, but still flexible, way. For a long time, these concepts dominated the scene...

  6. Engineering Autonomous Driving Software

    OpenAIRE

    Berger, Christian; Rumpe, Bernhard

    2014-01-01

    A larger number of people with heterogeneous knowledge and skills running a project together needs an adaptable, target, and skill-specific engineering process. This especially holds for a project to develop a highly innovative, autonomously driving vehicle to participate in the 2007 DARPA Urban Challenge. In this contribution, we present essential elements of a software and systems engineering process to develop a so-called artificial intelligence capable of driving autonomously in complex u...

  7. Querying Versioned Software Repositories

    OpenAIRE

    Christopeit, Dietrich; Böhlen, Michael; Kanne, Carl-Christian; Mazeika, Arturas

    2011-01-01

    Large parts of today's data is stored in text documents that undergo a series of changes during their lifetime. For instance during the development of a software product the source code changes frequently. Currently, managing such data relies on version control systems (VCSs). Extracting information from large documents and their different versions is a manual and tedious process. We present {\\sc Qvestor}, a system that allows to declaratively query docume...

  8. Querying versioned software repositories

    OpenAIRE

    Christopeit, Dietrich; Böhlen, Michael H.; Kanne, Carl-Christian; Mazeika, Arturas

    2011-01-01

    Large parts of today’s data is stored in text documents that undergo a series of changes during their lifetime. For instance during the development of a software product the source code changes frequently. Currently, managing such data relies on version control systems (VCSs). Extracting information from large documents and their different versions is a manual and tedious process. We present Qvestor, a system that allows to declaratively query documents. It leverages information about the str...

  9. Software Defined Networking

    OpenAIRE

    Roncero Hervás, Óscar

    2014-01-01

    Software Defined Networks (SDN) is a paradigm in which routing decisions are taken by a control layer. In contrast to conventional network structures, the control plane and forwarding plane are separated and communicate through standard protocols like OpenFlow. Historically, network management was based on a layered approach, each one isolated from the others. SDN proposes a radically different approach by bringing together the management of all these layers into a single controller. It is th...

  10. Expertise in Software Design

    OpenAIRE

    Sonnentag, Sabine; Niessen, Cornelia; Volmer, Judith

    2006-01-01

    In this chapter, we review research evidence on expertise in software design, computer programming, and related tasks. Research in this domain is particularly interesting because it refers both to rather general features and processes associated with expertise (e.g., knowledge representation, problem-solving strategies) and to specific characteristics of high performers in an economically relevant real-world setting. Therefore, in this chapter we draw on literature from various £elds, mainly ...

  11. Modeling software design diversity

    OpenAIRE

    Littlewood, B.; Popov, P. T.; Strigini, L.

    2001-01-01

    Design diversity has been used for many years now as a means of achieving a degree of fault tolerance in software-based systems. Whilst there is clear evidence that the approach can be expected to deliver some increase in reliability compared with a single version, there is not agreement about the extent of this. More importantly, it remains difficult to evaluate exactly how reliable a particular diverse fault-tolerant system is. This difficulty arises because assumptions of independence of f...

  12. Global Software Development

    DEFF Research Database (Denmark)

    Søderberg, Anne-Marie; Krishna, S.; Bjørn, Pernille

    2013-01-01

    accounts of close collaboration processes in two large and complex projects, where off-shoring of software development is moved to a strategic level, we found that the vendor was able to establish a strategic partnership through long-term engagement with the field of banking and insurance as well....... The article draws attention to the important collaborative work done by people who are able to span boundaries in the complex organizational set-up of global IT development projects....

  13. Standard software for CAMAC

    International Nuclear Information System (INIS)

    The NIM Committee (National Instrumentation Methods Committee) of the U.S. Department of Energy and the ESONE Committee of European Laboratories have jointly specified standard software for use with CAMAC. Three general approaches were followed: the definition of a language called IML for use in CAMAC systems, the definition of a standard set of subroutine calls, and real-time extensions to the BASIC language. This paper summarizes the results of these efforts. 1 table

  14. Software is a directed multigraph (and so is software process)

    OpenAIRE

    Dabrowski, Robert; Stencel, Krzysztof; Timoszuk, Grzegorz

    2011-01-01

    For a software system, its architecture is typically defined as the fundamental organization of the system incorporated by its components, their relationships to one another and their environment, and the principles governing their design. If contributed to by the artifacts coresponding to engineering processes that govern the system's evolution, the definition gets natually extended into the architecture of software and software process. Obviously, as long as there were no software systems, ...

  15. Harnessing software development contexts to inform software process selection decisions

    OpenAIRE

    Jeners, Simona; O'Connor, Rory V.; Clake, Paul; Lichter, Horst; Lepmets, Marion; Buglione, Luigi

    2013-01-01

    peer-reviewed Software development is a complex process for which numerous approaches have been suggested. However, no single approach to software development has been met with universal acceptance, which is not surprising, as there are many different software development concerns. In addition, there are a multitude of other contextual factors that influence the choice of software development process and process management decisions. The authors believe it is important to de...

  16. Bridging the Gap Between Software Process and Software Development

    OpenAIRE

    Rouillé, Emmanuelle; Combemale, Benoit; Barais, Olivier; David, Touzet; Jézéquel, Jean-Marc

    2011-01-01

    National audience Model Driven Engineering (MDE) benefits software development (a.k.a. Model Driven Software Development) as well as software processes (a.k.a. Software Process Modeling). Nevertheless, the gap between processes and development is still too great. Indeed, information from processes is not always used to improve development and vice versa. For instance, it is possible to define the development tools used in a process description without linking them to the real tools. This p...

  17. How can software SMEs become medical device software SMEs

    OpenAIRE

    Mc Caffery, Fergal; Casey, Valentine; Mc Hugh, Martin

    2011-01-01

    peer-reviewed Today the amount of software content within medical devices has grown considerably and will continue to do so as the level of complexity of medical devices continues to increase. This is driven by the fact that software is introduced to produce sophisticated medical devices that would not be possible using only hardware. This therefore presents opportunities for software development SMEs to become medical device software development organisations. However, some obstacles need...

  18. ATLAS software configuration and build tool optimisation

    International Nuclear Information System (INIS)

    multi-core computing resources utilisation, and considerably improved software developer and user experience.

  19. Evidence of Absence software

    Science.gov (United States)

    Dalthorp, Daniel; Huso, Manuela M. P.; Dail, David; Kenyon, Jessica

    2014-01-01

    Evidence of Absence software (EoA) is a user-friendly application used for estimating bird and bat fatalities at wind farms and designing search protocols. The software is particularly useful in addressing whether the number of fatalities has exceeded a given threshold and what search parameters are needed to give assurance that thresholds were not exceeded. The software is applicable even when zero carcasses have been found in searches. Depending on the effectiveness of the searches, such an absence of evidence of mortality may or may not be strong evidence that few fatalities occurred. Under a search protocol in which carcasses are detected with nearly 100 percent certainty, finding zero carcasses would be convincing evidence that overall mortality rate was near zero. By contrast, with a less effective search protocol with low probability of detecting a carcass, finding zero carcasses does not rule out the possibility that large numbers of animals were killed but not detected in the searches. EoA uses information about the search process and scavenging rates to estimate detection probabilities to determine a maximum credible number of fatalities, even when zero or few carcasses are observed.

  20. SSAC software in Poland

    International Nuclear Information System (INIS)

    The presentation includes the following: 1. History of the SSAC System: a) Manual work until 1986; b) Use of Safeguards Report Editor - SSAC01, proposed by the IAEA in the first half of 1987; c) The need for improved software initiates the state project: dbase SSAC application 1986-1988; d) Official use of Dbase 3+ application written in the Central Laboratory for Radiological Protection begins in December 1988. Old Code 10 and fixed format. Possible option for labeled format 1990 tested, not used. 2. Present status - no change: a) Operating system DOS/WIN; b) The SSAC System is based on commercial software dbase 3+; c) Only small upgrade for laser printer and 3,5'' diskettes has been made to the above application; d) Testing: not Y2K compliant. 3. Actions towards upgrade: a) Answer to the IAEA letter of 19 May 1998 (6 digit format); b) Consultation in SGIT in December 1998 and answer to the letter (22.12.1998) from director of SGIT; c) Operating system Windows98/NT - Y2K compliant; d) Commercial software Access for Windows (is Y2K compliant ?) planned to be used; e) Dedicated Application for Access with MDB format database files will be created. (author)