WorldWideScience

Sample records for catissue core software

  1. caTissue Suite 1.2 released —

    Science.gov (United States)

    caTissue Suite 1.2 is an open-source, web and programmatically accessible tool for managing biospecimens collected in support of basic and clinical research. Building on the capabilities of previous releases the application helps users manage biospecimen inventory, annotation and sample tracking. It also supports clinical and pathology report annotation and provides query capabilities for researchers to identify and find biospecimens for their research projects. In addition, it features "Dynamic Extensions", allowing Biorepositories to extend the caTissue data model and develop annotations customized for their institution.

  2. Core Flight Software Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The mission of the CFS project is to provide reusable software in support of human space exploration programs.   The top-level technical approach to...

  3. caTissue Suite to OpenSpecimen: Developing an extensible, open source, web-based biobanking management system.

    Science.gov (United States)

    McIntosh, Leslie D; Sharma, Mukesh K; Mulvihill, David; Gupta, Snehil; Juehne, Anthony; George, Bijoy; Khot, Suhas B; Kaushal, Atul; Watson, Mark A; Nagarajan, Rakesh

    2015-10-01

    The National Cancer Institute (NCI) Cancer Biomedical Informatics Grid® (caBIG®) program established standards and best practices for biorepository data management by creating an infrastructure to propagate biospecimen resource sharing while maintaining data integrity and security. caTissue Suite, a biospecimen data management software tool, has evolved from this effort. More recently, the caTissue Suite continues to evolve as an open source initiative known as OpenSpecimen. The essential functionality of OpenSpecimen includes the capture and representation of highly granular, hierarchically-structured data for biospecimen processing, quality assurance, tracking, and annotation. Ideal for multi-user and multi-site biorepository environments, OpenSpecimen permits role-based access to specific sets of data operations through a user-interface designed to accommodate varying workflows and unique user needs. The software is interoperable, both syntactically and semantically, with an array of other bioinformatics tools given its integration of standard vocabularies thus enabling research involving biospecimens. End-users are encouraged to share their day-to-day experiences in working with the application, thus providing to the community board insight into the needs and limitations which need be addressed. Users are also requested to review and validate new features through group testing environments and mock screens. Through this user interaction, application flexibility and interoperability have been recognized as necessary developmental focuses essential for accommodating diverse adoption scenarios and biobanking workflows to catalyze advances in biomedical research and operations. Given the diversity of biobanking practices and workforce roles, efforts have been made consistently to maintain robust data granularity while aiding user accessibility, data discoverability, and security within and across applications by providing a lower learning curve in using Open

  4. Core Flight Software (CFS) Maturation Towards Human Rating Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The Core Flight Software (CFS) system developed by Goddard Space Flight Center, through experience on Morpheus, has proven to be a quality product and a viable...

  5. DETERMINING THE CORE PART OF SOFTWARE DEVELOPMENT CURRICULUM APPLYING ASSOCIATION RULE MINING ON SOFTWARE JOB ADS IN TURKEY

    Directory of Open Access Journals (Sweden)

    Ilkay Yelmen

    2016-01-01

    Full Text Available The software technology is advancing rapidly over the years. In order to adapt to this advancement, the employees on software development should renew themselves consistently. During this rapid change, it is vital to train the proper software developer with respect to the criteria desired by the industry. Therefore, the curriculum of the programs related to software development at the universities should be revised according to software industry requirements. In this study, the core part of Software Development Curriculum is determined by applying association rule mining on Software Job ads in Turkey. The courses in the core part are chosen with respect to IEEE/ACM computer science curriculum. As a future study, it is also important to gather the academic personnel and the software company professionals to determine the compulsory and elective courses so that newly graduated software dev

  6. Cronos 2: a neutronic simulation software for reactor core calculations

    International Nuclear Information System (INIS)

    The CRONOS2 software is that part of the SAPHYR code system dedicated to neutronic core calculations. CRONOS2 is a powerful tool for reactor design, fuel management and safety studies. Its modular structure and great flexibility make CRONOS2 an unique simulation tool for research and development for a wide variety of reactor systems. CRONOS2 is a versatile tool that covers a large range of applications from very fast calculations used in training simulators to time and memory consuming reference calculations needed to understand complex physical phenomena. CRONOS2 has a procedure library named CPROC that allows the user to create its own application environment fitted to a specific industrial use. (authors)

  7. Experience with Intel's Many Integrated Core architecture in ATLAS software

    Science.gov (United States)

    Fleischmann, S.; Kama, S.; Lavrijsen, W.; Neumann, M.; Vitillo, R.; Atlas Collaboration

    2014-06-01

    Intel recently released the first commercial boards of its Many Integrated Core (MIC) Architecture. MIC is Intel's solution for the domain of throughput computing, currently dominated by general purpose programming on graphics processors (GPGPU). MIC allows the use of the more familiar x86 programming model and supports standard technologies such as OpenMP, MPI, and Intel's Threading Building Blocks (TBB). This should make it possible to develop for both throughput and latency devices using a single code base. In ATLAS Software, track reconstruction has been shown to be a good candidate for throughput computing on GPGPU devices. In addition, the newly proposed offline parallel event-processing framework, GaudiHive, uses TBB for task scheduling. The MIC is thus, in principle, a good fit for this domain. In this paper, we report our experiences of porting to and optimizing ATLAS tracking algorithms for the MIC, comparing the programmability and relative cost/performance of the MIC against those of current GPGPUs and latency-optimized CPUs.

  8. VERTAF/Multi-Core: A SysML-Based Application Framework for Multi-Core Embedded Software Development

    Institute of Scientific and Technical Information of China (English)

    Chao-Sheng Lin; Chun-Hsien Lu; Shang-Wei Lin; Yean-Ru Chen; Pao-Ann Hsiung

    2011-01-01

    Multi-core processors are becoming prevalent rapidly in personal computing and embedded systems. Nev-ertheless, the programming environment for multi-core processor-based systems is still quite immature and lacks efficient tools. In this work, we present a new VERTAF/Multi-Core framework and show how software code can be automatically generated from SysML models of multi-core embedded systems. We illustrate how model-driven design based on SysML can be seamlessly integrated with Intel's threading building blocks (TBB) and the quantum framework (QF) middleware. We use a digital video recording system to illustrate the benefits of the framework. Our experiments show how SysML/QF/TBB help in making multi-core embedded system programming model-driven, easy, and efficient.

  9. Experience with Intel's Many Integrated Core Architecture in ATLAS Software

    CERN Document Server

    Fleischmann, S; The ATLAS collaboration; Lavrijsen, W; Neumann, M; Vitillo, R

    2013-01-01

    Intel recently released the first commercial boards of its Many Integrated Core (MIC) Architecture. MIC is Intel's solution for the domain of throughput computing, currently dominated by general purpose programming on graphics processors (GPGPU). MIC allows the use of the more familiar x86 programming model and supports standard technologies such as OpenMP, MPI, and Intel's Threading Building Blocks. This should make it possible to develop for both throughput and latency devices using a single code base.\

  10. Experience with Intel's Many Integrated Core Architecture in ATLAS Software

    CERN Document Server

    Fleischmann, S; The ATLAS collaboration; Lavrijsen, W; Neumann, M; Vitillo, R

    2014-01-01

    Intel recently released the first commercial boards of its Many Integrated Core (MIC) Architecture. MIC is Intel's solution for the domain of throughput computing, currently dominated by general purpose programming on graphics processors (GPGPU). MIC allows the use of the more familiar x86 programming model and supports standard technologies such as OpenMP, MPI, and Intel's Threading Building Blocks. This should make it possible to develop for both throughput and latency devices using a single code base.\

  11. PanPlot - software to visualize profiles and core logs

    OpenAIRE

    Sieger, Rainer; Grobe, Hannes

    2005-01-01

    The program PanPlot was developed as a visualization tool for the information system PANGAEA. It can be used as a stand-alone application to plot data versus depth or time or in a ternary view. Data input format is tab-delimited ASCII (e.g. by export from MS-Excel or from PANGAEA). The default scales and graphic features can individualy be modified. PanPlot graphs can be exported in platform-specific interchange formats (EMF, PICT) which can be imported by graphic software for further process...

  12. Adaptive Multiclient Network-on-Chip Memory Core: Hardware Architecture, Software Abstraction Layer, and Application Exploration

    OpenAIRE

    Diana Göhringer; Lukas Meder; Stephan Werner; Oliver Oey; Jürgen Becker; Michael Hübner

    2012-01-01

    This paper presents the hardware architecture and the software abstraction layer of an adaptive multiclient Network-on-Chip (NoC) memory core. The memory core supports the flexibility of a heterogeneous FPGA-based runtime adaptive multiprocessor system called RAMPSoC. The processing elements, also called clients, can access the memory core via the Network-on-Chip (NoC). The memory core supports a dynamic mapping of an address space for the different clients as well as different data transfer ...

  13. The future of commodity computing and many-core versus the interests of HEP software

    International Nuclear Information System (INIS)

    As the mainstream computing world has shifted from multi-core to many-core platforms, the situation for software developers has changed as well. With the numerous hardware and software options available, choices balancing programmability and performance are becoming a significant challenge. The expanding multiplicative dimensions of performance offer a growing number of possibilities that need to be assessed and addressed on several levels of abstraction. This paper reviews the major trade-offs forced upon the software domain by the changing landscape of parallel technologies – hardware and software alike. Recent developments, paradigms and techniques are considered with respect to their impact on the rather traditional HEP programming models. Other considerations addressed include aspects of efficiency and reasonably achievable targets for the parallelization of large scale HEP workloads.

  14. Optimal hardware/software co-synthesis for core-based SoC designs

    Institute of Scientific and Technical Information of China (English)

    Zhan Jinyu; Xiong Guangze

    2006-01-01

    A hardware/software co-synthesis method is presented for SoC designs consisting of both hardware IP cores and software components on a graph-theoretic formulation. Given a SoC integrated with a set of functions and a set of performance factors, a core for each function is selected from a set of alternative IP cores and software components, and optimal partitions is found in a way to evenly balance the performance factors and to ultimately reduce the overall cost, size, power consumption and runtime of the core-based SoC. The algorithm formulates IP cores and components into the corresponding mathematical models, presents a graph-theoretic model for finding the optimal partitions of SoC design and transforms SoC hardware/software co-synthesis problem into finding optimal paths in a weighted, directed graph. Overcoming the three main deficiencies of the traditional methods, this method can work automatically, evaluate more performance factors at the same time and meet the particularity of SoC designs.At last, the approach is illustrated that is practical and effective through partitioning a practical system.

  15. Core Community Specifications for Electron Microprobe Operating Systems: Software, Quality Control, and Data Management Issues

    Science.gov (United States)

    Fournelle, John; Carpenter, Paul

    2006-01-01

    Modem electron microprobe systems have become increasingly sophisticated. These systems utilize either UNIX or PC computer systems for measurement, automation, and data reduction. These systems have undergone major improvements in processing, storage, display, and communications, due to increased capabilities of hardware and software. Instrument specifications are typically utilized at the time of purchase and concentrate on hardware performance. The microanalysis community includes analysts, researchers, software developers, and manufacturers, who could benefit from exchange of ideas and the ultimate development of core community specifications (CCS) for hardware and software components of microprobe instrumentation and operating systems.

  16. A Core Plug and Play Architecture for Reusable Flight Software Systems

    Science.gov (United States)

    Wilmot, Jonathan

    2006-01-01

    The Flight Software Branch, at Goddard Space Flight Center (GSFC), has been working on a run-time approach to facilitate a formal software reuse process. The reuse process is designed to enable rapid development and integration of high-quality software systems and to more accurately predict development costs and schedule. Previous reuse practices have been somewhat successful when the same teams are moved from project to project. But this typically requires taking the software system in an all-or-nothing approach where useful components cannot be easily extracted from the whole. As a result, the system is less flexible and scalable with limited applicability to new projects. This paper will focus on the rationale behind, and implementation of the run-time executive. This executive is the core for the component-based flight software commonality and reuse process adopted at Goddard.

  17. Exploring the Impact of Socio-Technical Core-Periphery Structures in Open Source Software Development

    CERN Document Server

    Amrit, Chintan

    2010-01-01

    In this paper we apply the social network concept of core-periphery structure to the sociotechnical structure of a software development team. We propose a socio-technical pattern that can be used to locate emerging coordination problems in Open Source projects. With the help of our tool and method called TESNA, we demonstrate a method to monitor the socio-technical core-periphery movement in Open Source projects. We then study the impact of different core-periphery movements on Open Source projects. We conclude that a steady core-periphery shift towards the core is beneficial to the project, whereas shifts away from the core are clearly not good. Furthermore, oscillatory shifts towards and away from the core can be considered as an indication of the instability of the project. Such an analysis can provide developers with a good insight into the health of an Open Source project. Researchers can gain from the pattern theory, and from the method we use to study the core-periphery movements.

  18. Development and preliminary verification of the PWR on-line core monitoring software system. SOPHORA

    International Nuclear Information System (INIS)

    This paper presents an introduction to the development and preliminary verification of a new on-line core monitoring software system (CMSS), named SOPHORA, for fixed in-core detector (FID) system of PWR. Developed at China General Nuclear Power Corporation (CGN), SOPHORA integrates CGN’s advanced PWR core simulator COCO and thermal-hydraulic sub-channel code LINDEN to manage the real-time core calculation and analysis. Currents measured by the FID are re-evaluated and used as bases to reconstruct the 3-D core power distribution. The key parameters such as peak local power margin and minimum DNBR margin are obtained by comparing with operation limits. Pseudo FID signals generated by data from movable in-core detector (MID) are used to verify the SOPHORA system. Comparison between predicted power peak and the responding MID in-core flux map results shows that the SOPHORA results are reasonable and satisfying. Further verification and validation of SOPHORA is undergoing and will be reported later. (author)

  19. ESAIR: A Behavior-Based Robotic Software Architecture on Multi-Core Processor Platforms

    Directory of Open Access Journals (Sweden)

    Chin-Yuan Tseng

    2013-03-01

    Full Text Available This paper introduces an Embedded Software Architecture for Intelligent Robot systems (ESAIR that addresses the issues of parallel thread executions on multi-core processor platforms. ESAIR provides a thread scheduling interface to improve the execution performance of a robot system by assigning a dedicated core to a running thread on the fly and dynamically rescheduling the priority of the thread. In the paper, we describe the object-oriented design and the control functions of ESAIR. The modular design of ESAIR helps improve the software quality, reliability and scalability in research and real practice. We prove the improvement by realizing ESAIR on an autonomous robot, named AVATAR. AVATAR implements various human-robot interactions, including speech recognition, human following, face recognition, speaker identification, etc. With the support of ESAIR, AVATAR can integrate a comprehensive set of behaviors and peripherals with better resource utilization.

  20. The caCORE Software Development Kit: Streamlining construction of interoperable biomedical information services

    Directory of Open Access Journals (Sweden)

    Warzel Denise

    2006-01-01

    Full Text Available Abstract Background Robust, programmatically accessible biomedical information services that syntactically and semantically interoperate with other resources are challenging to construct. Such systems require the adoption of common information models, data representations and terminology standards as well as documented application programming interfaces (APIs. The National Cancer Institute (NCI developed the cancer common ontologic representation environment (caCORE to provide the infrastructure necessary to achieve interoperability across the systems it develops or sponsors. The caCORE Software Development Kit (SDK was designed to provide developers both within and outside the NCI with the tools needed to construct such interoperable software systems. Results The caCORE SDK requires a Unified Modeling Language (UML tool to begin the development workflow with the construction of a domain information model in the form of a UML Class Diagram. Models are annotated with concepts and definitions from a description logic terminology source using the Semantic Connector component. The annotated model is registered in the Cancer Data Standards Repository (caDSR using the UML Loader component. System software is automatically generated using the Codegen component, which produces middleware that runs on an application server. The caCORE SDK was initially tested and validated using a seven-class UML model, and has been used to generate the caCORE production system, which includes models with dozens of classes. The deployed system supports access through object-oriented APIs with consistent syntax for retrieval of any type of data object across all classes in the original UML model. The caCORE SDK is currently being used by several development teams, including by participants in the cancer biomedical informatics grid (caBIG program, to create compatible data services. caBIG compatibility standards are based upon caCORE resources, and thus the caCORE SDK has

  1. Group Maintenance Behaviors of Core and Peripherial Members of Free/Libre Open Source Software Teams

    Science.gov (United States)

    Scialdone, Michael J.; Li, Na; Heckman, Robert; Crowston, Kevin

    Group Maintenance is pro-social, discretionary, and relation-building behavior that occurs between members of groups in order to maintain reciprocal trust and cooperation. This paper considers how Free/libre Open Source Software (FLOSS) teams demonstrate such behaviors within the context of e-mail, as this is the primary medium through which such teams communicate. We compare group maintenance behaviors between both core and peripheral members of these groups, as well as behaviors between a group that remains producing software today and one which has since dissolved. Our findings indicate that negative politeness tactics (those which show respect for the autonomy of others) may be the most instrumental group maintenance behaviors that contribute to a FLOSS group’s ability to survive and continue software production.

  2. Geolocating thermal binoculars based on a software defined camera core incorporating HOT MCT grown by MOVPE

    Science.gov (United States)

    Pillans, Luke; Harmer, Jack; Edwards, Tim; Richardson, Lee

    2016-05-01

    Geolocation is the process of calculating a target position based on bearing and range relative to the known location of the observer. A high performance thermal imager with integrated geolocation functions is a powerful long range targeting device. Firefly is a software defined camera core incorporating a system-on-a-chip processor running the AndroidTM operating system. The processor has a range of industry standard serial interfaces which were used to interface to peripheral devices including a laser rangefinder and a digital magnetic compass. The core has built in Global Positioning System (GPS) which provides the third variable required for geolocation. The graphical capability of Firefly allowed flexibility in the design of the man-machine interface (MMI), so the finished system can give access to extensive functionality without appearing cumbersome or over-complicated to the user. This paper covers both the hardware and software design of the system, including how the camera core influenced the selection of peripheral hardware, and the MMI design process which incorporated user feedback at various stages.

  3. Dynamic optical resource allocation for mobile core networks with software defined elastic optical networking.

    Science.gov (United States)

    Zhao, Yongli; Chen, Zhendong; Zhang, Jie; Wang, Xinbo

    2016-07-25

    Driven by the forthcoming of 5G mobile communications, the all-IP architecture of mobile core networks, i.e. evolved packet core (EPC) proposed by 3GPP, has been greatly challenged by the users' demands for higher data rate and more reliable end-to-end connection, as well as operators' demands for low operational cost. These challenges can be potentially met by software defined optical networking (SDON), which enables dynamic resource allocation according to the users' requirement. In this article, a novel network architecture for mobile core network is proposed based on SDON. A software defined network (SDN) controller is designed to realize the coordinated control over different entities in EPC networks. We analyze the requirement of EPC-lightpath (EPCL) in data plane and propose an optical switch load balancing (OSLB) algorithm for resource allocation in optical layer. The procedure of establishment and adjustment of EPCLs is demonstrated on a SDON-based EPC testbed with extended OpenFlow protocol. We also evaluate the OSLB algorithm through simulation in terms of bandwidth blocking ratio, traffic load distribution, and resource utilization ratio compared with link-based load balancing (LLB) and MinHops algorithms. PMID:27464120

  4. A Study of the Speedups and Competitiveness of FPGA Soft Processor Cores using Dynamic Hardware/Software Partitioning

    CERN Document Server

    Lysecky, Roman

    2011-01-01

    Field programmable gate arrays (FPGAs) provide designers with the ability to quickly create hardware circuits. Increases in FPGA configurable logic capacity and decreasing FPGA costs have enabled designers to more readily incorporate FPGAs in their designs. FPGA vendors have begun providing configurable soft processor cores that can be synthesized onto their FPGA products. While FPGAs with soft processor cores provide designers with increased flexibility, such processors typically have degraded performance and energy consumption compared to hard-core processors. Previously, we proposed warp processing, a technique capable of optimizing a software application by dynamically and transparently re-implementing critical software kernels as custom circuits in on-chip configurable logic. In this paper, we study the potential of a MicroBlaze soft-core based warp processing system to eliminate the performance and energy overhead of a soft-core processor compared to a hard-core processor. We demonstrate that the soft-c...

  5. Extension of the AMBER molecular dynamics software to Intel's Many Integrated Core (MIC) architecture

    Science.gov (United States)

    Needham, Perri J.; Bhuiyan, Ashraf; Walker, Ross C.

    2016-04-01

    We present an implementation of explicit solvent particle mesh Ewald (PME) classical molecular dynamics (MD) within the PMEMD molecular dynamics engine, that forms part of the AMBER v14 MD software package, that makes use of Intel Xeon Phi coprocessors by offloading portions of the PME direct summation and neighbor list build to the coprocessor. We refer to this implementation as pmemd MIC offload and in this paper present the technical details of the algorithm, including basic models for MPI and OpenMP configuration, and analyze the resultant performance. The algorithm provides the best performance improvement for large systems (>400,000 atoms), achieving a ∼35% performance improvement for satellite tobacco mosaic virus (1,067,095 atoms) when 2 Intel E5-2697 v2 processors (2 ×12 cores, 30M cache, 2.7 GHz) are coupled to an Intel Xeon Phi coprocessor (Model 7120P-1.238/1.333 GHz, 61 cores). The implementation utilizes a two-fold decomposition strategy: spatial decomposition using an MPI library and thread-based decomposition using OpenMP. We also present compiler optimization settings that improve the performance on Intel Xeon processors, while retaining simulation accuracy.

  6. User Friendly Processing of Sediment CT Data: Software and Application in High Resolution Non-Destructive Sediment Core Data Sets

    Science.gov (United States)

    Reilly, B. T.; Stoner, J. S.; Wiest, J.; Abbott, M. B.; Francus, P.; Lapointe, F.

    2015-12-01

    Computed Tomography (CT) of sediment cores allow for high resolution images, three dimensional volumes, and down core profiles, generated through the attenuation of X-rays as a function of density and atomic number. When using a medical CT-Scanner, these quantitative data are stored in pixels using the Hounsfield scale, which are relative to the attenuation of X-rays in water and air at standard temperature and pressure. Here we present MATLAB based software specifically designed for sedimentary applications with a user friendly graphical interface to process DICOM files and stitch overlapping CT scans. For visualization, the software allows easy generation of core slice images with grayscale and false color relative to a user defined Hounsfield number range. For comparison to other high resolution non-destructive methods, down core Hounsfield number profiles are extracted using a method robust to coring imperfections, like deformation, bowing, gaps, and gas expansion. We demonstrate the usefulness of this technique with lacustrine sediment cores from the Western United States and Canadian High Arctic, including Fish Lake, Oregon, and Sawtooth Lake, Ellesmere Island. These sites represent two different depositional environments and provide examples for a variety of common coring defects and lithologies. The Hounsfield profiles and images can be used in combination with other high resolution data sets, including sediment magnetic parameters, XRF core scans and many other types of data, to provide unique insights into how lithology influences paleoenvironmental and paleomagnetic records and their interpretations.

  7. On the Design of Energy Efficient Optical Networks with Software Defined Networking Control Across Core and Access Networks

    DEFF Research Database (Denmark)

    Wang, Jiayuan; Yan, Ying; Dittmann, Lars

    2013-01-01

    This paper presents a Software Defined Networking (SDN) control plane based on an overlay GMPLS control model. The SDN control platform manages optical core networks (WDM/DWDM networks) and the associated access networks (GPON networks), which makes it possible to gather global information...

  8. A Reusable and Adaptable Software Architecture for Embedded Space Flight System: The Core Flight Software System (CFS)

    Science.gov (United States)

    Wilmot, Jonathan

    2005-01-01

    The contents include the following: High availability. Hardware is in harsh environment. Flight processor (constraints) very widely due to power and weight constraints. Software must be remotely modifiable and still operate while changes are being made. Many custom one of kind interfaces for one of a kind missions. Sustaining engineering. Price of failure is high, tens to hundreds of millions of dollars.

  9. Experience with Intel's many integrated core architecture in ATLAS software

    International Nuclear Information System (INIS)

    Intel recently released the first commercial boards of its Many Integrated Core (MIC) Architecture. MIC is Intel's solution for the domain of throughput computing, currently dominated by general purpose programming on graphics processors (GPGPU). MIC allows the use of the more familiar x86 programming model and supports standard technologies such as OpenMP, MPI, and Intel's Threading Building Blocks (TBB). This should make it possible to develop for both throughput and latency devices using a single code base. In ATLAS Software, track reconstruction has been shown to be a good candidate for throughput computing on GPGPU devices. In addition, the newly proposed offline parallel event-processing framework, GaudiHive, uses TBB for task scheduling. The MIC is thus, in principle, a good fit for this domain. In this paper, we report our experiences of porting to and optimizing ATLAS tracking algorithms for the MIC, comparing the programmability and relative cost/performance of the MIC against those of current GPGPUs and latency-optimized CPUs.

  10. GENIE: a software package for gene-gene interaction analysis in genetic association studies using multiple GPU or CPU cores

    Directory of Open Access Journals (Sweden)

    Wang Kai

    2011-05-01

    Full Text Available Abstract Background Gene-gene interaction in genetic association studies is computationally intensive when a large number of SNPs are involved. Most of the latest Central Processing Units (CPUs have multiple cores, whereas Graphics Processing Units (GPUs also have hundreds of cores and have been recently used to implement faster scientific software. However, currently there are no genetic analysis software packages that allow users to fully utilize the computing power of these multi-core devices for genetic interaction analysis for binary traits. Findings Here we present a novel software package GENIE, which utilizes the power of multiple GPU or CPU processor cores to parallelize the interaction analysis. GENIE reads an entire genetic association study dataset into memory and partitions the dataset into fragments with non-overlapping sets of SNPs. For each fragment, GENIE analyzes: 1 the interaction of SNPs within it in parallel, and 2 the interaction between the SNPs of the current fragment and other fragments in parallel. We tested GENIE on a large-scale candidate gene study on high-density lipoprotein cholesterol. Using an NVIDIA Tesla C1060 graphics card, the GPU mode of GENIE achieves a speedup of 27 times over its single-core CPU mode run. Conclusions GENIE is open-source, economical, user-friendly, and scalable. Since the computing power and memory capacity of graphics cards are increasing rapidly while their cost is going down, we anticipate that GENIE will achieve greater speedups with faster GPU cards. Documentation, source code, and precompiled binaries can be downloaded from http://www.cceb.upenn.edu/~mli/software/GENIE/.

  11. FEM simulation of formation of metamorphic core complex with ANSYS software

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    This study utilizes ANSYS to establish FEM's model of metamorphic core complex,and used thermal-structure analysis to simulate metamorphic core complex's temperature field and stress field.The metamorphic core complex formation mechanism is discussed.The simulation results show that the temperature field change appearing as the earth surface's temperature is the lowest,and the temperature of metamorphic core complex's nucleus is the highest.The temperature field is higher along with depth increase,and the stress field change appearing as the biggest stress occurs in the nucleus.The next stress field occurs at the top of the cover.

  12. Rapid Development of Guidance, Navigation, and Control Core Flight System Software Applications Using Simulink Models Project

    Data.gov (United States)

    National Aeronautics and Space Administration — We will demonstrate the usefulness of SIL for GSFC missions by attempting to compile the SIL source code with an autocoded sample GNC application flight software....

  13. Development of New European VLIW Space DSP ASICS, IP Cores and Related Software via ESA Contracts in 2015 and Beyond

    Science.gov (United States)

    Trautner, R.

    2015-09-01

    European space industry needs a new generation of payload data processors in order to cope with in-creasing payload data processing requirements. ESA has defined a roadmap for the development of future payload processor hardware which is being implemented. A key part of this roadmap addresses the development of VLIW Digital Signal Processor (DSP) ASICs, IP cores and associated software. In this paper, we first present an overview of the ESA roadmap and the key development routes. We recapitulate the activities that have created the technology base for the ongoing DSP development, and present the ASIC development and several accompanying activities that will lead to the availability of a new space qualified DSP - the Scalable Sensor Data Processor (SSDP) - in the near future. We then present the expected future evolution of this technology area, and summarize the corresponding ESA roadmap part on VLIW DSPs and related IP and software.

  14. CoreIDRAW Software Applications in the Textile and Garment Design Digitizing%CoreIDRAW软件在纺织服装设计数字化上的应用

    Institute of Scientific and Technical Information of China (English)

    陈凤琴

    2014-01-01

    本文主要探讨了CoreIDRAW软件在纺织服装设计数字化中的应用,探讨了软件的适用性。%With the development of society, the progress of science and technology, information technology era, used in various industries. This paper mainly discusses the CoreIDRAW software applications in the textile clothing design digitized, and to explore the applicability of the software.

  15. GRAPES: a software for parallel searching on biological graphs targeting multi-core architectures.

    Directory of Open Access Journals (Sweden)

    Rosalba Giugno

    Full Text Available Biological applications, from genomics to ecology, deal with graphs that represents the structure of interactions. Analyzing such data requires searching for subgraphs in collections of graphs. This task is computationally expensive. Even though multicore architectures, from commodity computers to more advanced symmetric multiprocessing (SMP, offer scalable computing power, currently published software implementations for indexing and graph matching are fundamentally sequential. As a consequence, such software implementations (i do not fully exploit available parallel computing power and (ii they do not scale with respect to the size of graphs in the database. We present GRAPES, software for parallel searching on databases of large biological graphs. GRAPES implements a parallel version of well-established graph searching algorithms, and introduces new strategies which naturally lead to a faster parallel searching system especially for large graphs. GRAPES decomposes graphs into subcomponents that can be efficiently searched in parallel. We show the performance of GRAPES on representative biological datasets containing antiviral chemical compounds, DNA, RNA, proteins, protein contact maps and protein interactions networks.

  16. The astrometric core solution for the Gaia mission. Overview of models, algorithms and software implementation

    CERN Document Server

    Lindegren, Lennart; Hobbs, David; O'Mullane, William; Bastian, Ulrich; Hernández, José

    2011-01-01

    The Gaia satellite will observe about one billion stars and other point-like sources. The astrometric core solution will determine the astrometric parameters (position, parallax, and proper motion) for a subset of these sources, using a global solution approach which must also include a large number of parameters for the satellite attitude and optical instrument. The accurate and efficient implementation of this solution is an extremely demanding task, but crucial for the outcome of the mission. We provide a comprehensive overview of the mathematical and physical models applicable to this solution, as well as its numerical and algorithmic framework. The astrometric core solution is a simultaneous least-squares estimation of about half a billion parameters, including the astrometric parameters for some 100 million well-behaved so-called primary sources. The global nature of the solution requires an iterative approach, which can be broken down into a small number of distinct processing blocks (source, attitude,...

  17. BROCCOLI: Software for Fast fMRI Analysis on Many-Core CPUs and GPUs

    Directory of Open Access Journals (Sweden)

    Anders eEklund

    2014-03-01

    Full Text Available Analysis of functional magnetic resonance imaging (fMRI data is becoming ever more computationally demanding as temporal and spatial resolutions improve, and large, publicly available data sets proliferate. Moreover, methodological improvements in the neuroimaging pipeline, such as non-linear spatial normalization, non-parametric permutation tests and Bayesian Markov Chain Monte Carlo approaches, can dramatically increase the computational burden. Despite these challenges, there do not yet exist any fMRI software packages which leverage inexpensive and powerful graphics processing units (GPUs to perform these analyses. Here, we therefore present BROCCOLI, a free software package written in OpenCL (Open Computing Language that can be used for parallel analysis of fMRI data on a large variety of hardware configurations. BROCCOLI has, for example, been tested with an Intel CPU, an Nvidia GPU and an AMD GPU. These tests show that parallel processing of fMRI data can lead to significantly faster analysis pipelines. This speedup can be achieved on relatively standard hardware, but further, dramatic speed improvements require only a modest investment in GPU hardware. BROCCOLI (running on a GPU can perform non-linear spatial normalization to a 1 mm3 brain template in 4-6 seconds, and run a second level permutation test with 10,000 permutations in about a minute. These non-parametric tests are generally more robust than their parametric counterparts, and can also enable more sophisticated analyses by estimating complicated null distributions. Additionally, BROCCOLI includes support for Bayesian first-level fMRI analysis using a Gibbs sampler. The new software is freely available under GNU GPL3 and can be downloaded from github (https://github.com/wanderine/BROCCOLI/.

  18. Harmonic Domain Modelling of Transformer Core Nonlinearities Using the DIgSILENT PowerFactory Software

    DEFF Research Database (Denmark)

    Bak, Claus Leth; Bak-Jensen, Birgitte; Wiechowski, Wojciech

    2008-01-01

    This paper demonstrates the results of implementation and verification of an already existing algorithm that allows for calculating saturation characteristics of singlephase power transformers. The algorithm was described for the first time in 1993. Now this algorithm has been implemented using...... the DIgSILENT Programming Language (DPL) as an external script in the harmonic domain calculations of a power system analysis tool PowerFactory [10]. The algorithm is verified by harmonic measurements on a single-phase power transformer. A theoretical analysis of the core nonlinearities phenomena...... in single and three-phase transformers is also presented. This analysis leads to the conclusion that the method can be applied for modelling nonlinearities of three-phase autotransformers....

  19. AthenaMT: Upgrading the ATLAS Software Framework for the Many-Core World with Multi-Threading

    CERN Document Server

    Leggett, Charles; The ATLAS collaboration

    2016-01-01

    ATLAS's current software framework, Gaudi/Athena, has been very successful for the experiment in LHC Runs 1 and 2. However, its single threaded design has been recognised for some time to be increasingly problematic as CPUs have increased core counts and decreased available memory per core. Even the multi-process version of Athena, AthenaMP, will not scale to the range of architectures we expect to use beyond Run2. After concluding a rigorous requirements phase, where many design components were examined in detail, ATLAS has begun the migration to a new data-flow driven, multi-threaded framework, which enables the simultaneous processing of singleton, thread unsafe legacy Algorithms, cloned Algorithms that execute concurrently in their own threads with different Event contexts, and fully re-entrant, thread safe Algorithms. In this paper we will report on the process of modifying the framework to safely process multiple concurrent events in different threads, which entails significant changes in the underlying...

  20. CORE

    DEFF Research Database (Denmark)

    Krigslund, Jeppe; Hansen, Jonas; Hundebøll, Martin;

    2013-01-01

    different flows. Instead of maintaining these approaches separate, we propose a protocol (CORE) that brings together these coding mechanisms. Our protocol uses random linear network coding (RLNC) for intra- session coding but allows nodes in the network to setup inter- session coding regions where flows...... intersect. Routes for unicast sessions are agnostic to other sessions and setup beforehand, CORE will then discover and exploit intersecting routes. Our approach allows the inter-session regions to leverage RLNC to compensate for losses or failures in the overhearing or transmitting process. Thus, we...... increase the benefits of XORing by exploiting the underlying RLNC structure of individual flows. This goes beyond providing additional reliability to each individual session and beyond exploiting coding opportunistically. Our numerical results show that CORE outperforms both forwarding and COPE...

  1. Development and Evaluation of Vectorised and Multi-Core Event Reconstruction Algorithms within the CMS Software Framework

    International Nuclear Information System (INIS)

    The processing of data acquired by the CMS detector at LHC is carried out with an object-oriented C++ software framework: CMSSW. With the increasing luminosity delivered by the LHC, the treatment of recorded data requires extraordinary large computing resources, also in terms of CPU usage. A possible solution to cope with this task is the exploitation of the features offered by the latest microprocessor architectures. Modern CPUs present several vector units, the capacity of which is growing steadily with the introduction of new processor generations. Moreover, an increasing number of cores per die is offered by the main vendors, even on consumer hardware. Most recent C++ compilers provide facilities to take advantage of such innovations, either by explicit statements in the programs sources or automatically adapting the generated machine instructions to the available hardware, without the need of modifying the existing code base. Programming techniques to implement reconstruction algorithms and optimised data structures are presented, that aim to scalable vectorization and parallelization of the calculations. One of their features is the usage of new language features of the C++11 standard. Portions of the CMSSW framework are illustrated which have been found to be especially profitable for the application of vectorization and multi-threading techniques. Specific utility components have been developed to help vectorization and parallelization. They can easily become part of a larger common library. To conclude, careful measurements are described, which show the execution speedups achieved via vectorised and multi-threaded code in the context of CMSSW.

  2. 无线寻呼系统中心控制软件的实现%The Implement of the Core Software in the Paging System

    Institute of Scientific and Technical Information of China (English)

    蒋励; 张新

    2001-01-01

    The core software of the paging system is one of the key compoment of the pagin, system.This paper provides the working method of the paging system software and system manager software.Discusses the compiling method of paging system software based on client/server by Visual C++.%无线寻呼系统中心控制软件是无线寻呼系统的核心部件之一,本文介绍了寻呼软件及其管理软件的实现方法,采用Visual C++进行管理平台的设计,实现了基于客户/服务器的无线寻呼系统管理软件,具有一定的实用价值。

  3. HardwareSoftware Co-design for Heterogeneous Multi-core Platforms The hArtes Toolchain

    CERN Document Server

    2012-01-01

    This book describes the results and outcome of the FP6 project, known as hArtes, which focuses on the development of an integrated tool chain targeting a heterogeneous multi core platform comprising of a general purpose processor (ARM or powerPC), a DSP (the diopsis) and an FPGA. The tool chain takes existing source code and proposes transformations and mappings such that legacy code can easily be ported to a modern, multi-core platform. Benefits of the hArtes approach, described in this book, include: Uses a familiar programming paradigm: hArtes proposes a familiar programming paradigm which is compatible with the widely used programming practice, irrespective of the target platform. Enables users to view multiple cores as a single processor: the hArtes approach abstracts away the heterogeneity as well as the multi-core aspect of the underlying hardware so the developer can view the platform as consisting of a single, general purpose processor. Facilitates easy porting of existing applications: hArtes provid...

  4. Cronos 2: a neutronic simulation software for reactor core calculations; Cronos 2: un logiciel de simulation neutronique des coeurs de reacteurs

    Energy Technology Data Exchange (ETDEWEB)

    Lautard, J.J.; Magnaud, C.; Moreau, F.; Baudron, A.M. [CEA Saclay, Dept. de Mecanique et de Technologie (DMT/SERMA), 91 - Gif-sur-Yvette (France)

    1999-07-01

    The CRONOS2 software is that part of the SAPHYR code system dedicated to neutronic core calculations. CRONOS2 is a powerful tool for reactor design, fuel management and safety studies. Its modular structure and great flexibility make CRONOS2 an unique simulation tool for research and development for a wide variety of reactor systems. CRONOS2 is a versatile tool that covers a large range of applications from very fast calculations used in training simulators to time and memory consuming reference calculations needed to understand complex physical phenomena. CRONOS2 has a procedure library named CPROC that allows the user to create its own application environment fitted to a specific industrial use. (authors)

  5. The coupling of the Star-Cd software to a whole-core neutron transport code Decart for PWR applications

    International Nuclear Information System (INIS)

    As part of a U.S.- Korea collaborative U.S. Department of Energy INERI project, a comprehensive high-fidelity reactor-core modeling capability is being developed for detailed analysis of existing and advanced PWR reactor designs. An essential element of the project has been the development of an interface between the computational fluid dynamics (CFD) module, STAR-CD, and the neutronics module, DeCART. Since the computational mesh for CFD and neutronics calculations are generally different, the capability to average and decompose data on these different meshes has been an important part of code coupling activities. An averaging process has been developed to extract neutronics zone temperatures in the fuel and coolant and to generate appropriate multi group cross sections and densities. Similar procedures have also been established to map the power distribution from the neutronics zones to the mesh structure used in the CFD module. Since MPI is used as the parallel model in STAR-CD and conflicts arise during initiation of a second level of MPI, the interface developed here is based on using TCP/IP protocol sockets to establish communication between the CFD and neutronics modules. Preliminary coupled calculations have been performed for PWR fuel assembly size problems and converged solutions have been achieved for a series of steady-state problems ranging from a single pin to a 1/8 model of a 17 x 17 PWR fuel assembly. (authors)

  6. Validation of a new software version for monitoring of the core of the Unit 2 of the Laguna Verde power plant with ARTS; Validacion de una nueva version del software para monitoreo del nucleo de la Unidad 2 de la Central Laguna Verde con ARTS

    Energy Technology Data Exchange (ETDEWEB)

    Calleros, G.; Riestra, M.; Ibanez, C.; Lopez, X.; Vargas, A.; Mendez, A.; Gomez, R. [CFE, Central Nucleoelectrica de Laguna Verde, Alto Lucero, Veracruz (Mexico)]. e-mail: gcm9acpp@cfe.gob.mx

    2005-07-01

    In this work it is intended a methodology to validate a new version of the software used for monitoring the reactor core, which requires of the evaluation of the thermal limits settled down in the Operation Technical Specifications, for the Unit 2 of Laguna Verde with ARTS (improvements to the APRMs, Rod Block Monitor and Technical specifications). According to the proposed methodology, those are shown differences found in the thermal limits determined with the new versions and previous of the core monitoring software. Author)

  7. Pragmatic Software Innovation

    DEFF Research Database (Denmark)

    Aaen, Ivan; Jensen, Rikke Hagensby

    2014-01-01

    , for creating new markets, or for curbing rising public expenses, and software intensive systems are core elements in most such strategies. Software innovation therefore is vital for about every sector of the economy. Changes in software technologies over the last decades have opened up for experimentation......, learning, and flexibility in ongoing software projects, but how can this change be used to facilitate software innovation? How can a team systematically identify and pursue opportunities to create added value in ongoing projects? In this paper, we describe Deweyan pragmatism as the philosophical foundation......We understand software innovation as concerned with introducing innovation into the development of software intensive systems, i.e. systems in which software development and/or integration are dominant considerations. Innovation is key in almost any strategy for competitiveness in existing markets...

  8. Software Defined Radio with Parallelized Software Architecture

    Science.gov (United States)

    Heckler, Greg

    2013-01-01

    This software implements software-defined radio procession over multicore, multi-CPU systems in a way that maximizes the use of CPU resources in the system. The software treats each processing step in either a communications or navigation modulator or demodulator system as an independent, threaded block. Each threaded block is defined with a programmable number of input or output buffers; these buffers are implemented using POSIX pipes. In addition, each threaded block is assigned a unique thread upon block installation. A modulator or demodulator system is built by assembly of the threaded blocks into a flow graph, which assembles the processing blocks to accomplish the desired signal processing. This software architecture allows the software to scale effortlessly between single CPU/single-core computers or multi-CPU/multi-core computers without recompilation. NASA spaceflight and ground communications systems currently rely exclusively on ASICs or FPGAs. This software allows low- and medium-bandwidth (100 bps to approx.50 Mbps) software defined radios to be designed and implemented solely in C/C++ software, while lowering development costs and facilitating reuse and extensibility.

  9. Computer Software.

    Science.gov (United States)

    Kay, Alan

    1984-01-01

    Discusses the nature and development of computer software. Programing, programing languages, types of software (including dynamic spreadsheets), and software of the future are among the topics considered. (JN)

  10. 中国软件企业核心竞争力的属性论方法评估%Evaluation of Chinese Software Enterprises' Core Competence in Attribute Theory

    Institute of Scientific and Technical Information of China (English)

    许广林; 刘永昌; 冯嘉礼

    2006-01-01

    企业的核心竞争力是企业各种能力中最最关键的能力之一,它的强弱决定了企业在市场竞争中的成败和未来发展的命运.准确评估企业核心竞争力对于企业自我认识与培育提升核心竞争力有着重要作用.在此针对软件企业的含义及特点构建了相应的核心竞争力评价指标体系,运用属性坐标分析法对模拟数据进行运行,软件系统的计算结果令人满意.%Core competence is one of most important one among all capabilities an enterprise has,which determines its success in the present fierce market competition and its future development.Accurate evaluation of the core competence is,therefore,of vital importance to gain a better understanding of the company itself and to make further improvement.This paper builds up a core competence evaluation system based on the attributes of software enterprises,in which an attribute coordinate analytic method is adopted to analyze the simulation data.The result computed through the software system turns out to be rather satisfactory.

  11. Software Process Models and Analysis on Failure of Software Development Projects

    OpenAIRE

    Kaur, Rupinder; Sengupta, Jyotsna

    2013-01-01

    The software process model consists of a set of activities undertaken to design, develop and maintain software systems. A variety of software process models have been designed to structure, describe and prescribe the software development process. The software process models play a very important role in software development, so it forms the core of the software product. Software project failure is often devastating to an organization. Schedule slips, buggy releases and missing features can me...

  12. Teaching Software Engineering through Robotics

    OpenAIRE

    Shin, Jiwon; Rusakov, Andrey; Meyer, Bertrand

    2014-01-01

    This paper presents a newly-developed robotics programming course and reports the initial results of software engineering education in robotics context. Robotics programming, as a multidisciplinary course, puts equal emphasis on software engineering and robotics. It teaches students proper software engineering -- in particular, modularity and documentation -- by having them implement four core robotics algorithms for an educational robot. To evaluate the effect of software engineering educati...

  13. Software engineering

    CERN Document Server

    Sommerville, Ian

    2010-01-01

    The ninth edition of Software Engineering presents a broad perspective of software engineering, focusing on the processes and techniques fundamental to the creation of reliable, software systems. Increased coverage of agile methods and software reuse, along with coverage of 'traditional' plan-driven software engineering, gives readers the most up-to-date view of the field currently available. Practical case studies, a full set of easy-to-access supplements, and extensive web resources make teaching the course easier than ever.

  14. Software Reviews.

    Science.gov (United States)

    Smith, Richard L., Ed.

    1985-01-01

    Reviews software packages by providing extensive descriptions and discussions of their strengths and weaknesses. Software reviewed include (1) "VISIFROG: Vertebrate Anatomy" (grade seven-adult); (2) "Fraction Bars Computer Program" (grades three to six) and (3) four telecommunications utilities. (JN)

  15. Software reliability

    CERN Document Server

    Bendell, A

    1986-01-01

    Software Reliability reviews some fundamental issues of software reliability as well as the techniques, models, and metrics used to predict the reliability of software. Topics covered include fault avoidance, fault removal, and fault tolerance, along with statistical methods for the objective assessment of predictive accuracy. Development cost models and life-cycle cost models are also discussed. This book is divided into eight sections and begins with a chapter on adaptive modeling used to predict software reliability, followed by a discussion on failure rate in software reliability growth mo

  16. Core Recursive Hierarchical Image Segmentation

    Science.gov (United States)

    Tilton, James

    2011-01-01

    The Recursive Hierarchical Image Segmentation (RHSEG) software has been repackaged to provide a version of the RHSEG software that is not subject to patent restrictions and that can be released to the general public through NASA GSFC's Open Source release process. Like the Core HSEG Software Package, this Core RHSEG Software Package also includes a visualization program called HSEGViewer along with a utility program HSEGReader. It also includes an additional utility program called HSEGExtract. The unique feature of the Core RHSEG package is that it is a repackaging of the RHSEG technology designed to specifically avoid the inclusion of the certain software technology. Unlike the Core HSEG package, it includes the recursive portions of the technology, but does not include processing window artifact elimination technology.

  17. The dynamic of modern software development project management and the software crisis of quality. An integrated system dynamics approach towards software quality improvement

    OpenAIRE

    Nasirikaljahi, Armindokht

    2012-01-01

    The software industry is plagued by cost-overruns, delays, poor customer satisfaction and quality issues that are costing clients and customers world-wide billions of dollars each year. The phenomenon is coined The Software Crisis", and poses a huge challenge for software project management. This thesis addresses one of the core issues of the software crisis, namely software quality. The challenges of software quality are central for understanding the other symptoms of the software crisis. Th...

  18. Software engineering

    CERN Document Server

    Sommerville, Ian

    2016-01-01

    For courses in computer science and software engineering The Fundamental Practice of Software Engineering Software Engineering introduces readers to the overwhelmingly important subject of software programming and development. In the past few years, computer systems have come to dominate not just our technological growth, but the foundations of our world's major industries. This text seeks to lay out the fundamental concepts of this huge and continually growing subject area in a clear and comprehensive manner. The Tenth Edition contains new information that highlights various technological updates of recent years, providing readers with highly relevant and current information. Sommerville's experience in system dependability and systems engineering guides the text through a traditional plan-based approach that incorporates some novel agile methods. The text strives to teach the innovators of tomorrow how to create software that will make our world a better, safer, and more advanced place to live.

  19. The Ettention software package.

    Science.gov (United States)

    Dahmen, Tim; Marsalek, Lukas; Marniok, Nico; Turoňová, Beata; Bogachev, Sviatoslav; Trampert, Patrick; Nickels, Stefan; Slusallek, Philipp

    2016-02-01

    We present a novel software package for the problem "reconstruction from projections" in electron microscopy. The Ettention framework consists of a set of modular building-blocks for tomographic reconstruction algorithms. The well-known block iterative reconstruction method based on Kaczmarz algorithm is implemented using these building-blocks, including adaptations specific to electron tomography. Ettention simultaneously features (1) a modular, object-oriented software design, (2) optimized access to high-performance computing (HPC) platforms such as graphic processing units (GPU) or many-core architectures like Xeon Phi, and (3) accessibility to microscopy end-users via integration in the IMOD package and eTomo user interface. We also provide developers with a clean and well-structured application programming interface (API) that allows for extending the software easily and thus makes it an ideal platform for algorithmic research while hiding most of the technical details of high-performance computing. PMID:26686659

  20. Software requirements

    CERN Document Server

    Wiegers, Karl E

    2003-01-01

    Without formal, verifiable software requirements-and an effective system for managing them-the programs that developers think they've agreed to build often will not be the same products their customers are expecting. In SOFTWARE REQUIREMENTS, Second Edition, requirements engineering authority Karl Wiegers amplifies the best practices presented in his original award-winning text?now a mainstay for anyone participating in the software development process. In this book, you'll discover effective techniques for managing the requirements engineering process all the way through the development cy

  1. Software testing

    Science.gov (United States)

    Price-Whelan, Adrian M.

    2016-01-01

    Now more than ever, scientific results are dependent on sophisticated software and analysis. Why should we trust code written by others? How do you ensure your own code produces sensible results? How do you make sure it continues to do so as you update, modify, and add functionality? Software testing is an integral part of code validation and writing tests should be a requirement for any software project. I will talk about Python-based tools that make managing and running tests much easier and explore some statistics for projects hosted on GitHub that contain tests.

  2. Software Reviews.

    Science.gov (United States)

    Dwyer, Donna; And Others

    1989-01-01

    Reviewed are seven software packages for Apple and IBM computers. Included are: "Toxicology"; "Science Corner: Space Probe"; "Alcohol and Pregnancy"; "Science Tool Kit Plus"; Computer Investigations: Plant Growth"; "Climatrolls"; and "Animal Watch: Whales." (CW)

  3. Software Reviews.

    Science.gov (United States)

    Davis, Shelly J., Ed.; Knaupp, Jon, Ed.

    1984-01-01

    Reviewed is computer software on: (1) classification of living things, a tutorial program for grades 5-10; and (2) polynomial practice using tiles, a drill-and-practice program for algebra students. (MNS)

  4. Software Reviews.

    Science.gov (United States)

    Wulfson, Stephen, Ed.

    1987-01-01

    Reviews seven computer software programs that can be used in science education programs. Describes courseware which deals with muscles and bones, terminology, classifying animals without backbones, molecular structures, drugs, genetics, and shaping the earth's surface. (TW)

  5. Software Innovation

    DEFF Research Database (Denmark)

    Rose, Jeremy

      Innovation is the forgotten key to modern systems development - the element that defines the enterprising engineer, the thriving software firm and the cutting edge software application.  Traditional forms of technical education pay little attention to creativity - often encouraging overly...... rationalistic ways of thinking which stifle the ability to innovate. Professional software developers are often drowned in commercial drudgery and overwhelmed by work pressure and deadlines. The topic that will both ensure success in the market and revitalize their work lives is never addressed. This book sets...... out the new field of software innovation. It organizes the existing scientific research into eight simple heuristics - guiding principles for organizing a system developer's work-life so that it focuses on innovation....

  6. ANALYSIS OF SOFTWARE COST ESTIMATION MODELS

    OpenAIRE

    Tahir Abdullah; Rabia Saleem; Shahbaz Nazeer; Muhammad Usman

    2012-01-01

    Software Cost estimation is a process of forecasting the Cost of project in terms of budget, time, and other resources needed to complete a software system and it is a core issue in the software project management to estimate the cost of a project before initiating the Software Project. Different models have been developed to estimate the cost of software projects for the last several years. Most of these models rely on the Analysts’ experience, size of the software project and some other sof...

  7. Software reengineering

    Science.gov (United States)

    Fridge, Ernest M., III

    1991-01-01

    Today's software systems generally use obsolete technology, are not integrated properly with other software systems, and are difficult and costly to maintain. The discipline of reverse engineering is becoming prominent as organizations try to move their systems up to more modern and maintainable technology in a cost effective manner. JSC created a significant set of tools to develop and maintain FORTRAN and C code during development of the Space Shuttle. This tool set forms the basis for an integrated environment to re-engineer existing code into modern software engineering structures which are then easier and less costly to maintain and which allow a fairly straightforward translation into other target languages. The environment will support these structures and practices even in areas where the language definition and compilers do not enforce good software engineering. The knowledge and data captured using the reverse engineering tools is passed to standard forward engineering tools to redesign or perform major upgrades to software systems in a much more cost effective manner than using older technologies. A beta vision of the environment was released in Mar. 1991. The commercial potential for such re-engineering tools is very great. CASE TRENDS magazine reported it to be the primary concern of over four hundred of the top MIS executives.

  8. Proteomics Core

    Data.gov (United States)

    Federal Laboratory Consortium — Proteomics Core is the central resource for mass spectrometry based proteomics within the NHLBI. The Core staff help collaborators design proteomics experiments in...

  9. Pattern-based software architecture for service-oriented software systems

    OpenAIRE

    Barrett Ronan; Pahl Claus

    2010-01-01

    Service-oriented architecture is a recent conceptual framework for service-oriented software platforms. Architectures are of great importance for the evolution of software systems. We present a modelling and transformation technique for service-centric distributed software systems. Architectural configurations, expressed through hierarchical architectural patterns, form the core of a specification and transformation technique. Patterns on different levels of abstraction form transformation...

  10. MIAWARE Software

    DEFF Research Database (Denmark)

    Wilkowski, Bartlomiej; Pereira, Oscar N. M.; Dias, Paulo;

    2008-01-01

    pathologies on images and report their characteristics. The reporting process is normalized - radiologists cannot describe pathological changes with their own words, but can only use some terms from a specific vocabulary set provided by the software. Consequently, a normalized radiological report......This article presents MIAWARE, a software for Medical Image Analysis With Automated Reporting Engine, which was designed and developed for doctor/radiologist assistance. It allows to analyze an image stack from computed axial tomography scan of lungs (thorax) and, at the same time, to mark all......, a deductive report search was obtained, which may be helpful for doctors while diagnosing patients’ cases. Finally, the MIAWARE software can be considered also as a teaching tool for future radiologists and physicians....

  11. Software engineering

    CERN Document Server

    Thorin, Marc

    1985-01-01

    Software Engineering describes the conceptual bases as well as the main methods and rules on computer programming. This book presents software engineering as a coherent and logically built synthesis and makes it possible to properly carry out an application of small or medium difficulty that can later be developed and adapted to more complex cases. This text is comprised of six chapters and begins by introducing the reader to the fundamental notions of entities, actions, and programming. The next two chapters elaborate on the concepts of information and consistency domains and show that a proc

  12. Software Systems

    Institute of Scientific and Technical Information of China (English)

    崔涛; 周淼

    1996-01-01

    The information used with computers is known as software and includesprograms and data. Programs are sets of instructions telling the computerwhat operations have to be carried out and in what order they should be done. Specialised programs which enable the computer to be used for particularpurposes are called applications programs. A collection of these programs kept

  13. Software Reviews.

    Science.gov (United States)

    Science and Children, 1990

    1990-01-01

    Reviewed are seven computer software packages for IBM and/or Apple Computers. Included are "Windows on Science: Volume 1--Physical Science"; "Science Probe--Physical Science"; "Wildlife Adventures--Grizzly Bears"; "Science Skills--Development Programs"; "The Clean Machine"; "Rock Doctor"; and "Geology Search." Cost, quality, hardware, and…

  14. Software Review.

    Science.gov (United States)

    McGrath, Diane, Ed.

    1989-01-01

    Reviewed is a computer software package entitled "Audubon Wildlife Adventures: Grizzly Bears" for Apple II and IBM microcomputers. Included are availability, hardware requirements, cost, and a description of the program. The murder-mystery flavor of the program is stressed in this program that focuses on illegal hunting and game management. (CW)

  15. Software Reviews.

    Science.gov (United States)

    Mathematics and Computer Education, 1987

    1987-01-01

    Presented are reviews of several microcomputer software programs. Included are reviews of: (1) Microstat (Zenith); (2) MathCAD (MathSoft); (3) Discrete Mathematics (True Basic); (4) CALCULUS (True Basic); (5) Linear-Kit (John Wiley); and (6) Geometry Sensei (Broderbund). (RH)

  16. Practice and ExpIoration of CuItivating Core Competencies Embedded Software Testing Course in Higher VocationaI Education%职业核心能力培养嵌入高职《软件测试》课程的实践与探索

    Institute of Scientific and Technical Information of China (English)

    吴伶琳

    2014-01-01

    针对当前高职教育中缺乏职业核心能力培养的问题,结合《软件测试》课程的教学实践,阐述职业核心能力在教学中缺失的原因。为了提升高职学生的职业核心能力,探讨《软件测试》课程教学与职业核心能力培养结合的有效路径。%According to the problems such as lack of the cultivation of professional core competencies in the higher vocational education, expounds the reasons of lack of occupation of professional core competencies in teaching, combined with the teaching practice of Software Testing course. In order to improve the students professional core competencies in higher vocational education, explores the effective path of combining with the cultivation of professional core competencies and course teaching of Software Testing.

  17. Software preservation

    Directory of Open Access Journals (Sweden)

    Tadej Vodopivec

    2011-01-01

    Full Text Available Comtrade Ltd. covers a wide range of activities related to information and communication technologies; its deliverables include web applications, locally installed programs,system software, drivers, embedded software (used e.g. in medical devices, auto parts,communication switchboards. Also the extensive knowledge and practical experience about digital long-term preservation technologies have been acquired. This wide spectrum of activities puts us in the position to discuss the often overlooked aspect of the digital preservation - preservation of software programs. There are many resources dedicated to digital preservation of digital data, documents and multimedia records,but not so many about how to preserve the functionalities and features of computer programs. Exactly these functionalities - dynamic response to inputs - render the computer programs rich compared to documents or linear multimedia. The article opens the questions on the beginning of the way to the permanent digital preservation. The purpose is to find a way in the right direction, where all relevant aspects will be covered in proper balance. The following questions are asked: why at all to preserve computer programs permanently, who should do this and for whom, when we should think about permanent program preservation, what should be persevered (such as source code, screenshots, documentation, and social context of the program - e.g. media response to it ..., where and how? To illustrate the theoretic concepts given the idea of virtual national museum of electronic banking is also presented.

  18. Software Engineering to Professionalize Software Development

    Directory of Open Access Journals (Sweden)

    Juan Miguel Alonso

    2011-12-01

    Full Text Available The role, increasingly important, that plays the software in the systems with widespread effects presents new challenges for the formation of Software Engineers. Not only because social dependence software is increasing, but also because the character of software development is also changing and with it the demands for software developers certified. In this paper are propose some challenges and aspirations that guide the learning processes Software Engineering and help to identify the need to train professionals in software development.

  19. Defect Management in Agile Software Development

    Directory of Open Access Journals (Sweden)

    Rida Noor

    2014-03-01

    Full Text Available Agile development reduces the risk of developing low quality software in the first place by minimizing defects. In agile software development formal defect management processes help to build quality software. The core purpose of defect management is to make the software more effective and efficient in order to increase its quality. There are several methods for handling defects like defect prevention, defect discovery and resolution which are used by software developers and testers. Refactoring keeps the system clean by identifying and removing quality defects. To gain the full confidence of the customer defect management should be involved at every stage of development. Agile methodologies focus on delivering the software in form of short iterations. Thus each iteration helps to overcome defects and leads better development and end user satisfaction. This study paints the picture of handling the software defects using agile based Software Development Process.

  20. Software Engineering Education: Some Important Dimensions

    Science.gov (United States)

    Mishra, Alok; Cagiltay, Nergiz Ercil; Kilic, Ozkan

    2007-01-01

    Software engineering education has been emerging as an independent and mature discipline. Accordingly, various studies are being done to provide guidelines for curriculum design. The main focus of these guidelines is around core and foundation courses. This paper summarizes the current problems of software engineering education programs. It also…

  1. Ice cores

    DEFF Research Database (Denmark)

    Svensson, Anders

    2014-01-01

    Ice cores from Antarctica, from Greenland, and from a number of smaller glaciers around the world yield a wealth of information on past climates and environments. Ice cores offer unique records on past temperatures, atmospheric composition (including greenhouse gases), volcanism, solar activity......, dustiness, and biomass burning, among others. In Antarctica, ice cores extend back more than 800,000 years before present (Jouzel et al. 2007), whereas. Greenland ice cores cover the last 130,000 years...

  2. Reliability Testing Strategy - Reliability in Software Engineering

    OpenAIRE

    Taylor-Sakyi, Kevin

    2016-01-01

    This paper presents the core principles of reliability in software engineering - outlining why reliability testing is critical and specifying the process of measuring reliability. The paper provides insight for both novice and experts in the software engineering field for assessing failure intensity as well as predicting failure of software systems. Measurements are conducted by utilizing information from an operational profile to further enhance a test plan and test cases, all of which this ...

  3. Space Software

    Science.gov (United States)

    1990-01-01

    Xontech, Inc.'s software package, XonVu, simulates the missions of Voyager 1 at Jupiter and Saturn, Voyager 2 at Jupiter, Saturn, Uranus and Neptune, and Giotto in close encounter with Comet Halley. With the program, the user can generate scenes of the planets, moons, stars or Halley's nucleus and tail as seen by Giotto, all graphically reproduced with high accuracy in wireframe representation. Program can be used on a wide range of computers, including PCs. User friendly and interactive, with many options, XonVu can be used by a space novice or a professional astronomer. With a companion user's manual, it sells for $79.

  4. Software architecture

    CERN Document Server

    Vogel, Oliver; Chughtai, Arif

    2011-01-01

    As a software architect you work in a wide-ranging and dynamic environment. You have to understand the needs of your customer, design architectures that satisfy both functional and non-functional requirements, and lead development teams in implementing the architecture. And it is an environment that is constantly changing: trends such as cloud computing, service orientation, and model-driven procedures open up new architectural possibilities. This book will help you to develop a holistic architectural awareness and knowledge base that extends beyond concrete methods, techniques, and technologi

  5. DEVELOPING SOFTWARE FOR CORPUS RESEARCH

    Directory of Open Access Journals (Sweden)

    Oliver Mason

    2008-06-01

    Full Text Available Despite the central role of the computer in corpus research, programming is generally not seen as a core skill within corpus linguistics. As a consequence, limitations in software for text and corpus analysis slow down the progress of research while analysts often have to rely on third party software or even manual data analysis if no suitable software is available. Apart from software itself, data formats are also of great importance for text processing. But again, many practitioners are not very aware of the options available to them, and thus idiosyncratic text formats often make sharing of resources difficult if not impossible. This article discusses some issues relating to both data and processing which should aid researchers to become more aware of the choices available to them when it comes to using computers in linguistic research. It also describes an easy way towards automating some common text processing tasks that can easily be acquired without knowledge of actual computer programming.

  6. Global Software Engineering: A Software Process Approach

    Science.gov (United States)

    Richardson, Ita; Casey, Valentine; Burton, John; McCaffery, Fergal

    Our research has shown that many companies are struggling with the successful implementation of global software engineering, due to temporal, cultural and geographical distance, which causes a range of factors to come into play. For example, cultural, project managementproject management and communication difficulties continually cause problems for software engineers and project managers. While the implementation of efficient software processes can be used to improve the quality of the software product, published software process models do not cater explicitly for the recent growth in global software engineering. Our thesis is that global software engineering factors should be included in software process models to ensure their continued usefulness in global organisations. Based on extensive global software engineering research, we have developed a software process, Global Teaming, which includes specific practices and sub-practices. The purpose is to ensure that requirements for successful global software engineering are stipulated so that organisations can ensure successful implementation of global software engineering.

  7. Software Engineering to Professionalize Software Development

    OpenAIRE

    Juan Miguel Alonso; Fernando García

    2011-01-01

    The role, increasingly important, that plays the software in the systems with widespread effects presents new challenges for the formation of Software Engineers. Not only because social dependence software is increasing, but also because the character of software development is also changing and with it the demands for software developers certified. In this paper are propose some challenges and aspirations that guide the learning processes Software Engineering and help to identify the need to...

  8. Validation of a new version of software for monitoring the core of nuclear power plant of Laguna Verde Unit 2, at the end of Cycle 10; Validacion de una nueva version del software para monitoreo del nucleo de la Central Laguna Verde Unidad 2, al final del Ciclo 10

    Energy Technology Data Exchange (ETDEWEB)

    Hernandez, G.; Calleros, G.; Mata, F. [Comision Federal de Electricidad, Central Nucleoelectrica de Laguna Verde, Carretera Cardel-Nautla Km 42.5, Veracruz (Mexico)], e-mail: gabriel.hernandez05@cfe.gob.mx

    2009-10-15

    This work shows the differences observed in thermal limits established in the technical specifications of operation, among the new software, installed at the end of Cycle 10 of Unit 2 of nuclear power plant of Laguna Verde, and the old software that was installed from the beginning of the cycle. The methodology allowed to validate the new software during the coast down stage, before finishing the cycle, for what could be used as tool during the shutdown of Unit 2 at the end of Cycle 10. (Author)

  9. SOFTWARE METRICS VALIDATION METHODOLOGIES IN SOFTWARE ENGINEERING

    Directory of Open Access Journals (Sweden)

    K.P. Srinivasan

    2014-12-01

    Full Text Available In the software measurement validations, assessing the validation of software metrics in software engineering is a very difficult task due to lack of theoretical methodology and empirical methodology [41, 44, 45]. During recent years, there have been a number of researchers addressing the issue of validating software metrics. At present, software metrics are validated theoretically using properties of measures. Further, software measurement plays an important role in understanding and controlling software development practices and products. The major requirement in software measurement is that the measures must represent accurately those attributes they purport to quantify and validation is critical to the success of software measurement. Normally, validation is a collection of analysis and testing activities across the full life cycle and complements the efforts of other quality engineering functions and validation is a critical task in any engineering project. Further, validation objective is to discover defects in a system and assess whether or not the system is useful and usable in operational situation. In the case of software engineering, validation is one of the software engineering disciplines that help build quality into software. The major objective of software validation process is to determine that the software performs its intended functions correctly and provides information about its quality and reliability. This paper discusses the validation methodology, techniques and different properties of measures that are used for software metrics validation. In most cases, theoretical and empirical validations are conducted for software metrics validations in software engineering [1-50].

  10. SproutCore web application development

    CERN Document Server

    Keating, Tyler

    2013-01-01

    Written as a practical, step-by-step tutorial, Creating HTML5 Apps with SproutCore is full of engaging examples to help you learn in a practical context.This book is for any person looking to write software for the Web or already writing software for the Web. Whether your background is in web development or in software development, Creating HTML5 Apps with SproutCore will help you expand your skills so that you will be ready to apply the software development principles in the web development space.

  11. Software engineering architecture-driven software development

    CERN Document Server

    Schmidt, Richard F

    2013-01-01

    Software Engineering: Architecture-driven Software Development is the first comprehensive guide to the underlying skills embodied in the IEEE's Software Engineering Body of Knowledge (SWEBOK) standard. Standards expert Richard Schmidt explains the traditional software engineering practices recognized for developing projects for government or corporate systems. Software engineering education often lacks standardization, with many institutions focusing on implementation rather than design as it impacts product architecture. Many graduates join the workforce with incomplete skil

  12. Exploring the Sources of Enterprise Agility in Software Organizations

    OpenAIRE

    Srinivasan, Jayakanth

    2009-01-01

    Software is one of the core elements that drive the modern economy, with visible use in areas such as personal computing, telecommunications and banking, and background use in areas such as aircraft traffic management, nuclear power generation, and automotive control systems. Organizations that build software are unique in that they span industrial domains, and at their core of what they do is codifying human knowledge. When we talk about software organizations, we think of organizations that...

  13. Developing CMS software documentation system

    CERN Document Server

    Stankevicius, Mantas

    2012-01-01

    CMSSW (CMS SoftWare) is the overall collection of software and services needed by the simulation, calibration and alignment, and reconstruction modules that process data so that physicists can perform their analyses. It is a long term project, with a large amount of source code. In large scale and complex projects is important to have as up-to-date and automated software documentation as possible. The core of the documentation should be version-based and available online with the source code. CMS uses Doxygen and Twiki as the main tools to provide automated and non-automated documentation. Both of them are heavily cross-linked to prevent duplication of information. Doxygen is used to generate functional documentation and dependency graphs from the source code. Twiki is divided into two parts: WorkBook and Software Guide. WorkBook contains tutorial-type instructions on accessing computing resources and using the software to perform analysis within the CMS collaboration and Software Guide gives further details....

  14. Continuing progress on a lattice QCD software infrastructure

    International Nuclear Information System (INIS)

    We report on the progress of the software effort in the QCD application area of SciDAC. In particular, we discuss how the software developed under SciDAC enabled the aggressive exploitation of leadership computers, and we report on progress in the area of QCD software for multi-core architectures

  15. The software life cycle

    CERN Document Server

    Ince, Darrel

    1990-01-01

    The Software Life Cycle deals with the software lifecycle, that is, what exactly happens when software is developed. Topics covered include aspects of software engineering, structured techniques of software development, and software project management. The use of mathematics to design and develop computer systems is also discussed. This book is comprised of 20 chapters divided into four sections and begins with an overview of software engineering and software development, paying particular attention to the birth of software engineering and the introduction of formal methods of software develop

  16. Software Model Of Software-Development Process

    Science.gov (United States)

    Lin, Chi Y.; Synott, Debra J.; Levary, Reuven R.

    1990-01-01

    Collection of computer programs constitutes software tool for simulation of medium- to large-scale software-development projects. Necessary to include easily identifiable and more-readily quantifiable characteristics like costs, times, and numbers of errors. Mathematical model incorporating these and other factors of dynamics of software-development process implemented in the Software Life Cycle Simulator (SLICS) computer program. Simulates dynamics of software-development process. In combination with input and output expert software systems and knowledge-based management software system, develops information for use in managing large software-development project. Intended to aid managers in planning, managing, and controlling software-development processes by reducing uncertainties in budgets, required personnel, and schedules.

  17. Evolvability of Software Systems

    OpenAIRE

    Nasir, Muhammad-Iftikhar; Iqbal, Rizwan

    2008-01-01

    Software evolvability, meeting the future requirements of the customer is one of the emerging challenges which software industry is facing nowadays. Software evolvability is the ability of software system to accommodate future requirements. Studies have shown that software evolvability has large economic benefits but at the same time it’s difficult to assess. Over the time many methods have been derived to assess the software evolvability. Software evolvability depends upon various characteri...

  18. Software fault tolerance

    OpenAIRE

    Kazinov, Tofik Hasanaga; Mostafa, Jalilian Shahrukh

    2009-01-01

    Because of our present inability to produce errorfree software, software fault tolerance is and will contiune to be an important consideration in software system. The root cause of software design errors in the complexity of the systems. This paper surveys various software fault tolerance techniquest and methodologies. They are two gpoups: Single version and Multi version software fault tolerance techniques. It is expected that software fault tolerance research will benefit from this research...

  19. Software Innovation in a Mission Critical Environment

    Science.gov (United States)

    Fredrickson, Steven

    2015-01-01

    Operating in mission-critical environments requires trusted solutions, and the preference for "tried and true" approaches presents a potential barrier to infusing innovation into mission-critical systems. This presentation explores opportunities to overcome this barrier in the software domain. It outlines specific areas of innovation in software development achieved by the Johnson Space Center (JSC) Engineering Directorate in support of NASA's major human spaceflight programs, including International Space Station, Multi-Purpose Crew Vehicle (Orion), and Commercial Crew Programs. Software engineering teams at JSC work with hardware developers, mission planners, and system operators to integrate flight vehicles, habitats, robotics, and other spacecraft elements for genuinely mission critical applications. The innovations described, including the use of NASA Core Flight Software and its associated software tool chain, can lead to software that is more affordable, more reliable, better modelled, more flexible, more easily maintained, better tested, and enabling of automation.

  20. Ice Cores

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Records of past temperature, precipitation, atmospheric trace gases, and other aspects of climate and environment derived from ice cores drilled on glaciers and ice...

  1. Neutronics computational methods for cores

    International Nuclear Information System (INIS)

    This engineering-oriented publication contains a detailed presentation of neutronics computational methods for cores. More precisely, it presents neutronics equations: Boltzmann equation for neutron transport, resolution principles, use of high performance computing. The next parts present the problematic (values to be computed, computation software and methods), nuclear data and their processing. Then the authors describe the application of the Monte Carlo method to reactor physics: resolution of the transport equation by the Monte Carlo method, convergence of a Monte Carlo calculation and notion of quality factor, and software. Deterministic methods are then addressed: discretization, processing of resonant absorption, network calculations, core calculation, deterministic software, fuel evolution, and kinetics. The next chapter addresses multi-physical aspects: necessity of a coupling, principles of neutronic/thermal hydraulic coupling, example of an accidental transient. The last part addresses the checking approach, and neutronics computational code validation

  2. Economics of Software Engineering

    OpenAIRE

    Mohamed Mohamed Al Hady

    2007-01-01

    An article about the economic aspects of software engineering, it discusses many important issues in this field, like; knowledge economics, history of software engineering, prosperities of software industry, and economics of software, then it discusses the methods followed to improvement of software economics

  3. Software attribute visualization for high integrity software

    Energy Technology Data Exchange (ETDEWEB)

    Pollock, G.M.

    1998-03-01

    This report documents a prototype tool developed to investigate the use of visualization and virtual reality technologies for improving software surety confidence. The tool is utilized within the execution phase of the software life cycle. It provides a capability to monitor an executing program against prespecified requirements constraints provided in a program written in the requirements specification language SAGE. The resulting Software Attribute Visual Analysis Tool (SAVAnT) also provides a technique to assess the completeness of a software specification.

  4. A model-driven traceability framework for software product lines

    OpenAIRE

    Anquetil, Nicolas; Kulesza, Uirá; Mitschke, Ralf; Moreira, Ana; Royer, Jean-Claude; Rummler, Andreas; Sousa, André

    2010-01-01

    Software product line (SPL) engineering is a recent approach to software development where a set of software products are derived for a well defined target application domain, from a common set of core assets using analogous means of production (for instance, through Model Driven Engineering). Therefore, such family of products are built from reuse, instead of developed individually from scratch. SPL promise to lower the costs of development, increase the quality of software, give clients mor...

  5. A Practical Evaluation of Next Generation Sequencing & Molecular Cloning Software

    OpenAIRE

    Meintjes, Peter; Qaadri, Kashef; Olsen, Christian

    2013-01-01

    Laboratories using Next Generation Sequencing (NGS) technologies and/ or high-throughput molecular cloning experiments can spend a significant amount of their research budget on data analysis and data management. The decision to develop in-house software, to rely on combinations of free software packages, or to purchase commercial software can significantly affect productivity and ROI. In this talk, we will describe a practical software evaluation process that was developed to assist core fac...

  6. Ontologies for software engineering and software technology

    CERN Document Server

    Calero, Coral; Piattini, Mario

    2006-01-01

    Covers two applications of ontologies in software engineering and software technology: sharing knowledge of the problem domain and using a common terminology among all stakeholders; and filtering the knowledge when defining models and metamodels. This book is of benefit to software engineering researchers in both academia and industry.

  7. Space Flight Software Development Software for Intelligent System Health Management

    Science.gov (United States)

    Trevino, Luis C.; Crumbley, Tim

    2004-01-01

    The slide presentation examines the Marshall Space Flight Center Flight Software Branch, including software development projects, mission critical space flight software development, software technical insight, advanced software development technologies, and continuous improvement in the software development processes and methods.

  8. Software Development Practices in Global Software Work : Developing Quality Software

    OpenAIRE

    2005-01-01

    This thesis is about software development practices, including the project management aspects, in the context of global software outsourcing. It was focused on the issues of achieving quality product namely here: software. It is built on the premise that the global context, in which the stakeholders are geographically separated by national boundaries, poses unique and inherent challenges derived from separation of place, time and culture.

  9. Software Development for JSA Source Jerk Measurement

    Institute of Scientific and Technical Information of China (English)

    LUO; Huang-da; ZHANG; Tao

    2013-01-01

    We have developed a series of experiment measurement system for Jordan sub-critical assembly.The source jerk measurement system is used for measuring the reactivity of sub-critical reactor.It mainlyconsists of a BF3 neutron detector around the reactor core,main amplifier,the data acquisition and processing software.The software acquires neutron pulse data by controlling DAQ card,and displaying

  10. Software Engineering Guidebook

    Science.gov (United States)

    Connell, John; Wenneson, Greg

    1993-01-01

    The Software Engineering Guidebook describes SEPG (Software Engineering Process Group) supported processes and techniques for engineering quality software in NASA environments. Three process models are supported: structured, object-oriented, and evolutionary rapid-prototyping. The guidebook covers software life-cycles, engineering, assurance, and configuration management. The guidebook is written for managers and engineers who manage, develop, enhance, and/or maintain software under the Computer Software Services Contract.

  11. Software and systems traceability

    CERN Document Server

    Cleland-Huang, Jane; Zisman, Andrea

    2012-01-01

    ""Software and Systems Traceability"" provides a comprehensive description of the practices and theories of software traceability across all phases of the software development lifecycle. The term software traceability is derived from the concept of requirements traceability. Requirements traceability is the ability to track a requirement all the way from its origins to the downstream work products that implement that requirement in a software system. Software traceability is defined as the ability to relate the various types of software artefacts created during the development of software syst

  12. Maximizing ROI on software development

    CERN Document Server

    Sikka, Vijay

    2004-01-01

    A brief review of software development history. Software complexity crisis. Software development ROI. The case for global software development and testing. Software quality and test ROI. How do you implement global software development and testing. Case studies.

  13. Core strengthening.

    Science.gov (United States)

    Arendt, Elizabeth A

    2007-01-01

    Several recent studies have evaluated interventional techniques designed to reduce the risk of serious knee injuries, particularly noncontact anterior cruciate ligament injuries in female athletes. Maintenance of rotational control of the limb underneath the pelvis, especially in response to cutting and jumping activities, is a common goal in many training programs. Rotational control of the limb underneath the pelvis is mediated by a complex set of factors including the strength of the trunk muscles and the relationship between the core muscles. It is important to examine the interrelationship between lower extremity function and core stability. PMID:17472321

  14. cFE/CFS (Core Flight Executive/Core Flight System)

    Science.gov (United States)

    Wildermann, Charles P.

    2008-01-01

    This viewgraph presentation describes in detail the requirements and goals of the Core Flight Executive (cFE) and the Core Flight System (CFS). The Core Flight Software System is a mission independent, platform-independent, Flight Software (FSW) environment integrating a reusable core flight executive (cFE). The CFS goals include: 1) Reduce time to deploy high quality flight software; 2) Reduce project schedule and cost uncertainty; 3) Directly facilitate formalized software reuse; 4) Enable collaboration across organizations; 5) Simplify sustaining engineering (AKA. FSW maintenance); 6) Scale from small instruments to System of Systems; 7) Platform for advanced concepts and prototyping; and 7) Common standards and tools across the branch and NASA wide.

  15. Evolvable Neural Software System

    Science.gov (United States)

    Curtis, Steven A.

    2009-01-01

    The Evolvable Neural Software System (ENSS) is composed of sets of Neural Basis Functions (NBFs), which can be totally autonomously created and removed according to the changing needs and requirements of the software system. The resulting structure is both hierarchical and self-similar in that a given set of NBFs may have a ruler NBF, which in turn communicates with other sets of NBFs. These sets of NBFs may function as nodes to a ruler node, which are also NBF constructs. In this manner, the synthetic neural system can exhibit the complexity, three-dimensional connectivity, and adaptability of biological neural systems. An added advantage of ENSS over a natural neural system is its ability to modify its core genetic code in response to environmental changes as reflected in needs and requirements. The neural system is fully adaptive and evolvable and is trainable before release. It continues to rewire itself while on the job. The NBF is a unique, bilevel intelligence neural system composed of a higher-level heuristic neural system (HNS) and a lower-level, autonomic neural system (ANS). Taken together, the HNS and the ANS give each NBF the complete capabilities of a biological neural system to match sensory inputs to actions. Another feature of the NBF is the Evolvable Neural Interface (ENI), which links the HNS and ANS. The ENI solves the interface problem between these two systems by actively adapting and evolving from a primitive initial state (a Neural Thread) to a complicated, operational ENI and successfully adapting to a training sequence of sensory input. This simulates the adaptation of a biological neural system in a developmental phase. Within the greater multi-NBF and multi-node ENSS, self-similar ENI s provide the basis for inter-NBF and inter-node connectivity.

  16. Ragnarok: An Architecture Based Software Development Environment

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak

    The Ragnarok project is an experimental computer science project within the field of software development environments. Taking current problems in software engineering as starting point, a small set of hypotheses are proposed, outlining plausible solutions for problems concerning the management...... entities. A major effort has been invested in the design, development and deployment of a prototype software development environment, Ragnarok, that implements the core of these models. The Ragnarok prototype has been used in three, small- to medium-sized, real development projects for nearly three years...

  17. Continuing engineering education for software engineering professionals

    Energy Technology Data Exchange (ETDEWEB)

    Davis, P.I.

    1992-02-19

    Designers of software for safety-critical applications are impelled to supplement their education through continuing engineering studies in the areas of requirements analysis, hazard identification, risk analysis, fault tolerance, failure modes, and psychology. Today`s complex level of design is contributing to opportunities for catastrophic design errors in computer functions where failure of such functions is capable of causing injury and death. A syllabus of post-graduate, core studies within the curricula of five engineering specialties is suggested. Software Engineers are exhorted to undertake a professional, responsible role in safety-critical software design.

  18. Continuing engineering education for software engineering professionals

    Energy Technology Data Exchange (ETDEWEB)

    Davis, P.I.

    1992-02-19

    Designers of software for safety-critical applications are impelled to supplement their education through continuing engineering studies in the areas of requirements analysis, hazard identification, risk analysis, fault tolerance, failure modes, and psychology. Today's complex level of design is contributing to opportunities for catastrophic design errors in computer functions where failure of such functions is capable of causing injury and death. A syllabus of post-graduate, core studies within the curricula of five engineering specialties is suggested. Software Engineers are exhorted to undertake a professional, responsible role in safety-critical software design.

  19. Developing LHCb Grid Software: Experiences and Advances

    CERN Document Server

    Stokes-Rees, I; Cioffi, C; Tsaregorodtsev, A; Garonne, V; Graciani, R; Sanchez, M; Frank, M; Closier, J; Kuznetsov, G

    2004-01-01

    The LHCb grid software has been used for two Physics Data Challenges, the most recent of which will have produced 90 TB of data and required over 400 processor-years of computing power. This paper discusses the group's experience with developing Grid Services, interfacing to the LCG, running LHCb experiment software on the grid, and the integration of a number of new technologies into the LHCb grid software. Our experience and utilisation of the following core technologies will be discussed: OGSI, XML-RPC, grid services, LCG middle-ware, and instant messaging.

  20. Software Testing Techniques and Strategies

    OpenAIRE

    Isha,; Sunita Sangwan

    2014-01-01

    Software testing provides a means to reduce errors, cut maintenance and overall software costs. Numerous software development and testing methodologies, tools, and techniques have emerged over the last few decades promising to enhance software quality. This paper describes Software testing, need for software testing, Software testing goals and principles. Further it describe about different Software testing techniques and different software testing strategies.

  1. Software productivity improvement through software engineering technology

    Science.gov (United States)

    Mcgarry, F. E.

    1985-01-01

    It has been estimated that NASA expends anywhere from 6 to 10 percent of its annual budget on the acquisition, implementation and maintenance of computer software. Although researchers have produced numerous software engineering approaches over the past 5-10 years; each claiming to be more effective than the other, there is very limited quantitative information verifying the measurable impact htat any of these technologies may have in a production environment. At NASA/GSFC, an extended research effort aimed at identifying and measuring software techniques that favorably impact productivity of software development, has been active over the past 8 years. Specific, measurable, software development technologies have been applied and measured in a production environment. Resulting software development approaches have been shown to be effective in both improving quality as well as productivity in this one environment.

  2. Improving Software Developer's Competence

    DEFF Research Database (Denmark)

    Abrahamsson, Pekka; Kautz, Karlheinz; Sieppi, Heikki;

    2002-01-01

    Emerging agile software development methods are people oriented development approaches to be used by the software industry. The personal software process (PSP) is an accepted method for improving the capabilities of a single software engineer. Five original hypotheses regarding the impact of the ...

  3. Software distribution using xnetlib

    Energy Technology Data Exchange (ETDEWEB)

    Dongarra, J.J. [Univ. of Tennessee, Knoxville, TN (US). Dept. of Computer Science]|[Oak Ridge National Lab., TN (US); Rowan, T.H. [Oak Ridge National Lab., TN (US); Wade, R.C. [Univ. of Tennessee, Knoxville, TN (US). Dept. of Computer Science

    1993-06-01

    Xnetlib is a new tool for software distribution. Whereas its predecessor netlib uses e-mail as the user interface to its large collection of public-domain mathematical software, xnetlib uses an X Window interface and socket-based communication. Xnetlib makes it easy to search through a large distributed collection of software and to retrieve requested software in seconds.

  4. Ensuring Software IP Cleanliness

    Directory of Open Access Journals (Sweden)

    Mahshad Koohgoli

    2007-12-01

    Full Text Available At many points in the life of a software enterprise, determination of intellectual property (IP cleanliness becomes critical. The value of an enterprise that develops and sells software may depend on how clean the software is from the IP perspective. This article examines various methods of ensuring software IP cleanliness and discusses some of the benefits and shortcomings of current solutions.

  5. Agile Software Development

    Science.gov (United States)

    Biju, Soly Mathew

    2008-01-01

    Many software development firms are now adopting the agile software development method. This method involves the customer at every level of software development, thus reducing the impact of change in the requirement at a later stage. In this article, the principles of the agile method for software development are explored and there is a focus on…

  6. Validation of reactor core protection system

    International Nuclear Information System (INIS)

    Reactor COre Protection System (RCOPS), an advanced core protection calculator system, is a digitized one which provides core protection function based on two reactor core operation parameters, Departure from Nucleate Boiling Ratio (DNBR) and Local Power Density (LPD). It generates a reactor trip signal when the core condition exceeds the DNBR or LPD design limit. It consists of four independent channels adapted a two-out-of-four trip logic. System configuration, hardware platform and an improved algorithm of the newly designed core protection calculator system are described in this paper. One channel of RCOPS was implemented as a single channel facility for this R and D project where we performed final integration software testing. To implement custom function blocks, pSET is used. Software test is performed by two methods. The first method is a 'Software Module Test' and the second method is a 'Software Unit Test'. New features include improvement of core thermal margin through a revised on-line DNBR algorithm, resolution of the latching problem of control element assembly signal and addition of the pre-trip alarm generation. The change of the on-line DNBR calculation algorithm is considered to improve the DNBR net margin by 2.5%-3.3%. (author)

  7. Great software debates

    CERN Document Server

    Davis, A

    2004-01-01

    The industry’s most outspoken and insightful critic explains how the software industry REALLY works. In Great Software Debates, Al Davis, shares what he has learned about the difference between the theory and the realities of business and encourages you to question and think about software engineering in ways that will help you succeed where others fail. In short, provocative essays, Davis fearlessly reveals the truth about process improvement, productivity, software quality, metrics, agile development, requirements documentation, modeling, software marketing and sales, empiricism, start-up financing, software research, requirements triage, software estimation, and entrepreneurship.

  8. Software Engineering for Practiced Software Enhancement

    Directory of Open Access Journals (Sweden)

    Rashmi Yadav

    2011-03-01

    Full Text Available Software development scenario particularly in IT industries is very competitive and demands for development with minimum resources. Software development started and prevailed up to an extent in industry without the use of software engineering practices, which was perceived as an overhead. This approach causes over use of resources, such as money, man-hours, hardware components. This paper attempts to present the causes of inefficiencies in an almost exhaustive way. Further, an attempt has been made to elaborate the software engineering methods as remedies against the listed causes of inefficiencies of development.

  9. EPIC 2011: Third Workshop on Leveraging Empirical Research Results for Software Business Success

    NARCIS (Netherlands)

    Daneva, Maya; Herrmann, Andrea; Regnell, Björn; Weerd, van de Inge; De Troyer, Olga

    2011-01-01

    For many companies, software development is their core business process. For this process to be economically viable, it is not enough that software companies deliver software products that satisfy customers´ written specification. Software businesses also deem other requirements important as to deli

  10. Software Cost Estimation Review

    OpenAIRE

    Ongere, Alphonce

    2013-01-01

    Software cost estimation is the process of predicting the effort, the time and the cost re-quired to complete software project successfully. It involves size measurement of the soft-ware project to be produced, estimating and allocating the effort, drawing the project schedules, and finally, estimating overall cost of the project. Accurate estimation of software project cost is an important factor for business and the welfare of software organization in general. If cost and effort estimat...

  11. Views on Software Testability

    OpenAIRE

    Shimeall, Timothy; Friedman, Michael; Chilenski, John; Voas, Jeffrey

    1994-01-01

    The field of testability is an active, well-established part of engineering of modern computer systems. However, only recently have technologies for software testability began to be developed. These technologies focus on accessing the aspects of software that improve or depreciate the ease of testing. As both the size of implemented software and the amount of effort required to test that software increase, so will the important of software testability technologies in influencing the softwa...

  12. Core BPEL

    DEFF Research Database (Denmark)

    Hallwyl, Tim; Højsgaard, Espen

    The Web Services Business Process Execution Language (WS-BPEL) is a language for expressing business process behaviour based on web services. The language is intentionally not minimal but provides a rich set of constructs, allows omission of constructs by relying on defaults, and supports language....... To make the results of this work directly usable for practical purposes, we provide an XML Schema for Core BPEL and a set of XSLT 1.0 transformations that will transform any standard compliant WS-BPEL process into a Core BPEL process. We also provide an online service where one can apply...... the transformation. This work is part of the initial considerations on the implementation of a WS-BPEL engine within the Computer Supported Mobile Adaptive Business Processes (CosmoBiz) research project at the IT University of Copenhagen....

  13. Requirement emergence computation of networked software

    Institute of Scientific and Technical Information of China (English)

    HE Keqing; LIANG Peng; PENG Rong; LI Bing; LIU Jing

    2007-01-01

    Emergence Computation has become a hot topic in the research of complex systems in recent years.With the substantial increase in scale and complexity of network-based information systems,the uncertain user requirements from the Internet and personalized application requirement result in the frequent change for the software requirement.Meanwhile,the software system with non self-possessed,resource become more and more complex.Furthermore,the interaction and cooperation requirement between software units and running environment in service computing increase the complexity of software systems.The software systems with complex system characteristics are developing into the"Networked Software" with characteristics of change-on-demand and change-with-cooperation.The concepts "programming","compiling" and "running"of software in common sense are extended from "desktop" to "network".The core issue of software engineering is moving to the requirement engineering,which becomes the research focus of complex systemsoftware engineering.In this paper,we present the software network view based on complex system theory,and the concept of networked software and networked requirement.We proposethe challenge problem in the research of emergence computation of networked software requirement.A hierarchical & cooperative Unified requirement modeling framework URF (Unified Requirement Framework) and related RGPS (Role,Goal,Process and Service) meta-models are proposed.Five scales and the evolutionary growth mechanismin requirement emergence computation of networked software are given with focus on user-dominant and domain-oriented requirement,and the rules and predictability in requirement emergence computation are analyzed.A case study in the application of networked e-Business with evolutionary growth based on State design pattern is presented in the end.

  14. Software Engineering Program: Software Process Improvement Guidebook

    Science.gov (United States)

    1996-01-01

    The purpose of this document is to provide experience-based guidance in implementing a software process improvement program in any NASA software development or maintenance community. This guidebook details how to define, operate, and implement a working software process improvement program. It describes the concept of the software process improvement program and its basic organizational components. It then describes the structure, organization, and operation of the software process improvement program, illustrating all these concepts with specific NASA examples. The information presented in the document is derived from the experiences of several NASA software organizations, including the SEL, the SEAL, and the SORCE. Their experiences reflect many of the elements of software process improvement within NASA. This guidebook presents lessons learned in a form usable by anyone considering establishing a software process improvement program within his or her own environment. This guidebook attempts to balance general and detailed information. It provides material general enough to be usable by NASA organizations whose characteristics do not directly match those of the sources of the information and models presented herein. It also keeps the ideas sufficiently close to the sources of the practical experiences that have generated the models and information.

  15. SLIMarray: Lightweight software for microarray facility management

    Directory of Open Access Journals (Sweden)

    Marzolf Bruz

    2006-10-01

    Full Text Available Abstract Background Microarray core facilities are commonplace in biological research organizations, and need systems for accurately tracking various logistical aspects of their operation. Although these different needs could be handled separately, an integrated management system provides benefits in organization, automation and reduction in errors. Results We present SLIMarray (System for Lab Information Management of Microarrays, an open source, modular database web application capable of managing microarray inventories, sample processing and usage charges. The software allows modular configuration and is well suited for further development, providing users the flexibility to adapt it to their needs. SLIMarray Lite, a version of the software that is especially easy to install and run, is also available. Conclusion SLIMarray addresses the previously unmet need for free and open source software for managing the logistics of a microarray core facility.

  16. Promoting Science Software Best Practices: A Scientist's Perspective (Invited)

    Science.gov (United States)

    Blanton, B. O.

    2013-12-01

    Software is at the core of most modern scientific activities, and as societal awareness of, and impacts from, extreme weather, disasters, and climate and global change continue to increase, the roles that scientific software play in analyses and decision-making are brought more to the forefront. Reproducibility of research results (particularly those that enter into the decision-making arena) and open access to the software is essential for scientific and scientists' credibility. This has been highlighted in a recent article by Joppa et al (Troubling Trends in Scientific Software Use, Science Magazine, May 2013) that describes reasons for particular software being chosen by scientists, including that the "developer is well-respected" and on "recommendation from a close colleague". This reliance on recommendation, Joppa et al conclude, is fraught with risks to both sciences and scientists. Scientists must frequently take software for granted, assuming that it performs as expected and advertised and that the software itself has been validated and results verified. This is largely due to the manner in which much software is written and developed; in an ad hoc manner, with an inconsistent funding stream, and with little application of core software engineering best practices. Insufficient documentation, limited test cases, and code unavailability are significant barriers to informed and intelligent science software usage. This situation is exacerbated when the scientist becomes the software developer out of necessity due to resource constraints. Adoption of, and adherence to, best practices in scientific software development will substantially increase intelligent software usage and promote a sustainable evolution of the science as encoded in the software. We describe a typical scientist's perspective on using and developing scientific software in the context of storm surge research and forecasting applications that have real-time objectives and regulatory constraints

  17. 基于NIOSⅡ软核处理器的嵌入式测试系统软硬件设计研究%Software and Hardware Design of the Embedded Test System Based on NIOS Ⅱ Soft-core Processor

    Institute of Scientific and Technical Information of China (English)

    张荣; 黄海莹; 李春枝; 卫剑峰; 蒋宇

    2012-01-01

    介绍了利用NIOSⅡ软核处理器设计嵌入式测试系统的两类系统架构,详细讲述了基于NIOS Ⅱ软核处理器的嵌入式测试系统软硬件设计方法;最后结合EP2C8Q- 208C8型FPGA芯片,利用Verilog语言描述A/D芯片的工作时序逻辑,利用NIOS Ⅱ软核处理器设计串口处理单元,将A/D采集的数据通过串口发送到计算机显示.实践表明,利用NIOS Ⅱ软核处理器设计嵌入式测试系统,具有开发周期短,系统集成度高,功能灵活多样等特点,与传统利用单片机设计嵌入式测试系统相比,具有时钟频率高、运行速度快、调试方便等特点,是一种值得推广的嵌入式测试系统设计方法.%The two types embedded test system architectures that designed in the use of NIOS II soft-core processor are introduced in this topic, embedded test system hardware and software design based on NIOS II soft-core processor is described in detail. Finally, with EP2C8Q- 208C8 FPGA, we use Verilog language to describe operating timing logic of A / D chip, use NIOS II soft core processor to design serial processing unit, and transferring data which the A / D chip samples through the serial port to the computer and display data. Practice shows that the method that using NIOS II soft-core processor to design embedded test system, it has features of short development cycle, highly integrated system , flexible feature and so on, contrasting the traditional embedded test system design based on single chip- computer, it has features of high clock frequency , high speed running , convenient debugging, it is a worthy design of embedded test system.

  18. Development of software phantoms for software validation

    International Nuclear Information System (INIS)

    Nuclear medicine software is expected to meet certain criteria. The specifications are frequently not available to the user and, as a consequence, the performance of a particular software package may not meet the users' expectations. Under most circumstances this may be evident immediately, but frequently the user will assume certain specifications based upon the clinical procedure that is being performed, and assume that the software should function in a certain fashion to give the value of a desired parameter. To this end, it is useful to have a number of software phantoms which can act as standard data sets for validation of the software and ensure that the results obtained do meet expectations. A number of problems surround the development of a set of software phantoms that can be transported between different systems. One solution is the creation of mathematical phantoms, in which case algorithms or source code may be transportable. This paper describes four such mathematical phantoms that have been used to validate an ejection fraction and Fourier analysis package. This particular software package has been found lacking in several respects, none of which would have been evident from the documentation provided. (author). 12 refs, 4 figs

  19. ATLAS software packaging

    CERN Document Server

    Rybkin, G

    2012-01-01

    Software packaging is indispensable part of build and prerequisite for deployment processes. Full ATLAS software stack consists of TDAQ, HLT, and Offline software. These software groups depend on some 80 external software packages. We present tools, package PackDist, developed and used to package all this software except for TDAQ project. PackDist is based on and driven by CMT, ATLAS software configuration and build tool, and consists of shell and Python scripts. The packaging unit used is CMT project. Each CMT project is packaged as several packages - platform dependent (one per platform available), source code excluding header files, other platform independent files, documentation, and debug information packages (the last two being built optionally). Packaging can be done recursively to package all the dependencies. The whole set of packages for one software release, distribution kit, also includes configuration packages and contains some 120 packages for one platform. Also packaged are physics analysis pro...

  20. Improving Software Reliability Forecasting

    NARCIS (Netherlands)

    Burtsy, Bernard; Albeanu, Grigore; Boros, Dragos N.; Popentiu, Florin; Nicola, Victor

    1997-01-01

    This work investigates some methods for software reliability forecasting. A supermodel is presented as a suited tool for prediction of reliability in software project development. Also, times series forecasting for cumulative interfailure time is proposed and illustrated.

  1. Software Engineering Improvement Plan

    Science.gov (United States)

    2006-01-01

    In performance of this task order, bd Systems personnel provided support to the Flight Software Branch and the Software Working Group through multiple tasks related to software engineering improvement and to activities of the independent Technical Authority (iTA) Discipline Technical Warrant Holder (DTWH) for software engineering. To ensure that the products, comments, and recommendations complied with customer requirements and the statement of work, bd Systems personnel maintained close coordination with the customer. These personnel performed work in areas such as update of agency requirements and directives database, software effort estimation, software problem reports, a web-based process asset library, miscellaneous documentation review, software system requirements, issue tracking software survey, systems engineering NPR, and project-related reviews. This report contains a summary of the work performed and the accomplishments in each of these areas.

  2. Imprinting Community College Computer Science Education with Software Engineering Principles

    Science.gov (United States)

    Hundley, Jacqueline Holliday

    2012-01-01

    Although the two-year curriculum guide includes coverage of all eight software engineering core topics, the computer science courses taught in Alabama community colleges limit student exposure to the programming, or coding, phase of the software development lifecycle and offer little experience in requirements analysis, design, testing, and…

  3. Software Testing Requires Variability

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak

    2003-01-01

    Software variability is the ability of a software system or artefact to be changed, customized or configured for use in a particular context. Variability in software systems is important from a number of perspectives. Some perspectives rightly receive much attention due to their direct economic...... impact in software production. As is also apparent from the call for papers these perspectives focus on qualities such as reuse, adaptability, and maintainability....

  4. Healthcare Software Assurance

    OpenAIRE

    Cooper, Jason G.; Pauley, Keith A.

    2006-01-01

    Software assurance is a rigorous, lifecycle phase-independent set of activities which ensure completeness, safety, and reliability of software processes and products. This is accomplished by guaranteeing conformance to all requirements, standards, procedures, and regulations. These assurance processes are even more important when coupled with healthcare software systems, embedded software in medical instrumentation, and other healthcare-oriented life-critical systems. The current Food and Dru...

  5. SOFAS: Software Analysis Services

    OpenAIRE

    Ghezzi, G

    2010-01-01

    We propose a distributed and collaborative software analysis platform to enable seamless interoperability of software analysis tools across platform, geographical and organizational boundaries. In particular, we devise software analysis tools as services that can be accessed and composed over the Internet. These distributed services shall be widely accessible through a software analysis broker where organizations and research groups can register and share their tools. To enable (semi)-automat...

  6. Software engineering measurement

    CERN Document Server

    Munson, PhD, John C

    2003-01-01

    By demonstrating how to develop simple experiments for the empirical validation of theoretical research and showing how to convert measurement data into meaningful and valuable information, this text fosters more precise use of software measurement in the computer science and software engineering literature. Software Engineering Measurement shows you how to convert your measurement data to valuable information that can be used immediately for software process improvement.

  7. Software engineer's pocket book

    CERN Document Server

    Tooley, Michael

    2013-01-01

    Software Engineer's Pocket Book provides a concise discussion on various aspects of software engineering. The book is comprised of six chapters that tackle various areas of concerns in software engineering. Chapter 1 discusses software development, and Chapter 2 covers programming languages. Chapter 3 deals with operating systems. The book also tackles discrete mathematics and numerical computation. Data structures and algorithms are also explained. The text will be of great use to individuals involved in the specification, design, development, implementation, testing, maintenance, and qualit

  8. Holistic Marketing of Software Products: The New Paradigm

    Directory of Open Access Journals (Sweden)

    Dr. Ashutosh Nigam

    2011-05-01

    Full Text Available The software product firms needs to be competent in offeringservices with ever changing demands of the dynamic marketingenvironment. To overcome these barriers, the firms should deployholistic marketing strategies based on the established niche marketsfor specialized software products. Holistic marketing embraces allaspects of software firm’s products and customized solutions. Theconcept stresses on the interrelationship with the stakeholders toachieve distinction with core focus towards the customerrequirements.

  9. Software Language Evolution

    NARCIS (Netherlands)

    Vermolen, S.D.

    2012-01-01

    Software plays a critical role in our daily life. Vast amounts of money are spent on more and more complex systems. All software, regardless if it controls a plane or the game on your phone is never finished. Software changes when it contains bugs or when new functionality is added. This process of

  10. Software Architecture Evolution

    Science.gov (United States)

    Barnes, Jeffrey M.

    2013-01-01

    Many software systems eventually undergo changes to their basic architectural structure. Such changes may be prompted by new feature requests, new quality attribute requirements, changing technology, or other reasons. Whatever the causes, architecture evolution is commonplace in real-world software projects. Today's software architects, however,…

  11. Java for flight software

    Science.gov (United States)

    Benowitz, E.; Niessner, A.

    2003-01-01

    This work involves developing representative mission-critical spacecraft software using the Real-Time Specification for Java (RTSJ). This work currently leverages actual flight software used in the design of actual flight software in the NASA's Deep Space 1 (DSI), which flew in 1998.

  12. Software engineering: a roadmap

    OpenAIRE

    Finkelstein, A.; Kramer, J.

    2000-01-01

    This paper provides a roadmap for software engineering. It identifies the principal research challenges being faced by the discipline and brings together the threads derived from the key research specialisations within software engineering. The paper draws heavily on the roadmaps covering specific areas of software engineering research collected in this volume.

  13. Computer software quality assurance

    International Nuclear Information System (INIS)

    The author defines some criteria for the evaluation of software quality assurance elements for applicability to the regulation of the nuclear industry. The author then analyses a number of software quality assurance (SQA) standards. The major extracted SQA elements are then discussed, and finally specific software quality assurance recommendations are made for the nuclear industry

  14. Software Maintenance Success Recipes

    CERN Document Server

    Reifer, Donald J

    2011-01-01

    Dispelling much of the folklore surrounding software maintenance, Software Maintenance Success Recipes identifies actionable formulas for success based on in-depth analysis of more than 200 real-world maintenance projects. It details the set of factors that are usually present when effective software maintenance teams do their work and instructs on the methods required to achieve success. Donald J. Reifer--an award winner for his contributions to the field of software engineering and whose experience includes managing the DoD Software Initiatives Office--provides step-by-step guidance on how t

  15. Trustworthiness of Internet-based software

    Institute of Scientific and Technical Information of China (English)

    WANG Huaimin; TANG Yangbin; YIN Gang; LI Lei

    2006-01-01

    Recent years see an increasing concern over the trustworthiness of Internet-based software. By analyzing the trustworthiness of Internet-based software and the nature of the Internet applications, we point out that, on the one hand, due to the openness and dynamic nature of the Internet, the identity trustworthiness and the capability trustworthiness of the software are facing serious challenges; on the other hand, in order to ensure the trustworthiness of the whole system, emerging computing paradigms based on the collaboration of autonomous software need some impacts on the behavior of the software. Here we put forward a conceptual model for the trustworthiness of Internet-based software, and propose a trustworthy assurance framework for Internet-based virtual computing environment (iVCE). This framework deals with the trustworthy properties of software on identity, capability and behavior in a combinated way. The authorization management in inter-domain computing environment, assurance on high availability of service and incentive mechanism for autonomic collaboration are taken as three core mechanisms of iVCE trustworthy assurance.

  16. Mathematical software production

    Energy Technology Data Exchange (ETDEWEB)

    Cowell, W. R.; Fosdick, L. D.

    1977-01-01

    Locally constructed collections of mathematical routines are gradually being replaced by mathematical software that has been produced for broad dissemination and use. The process of producing such software begins with algorithmic analysis, and proceeds through software construction and documentation to extensive testing and, finally, to distribution and support of the software products. These are demanding and costly activities which require such a range of skills that they are carried out in collaborative projects. The costs and effort are justified by the utility of high-quality software, the efficiency of producing it for general distribution, and the benefits of providing a conduit from research to applications. This paper first reviews certain of the early developments in the field of mathematical software. Then it examines the technical problems that distinguish software production as an intellectual activity, problems whose descriptions also serve to characterize ideal mathematical software. Next, three mathematical software projects are sketched with attention to their emphasis, accomplishments, organization, and costs. Finally, comments are offered on possible future directions for mathematical software production, as extrapolations of the present involvement of universities, government laboratories, and private industry. 48 references.

  17. Software Defined Networking Demands on Software Technologies

    DEFF Research Database (Denmark)

    Galinac Grbac, T.; Caba, Cosmin Marius; Soler, José

    2015-01-01

    Software Defined Networking (SDN) is a networking approach based on a centralized control plane architecture with standardised interfaces between control and data planes. SDN enables fast configuration and reconfiguration of the network to enhance resource utilization and service performances....... This new approach enables a more dynamic and flexible network, which may adapt to user needs and application requirements. To this end, systemized solutions must be implemented in network software, aiming to provide secure network services that meet the required service performance levels. In this paper......, we review this new approach to networking from an architectural point of view, and identify and discuss some critical quality issues that require new developments in software technologies. These issues we discuss along with use case scenarios. Here in this paper we aim to identify challenges...

  18. Trace Software Pipelining

    Institute of Scientific and Technical Information of China (English)

    王剑; AndreasKrall; 等

    1995-01-01

    Global software pipelining is a complex but efficient compilation technique to exploit instruction-level parallelism for loops with branches.This paper presents a novel global software pipelining technique,called Trace Software Pipelining,targeted to the instruction-level parallel processors such as Very Long Instruction Word (VLIW) and superscalar machines.Trace software pipelining applies a global code scheduling technique to compact the original loop body.The resulting loop is called a trace software pipelined (TSP) code.The trace softwrae pipelined code can be directly executed with special architectural support or can be transformed into a globally software pipelined loop for the current VLIW and superscalar processors.Thus,exploiting parallelism across all iterations of a loop can be completed through compacting the original loop body with any global code scheduling technique.This makes our new technique very promising in practical compilers.Finally,we also present the preliminary experimental results to support our new approach.

  19. Social software in global software development

    DEFF Research Database (Denmark)

    Giuffrida, Rosalba; Dittrich, Yvonne

    2010-01-01

    Social software (SoSo) is defined by Farkas as tools that (1) allow people to communicate, collaborate, and build community online (2) can be syndicated, shared, reused or remixed and (3) let people learn easily from and capitalize on the behavior and knowledge of others. [1]. SoSo include a wide...... variety of tools such as: instant messaging, internet forums, mailing lists, blogs, wikis, social network sites, social bookmarking, social libraries, virtual worlds. Though normally rather belonging to the private realm, the use of social software in corporate context has been reported, e.g. as a way...

  20. Software Quality Assurance in Software Projects: A Study of Pakistan

    Directory of Open Access Journals (Sweden)

    Faisal Shafique Butt

    2013-05-01

    Full Text Available Software quality is specific property which tells what kind of standard software should have. In a software project, quality is the key factor of success and decline of software related organization. Many researches have been done regarding software quality. Software related organization follows standards introduced by Capability Maturity Model Integration (CMMI to achieve good quality software. Quality is divided into three main layers which are Software Quality Assurance (SQA, Software Quality Plan (SQP and Software Quality Control (SQC. So In this study, we are discussing the quality standards and principles of software projects in Pakistan software Industry and how these implemented quality standards are measured and managed. In this study, we will see how many software firms are following the rules of CMMI to create software. How many are reaching international standards and how many firms are measuring the quality of their projects. The results show some of the companies are using software quality assurance techniques in Pakstan.

  1. Software engineering the current practice

    CERN Document Server

    Rajlich, Vaclav

    2011-01-01

    INTRODUCTION History of Software EngineeringSoftware PropertiesOrigins of SoftwareBirth of Software EngineeringThird Paradigm: Iterative ApproachSoftware Life Span ModelsStaged ModelVariants of Staged ModelSoftware Technologies Programming Languages and CompilersObject-Oriented TechnologyVersion Control SystemSoftware ModelsClass DiagramsUML Activity DiagramsClass Dependency Graphs and ContractsSOFTWARE CHANGEIntroduction to Software ChangeCharacteristics of Software ChangePhases of Software ChangeRequirements and Their ElicitationRequirements Analysis and Change InitiationConcepts and Concept

  2. Software Preservation Benefits Framework

    OpenAIRE

    Chue Hong, Neil; Crouch, Steve; Hettrick, Simon; Parkinson, Tim; Shreeve, Matt

    2010-01-01

    An investigation of software preservation has been carried out by Curtis+Cartwright Consulting Limited, in partnership with the Software Sustainability Institute (SSI), on behalf of the JISC. The aim of the study was to raise awareness and build capacity throughout the Further and Higher Education (FE/HE) sector to engage with preservation issues as part of the process of software development. Part of this involved examining the purpose and benefits of employing preservation measures in relat...

  3. Software Process Improvement Defined

    DEFF Research Database (Denmark)

    Aaen, Ivan

    2002-01-01

    This paper argues in favor of the development of explanatory theory on software process improvement. The last one or two decades commitment to prescriptive approaches in software process improvement theory may contribute to the emergence of a gulf dividing theorists and practitioners....... It is proposed that this divide be met by the development of theory evaluating prescriptive approaches and informing practice with a focus on the software process policymaking and process control aspects of improvement efforts...

  4. The Other Software

    OpenAIRE

    McWilliams, Chandler B.

    2009-01-01

    This paper considers the absence of the human actor, specifically the programmer, from Friedrich Kittler’s analysis of software in his essay There is no Software. By focusing too intently on the machine and its specific, material existence, Kittler removes the human user / operator / writer from his analysis of software. Thus, he has no choice but to interpret the layers of language, assembler, opcode and WordPerfect, DOS, BIOS—both chains ending in an essentializing reduction to voltages—as ...

  5. Software architecture evolution

    DEFF Research Database (Denmark)

    Barais, Olivier; Le Meur, Anne-Francoise; Duchien, Laurence;

    2008-01-01

    Software architectures must frequently evolve to cope with changing requirements, and this evolution often implies integrating new concerns. Unfortunately, when the new concerns are crosscutting, existing architecture description languages provide little or no support for this kind of evolution...... one particular framework named Tran SAT, which addresses the above problems of software architecture evolution. Tran SAT provides a new element in the software architecture descriptions language, called an architectural aspect, for describing new concerns and their integration into an existing...

  6. Software configuration management

    CERN Document Server

    Keyes, Jessica

    2004-01-01

    Software Configuration Management discusses the framework from a standards viewpoint, using the original DoD MIL-STD-973 and EIA-649 standards to describe the elements of configuration management within a software engineering perspective. Divided into two parts, the first section is composed of 14 chapters that explain every facet of configuration management related to software engineering. The second section consists of 25 appendices that contain many valuable real world CM templates.

  7. Essential software architecture

    CERN Document Server

    Gorton, Ian

    2011-01-01

    Job titles like ""Technical Architect"" and ""Chief Architect"" nowadays abound in software industry, yet many people suspect that ""architecture"" is one of the most overused and least understood terms in professional software development. Gorton's book tries to resolve this dilemma. It concisely describes the essential elements of knowledge and key skills required to be a software architect. The explanations encompass the essentials of architecture thinking, practices, and supporting technologies. They range from a general understanding of structure and quality attributes through technical i

  8. Software evolution and maintenance

    CERN Document Server

    Tripathy, Priyadarshi

    2014-01-01

    Software Evolution and Maintenance: A Practitioner's Approach is an accessible textbook for students and professionals, which collates the advances in software development and provides the most current models and techniques in maintenance.Explains two maintenance standards: IEEE/EIA 1219 and ISO/IEC14764Discusses several commercial reverse and domain engineering toolkitsSlides for instructors are available onlineInformation is based on the IEEE SWEBOK (Software Engineering Body of Knowledge)

  9. Software Requirements Management

    OpenAIRE

    Ali Altalbe

    2015-01-01

    Requirements are defined as the desired set of characteristics of a product or a service. In the world of software development, it is estimated that more than half of the failures are attributed towards poor requirements management. This means that although the software functions correctly, it is not what the client requested. Modern software requirements management methodologies are available to reduce the occur-rence of such incidents. This paper performs a review on the available literatur...

  10. Gammasphere software development

    Energy Technology Data Exchange (ETDEWEB)

    Piercey, R.B.

    1993-01-01

    Activities of the nuclear physics group are described. Progress was made in organizing the Gammasphere Software Working Group, establishing a nuclear computing facility, participating in software development at Lawrence Berkeley, developing a common data file format, and adapting the ORNL UPAK software to run at Gammasphere. A universal histogram object was developed that defines a file format and provides for an objective-oriented programming model. An automated liquid nitrogen fill system was developed for Gammasphere (110 Ge detectors comprise the sphere).

  11. Marketing Mix del Software.

    OpenAIRE

    Yudith del Carmen Rodríguez Pérez

    2006-01-01

    La ingeniería del software y los modelos de calidad del software han consolidado sus esfuerzos en el proceso de producción del mismo, sin embargo son pocos sus aportes en el proceso de comercialización. Es esencial en la ciencia de la computación desarrollar un modelo de comercialización para las organizaciones productoras de software con el fin de elevar la productividad de las mismas. Sin embargo, es preciso primero conocer las características del producto software que los diferencian de ot...

  12. Software architecture 2

    CERN Document Server

    Oussalah, Mourad Chabanne

    2014-01-01

    Over the past 20 years, software architectures have significantly contributed to the development of complex and distributed systems. Nowadays, it is recognized that one of the critical problems in the design and development of any complex software system is its architecture, i.e. the organization of its architectural elements. Software Architecture presents the software architecture paradigms based on objects, components, services and models, as well as the various architectural techniques and methods, the analysis of architectural qualities, models of representation of architectural templa

  13. MYOB software for dummies

    CERN Document Server

    Curtis, Veechi

    2012-01-01

    Your complete guide to MYOB® AccountRight softwareNow in its seventh edition, MYOB® Software For Dummies walks you through everything you need to know, from starting your MYOB® file from scratch and recording payments and receipts, to tracking profit and analysing sales. This new edition includes all the information you need on the new generation of MYOB® AccountRight software, including the new cloud computing features. Set up MYOB® software - understand how to make it work the first time Keep track of purchases and sales - monitor customer accounts and ensure you get pai

  14. Agile software development

    CERN Document Server

    Dingsoyr, Torgeir; Moe, Nils Brede

    2010-01-01

    Agile software development has become an umbrella term for a number of changes in how software developers plan and coordinate their work, how they communicate with customers and external stakeholders, and how software development is organized in small, medium, and large companies, from the telecom and healthcare sectors to games and interactive media. Still, after a decade of research, agile software development is the source of continued debate due to its multifaceted nature and insufficient synthesis of research results. Dingsoyr, Dyba, and Moe now present a comprehensive snapshot of the kno

  15. Software architecture 1

    CERN Document Server

    Oussalah , Mourad Chabane

    2014-01-01

    Over the past 20 years, software architectures have significantly contributed to the development of complex and distributed systems. Nowadays, it is recognized that one of the critical problems in the design and development of any complex software system is its architecture, i.e. the organization of its architectural elements. Software Architecture presents the software architecture paradigms based on objects, components, services and models, as well as the various architectural techniques and methods, the analysis of architectural qualities, models of representation of architectural template

  16. Essence: Facilitating Software Innovation

    DEFF Research Database (Denmark)

    Aaen, Ivan

    2008-01-01

      This paper suggests ways to facilitate creativity and innovation in software development. The paper applies four perspectives – Product, Project, Process, and People –to identify an outlook for software innovation. The paper then describes a new facility–Software Innovation Research Lab (SIRL......) – and a new method concept for software innovation – Essence – based on views, modes, and team roles. Finally, the paper reports from an early experiment using SIRL and Essence and identifies further research....

  17. Architecture for Verifiable Software

    Science.gov (United States)

    Reinholtz, William; Dvorak, Daniel

    2005-01-01

    Verifiable MDS Architecture (VMA) is a software architecture that facilitates the construction of highly verifiable flight software for NASA s Mission Data System (MDS), especially for smaller missions subject to cost constraints. More specifically, the purpose served by VMA is to facilitate aggressive verification and validation of flight software while imposing a minimum of constraints on overall functionality. VMA exploits the state-based architecture of the MDS and partitions verification issues into elements susceptible to independent verification and validation, in such a manner that scaling issues are minimized, so that relatively large software systems can be aggressively verified in a cost-effective manner.

  18. A hardware/software co-optimization approach for embedded software of MP3 decoder

    Institute of Scientific and Technical Information of China (English)

    ZHANG Wei; LIU Peng; ZHAI Zhi-bo

    2007-01-01

    In order to improve the efficiency of embedded software running on processor core, this paper proposes a hardware/software co-optimization approach for embedded software from the system point of view. The proposed stepwise methods aim at exploiting the structure and the resources of the processor as much as possible for software algorithm optimization. To achieve low memory usage and low frequency need for the same performance, this co-optimization approach was used to optimize embedded software of MP3 decoder based on a 16-bit fixed-point DSP core. After the optimization, the results of decoding 128kbps, 44.1 kHz stereo MP3 on DSP evaluation platform need 45.9 MIPS and 20.4 kbytes memory space. The optimization rate achieves 65.6% for memory and 49.6% for frequency respectively compared with the results by compiler using floating-point computation. The experimental result indicates the availability of the hardware/software co-optimization approach depending on the algorithm and architecture.

  19. A Prototype for the Support of Integrated Software Process Development and Improvement

    Science.gov (United States)

    Porrawatpreyakorn, Nalinpat; Quirchmayr, Gerald; Chutimaskul, Wichian

    An efficient software development process is one of key success factors for quality software. Not only can the appropriate establishment but also the continuous improvement of integrated project management and of the software development process result in efficiency. This paper hence proposes a software process maintenance framework which consists of two core components: an integrated PMBOK-Scrum model describing how to establish a comprehensive set of project management and software engineering processes and a software development maturity model advocating software process improvement. Besides, a prototype tool to support the framework is introduced.

  20. Architecture of the ATLAS High Level Trigger Event Selection Software

    CERN Document Server

    Grothe, M; Baines, J T M; Bee, C P; Biglietti, M; Bogaerts, A; Boisvert, V; Bosman, M; Brandt, S; Caron, B; Casado, M P; Cataldi, G; Cavalli, D; Cervetto, M; Comune, G; Corso-Radu, A; Di Mattia, A; Díaz-Gómez, M; Dos Anjos, A; Drohan, J; Ellis, Nick; Elsing, M; Epp, B; Etienne, F; Falciano, S; Farilla, A; George, S; Ghete, V M; González, S; Kaczmarska, A; Karr, K M; Khomich, A; Konstantinidis, N P; Krasny, W; Li, W; Lowe, A; Luminari, L; Ma, H; Meessen, C; Mello, A G; Merino, G; Morettini, P; Moyse, E; Nairz, A; Negri, A; Nikitin, N V; Nisati, A; Padilla, C; Parodi, F; Pérez-Réale, V; Pinfold, J L; Pinto, P; Polesello, G; Qian, Z; Rajagopalan, S; Resconi, S; Rosati, S; Scannicchio, D A; Schiavi, C; Schörner-Sadenius, T; Segura, E; De Seixas, J M; Shears, T G; Sivoklokov, S Yu; Smizanska, M; Soluk, R A; Stanescu, C; Tapprogge, Stefan; Touchard, F; Vercesi, V; Watson, A; Wengler, T; Werner, P; Wheeler, S; Wickens, F J; Wiedenmann, W; Wielers, M; Zobernig, G; CHEP 2003 Computing in High Energy Physics; Grothe, Monika

    2004-01-01

    The ATLAS High Level Trigger (HLT) consists of two selection steps: the second level trigger and the event filter. Both will be implemented in software, running on mostly commodity hardware. Both levels have a coherent approach to event selection, so a common core software framework has been designed to maximize this coherency, while allowing sufficient flexibility to meet the different interfaces and requirements of the two different levels. The approach is extended further to allow the software to run in an off-line simulation and reconstruction environment for the purposes of development. This paper describes the architecture and high level design of the software.

  1. Integrating Behaviour in Software Models: An Event Coordination Notation

    DEFF Research Database (Denmark)

    Kindler, Ekkart

    2011-01-01

    One of the main problems in model-based software engineering is modelling behaviour in such a way that the behaviour models can be easily integrated with each other, with the structural software models and with pre-existing software. In this paper, we propose an event coordination notation (ECNO......) that deals with this problem. We present the main concepts and rationales behind this notation and discuss a prototype and run-time environment that executes these models, and provides an API so that other parts of the software can be easily integrated. The core concepts of the ECNO seem to be stabilizing...

  2. Systems and software variability management concepts, tools and experiences

    CERN Document Server

    Capilla, Rafael; Kang, Kyo-Chul

    2013-01-01

    The success of product line engineering techniques in the last 15 years has popularized the use of software variability as a key modeling approach for describing the commonality and variability of systems at all stages of the software lifecycle. Software product lines enable a family of products to share a common core platform, while allowing for product specific functionality being built on top of the platform. Many companies have exploited the concept of software product lines to increase the resources that focus on highly differentiating functionality and thus improve their competitiveness

  3. ATLAS software packaging

    Science.gov (United States)

    Rybkin, Grigory

    2012-12-01

    Software packaging is indispensable part of build and prerequisite for deployment processes. Full ATLAS software stack consists of TDAQ, HLT, and Offline software. These software groups depend on some 80 external software packages. We present tools, package PackDist, developed and used to package all this software except for TDAQ project. PackDist is based on and driven by CMT, ATLAS software configuration and build tool, and consists of shell and Python scripts. The packaging unit used is CMT project. Each CMT project is packaged as several packages—platform dependent (one per platform available), source code excluding header files, other platform independent files, documentation, and debug information packages (the last two being built optionally). Packaging can be done recursively to package all the dependencies. The whole set of packages for one software release, distribution kit, also includes configuration packages and contains some 120 packages for one platform. Also packaged are physics analysis projects (currently 6) used by particular physics groups on top of the full release. The tools provide an installation test for the full distribution kit. Packaging is done in two formats for use with the Pacman and RPM package managers. The tools are functional on the platforms supported by ATLAS—GNU/Linux and Mac OS X. The packaged software is used for software deployment on all ATLAS computing resources from the detector and trigger computing farms, collaboration laboratories computing centres, grid sites, to physicist laptops, and CERN VMFS and covers the use cases of running all applications as well as of software development.

  4. Software Validation Infrastructure for the ATLAS Trigger

    CERN Document Server

    Adorisio, C; Beauchemin, P; Bell, P; Biglietti, M; Coccaro, A; Damazio, D; Ehrenfeld, W; Faulkner, P; George, S; Giagu, S; Goncalo, R; Hamilton, A; Jones, G; Kirk, J; Kwee, R; Lane, J; Enoque Ferreira de Lima, D; Masik, J; Mincer, A; Monticelli, F; Omachi, C; Oyarzun, A; Panikashvili, N; Potter, C; Quinonez, F; Reinsch, A; Robinson, M; Rodríguez, D; Sarkisyan-Grinbaum, E; Sidoti, A; Sinev, N; Strom, D; Sutton, M; Ventura, A; Winklmeier, F; Zhao, L

    2009-01-01

    The ATLAS trigger system is responsible for selecting the interesting collision events delivered by the Large Hadron Collider (LHC). The ATLAS trigger will need to achieve a ~10^-7 rejection factor against random proton-proton collisions, and still be able to efficiently select interesting events. After a first processing level based on hardware, the final event selection is based on custom software running on two CPU farms, containing around two thousand multi-core machines. This is known as the high-level trigger. Running the trigger online during long periods demands very high quality software. It must be fast, performant, and essentially bug-free. With more than 100 contributors and around 250 different packages, a thorough validation of the HLT software is essential. This relies on a variety of unit and integration tests as well as on software metrics, and uses both in-house and open source software. This presentation presents the existing infrastructure used for validating the high-level trigger softwar...

  5. Agile Software Methodologies: Strength and Weakness

    OpenAIRE

    Dr. Adel Hamdan Mohammad; Dr. Tariq Alwada’n; Dr. Jafar "M.Ali" Ababneh

    2013-01-01

    Agile methodologies are great software development methodologies. No doubt that these methodologies have widespread reputation. The core of agile methodologies is people. Customer and each team member in agiledevelopment teams are the key success or failure factor in agile process. In this paper authors demonstrate strength and weakness points in agile methodologies. Also authors demonstrate how strength and weakness factors can affect the overall results of agile development process.

  6. Agile Software Methodologies: Strength and Weakness

    Directory of Open Access Journals (Sweden)

    Dr. Adel Hamdan Mohammad

    2013-03-01

    Full Text Available Agile methodologies are great software development methodologies. No doubt that these methodologies have widespread reputation. The core of agile methodologies is people. Customer and each team member in agiledevelopment teams are the key success or failure factor in agile process. In this paper authors demonstrate strength and weakness points in agile methodologies. Also authors demonstrate how strength and weakness factors can affect the overall results of agile development process.

  7. Multicore Considerations for Legacy Flight Software Migration

    Science.gov (United States)

    Vines, Kenneth; Day, Len

    2013-01-01

    In this paper we will discuss potential benefits and pitfalls when considering a migration from an existing single core code base to a multicore processor implementation. The results of this study present options that should be considered before migrating fault managers, device handlers and tasks with time-constrained requirements to a multicore flight software environment. Possible future multicore test bed demonstrations are also discussed.

  8. SOFTWARE MEASUREMENTS AND METRICS: ROLE IN EFFECTIVE SOFTWARE TESTING

    OpenAIRE

    Sheikh Umar Farooq; S. M. K. Quadri,; Nesar Ahmad

    2011-01-01

    Measurement has always been fundamental to the progress to any engineering discipline and software testing is no exception. Software metrics have been used in making quantitative/qualitative decisions as well as in risk assessment and reduction in software projects. In this paper we discuss software measurement and metrics and their fundamental role in software development life cycle. This paper focusing on software test metrics discusses their key role in software testing process and also cl...

  9. Who Owns Computer Software?

    Science.gov (United States)

    Branscomb, Anne Wells

    1995-01-01

    Discusses the protection of intellectual property as it applies to computer software and its impact on private enterprise and the public good. Highlights include the role of patents, copyrights, and trade secrets; some court cases; and recommendations for alternatives to the existing legal framework for protecting computer software. (KRN)

  10. Software evolution with XVCL

    DEFF Research Database (Denmark)

    Zhang, Weishan; Jarzabek, Stan; Zhang, Hongyu;

    2004-01-01

    This chapter introduces software evolution with XVCL (XML-based Variant Configuration Language), which is an XML-based metaprogramming technique. As the software evolves, a large number of variants may arise, especially whtn such kinds of evolutions are related to multiple platforms as shown in o...

  11. Measuring software technology

    Science.gov (United States)

    Agresti, W. W.; Card, D. N.; Church, V. E.; Page, G.; Mcgarry, F. E.

    1983-01-01

    Results are reported from a series of investigations into the effectiveness of various methods and tools used in a software production environment. The basis for the analysis is a project data base, built through extensive data collection and process instrumentation. The project profiles become an organizational memory, serving as a reference point for an active program of measurement and experimentation on software technology.

  12. Fastbus software progress

    International Nuclear Information System (INIS)

    The current status of the Fastbus software development program of the Fastbus Software Working Group is reported, and future plans are discussed. A package of Fastbus interface subroutines has been prepared as a proposed standard, language support for diagnostics and bench testing has been developed, and new documentation to help users find these resources and use them effectively is being written

  13. Marketing Mix del Software.

    Directory of Open Access Journals (Sweden)

    Yudith del Carmen Rodríguez Pérez

    2006-03-01

    Por ello, en este trabajo se define el concepto de producto software, se caracteriza al mismo y se exponen sus atributos de calidad. Además, se aborda la mezcla de marketing del software necesaria y diferente a la de otros productos para que este triunfe en el mercado.

  14. Measuring software design

    Science.gov (United States)

    1986-01-01

    An extensive series of studies of software design measures conducted by the Software Engineering Laboratory is described. Included are the objectives and results of the studies, the method used to perform the studies, and the problems encountered. The document should be useful to researchers planning similar studies as well as to managers and designers concerned with applying quantitative design measures.

  15. Threats to Bitcoin Software

    OpenAIRE

    Kateraas, Christian H

    2014-01-01

    Collect and analyse threat models to the Bitcoin ecosystem and its software. The create misuse case, attack trees, and sequence diagrams of the threats. Create a malicious client from the gathered threat models. Once the development of the client is complete, test the client and evaluate its performance. From this, assess the security of the Bitcoin software.

  16. Software business models and contexts for software innovation: key areas software business research

    OpenAIRE

    Käkölä, Timo

    2003-01-01

    This paper examines business, design, and product development aspects of software business models. Contexts of small and large companies for creating software innovations are also analysed. Finally, software business research is called for and an agenda for software business research is presented to better understand the dynamics of the software industry and help create and manage successful software-intensive ventures.

  17. Software quality in 1997

    Energy Technology Data Exchange (ETDEWEB)

    Jones, C. [Software Productivity Research, Inc., Burlington, MA (United States)

    1997-11-01

    For many years, software quality assurance lagged behind hardware quality assurance in terms of methods, metrics, and successful results. New approaches such as Quality Function Deployment (QFD) the ISO 9000-9004 standards, the SEI maturity levels, and Total Quality Management (TQM) are starting to attract wide attention, and in some cases to bring software quality levels up to a parity with manufacturing quality levels. Since software is on the critical path for many engineered products, and for internal business systems as well, the new approaches are starting to affect global competition and attract widespread international interest. It can be hypothesized that success in mastering software quality will be a key strategy for dominating global software markets in the 21st century.

  18. Developing Software Simulations

    Directory of Open Access Journals (Sweden)

    Tom Hall

    2007-06-01

    Full Text Available Programs in education and business often require learners to develop and demonstrate competence in specified areas and then be able to effectively apply this knowledge. One method to aid in developing a skill set in these areas is through the use of software simulations. These simulations can be used for learner demonstrations of competencies in a specified course as well as a review of the basic skills at the beginning of subsequent courses. The first section of this paper discusses ToolBook, the software used to develop our software simulations. The second section discusses the process of developing software simulations. The third part discusses how we have used software simulations to assess student knowledge of research design by providing simulations that allow the student to practice using SPSS and Excel.

  19. Software Requirements Management

    Directory of Open Access Journals (Sweden)

    Ali Altalbe

    2015-04-01

    Full Text Available Requirements are defined as the desired set of characteristics of a product or a service. In the world of software development, it is estimated that more than half of the failures are attributed towards poor requirements management. This means that although the software functions correctly, it is not what the client requested. Modern software requirements management methodologies are available to reduce the occur-rence of such incidents. This paper performs a review on the available literature in the area while tabulating possible methods of managing requirements. It also highlights the benefits of following a proper guideline for the requirements management task. With the introduction of specific software tools for the requirements management task, better software products are now been developed with lesser resources.

  20. Revisiting software ecosystems research

    DEFF Research Database (Denmark)

    Manikas, Konstantinos

    2016-01-01

    from evolving. We propose means for future research and the community to address them. Finally, our analysis shapes the view of the field having evolved outside the existing definitions of software ecosystems and thus propose the update of the definition of software ecosystems.......‘Software ecosystems’ is argued to first appear as a concept more than 10 years ago and software ecosystem research started to take off in 2010. We conduct a systematic literature study, based on the most extensive literature review in the field up to date, with two primarily aims: (a) to provide...... an updated overview of the field and (b) to document evolution in the field. In total, we analyze 231 papers from 2007 until 2014 and provide an overview of the research in software ecosystems. Our analysis reveals a field that is rapidly growing both in volume and empirical focus while becoming more mature...

  1. DIVERSIFICATION IN SOFTWARE ENGINEERING

    Directory of Open Access Journals (Sweden)

    Er.Kirtesh Jailia,

    2010-06-01

    Full Text Available In this paper we examine the factors that have promoted the iversification of software process models. The intention is to understand more clearly the problem-solving process in software engineering & try to find out the efficient way to manage the risk. A review of software process modeling is given first, followed by a discussion of process evaluation techniques. A taxonomy for categorizing process models, based on establishing decision criteria,is identified that can guide selecting the appropriate model from a set of alternatives on the basis of model characteristics and software project needs. We are proposing a model in this paper, for dealing with the diversification in software engineering.

  2. Software licenses: Stay honest!

    CERN Multimedia

    Computer Security Team

    2012-01-01

    Do you recall our article about copyright violation in the last issue of the CERN Bulletin, “Music, videos and the risk for CERN”? Now let’s be more precise. “Violating copyright” not only means the illegal download of music and videos, it also applies to software packages and applications.   Users must respect proprietary rights in compliance with the CERN Computing Rules (OC5). Not having legitimately obtained a program or the required licenses to run that software is not a minor offense. It violates CERN rules and puts the Organization at risk! Vendors deserve credit and compensation. Therefore, make sure that you have the right to use their software. In other words, you have bought the software via legitimate channels and use a valid and honestly obtained license. This also applies to “Shareware” and software under open licenses, which might also come with a cost. Usually, only “Freeware” is complete...

  3. Software safety hazard analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lawrence, J.D. [Lawrence Livermore National Lab., CA (United States)

    1996-02-01

    Techniques for analyzing the safety and reliability of analog-based electronic protection systems that serve to mitigate hazards in process control systems have been developed over many years, and are reasonably well understood. An example is the protection system in a nuclear power plant. The extension of these techniques to systems which include digital computers is not well developed, and there is little consensus among software engineering experts and safety experts on how to analyze such systems. One possible technique is to extend hazard analysis to include digital computer-based systems. Software is frequently overlooked during system hazard analyses, but this is unacceptable when the software is in control of a potentially hazardous operation. In such cases, hazard analysis should be extended to fully cover the software. A method for performing software hazard analysis is proposed in this paper.

  4. Dual-core antiresonant hollow core fibers.

    Science.gov (United States)

    Liu, Xuesong; Fan, Zhongwei; Shi, Zhaohui; Ma, Yunfeng; Yu, Jin; Zhang, Jing

    2016-07-25

    In this work, dual-core antiresonant hollow core fibers (AR-HCFs) are numerically demonstrated, based on our knowledge, for the first time. Two fiber structures are proposed. One is a composite of two single-core nested nodeless AR-HCFs, exhibiting low confinement loss and a circular mode profile in each core. The other has a relatively simple structure, with a whole elliptical outer jacket, presenting a uniform and wide transmission band. The modal couplings of the dual-core AR-HCFs rely on a unique mechanism that transfers power through the air. The core separation and the gap between the two cores influence the modal coupling strength. With proper designs, both of the dual-core fibers can have low phase birefringence and short modal coupling lengths of several centimeters. PMID:27464191

  5. Dual-core antiresonant hollow core fibers.

    Science.gov (United States)

    Liu, Xuesong; Fan, Zhongwei; Shi, Zhaohui; Ma, Yunfeng; Yu, Jin; Zhang, Jing

    2016-07-25

    In this work, dual-core antiresonant hollow core fibers (AR-HCFs) are numerically demonstrated, based on our knowledge, for the first time. Two fiber structures are proposed. One is a composite of two single-core nested nodeless AR-HCFs, exhibiting low confinement loss and a circular mode profile in each core. The other has a relatively simple structure, with a whole elliptical outer jacket, presenting a uniform and wide transmission band. The modal couplings of the dual-core AR-HCFs rely on a unique mechanism that transfers power through the air. The core separation and the gap between the two cores influence the modal coupling strength. With proper designs, both of the dual-core fibers can have low phase birefringence and short modal coupling lengths of several centimeters.

  6. Software abstractions logic, language, and analysis

    CERN Document Server

    Jackson, Daniel

    2011-01-01

    In Software Abstractions Daniel Jackson introduces an approach to software design that draws on traditional formal methods but exploits automated tools to find flaws as early as possible. This approach--which Jackson calls "lightweight formal methods" or "agile modeling"--takes from formal specification the idea of a precise and expressive notation based on a tiny core of simple and robust concepts but replaces conventional analysis based on theorem proving with a fully automated analysis that gives designers immediate feedback. Jackson has developed Alloy, a language that captures the essence of software abstractions simply and succinctly, using a minimal toolkit of mathematical notions. This revised edition updates the text, examples, and appendixes to be fully compatible with the latest version of Alloy (Alloy 4). The designer can use automated analysis not only to correct errors but also to make models that are more precise and elegant. This approach, Jackson says, can rescue designers from "the tarpit of...

  7. CoreDevRec:Automatic Core Member Recommendation for Contribution Evaluation

    Institute of Scientific and Technical Information of China (English)

    蒋竞; 贺佳欢; 陈学渊

    2015-01-01

    The pull-based software development helps developers make contributions flexibly and effciently. Core members evaluate code changes submitted by contributors, and decide whether to merge these code changes into repositories or not. Ideally, code changes are assigned to core members and evaluated within a short time after their submission. However, in reality, some popular projects receive many pull requests, and core members have di昋culties in choosing pull requests which are to be evaluated. Therefore, there is a growing need for automatic core member recommendation, which improves the evaluation process. In this paper, we investigate pull requests with manual assignment. Results show that 3.2%∼40.6% of pull requests are manually assigned to specific core members. To assist with the manual assignment, we propose CoreDevRec to recommend core members for contribution evaluation in GitHub. CoreDevRec uses support vector machines to analyze different kinds of features, including file paths of modified codes, relationships between contributors and core members, and activeness of core members. We evaluate CoreDevRec on 18 651 pull requests of five popular projects in GitHub. Results show that CoreDevRec achieves accuracy from 72.9% to 93.5% for top 3 recommendation. In comparison with a baseline approach, CoreDevRec improves the accuracy from 18.7% to 81.3% for top 3 recommendation. Moreover, CoreDevRec even has higher accuracy than manual assignment in the project TrinityCore. We believe that CoreDevRec can improve the assignment of pull requests.

  8. Power laws in software systems

    OpenAIRE

    Tonelli, Roberto

    2012-01-01

    The main topic of my PhD has been the study of power laws in software systems within the perspective of describing software quality. My PhD research contributes to a recent stream of studies in software engineering, where the investigation of power laws in software systems has become widely popular in recent years, since they appear on an incredible variety of different software quantities and properties, like, for example, software metrics, software faults, refactoring, Java byte-code,...

  9. Software Developers’ Perceptions of Productivity

    OpenAIRE

    Meyer, André; Fritz, Thomas; Murphy, Gail C.; Zimmermann, Thomas

    2014-01-01

    The better the software development community becomes at creating software, the more software the world seems to demand. Although there is a large body of research about measuring and investigating productivity from an organizational point of view, there is a paucity of research about how software developers, those at the front-line of software construction, think about, assess and try to improve their productivity. To investigate software developers' perceptions of software development produ...

  10. LDUA software custodian's notebook

    International Nuclear Information System (INIS)

    This plan describes the activities to be performed and controls to be applied to the process of specifying, obtaining, and qualifying the control and data acquisition software for the Light Duty Utility Arm (LDUA) System. It serves the purpose of a software quality assurance plan, a verification and validation plan, and a configuration management plan. This plan applies to all software that is an integral part of the LDUA control and data acquisition system, that is, software that is installed in the computers that are part of the LDUA system as it is deployed in the field. This plan applies to the entire development process, including: requirements; design; implementation; and operations and maintenance. This plan does not apply to any software that is not integral with the LDUA system. This plan has-been prepared in accordance with WHC-CM-6-1 Engineering Practices, EP-2.1; WHC-CM-3-10 Software Practices; and WHC-CM-4-2, QR 19.0, Software Quality Assurance Requirements

  11. LANMAS core: Update and current directions

    International Nuclear Information System (INIS)

    Local Area Network Material Accountability System (LANMAS) core software will provide the framework of a material accountability system. LANMAS is a network-based nuclear material accountability system. It tracks the movement of material throughout a site and generates the required reports on material accountability. LANMAS will run in a client/server mode. The database of material type and location will reside on the server, while the user interface runs on the client. The user interface accesses the server via a network. The LANMAS core can be used as the foundation for building required Materials Control and Accountability (MC ampersand A) functionality at any site requiring a new MC ampersand A system. An individual site will build on the LANMAS core by supplying site-specific software. This paper will provide an update on the current LANMAS development activities and discuss the current direction of the LANMAS project

  12. Design and performance test of spacecraft test and operation software

    Science.gov (United States)

    Wang, Guohua; Cui, Yan; Wang, Shuo; Meng, Xiaofeng

    2011-06-01

    Main test processor (MTP) software is the key element of Electrical Ground Support Equipment (EGSE) for spacecraft test and operation used in the Chinese Academy of Space Technology (CAST) for years without innovation. With the increasing demand for a more efficient and agile MTP software, the new MTP software was developed. It adopts layered and plug-in based software architecture, whose core runtime server provides message queue management, share memory management and process management services and forms the framework for a configurable and open architecture system. To investigate the MTP software's performance, the test case of network response time, test sequence management capability and data-processing capability was introduced in detail. Test results show that the MTP software is common and has higher performance than the legacy one.

  13. Designing Scientific Software for Heterogeneous Computing

    DEFF Research Database (Denmark)

    Glimberg, Stefan Lemvig

    concurrency and maintain numerical efficiency. Graphical Processing Units (GPUs) have proven to be very e_ective units for computing the solution of scientific problems described by partial differential equations (PDEs). GPUs have today become standard devices in portable, desktop, and supercomputers, which......The main objective with the present study has been to investigate parallel numerical algorithms with the purpose of running efficiently and scalably on modern many-core heterogeneous hardware. In order to obtain good efficiency and scalability on modern multi- and many- core architectures......, algorithms and data structures must be designed to utilize the underlying parallel architecture. The architectural changes in hardware design within the last decade, from single to multi and many-core architectures, require software developers to identify and properly implement methods that both exploit...

  14. Software quality assurance handbook

    Energy Technology Data Exchange (ETDEWEB)

    1990-09-01

    There are two important reasons for Software Quality Assurance (SQA) at Allied-Signal Inc., Kansas City Division (KCD): First, the benefits from SQA make good business sense. Second, the Department of Energy has requested SQA. This handbook is one of the first steps in a plant-wide implementation of Software Quality Assurance at KCD. The handbook has two main purposes. The first is to provide information that you will need to perform software quality assurance activities. The second is to provide a common thread to unify the approach to SQA at KCD. 2 figs.

  15. Advanced fingerprint verification software

    Science.gov (United States)

    Baradarani, A.; Taylor, J. R. B.; Severin, F.; Maev, R. Gr.

    2016-05-01

    We have developed a fingerprint software package that can be used in a wide range of applications from law enforcement to public and private security systems, and to personal devices such as laptops, vehicles, and door- locks. The software and processing units are a unique implementation of new and sophisticated algorithms that compete with the current best systems in the world. Development of the software package has been in line with the third generation of our ultrasonic fingerprinting machine1. Solid and robust performance is achieved in the presence of misplaced and low quality fingerprints.

  16. Guide to software export

    CERN Document Server

    Philips, Roger A

    2014-01-01

    An ideal reference source for CEOs, marketing and sales managers, sales consultants, and students of international marketing, Guide to Software Export provides a step-by-step approach to initiating or expanding international software sales. It teaches you how to examine critically your candidate product for exportability; how to find distributors, agents, and resellers abroad; how to identify the best distribution structure for export; and much, much more!Not content with providing just the guidelines for setting up, expanding, and managing your international sales channels, Guide to Software

  17. Global Software Engineering

    DEFF Research Database (Denmark)

    Ebert, Christof; Kuhrmann, Marco; Prikladnicki, Rafael

    2016-01-01

    SOFTWARE, LIKE ALL industry products, is the result of complex multinational supply chains with many partners from concept to development to production and maintenance. Global software engineering (GSE), IT outsourcing, and business process outsourcing during the past decade have showed growth...... rates of 10 to 20 percent per year. This instalment of Practitioner’s Digest summarizes experiences and guidance from industry to facilitate knowledge and technology transfer for GSE. It’s based on industry feedback from the annual IEEE International Conference on Global Software Engineering, which had...

  18. CNEOST Control Software System

    Science.gov (United States)

    Wang, Xin; Zhao, Hai-bin; Xia, Yan; Lu, Hao; Li, Bin

    2016-01-01

    In 2013, CNEOST (China Near Earth Object Survey Telescope) adapted its hardware system for the new CCD camera. Based on the new system architecture, the control software is re-designed and implemented. The software system adopts the messaging mechanism based on the WebSocket protocol, and possesses good flexibility and expansibility. The user interface based on the responsive web design has realized the remote observations under both desktop and mobile devices. The stable operation of the software system has greatly enhanced the operation efficiency while reducing the complexity, and has also made a successful attempt for the future system design of telescope and telescope cloud.

  19. Systematic Software Development

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; Méndez Fernández, Daniel

    2015-01-01

    The speed of innovation and the global allocation of resources to accelerate development or to reduce cost put pressure on the software industry. In the global competition, especially so-called high-price countries have to present arguments why the higher development cost is justified and what...... project- and quality management and their implementation in practice. So far, our results suggest that the necessity for a systematic software development is well recognized, while software development still follows an ad-hoc rather than a systematized style. Our results provide initial findings, which we...

  20. Speakeasy software development

    Science.gov (United States)

    Baskinger, Patricia J.; Ozarow, Larry; Chruscicki, Mary C.

    1993-08-01

    The Speakeasy Software Development Project had three primary objectives. The first objective was to perform Independent Verification and Validation (IV & V) of the software and documentation associated with the signal processor being developed by Hazeltine and TRW under the Speakeasy program. The IV & V task also included an analysis and assessment of the ability of the signal processor software to provide LPI communications functions. The second objective was to assist in the enhancement and modification of an existing Rome Lab signal processor workstation. Finally, TASC developed project management support tools and provided program management support to the Speakeasy Program Office.

  1. Orbit Software Suite

    Science.gov (United States)

    Osgood, Cathy; Williams, Kevin; Gentry, Philip; Brownfield, Dana; Hallstrom, John; Stuit, Tim

    2012-01-01

    Orbit Software Suite is used to support a variety of NASA/DM (Dependable Multiprocessor) mission planning and analysis activities on the IPS (Intrusion Prevention System) platform. The suite of Orbit software tools (Orbit Design and Orbit Dynamics) resides on IPS/Linux workstations, and is used to perform mission design and analysis tasks corresponding to trajectory/ launch window, rendezvous, and proximity operations flight segments. A list of tools in Orbit Software Suite represents tool versions established during/after the Equipment Rehost-3 Project.

  2. Software platform virtualization in chemistry research and university teaching

    Directory of Open Access Journals (Sweden)

    Kind Tobias

    2009-11-01

    Full Text Available Abstract Background Modern chemistry laboratories operate with a wide range of software applications under different operating systems, such as Windows, LINUX or Mac OS X. Instead of installing software on different computers it is possible to install those applications on a single computer using Virtual Machine software. Software platform virtualization allows a single guest operating system to execute multiple other operating systems on the same computer. We apply and discuss the use of virtual machines in chemistry research and teaching laboratories. Results Virtual machines are commonly used for cheminformatics software development and testing. Benchmarking multiple chemistry software packages we have confirmed that the computational speed penalty for using virtual machines is low and around 5% to 10%. Software virtualization in a teaching environment allows faster deployment and easy use of commercial and open source software in hands-on computer teaching labs. Conclusion Software virtualization in chemistry, mass spectrometry and cheminformatics is needed for software testing and development of software for different operating systems. In order to obtain maximum performance the virtualization software should be multi-core enabled and allow the use of multiprocessor configurations in the virtual machine environment. Server consolidation, by running multiple tasks and operating systems on a single physical machine, can lead to lower maintenance and hardware costs especially in small research labs. The use of virtual machines can prevent software virus infections and security breaches when used as a sandbox system for internet access and software testing. Complex software setups can be created with virtual machines and are easily deployed later to multiple computers for hands-on teaching classes. We discuss the popularity of bioinformatics compared to cheminformatics as well as the missing cheminformatics education at universities worldwide.

  3. The Software Patent Debate

    OpenAIRE

    Guadamuz, Andres

    2006-01-01

    The paper discusses the proposed European Directive on the Patentability of Computer-Implemented Inventions and the subsequent debate that followed. Do software patents - as argued by policymakers' - result in increased innovation?

  4. Managing Software Process Evolution

    DEFF Research Database (Denmark)

    This book focuses on the design, development, management, governance and application of evolving software processes that are aligned with changing business objectives, such as expansion to new domains or shifting to global production. In the context of an evolving business world, it examines...... the complete software process lifecycle, from the initial definition of a product to its systematic improvement. In doing so, it addresses difficult problems, such as how to implement processes in highly regulated domains or where to find a suitable notation system for documenting processes, and provides...... essential insights and tips to help readers manage process evolutions. And last but not least, it provides a wealth of examples and cases on how to deal with software evolution in practice. Reflecting these topics, the book is divided into three parts. Part 1 focuses on software business transformation...

  5. eSoftwareList

    Data.gov (United States)

    US Agency for International Development — USAID Software Database reporting tool created in Oracle Application Express (APEX). This version provides read only access to a database view of the JIRA SAR...

  6. Project Portfolio Management Software

    OpenAIRE

    Paul POCATILU

    2006-01-01

    In order to design a methodology for the development of project portfolio management (PPM) applications, the existing applications have to be studied. This paper describes the main characteristics of the leading project portfolio management software applications.

  7. Software for nuclear spectrometry

    International Nuclear Information System (INIS)

    The Advisory Group Meeting (AGM) on Software for Nuclear Spectrometry was dedicated to review the present status of software for nuclear spectrometry and to advise on future activities in this field. Because similar AGM and consultant's meetings had been held in the past; together with an attempt to get more streamlined, this AGM was devoted to the specific field of software for gamma ray spectrometry. Nevertheless, many of the issues discussed and the recommendations made are of general concern for any software on nuclear spectrometry. The report is organized by sections. The 'Summary' gives conclusions and recommendations adopted at the AGM. These conclusions and recommendations resulted from the discussions held during and after presentations of the scientific and technical papers. These papers are reported here in their integral form in the following Sections

  8. Banking Software Applications Security

    Directory of Open Access Journals (Sweden)

    Ioan Alexandru Bubu

    2015-03-01

    Full Text Available Computer software products are among the most complex artifacts, if not the most complex artifacts mankind has created. Securing those artifacts against intelligent attackers who try to exploit flaws in software design and construct is a great challenge too.The purpose of this paper is to introduce a secure alternative to banking software applications that are currently in use. This new application aims to cover most of the well-known vulnerabilities that plague the majority of current software.First we will take a quick look at current security methods that are in use, and a few known vulnerabilities. After this, we will discuss the security measures implemented in my application, and finally, we will the results of implementing them.

  9. ACS: ALMA Common Software

    Science.gov (United States)

    Chiozzi, Gianluca; Šekoranja, Matej

    2013-02-01

    ALMA Common Software (ACS) provides a software infrastructure common to all ALMA partners and consists of a documented collection of common patterns and components which implement those patterns. The heart of ACS is based on a distributed Component-Container model, with ACS Components implemented as CORBA objects in any of the supported programming languages. ACS provides common CORBA-based services such as logging, error and alarm management, configuration database and lifecycle management. Although designed for ALMA, ACS can and is being used in other control systems and distributed software projects, since it implements proven design patterns using state of the art, reliable technology. It also allows, through the use of well-known standard constructs and components, that other team members whom are not authors of ACS easily understand the architecture of software modules, making maintenance affordable even on a very large project.

  10. Computer Center: Software Review.

    Science.gov (United States)

    Duhrkopf, Richard, Ed.; Belshe, John F., Ed.

    1988-01-01

    Reviews a software package, "Mitosis-Meiosis," available for Apple II or IBM computers with colorgraphics capabilities. Describes the documentation, presentation and flexibility of the program. Rates the program based on graphics and usability in a biology classroom. (CW)

  11. Advanced Software Protection Now

    CERN Document Server

    Bendersky, Diego; Notarfrancesco, Luciano; Sarraute, Carlos; Waissbein, Ariel

    2010-01-01

    Software digital rights management is a pressing need for the software development industry which remains, as no practical solutions have been acclamaimed succesful by the industry. We introduce a novel software-protection method, fully implemented with today's technologies, that provides traitor tracing and license enforcement and requires no additional hardware nor inter-connectivity. Our work benefits from the use of secure triggers, a cryptographic primitive that is secure assuming the existence of an ind-cpa secure block cipher. Using our framework, developers may insert license checks and fingerprints, and obfuscate the code using secure triggers. As a result, this rises the cost that software analysis tools have detect and modify protection mechanisms. Thus rising the complexity of cracking this system.

  12. Spreadsheet Auditing Software

    CERN Document Server

    Nixon, David

    2010-01-01

    It is now widely accepted that errors in spreadsheets are both common and potentially dangerous. Further research has taken place to investigate how frequently these errors occur, what impact they have, how the risk of spreadsheet errors can be reduced by following spreadsheet design guidelines and methodologies, and how effective auditing of a spreadsheet is in the detection of these errors. However, little research exists to establish the usefulness of software tools in the auditing of spreadsheets. This paper documents and tests office software tools designed to assist in the audit of spreadsheets. The test was designed to identify the success of software tools in detecting different types of errors, to identify how the software tools assist the auditor and to determine the usefulness of the tools.

  13. Software Design Analyzer System

    Science.gov (United States)

    Tausworthe, R. C.

    1985-01-01

    CRISP80 software design analyzer system a set of programs that supports top-down, hierarchic, modular structured design, and programing methodologies. CRISP80 allows for expression of design as picture of program.

  14. Software Patent and its Impact on Software Innovation in Japan

    OpenAIRE

    Motohashi, Kazuyuki

    2009-01-01

    In Japan, the software patent system has been reformed and now software has become a patentable subject matter. In this paper, this pro-patent shift on software is surveyed and its impact on software innovation is analyzed. Before the 1990's, inventions related to software could not be patented by themselves, but they could be applied when combined with hardware related inventions. Therefore, integrated electronics firms used to be the major software patent applicants. However, during the per...

  15. NEW APPROACH FOR SOFTWARE PROCESSES REUSING BASED ON SOFTWARE ARCHITECTURES

    OpenAIRE

    Aoussat, Fadila; Ahmed-Nacer, Mohamed; Oussalah, Mourad Chabane

    2010-01-01

    International audience; This paper deals with reusing of software process models. Based on the insufficiencies of existing software process reusing approaches (limited reusability of the software process components), we propose a new approach that promotes a large reuse of existing proven software process models even not oriented components. Our approach is based on two steps: we use domain ontology to capitalize the software process knowledge and we handle the inferred knowledge as software ...

  16. Software Project Documentation - An Essence of Software Development

    OpenAIRE

    Vikas S. Chomal; Dr. Jatinderkumar R. Saini

    2015-01-01

    Software documentation is a critical attribute of both software projects and software engineering in general. Documentation is considered as a media of communication among the parties involved during software development as well the one who will be using the software. It consists of written particulars concerning software specifications as well as what it does, in which manner it accomplishes the specified details and even how to exercise it. In this paper, we tried to focus on the role of do...

  17. A Simple Complexity Measurement for Software Verification and Software Testing

    OpenAIRE

    Cheng, Zheng; Monahan, Rosemary; Power, James F.

    2012-01-01

    In this paper, we used a simple metric (i.e. Lines of Code) to measure the complexity involved in software verification and software testing. The goal is then, to argue for software verification over software testing, and motivate a discussion of how to reduce the complexity involved in software verification. We propose to reduce this complexity by translating the software to a simple intermediate representation which can be verified using an efficient verifier, such as Boog...

  18. Software Architecture: Architecture Constraints

    OpenAIRE

    Tibermacine, Chouki

    2014-01-01

    International audience In this chapter, we introduce an additional, yet essential, concept in describing software architectures : architecture constraints. We explain the precise role of these entities and their importance in object-oriented, component-based or service-oriented software engi-neering. We then describe the way in which they are specified and interpreted. An architect can define architecture constraints and then associate them to architectural descriptions to limit their stru...

  19. Occupational radiation protection software

    International Nuclear Information System (INIS)

    This paper presents a reflection on the basic essentials of a Radiation Work Permit (RWP). Based on the latest WANO Recommendations, this paper considers the RWP as a complete process rather than a simple administrative procedure. This process is implemented via software which is also presented in this paper. The software has been designed to achieve the following objectives: - To configure the radiological map of the plant. To plan radiological surveillance, to input data, to update radiological signposting and mandatory protective clothing in each area of the station. All this information can be checked from any personnel computer connected to a network. - To collect radiological data by means of a palmtop (PDA) and to upload it to a personnel computer, thereby speeding up the job and reducing human errors. - To implement the RWP by allowing on-line consultation of the permitted individual doses of the workers and the planned collective dose for each job. The software also supplies the radiological information to the workers. - To collect and arrange pictures, maps and sketches of equipment placed in rooms or in areas of the plant. - To allow the software to be used in real time from different workstations. - High reliability and speed of working. - Flexible data enquiry. The software provides a number of standard data enquiries such as numbers of workers on each job and their individual dose received...etc. It also allows data to be exported to other well-known software applications such as Excel and Access for further data analysis. The software has been designed by radiation protection professionals and developed by computer programmers who were integrated into the radiological work environment. The software would fulfill Occupational Radiation Protection Department requirements. (author)

  20. Deprogramming Large Software Systems

    OpenAIRE

    Coppel, Yohann; Candea, George

    2008-01-01

    Developers turn ideas, designs and patterns into source code, then compile the source code into executables. Decompiling turns executables back into source code, and deprogramming turns code back into designs and patterns. In this paper we introduce DeP, a tool for deprogramming software systems. DeP abstracts code into a dependency graph and mines this graph for patterns. It also gives programmers visual means for manipulating the program. We describe DeP's use in several software engineerin...

  1. SDN : Software defined networks

    OpenAIRE

    Wiklund, Petter

    2014-01-01

    This report is a specialization in Software defined networking. SDN really comes to revolutionize the industry and it’s under constant development. But is the technology ready to be launched into operation yet? The report would initially involve a number of problems that today's network technology is facing. It then follows a deeper description of what this software-based networking technology really is and how it works. Further, the technique is being tested in a lab assignment, using a prog...

  2. Personalised continuous software engineering

    OpenAIRE

    Papatheocharous, Efi; Belk, Marios; Nyfjord, Jaana; Germanakos, Panagiotis; Samaras, George

    2014-01-01

    This work describes how human factors can influence continuous software engineering. The reasoning begins from the Agile Manifesto promoting individuals and interactions over processes and tools. The organisational need to continuously develop, release and learn from software development in rapid cycles requires empowered and self-organised agile teams. However, these teams are formed without necessarily considering the members’ individual characteristics towards effective teamwork, from the ...

  3. Engineering and Software Engineering

    Science.gov (United States)

    Jackson, Michael

    The phrase ‘software engineering' has many meanings. One central meaning is the reliable development of dependable computer-based systems, especially those for critical applications. This is not a solved problem. Failures in software development have played a large part in many fatalities and in huge economic losses. While some of these failures may be attributable to programming errors in the narrowest sense—a program's failure to satisfy a given formal specification—there is good reason to think that most of them have other roots. These roots are located in the problem of software engineering rather than in the problem of program correctness. The famous 1968 conference was motivated by the belief that software development should be based on “the types of theoretical foundations and practical disciplines that are traditional in the established branches of engineering.” Yet after forty years of currency the phrase ‘software engineering' still denotes no more than a vague and largely unfulfilled aspiration. Two major causes of this disappointment are immediately clear. First, too many areas of software development are inadequately specialised, and consequently have not developed the repertoires of normal designs that are the indispensable basis of reliable engineering success. Second, the relationship between structural design and formal analytical techniques for software has rarely been one of fruitful synergy: too often it has defined a boundary between competing dogmas, at which mutual distrust and incomprehension deprive both sides of advantages that should be within their grasp. This paper discusses these causes and their effects. Whether the common practice of software development will eventually satisfy the broad aspiration of 1968 is hard to predict; but an understanding of past failure is surely a prerequisite of future success.

  4. Mining unstructured software data

    OpenAIRE

    Bacchelli, Alberto; Lanza, Michele

    2013-01-01

    Our thesis is that the analysis of unstructured data supports software understanding and evolution analysis, and complements the data mined from structured sources. To this aim, we implemented the necessary toolset and investigated methods for exploring, exposing, and exploiting unstructured data.To validate our thesis, we focused on development email data. We found two main challenges in using it to support program comprehension and software development: The disconnection between emai...

  5. Creative Software Engineering

    OpenAIRE

    Hooper, Clare J.; Millard, David E.

    2010-01-01

    Software engineering is traditionally seen as very structured and methodical. However, it often involves creative steps: consider requirements analysis, architecture engineering and GUI design. This poster describes three existing software engineering methods which include creative steps, alongside a method called 'experience deconstruction'. Deconstruction, which also includes a creative step, is used to help understand user experiences and re-provide these experiences in new contexts.

  6. Transformational Leadershipin Software Projects

    OpenAIRE

    MOUSAVIKHAH, MARYAM

    2013-01-01

    Lack of management in software projects is among the most important reasons for the failure of this kind of projects. Considering this fact, in addition to high rate of IS (Information System) projects’ failure, and the lack of leadership studies in IS field, it is necessary to pay more attention to the concept of leadership in software projects. Transformational leadership as one of the most popular leadership theories, although might bring specific advantages for this kind of projects, has ...

  7. Developing high-quality educational software.

    Science.gov (United States)

    Johnson, Lynn A; Schleyer, Titus K L

    2003-11-01

    The development of effective educational software requires a systematic process executed by a skilled development team. This article describes the core skills required of the development team members for the six phases of successful educational software development. During analysis, the foundation of product development is laid including defining the audience and program goals, determining hardware and software constraints, identifying content resources, and developing management tools. The design phase creates the specifications that describe the user interface, the sequence of events, and the details of the content to be displayed. During development, the pieces of the educational program are assembled. Graphics and other media are created, video and audio scripts written and recorded, the program code created, and support documentation produced. Extensive testing by the development team (alpha testing) and with students (beta testing) is conducted. Carefully planned implementation is most likely to result in a flawless delivery of the educational software and maintenance ensures up-to-date content and software. Due to the importance of the sixth phase, evaluation, we have written a companion article on it that follows this one. The development of a CD-ROM product is described including the development team, a detailed description of the development phases, and the lessons learned from the project.

  8. Towards research on software cybernetics

    OpenAIRE

    Cai, KY; Chen, TY; Tse, TH

    2002-01-01

    Software cybernetics is a newly proposed area in software engineering. It makes better use of the interplay between control theory/engineering and software engineering. In this paper, we look into the research potentials of this emerging area.

  9. Visualizing Object-oriented Software for Understanding and Documentation

    OpenAIRE

    Al-Msie'Deen, Ra'Fat

    2016-01-01

    Understanding or comprehending source code is one of the core activities of software engineering. Understanding object-oriented source code is essential and required when a programmer maintains, migrates, reuses, documents or enhances source code. The source code that is not comprehended cannot be changed. The comprehension of object-oriented source code is a difficult problem solving process. In order to document object-oriented software system there are needs to understand its source code. ...

  10. Software bibliotecario abierto y gratuito

    OpenAIRE

    Lencinas, Verónica

    2001-01-01

    Free software, known also as Open Source software, has a number of advantages for implementation in libraries. It offers free and full source code of the software that can be used to correct errors, modify it and integrate with other programs. Because of these advantages, free software offers better opportunities for libraries as closed software. Library management systems will soon be available and can be a real alternative to commercial software. The methodology used to develop the open sof...

  11. Management aspects of software maintenance

    OpenAIRE

    Henderson, Brian J.; Sullivan, Brenda J.

    1984-01-01

    Approved for public release; distribution is unlimited The Federal government depends upon software systems to fulfill its missions. These software systems must be maintained and improved to continue to meet the growing demands placed on them. The process of software maintenance and improvement may be called "software evolution". The software manager must be educated in the complex nature cf soft- ware maintenance to be able to properly evaluate and manage the software maintenance effort. ...

  12. Software Engineering Reviews and Audits

    CERN Document Server

    Summers, Boyd L

    2011-01-01

    Accurate software engineering reviews and audits have become essential to the success of software companies and military and aerospace programs. These reviews and audits define the framework and specific requirements for verifying software development efforts. Authored by an industry professional with three decades of experience, Software Engineering Reviews and Audits offers authoritative guidance for conducting and performing software first article inspections, and functional and physical configuration software audits. It prepares readers to answer common questions for conducting and perform

  13. Computing and software

    Directory of Open Access Journals (Sweden)

    White, G. C.

    2004-06-01

    Full Text Available The reality is that the statistical methods used for analysis of data depend upon the availability of software. Analysis of marked animal data is no different than the rest of the statistical field. The methods used for analysis are those that are available in reliable software packages. Thus, the critical importance of having reliable, up–to–date software available to biologists is obvious. Statisticians have continued to develop more robust models, ever expanding the suite of potential analysis methods available. But without software to implement these newer methods, they will languish in the abstract, and not be applied to the problems deserving them. In the Computers and Software Session, two new software packages are described, a comparison of implementation of methods for the estimation of nest survival is provided, and a more speculative paper about how the next generation of software might be structured is presented. Rotella et al. (2004 compare nest survival estimation with different software packages: SAS logistic regression, SAS non–linear mixed models, and Program MARK. Nests are assumed to be visited at various, possibly infrequent, intervals. All of the approaches described compute nest survival with the same likelihood, and require that the age of the nest is known to account for nests that eventually hatch. However, each approach offers advantages and disadvantages, explored by Rotella et al. (2004. Efford et al. (2004 present a new software package called DENSITY. The package computes population abundance and density from trapping arrays and other detection methods with a new and unique approach. DENSITY represents the first major addition to the analysis of trapping arrays in 20 years. Barker & White (2004 discuss how existing software such as Program MARK require that each new model’s likelihood must be programmed specifically for that model. They wishfully think that future software might allow the user to combine

  14. Review of the Unified Software Development Process%统一软件开发过程述评

    Institute of Scientific and Technical Information of China (English)

    麻志毅

    2002-01-01

    The Unified Software Development Process(USDP),published by a few masters in the software engineering field and Rational Software Corporation,and Supported by OMG,is attracting wide attention in the area of software engineering.After summarizing USDP,the paper introduces the phases and the core workflows of USDP in detail,then discusses positive influence of USDP on software development process,and points out some possible problems.

  15. SECURED CLOUD SUPPORT FOR GLOBAL SOFTWARE REQUIREMENT RISK MANAGEMENT

    OpenAIRE

    Shruti Patil; Roshani Ade

    2014-01-01

    This paper presents core problem solution to security of Global Software Development Requirement Information. Currently the major issue deals with hacking of sensitive client information which may lead to major financial as well as social loss. To avoid this system provides cloud security by encryption of data as well as deployment of tool over the cloud will provide significant security to whole global content management system. The core findings are presented in terms of how hac...

  16. Software Activation Using Multithreading

    Directory of Open Access Journals (Sweden)

    Jianrui Zhang

    2012-11-01

    Full Text Available Software activation is an anti-piracy technology designed to verify that software products have been legitimately licensed. Activation should be quick and simple while simultaneously being secure and protecting customer privacy. The most common form of software activation is for the user to enter a legitimate product serial number. However, software activation based on serial numbers appears to be weak, since cracks for many programs are readily available on the Internet. Users can employ such cracks to bypass software activation.Serial number verification logic usually executes sequentially in a single thread. Such an approach is relatively easy to break since attackers can trace the code to understand how the logic works. In this paper, we develop a practical multi-threaded verification design. Our results show that by proper use of multi-threading, the amount of traceable code in a debugger can be reduced to a low percentage of the total and the traceable code in each run can differ as well. This makes it significantly more difficult for an attacker to reverse engineer the code as a means of bypassing a security check. Finally, we attempt to quantify the increased effort needed to break our verification logic.

  17. Software reliability assessment

    International Nuclear Information System (INIS)

    The increased usage and sophistication of computers applied to real time safety-related systems in the United Kingdom has spurred on the desire to provide a standard framework within which to assess dependable computing systems. Recent accidents and ensuing legislation have acted as a catalyst in this area. One particular aspect of dependable computing systems is that of software, which is usually designed to reduce risk at the system level, but which can increase risk if it is unreliable. Various organizations have recognized the problem of assessing the risk imposed to the system by unreliable software, and have taken initial steps to develop and use such assessment frameworks. This paper relates the approach of Consultancy Services of AEA Technology in developing a framework to assess the risk imposed by unreliable software. In addition, the paper discusses the experiences gained by Consultancy Services in applying the assessment framework to commercial and research projects. The framework is applicable to software used in safety applications, including proprietary software. Although the paper is written with Nuclear Reactor Safety applications in mind, the principles discussed can be applied to safety applications in all industries

  18. Microprocessor-based integrated LMFBR core surveillance

    International Nuclear Information System (INIS)

    This report results from a joint study of KfK and INTERATOM. The aim of this study is to explore the advantages of microprocessors and microelectronics for a more sophisticated core surveillance, which is based on the integration of separate surveillance techniques. Due to new developments in microelectronics and related software an approach to LMFBR core surveillance can be conceived that combines a number of measurements into a more intelligent decision-making data processing system. The following techniques are considered to contribute essentially to an integrated core surveillance system: - subassembly state and thermal hydraulics performance monitoring, - temperature noise analysis, - acoustic core surveillance, - failure characterization and failure prediction based on DND- and cover gas signals, and - flux tilting techniques. Starting from a description of these techniques it is shown that by combination and correlation of these individual techniques a higher degree of cost-effectiveness, reliability and accuracy can be achieved. (orig./GL)

  19. Software Engineering Laboratory Series: Collected Software Engineering Papers. Volume 14

    Science.gov (United States)

    1996-01-01

    The Software Engineering Laboratory (SEL) is an organization sponsored by NASA/GSFC and created to investigate the effectiveness of software engineering technologies when applied to the development of application software. The activities, findings, and recommendations of the SEL are recorded in the Software Engineering Laboratory Series, a continuing series of reports that includes this document.

  20. Software Engineering Laboratory Series: Collected Software Engineering Papers. Volume 15

    Science.gov (United States)

    1997-01-01

    The Software Engineering Laboratory (SEL) is an organization sponsored by NASA/GSFC and created to investigate the effectiveness of software engineering technologies when applied to the development of application software. The activities, findings, and recommendations of the SEL are recorded in the Software Engineering Laboratory Series, a continuing series of reports that includes this document.

  1. Software Engineering Laboratory Series: Collected Software Engineering Papers. Volume 13

    Science.gov (United States)

    1995-01-01

    The Software Engineering Laboratory (SEL) is an organization sponsored by NASA/GSFC and created to investigate the effectiveness of software engineering technologies when applied to the development of application software. The activities, findings, and recommendations of the SEL are recorded in the Software Engineering Laboratory Series, a continuing series of reports that includes this document.

  2. Impact of Agile Software Development Model on Software Maintainability

    Science.gov (United States)

    Gawali, Ajay R.

    2012-01-01

    Software maintenance and support costs account for up to 60% of the overall software life cycle cost and often burdens tightly budgeted information technology (IT) organizations. Agile software development approach delivers business value early, but implications on software maintainability are still unknown. The purpose of this quantitative study…

  3. SOFTWARE MEASUREMENTS AND METRICS: ROLE IN EFFECTIVE SOFTWARE TESTING

    Directory of Open Access Journals (Sweden)

    Sheikh Umar Farooq

    2011-01-01

    Full Text Available Measurement has always been fundamental to the progress to any engineering discipline and software testing is no exception. Software metrics have been used in making quantitative/qualitative decisions as well as in risk assessment and reduction in software projects. In this paper we discuss software measurement and metrics and their fundamental role in software development life cycle. This paper focusing on software test metrics discusses their key role in software testing process and also classifies and systematically analyzes the various test metrics.

  4. Test af Software

    DEFF Research Database (Denmark)

    Dette dokument udgør slutrapporten for netværkssamarbejdet ”Testnet”, som er udført i perioden 1.4.2006 til 31.12.2008. Netværket beskæftiger sig navnlig med emner inden for test af indlejret og teknisk software, men et antal eksempler på problemstillinger og løsninger forbundet med test af...... administrativ software indgår også. Rapporten er opdelt i følgende 3 dele: Overblik. Her giver vi et resumé af netværkets formål, aktiviteter og resultater. State of the art af software test ridses op. Vi omtaler, at CISS og netværket tager nye tiltag. Netværket. Formål, deltagere og behandlede emner på ti...

  5. Astronomers as Software Developers

    Science.gov (United States)

    Pildis, Rachel A.

    2016-01-01

    Astronomers know that their research requires writing, adapting, and documenting computer software. Furthermore, they often have to learn new computer languages and figure out how existing programs work without much documentation or guidance and with extreme time pressure. These are all skills that can lead to a software development job, but recruiters and employers probably won't know that. I will discuss all the highly useful experience that astronomers may not know that they already have, and how to explain that knowledge to others when looking for non-academic software positions. I will also talk about some of the pitfalls I have run into while interviewing for jobs and working as a developer, and encourage you to embrace the curiosity employers might have about your non-standard background.

  6. Lecture 2: Software Security

    CERN Document Server

    CERN. Geneva

    2013-01-01

    Computer security has been an increasing concern for IT professionals for a number of years, yet despite all the efforts, computer systems and networks remain highly vulnerable to attacks of different kinds. Design flaws and security bugs in the underlying software are among the main reasons for this. This lecture addresses the following question: how to create secure software? The lecture starts with a definition of computer security and an explanation of why it is so difficult to achieve. It then introduces the main security principles (like least-privilege, or defense-in-depth) and discusses security in different phases of the software development cycle. The emphasis is put on the implementation part: most common pitfalls and security bugs are listed, followed by advice on best practice for security development, testing and deployment. Sebastian Lopienski is CERN’s deputy Computer Security Officer. He works on security strategy and policies; offers internal consultancy and audit services; develops and ...

  7. Secure software practices among Malaysian software practitioners: An exploratory study

    Science.gov (United States)

    Mohamed, Shafinah Farvin Packeer; Baharom, Fauziah; Deraman, Aziz; Yahya, Jamaiah; Mohd, Haslina

    2016-08-01

    Secure software practices is increasingly gaining much importance among software practitioners and researchers due to the rise of computer crimes in the software industry. It has become as one of the determinant factors for producing high quality software. Even though its importance has been revealed, its current practice in the software industry is still scarce, particularly in Malaysia. Thus, an exploratory study is conducted among software practitioners in Malaysia to study their experiences and practices in the real-world projects. This paper discusses the findings from the study, which involved 93 software practitioners. Structured questionnaire is utilized for data collection purpose whilst statistical methods such as frequency, mean, and cross tabulation are used for data analysis. Outcomes from this study reveal that software practitioners are becoming increasingly aware on the importance of secure software practices, however, they lack of appropriate implementation, which could affect the quality of produced software.

  8. A Software Reliability Estimation Method to Nuclear Safety Software

    Energy Technology Data Exchange (ETDEWEB)

    Park, Geeyong; Jang, Seung Cheol [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2014-02-15

    A method for estimating software reliability for nuclear safety software is proposed in this paper. This method is based on the software reliability growth model (SRGM), where the behavior of software failure is assumed to follow a nonhomogeneous Poisson process. Two types of modeling schemes based on a particular underlying method are proposed in order to more precisely estimate and predict the number of software defects based on very rare software failure data. The Bayesian statistical inference is employed to estimate the model parameters by incorporating software test cases as a covariate into the model. It was identified that these models are capable of reasonably estimating the remaining number of software defects which directly affects the reactor trip functions. The software reliability might be estimated from these modeling equations, and one approach of obtaining software reliability value is proposed in this paper.

  9. Evolutional development of controlling software for agricultural vehicles and robots

    DEFF Research Database (Denmark)

    Nakanishi, Tsuneo; Jæger-Hansen, Claes Lund; Griepentrog, Hans-Werner

    Agricultural vehicles and robots expand their controlling software in size and complexity for their increasing functions. Due to repeated, ad hoc addition and modification, software gets structurally corrupted and becomes low performing, resource consuming and unreliable. This paper presents...... managed core assets. By contrast, while XDDP is a less burden process which focuses only on the portion to be changed in the new system, it never prevents software structure from corrupting due to absence of the global view of the system. The paper describes an adoption process for SPL, with an example...

  10. Updated Core Libraries of the ALPS Project

    CERN Document Server

    Gaenko, A; Carcassi, G; Chen, T; Chen, X; Dong, Q; Gamper, L; Gukelberger, J; Igarashi, R; Iskakov, S; Könz, M; LeBlanc, J P F; Levy, R; Ma, P N; Paki, J E; Shinaoka, H; Todo, S; Troyer, M; Gull, E

    2016-01-01

    The open source ALPS (Algorithms and Libraries for Physics Simulations) project provides a collection of physics libraries and applications, with a focus on simulations of lattice models and strongly correlated systems. The libraries provide a convenient set of well-documented and reusable components for developing condensed matter physics simulation code, and the applications strive to make commonly used and proven computational algorithms available to a non-expert community. In this paper we present an updated and refactored version of the core ALPS libraries geared at the computational physics software development community, rewritten with focus on documentation, ease of installation, and software maintainability.

  11. Conceptual Models Core to Good Design

    CERN Document Server

    Johnson, Jeff

    2011-01-01

    People make use of software applications in their activities, applying them as tools in carrying out tasks. That this use should be good for people--easy, effective, efficient, and enjoyable--is a principal goal of design. In this book, we present the notion of Conceptual Models, and argue that Conceptual Models are core to achieving good design. From years of helping companies create software applications, we have come to believe that building applications without Conceptual Models is just asking for designs that will be confusing and difficult to learn, remember, and use. We show how Concept

  12. Agile distributed software development

    DEFF Research Database (Denmark)

    Persson, John Stouby; Mathiassen, Lars; Aaen, Ivan

    2012-01-01

    While face-to-face interaction is fundamental in agile software development, distributed environments must rely extensively on mediated interactions. Practicing agile principles in distributed environments therefore poses particular control challenges related to balancing fixed vs. evolving quality...... requirements and people vs. process-based collaboration. To investigate these challenges, we conducted an in-depth case study of a successful agile distributed software project with participants from a Russian firm and a Danish firm. Applying Kirsch’s elements of control framework, we offer an analysis of how...

  13. Processeringsoptimering med Canons software

    DEFF Research Database (Denmark)

    Precht, Helle

    2009-01-01

    . Muligheder i software optimering blev studeret i relation til optimal billedkvalitet og kontrol optagelser, for at undersøge om det var muligt at acceptere diagnostisk billedkvalitet og derved tage afsæt i ALARA. Metode og materialer Et kvantitativt eksperimentelt studie baseret på forsøg med teknisk og...... humant fantom. CD Rad fantom anvendes som teknisk fantom, hvor billederne blev analyseret med CD Rad software, og resultatet var en objektiv IQF værdi. Det humane fantom var et lamme pelvis med femur, der via NRPB’ er sammenlignelig med absorptionen ved et femårigt barn. De humane forsøgsbilleder blev...

  14. Software product quality control

    CERN Document Server

    Wagner, Stefan

    2013-01-01

    Quality is not a fixed or universal property of software; it depends on the context and goals of its stakeholders. Hence, when you want to develop a high-quality software system, the first step must be a clear and precise specification of quality. Yet even if you get it right and complete, you can be sure that it will become invalid over time. So the only solution is continuous quality control: the steady and explicit evaluation of a product's properties with respect to its updated quality goals.This book guides you in setting up and running continuous quality control in your environment. Star

  15. Agile software development

    CERN Document Server

    Stober, Thomas

    2009-01-01

    Software Development is moving towards a more agile and more flexible approach. It turns out that the traditional 'waterfall' model is not supportive in an environment where technical, financial and strategic constraints are changing almost every day. But what is agility? What are today's major approaches? And especially: What is the impact of agile development principles on the development teams, on project management and on software architects? How can large enterprises become more agile and improve their business processes, which have been existing since many, many years? What are the limit

  16. Machine Tool Software

    Science.gov (United States)

    1988-01-01

    A NASA-developed software package has played a part in technical education of students who major in Mechanical Engineering Technology at William Rainey Harper College. Professor Hack has been using (APT) Automatically Programmed Tool Software since 1969 in his CAD/CAM Computer Aided Design and Manufacturing curriculum. Professor Hack teaches the use of APT programming languages for control of metal cutting machines. Machine tool instructions are geometry definitions written in APT Language to constitute a "part program." The part program is processed by the machine tool. CAD/CAM students go from writing a program to cutting steel in the course of a semester.

  17. Green in software engineering

    CERN Document Server

    Calero Munoz, Coral

    2015-01-01

    This is the first book that presents a comprehensive overview of sustainability aspects in software engineering. Its format follows the structure of the SWEBOK and covers the key areas involved in the incorporation of green aspects in software engineering, encompassing topics from requirement elicitation to quality assurance and maintenance, while also considering professional practices and economic aspects. The book consists of thirteen chapters, which are structured in five parts. First the "Introduction" gives an overview of the primary general concepts related to Green IT, discussing wha

  18. Agile Software Development

    OpenAIRE

    Stewart, Rhonda

    2009-01-01

    One of the most noticeable changes to software process thinking in the last ten years has been the appearance of the word ‘agile’ (Fowler, 2005). In the Information Technology (IT) industry Agile Software Development, or simply Agile is used to refer to a family of lightweight development approaches that share a common set of values and principles1 focused around adapting to change and putting people first (Fowler, 2005). Such Agile methods2 provide an alternative to the well-established Wate...

  19. Software Testing as Science

    Directory of Open Access Journals (Sweden)

    Ingrid Gallesdic

    2013-06-01

    Full Text Available The most widespread opinion among people who have some connection with software testing is that this activity is an art. In fact, books have been published widely whose titles refer to it as art, role or process. But because software complexity is increasing every year, this paper proposes a new approach, conceiving the test as a science. This is because the processes by which they are applied are the steps of the scientific method: inputs, processes, outputs. The contents of this paper examines the similarities and test characteristics as science.

  20. The PANIC software system

    Science.gov (United States)

    Ibáñez Mengual, José M.; Fernández, Matilde; Rodríguez Gómez, Julio F.; García Segura, Antonio J.; Storz, Clemens

    2010-07-01

    PANIC is the Panoramic Near Infrared Camera for the 2.2m and 3.5m telescopes at Calar Alto observatory. The aim of the project is to build a wide-field general purpose NIR camera. In this paper we describe the software system of the instrument, which comprises four main packages: GEIRS for the instrument control and the data acquisition; the Observation Tool (OT), the software used for detailed definition and pre-planning the observations, developed in Java; the Quick Look tool (PQL) for easy inspection of the data in real-time and a scientific pipeline (PAPI), both based on the Python programming language.

  1. Six Sigma software development

    CERN Document Server

    Tayntor, Christine B

    2002-01-01

    Since Six Sigma has had marked success in improving quality in other settings, and since the quality of software remains poor, it seems a natural evolution to apply the concepts and tools of Six Sigma to system development and the IT department. Until now however, there were no books available that applied these concepts to the system development process. Six Sigma Software Development fills this void and illustrates how Six Sigma concepts can be applied to all aspects of the evolving system development process. It includes the traditional waterfall model and in the support of legacy systems,

  2. Managing Distributed Software Projects

    DEFF Research Database (Denmark)

    Persson, John Stouby

    and coordination in a successful, distributed software project between a Russian and a Danish company. The case study’s control aspects were investigated, drawing on Kirsch’s (2004) elements of control framework, to analyze how control is enacted in the project. This analysis showed that informal measurement...... showed that multimodal communication can facilitate collective minding in distributed software projects and can positively impact performance. In providing an approach for investigating the impact of multimodal communication practices on virtual team performance, we can further understand and support...

  3. k-core covers and the core

    NARCIS (Netherlands)

    Sanchez-Rodriguez, E.; Borm, Peter; Estevez-Fernandez, A.; Fiestras-Janeiro, G.; Mosquera, M.A.

    2015-01-01

    This paper extends the notion of individual minimal rights for a transferable utility game (TU-game) to coalitional minimal rights using minimal balanced families of a specific type, thus defining a corresponding minimal rights game. It is shown that the core of a TU-game coincides with the core of

  4. Academic Rigor: The Core of the Core

    Science.gov (United States)

    Brunner, Judy

    2013-01-01

    Some educators see the Common Core State Standards as reason for stress, most recognize the positive possibilities associated with them and are willing to make the professional commitment to implementing them so that academic rigor for all students will increase. But business leaders, parents, and the authors of the Common Core are not the only…

  5. How can Software Packages Certification Improve Software Process

    OpenAIRE

    Pivka, Marjan; Potočan, Vojko

    1997-01-01

    Popular software assessment models such as CMM, BOOTSTRAP, SPICE or ISO 9000 ignore the impact of software product certification on software quality. The first standard for software product quality was German DIN 66285. Based on this standard, the ISO developed a international standard for quality requirements and testing procedures for software packages: ISO/IEC 12119. This paper presents our experience with classical testing models based on ISO/IEC 12119 and DIN 66285 and with our improved ...

  6. Software Risk Management Practice: Evidence From Thai Software Industry

    OpenAIRE

    Tharwon Arnuphaptrairong

    2014-01-01

    Software risk management has been around at least since it was introduced in mainstream of software management process, in 1989 [1]-[3] but little has been reported about its industrial practice [4]-[6]. This paper reports the current software risk management practice in Thai software industry. A questionnaire survey was designed to capture the information of the software project risk management practice. The questionnaire was sent to 141 companies and received a response rate 28 percent. The...

  7. The optimal community detection of software based on complex networks

    Science.gov (United States)

    Huang, Guoyan; Zhang, Peng; Zhang, Bing; Yin, Tengteng; Ren, Jiadong

    2016-02-01

    The community structure is important for software in terms of understanding the design patterns, controlling the development and the maintenance process. In order to detect the optimal community structure in the software network, a method Optimal Partition Software Network (OPSN) is proposed based on the dependency relationship among the software functions. First, by analyzing the information of multiple execution traces of one software, we construct Software Execution Dependency Network (SEDN). Second, based on the relationship among the function nodes in the network, we define Fault Accumulation (FA) to measure the importance of the function node and sort the nodes with measure results. Third, we select the top K(K=1,2,…) nodes as the core of the primal communities (only exist one core node). By comparing the dependency relationships between each node and the K communities, we put the node into the existing community which has the most close relationship. Finally, we calculate the modularity with different initial K to obtain the optimal division. With experiments, the method OPSN is verified to be efficient to detect the optimal community in various softwares.

  8. Implications of Responsive Space on the Flight Software Architecture

    Science.gov (United States)

    Wilmot, Jonathan

    2006-01-01

    The Responsive Space initiative has several implications for flight software that need to be addressed not only within the run-time element, but the development infrastructure and software life-cycle process elements as well. The runtime element must at a minimum support Plug & Play, while the development and process elements need to incorporate methods to quickly generate the needed documentation, code, tests, and all of the artifacts required of flight quality software. Very rapid response times go even further, and imply little or no new software development, requiring instead, using only predeveloped and certified software modules that can be integrated and tested through automated methods. These elements have typically been addressed individually with significant benefits, but it is when they are combined that they can have the greatest impact to Responsive Space. The Flight Software Branch at NASA's Goddard Space Flight Center has been developing the runtime, infrastructure and process elements needed for rapid integration with the Core Flight software System (CFS) architecture. The CFS architecture consists of three main components; the core Flight Executive (cFE), the component catalog, and the Integrated Development Environment (DE). This paper will discuss the design of the components, how they facilitate rapid integration, and lessons learned as the architecture is utilized for an upcoming spacecraft.

  9. Software Communication Architecture Implementation and Its Waveform Application

    Institute of Scientific and Technical Information of China (English)

    SUN Pei-gang; ZHAO Hai; WANG Ting-chang; FAN Jian-hua

    2006-01-01

    This paper attempts to do a research on the development of software defined radio(SDR) based on software communication architecture(SCA). Firstly, SCA is studied and a whole reference model of SCA3.0 core framework (CF)is realized; Secondly, an application-specific FM3TR waveform is implemented on the platform of common software based on the reference model; Thirdly, from the point of view of real-time performance and software reuse, tests and validations are made on the above realized CF reference model and FM3TR waveform. As a result, the SCA-compliant SDR has favorable interoperability and software portability and can satisfy the real-time performance requirements which are not too rigorous.

  10. Advanced Core Monitoring Framework: An overview description

    International Nuclear Information System (INIS)

    One of the most significant developments in nuclear power plant operations in recent years is the application of digital computers to monitor and manage power plant process. The introduction of this technology, moreover is not without its problems. At present each of these advanced core monitoring systems as GE's MONICORE, EXXON's POWERPLEX, EPRI's PSMS, etc., works only by itself in an operating configuration which makes it difficult to compare, benchmark or replace with alternative core monitoring packages. The Advanced Core Monitoring Framework (ACMF) was conceived to provide one standard software framework in a number of different virtual-memory mini-computers within which modules from any of the core monitoring systems (both BWR and PWR) could be installed. The primary theme of ACMF is to build a framework that allows software plug-in compatibility for a variety of core monitoring functional packages by carefully controlling (standardizing) module interfaces to a well-defined database and requiring a common man-machine interface to be installed

  11. The fallacy of Software Patents

    CERN Document Server

    CERN. Geneva

    2015-01-01

    Software patents are usually used as argument for innovation but do they really promote innovation? Who really benefits from software patents? This talk attempts to show the problems with software patents and how they can actually harm innovation having little value for software users and our society in general.

  12. Evaluating Commercial Game System Software

    OpenAIRE

    Quinn, Kelly; クイン, ケリー

    2009-01-01

    This paper describes the TOEIC test DS Training software published by Obunsha for the Nintendo DS game system. This paper describes the different features of the software, the advantages and disadvantages of the software and the results of a survey of students' reactions to the software and using the DS as a platform for studying English.

  13. Patterns for Parallel Software Design

    CERN Document Server

    Ortega-Arjona, Jorge Luis

    2010-01-01

    Essential reading to understand patterns for parallel programming Software patterns have revolutionized the way we think about how software is designed, built, and documented, and the design of parallel software requires you to consider other particular design aspects and special skills. From clusters to supercomputers, success heavily depends on the design skills of software developers. Patterns for Parallel Software Design presents a pattern-oriented software architecture approach to parallel software design. This approach is not a design method in the classic sense, but a new way of managin

  14. The Art of Software Testing

    CERN Document Server

    Myers, Glenford J; Badgett, Tom

    2011-01-01

    The classic, landmark work on software testing The hardware and software of computing have changed markedly in the three decades since the first edition of The Art of Software Testing, but this book's powerful underlying analysis has stood the test of time. Whereas most books on software testing target particular development techniques, languages, or testing methods, The Art of Software Testing, Third Edition provides a brief but powerful and comprehensive presentation of time-proven software testing approaches. If your software development project is mission critical, this book is an investme

  15. Software testing concepts and operations

    CERN Document Server

    Mili, Ali

    2015-01-01

    Explores and identifies the main issues, concepts, principles and evolution of software testing, including software quality engineering and testing concepts, test data generation, test deployment analysis, and software test managementThis book examines the principles, concepts, and processes that are fundamental to the software testing function. This book is divided into five broad parts. Part I introduces software testing in the broader context of software engineering and explores the qualities that testing aims to achieve or ascertain, as well as the lifecycle of software testing. Part II c

  16. A Core Language for Separate Variability Modeling

    DEFF Research Database (Denmark)

    Iosif-Lazăr, Alexandru Florin; Wasowski, Andrzej; Schaefer, Ina

    2014-01-01

    Separate variability modeling adds variability to a modeling language without requiring modifications of the language or the supporting tools. We define a core language for separate variability modeling using a single kind of variation point to define transformations of software artifacts in object...... hierarchical dependencies between variation points via copying and flattening. Thus, we reduce a model with intricate dependencies to a flat executable model transformation consisting of simple unconditional local variation points. The core semantics is extremely concise: it boils down to two operational rules...

  17. The NOvA software testing framework

    Science.gov (United States)

    Tamsett, M.; C Group

    2015-12-01

    The NOvA experiment at Fermilab is a long-baseline neutrino experiment designed to study vε appearance in a vμ beam. NOvA has already produced more than one million Monte Carlo and detector generated files amounting to more than 1 PB in size. This data is divided between a number of parallel streams such as far and near detector beam spills, cosmic ray backgrounds, a number of data-driven triggers and over 20 different Monte Carlo configurations. Each of these data streams must be processed through the appropriate steps of the rapidly evolving, multi-tiered, interdependent NOvA software framework. In total there are greater than 12 individual software tiers, each of which performs a different function and can be configured differently depending on the input stream. In order to regularly test and validate that all of these software stages are working correctly NOvA has designed a powerful, modular testing framework that enables detailed validation and benchmarking to be performed in a fast, efficient and accessible way with minimal expert knowledge. The core of this system is a novel series of python modules which wrap, monitor and handle the underlying C++ software framework and then report the results to a slick front-end web-based interface. This interface utilises modern, cross-platform, visualisation libraries to render the test results in a meaningful way. They are fast and flexible, allowing for the easy addition of new tests and datasets. In total upwards of 14 individual streams are regularly tested amounting to over 70 individual software processes, producing over 25 GB of output files. The rigour enforced through this flexible testing framework enables NOvA to rapidly verify configurations, results and software and thus ensure that data is available for physics analysis in a timely and robust manner.

  18. Writing testable software requirements

    Energy Technology Data Exchange (ETDEWEB)

    Knirk, D. [Sandia National Labs., Albuquerque, NM (United States)

    1997-11-01

    This tutorial identifies common problems in analyzing requirements in the problem and constructing a written specification of what the software is to do. It deals with two main problem areas: identifying and describing problem requirements, and analyzing and describing behavior specifications.

  19. Global Software Development

    DEFF Research Database (Denmark)

    Søderberg, Anne-Marie; Krishna, S.; Bjørn, Pernille

    2013-01-01

    accounts of close collaboration processes in two large and complex projects, where off-shoring of software development is moved to a strategic level, we found that the vendor was able to establish a strategic partnership through long-term engagement with the field of banking and insurance as well...

  20. Software for noise measurements

    International Nuclear Information System (INIS)

    The CURS program library comprising 38 fortran-programs, designed for processing descrete experimental data in the form of random or determined periodic processes is described. The library is based on the modular construction principle which allows one to create on its base any sets of programs to solve tasks related to NPP operation, and to develop special software

  1. Software Carpentry: Lessons Learned

    CERN Document Server

    Wilson, Greg

    2013-01-01

    Over the last 15 years, Software Carpentry has evolved from a week-long training course at the US national laboratories into a worldwide volunteer effort to raise standards in scientific computing. This article explains what we have learned along the way the challenges we now face, and our plans for the future.

  2. Software engineering tools.

    Science.gov (United States)

    Wear, L L; Pinkert, J R

    1994-01-01

    We have looked at general descriptions and illustrations of several software development tools, such as tools for prototyping, developing DFDs, testing, and maintenance. Many others are available, and new ones are being developed. However, you have at least seen some examples of powerful CASE tools for systems development.

  3. Iterative software kernels

    Energy Technology Data Exchange (ETDEWEB)

    Duff, I.

    1994-12-31

    This workshop focuses on kernels for iterative software packages. Specifically, the three speakers discuss various aspects of sparse BLAS kernels. Their topics are: `Current status of user lever sparse BLAS`; Current status of the sparse BLAS toolkit`; and `Adding matrix-matrix and matrix-matrix-matrix multiply to the sparse BLAS toolkit`.

  4. UAS-NAS Live Virtual Constructive Distributed Environment (LVC): LVC Gateway, Gateway Toolbox, Gateway Data Logger (GDL), SaaProc Software Design Description

    Science.gov (United States)

    Jovic, Srboljub

    2015-01-01

    This document provides the software design description for the two core software components, the LVC Gateway, the LVC Gateway Toolbox, and two participants, the LVC Gateway Data Logger and the SAA Processor (SaaProc).

  5. Generic Kalman Filter Software

    Science.gov (United States)

    Lisano, Michael E., II; Crues, Edwin Z.

    2005-01-01

    The Generic Kalman Filter (GKF) software provides a standard basis for the development of application-specific Kalman-filter programs. Historically, Kalman filters have been implemented by customized programs that must be written, coded, and debugged anew for each unique application, then tested and tuned with simulated or actual measurement data. Total development times for typical Kalman-filter application programs have ranged from months to weeks. The GKF software can simplify the development process and reduce the development time by eliminating the need to re-create the fundamental implementation of the Kalman filter for each new application. The GKF software is written in the ANSI C programming language. It contains a generic Kalman-filter-development directory that, in turn, contains a code for a generic Kalman filter function; more specifically, it contains a generically designed and generically coded implementation of linear, linearized, and extended Kalman filtering algorithms, including algorithms for state- and covariance-update and -propagation functions. The mathematical theory that underlies the algorithms is well known and has been reported extensively in the open technical literature. Also contained in the directory are a header file that defines generic Kalman-filter data structures and prototype functions and template versions of application-specific subfunction and calling navigation/estimation routine code and headers. Once the user has provided a calling routine and the required application-specific subfunctions, the application-specific Kalman-filter software can be compiled and executed immediately. During execution, the generic Kalman-filter function is called from a higher-level navigation or estimation routine that preprocesses measurement data and post-processes output data. The generic Kalman-filter function uses the aforementioned data structures and five implementation- specific subfunctions, which have been developed by the user on

  6. Evaluation & Optimization of Software Engineering

    OpenAIRE

    Asaduzzaman Noman; Atik Ahmed Sourav; Shakh Md. Alimuzjaman Alim

    2016-01-01

    The term is made of two words, software and engineering. Software is more than just a program code. A program is an executable code, which serves some computational purpose. Software is considered to be collection of executable programming code, associated libraries and documentations. Software, when made for a specific requirement is called software product. Engineering on the other hand, is all about developing products, using well-defined, scientific principles and methods. The outcome ...

  7. Self-assembling software generator

    Science.gov (United States)

    Bouchard, Ann M.; Osbourn, Gordon C.

    2011-11-25

    A technique to generate an executable task includes inspecting a task specification data structure to determine what software entities are to be generated to create the executable task, inspecting the task specification data structure to determine how the software entities will be linked after generating the software entities, inspecting the task specification data structure to determine logic to be executed by the software entities, and generating the software entities to create the executable task.

  8. Numerical software: science or alchemy

    Energy Technology Data Exchange (ETDEWEB)

    Gear, C.W.

    1979-06-01

    This is a summary of the Forsythe lecture presented at the Computer Science Conference, Dayton, Ohio, in February 1979. It examines the activity called Numerical Software, first to see what distinguishes numerical software from any other form of software and why numerical software is so much more difficult. Then it examines the scientific basis of such software and discusses that is lacking in that basis.

  9. Astropy: Community Python Software for Astronomy

    Science.gov (United States)

    Greenfield, Perry; Tollerud, E. J.; Robitaille, T.; Developers, Astropy

    2014-01-01

    The Astropy Project is a community effort to develop an open source Python package of common data structures and routines for use by other, more specialized astronomy software in Python in order to foster software interoperability in the astronomical community. The project encompasses Astropy's ”core” and ”affiliated" packages that adopt Astropy’s coding, testing and documentation standards. By doing so we aim to improve interoperability with other Python packages in astronomy, and help a broader community implement more Pythonic solutions to astronomy computing problems while minimizing duplication of effort. The project provides a template for other projects that use Astropy to reuse much of Astropy’s development framework without reinventing the wheel. Here we present an overview of the key features of the core package (existing and upcoming), current and planned affiliated packages, and how we manage a large open source project with a diverse community of contributors.

  10. Software-Defined Cellular Mobile Network Solutions

    Institute of Scientific and Technical Information of China (English)

    Jiandong Li; Peng Liu; Hongyan Li

    2014-01-01

    The emergency relating to software-defined networking (SDN), especially in terms of the prototype associated with OpenFlow, pro-vides new possibilities for innovating on network design. Researchers have started to extend SDN to cellular networks. Such new programmable architecture is beneficial to the evolution of mobile networks and allows operators to provide better services. The typical cellular network comprises radio access network (RAN) and core network (CN); hence, the technique roadmap diverges in two ways. In this paper, we investigate SoftRAN, the latest SDN solution for RAN, and SoftCell and MobileFlow, the latest solu-tions for CN. We also define a series of control functions for CROWD. Unlike in the other literature, we emphasize only software-defined cellular network solutions and specifications in order to provide possible research directions.

  11. SAPHIRE models and software for ASP evaluations

    Energy Technology Data Exchange (ETDEWEB)

    Sattison, M.B.; Schroeder, J.A.; Russell, K.D. [Idaho National Engineering Lab., Idaho Falls, ID (United States)] [and others

    1995-04-01

    The Idaho National Engineering Laboratory (INEL) over the past year has created 75 plant-specific Accident Sequence Precursor (ASP) models using the SAPHIRE suite of PRA codes. Along with the new models, the INEL has also developed a new module for SAPHIRE which is tailored specifically to the unique needs of conditional core damage probability (CCDP) evaluations. These models and software will be the next generation of risk tools for the evaluation of accident precursors by both NRR and AEOD. This paper presents an overview of the models and software. Key characteristics include: (1) classification of the plant models according to plant response with a unique set of event trees for each plant class, (2) plant-specific fault trees using supercomponents, (3) generation and retention of all system and sequence cutsets, (4) full flexibility in modifying logic, regenerating cutsets, and requantifying results, and (5) user interface for streamlined evaluation of ASP events.

  12. Software for airborne radiation monitoring system

    International Nuclear Information System (INIS)

    The Airborne Radiation Monitoring System monitors radioactive contamination in the air or on the ground. The contamination source can be a radioactive plume or an area contaminated with radionuclides. This system is composed of two major parts: Airborne Unit carried by a helicopter, and Ground Station carried by a truck. The Airborne software is intended to be the core of a computerized airborne station. The software is written in C++ under MS-Windows with object-oriented methodology. It has been designed to be user-friendly: function keys and other accelerators are used for vital operations, a help file and help subjects are available, the Human-Machine-Interface is plain and obvious. (authors)

  13. Flight Software Math Library

    Science.gov (United States)

    McComas, David

    2013-01-01

    The flight software (FSW) math library is a collection of reusable math components that provides typical math utilities required by spacecraft flight software. These utilities are intended to increase flight software quality reusability and maintainability by providing a set of consistent, well-documented, and tested math utilities. This library only has dependencies on ANSI C, so it is easily ported. Prior to this library, each mission typically created its own math utilities using ideas/code from previous missions. Part of the reason for this is that math libraries can be written with different strategies in areas like error handling, parameters orders, naming conventions, etc. Changing the utilities for each mission introduces risks and costs. The obvious risks and costs are that the utilities must be coded and revalidated. The hidden risks and costs arise in miscommunication between engineers. These utilities must be understood by both the flight software engineers and other subsystem engineers (primarily guidance navigation and control). The FSW math library is part of a larger goal to produce a library of reusable Guidance Navigation and Control (GN&C) FSW components. A GN&C FSW library cannot be created unless a standardized math basis is created. This library solves the standardization problem by defining a common feature set and establishing policies for the library s design. This allows the libraries to be maintained with the same strategy used in its initial development, which supports a library of reusable GN&C FSW components. The FSW math library is written for an embedded software environment in C. This places restrictions on the language features that can be used by the library. Another advantage of the FSW math library is that it can be used in the FSW as well as other environments like the GN&C analyst s simulators. This helps communication between the teams because they can use the same utilities with the same feature set and syntax.

  14. Framework of Software Quality Management Using Object oriented Software Agent

    Directory of Open Access Journals (Sweden)

    Anand Pandey

    2013-01-01

    Full Text Available Development of software is a scientific and economic problem, particularly the design of complex systems whichrequire evolving methods and approaches. Agent technology is currently one of the most active and vibrant areas of IT research and development. Object-oriented Software Engineering (OOSE has become an active area of research in recent years. In this paper, we review the framework of software quality management using object- oriented methodology concepts for software agents.The software specification acts as a bridge between customers, architects, software developers and testers. Using object-oriented concept of software agent and its standard it may offer benefits even if the system is implemented without an object-based language or framework . We propose and discuss a software agent framework, specifically to support software quality management. Although still in its initial phases, research indicates some promise in enabling software developers to meet market expectations and produce projects timeously, within budget and to users satisfaction. However, the software quality management environment has also changed and is continuously evolving. Currently software projects are developed and deployed in distributed, pervasive and collaborative environments and its quality should be managed by applying its best standard. From the point of view of software engineering this framework and its standards are applying for developing the software projects.We discuss the standard and benefits that can be gained by using object-oriented concepts, and where the concepts require further development.

  15. Software Polarization Spectrometer "PolariS"

    OpenAIRE

    Mizuno, Izumi; Kameno, Seiji; Kano, Amane; Kuroo, Makoto; Nakamura, Fumitaka; KAWAGUCHI, Noriyuki; Shibata, Katsunori M.; Kuji, Seisuke; Kuno, Nario

    2014-01-01

    We have developed a software-based polarization spectrometer, PolariS, to acquire full-Stokes spectra with a very high spectral resolution of 61 Hz. The primary aim of PolariS is to measure the magnetic fields in dense star-forming cores by detecting the Zeeman splitting of molecular emission lines. The spectrometer consists of a commercially available digital sampler and a Linux computer. The computer is equipped with a graphics processing unit (GPU) to process FFT and cross-correlation usin...

  16. Are Academic Programs Adequate for the Software Profession?

    Science.gov (United States)

    Koster, Alexis

    2010-01-01

    According to the Bureau of Labor Statistics, close to 1.8 million people, or 77% of all computer professionals, were working in the design, development, deployment, maintenance, and management of software in 2006. The ACM [Association for Computing Machinery] model curriculum for the BS in computer science proposes that about 42% of the core body…

  17. Software Development Practices, Software Complexity, and Software Maintenance Performance: A Field Study

    OpenAIRE

    Banker, Rajiv D.; Davis, Gordon B.; Sandra A. Slaughter

    1998-01-01

    Software maintenance claims a large proportion of organizational resources. It is thought that many maintenance problems derive from inadequate software design and development practices. Poor design choices can result in complex software that is costly to support and difficult to change. However, it is difficult to assess the actual maintenance performance effects of software development practices because their impact is realized over the software life cycle. To estimate the impact of develop...

  18. A software engineering process for safety-critical software application.

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Byung Heon; Kim, Hang Bae; Chang, Hoon Seon; Jeon, Jong Sun [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1995-02-01

    Application of computer software to safety-critical systems in on the increase. To be successful, the software must be designed and constructed to meet the functional and performance requirements of the system. For safety reason, the software must be demonstrated not only to meet these requirements, but also to operate safely as a component within the system. For longer-term cost consideration, the software must be designed and structured to ease future maintenance and modifications. This paper presents a software engineering process for the production of safety-critical software for a nuclear power plant. The presentation is expository in nature of a viable high quality safety-critical software development. It is based on the ideas of a rational design process and on the experience of the adaptation of such process in the production of the safety-critical software for the shutdown system number two of Wolsung 2, 3 and 4 nuclear power generation plants. This process is significantly different from a conventional process in terms of rigorous software development phases and software design techniques, The process covers documentation, design, verification and testing using mathematically precise notations and highly reviewable tabular format to specify software requirements and software requirements and software requirements and code against software design using static analysis. The software engineering process described in this paper applies the principle of information-hiding decomposition in software design using a modular design technique so that when a change is required or an error is detected, the affected scope can be readily and confidently located. it also facilitates a sense of high degree of confidence in the `correctness` of the software production, and provides a relatively simple and straightforward code implementation effort. 1 figs., 10 refs. (Author).

  19. A software engineering process for safety-critical software application

    International Nuclear Information System (INIS)

    Application of computer software to safety-critical systems in on the increase. To be successful, the software must be designed and constructed to meet the functional and performance requirements of the system. For safety reason, the software must be demonstrated not only to meet these requirements, but also to operate safely as a component within the system. For longer-term cost consideration, the software must be designed and structured to ease future maintenance and modifications. This paper presents a software engineering process for the production of safety-critical software for a nuclear power plant. The presentation is expository in nature of a viable high quality safety-critical software development. It is based on the ideas of a rational design process and on the experience of the adaptation of such process in the production of the safety-critical software for the shutdown system number two of Wolsung 2, 3 and 4 nuclear power generation plants. This process is significantly different from a conventional process in terms of rigorous software development phases and software design techniques, The process covers documentation, design, verification and testing using mathematically precise notations and highly reviewable tabular format to specify software requirements and software requirements and software requirements and code against software design using static analysis. The software engineering process described in this paper applies the principle of information-hiding decomposition in software design using a modular design technique so that when a change is required or an error is detected, the affected scope can be readily and confidently located. it also facilitates a sense of high degree of confidence in the 'correctness' of the software production, and provides a relatively simple and straightforward code implementation effort. 1 figs., 10 refs. (Author)

  20. Software Architecture Design Reasoning

    Science.gov (United States)

    Tang, Antony; van Vliet, Hans

    Despite recent advancements in software architecture knowledge management and design rationale modeling, industrial practice is behind in adopting these methods. The lack of empirical proofs and the lack of a practical process that can be easily incorporated by practitioners are some of the hindrance for adoptions. In particular, the process to support systematic design reasoning is not available. To rectify this issue, we propose a design reasoning process to help architects cope with an architectural design environment where design concerns are cross-cutting and diversified.We use an industrial case study to validate that the design reasoning process can help improve the quality of software architecture design. The results have indicated that associating design concerns and identifying design options are important steps in design reasoning.

  1. Chemical recognition software

    Energy Technology Data Exchange (ETDEWEB)

    Wagner, J.S.; Trahan, M.W.; Nelson, W.E.; Hargis, P.H. Jr.; Tisone, G.C.

    1994-06-01

    We have developed a capability to make real time concentration measurements of individual chemicals in a complex mixture using a multispectral laser remote sensing system. Our chemical recognition and analysis software consists of three parts: (1) a rigorous multivariate analysis package for quantitative concentration and uncertainty estimates, (2) a genetic optimizer which customizes and tailors the multivariate algorithm for a particular application, and (3) an intelligent neural net chemical filter which pre-selects from the chemical database to find the appropriate candidate chemicals for quantitative analyses by the multivariate algorithms, as well as providing a quick-look concentration estimate and consistency check. Detailed simulations using both laboratory fluorescence data and computer synthesized spectra indicate that our software can make accurate concentration estimates from complex multicomponent mixtures, even when the mixture is noisy and contaminated with unknowns.

  2. Chemical recognition software

    Energy Technology Data Exchange (ETDEWEB)

    Wagner, J.S.; Trahan, M.W.; Nelson, W.E.; Hargis, P.J. Jr.; Tisone, G.C.

    1994-12-01

    We have developed a capability to make real time concentration measurements of individual chemicals in a complex mixture using a multispectral laser remote sensing system. Our chemical recognition and analysis software consists of three parts: (1) a rigorous multivariate analysis package for quantitative concentration and uncertainty estimates, (2) a genetic optimizer which customizes and tailors the multivariate algorithm for a particular application, and (3) an intelligent neural net chemical filter which pre-selects from the chemical database to find the appropriate candidate chemicals for quantitative analyses by the multivariate algorithms, as well as providing a quick-look concentration estimate and consistency check. Detailed simulations using both laboratory fluorescence data and computer synthesized spectra indicate that our software can make accurate concentration estimates from complex multicomponent mixtures. even when the mixture is noisy and contaminated with unknowns.

  3. Implementing Software Defined Radio

    CERN Document Server

    Grayver, Eugene

    2013-01-01

    Software Defined Radio makes wireless communications easier, more efficient, and more reliable. This book bridges the gap between academic research and practical implementation. When beginning a project, practicing engineers, technical managers, and graduate students can save countless hours by considering the concepts presented in these pages. The author covers the myriad options and trade-offs available when selecting an appropriate hardware architecture. As demonstrated here, the choice between hardware- and software-centric architecture can mean the difference between meeting an aggressive schedule and bogging down in endless design iterations. Because of the author’s experience overseeing dozens of failed and successful developments, he is able to present many real-life examples. Some of the key concepts covered are: Choosing the right architecture for the market – laboratory, military, or commercial Hardware platforms – FPGAs, GPPs, specialized and hybrid devices Standardization efforts to ens...

  4. TOUGH2 software qualification

    International Nuclear Information System (INIS)

    TOUGH2 is a numerical simulation code for multi-dimensional coupled fluid and heat flow of multiphase, multicomponent fluid mixtures in porous and fractured media. It belongs to the MULKOM (open-quotes MULti-KOMponentclose quotes) family of codes and is a more general version of the TOUGH simulator. The MULKOM family of codes was originally developed with a focus on geothermal reservoir simulation. They are suited to modeling systems which contain different fluid mixtures, with applications to flow problems arising in the context of high-level nuclear waste isolation, oil and gas recovery and storage, and groundwater resource protection. TOUGH2 is essentially a subset of MULKOM, consisting of a selection of the better tested and documented MULKOM program modules. The purpose of this package of reports is to provide all software baseline documents necessary for the software qualification of TOUGH2

  5. TOUGH2 software qualification

    Energy Technology Data Exchange (ETDEWEB)

    Pruess, K.; Simmons, A.; Wu, Y.S.; Moridis, G.

    1996-02-01

    TOUGH2 is a numerical simulation code for multi-dimensional coupled fluid and heat flow of multiphase, multicomponent fluid mixtures in porous and fractured media. It belongs to the MULKOM ({open_quotes}MULti-KOMponent{close_quotes}) family of codes and is a more general version of the TOUGH simulator. The MULKOM family of codes was originally developed with a focus on geothermal reservoir simulation. They are suited to modeling systems which contain different fluid mixtures, with applications to flow problems arising in the context of high-level nuclear waste isolation, oil and gas recovery and storage, and groundwater resource protection. TOUGH2 is essentially a subset of MULKOM, consisting of a selection of the better tested and documented MULKOM program modules. The purpose of this package of reports is to provide all software baseline documents necessary for the software qualification of TOUGH2.

  6. ThermalTracker Software

    Energy Technology Data Exchange (ETDEWEB)

    2016-08-10

    The software processes recorded thermal video and detects the flight tracks of birds and bats that passed through the camera's field of view. The output is a set of images that show complete flight tracks for any detections, with the direction of travel indicated and the thermal image of the animal delineated. A report of the descriptive features of each detected track is also output in the form of a comma-separated value text file.

  7. Standard software for CAMAC

    International Nuclear Information System (INIS)

    The NIM Committee (National Instrumentation Methods Committee) of the U.S. Department of Energy and the ESONE Committee of European Laboratories have jointly specified standard software for use with CAMAC. Three general approaches were followed: the definition of a language called IML for use in CAMAC systems, the definition of a standard set of subroutine calls, and real-time extensions to the BASIC language. This paper summarizes the results of these efforts. 1 table

  8. Office software Individual coaching

    CERN Multimedia

    HR Department

    2010-01-01

    If one or several particular topics cause you sleepless nights, you can get the help of our trainer who will come to your workplace for a multiple of 1-hour slots . All fields in which our trainer can help are detailed in the course description in our training catalogue (Microsoft Office software, Adobe applications, i-applications etc.). Please discover these new courses in our catalogue! Tel. 74924

  9. Software Defined Networking

    OpenAIRE

    Roncero Hervás, Óscar

    2014-01-01

    Software Defined Networks (SDN) is a paradigm in which routing decisions are taken by a control layer. In contrast to conventional network structures, the control plane and forwarding plane are separated and communicate through standard protocols like OpenFlow. Historically, network management was based on a layered approach, each one isolated from the others. SDN proposes a radically different approach by bringing together the management of all these layers into a single controller. It is th...

  10. Software for Spatial Statistics

    Directory of Open Access Journals (Sweden)

    Edzer Pebesma

    2015-02-01

    Full Text Available We give an overview of the papers published in this special issue on spatial statistics, of the Journal of Statistical Software. 21 papers address issues covering visualization (micromaps, links to Google Maps or Google Earth, point pattern analysis, geostatistics, analysis of areal aggregated or lattice data, spatio-temporal statistics, Bayesian spatial statistics, and Laplace approximations. We also point to earlier publications in this journal on the same topic.

  11. Addressing Software Security

    Science.gov (United States)

    Bailey, Brandon

    2015-01-01

    Historically security within organizations was thought of as an IT function (web sites/servers, email, workstation patching, etc.) Threat landscape has evolved (Script Kiddies, Hackers, Advanced Persistent Threat (APT), Nation States, etc.) Attack surface has expanded -Networks interconnected!! Some security posture factors Network Layer (Routers, Firewalls, etc.) Computer Network Defense (IPS/IDS, Sensors, Continuous Monitoring, etc.) Industrial Control Systems (ICS) Software Security (COTS, FOSS, Custom, etc.)

  12. Image Processing Software

    Science.gov (United States)

    1992-01-01

    To convert raw data into environmental products, the National Weather Service and other organizations use the Global 9000 image processing system marketed by Global Imaging, Inc. The company's GAE software package is an enhanced version of the TAE, developed by Goddard Space Flight Center to support remote sensing and image processing applications. The system can be operated in three modes and is combined with HP Apollo workstation hardware.

  13. Spreadsheet Auditing Software

    OpenAIRE

    Nixon, David; O'Hara, Mike

    2010-01-01

    It is now widely accepted that errors in spreadsheets are both common and potentially dangerous. Further research has taken place to investigate how frequently these errors occur, what impact they have, how the risk of spreadsheet errors can be reduced by following spreadsheet design guidelines and methodologies, and how effective auditing of a spreadsheet is in the detection of these errors. However, little research exists to establish the usefulness of software tools in the auditing of spre...

  14. Banded transformer cores

    Science.gov (United States)

    Mclyman, C. W. T. (Inventor)

    1974-01-01

    A banded transformer core formed by positioning a pair of mated, similar core halves on a supporting pedestal. The core halves are encircled with a strap, selectively applying tension whereby a compressive force is applied to the core edge for reducing the innate air gap. A dc magnetic field is employed in supporting the core halves during initial phases of the banding operation, while an ac magnetic field subsequently is employed for detecting dimension changes occurring in the air gaps as tension is applied to the strap.

  15. Preliminaries on core image analysis using fault drilling samples; Core image kaiseki kotohajime (danso kussaku core kaisekirei)

    Energy Technology Data Exchange (ETDEWEB)

    Miyazaki, T.; Ito, H. [Geological Survey of Japan, Tsukuba (Japan)

    1996-05-01

    This paper introduces examples of image data analysis on fault drilling samples. The paper describes the following matters: core samples used in the analysis are those obtained from wells drilled piercing the Nojima fault which has moved in the Hygoken-Nanbu Earthquake; the CORESCAN system made by DMT Corporation, Germany, used in acquiring the image data consists of a CCD camera, a light source and core rotation mechanism, and a personal computer, its resolution being about 5 pixels/mm in both axial and circumferential directions, and 24-bit full color; with respect to the opening fractures in core samples collected by using a constant azimuth coring, it was possible to derive values of the opening width, inclination angle, and travel from the image data by using a commercially available software for the personal computer; and comparison of this core image with the BHTV record and the hydrophone VSP record (travel and inclination obtained from the BHTV record agree well with those obtained from the core image). 4 refs., 4 figs.

  16. Computer games and software engineering

    CERN Document Server

    Cooper, Kendra M L

    2015-01-01

    Computer games represent a significant software application domain for innovative research in software engineering techniques and technologies. Game developers, whether focusing on entertainment-market opportunities or game-based applications in non-entertainment domains, thus share a common interest with software engineers and developers on how to best engineer game software.Featuring contributions from leading experts in software engineering, the book provides a comprehensive introduction to computer game software development that includes its history as well as emerging research on the inte

  17. Terminological recommendations for software localization

    Directory of Open Access Journals (Sweden)

    Klaus-Dirk Schmitz

    2012-08-01

    Full Text Available After an explosive growth of data processing and software starting at the beginning of the 1980s, the software industry shifted toward a strong orientation in non-US markets at the beginning of the 1990s. Today we see the global marketing of software in almost all regions of the world. Since software is no longer used by IT experts only, and since European and national regulations require user interfaces, manuals and documentation to be provided in the language of the customer, the market for software translation, i.e. for software localization, is the fastest growing market in the translation business.

  18. Wildlife software: procedures for publication of computer software

    Science.gov (United States)

    Samuel, M.D.

    1990-01-01

    Computers and computer software have become an integral part of the practice of wildlife science. Computers now play an important role in teaching, research, and management applications. Because of the specialized nature of wildlife problems, specific computer software is usually required to address a given problem (e.g., home range analysis). This type of software is not usually available from commercial vendors and therefore must be developed by those wildlife professionals with particular skill in computer programming. Current journal publication practices generally prevent a detailed description of computer software associated with new techniques. In addition, peer review of journal articles does not usually include a review of associated computer software. Thus, many wildlife professionals are usually unaware of computer software that would meet their needs or of major improvements in software they commonly use. Indeed most users of wildlife software learn of new programs or important changes only by word of mouth.

  19. JPI UML Software Modeling

    Directory of Open Access Journals (Sweden)

    Cristian Vidal Silva

    2015-12-01

    Full Text Available Aspect-Oriented Programming AOP extends object-oriented programming OOP with aspects to modularize crosscutting behavior on classes by means of aspects to advise base code in the occurrence of join points according to pointcut rules definition. However, join points introduce dependencies between aspects and base code, a great issue to achieve an effective independent development of software modules. Join Point Interfaces JPI represent join points using interfaces between classes and aspect, thus these modules do not depend of each other. Nevertheless, since like AOP, JPI is a programming methodology; thus, for a complete aspect-oriented software development process, it is necessary to define JPI requirements and JPI modeling phases. Towards previous goal, this article proposes JPI UML class and sequence diagrams for modeling JPI software solutions. A purpose of these diagrams is to facilitate understanding the structure and behavior of JPI programs. As an application example, this article applies the JPI UML diagrams proposal on a case study and analyzes the associated JPI code to prove their hegemony.

  20. Balloon Design Software

    Science.gov (United States)

    Farley, Rodger

    2007-01-01

    PlanetaryBalloon Version 5.0 is a software package for the design of meridionally lobed planetary balloons. It operates in a Windows environment, and programming was done in Visual Basic 6. By including the effects of circular lobes with load tapes, skin mass, hoop and meridional stress, and elasticity in the structural elements, a more accurate balloon shape of practical construction can be determined as well as the room-temperature cut pattern for the gore shapes. The computer algorithm is formulated for sizing meridionally lobed balloons for any generalized atmosphere or planet. This also covers zero-pressure, over-pressure, and super-pressure balloons. Low circumferential loads with meridionally reinforced load tapes will produce shapes close to what are known as the "natural shape." The software allows for the design of constant angle, constant radius, or constant hoop stress balloons. It uses the desired payload capacity for given atmospheric conditions and determines the required volume, allowing users to design exactly to their requirements. The formulations are generalized to use any lift gas (or mixture of gases), any atmosphere, or any planet as described by the local acceleration of gravity. PlanetaryBalloon software has a comprehensive user manual that covers features ranging from, but not limited to, buoyancy and super-pressure, convenient design equations, shape formulation, and orthotropic stress/strain.

  1. Evidence of Absence software

    Science.gov (United States)

    Dalthorp, Daniel; Huso, Manuela M. P.; Dail, David; Kenyon, Jessica

    2014-01-01

    Evidence of Absence software (EoA) is a user-friendly application used for estimating bird and bat fatalities at wind farms and designing search protocols. The software is particularly useful in addressing whether the number of fatalities has exceeded a given threshold and what search parameters are needed to give assurance that thresholds were not exceeded. The software is applicable even when zero carcasses have been found in searches. Depending on the effectiveness of the searches, such an absence of evidence of mortality may or may not be strong evidence that few fatalities occurred. Under a search protocol in which carcasses are detected with nearly 100 percent certainty, finding zero carcasses would be convincing evidence that overall mortality rate was near zero. By contrast, with a less effective search protocol with low probability of detecting a carcass, finding zero carcasses does not rule out the possibility that large numbers of animals were killed but not detected in the searches. EoA uses information about the search process and scavenging rates to estimate detection probabilities to determine a maximum credible number of fatalities, even when zero or few carcasses are observed.

  2. FPGAs for software programmers

    CERN Document Server

    Hannig, Frank; Ziener, Daniel

    2016-01-01

    This book makes powerful Field Programmable Gate Array (FPGA) and reconfigurable technology accessible to software engineers by covering different state-of-the-art high-level synthesis approaches (e.g., OpenCL and several C-to-gates compilers). It introduces FPGA technology, its programming model, and how various applications can be implemented on FPGAs without going through low-level hardware design phases. Readers will get a realistic sense for problems that are suited for FPGAs and how to implement them from a software designer’s point of view. The authors demonstrate that FPGAs and their programming model reflect the needs of stream processing problems much better than traditional CPU or GPU architectures, making them well-suited for a wide variety of systems, from embedded systems performing sensor processing to large setups for Big Data number crunching. This book serves as an invaluable tool for software designers and FPGA design engineers who are interested in high design productivity through behavi...

  3. The ALMA software architecture

    Science.gov (United States)

    Schwarz, Joseph; Farris, Allen; Sommer, Heiko

    2004-09-01

    The software for the Atacama Large Millimeter Array (ALMA) is being developed by many institutes on two continents. The software itself will function in a distributed environment, from the 0.5-14 kmbaselines that separate antennas to the larger distances that separate the array site at the Llano de Chajnantor in Chile from the operations and user support facilities in Chile, North America and Europe. Distributed development demands 1) interfaces that allow separated groups to work with minimal dependence on their counterparts at other locations; and 2) a common architecture to minimize duplication and ensure that developers can always perform similar tasks in a similar way. The Container/Component model provides a blueprint for the separation of functional from technical concerns: application developers concentrate on implementing functionality in Components, which depend on Containers to provide them with services such as access to remote resources, transparent serialization of entity objects to XML, logging, error handling and security. Early system integrations have verified that this architecture is sound and that developers can successfully exploit its features. The Containers and their services are provided by a system-orienteddevelopment team as part of the ALMA Common Software (ACS), middleware that is based on CORBA.

  4. Terra Harvest software architecture

    Science.gov (United States)

    Humeniuk, Dave; Klawon, Kevin

    2012-06-01

    Under the Terra Harvest Program, the DIA has the objective of developing a universal Controller for the Unattended Ground Sensor (UGS) community. The mission is to define, implement, and thoroughly document an open architecture that universally supports UGS missions, integrating disparate systems, peripherals, etc. The Controller's inherent interoperability with numerous systems enables the integration of both legacy and future UGS System (UGSS) components, while the design's open architecture supports rapid third-party development to ensure operational readiness. The successful accomplishment of these objectives by the program's Phase 3b contractors is demonstrated via integration of the companies' respective plug-'n'-play contributions that include controllers, various peripherals, such as sensors, cameras, etc., and their associated software drivers. In order to independently validate the Terra Harvest architecture, L-3 Nova Engineering, along with its partner, the University of Dayton Research Institute, is developing the Terra Harvest Open Source Environment (THOSE), a Java Virtual Machine (JVM) running on an embedded Linux Operating System. The Use Cases on which the software is developed support the full range of UGS operational scenarios such as remote sensor triggering, image capture, and data exfiltration. The Team is additionally developing an ARM microprocessor-based evaluation platform that is both energy-efficient and operationally flexible. The paper describes the overall THOSE architecture, as well as the design decisions for some of the key software components. Development process for THOSE is discussed as well.

  5. REUSING APPROACH FOR SOFTWARE PROCESSES BASED ON SOFTWARE ARCHITECTURES.

    OpenAIRE

    Aoussat, Fadila; Ahmed-Nacer, Mohamed; Oussalah, Mourad Chabane

    2010-01-01

    International audience; Capitalizing and reusing the knowledge in the field of software process engineering is the objective of this work. In order to ensure a high quality for software process models, regarding to the specific needs of new development techniques and methods, we propose an approach based on two essential points: The Capitalization of the knowledge through a domain ontology, and the reusing of this knowledge across handling software process models as software architectures.

  6. Software is a directed multigraph (and so is software process)

    OpenAIRE

    Dabrowski, Robert; Stencel, Krzysztof; Timoszuk, Grzegorz

    2011-01-01

    For a software system, its architecture is typically defined as the fundamental organization of the system incorporated by its components, their relationships to one another and their environment, and the principles governing their design. If contributed to by the artifacts coresponding to engineering processes that govern the system's evolution, the definition gets natually extended into the architecture of software and software process. Obviously, as long as there were no software systems, ...

  7. Harnessing software development contexts to inform software process selection decisions

    OpenAIRE

    Jeners, Simona; O'Connor, Rory V.; Clake, Paul; Lichter, Horst; Lepmets, Marion; Buglione, Luigi

    2013-01-01

    peer-reviewed Software development is a complex process for which numerous approaches have been suggested. However, no single approach to software development has been met with universal acceptance, which is not surprising, as there are many different software development concerns. In addition, there are a multitude of other contextual factors that influence the choice of software development process and process management decisions. The authors believe it is important to de...

  8. Bridging the Gap Between Software Process and Software Development

    OpenAIRE

    Rouillé, Emmanuelle; Combemale, Benoit; Barais, Olivier; David, Touzet; Jézéquel, Jean-Marc

    2011-01-01

    National audience Model Driven Engineering (MDE) benefits software development (a.k.a. Model Driven Software Development) as well as software processes (a.k.a. Software Process Modeling). Nevertheless, the gap between processes and development is still too great. Indeed, information from processes is not always used to improve development and vice versa. For instance, it is possible to define the development tools used in a process description without linking them to the real tools. This p...

  9. Software design practice using two SCADA software packages

    DEFF Research Database (Denmark)

    Basse, K.P.; Christensen, Georg Kronborg; Frederiksen, P. K.

    1996-01-01

    Typical software development for manufacturing control is done either by specialists with consideral real-time programming experience or done by the adaptation of standard software packages for manufacturing control. After investigation and test of two commercial software packages: "InTouch" and ......Touch" and "Fix", it is argued, that a more efficient software solution can be achieved by utilising an integrated specification for SCADA and PLC-programming. Experiences gained from process control is planned investigated for descrete parts manufacturing....

  10. How can software SMEs become medical device software SMEs

    OpenAIRE

    Mc Caffery, Fergal; Casey, Valentine; Mc Hugh, Martin

    2011-01-01

    peer-reviewed Today the amount of software content within medical devices has grown considerably and will continue to do so as the level of complexity of medical devices continues to increase. This is driven by the fact that software is introduced to produce sophisticated medical devices that would not be possible using only hardware. This therefore presents opportunities for software development SMEs to become medical device software development organisations. However, some obstacles need...

  11. TOWARD SOFTWARE ENGINEERING PRINCIPLES BASED ON ISLAMIC ETHICAL VALUES

    Directory of Open Access Journals (Sweden)

    shihab A. Hameed

    2010-09-01

    Full Text Available Software is the core for Computer-based applications which became an essential part for critical control systems, health and human life guard systems, financial and banking systems, educational and other systems. It requires qualified software engineers professionally and ethically. L.R and survey results show that software engineering professionals facing several ethical related problems which are costly, harmful and affected high ratio of people. Professional organizations like ACM, IEEE, ABET and CSAC have established codes of ethics to help software engineering professionals to understand and manage their ethical responsibilities. Islam considers ethics an essential factor to build individuals,communities and society. Islamic Ethics are set of moral principles and guidance that recognizes what is right behavior from wrong, which are comprehensive, stable, fair, and historically prove success in building ethically great society. The 1.3 billions of Muslims with 10s of thousands of software engineers should have an effective role in software development and life, which requires them to understand and implement ethics, specially the Islamic ethics in their work. This paper is a frame-work for modeling software engineering principle. It focuses mainly on adopting a new version of software engineering principle based on Islamic ethical values.

  12. A Framework for Software Preservation

    Directory of Open Access Journals (Sweden)

    Brian Matthews

    2010-07-01

    Full Text Available Software preservation has not had detailed consideration as a research topic or in practical application. In this paper, we present a conceptual framework to capture and organise the main notions of software preservation, which are required for a coherent and comprehensive approach.  This framework has three main aspects. Firstly a discussion of what it means to preserve software via a performance model which considers how a software artefact can be rebuilt from preserved components and can then be seen to be representative of the original software product. Secondly the development of a model of software artefacts, describing the basic components of all software, loosely based on the FRBR model for representing digital artefacts and their history within a library context. Finally, the definition and categorisation of the properties of software artefacts which are required to ensure that the software product has been adequately preserved. These are broken down into a number of categories and related to the concepts defined in the OAIS standard. We also discuss our experience of recording these preservation properties for a number of BADC software products, which arose from a series of case studies conducted to evaluate the software preservation framework, and also briefly describe the SPEQS toolkit, a tool to capture software preservation properties within a software development.

  13. Software engineering methodologies and tools

    Science.gov (United States)

    Wilcox, Lawrence M.

    1993-01-01

    Over the years many engineering disciplines have developed, including chemical, electronic, etc. Common to all engineering disciplines is the use of rigor, models, metrics, and predefined methodologies. Recently, a new engineering discipline has appeared on the scene, called software engineering. For over thirty years computer software has been developed and the track record has not been good. Software development projects often miss schedules, are over budget, do not give the user what is wanted, and produce defects. One estimate is there are one to three defects per 1000 lines of deployed code. More and more systems are requiring larger and more complex software for support. As this requirement grows, the software development problems grow exponentially. It is believed that software quality can be improved by applying engineering principles. Another compelling reason to bring the engineering disciplines to software development is productivity. It has been estimated that productivity of producing software has only increased one to two percent a year in the last thirty years. Ironically, the computer and its software have contributed significantly to the industry-wide productivity, but computer professionals have done a poor job of using the computer to do their job. Engineering disciplines and methodologies are now emerging supported by software tools that address the problems of software development. This paper addresses some of the current software engineering methodologies as a backdrop for the general evaluation of computer assisted software engineering (CASE) tools from actual installation of and experimentation with some specific tools.

  14. The software invention cube: A classification scheme for software inventions

    NARCIS (Netherlands)

    Bergstra, J.A.; Klint, P.

    2008-01-01

    The patent system protects inventions. The requirement that a software invention should make ‘a technical contribution’ turns out to be untenable in practice and this raises the question, what constitutes an invention in the realm of software. The authors developed the Software Invention Cube (SWIC)

  15. ATLAS software configuration and build tool optimisation

    Science.gov (United States)

    Rybkin, Grigory; Atlas Collaboration

    2014-06-01

    multi-core computing resources utilisation, and considerably improved software developer and user experience.

  16. Improving Software Citation and Credit

    CERN Document Server

    Allen, Alice; DuPrie, Kimberly; Mink, Jessica; Nemiroff, Robert; Robitaille, Thomas; Shamir, Lior; Shortridge, Keith; Taylor, Mark; Teuben, Peter; Wallin, John

    2015-01-01

    The past year has seen movement on several fronts for improving software citation, including the Center for Open Science's Transparency and Openness Promotion (TOP) Guidelines, the Software Publishing Special Interest Group that was started at January's AAS meeting in Seattle at the request of that organization's Working Group on Astronomical Software, a Sloan-sponsored meeting at GitHub in San Francisco to begin work on a cohesive research software citation-enabling platform, the work of Force11 to "transform and improve" research communication, and WSSSPE's ongoing efforts that include software publication, citation, credit, and sustainability. Brief reports on these efforts were shared at the BoF, after which participants discussed ideas for improving software citation, generating a list of recommendations to the community of software authors, journal publishers, ADS, and research authors. The discussion, recommendations, and feedback will help form recommendations for software citation to those publishers...

  17. Turning software into a service.

    OpenAIRE

    Turner, M; Budgen, D.; Brereton, P.

    2003-01-01

    The software as a service model composes services dynamically, as needed, by binding several lower-level services--thus overcoming many limitations that constrain traditional software use, deployment and evolution.

  18. ATF beam image monitor software

    International Nuclear Information System (INIS)

    We report about software for the beam image analysis at ATF. We developed image analysis software with a Linux computer. It acquire image data from a video and an IEEE1394 digital camera of the analog. (author)

  19. Clustering Methodologies for Software Engineering

    Directory of Open Access Journals (Sweden)

    Mark Shtern

    2012-01-01

    Full Text Available The size and complexity of industrial strength software systems are constantly increasing. This means that the task of managing a large software project is becoming even more challenging, especially in light of high turnover of experienced personnel. Software clustering approaches can help with the task of understanding large, complex software systems by automatically decomposing them into smaller, easier-to-manage subsystems. The main objective of this paper is to identify important research directions in the area of software clustering that require further attention in order to develop more effective and efficient clustering methodologies for software engineering. To that end, we first present the state of the art in software clustering research. We discuss the clustering methods that have received the most attention from the research community and outline their strengths and weaknesses. Our paper describes each phase of a clustering algorithm separately. We also present the most important approaches for evaluating the effectiveness of software clustering.

  20. Software Engineering for Human Spaceflight

    Science.gov (United States)

    Fredrickson, Steven E.

    2014-01-01

    The Spacecraft Software Engineering Branch of NASA Johnson Space Center (JSC) provides world-class products, leadership, and technical expertise in software engineering, processes, technology, and systems management for human spaceflight. The branch contributes to major NASA programs (e.g. ISS, MPCV/Orion) with in-house software development and prime contractor oversight, and maintains the JSC Engineering Directorate CMMI rating for flight software development. Software engineering teams work with hardware developers, mission planners, and system operators to integrate flight vehicles, habitats, robotics, and other spacecraft elements. They seek to infuse automation and autonomy into missions, and apply new technologies to flight processor and computational architectures. This presentation will provide an overview of key software-related projects, software methodologies and tools, and technology pursuits of interest to the JSC Spacecraft Software Engineering Branch.

  1. The Software Management Environment (SME)

    Science.gov (United States)

    Valett, Jon D.; Decker, William; Buell, John

    1988-01-01

    The Software Management Environment (SME) is a research effort designed to utilize the past experiences and results of the Software Engineering Laboratory (SEL) and to incorporate this knowledge into a tool for managing projects. SME provides the software development manager with the ability to observe, compare, predict, analyze, and control key software development parameters such as effort, reliability, and resource utilization. The major components of the SME, the architecture of the system, and examples of the functionality of the tool are discussed.

  2. Visual assessment of software evolution

    OpenAIRE

    Voinea, Lucian; Lukkien, Johan; Telea, Alexandru

    2007-01-01

    Configuration management tools have become well and widely accepted by the software industry. Software Configuration Management (SCM) systems hold minute information about the entire evolution of complex software systems and thus represent a good source for process accounting and auditing. However, it is still difficult to use the entire spectrum of information such tools maintain. Currently, significant effort is being done in the direction of mining this kind of software repositories for ex...

  3. GNSS Software Receiver for UAVs

    DEFF Research Database (Denmark)

    Olesen, Daniel Madelung; Jakobsen, Jakob; von Benzon, Hans-Henrik;

    2016-01-01

    This paper describes the current activities of GPS/GNSS Software receiver development at DTU Space. GNSS Software receivers have received a great deal of attention in the last two decades and numerous implementations have already been presented. DTU Space has just recently started development of ...... of our own GNSS software-receiver targeted for mini UAV applications, and we will in in this paper present our current progress and briefly discuss the benefits of Software Receivers in relation to our research interests....

  4. Next generation software process improvement

    OpenAIRE

    Turnas, Daniel

    2003-01-01

    Approved for public release; distribution is unlimited Software is often developed under a process that can at best be described as ad hoc. While it is possible to develop quality software under an ad hoc process, formal processes can be developed to help increase the overall quality of the software under development. The application of these processes allows for an organization to mature. The software maturity level, and process improvement, of an organization can be measured with the Cap...

  5. Software quality and agile methods

    OpenAIRE

    Huo, Ming; Verner, June; Zhu, Liming; Ali Babar, Muhammad

    2004-01-01

    peer-reviewed Agile methods may produce software faster but we also need to know how they meet our quality requirements. In this paper we compare the waterfall model with agile processes to show how agile methods achieve software quality under time pressure and in an unstable requirements environment, i.e. we analyze agile software quality assurance. We present a detailed waterfall model showing its software quality support processes. We then show the quality pra...

  6. TESTING FOR OBJECT ORIENTED SOFTWARE

    Directory of Open Access Journals (Sweden)

    Jitendra S. Kushwah

    2011-02-01

    Full Text Available This paper deals with design and development of an automated testing tool for Object Oriented Software. By an automated testing tool, we mean a tool that automates a part of the testing process. It can include one or more of the following processes: test strategy eneration, test case generation, test case execution, test data generation, reporting and logging results. By object-oriented software we mean a software designed using OO approach and implemented using a OO language. Testing of OO software is different from testing software created using procedural languages. Severalnew challenges are posed. In the past most of the methods for testing OO software were just a simple extension of existing methods for conventional software. However, they have been shown to be not very appropriate. Hence, new techniques have been developed. This thesis work has mainly focused on testing design specifications for OO software. As described later, there is a lack of specification-based testing tools for OO software. An advantage of testing software specifications as compared to program code is that specifications aregenerally correct whereas code is flawed. Moreover, with software engineering principles firmly established in the industry, most of the software developed nowadays follow all the steps of Software Development Life Cycle (SDLC. For this work, UML specifications created in Rational Rose are taken. UML has become the de-factostandard for analysis and design of OO software. Testing is conducted at 3 levels: Unit, Integration and System. At the system level there is no difference between the testing techniques used for OO software and other software created using a procedural language, and hence, conventional techniques can be used. This tool provides features for testing at Unit (Class level as well as Integration level. Further a maintenance-level component has also been incorporated. Results of applying this tool to sample Rational Rose files have

  7. Gammasphere software development. Progress report

    Energy Technology Data Exchange (ETDEWEB)

    Piercey, R.B.

    1994-01-01

    This report describes the activities of the nuclear physics group at Mississippi State University which were performed during 1993. Significant progress has been made in the focus areas: chairing the Gammasphere Software Working Group (SWG); assisting with the porting and enhancement of the ORNL UPAK histogramming software package; and developing standard formats for Gammasphere data products. In addition, they have established a new public ftp archive to distribute software and software development tools and information.

  8. Developing energy-aware software

    OpenAIRE

    Brinke, te, Steven

    2015-01-01

    Awareness of environmental sustainability, together with an increasing use of software, makes optimization of software energy consumption evermore important. Energy is one of many resources that is managed by software, and reducing energy consumption cannot be considered without taking into account the trade-offs with other resources and services. Optimization techniques, implemented in software, can lead to substantial reduction of resource consumption, within both the computer system and th...

  9. Software engineering a practitioner's approach

    CERN Document Server

    Pressman, Roger S

    1997-01-01

    This indispensable guide to software engineering exploration enables practitioners to navigate the ins and outs of this rapidly changing field. Pressman's fully revised and updated Fourth Edition provides in-depth coverage of every important management and technical topic in software engineering. Moreover, readers will find the inclusion of the hottest developments in the field such as: formal methods and cleanroom software engineering, business process reengineering, and software reengineering.

  10. Design Principles for Interactive Software

    DEFF Research Database (Denmark)

    The book addresses the crucial intersection of human-computer interaction (HCI) and software engineering by asking both what users require from interactive systems and what developers need to produce well-engineered software. Needs are expressed as......The book addresses the crucial intersection of human-computer interaction (HCI) and software engineering by asking both what users require from interactive systems and what developers need to produce well-engineered software. Needs are expressed as...

  11. Research on Software-Cell-Based Software System

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    The aim of research on software architecture is to improve the quality attributes of software sys tems, such as security, reliability, maintainability, testability , reassembility , evolvability. However, a sin gle running system is hard to achieve all these goals. In this paper, software-cell is introduced as the basic u nit throughout developing process. Then it is further advanced that a robust, safe and high-quality software system is composed of a running system and four supportive systems. This paper especially discusses the structure of software-cell, the construction of the five systems, and the relations between them.

  12. Calculation Software versus Illustration Software for Teaching Statistics

    DEFF Research Database (Denmark)

    Mortensen, Peter Stendahl; Boyle, Robin G.

    1999-01-01

    As personal computers have become more and more powerful, so have the software packages available to us for teaching statistics. This paper investigates what software packages are currently being used by progressive statistics instructors at university level, examines some of the deficiencies...... presented at the conference, and the experiences of the authors, give rise to certain recommendations for instructors and software developers. The main conclusions are that both calculation software and illustration software are needed in the teaching and learning process, and that no computing skills...

  13. IT & C Projects Duration Assessment Based on Audit and Software Reengineering

    Directory of Open Access Journals (Sweden)

    2009-01-01

    Full Text Available This paper analyses the effect of applying the core elements of software engineering and reengineering, probabilistic simulations and system development auditing to software development projects. Our main focus is reducing software development project duration. Due to the fast changing economy, the need for efficiency and productivity is greater than ever. Optimal allocation of resources has proved to be the main element contributing to an increase in efficiency.

  14. Software product lines : Organizational alternatives

    NARCIS (Netherlands)

    Bosch, J

    2001-01-01

    Software product lines enjoy increasingly wide adoption in the software industry. Most authors focus on the technical and process aspects and assume an organizational model consisting of a domain engineering unit and several application engineering units. In our cooperation with several software dev

  15. Visual assessment of software evolution

    NARCIS (Netherlands)

    Voinea, Lucian; Lukkien, Johan; Telea, Alexandru

    2007-01-01

    Configuration management tools have become well and widely accepted by the software industry. Software Configuration Management (SCM) systems hold minute information about the entire evolution of complex software systems and thus represent a good source for process accounting and auditing. However,

  16. Model-based Software Engineering

    DEFF Research Database (Denmark)

    Kindler, Ekkart

    2010-01-01

    The vision of model-based software engineering is to make models the main focus of software development and to automatically generate software from these models. Part of that idea works already today. But, there are still difficulties when it comes to behaviour. Actually, there is no lack in models...

  17. The Ragnarok Software Development Environment

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak

    1999-01-01

    Ragnarok is an experimental software development environment that focuses on enhanced support for managerial activities in large scale software development taking the daily work of the software developer as its point of departure. The main emphasis is support in three areas: management, navigation...... the Ragnarok prototype in a number of projects are outlined....

  18. Free Software and Free Textbooks

    Science.gov (United States)

    Takhteyev, Yuri

    2012-01-01

    Some of the world's best and most sophisticated software is distributed today under "free" or "open source" licenses, which allow the recipients of such software to use, modify, and share it without paying royalties or asking for permissions. If this works for software, could it also work for educational resources, such as books? The economics of…

  19. Software-Design-Analyzer System

    Science.gov (United States)

    Tausworthe, Robert C.

    1991-01-01

    CRISP-90 software-design-analyzer system, update of CRISP-80, is set of computer programs constituting software tool for design and documentation of other software and supporting top-down, hierarchical, modular, structured methodologies for design and programming. Written in Microsoft QuickBasic.

  20. A Learning Software Design Competition.

    Science.gov (United States)

    Hooper, Simon; Hokanson, Brad; Bernhardt, Paul; Johnson, Mark

    2002-01-01

    Explains the University of Minnesota Learning Software Design Competition, focusing on its goals and emphasis on innovation. Describes the review process to evaluate and judge the software, lists the winners, identifies a new class of educational software, and outlines plans for future competitions. (Author/LRW)

  1. Desiderata for Linguistic Software Design

    Science.gov (United States)

    Garretson, Gregory

    2008-01-01

    This article presents a series of guidelines both for researchers in search of software to be used in linguistic analysis and for programmers designing such software. A description of the intended audience and the types of software under consideration and a review of some relevant literature are followed by a discussion of several important…

  2. Software Vulnerability Taxonomy Consolidation

    Energy Technology Data Exchange (ETDEWEB)

    Polepeddi, S

    2004-12-08

    In today's environment, computers and networks are increasing exposed to a number of software vulnerabilities. Information about these vulnerabilities is collected and disseminated via various large publicly available databases such as BugTraq, OSVDB and ICAT. Each of these databases, individually, do not cover all aspects of a vulnerability and lack a standard format among them, making it difficult for end-users to easily compare various vulnerabilities. A central database of vulnerabilities has not been available until today for a number of reasons, such as the non-uniform methods by which current vulnerability database providers receive information, disagreement over which features of a particular vulnerability are important and how best to present them, and the non-utility of the information presented in many databases. The goal of this software vulnerability taxonomy consolidation project is to address the need for a universally accepted vulnerability taxonomy that classifies vulnerabilities in an unambiguous manner. A consolidated vulnerability database (CVDB) was implemented that coalesces and organizes vulnerability data from disparate data sources. Based on the work done in this paper, there is strong evidence that a consolidated taxonomy encompassing and organizing all relevant data can be achieved. However, three primary obstacles remain: lack of referencing a common ''primary key'', un-structured and free-form descriptions of necessary vulnerability data, and lack of data on all aspects of a vulnerability. This work has only considered data that can be unambiguously extracted from various data sources by straightforward parsers. It is felt that even with the use of more advanced, information mining tools, which can wade through the sea of unstructured vulnerability data, this current integration methodology would still provide repeatable, unambiguous, and exhaustive results. Though the goal of coalescing all available data

  3. Software Vulnerability Taxonomy Consolidation

    Energy Technology Data Exchange (ETDEWEB)

    Polepeddi, Sriram S. [Carnegie Mellon Univ., Pittsburgh, PA (United States)

    2004-12-07

    In today's environment, computers and networks are increasing exposed to a number of software vulnerabilities. Information about these vulnerabilities is collected and disseminated via various large publicly available databases such as BugTraq, OSVDB and ICAT. Each of these databases, individually, do not cover all aspects of a vulnerability and lack a standard format among them, making it difficult for end-users to easily compare various vulnerabilities. A central database of vulnerabilities has not been available until today for a number of reasons, such as the non-uniform methods by which current vulnerability database providers receive information, disagreement over which features of a particular vulnerability are important and how best to present them, and the non-utility of the information presented in many databases. The goal of this software vulnerability taxonomy consolidation project is to address the need for a universally accepted vulnerability taxonomy that classifies vulnerabilities in an unambiguous manner. A consolidated vulnerability database (CVDB) was implemented that coalesces and organizes vulnerability data from disparate data sources. Based on the work done in this paper, there is strong evidence that a consolidated taxonomy encompassing and organizing all relevant data can be achieved. However, three primary obstacles remain: lack of referencing a common ''primary key'', un-structured and free-form descriptions of necessary vulnerability data, and lack of data on all aspects of a vulnerability. This work has only considered data that can be unambiguously extracted from various data sources by straightforward parsers. It is felt that even with the use of more advanced, information mining tools, which can wade through the sea of unstructured vulnerability data, this current integration methodology would still provide repeatable, unambiguous, and exhaustive results. Though the goal of coalescing all available data

  4. Software is a directed multigraph (and so is software process)

    CERN Document Server

    Dabrowski, Robert; Timoszuk, Grzegorz

    2011-01-01

    For a software system, its architecture is typically defined as the fundamental organization of the system incorporated by its components, their relationships to one another and their environment, and the principles governing their design. If contributed to by the artifacts coresponding to engineering processes that govern the system's evolution, the definition gets natually extended into the architecture of software and software process. Obviously, as long as there were no software systems, managing their architecture was no problem at all; when there were only small systems, managing their architecture became a mild problem; and now we have gigantic software systems, and managing their architecture has become an equally gigantic problem (to paraphrase Edsger Dijkstra). In this paper we propose a simple, yet we believe effective, model for organizing architecture of software systems. First of all we postulate that only a hollistic approach that supports continuous integration and verification for all softwar...

  5. Secured Reconfigurable Software Defined Radio using OTA software download

    Directory of Open Access Journals (Sweden)

    Dr.V.Jeyalakshmi

    2012-01-01

    Full Text Available Dynamic Reconfiguration of lower layers of the protocol stacks used in communication terminals is a key for the development of future multimode software radio. Together with the use of software downloading, future terminals will become a platform to support the deployment of yet unspecified services and applications. Today software radio is viewed more as a technology to enable the reconfiguration terminals at all stages of design, and production. This paper proposes the concepts of software radio reconfiguration by software downloading on 3G and possible future 4G standards. The work is to discuss the Software Defined Radio (SDR main challenges including the over the air software downloads, installation, execution and automatic reconfiguration.

  6. The software factory: A fourth generation software engineering environment

    Energy Technology Data Exchange (ETDEWEB)

    Evans, M.W.

    1989-01-01

    The software-development process and its management are examined in a text intended for engineering managers and students of computer science. A unified concept based on the principle that software design is an engineering science rather than an art is applied, and a software engineering environment (SEE) analogous to an industrial plant is proposed. Chapters are devoted to the classical software environment, the history of software engineering, the evolution of the SEE, the fourth-generation SEE, the engineering process, software-data relationships, the SEE data base, data control in the SEE, software life cycles, information-system product assurance, business management and control, and automating and adapting the SEE. 143 refs.

  7. Software Must Move! A Description of the Software Assembly Line

    CERN Document Server

    McGowan, Martin J

    2010-01-01

    This paper describes a set of tools for automating and controlling the development and maintenance of software systems. The mental model is a software assembly line. Program design and construction take place at individual programmer workstations. Integration of individual software components takes place at subsequent stations on the assembly line. Software is moved automatically along the assembly line toward final packaging. Software under construction or maintenance is divided into packages. Each package of software is composed of a recipe and ingredients. Some new terms are introduced to describe the ingredients. The recipe specifies how ingredients are transformed into products. The benefits of the Software Assembly Line for development, maintenance, and management of large-scale computer systems are explained.

  8. Software Design Improvements. Part 1; Software Benefits and Limitations

    Science.gov (United States)

    Lalli, Vincent R.; Packard, Michael H.; Ziemianski, Tom

    1997-01-01

    Computer hardware and associated software have been used for many years to process accounting information, to analyze test data and to perform engineering analysis. Now computers and software also control everything from automobiles to washing machines and the number and type of applications are growing at an exponential rate. The size of individual program has shown similar growth. Furthermore, software and hardware are used to monitor and/or control potentially dangerous products and safety-critical systems. These uses include everything from airplanes and braking systems to medical devices and nuclear plants. The question is: how can this hardware and software be made more reliable? Also, how can software quality be improved? What methodology needs to be provided on large and small software products to improve the design and how can software be verified?

  9. BLTC control system software

    Energy Technology Data Exchange (ETDEWEB)

    Logan, J.B., Fluor Daniel Hanford

    1997-02-10

    This is a direct revision to Rev. 0 of the BLTC Control System Software. The entire document is being revised and released as HNF-SD-FF-CSWD-025, Rev 1. The changes incorporated by this revision include addition of a feature to automate the sodium drain when removing assemblies from sodium wetted facilities. Other changes eliminate locked in alarms during cold operation and improve the function of the Oxygen Analyzer. See FCN-620498 for further details regarding these changes. Note the change in the document number prefix, in accordance with HNF-MD-003.

  10. Office software Individual coaching

    CERN Multimedia

    HR Department

    2010-01-01

    If one or several particular topics cause you sleepless nights, you can get help from our trainer who will come to your workplace for a multiple of 1-hour slots . All fields in which our trainer can help are detailed in the course description in our training catalogue (Microsoft Office software, Adobe applications, i-applications etc.) Discover these new courses in our catalogue! http://cta.cern.ch/cta2/f?p=110:9 Technical Training Service Technical.Training@cern.ch Tel 74924

  11. Self-organising software

    CERN Document Server

    Serugendo, Giovanna Di Marzo; Karageorgos, Anthony

    2011-01-01

    Self-organisation, self-regulation, self-repair and self-maintenance are promising conceptual approaches for dealing with complex distributed interactive software and information-handling systems. Self-organising applications dynamically change their functionality and structure without direct user intervention, responding to changes in requirements and the environment. This is the first book to offer an integrated view of self-organisation technologies applied to distributed systems, particularly focusing on multiagent systems. The editors developed this integrated book with three aims: to exp

  12. FASTBUS software workshop

    International Nuclear Information System (INIS)

    FASTBUS is a standard for modular high-speed data acquisition, data-processing and control, development for use in high-energy physics experiments incorporating different types of computers and microprocessors. This Workshop brought together users from different laboratories for a review of current software activities, using the standard both in experiments and for test equipment. There are also papers on interfacing and the present state of systems being developed for use in future LEP experiments. Also included is a discussion on the proposed revision of FASTBUS Standard Routines. (orig.)

  13. Entretian Model for Software Maintenance

    Directory of Open Access Journals (Sweden)

    Priya K Betala

    2013-10-01

    Full Text Available Maintenance refers to the act of modifying the software after putting in use in order to maintain its usability.[1]. In other words, Software maintenance can be defined as; it is the process of providing services to the customers after the delivery of the software. Despite the fact that maintaining software is very challenging, it is the most important routine that must be carried out in the development cycle. If the software is not maintained efficiently it may lead to the death of the software. Maintenance of software may be carried out in two ways. The first one is called „In-house maintenance‟ and the second one is called „Transition maintenance‟. The latter faces the drastic challenges when compared to former, as one team may not provide complete source code to the other, leading to unstructured code, lack of appropriate technique and knowledge about the functioning of the current software. There are a few aspects of software maintenance that set it apart from the other phases. Software maintenance cost comprises more than half of the total software development cost. Also, without software maintenance, it is impossible to change the problems within the product after its release, and many disasters can happen because of immature software. Recognising the importance of software maintenance, this paper proposes a model called “ENTRETIAN MODEL” (Entretian, a French word meaning Maintenance which consists of six basic steps to follow while maintaining software system. This model overcomes certain misconceptions about maintenance phase and it is highly beneficial to the Maintenance Support Team (MST to handle their maintenance activities systematically and efficiently. By employing the proposed model, the MST is able to overcome the technical and managerial issues that are faced earlier in the maintenance phase. The advantage of using “Entretian Model” is best illustrated in this paper with the help of the ERP package.

  14. K-core inflation

    OpenAIRE

    Wolman, Alexander L.

    2011-01-01

    K-core inflation is a new class of underlying inflation measures. The two most popular measures of underlying inflation are core inflation and trimmed mean inflation. The former removes fixed categories of goods and services (food and energy) from the inflation calculation, and the latter removes fixed percentiles of the weighted distribution of price changes. In contrast, k-core inflation specifies a size of relative price change to be removed from the inflation calculation. Thus, the catego...

  15. The core paradox.

    Science.gov (United States)

    Kennedy, G. C.; Higgins, G. H.

    1973-01-01

    Rebuttal of suggestions from various critics attempting to provide an escape from the seeming paradox originated by Higgins and Kennedy's (1971) proposed possibility that the liquid in the outer core was thermally stably stratified and that this stratification might prove a powerful inhibitor to circulation of the outer core fluid of the kind postulated for the generation of the earth's magnetic field. These suggestions are examined and shown to provide no reasonable escape from the core paradox.

  16. SIMD studies in the LHCb reconstruction software

    Science.gov (United States)

    Cámpora Pérez, Daniel Hugo; Couturier, Ben

    2015-12-01

    During the data taking process in the LHC at CERN, millions of collisions are recorded every second by the LHCb Detector. The LHCb Online computing farm, counting around 15000 cores, is dedicated to the reconstruction of the events in real-time, in order to filter those with interesting Physics. The ones kept are later analysed Offline in a more precise fashion on the Grid. This imposes very stringent requirements on the reconstruction software, which has to be as efficient as possible. Modern CPUs support so-called vector-extensions, which extend their Instruction Sets, allowing for concurrent execution across functional units. Several libraries expose the Single Instruction Multiple Data programming paradigm to issue these instructions. The use of vectorisation in our codebase can provide performance boosts, leading ultimately to Physics reconstruction enhancements. In this paper, we present vectorisation studies of significant reconstruction algorithms. A variety of vectorisation libraries are analysed and compared in terms of design, maintainability and performance. We also present the steps taken to systematically measure the performance of the released software, to ensure the consistency of the run-time of the vectorised software.

  17. ATLAS software stack on ARM64

    CERN Document Server

    Smith, Joshua Wyatt; The ATLAS collaboration

    2016-01-01

    The ATLAS experiment explores new hardware and software platforms that, in the future, may be more suited to its data intensive workloads. One such alternative hardware platform is the ARM architecture, which is designed to be extremely power efficient and is found in most smartphones and tablets. CERN openlab recently installed a small cluster of ARM 64-bit evaluation prototype servers. Each server is based on a single-socket ARM 64-bit system on a chip, with 32 Cortex-A57 cores. In total, each server has 128 GB RAM connected with four fast memory channels. This paper reports on the port of the ATLAS software stack onto these new prototype ARM64 servers. This included building the "external" packages that the ATLAS software relies on. Patches were needed to introduce this new architecture into the build as well as patches that correct for platform specific code that caused failures on non-x86 architectures. These patches were applied such that porting to further platforms will need no or only very little adj...

  18. Main: -300CORE [PLACE

    Lifescience Database Archive (English)

    Full Text Available -300CORE S000001 10-May-2006 (last modified) kehi TGTAAAG core motif in -300 elemen...ts of alpha-zein genes of maize; -300 element core; prolamin box by Vicente-Carbajosa et al. (Proc Natl Acad... a DNA-binding protein of the DOF class of transcription factors; zein; core moti...f; maize; -300 element; promoter; prolamin-box; P-box; seed; endosperm; maize (Zea mays); wheat (Triticum aestivum); barley (Hordeum vulgare); tobacco (Nicotiana tabacum) TGTAAAG ...

  19. Core Research Center

    Science.gov (United States)

    Hicks, Joshua; Adrian, Betty

    2009-01-01

    The Core Research Center (CRC) of the U.S. Geological Survey (USGS), located at the Denver Federal Center in Lakewood, Colo., currently houses rock core from more than 8,500 boreholes representing about 1.7 million feet of rock core from 35 States and cuttings from 54,000 boreholes representing 238 million feet of drilling in 28 States. Although most of the boreholes are located in the Rocky Mountain region, the geologic and geographic diversity of samples have helped the CRC become one of the largest and most heavily used public core repositories in the United States. Many of the boreholes represented in the collection were drilled for energy and mineral exploration, and many of the cores and cuttings were donated to the CRC by private companies in these industries. Some cores and cuttings were collected by the USGS along with other government agencies. Approximately one-half of the cores are slabbed and photographed. More than 18,000 thin sections and a large volume of analytical data from the cores and cuttings are also accessible. A growing collection of digital images of the cores are also becoming available on the CRC Web site Internet http://geology.cr.usgs.gov/crc/.

  20. Characteristics for Software Optimization Projects

    Directory of Open Access Journals (Sweden)

    Iulian NITESCU

    2008-01-01

    Full Text Available The increasing of the software systems complexity imposes the identification and implementation of some methods and techniques in order to manage it. The software optimization project is a way in which the software complexity is controlled. The software optimization project must face to the organization need to earn profit. The software optimization project is an integrated part of the application cycle because share same resources, depends on other stages and influences next phases. The optimization project has some particularities because it works on an finished product around its quality. The process is quality and performance oriented and it assumes that the product life cycle is almost finished.

  1. Robust Software Architecture for Robots

    Science.gov (United States)

    Aghazanian, Hrand; Baumgartner, Eric; Garrett, Michael

    2009-01-01

    Robust Real-Time Reconfigurable Robotics Software Architecture (R4SA) is the name of both a software architecture and software that embodies the architecture. The architecture was conceived in the spirit of current practice in designing modular, hard, realtime aerospace systems. The architecture facilitates the integration of new sensory, motor, and control software modules into the software of a given robotic system. R4SA was developed for initial application aboard exploratory mobile robots on Mars, but is adaptable to terrestrial robotic systems, real-time embedded computing systems in general, and robotic toys.

  2. Software-Defined Cluster

    Institute of Scientific and Technical Information of China (English)

    聂华; 杨晓君; 刘淘英

    2015-01-01

    The cluster architecture has played an important role in high-end computing for the past 20 years. With the advent of Internet services, big data, and cloud computing, traditional clusters face three challenges: 1) providing flexible system balance among computing, memory, and I/O capabilities;2) reducing resource pooling overheads;and 3) addressing low performance-power efficiency. This position paper proposes a software-defined cluster (SDC) architecture to deal with these challenges. The SDC architecture inherits two features of traditional cluster: its architecture is multicomputer and it has loosely-coupled interconnect. SDC provides two new mechanisms: global I/O space (GIO) and hardware-supported native access (HNA) to remote devices. Application software can define a virtual cluster best suited to its needs from resources pools provided by a physical cluster, and traditional cluster ecosystems need no modification. We also discuss a prototype design and implementation of a 32-processor cloud server utilizing the SDC architecture.

  3. Software and Computing News

    CERN Multimedia

    Barberis, D

    The last several months have been very busy ones for the ATLAS software developers. They've been trying to cope with the competing demands of multiple software stress tests and testbeds. These include Data Challenge Two (DC2), the Combined Testbeam (CTB), preparations for the Physics Workshop to be held in Rome in June 2005, and other testbeds, primarily one for the High-Level Trigger. Data Challenge 2 (DC2) The primary goal of this was to validate the computing model and to provide a test of simulating a day's worth of ATLAS data (10 million events) and of fully processing it and making it available to the physicists within 10 days (i.e. a 10% scale test). DC2 consists of three parts - the generation, simulation, and mixing of a representative sample of physics events with background events; the reconstruction of the mixed samples with initial classification into the different physics signatures; and the distribution of the data to multiple remote sites (Tier-1 centers) for analysis by physicists. Figu...

  4. Computer software review procedures

    International Nuclear Information System (INIS)

    This article reviews the procedures which are used to review software written for computer based instrumentation and control functions in nuclear facilities. The utilization of computer based control systems is becoming much more prevalent in such installations, in addition to being retrofit into existing systems. Currently, the Nuclear Regulatory System uses Regulatory Guide 1.152, open-quotes Criteria for Programmable Digital Computer System Software in Safety-Related Systems of Nuclear Power Plantsclose quotes and ANSI/IEEE-ANS-7-4.3.2-1982, open-quotes Application Criteria for Programmable Digital Computer Systems in Safety Systems of Nuclear Power Generating Stationsclose quotes for guidance when performing reviews of digital systems. There is great concern about the process of verification and validation of these codes, so when inspections are done of such systems, inspectors examine very closely the processes which were followed in developing the codes, the errors which were detected, how they were found, and the analysis which went into tracing down the causes behind the errors to insure such errors were not propagated again in the future

  5. IMAGE Software Suite

    Science.gov (United States)

    Gallagher, Dennis L.; Rose, M. Franklin (Technical Monitor)

    2000-01-01

    The IMAGE Mission is generating a truely unique set of magnetospheric measurement through a first-of-its-kind complement of remote, global observations. These data are being distributed in the Universal Data Format (UDF), which consists of data, calibration, and documentation. This is an open dataset, available to all by request to the National Space Science Data Center (NSSDC) at NASA Goddard Space Flight Center. Browse data, which consists of summary observations, is also available through the NSSDC in the Common Data Format (CDF) and graphic representations of the browse data. Access to the browse data can be achieved through the NSSDC CDAWeb services or by use of NSSDC provided software tools. This presentation documents the software tools, being provided by the IMAGE team, for use in viewing and analyzing the UDF telemetry data. Like the IMAGE data, these tools are openly available. What these tools can do, how they can be obtained, and how they are expected to evolve will be discussed.

  6. The ATLAS Simulation Software

    CERN Document Server

    Marshall, Z

    2008-01-01

    We present the status of the ATLAS Simulation Pro ject. Recent detector description improvements have focussed on commissioning layouts, implementation of inert material, and comparisons to the as-built detector. Core Simulation is reviewed with a focus on parameter optimizations, physics list choices, visualization, large-scale production, and validation. A fast simulation is also briefly described, and its performance is evaluated with respect to the full Simulation. Digitization, the last step of the Monte Carlo chain, is described, including developments in pile up and data overlay.

  7. Quality of the Open Source Software

    OpenAIRE

    Tariq, Muhammad Tahir and Aleem

    2008-01-01

    Quality and security of software are key factors in the software development. This thesis deals with the quality of open source software (OSS for short) and different questions that are related with open source and close source software has discussed in the thesis proposal. Open source software is a process by which we can produce cheap and qualitative software and its source could be re-use in the development of the software. Close source software is more expensive than open source software ...

  8. New ATLAS Software & Computing Organization

    CERN Multimedia

    Barberis, D

    Following the election by the ATLAS Collaboration Board of Dario Barberis (Genoa University/INFN) as Computing Coordinator and David Quarrie (LBNL) as Software Project Leader, it was considered necessary to modify the organization of the ATLAS Software & Computing ("S&C") project. The new organization is based upon the following principles: separation of the responsibilities for computing management from those of software development, with the appointment of a Computing Coordinator and a Software Project Leader who are both members of the Executive Board; hierarchical structure of responsibilities and reporting lines; coordination at all levels between TDAQ, S&C and Physics working groups; integration of the subdetector software development groups with the central S&C organization. A schematic diagram of the new organization can be seen in Fig.1. Figure 1: new ATLAS Software & Computing organization. Two Management Boards will help the Computing Coordinator and the Software Project...

  9. Managing the Software Development Process

    Science.gov (United States)

    Lubelczky, Jeffrey T.; Parra, Amy

    1999-01-01

    The goal of any software development project is to produce a product that is delivered on time, within the allocated budget, and with the capabilities expected by the customer and unfortunately, this goal is rarely achieved. However, a properly managed project in a mature software engineering environment can consistently achieve this goal. In this paper we provide an introduction to three project success factors, a properly managed project, a competent project manager, and a mature software engineering environment. We will also present an overview of the benefits of a mature software engineering environment based on 24 years of data from the Software Engineering Lab, and suggest some first steps that an organization can take to begin benefiting from this environment. The depth and breadth of software engineering exceeds this paper, various references are cited with a goal of raising awareness and encouraging further investigation into software engineering and project management practices.

  10. Managing the Software Development Process

    Science.gov (United States)

    Lubelczyk, J.; Parra, A.

    The goal of any software development project is to produce a product that is delivered on time, within the allocated budget, and with the capabilities expected by the customer and unfortunately, this goal is rarely achieved. However, a properly managed project in a mature software engineering environment can consistently achieve this goal. In this paper we provide an introduction to three project success factors, a properly managed project, a competent project manager, and a mature software engineering environment. We will also present an overview of the benefits of a mature software engineering environment based on 24 years of data from the Software Engineering Lab, and suggest some first steps that an organization can take to begin benefiting from this environment. The depth and breadth of software engineering exceeds this paper, various references are cited with a goal of raising awareness and encouraging further investigation into software engineering and project management practices.

  11. Software Process in Geant4

    Institute of Scientific and Technical Information of China (English)

    GabrieleCosmo

    2001-01-01

    Since its erliest years of R&D [1],the GEANT4 simulation toolkit has been developed following software process standards which dictated the overall evolution of the project.The complexity of the software involved,the wide areas of application of the software product,the huge amount of code and Category complexity,the size and distributed nature of the Collaboration itself are all ingredients which involve and correlate together a wide variety of software processes.Although in "production" and available to the public since December 1998,the GEANT4 software product [1] includes Category Domains which are still under active development.Therefore they require different treatment also in terms of improvement of the development cycle,system,testing and user support,This article is meant to describe some of the software processes as they are applied in GEANT4 for both development,testing and maintenance of the software.

  12. Software libre vs. software propietario: programando nuestro futuro

    Directory of Open Access Journals (Sweden)

    Rafael Gómez Sánchez

    2008-12-01

    Full Text Available Este trabajo estudia la evolución de dos modelos contrapuestos: el software propietario y el software libre. Mientras el primero está plenamente establecido, y apoyado por la industria tradicional de los programas de ordenador, el software libre se presenta como una atractiva alternativa que promete corregir muchas de las deficiencias de aquel modelo. Basado en la filosofía de respetar las libertades del usuario -libertad de compartir, mejorar y utilizar los programas-, son cada vez más las administraciones, empresas y demás usuarios que optan por el software libre. La interacción entre ambos modelos y sus consecuencias, así como los intentos de las multinacionales del software por no perder mercado, serán asimismo objeto de estudio.________________________ABSTRACT:This work studies the evolution of two opposed models: propietary software and free software. Meanwhile the first one is fully established, and supported by the traditional computing industry, free software appears like an attractive alternative which promises to correct many deficiencies of that model. Based on the philosophy of respecting user’s liberties -freedom of sharing, improving and using the software-, an increasing number of administrations, companies and other users are moving to the model of free software. Interactions between both models and its consequences, as well as the attempts from the software’s multinational corporations of not to lose market, will also be objects to be studied.

  13. Evolution of the ATLAS Software Framework towards Concurrency

    CERN Document Server

    Jones, Roger; The ATLAS collaboration; Leggett, Charles; Wynne, Benjamin

    2015-01-01

    The ATLAS experiment has successfully used its Gaudi/Athena software framework for data taking and analysis during the first LHC run, with billions of events successfully processed. However, the design of Gaudi/Athena dates from early 2000 and the software and the physics code has been written using a single threaded, serial design. This programming model has increasing difficulty in exploiting the potential of current CPUs, which offer their best performance only through taking full advantage of multiple cores and wide vector registers. Future CPU evolution will intensify this trend, with core counts increasing and memory per core falling. Maximising performance per watt will be a key metric, so all of these cores must be used as efficiently as possible. In order to address the deficiencies of the current framework, ATLAS has embarked upon two projects: first, a practical demonstration of the use of multi-threading in our reconstruction software, using the GaudiHive framework; second, an exercise to gather r...

  14. Applied software risk management a guide for software project managers

    CERN Document Server

    Pandian, C Ravindranath

    2006-01-01

    Few software projects are completed on time, on budget, and to their original specifications. Focusing on what practitioners need to know about risk in the pursuit of delivering software projects, Applied Software Risk Management: A Guide for Software Project Managers covers key components of the risk management process and the software development process, as well as best practices for software risk identification, risk planning, and risk analysis. Written in a clear and concise manner, this resource presents concepts and practical insight into managing risk. It first covers risk-driven project management, risk management processes, risk attributes, risk identification, and risk analysis. The book continues by examining responses to risk, the tracking and modeling of risks, intelligence gathering, and integrated risk management. It concludes with details on drafting and implementing procedures. A diary of a risk manager provides insight in implementing risk management processes.Bringing together concepts ...

  15. Mercury's core evolution

    Science.gov (United States)

    Deproost, Marie-Hélène; Rivoldini, Attilio; Van Hoolst, Tim

    2016-10-01

    Remote sensing data of Mercury's surface by MESSENGER indicate that Mercury formed under reducing conditions. As a consequence, silicon is likely the main light element in the core together with a possible small fraction of sulfur. Compared to sulfur, which does almost not partition into solid iron at Mercury's core conditions and strongly decreases the melting temperature, silicon partitions almost equally well between solid and liquid iron and is not very effective at reducing the melting temperature of iron. Silicon as the major light element constituent instead of sulfur therefore implies a significantly higher core liquidus temperature and a decrease in the vigor of compositional convection generated by the release of light elements upon inner core formation.Due to the immiscibility in liquid Fe-Si-S at low pressure (below 15 GPa), the core might also not be homogeneous and consist of an inner S-poor Fe-Si core below a thinner Si-poor Fe-S layer. Here, we study the consequences of a silicon-rich core and the effect of the blanketing Fe-S layer on the thermal evolution of Mercury's core and on the generation of a magnetic field.

  16. Ice Core Investigations

    Science.gov (United States)

    Krim, Jessica; Brody, Michael

    2008-01-01

    What can glaciers tell us about volcanoes and atmospheric conditions? How does this information relate to our understanding of climate change? Ice Core Investigations is an original and innovative activity that explores these types of questions. It brings together popular science issues such as research, climate change, ice core drilling, and air…

  17. Neutron beam tomography software

    International Nuclear Information System (INIS)

    When a sample is traversed by a neutron beam, inhomogeneities in the sample will cause deflections, and the deflections will permit conclusions to be drawn concerning the location and size of the inhomogeneities. The associated computation is similar to problems in tomography, analogous to X-ray tomography though significantly different in detail. We do not have any point-sample information, but only mean values over short line segments. Since each mean value is derived from a separate neutron counter, the quantity of available data has to be modest; also, since each datum is an integral, its geometric precision is inferior to that of X-ray data. Our software is designed to cope with these difficulties. (orig.)

  18. Software and Network Engineering

    CERN Document Server

    2012-01-01

    The series "Studies in Computational Intelligence" (SCI) publishes new developments and advances in the various areas of computational intelligence – quickly and with a high quality. The intent is to cover the theory, applications, and design methods of computational intelligence, as embedded in the fields of engineering, computer science, physics and life science, as well as the methodologies behind them. The series contains monographs, lecture notes and edited volumes in computational intelligence spanning the areas of neural networks, connectionist systems, genetic algorithms, evolutionary computation, artificial intelligence, cellular automata, self-organizing systems, soft computing, fuzzy systems, and hybrid intelligent systems. Critical to both contributors and readers are the short publication time and world-wide distribution - this permits a rapid and broad dissemination of research results.   The purpose of the first ACIS International Symposium on Software and Network Engineering held on Decembe...

  19. Open Source Software Acquisition

    DEFF Research Database (Denmark)

    Holck, Jesper; Kühn Pedersen, Mogens; Holm Larsen, Michael

    2005-01-01

    Lately we have seen a growing interest from both public and private organisations to adopt OpenSource Software (OSS), not only for a few, specific applications but also on a more general levelthroughout the organisation. As a consequence, the organisations' decisions on adoption of OSS arebecoming...... the decision criteria we have found. Our results indicate that for large-scaleadoption of OSS, focus will be on architectural considerations: enterprise-wide architectures will atfirst be a barrier, but in the long term OSS's support of open standards can be a major enabler forOSS adoption. In contrast...... OSS can be adopted in niche-areas, without significantlyviolating an existing IT-architecture.Keywords: open source, COTS, IT architecture, governance...

  20. BNL multiparticle spectrometer software

    International Nuclear Information System (INIS)

    This paper discusses some solutions to problems common to the design, management and maintenance of a large high energy physics spectrometer software system. The experience of dealing with a large, complex program and the necessity of having the program controlled by various people at different levels of computer experience has led us to design a program control structure of mnemonic and self-explanatory nature. The use of this control language in both on-line and off-line operation of the program will be discussed. The solution of structuring a large program for modularity so that substantial changes to the program can be made easily for a wide variety of high energy physics experiments is discussed. Specialized tools for this type of large program management are also discussed

  1. Software developments for gammasphere

    Energy Technology Data Exchange (ETDEWEB)

    Lauritsen, T.; Ahmad, I.; Carpenter, M.P. [and others

    1995-08-01

    This year marked the year when data acquisition development for Gammasphere evolved from planning to accomplishment, both in hardware and software. Two VME crates now contain about 10 crate-processors which are used to handle the data from VXI processors - which in turn collect the data from germanium and BGO detectors in the array. The signals from the detectors are processed and digitized in custom-built electronics boards. The processing power in the VME crates is used to digitally filter the data before they are written to tape. The goal is to have highly processed data flowing to tape, eliminating the off-line filtering and manipulation of data that was standard procedure in earlier experiments.

  2. Improving Agile Software Practice

    DEFF Research Database (Denmark)

    Tjørnehøj, Gitte

    2006-01-01

    Software process improvement in small and agile organizations is often problematic, but achieving good SPI-assessments can still be necessary to stay in the marked or to meet demands of multinational owners. The traditional norm driven, centralized and control centered improvement approaches has...... agile and innovative, while striving for certification according to traditional rational SPI norms? This paper reports from a action research project in a small agile Danish firm (Techsoft) conducting a improvement initiative. The firm had just been met by the demand for a CMMI level-3 certification...... from their new multinational owners. In the project we experimented to reach a less centralized and control centered SPI approach trying to meet the agile culture of the firm both within diagnosing, improvement planning, process design and evaluation eventhough the goal was applying to the norm. After...

  3. Energy Tracking Software Platform

    Energy Technology Data Exchange (ETDEWEB)

    Ryan Davis; Nathan Bird; Rebecca Birx; Hal Knowles

    2011-04-04

    Acceleration has created an interactive energy tracking and visualization platform that supports decreasing electric, water, and gas usage. Homeowners have access to tools that allow them to gauge their use and track progress toward a smaller energy footprint. Real estate agents have access to consumption data, allowing for sharing a comparison with potential home buyers. Home builders have the opportunity to compare their neighborhood's energy efficiency with competitors. Home energy raters have a tool for gauging the progress of their clients after efficiency changes. And, social groups are able to help encourage members to reduce their energy bills and help their environment. EnergyIT.com is the business umbrella for all energy tracking solutions and is designed to provide information about our energy tracking software and promote sales. CompareAndConserve.com (Gainesville-Green.com) helps homeowners conserve energy through education and competition. ToolsForTenants.com helps renters factor energy usage into their housing decisions.

  4. Software Process Improvement

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; Diebold, Philipp; Münch, Jürgen

    2016-01-01

    Software process improvement (SPI) is around for decades: frameworks are proposed, success factors are studied, and experiences have been reported. However, the sheer mass of concepts, approaches, and standards published over the years overwhelms practitioners as well as researchers. What is out...... there? Are there new trends and emerging approaches? What are open issues? Still, we struggle to answer these questions about the current state of SPI and related research. In this article, we present results from an updated systematic mapping study to shed light on the field of SPI, to develop a big...... theories and models on SPI in general. In particular, standard SPI models like CMMI and ISO/IEC 15504 are analyzed, enhanced, and evaluated for applicability in practice, but these standards are also critically discussed, e.g., from the perspective of SPI in small-to-medium-sized companies, which leads...

  5. Software Process Improvement

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; Konopka, Claudia; Nellemann, Peter;

    2016-01-01

    Software process improvement (SPI) is around for decades: frameworks are proposed, success factors are studied, and experiences have been reported. However, the sheer mass of concepts, approaches, and standards published over the years overwhelms practitioners as well as researchers. What is out...... there? Are there new emerging approaches? What are open issues? Still, we struggle to answer the question for what is the current state of SPI and related research? We present initial results from a systematic mapping study to shed light on the field of SPI and to draw conclusions for future research...... directions. An analysis of 635 publications draws a big picture of SPI-related research of the past 25 years. Our study shows a high number of solution proposals, experience reports, and secondary studies, but only few theories. In particular, standard SPI models are analyzed and evaluated for applicability...

  6. Software, Software Engineering and Software Engineering Research:Some Unconventional Thoughts

    Institute of Scientific and Technical Information of China (English)

    David Notkin

    2009-01-01

    Software engineering is broadly discussed as falling far short of expectations. Data and examples are used to justify how software itself is often poor, how the engineering of software leaves much to be desired, and how research in software engineering has not made enough progress to help overcome these weaknesses. However, these data and examples are presented and interpreted in ways that are arguably imbalanced. This imbalance, usually taken at face value, may be distracting the field from making significant progress towards improving the effective engineering of software, a goal the entire community shares. Research dichotomies, which tend to pit one approach against another, often subtly hint that there is a best way to engineer software or a best way to perform research on software. This, too, may be distracting the field from important classes of progress.

  7. Mars' core and magnetism.

    Science.gov (United States)

    Stevenson, D J

    2001-07-12

    The detection of strongly magnetized ancient crust on Mars is one of the most surprising outcomes of recent Mars exploration, and provides important insight about the history and nature of the martian core. The iron-rich core probably formed during the hot accretion of Mars approximately 4.5 billion years ago and subsequently cooled at a rate dictated by the overlying mantle. A core dynamo operated much like Earth's current dynamo, but was probably limited in duration to several hundred million years. The early demise of the dynamo could have arisen through a change in the cooling rate of the mantle, or even a switch in convective style that led to mantle heating. Presently, Mars probably has a liquid, conductive outer core and might have a solid inner core like Earth.

  8. Verification of safety critical software

    International Nuclear Information System (INIS)

    To assure quality of safety critical software, software should be developed in accordance with software development procedures and rigorous software verification and validation should be performed. Software verification is the formal act of reviewing, testing of checking, and documenting whether software components comply with the specified requirements for a particular stage of the development phase[1]. New software verification methodology was developed and was applied to the Shutdown System No. 1 and 2 (SDS1,2) for Wolsung 2,3 and 4 nuclear power plants by Korea Atomic Energy Research Institute(KAERI) and Atomic Energy of Canada Limited(AECL) in order to satisfy new regulation requirements of Atomic Energy Control Boars(AECB). Software verification methodology applied to SDS1 for Wolsung 2,3 and 4 project will be described in this paper. Some errors were found by this methodology during the software development for SDS1 and were corrected by software designer. Outputs from Wolsung 2,3 and 4 project have demonstrated that the use of this methodology results in a high quality, cost-effective product. 15 refs., 6 figs. (author)

  9. Software Configuration Management Problems and Solutions to Software Variability Management

    DEFF Research Database (Denmark)

    Bendix, Lars Gotfred

    2003-01-01

    the development and evolution of one single software product and to some degree also supports the concept of variants. It would be interesting to explore to what degree SCM already has solutions to some of the problems of product families and what are the problems where SCM has to invent new techniques to support......These days more and more software is produced as product families. Products that have a lot in common, but all the same vary slightly in one or more aspects. Developing and maintaining these product families is a complex task. Software configuration management (SCM) can, in general, support...... software variability management....

  10. Software As A Service With Cloud Computing

    Directory of Open Access Journals (Sweden)

    Ms. Rashmi A. Akojwar Ms. Reshma V. Kothari Mr. Sandip A. Kahate Ms. Ruchika D. Ganvir

    2012-02-01

    Full Text Available Cloud Computing (CC is a style of computing in which dynamically scalable and often virtualized resources are provided as a service over the Internet. Cloud is basically an extension to the object oriented programming concept of abstraction. It eliminates the complex working details from being visible to the users. What users can view is just an interface, which only involves receiving the inputs and providing the outputs. The process involved in generating the outputs is completely invisible. The cloud works on the concept of abstraction in a physical computing environment, which is done by simply hiding the actual processes from the users. In the cloud hosting environment, the data is placed over multiple servers in the cluster, though the credentials of the network connections are entirely hidden and the users cannot access anyone else’s data. The Cloud environment includes derivatives from the UNIX paradigm of having multiple elements, where every element is exceptional at any individual task, instead of having a single huge element that takes care of all the tasks. One type of cloud service, Software as a Service (SaaS is commonly utilized and it provides several benefits to service consumers. To realize these benefits, it is essential to evaluate the quality of SaaS and manage relatively higher level of its quality based on the evaluation result. Hence, there is a high demand for devising a quality model to evaluate SaaS cloud services. Conventional frameworks do not effectively support SaaS-specific quality aspects such as reusability and accessibility. Software as a Service is a software delivery paradigm in which the software is hosted off-premise and delivered via web. The mode of payment follows a subscription model. SaaS helps organizations avoid capital expenditure and let them focus on their core business instead of support services. Using the SaaS architecture, one can proactively assess applications. Implementing SaaS also

  11. Embracing Open Software Development in Solar Physics

    Science.gov (United States)

    Hughitt, V. K.; Ireland, J.; Christe, S.; Mueller, D.

    2012-12-01

    We discuss two ongoing software projects in solar physics that have adopted best practices of the open source software community. The first, the Helioviewer Project, is a powerful data visualization tool which includes online and Java interfaces inspired by Google Maps (tm). This effort allows users to find solar features and events of interest, and download the corresponding data. Having found data of interest, the user now has to analyze it. The dominant solar data analysis platform is an open-source library called SolarSoft (SSW). Although SSW itself is open-source, the programming language used is IDL, a proprietary language with licensing costs that are prohibative for many institutions and individuals. SSW is composed of a collection of related scripts written by missions and individuals for solar data processing and analysis, without any consistent data structures or common interfaces. Further, at the time when SSW was initially developed, many of the best software development processes of today (mirrored and distributed version control, unit testing, continuous integration, etc.) were not standard, and have not since been adopted. The challenges inherent in developing SolarSoft led to a second software project known as SunPy. SunPy is an open-source Python-based library which seeks to create a unified solar data analysis environment including a number of core datatypes such as Maps, Lightcurves, and Spectra which have consistent interfaces and behaviors. By taking advantage of the large and sophisticated body of scientific software already available in Python (e.g. SciPy, NumPy, Matplotlib), and by adopting many of the best practices refined in open-source software development, SunPy has been able to develop at a very rapid pace while still ensuring a high level of reliability. The Helioviewer Project and SunPy represent two pioneering technologies in solar physics - simple yet flexible data visualization and a powerful, new data analysis environment. We

  12. Determining optimum aging time using novel core flooding equipment

    DEFF Research Database (Denmark)

    Ahkami, Mehrdad; Chakravarty, Krishna Hara; Xiarchos, Ioannis;

    2016-01-01

    New methods for enhanced oil recovery are typically developed using core flooding techniques. Establishing reservoir conditions is essential before the experimental campaign commences. The realistic oil-rock wettability can be obtained through optimum aging of the core. Aging time is affected...... the optimum aging time regardless of variations in crude oil, rock, and brine properties. State of the art core flooding equipment has been developed that can be used for consistently determining the resistivity of the coreplug during aging and waterflooding using advanced data acquisition software...

  13. Earth's inner core: Innermost inner core or hemispherical variations?

    NARCIS (Netherlands)

    Lythgoe, K. H.; Deuss, A.; Rudge, J. F.; Neufeld, J. A.

    2014-01-01

    The structure of Earth's deep inner core has important implications for core evolution, since it is thought to be related to the early stages of core formation. Previous studies have suggested that there exists an innermost inner core with distinct anisotropy relative to the rest of the inner core.

  14. Software engineering beyond the project

    DEFF Research Database (Denmark)

    Dittrich, Yvonne

    2014-01-01

    Context The main part of software engineering methods, tools and technologies has developed around projects as the central organisational form of software development. A project organisation depends on clear bounds regarding scope, participants, development effort and lead-time. What happens when...... these conditions are not given? The article claims that this is the case for software product specific ecosystems. As software is increasingly developed, adopted and deployed in the form of customisable and configurable products, software engineering as a discipline needs to take on the challenge to support...... of traditional software engineering, but makes perfect sense, considering that the frame of reference for product development is not a project but continuous innovation across the respective ecosystem. The article provides a number of concrete points for further research....

  15. SOFTWARE RELIABILITY OF PROFICIENT ENACTMENT

    Directory of Open Access Journals (Sweden)

    B.Anni Princy

    2014-07-01

    Full Text Available A software reliability exemplary projects snags the random process as disillusionments which were the culmination yield of two progressions: emerging faults and initial state values. The predominant classification uses the logistic analysis effort function mounting efficient software on the real time dataset. The detriments of the logistic testing were efficaciously overcome by Pareto distribution. The estimated outline ventures the resolved technique for analyzing the suitable communities and the preeminent of fit for a software reliability progress model. Its constraints are predictable to evaluate the reliability of a software system. The future process will permit for software reliability estimations that can be used both as prominence Indicator, but also for planning and controlling resources, the development times based on the onslaught assignments of the efficient computing and reliable measurement of a software system was competent.

  16. Recommendation systems in software engineering

    CERN Document Server

    Robillard, Martin P; Walker, Robert J; Zimmermann, Thomas

    2014-01-01

    With the growth of public and private data stores and the emergence of off-the-shelf data-mining technology, recommendation systems have emerged that specifically address the unique challenges of navigating and interpreting software engineering data.This book collects, structures and formalizes knowledge on recommendation systems in software engineering. It adopts a pragmatic approach with an explicit focus on system design, implementation, and evaluation. The book is divided into three parts: "Part I - Techniques" introduces basics for building recommenders in software engineering, including techniques for collecting and processing software engineering data, but also for presenting recommendations to users as part of their workflow.?"Part II - Evaluation" summarizes methods and experimental designs for evaluating recommendations in software engineering.?"Part III - Applications" describes needs, issues and solution concepts involved in entire recommendation systems for specific software engineering tasks, fo...

  17. Modular Software-Defined Radio

    Directory of Open Access Journals (Sweden)

    Rhiemeier Arnd-Ragnar

    2005-01-01

    Full Text Available In view of the technical and commercial boundary conditions for software-defined radio (SDR, it is suggestive to reconsider the concept anew from an unconventional point of view. The organizational principles of signal processing (rather than the signal processing algorithms themselves are the main focus of this work on modular software-defined radio. Modularity and flexibility are just two key characteristics of the SDR environment which extend smoothly into the modeling of hardware and software. In particular, the proposed model of signal processing software includes irregular, connected, directed, acyclic graphs with random node weights and random edges. Several approaches for mapping such software to a given hardware are discussed. Taking into account previous findings as well as new results from system simulations presented here, the paper finally concludes with the utility of pipelining as a general design guideline for modular software-defined radio.

  18. Software Reliability Experimentation and Control

    Institute of Scientific and Technical Information of China (English)

    Kai-Yuan Cai

    2006-01-01

    This paper classifies software researches as theoretical researches, experimental researches, and engineering researches, and is mainly concerned with the experimental researches with focus on software reliability experimentation and control. The state-of-the-art of experimental or empirical studies is reviewed. A new experimentation methodology is proposed, which is largely theory discovering oriented. Several unexpected results of experimental studies are presented to justify the importance of software reliability experimentation and control. Finally, a few topics that deserve future investigation are identified.

  19. Reliability in open source software

    OpenAIRE

    Ullah, Najeeb

    2014-01-01

    Open Source Software is a component or an application whose source code is freely accessible and changeable by the users, subject to constraints expressed in a number of licensing modes. It implies a global alliance for developing quality software with quick bug fixing along with quick evolution of the software features. In the recent year tendency toward adoption of OSS in industrial projects has swiftly increased. Many commercial products use OSS in various fields such as embedded systems, ...

  20. Gammasphere software development. Progress report

    Energy Technology Data Exchange (ETDEWEB)

    Piercey, R.B.

    1993-05-01

    Activities of the nuclear physics group are described. Progress was made in organizing the Gammasphere Software Working Group, establishing a nuclear computing facility, participating in software development at Lawrence Berkeley, developing a common data file format, and adapting the ORNL UPAK software to run at Gammasphere. A universal histogram object was developed that defines a file format and provides for an objective-oriented programming model. An automated liquid nitrogen fill system was developed for Gammasphere (110 Ge detectors comprise the sphere).

  1. Software maintenance : the need for standardization

    OpenAIRE

    Schneidewind, Norman F.

    1989-01-01

    Procedures are proposed to assist the Navy Management Systems Support Office in performing software maintenance. Hardware and software maintenance are contrasted. The key difference between the two -- the ease which software can be changed -- leads to the need for managing software change. Standardization of software is proposed as the method for managing software change. A model of software maintenance is advanced as the function for standardizing software maintenance. (kr)

  2. Silverlight 4 Business Intelligence Software

    CERN Document Server

    Czernicki, Bart

    2010-01-01

    Business Intelligence (BI) software allows you to view different components of a business using a single visual platform, which makes comprehending mountains of data easier. BI is everywhere. Applications that include reports, analytics, statistics, and historical and predictive modeling are all examples of BI. Currently, we are in the second generation of BI software - called BI 2.0 - which is focused on writing BI software that is predictive, adaptive, simple, and interactive. As computers and software have evolved, more data can be presented to end users with increasingly visually rich tech

  3. A Software Configuration Management Course

    DEFF Research Database (Denmark)

    Asklund, U.; Bendix, Lars Gotfred

    2003-01-01

    Software Configuration Management has been a big success in research and creation of tools. There are also many vendors in the market of selling courses to companies. However, in the education sector Software Configuration Management has still not quite made it - at least not into the university...... curriculum. It is either not taught at all or is just a minor part of a general course in software engineering. In this paper, we report on our experience with giving a full course entirely dedicated to Software Configuration Management topics and start a discussion of what ideally should be the goal...... and contents of such a course....

  4. Empirically Driven Software Engineering Research

    Science.gov (United States)

    Rombach, Dieter

    Software engineering is a design discipline. As such, its engineering methods are based on cognitive instead of physical laws, and their effectiveness depends highly on context. Empirical methods can be used to observe the effects of software engineering methods in vivo and in vitro, to identify improvement potentials, and to validate new research results. This paper summarizes both the current body of knowledge and further challenges wrt. empirical methods in software engineering as well as empirically derived evidence regarding software typical engineering methods. Finally, future challenges wrt. education, research, and technology transfer will be outlined.

  5. Searching publications on software testing

    CERN Document Server

    Middelburg, C A

    2010-01-01

    This note concerns a search for publications in which the pragmatic concept of a test as conducted in the practice of software testing is formalized, a theory about software testing based on such a formalization is presented or it is demonstrated on the basis of such a theory that there are solid grounds to test software in cases where in principle other forms of analysis could be used. This note reports on the way in which the search has been carried out and the main outcomes of the search. The message of the note is that the fundamentals of software testing are not yet complete in some respects.

  6. The Art of Software Innovation

    CERN Document Server

    Pikkarainen, Minna; Boucart, Nick; Alvaro, Jose Antonio Heredia

    2011-01-01

    Imagine that you are the CEO of a software company. You know you compete in an environment that does not permit you to treat innovation as a secondary issue. But how should you manage your software innovation to get the most out of it? This book will provide you with the answer. Software innovation is multifaceted and the approaches used by companies can be very different. The team of authors that wrote this book took the assumption that there is no such thing as a universal software engineering process or innovation process. Some things work well for a certain company, others do not. The book

  7. Software Security Rules: SDLC Perspective

    Directory of Open Access Journals (Sweden)

    S. K. Pandey

    2009-10-01

    Full Text Available Software has become an integral part of everyday life. Everyday, millions of people perform transaction through internet, ATM, mobile phone, they send email & e-greetings, and use word processing and spreadsheet for various purpose. People use software bearing in mind that it is reliable and can be trust upon and the operation they perform is secured. Now, if these software have exploitable security hole then how can they be safe for use. Security brings value to software in terms of people’s trust. The value provided by secure software is of vital importance because many critical functions are entirely dependent on the software. That is why security is a serious topic which should be given proper attention during the entire SDLC, ‘right from the beginning’. For the proper implementation of security in the software, twenty one security rules are proposed in this paper along with validation results. It is found that by applying these rules as per given implementation mechanism, most of the vulnerabilities are eliminated in the software and a more secure software can be built.

  8. Modularisation of Software Configuration Management

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak

    2000-01-01

    management, and outline how modularisation is natural and powerful also in this context. The analysis is partly based on experiences from case studies where small- to medium-sized development projects are using a prototype tool that supports modular software configuration management.......The principle of modularisation is one of the main techniques that software designers use to tame the complexity of programming. A software project, however, is complex in many other areas than just programming. In this paper, we focus on one of these complex areas, namely software configuration...

  9. Assessing Core Competencies

    Science.gov (United States)

    Narayanan, M.

    2004-12-01

    Catherine Palomba and Trudy Banta offer the following definition of assessment, adapted from one provided by Marches in 1987. Assessment in the systematic collection, review, and use of information about educational programs undertaken for the purpose of improving student learning and development. (Palomba and Banta 1999). It is widely recognized that sophisticated computing technologies are becoming a key element in today's classroom instructional techniques. Regardless, the Professor must be held responsible for creating an instructional environment in which the technology actually supplements learning outcomes of the students. Almost all academic disciplines have found a niche for computer-based instruction in their respective professional domain. In many cases, it is viewed as an essential and integral part of the educational process. Educational institutions are committing substantial resources to the establishment of dedicated technology-based laboratories, so that they will be able to accommodate and fulfill students' desire to master certain of these specific skills. This type of technology-based instruction may raise some fundamental questions about the core competencies of the student learner. Some of the most important questions are : 1. Is the utilization of these fast high-powered computers and user-friendly software programs creating a totally non-challenging instructional environment for the student learner ? 2. Can technology itself all too easily overshadow the learning outcomes intended ? 3. Are the educational institutions simply training students how to use technology rather than educating them in the appropriate field ? 4. Are we still teaching content-driven courses and analysis oriented subject matter ? 5. Are these sophisticated modern era technologies contributing to a decline in the Critical Thinking Capabilities of the 21st century technology-savvy students ? The author tries to focus on technology as a tool and not on the technology

  10. Jitter Controller Software

    Science.gov (United States)

    Lansdowne, Chatwin; Schlensinger, Adam

    2011-01-01

    Sinusoidal jitter is produced by simply modulating a clock frequency sinusoidally with a given frequency and amplitude. But this can be expressed as phase jitter, frequency jitter, or cycle-to-cycle jitter, rms or peak, absolute units, or normalized to the base clock frequency. Jitter using other waveforms requires calculating and downloading these waveforms to an arbitrary waveform generator, and helping the user manage relationships among phase jitter crest factor, frequency jitter crest factor, and cycle-to-cycle jitter (CCJ) crest factor. Software was developed for managing these relationships, automatically configuring the generator, and saving test results documentation. Tighter management of clock jitter and jitter sensitivity is required by new codes that further extend the already high performance of space communication links, completely correcting symbol error rates higher than 10 percent, and therefore typically requiring demodulation and symbol synchronization hardware to operating at signal-to-noise ratios of less than one. To accomplish this, greater demands are also made on transmitter performance, and measurement techniques are needed to confirm performance. It was discovered early that sinusoidal jitter can be stepped on a grid such that one can connect points by constant phase jitter, constant frequency jitter, or constant cycle-cycle jitter. The tool automates adherence to a grid while also allowing adjustments off-grid. Also, the jitter can be set by the user on any dimension and the others are calculated. The calculations are all recorded, allowing the data to be rapidly plotted or re-plotted against different interpretations just by changing pointers to columns. A key advantage is taking data on a carefully controlled grid, which allowed a single data set to be post-analyzed many different ways. Another innovation was building a software tool to provide very tight coupling between the generator and the recorded data product, and the operator

  11. IGCSE core mathematics

    CERN Document Server

    Wall, Terry

    2013-01-01

    Give your core level students the support and framework they require to get their best grades with this book dedicated to the core level content of the revised syllabus and written specifically to ensure a more appropriate pace. This title has been written for Core content of the revised Cambridge IGCSE Mathematics (0580) syllabus for first teaching from 2013. ? Gives students the practice they require to deepen their understanding through plenty of practice questions. ? Consolidates learning with unique digital resources on the CD, included free with every book. We are working with Cambridge

  12. Core shroud corner joints

    Science.gov (United States)

    Gilmore, Charles B.; Forsyth, David R.

    2013-09-10

    A core shroud is provided, which includes a number of planar members, a number of unitary corners, and a number of subassemblies each comprising a combination of the planar members and the unitary corners. Each unitary corner comprises a unitary extrusion including a first planar portion and a second planar portion disposed perpendicularly with respect to the first planar portion. At least one of the subassemblies comprises a plurality of the unitary corners disposed side-by-side in an alternating opposing relationship. A plurality of the subassemblies can be combined to form a quarter perimeter segment of the core shroud. Four quarter perimeter segments join together to form the core shroud.

  13. MGSim - simulation tools for multi-core processor architectures

    NARCIS (Netherlands)

    M. Lankamp; R. Poss; Q. Yang; J. Fu; I. Uddin; C.R. Jesshope

    2013-01-01

    MGSim is an open source discrete event simulator for on-chip hardware components, developed at the University of Amsterdam. It is intended to be a research and teaching vehicle to study the fine-grained hardware/software interactions on many-core and hardware multithreaded processors. It includes su

  14. Geocomputation and open source software: components and software stacks.

    OpenAIRE

    Bivand, Roger S.

    2011-01-01

    Geocomputation, with its necessary focus on software development and methods innovation, has enjoyed a close relationship with free and open source software communities. These extend from communities providing the numerical infrastructure for computation, such as BLAS (Basic Linear Algebra Subprograms), through language communities around Python, Java and others, to communities supporting spatial data handling, especially the projects of the Open Source Geospatial Foundation...

  15. Application Service Providers (ASP Adoption in Core and Non-Core Functions

    Directory of Open Access Journals (Sweden)

    Aman Y.M. Chan

    2009-10-01

    Full Text Available With the further improvement in internet bandwidth, connection stability and data transmission security, a new wave of Application Service Providers (ASP is on his way. The recent booming on some models such as Software Application as Service (SaaS and On-Demand in 2008, has led to emergence of ASP model in core business functions. The traditional IS outsourcing covers the non-core business functions that are not critical to business performance and competitive advantages. Comparing with traditional IS outsourcing, ASP is a new phenomenon that can be considered as an emerging innovation as it covers both core and non-core business functions. Most of the executives do not comprehend the difference and similarity between traditional IS outsourcing and ASP mode. Hence, we propose to conduct a research so as to identify the determinants (cost benefit, gap in IS capability complementing the company's strategic goal, and trust to ASP's service and security level and moderating factors (management's attitude in ownership & control, and company aggressiveness of ASP adoption decision in both core and non-core business functions.

  16. The EPOS Integrated Core Services

    Science.gov (United States)

    Jeffery, Keith; Michelini, Alberto; Bailo, Daniele

    2013-04-01

    EPOS also including other work packages in EPOS such as those concerned with legalistics and financing; (c) a prototype based on the woodman architecture in one domain (seismology) to provide assurance that the architecture is valid. The key aspect is the metadata catalog. In one dimension this is described in 3 levels: (1) discovery metadata using well-known and commonly used standards such as DC (Dublin Core) to enable users (via an intelligent user interface) to search for objects within the EPOS environment relevant to their needs; (2) contextual metadata providing the context of the object described in the catalog to enable a user or the system to determine the relevance of the discovered object(s) to their requirement - the context includes projects, funding, organisations involved, persons involved, related publications, facilities, equipment etc and utilises CERIF (Common European Research Information Format) see www.eurocris.org ; (3) detailed metadata which is specific to a domain or to a particular object and includes the schema describing the object to processing software. The other dimension of the metadata concerns the objects described. These are classified into users, services (including software), data and resources (computing, data storage, instruments and scientific equipment). The core services include not only user access to data, software, services, equipment and associated processing but also facilities for interaction and cooperative working between users and storage of history and experience. EPOS will operate a full e-Science environment including metadata and persistent identifiers.

  17. The SFXC software correlator for Very Long Baseline Interferometry: Algorithms and Implementation

    CERN Document Server

    Keimpema, A; Pogrebenko, S V; Campbell, R M; Cimó, G; Duev, D A; Eldering, B; Kruithof, N; van Langevelde, H J; Marchal, D; Calvés, G Molera; Ozdemir, H; Paragi, Z; Pidopryhora, Y; Szomoru, A; Yang, J

    2015-01-01

    In this paper a description is given of the SFXC software correlator, developed and maintained at the Joint Institute for VLBI in Europe (JIVE). The software is designed to run on generic Linux-based computing clusters. The correlation algorithm is explained in detail, as are some of the novel modes that software correlation has enabled, such as wide-field VLBI imaging through the use of multiple phase centres and pulsar gating and binning. This is followed by an overview of the software architecture. Finally, the performance of the correlator as a function of number of CPU cores, telescopes and spectral channels is shown.

  18. CMS Simulation Software

    CERN Document Server

    Banerjee, Sunanda

    2012-01-01

    The CMS simulation, based on the Geant4 toolkit, has been operational within the new CMS software framework for more than four years. The description of the detector including the forward regions has been completed and detailed investigation of detector positioning and material budget has been carried out using collision data. Detailed modeling of detector noise has been performed and validated with the collision data. In view of the high luminosity runs of the Large Hadron Collider, simulation of pile-up events has become a key issue. Challenges have raised from the point of view of providing a realistic luminosity profile and modeling of out-of-time pileup events, as well as computing issues regarding memory footprint and IO access. These will be especially severe in the simulation of collision events for the LHC upgrades; a new pileup simulation architecture has been introduced to cope with these issues. The CMS detector has observed anomalous energy deposit in the calorimeters and there has been a sub...

  19. Building Software with Gradle

    CERN Document Server

    CERN. Geneva; Studer, Etienne

    2014-01-01

    In this presentation, we will give an overview of the key concepts and main features of Gradle, the innovative build system that has become the de-facto standard in the enterprise. We will cover task declaration and task graph execution, incremental builds, multi-project builds, dependency management, applying plugins, extracting reusable build logic, bootstrapping a build, and using the Gradle daemon. By the end of this talk, you will have a good understanding of what makes Gradle so powerful yet easy to use. You will also understand why companies like Pivotal, LinkedIn, Google, and other giants with complex builds count on Gradle. About the speakers Etienne is leading the Tooling Team at Gradleware. He has been working as a developer, architect, project manager, and CTO over the past 15 years. He has spent most of his time building software products from the ground up and successfully shipping them to happy customers. He had ...

  20. Maneuver Automation Software

    Science.gov (United States)

    Uffelman, Hal; Goodson, Troy; Pellegrin, Michael; Stavert, Lynn; Burk, Thomas; Beach, David; Signorelli, Joel; Jones, Jeremy; Hahn, Yungsun; Attiyah, Ahlam; Illsley, Jeannette

    2009-01-01

    The Maneuver Automation Software (MAS) automates the process of generating commands for maneuvers to keep the spacecraft of the Cassini-Huygens mission on a predetermined prime mission trajectory. Before MAS became available, a team of approximately 10 members had to work about two weeks to design, test, and implement each maneuver in a process that involved running many maneuver-related application programs and then serially handing off data products to other parts of the team. MAS enables a three-member team to design, test, and implement a maneuver in about one-half hour after Navigation has process-tracking data. MAS accepts more than 60 parameters and 22 files as input directly from users. MAS consists of Practical Extraction and Reporting Language (PERL) scripts that link, sequence, and execute the maneuver- related application programs: "Pushing a single button" on a graphical user interface causes MAS to run navigation programs that design a maneuver; programs that create sequences of commands to execute the maneuver on the spacecraft; and a program that generates predictions about maneuver performance and generates reports and other files that enable users to quickly review and verify the maneuver design. MAS can also generate presentation materials, initiate electronic command request forms, and archive all data products for future reference.

  1. NASA Software Engineering Benchmarking Study

    Science.gov (United States)

    Rarick, Heather L.; Godfrey, Sara H.; Kelly, John C.; Crumbley, Robert T.; Wifl, Joel M.

    2013-01-01

    To identify best practices for the improvement of software engineering on projects, NASA's Offices of Chief Engineer (OCE) and Safety and Mission Assurance (OSMA) formed a team led by Heather Rarick and Sally Godfrey to conduct this benchmarking study. The primary goals of the study are to identify best practices that: Improve the management and technical development of software intensive systems; Have a track record of successful deployment by aerospace industries, universities [including research and development (R&D) laboratories], and defense services, as well as NASA's own component Centers; and Identify candidate solutions for NASA's software issues. Beginning in the late fall of 2010, focus topics were chosen and interview questions were developed, based on the NASA top software challenges. Between February 2011 and November 2011, the Benchmark Team interviewed a total of 18 organizations, consisting of five NASA Centers, five industry organizations, four defense services organizations, and four university or university R and D laboratory organizations. A software assurance representative also participated in each of the interviews to focus on assurance and software safety best practices. Interviewees provided a wealth of information on each topic area that included: software policy, software acquisition, software assurance, testing, training, maintaining rigor in small projects, metrics, and use of the Capability Maturity Model Integration (CMMI) framework, as well as a number of special topics that came up in the discussions. NASA's software engineering practices compared favorably with the external organizations in most benchmark areas, but in every topic, there were ways in which NASA could improve its practices. Compared to defense services organizations and some of the industry organizations, one of NASA's notable weaknesses involved communication with contractors regarding its policies and requirements for acquired software. One of NASA's strengths

  2. iPSC Core

    Data.gov (United States)

    Federal Laboratory Consortium — The induced Pluripotent Stem Cells (iPSC) Core was created in 2011 to accelerate stem cell research in the NHLBI by providing investigators consultation, technical...

  3. Reference: -300CORE [PLACE

    Lifescience Database Archive (English)

    Full Text Available -300CORE Forde BG, Heyworth A, Pywell J, Kreis M Nucleotide sequence of a B1 hordein gene and the identifica...tion of possible upstream regulatory elements in endosperm storage protein genes fr

  4. Organizing Core Tasks

    DEFF Research Database (Denmark)

    Boll, Karen

    Civil servants conduct the work which makes welfare states functions on an everyday bases: Police men police, school teachers teach, and tax inspectors inspect. Focus in this paper is on the core tasks of tax inspectors. The paper argues that their core task of securing the collection of revenue...... has remained much the same within the last 10 years. However, how the core task has been organized has changed considerable under the influence of various “organizing devices”. The paper focusses on how organizing devices such as risk assessment, output-focus, effect orientation, and treatment...... projects influence the organization of core tasks within the tax administration. The paper shows that the organizational transformations based on the use of these devices have had consequences both for the overall collection of revenue and for the employees’ feeling of “making a difference”. All in all...

  5. Biospecimen Core Resource - TCGA

    Science.gov (United States)

    The Cancer Genome Atlas (TCGA) Biospecimen Core Resource centralized laboratory reviews and processes blood and tissue samples and their associated data using optimized standard operating procedures for the entire TCGA Research Network.

  6. Focusing on Core Business

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    China is regulating state-owned enterprises that are investing outside of their core business realms, concerned that poor investment decisions could lead to loss of state-owned assets, but some doubt the effect of the new regulation

  7. PWR degraded core analysis

    International Nuclear Information System (INIS)

    A review is presented of the various phenomena involved in degraded core accidents and the ensuing transport of fission products from the fuel to the primary circuit and the containment. The dominant accident sequences found in the PWR risk studies published to date are briefly described. Then chapters deal with the following topics: the condition and behaviour of water reactor fuel during normal operation and at the commencement of degraded core accidents; the generation of hydrogen from the Zircaloy-steam and the steel-steam reactions; the way in which the core deforms and finally melts following loss of coolant; debris relocation analysis; containment integrity; fission product behaviour during a degraded core accident. (U.K.)

  8. Reference: -300CORE [PLACE

    Lifescience Database Archive (English)

    Full Text Available -300CORE Mena M, Vicente-Carbajosa J, Schmidt RJ, Carbonero P An endosperm-specific... DOF protein from barley, highly conserved in wheat, binds to and activates transcription from the prolamin-

  9. NICHD Zebrafish Core

    Data.gov (United States)

    Federal Laboratory Consortium — The core[HTML_REMOVED]s goal is to help researchers of any expertise perform zebrafish experiments aimed at illuminating basic biology and human disease mechanisms,...

  10. SIMD studies in the LHCb reconstruction software

    CERN Document Server

    Campora Perez, D H

    2015-01-01

    During the data taking process in the LHC at CERN, millions of collisions are recorded every second by the LHCb Detector. The LHCb Online computing farm, counting around 15000 cores, is dedicated to the reconstruction of the events in real-time, in order to filter those with interesting Physics. The ones kept are later analysed $Offline$ in a more precise fashion on the Grid. This imposes very stringent requirements on the reconstruction software, which has to be as efficient as possible. Modern CPUs support so-called vector-extensions, which extend their Instruction Sets, allowing for concurrent execution across functional units. Several libraries expose the Single Instruction Multiple Data programming paradigm to issue these instructions. The use of vectorisation in our codebase can provide performance boosts, leading ultimately to Physics reconstruction enhancements. In this paper, we present vectorisation studies of significant reconstruction algorithms. A variety of vectorisation libraries are analysed a...

  11. Astropy: Building Blocks for Astronomy Software

    Science.gov (United States)

    Bray, E. M.

    2014-05-01

    The Astropy Project is a community effort to develop an open source Python package of common data structures and routines for use by other, more specialized astronomy software in order to foster interoperability. The project encompasses the “core” astropy Python package, “affiliated packages” that strive to implement Astropy's coding standards and interoperability with other affiliated packages, and a broader community aimed at implementing Pythonic solutions to astronomy computing problems while minimizing duplication of effort. The project also provides a template for other projects that use Astropy to reuse much of Astropy's development framework without reinventing the wheel. Here we present an overview of the key features of the core package (existing and upcoming), current and planned affiliated packages, and how we manage a large open source project with a diverse community of contributors.

  12. MCNP LWR Core Generator

    Energy Technology Data Exchange (ETDEWEB)

    Fischer, Noah A. [Los Alamos National Laboratory

    2012-08-14

    The reactor core input generator allows for MCNP input files to be tailored to design specifications and generated in seconds. Full reactor models can now easily be created by specifying a small set of parameters and generating an MCNP input for a full reactor core. Axial zoning of the core will allow for density variation in the fuel and moderator, with pin-by-pin fidelity, so that BWR cores can more accurately be modeled. LWR core work in progress: (1) Reflectivity option for specifying 1/4, 1/2, or full core simulation; (2) Axial zoning for moderator densities that vary with height; (3) Generating multiple types of assemblies for different fuel enrichments; and (4) Parameters for specifying BWR box walls. Fuel pin work in progress: (1) Radial and azimuthal zoning for generating further unique materials in fuel rods; (2) Options for specifying different types of fuel for MOX or multiple burn assemblies; (3) Additional options for replacing fuel rods with burnable poison rods; and (4) Control rod/blade modeling.

  13. Developing E-Learning Materials for Software Development Course

    OpenAIRE

    Hao Shi

    2010-01-01

    Software Development is a core second-year course currently offered to undergraduate students at Victoria University at its five local and international campuses. The project aims to redesign the existing course curriculum to support student-centred teaching and learning. It is intended to provide a learning context in which learners can reflect on new material, discuss their tentative understandings with others, actively search for new information, develop skills in communication and collabo...

  14. Agile practices: the impact on trust in software project teams

    OpenAIRE

    McHugh, Orla; Conboy, Kieran; Lang, Michael

    2012-01-01

    peer-reviewed People are core to any software development effort, but they???re particularly important in an agile team. The Agile Manifesto places great emphasis on the team, encouraging autonomy and giving individuals the environment and support they need to get the job done.1 Leadership is shared, and the agile team has substantially more control, which dramatically changes the project manager???s role.2 Managers must have greater trust that their team will make the right decisions and ...

  15. Probabilistic Reliability Assessment of Truss Construction in Matlab Software Platform

    OpenAIRE

    Vašek Jakub; Krejsa Martin

    2014-01-01

    This paper deals with the use of probabilistic methods in assessing the reliability of the planar truss support structure. Classical Monte Carlo simulation technique was chosen for calculation of failure probability in structural elements and the entire support system under assessment. Numerical calculation was applied in MATLAB software system using the random number generator and parallelization using multi-core processors. The aim of the study was to analyze the usability of MATLAB for pro...

  16. Real-time Performance Verification of Core Protection and Monitoring System with Integrated Model for SMART Simulator

    International Nuclear Information System (INIS)

    In keeping with these purposes, a real-time model of the digital core protection and monitoring systems for simulator implementation was developed on the basis of SCOPS and SCOMS algorithms. In addition, important features of the software models were explained for the application to SMART simulator, and the real-time performance of the models linked with DLL was examined for various simulation scenarios. In this paper, performance verification of core protection and monitoring software is performed with integrated simulator model. A real-time performance verification of core protection and monitoring software for SMART simulator was performed with integrated simulator model. Various DLL connection tests were done for software algorithm change. In addition, typical accident scenarios of SMART were simulated with 3KEYMASTER and simulated results were compared with those of DLL linked core protection and monitoring software. Each calculational result showed good agreements

  17. Software architecture analysis of usability

    NARCIS (Netherlands)

    Folmer, Eelke

    2005-01-01

    One of the qualities that has received increased attention in recent decades is usability. A software product with poor usability is likely to fail in a highly competitive market; therefore software developing organizations are paying more and more attention to ensuring the usability of their softwa

  18. Explicit models for dynamic software

    NARCIS (Netherlands)

    Bosloper, Ivor; Siljee, Johanneke; Nijhuis, Jos; Nord, R; Medvidovic, N; Krikhaar, R; Khrhaar, R; Stafford, J; Bosch, J

    2006-01-01

    A key aspect in creating autonomous dynamic software systems is the possibility of reasoning about properties of runtime variability and dynamic behavior, e.g. when and how to reconfigure the system. Currently these properties are often not made explicit in the software architecture. We argue that h

  19. Music Software for Special Needs.

    Science.gov (United States)

    McCord, Kimberly

    2001-01-01

    Discusses the use of computer software for students with special needs in the music classroom. Focuses on software programs that are appropriate for children with special needs such as: "Musicshop,""Band-in-a-Box,""Rock Rap'n Roll,""Music Mania,""Music Ace" and "Music Ace 2," and "Children's Songbook." (CMK)

  20. A layered software specification architecture

    NARCIS (Netherlands)

    M. Snoeck; S. Poelmans; G. Dedene

    2000-01-01

    Separation of concerns is a determining factor of the quality of object- oriented software development. Done well, it can provide substantial benefits such as additive rather than invasive change and improved adaptability, customizability, and reuse. In this paper we propose a software architecture

  1. Engineering the Irish software tiger

    OpenAIRE

    Ryan, Kevin

    2008-01-01

    Information and communication technologies, particularly software, play a crucial role in the Republic of Ireland's remarkable economic growth. Successful globalization has posed many challenges and fostered a major strategic investment in research. Ireland's unique position has a major influence on the realization of the software engineering research agenda.

  2. Future of Software Engineering Standards

    Science.gov (United States)

    Poon, Peter T.

    1997-01-01

    In the new millennium, software engineering standards are expected to continue to influence the process of producing software-intensive systems which are cost-effetive and of high quality. These sytems may range from ground and flight systems used for planetary exploration to educational support systems used in schools as well as consumer-oriented systems.

  3. Real-time software correlation

    NARCIS (Netherlands)

    N.G.H. Kruithof; D.C.P.M. Marchal

    2008-01-01

    In this chapter we present the progress of the SCARIe project, where we investigate the capabilities of a next generation grid-based software correlator for VLBI. We will mostly focus on the current design of our software correlator and on the challenges of running real-time scientific experiments o

  4. Software Development at Belle II

    Science.gov (United States)

    Kuhr, Thomas; Hauth, Thomas

    2015-12-01

    Belle II is a next generation B-factory experiment that will collect 50 times more data than its predecessor Belle. This requires not only a major upgrade of the detector hardware, but also of the simulation, reconstruction, and analysis software. The challenges of the software development at Belle II and the tools and procedures to address them are reviewed in this article.

  5. Software Agent Techniques in Design

    DEFF Research Database (Denmark)

    Hartvig, Susanne C

    1998-01-01

    This paper briefly presents studies of software agent techniques and outline aspects of these which can be applied in design agents in integrated civil engineering design environments.......This paper briefly presents studies of software agent techniques and outline aspects of these which can be applied in design agents in integrated civil engineering design environments....

  6. Risks Management in Software Engineering

    Directory of Open Access Journals (Sweden)

    Dishek Mankad

    2012-12-01

    Full Text Available Risk management is an action that helps a software development team to understand what kinds of risks are there in software development. Risk is always concern with today’s and yesterday’s uncertainty. It is a potential problem. So, it might happen, it might not. It is better to identify its probability of occurrence.

  7. Software Defect Detection with Rocus

    Institute of Scientific and Technical Information of China (English)

    Yuan Jiang; Ming Li; Zhi-Hua Zhou

    2011-01-01

    Software defect detection aims to automatically identify defective software modules for efficient software test in order to improve the quality of a software system. Although many machine learning methods have been successfully applied to the task, most of them fail to consider two practical yet important issues in software defect detection. First, it is rather difficult to collect a large amount of labeled training data for learning a well-performing model; second, in a software system there are usually much fewer defective modules than defect-free modules, so learning would have to be conducted over an imbalanced data set. In this paper, we address these two practical issues simultaneously by proposing a novel semi-supervised learning approach named Rocus. This method exploits the abundant unlabeled examples to improve the detection accuracy, as well as employs under-sampling to tackle the class-imbalance problem in the learning process. Experimental results of real-world software defect detection tasks show that Rocgs is effective for software defect detection. Its performance is better than a semi-supervised learning method that ignores the class-imbalance nature of the task and a class-imbalance learning method that does not make effective use of unlabeled data.

  8. Software Development Standard Processes (SDSP)

    Science.gov (United States)

    Lavin, Milton L.; Wang, James J.; Morillo, Ronald; Mayer, John T.; Jamshidian, Barzia; Shimizu, Kenneth J.; Wilkinson, Belinda M.; Hihn, Jairus M.; Borgen, Rosana B.; Meyer, Kenneth N.; Crean, Kathleen A.; Rinker, George C.; Smith, Thomas P.; Lum, Karen T.; Hanna, Robert A.; Erickson, Daniel E.; Gamble, Edward B., Jr.; Morgan, Scott C.; Kelsay, Michael G.; Newport, Brian J.; Lewicki, Scott A.; Stipanuk, Jeane G.; Cooper, Tonja M.; Meshkat, Leila

    2011-01-01

    A JPL-created set of standard processes is to be used throughout the lifecycle of software development. These SDSPs cover a range of activities, from management and engineering activities, to assurance and support activities. These processes must be applied to software tasks per a prescribed set of procedures. JPL s Software Quality Improvement Project is currently working at the behest of the JPL Software Process Owner to ensure that all applicable software tasks follow these procedures. The SDSPs are captured as a set of 22 standards in JPL s software process domain. They were developed in-house at JPL by a number of Subject Matter Experts (SMEs) residing primarily within the Engineering and Science Directorate, but also from the Business Operations Directorate and Safety and Mission Success Directorate. These practices include not only currently performed best practices, but also JPL-desired future practices in key thrust areas like software architecting and software reuse analysis. Additionally, these SDSPs conform to many standards and requirements to which JPL projects are beholden.

  9. Software Build and Delivery Systems

    Energy Technology Data Exchange (ETDEWEB)

    Robey, Robert W. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-07-10

    This presentation deals with the hierarchy of software build and delivery systems. One of the goals is to maximize the success rate of new users and developers when first trying your software. First impressions are important. Early successes are important. This also reduces critical documentation costs. This is a presentation focused on computer science and goes into detail about code documentation.

  10. The language of social software

    NARCIS (Netherlands)

    Eijck, D.J.N. van

    2010-01-01

    Computer software is written in languages likeC, Java orHaskell. In many cases social software is expressed in natural language. The paper explores connections between the areas of natural language analysis and analysis of social protocols, and proposes an extended program for natural language seman

  11. The language of social software

    OpenAIRE

    Eijck, van, Jan

    2010-01-01

    Computer software is written in languages likeC, Java orHaskell. In many cases social software is expressed in natural language. The paper explores connections between the areas of natural language analysis and analysis of social protocols, and proposes an extended program for natural language semantics, where the goals of natural language communication are derived from the demands of specific social protocols.

  12. Echelle spectrograph software design aid

    Science.gov (United States)

    Dantzler, A. A.

    1985-01-01

    A method for mapping, to first order, the spectrograms that result from echelle spectrographic systems is discussed. An in-depth description of the principles behind the method are given so that software may be generated. Such software is an invaluable echelle spectrograph design aid. Results from two applications are discussed.

  13. Reflight certification software design specifications

    Science.gov (United States)

    1984-01-01

    The PDSS/IMC Software Design Specification for the Payload Development Support System (PDSS)/Image Motion Compensator (IMC) is contained. The PDSS/IMC is to be used for checkout and verification of the IMC flight hardware and software by NASA/MSFC.

  14. Design Knowledge and Software Engineering

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    This paper points out various relationships between DesignKnowledge and Softwar e Engineering. After introduction of human design, the relationships between ind ustrial Software Engineering is discussed, then further details of human design knowledge are revealed with discussions on humanistic aspects of design.

  15. Knowledge modeling for software design

    Science.gov (United States)

    Shaw, Mildred L. G.; Gaines, Brian R.

    1992-01-01

    This paper develops a modeling framework for systems engineering that encompasses systems modeling, task modeling, and knowledge modeling, and allows knowledge engineering and software engineering to be seen as part of a unified developmental process. This framework is used to evaluate what novel contributions the 'knowledge engineering' paradigm has made and how these impact software engineering.

  16. In praise of open software

    CERN Multimedia

    2000-01-01

    Much scientific software is proprietary and beyond the reach of poorer scientific communities. This issue will become critical as companies build bioinformatics tools for genomics. The principal of open-source software needs to be defended by academic research institutions (1/2 p).

  17. Free software, Open source software, licenses. A short presentation including a procedure for research software and data dissemination

    OpenAIRE

    Gomez-Diaz, Teresa

    2014-01-01

    4 pages. Spanish version: Software libre, software de código abierto, licencias. Donde se propone un procedimiento de distribución de software y datos de investigación The main goal of this document is to help the research community to understand the basic concepts of software distribution: Free software, Open source software, licenses. This document also includes a procedure for research software and data dissemination.

  18. CMS Simulation Software

    Science.gov (United States)

    Banerjee, S.

    2012-12-01

    The CMS simulation, based on the Geant4 toolkit, has been operational within the new CMS software framework for more than four years. The description of the detector including the forward regions has been completed and detailed investigation of detector positioning and material budget has been carried out using collision data. Detailed modeling of detector noise has been performed and validated with the collision data. In view of the high luminosity runs of the Large Hadron Collider, simulation of pile-up events has become a key issue. Challenges have raised from the point of view of providing a realistic luminosity profile and modeling of out-of-time pileup events, as well as computing issues regarding memory footprint and IO access. These will be especially severe in the simulation of collision events for the LHC upgrades; a new pileup simulation architecture has been introduced to cope with these issues. The CMS detector has observed anomalous energy deposit in the calorimeters and there has been a substantial effort to understand these anomalous signal events present in the collision data. Emphasis has also been given to validation of the simulation code including the physics of the underlying models of Geant4. Test beam as well as collision data are used for this purpose. Measurements of mean response, resolution, energy sharing between the electromagnetic and hadron calorimeters, shower shapes for single hadrons are directly compared with predictions from Monte Carlo. A suite of performance analysis tools has been put in place and has been used to drive several optimizations to allow the code to fit the constraints posed by the CMS computing model.

  19. Technology transfer in software engineering

    Science.gov (United States)

    Bishop, Peter C.

    1989-01-01

    The University of Houston-Clear Lake is the prime contractor for the AdaNET Research Project under the direction of NASA Johnson Space Center. AdaNET was established to promote the principles of software engineering to the software development industry. AdaNET will contain not only environments and tools, but also concepts, principles, models, standards, guidelines and practices. Initially, AdaNET will serve clients from the U.S. government and private industry who are working in software development. It will seek new clients from those who have not yet adopted the principles and practices of software engineering. Some of the goals of AdaNET are to become known as an objective, authoritative source of new software engineering information and parts, to provide easy access to information and parts, and to keep abreast of innovations in the field.

  20. Ada education in a software life-cycle context

    Science.gov (United States)

    Clough, Anne J.

    1986-01-01

    Some of the experience gained from a comprehensive educational program undertaken at The Charles Stark Draper Lab. to introduce the Ada language and to transition modern software engineering technology into the development of Ada and non-Ada applications is described. Initially, a core group, which included manager, engineers and programmers, received training in Ada. An Ada Office was established to assume the major responsibility for training, evaluation, acquisition and benchmarking of tools, and consultation on Ada projects. As a first step in this process, and in-house educational program was undertaken to introduce Ada to the Laboratory. Later, a software engineering course was added to the educational program as the need to address issues spanning the entire software life cycle became evident. Educational efforts to date are summarized, with an emphasis on the educational approach adopted. Finally, lessons learned in administering this program are addressed.