WorldWideScience

Sample records for testable software requirements

  1. Writing testable software requirements

    Energy Technology Data Exchange (ETDEWEB)

    Knirk, D. [Sandia National Labs., Albuquerque, NM (United States)

    1997-11-01

    This tutorial identifies common problems in analyzing requirements in the problem and constructing a written specification of what the software is to do. It deals with two main problem areas: identifying and describing problem requirements, and analyzing and describing behavior specifications.

  2. Improving the software fault localization process through testability information

    NARCIS (Netherlands)

    Gonzalez-Sanchez, A.; Abreu, R.; Gross, H.; Van Gemund, A.

    2010-01-01

    When failures occur during software testing, automated software fault localization helps to diagnose their root causes and identify the defective components of a program to support debugging. Diagnosis is carried out by selecting test cases in such way that their pass or fail information will narrow

  3. Software requirements

    CERN Document Server

    Wiegers, Karl E

    2003-01-01

    Without formal, verifiable software requirements-and an effective system for managing them-the programs that developers think they've agreed to build often will not be the same products their customers are expecting. In SOFTWARE REQUIREMENTS, Second Edition, requirements engineering authority Karl Wiegers amplifies the best practices presented in his original award-winning text?now a mainstay for anyone participating in the software development process. In this book, you'll discover effective techniques for managing the requirements engineering process all the way through the development cy

  4. Imperfect Requirements in Software Development

    NARCIS (Netherlands)

    Noppen, J.A.R.; van den Broek, P.M.; Aksit, Mehmet; Sawyer, Pete; Paech, Barbara; Heymans, Patrick

    2007-01-01

    Requirement Specifications are very difficult to define. Due to lack of information and differences in interpretation, software engineers are faced with the necessity to redesign and iterate. This imperfection in software requirement specifications is commonly addressed by incremental design. In

  5. Software Testing Requires Variability

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak

    2003-01-01

    Software variability is the ability of a software system or artefact to be changed, customized or configured for use in a particular context. Variability in software systems is important from a number of perspectives. Some perspectives rightly receive much attention due to their direct economic i...... impact in software production. As is also apparent from the call for papers these perspectives focus on qualities such as reuse, adaptability, and maintainability.......Software variability is the ability of a software system or artefact to be changed, customized or configured for use in a particular context. Variability in software systems is important from a number of perspectives. Some perspectives rightly receive much attention due to their direct economic...

  6. Chip to System Testability

    National Research Council Canada - National Science Library

    McNamer, Michael

    1997-01-01

    The ultimate objective of the Chip-to-System Testability program was the development of a structured testability implementation methodology which will be used as a basis for a PC-based tool called TESPAD...

  7. UTM TCL2 Software Requirements

    Science.gov (United States)

    Smith, Irene S.; Rios, Joseph L.; McGuirk, Patrick O.; Mulfinger, Daniel G.; Venkatesan, Priya; Smith, David R.; Baskaran, Vijayakumar; Wang, Leo

    2017-01-01

    The Unmanned Aircraft Systems (UAS) Traffic Management (UTM) Technical Capability Level (TCL) 2 software implements the UTM TCL 2 software requirements described herein. These software requirements are linked to the higher level UTM TCL 2 System Requirements. Each successive TCL implements additional UTM functionality, enabling additional use cases. TCL 2 demonstrated how to enable expanded multiple operations by implementing automation for beyond visual line-of-sight, tracking operations, and operations flying over sparsely populated areas.

  8. The NLC Software Requirements Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Shoaee, Hamid

    2002-08-20

    We describe the software requirements and development methodology developed for the NLC control system. Given the longevity of that project, and the likely geographical distribution of the collaborating engineers, the planned requirements management process is somewhat more formal than the norm in high energy physics projects. The short term goals of the requirements process are to accurately estimate costs, to decompose the problem, and to determine likely technologies. The long term goal is to enable a smooth transition from high level functional requirements to specific subsystem and component requirements for individual programmers, and to support distributed development. The methodology covers both ends of that life cycle. It covers both the analytical and documentary tools for software engineering, and project management support. This paper introduces the methodology, which is fully described in [1].

  9. Software requirements: definition and specification.

    Science.gov (United States)

    Chevlin, D H; Jorgens, J

    1996-01-01

    The software requirements specification is the single most important document in the software development process. It provides the basis for development as well as for validation. The SRS needs to include adequate definition of all requirements without specifying implementation or project management issues. The SRS should be completed early in the development process. However, it is very likely that changes will occur during the development life cycle. This is not an excuse for approving and releasing the current version of the SRS. When changes occur, the SRS must be revised. In any case, the concept is to deal with the current, approved version of the SRS. Ultimately, the SRS should include all the information needed to proceed into the design phase of software development.

  10. Requirements Engineering in Building Climate Science Software

    Science.gov (United States)

    Batcheller, Archer L.

    2011-01-01

    Software has an important role in supporting scientific work. This dissertation studies teams that build scientific software, focusing on the way that they determine what the software should do. These requirements engineering processes are investigated through three case studies of climate science software projects. The Earth System Modeling…

  11. SOFTWARE REQUIREMENTS SPECIFICATION SINAPRA BERBASIS SISTEM INFORMASI

    Directory of Open Access Journals (Sweden)

    Nur Hadi Waryanto

    2015-07-01

    Full Text Available Sistem Informasi Sarana dan Prasarana (SINAPRA merupakan bagian dari beberapa sistem informasi yang dipakai oleh Univeristas Negeri Yogyakarta. SINAPRA merupakan salah satu sistem yang akan dikembangkan dalam Sistem Informasi Terpadu (SIPADU. Software Requirements Specification SINAPRA merupakan acuan teknis developer dalam mengembangkan sistem untuk tahap selanjutnya. Software Requirements Specification SINAPRA dikembangkan menggunakan model WSU-TC CptS 322  dengan berbasis sistem informasi terpadu UNY Kata Kunci : Software Requirements Specification, WSU-TC CptS 322, SINAPRA

  12. Requirements Engineering for Software Integrity and Safety

    Science.gov (United States)

    Leveson, Nancy G.

    2002-01-01

    Requirements flaws are the most common cause of errors and software-related accidents in operational software. Most aerospace firms list requirements as one of their most important outstanding software development problems and all of the recent, NASA spacecraft losses related to software (including the highly publicized Mars Program failures) can be traced to requirements flaws. In light of these facts, it is surprising that relatively little research is devoted to requirements in contrast with other software engineering topics. The research proposed built on our previous work. including both criteria for determining whether a requirements specification is acceptably complete and a new approach to structuring system specifications called Intent Specifications. This grant was to fund basic research on how these ideas could be extended to leverage innovative approaches to the problems of (1) reducing the impact of changing requirements, (2) finding requirements specification flaws early through formal and informal analysis, and (3) avoiding common flaws entirely through appropriate requirements specification language design.

  13. Requirements engineering: foundation for software quality

    NARCIS (Netherlands)

    Daneva, Maia; Pastor, Oscar

    2016-01-01

    Welcome to the proceedings of the 22nd edition of REFSQ: the International Working Conference on Requirements Engineering – Foundation for Software Quality! Requirements engineering (RE) has been recognized as a critical factor that impacts the quality of software, systems, and services. Since the

  14. Identify and Manage the Software Requirements Volatility

    OpenAIRE

    Khloud Abd Elwahab; Mahmoud Abd EL Latif; Sherif Kholeif

    2016-01-01

    Management of software requirements volatility through development of life cycle is a very important stage. It helps the team to control significant impact all over the project (cost, time and effort), and also it keeps the project on track, to finally satisfy the user which is the main success criteria for the software project. In this research paper, we have analysed the root causes of requirements volatility through a proposed framework presenting the requirements volatility causes and how...

  15. The neural basis of testable and non-testable beliefs.

    Directory of Open Access Journals (Sweden)

    Jonathon R Howlett

    Full Text Available Beliefs about the state of the world are an important influence on both normal behavior and psychopathology. However, understanding of the neural basis of belief processing remains incomplete, and several aspects of belief processing have only recently been explored. Specifically, different types of beliefs may involve fundamentally different inferential processes and thus recruit distinct brain regions. Additionally, neural processing of truth and falsity may differ from processing of certainty and uncertainty. The purpose of this study was to investigate the neural underpinnings of assessment of testable and non-testable propositions in terms of truth or falsity and the level of certainty in a belief. Functional magnetic resonance imaging (fMRI was used to study 14 adults while they rated propositions as true or false and also rated the level of certainty in their judgments. Each proposition was classified as testable or non-testable. Testable propositions activated the DLPFC and posterior cingulate cortex, while non-testable statements activated areas including inferior frontal gyrus, superior temporal gyrus, and an anterior region of the superior frontal gyrus. No areas were more active when a proposition was accepted, while the dorsal anterior cingulate was activated when a proposition was rejected. Regardless of whether a proposition was testable or not, certainty that the proposition was true or false activated a common network of regions including the medial prefrontal cortex, caudate, posterior cingulate, and a region of middle temporal gyrus near the temporo-parietal junction. Certainty in the truth or falsity of a non-testable proposition (a strong belief without empirical evidence activated the insula. The results suggest that different brain regions contribute to the assessment of propositions based on the type of content, while a common network may mediate the influence of beliefs on motivation and behavior based on the level of

  16. Requirements Engineering in Building Climate Science Software

    Science.gov (United States)

    Batcheller, Archer L.

    Software has an important role in supporting scientific work. This dissertation studies teams that build scientific software, focusing on the way that they determine what the software should do. These requirements engineering processes are investigated through three case studies of climate science software projects. The Earth System Modeling Framework assists modeling applications, the Earth System Grid distributes data via a web portal, and the NCAR (National Center for Atmospheric Research) Command Language is used to convert, analyze and visualize data. Document analysis, observation, and interviews were used to investigate the requirements-related work. The first research question is about how and why stakeholders engage in a project, and what they do for the project. Two key findings arise. First, user counts are a vital measure of project success, which makes adoption important and makes counting tricky and political. Second, despite the importance of quantities of users, a few particular "power users" develop a relationship with the software developers and play a special role in providing feedback to the software team and integrating the system into user practice. The second research question focuses on how project objectives are articulated and how they are put into practice. The team seeks to both build a software system according to product requirements but also to conduct their work according to process requirements such as user support. Support provides essential communication between users and developers that assists with refining and identifying requirements for the software. It also helps users to learn and apply the software to their real needs. User support is a vital activity for scientific software teams aspiring to create infrastructure. The third research question is about how change in scientific practice and knowledge leads to changes in the software, and vice versa. The "thickness" of a layer of software infrastructure impacts whether the

  17. Requirements engineering for software and systems

    CERN Document Server

    Laplante, Phillip A

    2014-01-01

    Solid requirements engineering has increasingly been recognized as the key to improved, on-time and on-budget delivery of software and systems projects. This book provides practical teaching for graduate and professional systems and software engineers. It uses extensive case studies and exercises to help students grasp concepts and techniques. With a focus on software-intensive systems, this text provides a probing and comprehensive review of recent developments in intelligent systems, soft computing techniques, and their diverse applications in manufacturing. The second edition contains 100% revised content and approximately 30% new material

  18. 78 FR 47015 - Software Requirement Specifications for Digital Computer Software Used in Safety Systems of...

    Science.gov (United States)

    2013-08-02

    ... COMMISSION Software Requirement Specifications for Digital Computer Software Used in Safety Systems of... 1 of RG 1.172, ``Software Requirement Specifications for Digital Computer Software used in Safety... (IEEE) Standard (Std.) 830-1998, ``IEEE Recommended Practice for Software Requirements Specifications...

  19. Model requirements for Biobank Software Systems.

    Science.gov (United States)

    Tukacs, Edit; Korotij, Agnes; Maros-Szabo, Zsuzsanna; Molnar, Agnes Marta; Hajdu, Andras; Torok, Zsolt

    2012-01-01

    Biobanks are essential tools in diagnostics and therapeutics research and development related to personalized medicine. Several international recommendations, standards and guidelines exist that discuss the legal, ethical, technological, and management requirements of biobanks. Today's biobanks are much more than just collections of biospecimens. They also store a huge amount of data related to biological samples which can be either clinical data or data coming from biochemical experiments. A well-designed biobank software system also provides the possibility of finding associations between stored elements. Modern research biobanks are able to manage multicenter sample collections while fulfilling all requirements of data protection and security. While developing several biobanks and analyzing the data stored in them, our research group recognized the need for a well-organized, easy-to-check requirements guideline that can be used to develop biobank software systems. International best practices along with relevant ICT standards were integrated into a comprehensive guideline: The Model Requirements for the Management of Biological Repositories (BioReq), which covers the full range of activities related to biobank development. The guideline is freely available on the Internet for the research community. The database is available for free at http://bioreq.astridbio.com/bioreq_v2.0.pdf.

  20. Designing Law-Compliant Software Requirements

    Science.gov (United States)

    Siena, Alberto; Mylopoulos, John; Perini, Anna; Susi, Angelo

    New laws, such as HIPAA and SOX, are increasingly impacting the design of software systems, as business organisations strive to comply. This paper studies the problem of generating a set of requirements for a new system which comply with a given law. Specifically, the paper proposes a systematic process for generating law-compliant requirements by using a taxonomy of legal concepts and a set of primitives to describe stakeholders and their strategic goals. Given a model of law and a model of stakeholders goals, legal alternatives are identified and explored. Strategic goals that can realise legal prescriptions are systematically analysed, and alternative ways of fulfilling a law are evaluated. The approach is demonstrated by means of a case study. This work is part of the Nomos framework, intended to support the design of law-compliant requirements models.

  1. Requirements engineering and management for software development projects

    CERN Document Server

    Chemuturi, Murali

    2012-01-01

    Requirements Engineering and Management for Software Development Projects presents a complete guide on requirements for software development including engineering, computer science and management activities. It is the first book to cover all aspects of requirements management in software development projects. This book introduces the understanding of the requirements, elicitation and gathering, requirements analysis, verification and validation of the requirements, establishment of requirements, different methodologies in brief, requirements traceability and change management among other topic

  2. 77 FR 50726 - Software Requirement Specifications for Digital Computer Software and Complex Electronics Used in...

    Science.gov (United States)

    2012-08-22

    ... COMMISSION Software Requirement Specifications for Digital Computer Software and Complex Electronics Used in... issuing for public comment draft regulatory guide (DG), DG-1209, ``Software Requirement Specifications for Digital Computer Software and Complex Electronics used in Safety Systems of Nuclear Power Plants.'' The DG...

  3. Traceability of Requirements and Software Architecture for Change Management

    OpenAIRE

    Göknil, Arda

    2011-01-01

    At the present day, software systems get more and more complex. The requirements of software systems change continuously and new requirements emerge frequently. New and/or modified requirements are integrated with the existing ones, and adaptations to the architecture and source code of the system are made. The process of integration of the new/modified requirements and adaptations to the software system is called change management. The size and complexity of software systems make change mana...

  4. A Quantitative Study of Global Software Development Teams, Requirements, and Software Projects

    Science.gov (United States)

    Parker, Linda L.

    2016-01-01

    The study explored the relationship between global software development teams, effective software requirements, and stakeholders' perception of successful software development projects within the field of information technology management. It examined the critical relationship between Global Software Development (GSD) teams creating effective…

  5. Graph Based Verification of Software Evolution Requirements

    NARCIS (Netherlands)

    Ciraci, S.

    2009-01-01

    Due to market demands and changes in the environment, software systems have to evolve. However, the size and complexity of the current software systems make it time consuming to incorporate changes. During our collaboration with the industry, we observed that the developers spend much time on the

  6. Requirements: Towards an understanding on why software projects fail

    Science.gov (United States)

    Hussain, Azham; Mkpojiogu, Emmanuel O. C.

    2016-08-01

    Requirement engineering is at the foundation of every successful software project. There are many reasons for software project failures; however, poorly engineered requirements process contributes immensely to the reason why software projects fail. Software project failure is usually costly and risky and could also be life threatening. Projects that undermine requirements engineering suffer or are likely to suffer from failures, challenges and other attending risks. The cost of project failures and overruns when estimated is very huge. Furthermore, software project failures or overruns pose a challenge in today's competitive market environment. It affects the company's image, goodwill, and revenue drive and decreases the perceived satisfaction of customers and clients. In this paper, requirements engineering was discussed. Its role in software projects success was elaborated. The place of software requirements process in relation to software project failure was explored and examined. Also, project success and failure factors were also discussed with emphasis placed on requirements factors as they play a major role in software projects' challenges, successes and failures. The paper relied on secondary data and empirical statistics to explore and examine factors responsible for the successes, challenges and failures of software projects in large, medium and small scaled software companies.

  7. Traceability of Requirements and Software Architecture for Change Management

    NARCIS (Netherlands)

    Göknil, Arda

    2011-01-01

    At the present day, software systems get more and more complex. The requirements of software systems change continuously and new requirements emerge frequently. New and/or modified requirements are integrated with the existing ones, and adaptations to the architecture and source code of the system

  8. Section 508 Electronic Information Accessibility Requirements for Software Development

    Science.gov (United States)

    Ellis, Rebecca

    2014-01-01

    Section 508 Subpart B 1194.21 outlines requirements for operating system and software development in order to create a product that is accessible to users with various disabilities. This portion of Section 508 contains a variety of standards to enable those using assistive technology and with visual, hearing, cognitive and motor difficulties to access all information provided in software. The focus on requirements was limited to the Microsoft Windows® operating system as it is the predominant operating system used at this center. Compliance with this portion of the requirements can be obtained by integrating the requirements into the software development cycle early and by remediating issues in legacy software if possible. There are certain circumstances with software that may arise necessitating an exemption from these requirements, such as design or engineering software using dynamically changing graphics or numbers to convey information. These exceptions can be discussed with the Section 508 Coordinator and another method of accommodation used.

  9. Proposing an Evidence-Based Strategy for Software Requirements Engineering.

    Science.gov (United States)

    Lindoerfer, Doris; Mansmann, Ulrich

    2016-01-01

    This paper discusses an evidence-based approach to software requirements engineering. The approach is called evidence-based, since it uses publications on the specific problem as a surrogate for stakeholder interests, to formulate risks and testing experiences. This complements the idea that agile software development models are more relevant, in which requirements and solutions evolve through collaboration between self-organizing cross-functional teams. The strategy is exemplified and applied to the development of a Software Requirements list used to develop software systems for patient registries.

  10. Optimising software development policies for evolutionary system requirements

    NARCIS (Netherlands)

    Noppen, J.A.R.; Tekinerdogan, B.; Aksit, Mehmet; Glandrup, Maurice; Nicola, V.F.

    2002-01-01

    Anticipating future software requirements might support the evolution of software systems and as such reduce the cost of development and maintenance in due time. Unfortunately identifying the right set of evolution scenarios is difficult due to the uncertainty of occurrence of future requirements.

  11. Safety. [requirements for software to monitor and control critical processes

    Science.gov (United States)

    Leveson, Nancy G.

    1991-01-01

    Software requirements, design, implementation, verification and validation, and especially management are affected by the need to produce safe software. This paper discusses the changes in the software life cycle that are necessary to ensure that software will execute without resulting in unacceptable risk. Software is being used increasingly to monitor and control safety-critical processes in which a run-time failure or error could result in unacceptable losses such as death, injury, loss of property, or environmental harm. Examples of such processes maybe found in transportation, energy, aerospace, basic industry, medicine, and defense systems.

  12. The Use of UML for Software Requirements Expression and Management

    Science.gov (United States)

    Murray, Alex; Clark, Ken

    2015-01-01

    It is common practice to write English-language "shall" statements to embody detailed software requirements in aerospace software applications. This paper explores the use of the UML language as a replacement for the English language for this purpose. Among the advantages offered by the Unified Modeling Language (UML) is a high degree of clarity and precision in the expression of domain concepts as well as architecture and design. Can this quality of UML be exploited for the definition of software requirements? While expressing logical behavior, interface characteristics, timeliness constraints, and other constraints on software using UML is commonly done and relatively straight-forward, achieving the additional aspects of the expression and management of software requirements that stakeholders expect, especially traceability, is far less so. These other characteristics, concerned with auditing and quality control, include the ability to trace a requirement to a parent requirement (which may well be an English "shall" statement), to trace a requirement to verification activities or scenarios which verify that requirement, and to trace a requirement to elements of the software design which implement that requirement. UML Use Cases, designed for capturing requirements, have not always been satisfactory. Some applications of them simply use the Use Case model element as a repository for English requirement statements. Other applications of Use Cases, in which Use Cases are incorporated into behavioral diagrams that successfully communicate the behaviors and constraints required of the software, do indeed take advantage of UML's clarity, but not in ways that support the traceability features mentioned above. Our approach uses the Stereotype construct of UML to precisely identify elements of UML constructs, especially behaviors such as State Machines and Activities, as requirements, and also to achieve the necessary mapping capabilities. We describe this approach in the

  13. Automated hierarchical testable design of digital circuits

    Science.gov (United States)

    Kraak, M.

    1993-03-01

    The thesis gives an overview of approaches dealing with the selection of test strategies and methods for digital circuits and the incorporation of test in designs. A review is provided of existing testability analyzers. A new way to analyze testability at three hierarchical levels of abstraction is presented. It is shown how this approach is contained in an expert system rule-base called TRI Stage Testability Analysis (TRISTAN). The paper then deals with testability synthesis. It is shown that a new synthesis method had to be devised to be able to hierarchically select test strategies and methods. The testability synthesizer is also contained in a rule-base, called Intelligent Synthesis of Testable Designs (ISOLDE). TRISTAN and ISOLDE are parts of an expert system called WAGNER. The knowledge processor for WAGNER is covered, presenting its knowledge representation scheme, knowledge acquisition and inference mechanism. Results of experiments done with WAGNER on board and chip level designs are given. Conclusive remarks provide an outlook to continued research.

  14. Semantic-Based Requirements Content Management for Cloud Software

    Directory of Open Access Journals (Sweden)

    Jianqiang Hu

    2015-01-01

    Full Text Available Cloud Software is a software complex system whose topology and behavior can evolve dynamically in Cloud-computing environments. Given the unpredictable, dynamic, elasticity, and on-demand nature of the Cloud, it would be unrealistic to assume that traditional software engineering can “cleanly” satisfy the behavioral requirements of Cloud Software. In particular, the majority of traditional requirements managements take document-centric approaches, which have low degree of automation, coarse-grained management, and limited support for requirements modeling activities. Facing the challenges, based on metamodeling frame called RGPS (Role-Goal-Process-Service international standard, this paper firstly presents a hierarchical framework of semantic-based requirements content management for Cloud Software. And then, it focuses on some of the important management techniques in this framework, such as the native storage scheme, an ordered index with keywords, requirements instances classification based linear conditional random fields (CRFs, and breadth-first search algorithm for associated instances. Finally, a prototype tool called RGPS-RM for semantic-based requirements content management is implemented to provide supporting services for open requirements process of Cloud Software. The proposed framework applied to the Cloud Software development is demonstrated to show the validity and applicability. RGPS-RM also displays effect of fine-grained retrieval and breadth-first search algorithm for associated instance in visualization.

  15. MODIS. Volume 1: MODIS level 1A software baseline requirements

    Science.gov (United States)

    Masuoka, Edward; Fleig, Albert; Ardanuy, Philip; Goff, Thomas; Carpenter, Lloyd; Solomon, Carl; Storey, James

    1994-01-01

    This document describes the level 1A software requirements for the moderate resolution imaging spectroradiometer (MODIS) instrument. This includes internal and external requirements. Internal requirements include functional, operational, and data processing as well as performance, quality, safety, and security engineering requirements. External requirements include those imposed by data archive and distribution systems (DADS); scheduling, control, monitoring, and accounting (SCMA); product management (PM) system; MODIS log; and product generation system (PGS). Implementation constraints and requirements for adapting the software to the physical environment are also included.

  16. Green Software Engineering Adaption In Requirement Elicitation Process

    Directory of Open Access Journals (Sweden)

    Umma Khatuna Jannat

    2015-08-01

    Full Text Available A recent technology investigates the role of concern in the environment software that is green software system. Now it is widely accepted that the green software can fit all process of software development. It is also suitable for the requirement elicitation process. Now a days software companies have used requirements elicitation techniques in an enormous majority. Because this process plays more and more important roles in software development. At the present time most of the requirements elicitation process is improved by using some techniques and tools. So that the intention of this research suggests to adapt green software engineering for the intention of existing elicitation technique and recommend suitable actions for improvement. This research being involved qualitative data. I used few keywords in my searching procedure then searched IEEE ACM Springer Elsevier Google scholar Scopus and Wiley. Find out articles which published in 2010 until 2016. Finding from the literature review Identify 15 traditional requirement elicitations factors and 23 improvement techniques to convert green engineering. Lastly The paper includes a squat review of the literature a description of the grounded theory and some of the identity issues related finding of the necessity for requirements elicitation improvement techniques.

  17. Spectrum analysis on quality requirements consideration in software design documents.

    Science.gov (United States)

    Kaiya, Haruhiko; Umemura, Masahiro; Ogata, Shinpei; Kaijiri, Kenji

    2013-12-01

    Software quality requirements defined in the requirements analysis stage should be implemented in the final products, such as source codes and system deployment. To guarantee this meta-requirement, quality requirements should be considered in the intermediate stages, such as the design stage or the architectural definition stage. We propose a novel method for checking whether quality requirements are considered in the design stage. In this method, a technique called "spectrum analysis for quality requirements" is applied not only to requirements specifications but also to design documents. The technique enables us to derive the spectrum of a document, and quality requirements considerations in the document are numerically represented in the spectrum. We can thus objectively identify whether the considerations of quality requirements in a requirements document are adapted to its design document. To validate the method, we applied it to commercial software systems with the help of a supporting tool, and we confirmed that the method worked well.

  18. More about software requirements thorny issues and practical advice

    CERN Document Server

    Wiegers, Karl E

    2006-01-01

    No matter how much instruction you've had on managing software requirements, there's no substitute for experience. Too often, lessons about requirements engineering processes lack the no-nonsense guidance that supports real-world solutions. Complementing the best practices presented in his book, Software Requirements, Second Edition, requirements engineering authority Karl Wiegers tackles even more of the real issues head-on in this book. With straightforward, professional advice and practical solutions based on actual project experiences, this book answers many of the tough questions rais

  19. Software Requirements Specification for Lunar IceCube

    Science.gov (United States)

    Glaser-Garbrick, Michael R.

    Lunar IceCube is a 6U satellite that will orbit the moon to measure water volatiles as a function of position, altitude, and time, and measure in its various phases. Lunar IceCube, is a collaboration between Morehead State University, Vermont Technical University, Busek, and NASA. The Software Requirements Specification will serve as contract between the overall team and the developers of the flight software. It will provide a system's overview of the software that will be developed for Lunar IceCube, in that it will detail all of the interconnects and protocols for each subsystem's that Lunar IceCube will utilize. The flight software will be written in SPARK to the fullest extent, due to SPARK's unique ability to make software free of any errors. The LIC flight software does make use of a general purpose, reusable application framework called CubedOS. This framework imposes some structuring requirements on the architecture and design of the flight software, but it does not impose any high level requirements. It will also detail the tools that we will be using for Lunar IceCube, such as why we will be utilizing VxWorks.

  20. Timing-Driven-Testable Convergent Tree Adders

    Directory of Open Access Journals (Sweden)

    Johnnie A. Huang

    2002-01-01

    Full Text Available Carry lookahead adders have been, over the years, implemented in complex arithmetic units due to their regular structure which leads to efficient VLSI implementation for fast adders. In this paper, timing-driven testability synthesis is first performed on a tree adder. It is shown that the structure of the tree adder provides for a high fanout with an imbalanced tree structure, which likely contributes to a racing effect and increases the delay of the circuit. The timing optimization is then realized by reducing the maximum fanout of the adder and by balancing the tree circuit. For a 56-b testable tree adder, the optimization produces a 6.37%increase in speed of the critical path while only contributing a 2.16% area overhead. The full testability of the circuit is achieved in the optimized adder design.

  1. A report on NASA software engineering and Ada training requirements

    Science.gov (United States)

    Legrand, Sue; Freedman, Glenn B.; Svabek, L.

    1987-01-01

    NASA's software engineering and Ada skill base are assessed and information that may result in new models for software engineering, Ada training plans, and curricula are provided. A quantitative assessment which reflects the requirements for software engineering and Ada training across NASA is provided. A recommended implementation plan including a suggested curriculum with associated duration per course and suggested means of delivery is also provided. The distinction between education and training is made. Although it was directed to focus on NASA's need for the latter, the key relationships to software engineering education are also identified. A rationale and strategy for implementing a life cycle education and training program are detailed in support of improved software engineering practices and the transition to Ada.

  2. A Method for Software Requirement Volatility Analysis Using QFD

    Directory of Open Access Journals (Sweden)

    Yunarso Anang

    2016-10-01

    Full Text Available Changes of software requirements are inevitable during the development life cycle. Rather than avoiding the circumstance, it is easier to just accept it and find a way to anticipate those changes. This paper proposes a method to analyze the volatility of requirement by using the Quality Function Deployment (QFD method and the introduced degree of volatility. Customer requirements are deployed to software functions and subsequently to architectural design elements. And then, after determining the potential for changes of the design elements, the degree of volatility of the software requirements is calculated. In this paper the method is described using a flow diagram and illustrated using a simple example, and is evaluated using a case study.

  3. Software Requirements Specification Verifiable Fuel Cycle Simulation (VISION) Model

    Energy Technology Data Exchange (ETDEWEB)

    D. E. Shropshire; W. H. West

    2005-11-01

    The purpose of this Software Requirements Specification (SRS) is to define the top-level requirements for a Verifiable Fuel Cycle Simulation Model (VISION) of the Advanced Fuel Cycle (AFC). This simulation model is intended to serve a broad systems analysis and study tool applicable to work conducted as part of the AFCI (including costs estimates) and Generation IV reactor development studies.

  4. Facilitating Software Architecting by Ranking Requirements based on their Impact on the Architecture Process

    NARCIS (Netherlands)

    Galster, Matthias; Eberlein, Armin; Sprinkle, J; Sterritt, R; Breitman, K

    2011-01-01

    Ranking software requirements helps decide what requirements to implement during a software development project, and when. Currently, requirements ranking techniques focus on resource constraints or stakeholder priorities and neglect the effect of requirements on the software architecture process.

  5. Solid Waste Information and Tracking System (SWITS) Software Requirements Specification

    Energy Technology Data Exchange (ETDEWEB)

    MAY, D.L.

    2000-03-22

    This document is the primary document establishing requirements for the Solid Waste Information and Tracking System (SWITS) as it is converted to a client-server architecture. The purpose is to provide the customer and the performing organizations with the requirements for the SWITS in the new environment. This Software Requirement Specification (SRS) describes the system requirements for the SWITS Project, and follows the PHMC Engineering Requirements, HNF-PRO-1819, and Computer Software Qualify Assurance Requirements, HNF-PRO-309, policies. This SRS includes sections on general description, specific requirements, references, appendices, and index. The SWITS system defined in this document stores information about the solid waste inventory on the Hanford site. Waste is tracked as it is generated, analyzed, shipped, stored, and treated. In addition to inventory reports a number of reports for regulatory agencies are produced.

  6. On boolean combinations forming piecewise testable languages

    Czech Academy of Sciences Publication Activity Database

    Masopust, Tomáš; Thomazo, M.

    2017-01-01

    Roč. 682, June 19 (2017), s. 165-179 ISSN 0304-3975 Institutional support: RVO:67985840 Keywords : automata * language s * k-piecewise testability Subject RIV: BA - General Mathematics Impact factor: 0.698, year: 2016 http://www.sciencedirect.com/science/article/pii/S030439751730066X

  7. Metric-based method of software requirements correctness improvement

    Directory of Open Access Journals (Sweden)

    Yaremchuk Svitlana

    2017-01-01

    Full Text Available The work highlights the most important principles of software reliability management (SRM. The SRM concept construes a basis for developing a method of requirements correctness improvement. The method assumes that complicated requirements contain more actual and potential design faults/defects. The method applies a newer metric to evaluate the requirements complexity and double sorting technique evaluating the priority and complexity of a particular requirement. The method enables to improve requirements correctness due to identification of a higher number of defects with restricted resources. Practical application of the proposed method in the course of demands review assured a sensible technical and economic effect.

  8. REQUIREMENTS FOR DIDACTIC SOFTWARE AIMED AT PRIMARY SCHOOL

    Directory of Open Access Journals (Sweden)

    N. Olefirenko

    2012-07-01

    Full Text Available This article has an overview of the traditional requirements for didactic software, analyzed age and individual characteristics of younger pupils and defined additional requirements – the need to rely on a visual representation of information, carry out practical actions with objects, to ensure a balance between the playful and didactic content, account the experience of pupils and their individual capabilities, to have tools for situation of success.

  9. A software framework for developing measurement applications under variable requirements.

    Science.gov (United States)

    Arpaia, Pasquale; Buzio, Marco; Fiscarelli, Lucio; Inglese, Vitaliano

    2012-11-01

    A framework for easily developing software for measurement and test applications under highly and fast-varying requirements is proposed. The framework allows the software quality, in terms of flexibility, usability, and maintainability, to be maximized. Furthermore, the development effort is reduced and finalized, by relieving the test engineer of development details. The framework can be configured for satisfying a large set of measurement applications in a generic field for an industrial test division, a test laboratory, or a research center. As an experimental case study, the design, the implementation, and the assessment inside the application to a measurement scenario of magnet testing at the European Organization for Nuclear Research is reported.

  10. Knowledge-based requirements analysis for automating software development

    Science.gov (United States)

    Markosian, Lawrence Z.

    1988-01-01

    We present a new software development paradigm that automates the derivation of implementations from requirements. In this paradigm, informally-stated requirements are expressed in a domain-specific requirements specification language. This language is machine-understable and requirements expressed in it are captured in a knowledge base. Once the requirements are captured, more detailed specifications and eventually implementations are derived by the system using transformational synthesis. A key characteristic of the process is that the required human intervention is in the form of providing problem- and domain-specific engineering knowledge, not in writing detailed implementations. We describe a prototype system that applies the paradigm in the realm of communication engineering: the prototype automatically generates implementations of buffers following analysis of the requirements on each buffer.

  11. SE and I system testability: The key to space system FDIR and verification testing

    Science.gov (United States)

    Barry, Thomas; Scheffer, Terrance; Small, Lynn R.; Monis, Richard

    1990-01-01

    The key to implementing self-diagnosing design is a systems engineering task focused on design for testability concurrent with design for functionality. The design for testability process described here is the product of several years of DOD study and experience. Its application to the space station has begun on Work Package II under NASA and McDonnell direction. Other work package teams are being briefed by Harris Corporation with the hope of convincing them to embrace the process. For the purpose of this discussion the term testability is used to describe the systems engineering process by which designers can assure themselves and their reviewers that their designs are TESTABLE, that is they will support the downstream process of determining their functionality. Due to the complexity and density of present-day state-of-the-art designs, such as pipeline processors and high-speed integrated circuit technology, testability feature design is a critical requirement of the functional design process. A systematic approach to Space systems test and checkout as well as fault detection fault isolation reconfiguration (FDFIR) will minimize operational costs and maximize operational efficiency. An effective design for the testability program must be implemented by all contractors to insure meeting this objective. The process is well understood and technology is here to support it.

  12. Software use cases to elicit the software requirements analysis within the ASTRI project

    Science.gov (United States)

    Conforti, Vito; Antolini, Elisa; Bonnoli, Giacomo; Bruno, Pietro; Bulgarelli, Andrea; Capalbi, Milvia; Fioretti, Valentina; Fugazza, Dino; Gardiol, Daniele; Grillo, Alessandro; Leto, Giuseppe; Lombardi, Saverio; Lucarelli, Fabrizio; Maccarone, Maria Concetta; Malaguti, Giuseppe; Pareschi, Giovanni; Russo, Federico; Sangiorgi, Pierluca; Schwarz, Joseph; Scuderi, Salvatore; Tanci, Claudio; Tosti, Gino; Trifoglio, Massimo; Vercellone, Stefano; Zanmar Sanchez, Ricardo

    2016-07-01

    The Italian National Institute for Astrophysics (INAF) is leading the Astrofisica con Specchi a Tecnologia Replicante Italiana (ASTRI) project whose main purpose is the realization of small size telescopes (SST) for the Cherenkov Telescope Array (CTA). The first goal of the ASTRI project has been the development and operation of an innovative end-to-end telescope prototype using a dual-mirror optical configuration (SST-2M) equipped with a camera based on silicon photo-multipliers and very fast read-out electronics. The ASTRI SST-2M prototype has been installed in Italy at the INAF "M.G. Fracastoro" Astronomical Station located at Serra La Nave, on Mount Etna, Sicily. This prototype will be used to test several mechanical, optical, control hardware and software solutions which will be used in the ASTRI mini-array, comprising nine telescopes proposed to be placed at the CTA southern site. The ASTRI mini-array is a collaborative and international effort led by INAF and carried out by Italy, Brazil and South-Africa. We present here the use cases, through UML (Unified Modeling Language) diagrams and text details, that describe the functional requirements of the software that will manage the ASTRI SST-2M prototype, and the lessons learned thanks to these activities. We intend to adopt the same approach for the Mini Array Software System that will manage the ASTRI miniarray operations. Use cases are of importance for the whole software life cycle; in particular they provide valuable support to the validation and verification activities. Following the iterative development approach, which breaks down the software development into smaller chunks, we have analysed the requirements, developed, and then tested the code in repeated cycles. The use case technique allowed us to formalize the problem through user stories that describe how the user procedurally interacts with the software system. Through the use cases we improved the communication among team members, fostered

  13. Software

    Energy Technology Data Exchange (ETDEWEB)

    Macedo, R.; Budd, G.; Ross, E.; Wells, P.

    2010-07-15

    The software section of this journal presented new software programs that have been developed to help in the exploration and development of hydrocarbon resources. Software provider IHS Inc. has made additions to its geological and engineering analysis software tool, IHS PETRA, a product used by geoscientists and engineers to visualize, analyze and manage well production, well log, drilling, reservoir, seismic and other related information. IHS PETRA also includes a directional well module and a decline curve analysis module to improve analysis capabilities in unconventional reservoirs. Petris Technology Inc. has developed a software to help manage the large volumes of data. PetrisWinds Enterprise (PWE) helps users find and manage wellbore data, including conventional wireline and MWD core data; analysis core photos and images; waveforms and NMR; and external files documentation. Ottawa-based Ambercore Software Inc. has been collaborating with Nexen on the Petroleum iQ software for steam assisted gravity drainage (SAGD) producers. Petroleum iQ integrates geology and geophysics data with engineering data in 3D and 4D. Calgary-based Envirosoft Corporation has developed a software that reduces the costly and time-consuming effort required to comply with Directive 39 of the Alberta Energy Resources Conservation Board. The product includes an emissions modelling software. Houston-based Seismic Micro-Technology (SMT) has developed the Kingdom software that features the latest in seismic interpretation. Holland-based Joa Oil and Gas and Calgary-based Computer Modelling Group have both supplied the petroleum industry with advanced reservoir simulation software that enables reservoir interpretation. The 2010 software survey included a guide to new software applications designed to facilitate petroleum exploration, drilling and production activities. Oil and gas producers can use the products for a range of functions, including reservoir characterization and accounting. In

  14. SWEPP assay system version 2.0 software requirements specification

    Energy Technology Data Exchange (ETDEWEB)

    Matthews, S.D.; East, L.V.; Marwil, E.S.; Ferguson, J.J.

    1996-06-01

    The INEL Stored Waste Examination Pilot Plant (SWEPP) operations staff use nondestructive analysis methods to characterize the radiological contents of contact-handled radioactive waste containers. Containers of waste from Rocky Flats Environmental Technology Site and other DOE sites are currently stored at SWEPP. Before these containers can be shipped to WIPP, SWEPP must verify compliance with storage, shipping, and disposal requirements. One part of the SWEPP program measures neutron emissions from the containers and estimates the mass of Pu and other transuranic isotopes present. The code NEUT2 was originally used to perform data acquisition and reduction; the SWEPP Assay System (SAS) code replaced NEUT2 in early 1994. This document specifies the requirements for the SAS software as installed at INEL and was written to comply with RWMC (INEL Radioactive Waste Management Complex) quality requirements.

  15. Integrated Modular Avionics for Spacecraft Software Architecture and Requirements

    Science.gov (United States)

    Deredempt, Marie-Helene; Rossignol, Alain; Windsor, James; De-Ferluc, Regis; Sanmarti, Joaquim; Thorn, Jason; Parisis, Paul; Quartier, Fernand; Vatrinet, Francis; Schoofs, Tobias; Crespo, Alfons; Galizzi, Julien; Garcia, Gerald; Arberet, Paul

    2012-08-01

    Space industries designers, for scientific, observation, exploration and telecom missions are now facing requirements such as long lifetime, autonomy and safe operation guarantee in case of failure. New technical and industrial challenges will add complexity. The key point to ensure the success of future industrial projects is to answer on board processing increasing demand by designing more scalable and modular architectures in order to allow new missions whilst improving lifecycle, costs of design, qualification, and security.Focusing on data processing, new technology such as time and space partitioning as part of Integrated Modular Avionics (IMA) experimented by aeronautical domain and industrialized in the new generation of aircraft, was analyzed first for security and feasibility in space domain by an ESA project on secure partitioning and working group. In order to complete these studies, Integrated Modular Avionics for Space, as current ESA project, has the objective to confirm the feasibility of Time and Space Partitioning in space domain using existing hardware and based on ARINC653.By combining the efforts of industrial partners, the IMA for Space (IMA SP) project main goals are to focus first on some major topics such as computational model, impact of caches, impact on process and tools, Failure Detection and Isolation Recovery (FDIR), maintenance and I/O management in order to consolidate requirements, then to develop software solutions that meet requirements and lastly to implement these solutions in a demonstration phase with operational software.

  16. Testability of evolutionary game dynamics based on experimental economics data

    Science.gov (United States)

    Wang, Yijia; Chen, Xiaojie; Wang, Zhijian

    2017-11-01

    Understanding the dynamic processes of a real game system requires an appropriate dynamics model, and rigorously testing a dynamics model is nontrivial. In our methodological research, we develop an approach to testing the validity of game dynamics models that considers the dynamic patterns of angular momentum and speed as measurement variables. Using Rock-Paper-Scissors (RPS) games as an example, we illustrate the geometric patterns in the experiment data. We then derive the related theoretical patterns from a series of typical dynamics models. By testing the goodness-of-fit between the experimental and theoretical patterns, we show that the validity of these models can be evaluated quantitatively. Our approach establishes a link between dynamics models and experimental systems, which is, to the best of our knowledge, the most effective and rigorous strategy for ascertaining the testability of evolutionary game dynamics models.

  17. From Regular to Strictly Locally Testable Languages

    Directory of Open Access Journals (Sweden)

    Stefano Crespi Reghizzi

    2011-08-01

    Full Text Available A classical result (often credited to Y. Medvedev states that every language recognized by a finite automaton is the homomorphic image of a local language, over a much larger so-called local alphabet, namely the alphabet of the edges of the transition graph. Local languages are characterized by the value k=2 of the sliding window width in the McNaughton and Papert's infinite hierarchy of strictly locally testable languages (k-slt. We generalize Medvedev's result in a new direction, studying the relationship between the width and the alphabetic ratio telling how much larger the local alphabet is. We prove that every regular language is the image of a k-slt language on an alphabet of doubled size, where the width logarithmically depends on the automaton size, and we exhibit regular languages for which any smaller alphabetic ratio is insufficient. More generally, we express the trade-off between alphabetic ratio and width as a mathematical relation derived from a careful encoding of the states. At last we mention some directions for theoretical development and application.

  18. How do Quality Requirements Contribute to Software Sustainability?

    NARCIS (Netherlands)

    Condori-Fernandez, O.N.; Lago, P.; Calero, Coral

    2016-01-01

    The concept of sustainable development has become an important objective of policy makers in the software industry. The most used definition of sustainability refers to dimensions of economic sustainability to ensure that software services can create economic value; technical sustainability that

  19. Toward an Agile Approach to Managing the Effect of Requirements on Software Architecture during Global Software Development

    Directory of Open Access Journals (Sweden)

    Abdulaziz Alsahli

    2016-01-01

    Full Text Available Requirement change management (RCM is a critical activity during software development because poor RCM results in occurrence of defects, thereby resulting in software failure. To achieve RCM, efficient impact analysis is mandatory. A common repository is a good approach to maintain changed requirements, reusing and reducing effort. Thus, a better approach is needed to tailor knowledge for better change management of requirements and architecture during global software development (GSD.The objective of this research is to introduce an innovative approach for handling requirements and architecture changes simultaneously during global software development. The approach makes use of Case-Based Reasoning (CBR and agile practices. Agile practices make our approach iterative, whereas CBR stores requirements and makes them reusable. Twin Peaks is our base model, meaning that requirements and architecture are handled simultaneously. For this research, grounded theory has been applied; similarly, interviews from domain experts were conducted. Interview and literature transcripts formed the basis of data collection in grounded theory. Physical saturation of theory has been achieved through a published case study and developed tool. Expert reviews and statistical analysis have been used for evaluation. The proposed approach resulted in effective change management of requirements and architecture simultaneously during global software development.

  20. Characterizing the contribution of quality requirements to software sustainability

    NARCIS (Netherlands)

    Condori-Fernandez, Nelly; Lago, Patricia

    2018-01-01

    Most respondents considered modifiability as relevant for addressing both technical and environmental sustainability. Functional correctness, availability, modifiability, interoperability and recoverability favor positively the endurability of software systems. This study has also identified

  1. The Assessment and Foundation of Bell-Shaped Testability Growth Effort Functions Dependent System Testability Growth Models Based on NHPP

    Directory of Open Access Journals (Sweden)

    Tian-Mei Li

    2015-01-01

    Full Text Available This paper investigates a type of STGM (system testability growth model based on the nonhomogeneous Poisson process which incorporates TGEF (testability growth effort function. First, we analyze the process of TGT (testability growth test for equipment, which shows that the TGT can be divided into two committed steps: make the unit under test be in broken condition to identify TDL (testability design limitation and remove the TDL. We consider that the amount of TGF (testability growth effort spent on identifying TDL is a crucial issue which decides the shape of testability growth curve and that the TGF increases firstly and then decreases at different rates in the whole life cycle. Furthermore, we incorporate five TGEFs: an Exponential curve, a Rayleigh curve, a logistic curve, a delayed S-shape curve or an inflected S-shaped curve which are collectively referred to as Bell-shaped TGEFs into STGM. Results from applications to a real data set of a stable tracking platform are analyzed and evaluated in testability prediction capability and show that the Bell-shaped function can be expressed as a TGF curve and that the logistic TGEF dependent STGM gives better predictions based on the real data set.

  2. Enhancing requirements engineering for patient registry software systems with evidence-based components.

    Science.gov (United States)

    Lindoerfer, Doris; Mansmann, Ulrich

    2017-07-01

    Patient registries are instrumental for medical research. Often their structures are complex and their implementations use composite software systems to meet the wide spectrum of challenges. Commercial and open-source systems are available for registry implementation, but many research groups develop their own systems. Methodological approaches in the selection of software as well as the construction of proprietary systems are needed. We propose an evidence-based checklist, summarizing essential items for patient registry software systems (CIPROS), to accelerate the requirements engineering process. Requirements engineering activities for software systems follow traditional software requirements elicitation methods, general software requirements specification (SRS) templates, and standards. We performed a multistep procedure to develop a specific evidence-based CIPROS checklist: (1) A systematic literature review to build a comprehensive collection of technical concepts, (2) a qualitative content analysis to define a catalogue of relevant criteria, and (3) a checklist to construct a minimal appraisal standard. CIPROS is based on 64 publications and covers twelve sections with a total of 72 items. CIPROS also defines software requirements. Comparing CIPROS with traditional software requirements elicitation methods, SRS templates and standards show a broad consensus but differences in issues regarding registry-specific aspects. Using an evidence-based approach to requirements engineering for registry software adds aspects to the traditional methods and accelerates the software engineering process for registry software. The method we used to construct CIPROS serves as a potential template for creating evidence-based checklists in other fields. The CIPROS list supports developers in assessing requirements for existing systems and formulating requirements for their own systems, while strengthening the reporting of patient registry software system descriptions. It may be

  3. Hazard Analysis of Software Requirements Specification for Process Module of FPGA-based Controllers in NPP

    Energy Technology Data Exchange (ETDEWEB)

    Jung; Sejin; Kim, Eui-Sub; Yoo, Junbeom [Konkuk University, Seoul (Korea, Republic of); Keum, Jong Yong; Lee, Jang-Soo [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2016-10-15

    Software in PLC, FPGA which are used to develop I and C system also should be analyzed to hazards and risks before used. NUREG/CR-6430 proposes the method for performing software hazard analysis. It suggests analysis technique for software affected hazards and it reveals that software hazard analysis should be performed with the aspects of software life cycle such as requirements analysis, design, detailed design, implements. It also provides the guide phrases for applying software hazard analysis. HAZOP (Hazard and operability analysis) is one of the analysis technique which is introduced in NUREG/CR-6430 and it is useful technique to use guide phrases. HAZOP is sometimes used to analyze the safety of software. Analysis method of NUREG/CR-6430 had been used in Korea nuclear power plant software for PLC development. Appropriate guide phrases and analysis process are selected to apply efficiently and NUREG/CR-6430 provides applicable methods for software hazard analysis is identified in these researches. We perform software hazard analysis of FPGA software requirements specification with two approaches which are NUREG/CR-6430 and HAZOP with using general GW. We also perform the comparative analysis with them. NUREG/CR-6430 approach has several pros and cons comparing with the HAZOP with general guide words and approach. It is enough applicable to analyze the software requirements specification of FPGA.

  4. Security Requirements Management in Software Product Line Engineering

    Science.gov (United States)

    Mellado, Daniel; Fernández-Medina, Eduardo; Piattini, Mario

    Security requirements engineering is both a central task and a critical success factor in product line development due to the complexity and extensive nature of product lines. However, most of the current product line practices in requirements engineering do not adequately address security requirements engineering. Therefore, in this chapter we will propose a security requirements engineering process (SREPPLine) driven by security standards and based on a security requirements decision model along with a security variability model to manage the variability of the artefacts related to security requirements. The aim of this approach is to deal with security requirements from the early stages of the product line development in a systematic way, in order to facilitate conformance with the most relevant security standards with regard to the management of security requirements, such as ISO/IEC 27001 and ISO/IEC 15408.

  5. Views on Software Engineering from the Twin Peaks of Requirements and Architecture

    NARCIS (Netherlands)

    Galster, Matthias; Mirakhorli, Mehdi; Cleland-Huang, Jane; Burge, Janet E.; Franch, Xavier; Roshandel, Roshanak; Avgeriou, Paris

    2013-01-01

    The disciplines of requirements engineering (RE) and software architecture (SA) are fundamental to the success of software projects. Even though RE and SA are often considered in isolation, drawing a line between RE and SA is neither feasible nor reasonable as requirements and architectural design

  6. Space shuttle orbital maneuvering system failure detection and identification software requirements (uncontrolled)

    Science.gov (United States)

    Damario, L. A.; Vullo, J. P.

    1976-01-01

    Candidate designs and their software implementation are presented for the Orbital Maneuvering System (OMS) Failure Detection and Identification (FDI) algorithms in the Redundance Management (RM) module of the Space Shuttle Guidance, Navigation, and Control (GN&C) software. The OMS engine FDI algorithm monitors OMS engine thrust performance, and the OMS actuator FDI algorithm monitors OMS gimbal actuator performance. The software functional requirements of the algorithms are described along with the objective of each algorithm. A list of the assumptions which have governed its design, input/output requirements, a functional description of the algorithm (including a functional block diagram), and input interface requirements are given. The HAL (the language of the space shuttle flight computer) software formulation of the algorithms is considered including structured flowcharts of the procedures, estimates of flight computer core storage and CPU time, and processing requirements. A glossary of the symbols used to define the software requirements and formulations is included.

  7. Assessing students' performance in software requirements engineering education using scoring rubrics

    Science.gov (United States)

    Mkpojiogu, Emmanuel O. C.; Hussain, Azham

    2017-10-01

    The study investigates how helpful the use of scoring rubrics is, in the performance assessment of software requirements engineering students and whether its use can lead to students' performance improvement in the development of software requirements artifacts and models. Scoring rubrics were used by two instructors to assess the cognitive performance of a student in the design and development of software requirements artifacts. The study results indicate that the use of scoring rubrics is very helpful in objectively assessing the performance of software requirements or software engineering students. Furthermore, the results revealed that the use of scoring rubrics can also produce a good achievement assessments direction showing whether a student is either improving or not in a repeated or iterative assessment. In a nutshell, its use leads to the performance improvement of students. The results provided some insights for further investigation and will be beneficial to researchers, requirements engineers, system designers, developers and project managers.

  8. Development of requirements tracking and verification system for the software design of distributed control system

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Chul Hwan; Kim, Jang Yeol; Kim, Jung Tack; Lee, Jang Soo; Ham, Chang Shik [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1998-12-31

    In this paper a prototype of Requirement Tracking and Verification System(RTVS) for a Distributed Control System was implemented and tested. The RTVS is a software design and verification tool. The main functions required by the RTVS are managing, tracking and verification of the software requirements listed in the documentation of the DCS. The analysis of DCS software design procedures and interfaces with documents were performed to define the user of the RTVS, and the design requirements for RTVS were developed. 4 refs., 3 figs. (Author)

  9. Surveillance Analysis Computer System (SACS): Software requirements specification (SRS). Revision 2

    Energy Technology Data Exchange (ETDEWEB)

    Glasscock, J.A.

    1995-03-08

    This document is the primary document establishing requirements for the Surveillance Analysis Computer System (SACS) database, an Impact Level 3Q system. SACS stores information on tank temperatures, surface levels, and interstitial liquid levels. This information is retrieved by the customer through a PC-based interface and is then available to a number of other software tools. The software requirements specification (SRS) describes the system requirements for the SACS Project, and follows the Standard Engineering Practices (WHC-CM-6-1), Software Practices (WHC-CM-3-10) and Quality Assurance (WHC-CM-4-2, QR 19.0) policies.

  10. Questioning the Role of Requirements Engineering in the Causes of Safety-Critical Software Failures

    Science.gov (United States)

    Johnson, C. W.; Holloway, C. M.

    2006-01-01

    Many software failures stem from inadequate requirements engineering. This view has been supported both by detailed accident investigations and by a number of empirical studies; however, such investigations can be misleading. It is often difficult to distinguish between failures in requirements engineering and problems elsewhere in the software development lifecycle. Further pitfalls arise from the assumption that inadequate requirements engineering is a cause of all software related accidents for which the system fails to meet its requirements. This paper identifies some of the problems that have arisen from an undue focus on the role of requirements engineering in the causes of major accidents. The intention is to provoke further debate within the emerging field of forensic software engineering.

  11. Towards an Early Software Effort Estimation Based on Functional and Non-Functional Requirements

    NARCIS (Netherlands)

    Kassab, M.; Daneva, Maia; Ormanjieva, Olga; Abran, A.; Braungarten, R.; Dumke, R.; Cuadrado-Gallego, J.; Brunekreef, J.

    2009-01-01

    The increased awareness of the non-functional requirements as a key to software project and product success makes explicit the need to include them in any software project effort estimation activity. However, the existing approaches to defining size-based effort relationships still pay insufficient

  12. Separability by piecewise testable languages is PTime-complete

    Czech Academy of Sciences Publication Activity Database

    Masopust, Tomáš

    2018-01-01

    Roč. 711, February 8 (2018), s. 109-114 ISSN 0304-3975 Institutional support: RVO:67985840 Keywords : separability * piecewise testable languages * complexity Subject RIV: BA - General Mathematics Impact factor: 0.698, year: 2016 https://www.sciencedirect.com/science/article/pii/S0304397517308319?via%3Dihub

  13. TileCal ROD Hardware and Software Requirements

    CERN Document Server

    Castelo, J; Cuenca, C; Ferrer, A; Fullana, E; Higón, E; Iglesias, C; Munar, A; Poveda, J; Ruiz-Martínez, A; Salvachúa, B; Solans, C; Valls, J A

    2005-01-01

    In this paper we present the specific hardware and firmware requirements and modifications to operate the Liquid Argon Calorimeter (LiArg) ROD motherboard in the Hadronic Tile Calorimeter (TileCal) environment. Although the use of the board is similar for both calorimeters there are still some differences in the operation of the front-end associated to both detectors which make the use of the same board incompatible. We review the evolution of the design of the ROD from the early prototype stages (ROD based on commercial and Demonstrator boards) to the production phases (ROD final board based on the LiArg design), with emphasis on the different operation modes for the TileCal detector. We start with a short review of the TileCal ROD system functionality and then we detail the different ROD hardware requirements for options, the baseline (ROD Demo board) and the final (ROD final high density board). We also summarize the performance parameters of the ROD motherboard based on the final high density option and s...

  14. Basic Requirements for Systems Software Research and Development

    Science.gov (United States)

    Kuszmaul, Chris; Nitzberg, Bill

    1996-01-01

    Our success over the past ten years evaluating and developing advanced computing technologies has been due to a simple research and development (R/D) model. Our model has three phases: (a) evaluating the state-of-the-art, (b) identifying problems and creating innovations, and (c) developing solutions, improving the state- of-the-art. This cycle has four basic requirements: a large production testbed with real users, a diverse collection of state-of-the-art hardware, facilities for evalua- tion of emerging technologies and development of innovations, and control over system management on these testbeds. Future research will be irrelevant and future products will not work if any of these requirements is eliminated. In order to retain our effectiveness, the numerical aerospace simulator (NAS) must replace out-of-date production testbeds in as timely a fashion as possible, and cannot afford to ignore innovative designs such as new distributed shared memory machines, clustered commodity-based computers, and multi-threaded architectures.

  15. Report on the working conference on requirements engineering: foundation for software quality (REFSQ'09)

    NARCIS (Netherlands)

    Glinz, Martin; Heymans, Patrick; Persson, Anne; Sindre, Guttorm; Aurum, Aybüke; Madhavji, Nazim; Madhavji, N.; Paech, Barbara; Regev, Gil; Wieringa, Roelf J.

    This report summarizes the presentations and discussions at REFSQ’09, the 15th International Working Conference on Requirements Engineering: Foundation for Software Quality which was held on June 8-9, 2009 in Amsterdam, The Netherlands.

  16. Digital flight control software design requirements. [for space shuttle orbiter

    Science.gov (United States)

    1973-01-01

    The objective of the integrated digital flight control system is to provide rotational and translational control of the space shuttle orbiter in all phases of flight: from launch ascent through orbit to entry and touchdown, and during powered horizontal flights. The program provides a versatile control system structure while maintaining uniform communications with other programs, sensors, and control effects by using an executive routine/function subroutine format. The program reads all external variables at a single point, copies them into its dedicated storage, and then calls the required subroutines in the proper sequence. As a result, the flight control program is largely independent of other programs in the GN and C computer complex and is equally insensitive to the characteristics of the processor configuration. The integrated structure of the control system and the DFCS executive routine which embodies that structure are described. The specific estimation and control algorithms used in the various mission phases are shown. Attitude maneuver routines that interface with the DFCS are also described.

  17. The seesaw portal in testable models of neutrino masses

    Science.gov (United States)

    Caputo, A.; Hernández, P.; López-Pavón, J.; Salvado, J.

    2017-06-01

    A Standard Model extension with two Majorana neutrinos can explain the measured neutrino masses and mixings, and also account for the matter-antimatter asymmetry in a region of parameter space that could be testable in future experiments. The testability of the model relies to some extent on its minimality. In this paper we address the possibility that the model might be extended by extra generic new physics which we parametrize in terms of a low-energy effective theory. We consider the effects of the operators of the lowest dimensionality, d = 5, and evaluate the upper bounds on the coefficients so that the predictions of the minimal model are robust. One of the operators gives a new production mechanism for the heavy neutrinos at LHC via higgs decays. The higgs can decay to a pair of such neutrinos that, being long-lived, leave a powerful signal of two displaced vertices. We estimate the LHC reach to this process.

  18. Research preview: Prioritizing quality requirements based on software architecture evaluation feedback

    OpenAIRE

    Koziolek, Anne

    2012-01-01

    Context and motivation Quality requirements are a main driver for architectural decisions of software systems. Although the need for iterative handling of requirements and architecture has been identified, current architecture design processes do not provide systematic, quantitative feedback for the prioritization and cost/benefit considerations for quality requirements. Question/problem Thus, in practice stakeholders still often state and prioritize quality requirements before knowing the so...

  19. Qualification of Simulation Software for Safety Assessment of Sodium Cooled Fast Reactors. Requirements and Recommendations

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Nicholas R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Pointer, William David [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Sieger, Matt [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Flanagan, George F. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Moe, Wayne [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); HolbrookINL, Mark [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-04-01

    The goal of this review is to enable application of codes or software packages for safety assessment of advanced sodium-cooled fast reactor (SFR) designs. To address near-term programmatic needs, the authors have focused on two objectives. First, the authors have focused on identification of requirements for software QA that must be satisfied to enable the application of software to future safety analyses. Second, the authors have collected best practices applied by other code development teams to minimize cost and time of initial code qualification activities and to recommend a path to the stated goal.

  20. Integrated Requirement Selection and Scheduling for the Release Planning of a Software Product

    NARCIS (Netherlands)

    Li, C.; Akkermans, J.M.; Brinkkemper, S.; Diepen, G.

    2007-01-01

    This paper investigates two integer linear programming models that integrate requirement scheduling into software release planning. The first model can schedule the development of the requirements for the new release exactly in time so that the project span is minimized and the resource and

  1. An integrated approach for requirement selection and scheduling in software release planning

    NARCIS (Netherlands)

    Li, C.; van den Akker, Marjan; Brinkkemper, Sjaak; Diepen, Guido

    It is essential for product software companies to decide which requirements should be included in the next release and to make an appropriate time plan of the development project. Compared to the extensive research done on requirement selection, very little research has been performed on time

  2. Effects of Using Requirements Catalogs on Effectiveness and Productivity of Requirements Specification in a Software Project Management Course

    Science.gov (United States)

    Fernández-Alemán, José Luis; Carrillo-de-Gea, Juan Manuel; Meca, Joaquín Vidal; Ros, Joaquín Nicolás; Toval, Ambrosio; Idri, Ali

    2016-01-01

    This paper presents the results of two educational experiments carried out to determine whether the process of specifying requirements (catalog-based reuse as opposed to conventional specification) has an impact on effectiveness and productivity in co-located and distributed software development environments. The participants in the experiments…

  3. A Systematic Mapping on Supporting Approaches for Requirements Traceability in the Context of Software Projects

    Directory of Open Access Journals (Sweden)

    MALCHER, P R.C.

    2015-12-01

    Full Text Available The Requirements Traceability is seen as a quality factor with regard to software development, being present in standards and quality models. In this context, several techniques, models, frameworks and tools have been used to support it. Thus, the purpose of this paper is to present a systematic mapping carried out in order to find in the literature approaches to support the requirements traceability in the context of software projects and make the categorization of the data found in order to demonstrate, by means of a reliable, accurate and auditable method, how this area has developed and what are the main approaches are used to implement it.

  4. Dependencies among Architectural Views Got from Software Requirements Based on a Formal Model

    Directory of Open Access Journals (Sweden)

    Osis Janis

    2014-12-01

    Full Text Available A system architect has software requirements and some unspecified knowledge about a problem domain (e.g., an enterprise as source information for assessment and evaluation of possible solutions and getting the target point, a preliminary software design. The solving factor is architect’s experience and expertise in the problem domain (“AS-IS”. A proposed approach is dedicated to assist a system architect in making an appropriate decision on the solution (“TO-BE”. It is based on a formal mathematical model, Topological Functioning Model (TFM. Compliant TFMs can be transformed into software architectural views. The paper demonstrates and discusses tracing dependency links from the requirements to and between the architectural views.

  5. A Systematic Mapping Study on Empirical Evaluation of Software Requirements Specifications Techniques

    NARCIS (Netherlands)

    Condori-Fernandez, Nelly; Condori-Fernandez, Nelly; Daneva, Maia; Sikkel, Nicolaas; Wieringa, Roelf J.; Dieste, Oscar; Pastor, Oscar; Williams, L.; Miller, J.; Selby, R.

    2009-01-01

    This paper describes an empirical mapping study, which was designed to identify what aspects of Software Requirement Specifications (SRS) are empirically evaluated, in which context, and by using which research method. On the basis of 46 identified and categorized primary studies, we found that

  6. MODSARE-V: Validation of Dependability and Safety Critical Software Components with Model Based Requirements

    Science.gov (United States)

    Silveira, Daniel T. de M. M.; Schoofs, Tobias; Alana Salazar, Elena; Rodriguez Rodriguez, Ana Isabel; Devic, Marie-Odile

    2010-08-01

    The wide use of RAMS methods and techniques [1] (e.g. SFMECA, SFTA, HAZOP, HA...) in critical software development resulted in the specification of new software requirements, design constraints and other issues such as mandatory coding rules. Given the large variety of RAMS Requirements and Techniques, different types of Verification and Validation (V&V) [14] are spread over the phases of the software engineering process. As a result, the V&V process becomes complex and the cost and time required for a complete and consistent V&V process is increased. By introducing the concept of a model based approach to facilitate the RAMS requirements definition process, the V&V may be reduce in time and effort. MODSARE-V is demonstrates the feasibility of this concept based on case studies applied to ground or on-board software space projects with critical functions/components. This paper describes the approach adopted at MODSARE-V to realize the concept into a prototype and summarizes the results and conclusions met after the prototype application on the case studies.

  7. Evaluation of Using Course-Management Software: Supplementing a Course that Requires a Group Research Project

    Science.gov (United States)

    Korchmaros, Josephine D.; Gump, Nathaniel W.

    2009-01-01

    The benefits of course-management software (CMS) will not be realized if it is underused. The authors investigated one possible barrier to CMS use, students' perceptions of using CMS. After taking a course requiring a group research project, college students reported their perceptions of the use of CMS for the course. Overall, students did not…

  8. Second International Workshop on From Software Requirements to Architectures (STRAW'03)

    NARCIS (Netherlands)

    Berry, Daniel M.; Kazman, Rick; Wieringa, Roelf J.

    The Second International Workshop on From SoyTware Requirements to Architectures (STRAW'03) was held in Portland, Oregon, USA on 9 May 2003just after the Twenty-Fifth International Conference on Software Engineering (ICSE'03). This brief paper outlines the motivation, goals, and organization of the

  9. Design Requirements, Epistemic Uncertainty and Solution Development Strategies in Software Design

    DEFF Research Database (Denmark)

    Ball, Linden J.; Onarheim, Balder; Christensen, Bo Thomas

    2010-01-01

    This paper investigates the potential involvement of “epistemic uncertainty” in mediating between complex design requirements and strategic switches in software design strategies. The analysis revealed that the designers produced an initial “first-pass” solution to the given design brief...... a view of software design as involving a mixed breadth-first and depth-first solution development approach, with strategic switching to depth-first design being triggered by requirement complexity and being mediated by associated feelings of uncertainty....... in a breadth-first manner, with this solution addressing several easy-to-handle requirements. The designers then focused on adding relatively complex-to-handle requirements to this initial solution in what appeared to be a depth-first manner, as reflected, for example, by detailed mental simulations...

  10. Clinical and practical requirements of online software for anesthesia documentation an experience report.

    Science.gov (United States)

    Benson, M; Junger, A; Quinzio, L; Fuchs, C; Sciuk, G; Michel, A; Marquardt, K; Hempelmann, G

    2000-07-01

    The aim of this paper is the presentation of a new version of the anesthesia documentation software, NarkoData, that has been used in routine clinical work in our department as part of an anesthesia information management system (AIMS) since 1995. The performance of this software is presented along with requirements for future development of such a system. The originally used version, NarkoData 3.0, is an online anesthesia documentation software established by the software company ProLogic GmbH. It was primarily developed as a disk-based system for the MacOS operating system (Apple Computer Inc.). Based on our routine experience with the system, a catalogue of requirements was developed that concentrated on improvement in the sequence of work, administration and data management. In 1996, the concepts developed in our department, in close co-operation with medical personnel and the software company, led to a considerable enlargement of the program functions and the subsequent release of a new version of NarkoData. Since 1997, more than 20 000 anesthesia procedures have been recorded annually with this new version at 115 decentralized work stations at our university hospital.

  11. Waste Receiving and Processing Facility Module 1 Data Management System Software Requirements Specification

    Energy Technology Data Exchange (ETDEWEB)

    Brann, E.C. II

    1994-09-09

    This document provides the software requirements for Waste Receiving and Processing (WRAP) Module 1 Data Management System (DMS). The DMS is one of the plant computer systems for the new WRAP 1 facility (Project W-026). The DMS will collect, store and report data required to certify the low level waste (LLW) and transuranic (TRU) waste items processed at WRAP 1 as acceptable for shipment, storage, or disposal.

  12. Waste Receiving and Processing Facility Module 1 Data Management System software requirements specification

    Energy Technology Data Exchange (ETDEWEB)

    Rosnick, C.K.

    1996-04-19

    This document provides the software requirements for Waste Receiving and Processing (WRAP) Module 1 Data Management System (DMS). The DMS is one of the plant computer systems for the new WRAP 1 facility (Project W-0126). The DMS will collect, store and report data required to certify the low level waste (LLW) and transuranic (TRU) waste items processed at WRAP 1 as acceptable for shipment, storage, or disposal.

  13. Detailed requirements document for common software of shuttle program information management system

    Science.gov (United States)

    Everette, J. M.; Bradfield, L. D.; Horton, C. L.

    1975-01-01

    Common software was investigated as a method for minimizing development and maintenance cost of the shuttle program information management system (SPIMS) applications while reducing the time-frame of their development. Those requirements satisfying these criteria are presented along with the stand-alone modules which may be used directly by applications. The SPIMS applications operating on the CYBER 74 computer, are specialized information management systems which use System 2000 as a data base manager. Common software provides the features to support user interactions on a CRT terminal using form input and command response capabilities. These features are available as subroutines to the applications.

  14. Designing for Change: Minimizing the Impact of Changing Requirements in the Later Stages of a Spaceflight Software Project

    Science.gov (United States)

    Allen, B. Danette

    1998-01-01

    In the traditional 'waterfall' model of the software project life cycle, the Requirements Phase ends and flows into the Design Phase, which ends and flows into the Development Phase. Unfortunately, the process rarely, if ever, works so smoothly in practice. Instead, software developers often receive new requirements, or modifications to the original requirements, well after the earlier project phases have been completed. In particular, projects with shorter than ideal schedules are highly susceptible to frequent requirements changes, as the software requirements analysis phase is often forced to begin before the overall system requirements and top-level design are complete. This results in later modifications to the software requirements, even though the software design and development phases may be complete. Requirements changes received in the later stages of a software project inevitably lead to modification of existing developed software. Presented here is a series of software design techniques that can greatly reduce the impact of last-minute requirements changes. These techniques were successfully used to add built-in flexibility to two complex software systems in which the requirements were expected to (and did) change frequently. These large, real-time systems were developed at NASA Langley Research Center (LaRC) to test and control the Lidar In-Space Technology Experiment (LITE) instrument which flew aboard the space shuttle Discovery as the primary payload on the STS-64 mission.

  15. Requirements for guidelines systems: implementation challenges and lessons from existing software-engineering efforts.

    Science.gov (United States)

    Shah, Hemant; Allard, Raymond D; Enberg, Robert; Krishnan, Ganesh; Williams, Patricia; Nadkarni, Prakash M

    2012-03-09

    A large body of work in the clinical guidelines field has identified requirements for guideline systems, but there are formidable challenges in translating such requirements into production-quality systems that can be used in routine patient care. Detailed analysis of requirements from an implementation perspective can be useful in helping define sub-requirements to the point where they are implementable. Further, additional requirements emerge as a result of such analysis. During such an analysis, study of examples of existing, software-engineering efforts in non-biomedical fields can provide useful signposts to the implementer of a clinical guideline system. In addition to requirements described by guideline-system authors, comparative reviews of such systems, and publications discussing information needs for guideline systems and clinical decision support systems in general, we have incorporated additional requirements related to production-system robustness and functionality from publications in the business workflow domain, in addition to drawing on our own experience in the development of the Proteus guideline system (http://proteme.org). The sub-requirements are discussed by conveniently grouping them into the categories used by the review of Isern and Moreno 2008. We cite previous work under each category and then provide sub-requirements under each category, and provide example of similar work in software-engineering efforts that have addressed a similar problem in a non-biomedical context. When analyzing requirements from the implementation viewpoint, knowledge of successes and failures in related software-engineering efforts can guide implementers in the choice of effective design and development strategies.

  16. Active Mirror Predictive and Requirements Verification Software (AMP-ReVS)

    Science.gov (United States)

    Basinger, Scott A.

    2012-01-01

    This software is designed to predict large active mirror performance at various stages in the fabrication lifecycle of the mirror. It was developed for 1-meter class powered mirrors for astronomical purposes, but is extensible to other geometries. The package accepts finite element model (FEM) inputs and laboratory measured data for large optical-quality mirrors with active figure control. It computes phenomenological contributions to the surface figure error using several built-in optimization techniques. These phenomena include stresses induced in the mirror by the manufacturing process and the support structure, the test procedure, high spatial frequency errors introduced by the polishing process, and other process-dependent deleterious effects due to light-weighting of the mirror. Then, depending on the maturity of the mirror, it either predicts the best surface figure error that the mirror will attain, or it verifies that the requirements for the error sources have been met once the best surface figure error has been measured. The unique feature of this software is that it ties together physical phenomenology with wavefront sensing and control techniques and various optimization methods including convex optimization, Kalman filtering, and quadratic programming to both generate predictive models and to do requirements verification. This software combines three distinct disciplines: wavefront control, predictive models based on FEM, and requirements verification using measured data in a robust, reusable code that is applicable to any large optics for ground and space telescopes. The software also includes state-of-the-art wavefront control algorithms that allow closed-loop performance to be computed. It allows for quantitative trade studies to be performed for optical systems engineering, including computing the best surface figure error under various testing and operating conditions. After the mirror manufacturing process and testing have been completed, the

  17. Importance of Requirements Analysis & Traceability to Improve Software Quality and Reduce Cost and Risk

    Science.gov (United States)

    Kapoor, Manju M.; Mehta, Manju

    2010-01-01

    The goal of this paper is to emphasize the importance of developing complete and unambiguous requirements early in the project cycle (prior to Preliminary Design Phase). Having a complete set of requirements early in the project cycle allows sufficient time to generate a traceability matrix. Requirements traceability and analysis are the key elements in improving verification and validation process, and thus overall software quality. Traceability can be most beneficial when the system changes. If changes are made to high-level requirements it implies that low-level requirements need to be modified. Traceability ensures that requirements are appropriately and efficiently verified at various levels whereas analysis ensures that a rightly interpreted set of requirements is produced.

  18. Independent Verification and Validation Of SAPHIRE 8 Software Requirements Project Number: N6423 U.S. Nuclear Regulatory Commission

    Energy Technology Data Exchange (ETDEWEB)

    Kent Norris

    2009-09-01

    The purpose of the Independent Verification and Validation (IV&V) role in the evaluation of the SAPHIRE requirements definition is to assess the activities that results in the specification, documentation, and review of the requirements that the software product must satisfy, including functionality, performance, design constraints, attributes and external interfaces. The IV&V team began this endeavor after the software engineering and software development of SAPHIRE had already been in production. IV&V reviewed the requirements specified in the NRC Form 189s to verify these requirements were included in SAPHIRE’s Software Verification and Validation Plan (SVVP).

  19. A Methodology for Measuring the Risk Associated with A Software Requirements Specification

    Directory of Open Access Journals (Sweden)

    Trevor Moores

    1996-11-01

    Full Text Available This paper presents a six-step metrics-based methodology for assessing the risks associated with - and hence the resources required to implement - the requirements contained within a software requirements specification (SRS. The method seeks to eliminate the use of subjective probability assessments in models of risk exposure (RE and risk reduction leverage (RRL. Measurements are taken of the number of requirements and the class of risk, the number of change requests and their date of issue, and the cost of each requirement change. The class of requirements risk is tailored to a given organisation using the Delphi method. The information collected is stored as an historical database for use in the analysis of subsequent SRSs.

  20. Meta-Model and UML Profile for Requirements Management of Software and Embedded Systems

    Directory of Open Access Journals (Sweden)

    Arpinen Tero

    2011-01-01

    Full Text Available Software and embedded system companies today encounter problems related to requirements management tool integration, incorrect tool usage, and lack of traceability. This is due to utilized tools with no clear meta-model and semantics to communicate requirements between different stakeholders. This paper presents a comprehensive meta-model for requirements management. The focus is on software and embedded system domains. The goal is to define generic requirements management domain concepts and abstract interfaces between requirements management and system development. This leads to a portable requirements management meta-model which can be adapted with various system modeling languages. The created meta-model is prototyped by translating it into a UML profile. The profile is imported into a UML tool which is used for rapid evaluation of meta-model concepts in practice. The developed profile is associated with a proof of concept report generator tool that automatically produces up-to-date documentation from the models in form of web pages. The profile is adopted to create an example model of embedded system requirement specification which is built with the profile.

  1. The DOE Meteorological Coordinating Council Perspective on the Application to Meteorological Software of DOE's Software Quality Assurance Requirements

    Energy Technology Data Exchange (ETDEWEB)

    Mazzola, Carl; Schalk, Walt; Glantz, Clifford S.

    2008-03-12

    Since 1994, the DOE Meteorological Coordinating Council (DMCC) has been addressing meteorological monitoring and meteorological applications program issues at U.S. Department of Energy/National Nuclear Safety Administration (DOE/NNSA) sites and providing solutions to these issues. The fundamental objectives of the DMCC include promoting cost-effective meteorological support and facilitating the use of common meteorological methods, procedures, and standards at all DOE/NNSA sites. In 2005, the DOE established strict software quality assurance (SQA) requirements for safety software, including consequence assessment software used for hazard assessments and safety analyses. These evaluations often utilize meteorological data supplied by DOE/NNSA site-based meteorological programs. However, the DOE has not established SQA guidance for this type of safety-related meteorological software. To address this gap, the DMCC is developing this guidance for the use of both public- and private-sector organizations. The goal for the DMCC is to mimic the SQA requirements for safety software but allow a much greater degree of “grading” in determining exactly what specific activities are needed. The emphasis of the DMCC SQA guidelines is on three key elements: 1) design and implementation documentation, 2) configuration management, and 3) verification and validation testing. These SQA guidelines should provide owners and users of meteorological software with a fair degree of assurance that their software is reliable, documented, and tested without putting an undue burden on meteorological system software developers.

  2. Adding Timing Requirements to the CODARTS Real-Time Software Design Method

    DEFF Research Database (Denmark)

    Bach, K.R.

    The CODARTS software design method consideres how concurrent, distributed and real-time applications can be designed. Although accounting for the important issues of task and communication, the method does not provide means for expressing the timeliness of the tasks and communication directly...... in the design as is otherwise the case with tasks and communication specifics. In this paper we propose an extension scheme which will enable for speifyinbg timing requirements for tasks and communications within the CODARTS model...

  3. System requirements for one-time-use ENRAF control panel software

    Energy Technology Data Exchange (ETDEWEB)

    HUBER, J.H.

    1999-08-19

    An Enraf Densitometer is installed on tank 241-AY-102. The Densitometer will frequently be tasked to obtain and log density profiles. The activity can be effected a number of ways. Enraf Incorporated provides a software package called ''Logger18'' to its customers for the purpose of in-shop testing of their gauges. Logger18 is capable of accepting an input file which can direct the gauge to obtain a density profile for a given tank level and bottom limit. Logger18 is a complex, DOS based program which will require trained technicians and/or tank farm entries to obtain the data. ALARA considerations have prompted the development of a more user-friendly, computer-based interface to the Enraf densitometers. This document records the plan by which this new Enraf data acquisition software will be developed, reviewed, verified, and released. This plan applies to the development and implementation of a one-time-use software program, which will be called ''Enraf Control Panel.'' The software will be primarily used for remote operation of Enraf Densitometers for the purpose of obtaining and logging tank product density profiles.

  4. Tool Support for Distributed Software Development : The past - present - and future of gaps between user requirements and tool functionalities

    NARCIS (Netherlands)

    Herrera, Miles; van Hillegersberg, Jos; Harmsen, Frank; Amrit, Chintan Amrit; Geisberger, Eva; Keil, Patrick; Kuhrmann, Marco

    2007-01-01

    This paper presents the past, present, and our view on future user requirements and tool functionalities supporting Globally Distributed Software Teams and highlights the changing emphasis in these user requirements.

  5. Software requirements elicitation to support internal monitoring of quality assurance system for higher education in Indonesia

    Science.gov (United States)

    Amalia, A.; Gunawan, D.; Hardi, S. M.; Rachmawati, D.

    2018-02-01

    The Internal Quality Assurance System (in Indonesian: SPMI (Sistem Penjaminan Mutu Internal) is a systemic activity of quality assurance of higher education in Indonesia. SPMI should be done by all higher education or universities in Indonesia based on the Regulation of the Minister of Research, Technology and Higher Education of the Republic of Indonesia Number 62 of 2016. Implementation of SPMI must refer to the principle of SPMI that is independent, standardize, accurate, well planned and sustainable, documented and systematic. To assist the SPMI cycle properly, universities need a supporting software to monitor all the activities of SPMI. But in reality, many universities are not optimal in building this SPMI monitoring system. One of the obstacles is the determination of system requirements in support of SPMI principles is difficult to achieve. In this paper, we observe the initial phase of the engineering requirements elicitation. Unlike other methods that collect system requirements from users and stakeholders, we find the system requirements of the SPMI principles from SPMI guideline book. The result of this paper can be used as a choice in determining SPMI software requirements. This paper can also be used by developers and users to understand the scenario of SPMI so that could overcome the problems of understanding between this two parties.

  6. Incorporating Software Requirements into the System RFP: Survey of RFP Language for Software by Topic, v. 2.0

    Science.gov (United States)

    2009-05-01

    CDRL) System - Terminal S TMOS - SEIT CL TMOS - IA/ Crypto CL Space - Payload S Space - SEIT C Space - IA/ Crypto C Role Legend: CL...strategy for maintaining the currency of the technology (through Commercial off-the-shelf software (COTS) and other reusable Non-Developmental Items (NDI...the strategy for maintaining the currency of technology (e.g., through Commercial off-the-shelf software (COTS) insertion, technology refresh

  7. DEVELOPMENT OF METHODOLOGY FOR DESIGNING TESTABLE COMPONENT STRUCTURE OF DISCIPLINARY COMPETENCE

    Directory of Open Access Journals (Sweden)

    Vladimir I. Freyman

    2014-01-01

    Full Text Available The aim of the study is to present new methods of quality results assessment of the education corresponding to requirements of Federal State Educational Standards (FSES of the Third Generation developed for the higher school. The urgency of search of adequate tools for quality competency measurement and its elements formed in the course of experts’ preparation are specified. Methods. It is necessary to consider interference of competency components such as knowledge, abilities, possession in order to make procedures of assessment of students’ achievements within the limits of separate discipline or curriculum section more convenient, effective and exact. While modeling of component structure of the disciplinary competence the testable design of components is used; the approach borrowed from technical diagnostics. Results. The research outcomes include the definition and analysis of general iterative methodology for testable designing component structure of the disciplinary competence. Application of the proposed methodology is illustrated as the example of an abstract academic discipline with specified data and index of labour requirement. Methodology restrictions are noted; practical recommendations are given. Scientific novelty. Basic data and a detailed step-by-step implementation phase of the proposed common iterative approach to the development of disciplinary competence testable component structure are considered. Tests and diagnostic tables for different options of designing are proposed. Practical significance. The research findings can help promoting learning efficiency increase, a choice of adequate control devices, accuracy of assessment, and also efficient use of personnel, temporal and material resources of higher education institutions. Proposed algorithms, methods and approaches to procedure of control results organization and realization of developed competences and its components can be used as methodical base while

  8. Knowledge Base for an Intelligent System in order to Identify Security Requirements for Government Agencies Software Projects

    Directory of Open Access Journals (Sweden)

    Adán Beltrán G.

    2016-01-01

    Full Text Available It has been evidenced that one of the most common causes in the failure of software security is the lack of identification and specification of requirements for information security, it is an activity with an insufficient importance in the software development or software acquisition We propose the knowledge base of CIBERREQ. CIBERREQ is an intelligent knowledge-based system used for the identification and specification of security requirements in the software development cycle or in the software acquisition. CIBERREQ receives functional software requirements written in natural language and produces non-functional security requirements through a semi-automatic process of risk management. The knowledge base built is formed by an ontology developed collaboratively by experts in information security. In this process has been identified six types of assets: electronic data, physical data, hardware, software, person and service; as well as six types of risk: competitive disadvantage, loss of credibility, economic risks, strategic risks, operational risks and legal sanctions. In addition there are defined 95 vulnerabilities, 24 threats, 230 controls, and 515 associations between concepts. Additionally, automatic expansion was used with Wikipedia for the asset types Software and Hardware, obtaining 7125 and 5894 software and hardware subtypes respectively, achieving thereby an improvement of 10% in the identification of the information assets candidates, one of the most important phases of the proposed system.

  9. Software Prototyping: A Case Report of Refining User Requirements for a Health Information Exchange Dashboard.

    Science.gov (United States)

    Nelson, Scott D; Del Fiol, Guilherme; Hanseler, Haley; Crouch, Barbara Insley; Cummins, Mollie R

    2016-01-01

    Health information exchange (HIE) between Poison Control Centers (PCCs) and Emergency Departments (EDs) could improve care of poisoned patients. However, PCC information systems are not designed to facilitate HIE with EDs; therefore, we are developing specialized software to support HIE within the normal workflow of the PCC using user-centered design and rapid prototyping. To describe the design of an HIE dashboard and the refinement of user requirements through rapid prototyping. Using previously elicited user requirements, we designed low-fidelity sketches of designs on paper with iterative refinement. Next, we designed an interactive high-fidelity prototype and conducted scenario-based usability tests with end users. Users were asked to think aloud while accomplishing tasks related to a case vignette. After testing, the users provided feedback and evaluated the prototype using the System Usability Scale (SUS). Survey results from three users provided useful feedback that was then incorporated into the design. After achieving a stable design, we used the prototype itself as the specification for development of the actual software. Benefits of prototyping included having 1) subject-matter experts heavily involved with the design; 2) flexibility to make rapid changes, 3) the ability to minimize software development efforts early in the design stage; 4) rapid finalization of requirements; 5) early visualization of designs; 6) and a powerful vehicle for communication of the design to the programmers. Challenges included 1) time and effort to develop the prototypes and case scenarios; 2) no simulation of system performance; 3) not having all proposed functionality available in the final product; and 4) missing needed data elements in the PCC information system.

  10. Potential and requirements for a standarized pan-European food consumption survey using the EPIC-Soft software

    NARCIS (Netherlands)

    Ocke, M.C.; Slimani, N.; Brants, H.A.M.; Buurma-Rethans, E.; Casagrande, C.; Nicolas, G.; Dofkova, M.; Donne, le C.; Freisling, H.; Geelen, A.; Huybrechts, I.; Keyzer, de W.; Laan, van der J.D.; Lafay, L.; Lillegaard, I.T.L.; Niekerk, E.M.; Vries, de J.H.M.; Wilson-van den Hooven, E.C.; Boer, de E.J.

    2011-01-01

    Background/Objectives: To describe the strengths, limitations and requirements of using EPIC-Soft software (the software developed to conduct 24-h dietary recalls in the European Prospective Investigation into Cancer and Nutrition (EPIC) Study) in pan-European food consumption surveys, and to

  11. Evaluation of a Game to Teach Requirements Collection and Analysis in Software Engineering at Tertiary Education Level

    Science.gov (United States)

    Hainey, Thomas; Connolly, Thomas M.; Stansfield, Mark; Boyle, Elizabeth A.

    2011-01-01

    A highly important part of software engineering education is requirements collection and analysis which is one of the initial stages of the Database Application Lifecycle and arguably the most important stage of the Software Development Lifecycle. No other conceptual work is as difficult to rectify at a later stage or as damaging to the overall…

  12. RELAP-7 Software Verification and Validation Plan: Requirements Traceability Matrix (RTM) Part 1 – Physics and numerical methods

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Yong Joon [Idaho National Lab. (INL), Idaho Falls, ID (United States); Yoo, Jun Soo [Idaho National Lab. (INL), Idaho Falls, ID (United States); Smith, Curtis Lee [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-09-01

    This INL plan comprehensively describes the Requirements Traceability Matrix (RTM) on main physics and numerical method of the RELAP-7. The plan also describes the testing-based software verification and validation (SV&V) process—a set of specially designed software models used to test RELAP-7.

  13. Investigation of the current requirements engineering practices among software developers at the Universiti Utara Malaysia Information Technology (UUMIT) centre

    Science.gov (United States)

    Hussain, Azham; Mkpojiogu, Emmanuel O. C.; Abdullah, Inam

    2016-08-01

    Requirements Engineering (RE) is a systemic and integrated process of eliciting, elaborating, negotiating, validating and managing of the requirements of a system in a software development project. UUM has been supported by various systems developed and maintained by the UUM Information Technology (UUMIT) Centre. The aim of this study was to assess the current requirements engineering practices at UUMIT. The main problem that prompted this research is the lack of studies that support software development activities at the UUMIT. The study is geared at helping UUMIT produce quality but time and cost saving software products by implementing cutting edge and state of the art requirements engineering practices. Also, the study contributes to UUM by identifying the activities needed for software development so that the management will be able to allocate budget to provide adequate and precise training for the software developers. Three variables were investigated: Requirement Description, Requirements Development (comprising: Requirements Elicitation, Requirements Analysis and Negotiation, Requirements Validation), and Requirement Management. The results from the study showed that the current practice of requirement engineering in UUMIT is encouraging, but still need further development and improvement because a few RE practices were seldom practiced.

  14. Investigation of Classification and Design Requirements for Digital Software for Advanced Research Reactors

    Energy Technology Data Exchange (ETDEWEB)

    Park, Gee Young; Jung, H. S.; Ryu, J. S.; Park, C

    2005-06-15

    software for use in I and C systems in nuclear power plants and describes the requirements for software development recommended by international standard.

  15. APPROACHES DEVELOPMENT TO FORMALIZED DESCRIPTION OF THE DISCIPLINARY COMPETENCE OF TESTABLE COMPONENT STRUCTURE

    Directory of Open Access Journals (Sweden)

    Efim L. Kon

    2015-01-01

    Full Text Available The purpose of this paper is a development of approaches and recommendations on the selection of quantitative and qualitative component structure of disciplinary competencies, as well as ways of its formal description. One of the main problems, which it is necessary to decide while developing a studying and methodical discipline complex (for example, a discipline program, a fund of estimation tools, etc. of competence-oriented educational program, is designing of a component structure of each part of competence (a disciplinary competence, that is involved in the formation of the discipline. In this case, a significant impact on this process has not only the content of thematic plan and selected kinds of class and self works of students, but proposed control tools and diagnosing methods of learning outcomes specified in the competency format. Methods. It is proposed to use joint (testable design component structure of disciplinary competencies and control tools (tests, test materials, check the level of development of their constituent elements described in the triad of «to know», «to be able to», «to master». Requirements to the basic quantitative and qualitative properties of disciplinary competence component structure are formulated and substantiated. A structure of diagnostic table that makes it possible to set a correspondence between disciplinary competence elements and components and their control tests, and also to fix outcomes of current control of development level (test reactions in the binary and not binary alphabets is proposed and analyzed. A classification of diagnostic tests is given; their impact on format and properties of diagnostic table is shown. Scientific novelty. The approach to designing of a testable component structure of disciplinary competence is proposed; it allows setting some properties of control object, which can increase procedure effective and decoding precision of diagnosis of learning outcomes

  16. Supporting Early Math--Rationales and Requirements for High Quality Software

    Science.gov (United States)

    Haake, Magnus; Husain, Layla; Gulz, Agneta

    2015-01-01

    There is substantial evidence that preschooler's performance in early math is highly correlated to math performance throughout school as well as academic skills in general. One way to help children attain early math skills is by using targeted educational software and the paper discusses potential gains of using such software to support early math…

  17. Developing visualisation software for rehabilitation: investigating the requirements of patients, therapists and the rehabilitation process

    Science.gov (United States)

    Loudon, David; Macdonald, Alastair S.; Carse, Bruce; Thikey, Heather; Jones, Lucy; Rowe, Philip J.; Uzor, Stephen; Ayoade, Mobolaji; Baillie, Lynne

    2012-01-01

    This paper describes the ongoing process of the development and evaluation of prototype visualisation software, designed to assist in the understanding and the improvement of appropriate movements during rehabilitation. The process of engaging users throughout the research project is detailed in the paper, including how the design of the visualisation software is being adapted to meet the emerging understanding of the needs of patients and professionals, and of the rehabilitation process. The value of the process for the design of the visualisation software is illustrated with a discussion of the findings of pre-pilot focus groups with stroke survivors and therapists. PMID:23011812

  18. Understanding quality requirements engineering in contract-based projects from the perspective of software architects: an exploratory study

    NARCIS (Netherlands)

    Daneva, Maya; Herrmann, Andrea; Buglione, Luigi; Mistrik, Ivan; Bahsoon, Rami; Eeles, Peter; Roshandel, Roshanak; Stal, Michael

    2014-01-01

    This chapter discusses how software architects from 21 European project organizations cope with quality requirements (QRs) in large, contract-based systems delivery projects. It reports on the roles that architects played in QRs engineering, their interactions with other project roles, the specific

  19. Comparison on testability of visual acuity, stereo acuity and colour vision tests between children with learning disabilities and children without learning disabilities in government primary schools.

    Science.gov (United States)

    Abu Bakar, Nurul Farhana; Chen, Ai-Hong

    2014-02-01

    Children with learning disabilities might have difficulties to communicate effectively and give reliable responses as required in various visual function testing procedures. The purpose of this study was to compare the testability of visual acuity using the modified Early Treatment Diabetic Retinopathy Study (ETDRS) and Cambridge Crowding Cards, stereo acuity using Lang Stereo test II and Butterfly stereo tests and colour perception using Colour Vision Test Made Easy (CVTME) and Ishihara's Test for Colour Deficiency (Ishihara Test) between children in mainstream classes and children with learning disabilities in special education classes in government primary schools. A total of 100 primary school children (50 children from mainstream classes and 50 children from special education classes) matched in age were recruited in this cross-sectional comparative study. The testability was determined by the percentage of children who were able to give reliable respond as required by the respective tests. 'Unable to test' was defined as inappropriate response or uncooperative despite best efforts of the screener. The testability of the modified ETDRS, Butterfly stereo test and Ishihara test for respective visual function tests were found lower among children in special education classes ( P Stereo test II and CVTME. Non verbal or "matching" approaches were found to be more superior in testing visual functions in children with learning disabilities. Modifications of vision testing procedures are essential for children with learning disabilities.

  20. Comparison on testability of visual acuity, stereo acuity and colour vision tests between children with learning disabilities and children without learning disabilities in government primary schools

    Directory of Open Access Journals (Sweden)

    Nurul Farhana Abu Bakar

    2014-01-01

    Full Text Available Context: Children with learning disabilities might have difficulties to communicate effectively and give reliable responses as required in various visual function testing procedures. Aims: The purpose of this study was to compare the testability of visual acuity using the modified Early Treatment Diabetic Retinopathy Study (ETDRS and Cambridge Crowding Cards, stereo acuity using Lang Stereo test II and Butterfly stereo tests and colour perception using Colour Vision Test Made Easy (CVTME and Ishihara′s Test for Colour Deficiency (Ishihara Test between children in mainstream classes and children with learning disabilities in special education classes in government primary schools. Materials and Methods: A total of 100 primary school children (50 children from mainstream classes and 50 children from special education classes matched in age were recruited in this cross-sectional comparative study. The testability was determined by the percentage of children who were able to give reliable respond as required by the respective tests. ′Unable to test′ was defined as inappropriate response or uncooperative despite best efforts of the screener. Results: The testability of the modified ETDRS, Butterfly stereo test and Ishihara test for respective visual function tests were found lower among children in special education classes ( P < 0.001 but not in Cambridge Crowding Cards, Lang Stereo test II and CVTME. Conclusion: Non verbal or "matching" approaches were found to be more superior in testing visual functions in children with learning disabilities. Modifications of vision testing procedures are essential for children with learning disabilities.

  1. Analysis of free geo-server software usability from the viewpoint of INSPIRE requirementsAnalysis of free geo-server software usability from the viewpoint of INSPIRE requirements

    Directory of Open Access Journals (Sweden)

    Tomasz  Grasza

    2014-06-01

    Full Text Available The paper presents selected server platforms based on free and open source license, coherent with the standards of the Open Geospatial Consortium. The presented programs are evaluated in the context of the INSPIRE Directive. The first part describes the requirements of the Directive, and afterwards presented are the pros and cons of each platform, to meet these demands. This article provides an answer to the question whether the use of free software can provide interoperable network services in accordance with the requirements of the INSPIRE Directive, on the occasion of presenting the application examples and practical tips on the use of particular programs.[b]Keywords[/b]: GIS, INSPIRE, free software, OGC, geoportal, network services, GeoServer, deegree, GeoNetwork

  2. Improving Reliability of Spectrum Analysis for Software Quality Requirements Using TCM

    OpenAIRE

    Kaiya, Haruhiko; Tanigawa, Masaaki; Suzuki, Shunichi; Sato, Tomonori; Osada, Akira; Kaijiri, Kenji

    2010-01-01

    Quality requirements are scattered over a requirements specification. thus it Is hard to measure and trace such quality requirements to validate the specification against stakeholders' needs We proposed a technique called "spectrum analysis for quality requirements" which enabled analysts to sort a requirements specification to measure and track quality requirements in the specification In the same way as a spectrum in optics, a quality spectrum of a specification shows a quantitative feature...

  3. An Approach for Implementing State Machines with Online Testability

    Directory of Open Access Journals (Sweden)

    P. K. Lala

    2010-01-01

    Full Text Available During the last two decades, significant amount of research has been performed to simplify the detection of transient or soft errors in VLSI-based digital systems. This paper proposes an approach for implementing state machines that uses 2-hot code for state encoding. State machines designed using this approach allow online detection of soft errors in registers and output logic. The 2-hot code considerably reduces the number of required flip-flops and leads to relatively straightforward implementation of next state and output logic. A new way of designing output logic for online fault detection has also been presented.

  4. Design-for-Delay Testability Techniques for High-Speed Digital Circuits

    NARCIS (Netherlands)

    Vermaak, H.J.

    2005-01-01

    The importance of delay faults is enhanced by the ever increasing clock rates and decreasing geometry sizes of nowadays’ circuits. This thesis focuses on the development of Design-for-Delay-Testability (DfDT) techniques for high-speed circuits and embedded cores. The rising costs of IC testing and

  5. A Research Agenda for Identifying and Developing Required Competencies in Software Engineering

    Directory of Open Access Journals (Sweden)

    Yvonne Sedelmaier

    2013-04-01

    Full Text Available 0 0 1 130 820 Hochschule Coburg 6 1 949 14.0 96 Normal 0 21 false false false DE JA X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Normale Tabelle"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin:0cm; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Times New Roman";} Various issues make learning and teaching software engineering a challenge for both students and instructors. Since there are no standard curricula and no cookbook recipes for successful software engineering, it is fairly hard to figure out which specific topics and competencies should be learned or acquired by a particular group of students. Furthermore, it is not clear which particular didactic approaches might work well for a specific topic and a particular group of students. This contribution presents a research agenda that aims at identifying relevant competencies and environmental constraints as well as their effect on learning and teaching software engineering. To that end, an experimental approach will be taken. As a distinctive feature, this approach iteratively introduces additional or modified didactical methods into existing courses and carefully evaluates their appropriateness. Thus, it continuously improves these methods.

  6. RELAP-7 Software Verification and Validation Plan - Requirements Traceability Matrix (RTM) Part 2: Code Assessment Strategy, Procedure, and RTM Update

    Energy Technology Data Exchange (ETDEWEB)

    Yoo, Jun Soo [Idaho National Lab. (INL), Idaho Falls, ID (United States); Choi, Yong Joon [Idaho National Lab. (INL), Idaho Falls, ID (United States); Smith, Curtis Lee [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-09-01

    This document addresses two subjects involved with the RELAP-7 Software Verification and Validation Plan (SVVP): (i) the principles and plan to assure the independence of RELAP-7 assessment through the code development process, and (ii) the work performed to establish the RELAP-7 assessment plan, i.e., the assessment strategy, literature review, and identification of RELAP-7 requirements. Then, the Requirements Traceability Matrices (RTMs) proposed in previous document (INL-EXT-15-36684) are updated. These RTMs provide an efficient way to evaluate the RELAP-7 development status as well as the maturity of RELAP-7 assessment through the development process.

  7. Specification of problems from the business goals in the context of early software requirements elicitation

    Directory of Open Access Journals (Sweden)

    Carlos Mario Zapata-J.

    2014-01-01

    Full Text Available Una de las principales actividades de la educción temprana de requisitos de software es el reconocimiento y especificación de los problemas de la organización. Esta actividad tiene por objeto la definición de los requisitos iniciales y la satisfacción de las necesidades de los interesados. Estos problemas deben tener relación con los objetivos de la organización para lograr una aplicación de software contextualizada y alineada con la razón de ser de la organización. En los métodos de educción actuales basados en objetivos y problemas, las relaciones se detectan con la ayuda de la experiencia y conocimiento del analista y el interesado. Sin embargo aún no se logra trazabilidad entre objetivos y problemas. En este artículo se propone un método para la especificación de problemas a partir de objetivos organizacionales. Este método se compone de un conjunto de reglas sintácticas y semánticas que el analista usa para expresar los problemas a partir de las declaraciones de los objetivos. También, se presenta un ejemplo de laboratorio basado en el diagrama de objetivos de KAOS.

  8. Handling requirements dependencies in agile projects: A focus group with agile software development practitioners

    NARCIS (Netherlands)

    Martakis, Aias; Daneva, Maia; Wieringa, Roelf J.; Jean-Louis Cavarero, S.; Rolland, C; Cavarero, J.L.

    2013-01-01

    Agile practices on requirements dependencies are a relatively unexplored topic in literature. Empirical studies on it are scarce. This research sets out to uncover concepts that practitioners in companies of various sizes across the globe and in various industries, use for dealing with requirements

  9. Report on functional requirements and software architecture for the IDTO prototype phase 2 : central Florida demonstration.

    Science.gov (United States)

    2015-05-01

    This report documents the System Requirements and Architecture for the Phase 2 implementation of the Integrated Dynamic : Transit Operations (IDTO) Prototype bundle within the Dynamic Mobility Applications (DMA) portion of the Connected Vehicle : Pro...

  10. Report on functional requirements and software architecture for the IDTO prototype : phase I demonstration site (Columbus).

    Science.gov (United States)

    2013-08-01

    This report documents the System Requirements and Architecture for the Phase I implementation of the Integrated Dynamic : Transit Operations (IDTO) Prototype bundle within the Dynamic Mobility Applications (DMA) portion of the Connected Vehicle : Pro...

  11. Requirements and Matching Software Technologies for Sustainable and Agile Manufacturing Systems

    NARCIS (Netherlands)

    Pascal Muller; Daniël Telgen; Ing. Erik Puik; Leo van Moergestel

    2013-01-01

    Sustainable and Agile manufacturing is expected of future generation manufacturing systems. The goal is to create scalable, reconfigurable and adaptable manufacturing systems which are able to produce a range of products without new investments into new manufacturing equipment. This requires a new

  12. Potential and requirements for a standardized pan-European food consumption survey using the EPIC-Soft software.

    Science.gov (United States)

    Ocké, M C; Slimani, N; Brants, H; Buurma-Rethans, E; Casagrande, C; Nicolas, G; Dofkova, M; le Donne, C; Freisling, H; Geelen, A; Huybrechts, I; De Keyzer, W; van der Laan, J D; Lafay, L; Lillegaard, I T; Niekerk, E M; de Vries, J H; Wilson-van den Hooven, E C; de Boer, E J

    2011-07-01

    To describe the strengths, limitations and requirements of using EPIC-Soft software (the software developed to conduct 24-h dietary recalls in the European Prospective Investigation into Cancer and Nutrition (EPIC) Study) in pan-European food consumption surveys, and to discuss potentials and barriers for a harmonized pan-European food consumption survey. The paper is based on the experiences in the 'European Food Consumption and Validation' Project, which included updating six existing and preparing one new country-specific EPIC-Soft version, applying EPIC-Soft in validation and feasibility studies, and estimating the intake of nutrients and flavoring substances. The experiences were discussed in the September 2009 workshop 'Pan-European Food Consumption Surveys--for Standardized and Comparable Transnational Data Collection'. EPIC-Soft is suitable for detailed and standardized food consumption data collection in pan-European food consumption surveys. A thorough preparation of all aspects of the food consumption survey is important for the quality and efficiency during data collection and processing. The preparation and data-handling phase of working with EPIC-Soft is labor intensive and requires trained, motivated and qualified personnel. Given the suitability of EPIC-Soft as standardized dietary assessment tool in European dietary monitoring, the proposed strategy toward a pan-European food consumption survey is to prepare well, to allow flexibility in national extensions and to start with a limited number of countries that are interested.

  13. SATISFACTION OF QUALIFICATION REQUIREMENTS OF EMPLOYERS APPLIED TO SOFTWARE ENGINEERS IN THE PROCESS OF TRAINING AT HIGHER EDUCATIONAL INSTITUTIONS

    Directory of Open Access Journals (Sweden)

    Vladislav Kruhlyk

    2017-03-01

    Full Text Available In the article, based on the analysis of the problems of the professional training of software engineers in higher educational institutions, was shown that the contents of the curricula for the training of software engineers in basic IT specialties in higher education institutions generally meet the requirements to them at the labor market. It is stated that at the present time there are certain changes in the job market not only in the increasing demand for IT professionals but also in the requirements settled for future specialists. To scientists’ opinion, at present there is a gap between the level of expectation of employers and the level of education of graduates of IT-specialties of universities. Due to the extremely fast pace of IT development, already at the end of the studies, students' knowledge may become obsolete. We are talking about a complex of competencies offered by university during training of specialist for their relevance and competitiveness at the labor market. At the same time, the practical training of students does not fully correspond to the current state of information technology. Therefore, it is necessary to ensure the updating of the contents of the academic disciplines with the aim of providing quality training of specialists.

  14. Development of an expert system prototype for determining software functional requirements for command management activities at NASA Goddard

    Science.gov (United States)

    Liebowitz, J.

    1986-01-01

    The development of an expert system prototype for software functional requirement determination for NASA Goddard's Command Management System, as part of its process of transforming general requests into specific near-earth satellite commands, is described. The present knowledge base was formulated through interactions with domain experts, and was then linked to the existing Knowledge Engineering Systems (KES) expert system application generator. Steps in the knowledge-base development include problem-oriented attribute hierarchy development, knowledge management approach determination, and knowledge base encoding. The KES Parser and Inspector, in addition to backcasting and analogical mapping, were used to validate the expert system-derived requirements for one of the major functions of a spacecraft, the solar Maximum Mission. Knowledge refinement, evaluation, and implementation procedures of the expert system were then accomplished.

  15. Soy-Based Therapeutic Baby Formulas: Testable Hypotheses Regarding the Pros and Cons.

    Science.gov (United States)

    Westmark, Cara J

    2016-01-01

    Soy-based infant formulas have been consumed in the United States since 1909, and currently constitute a significant portion of the infant formula market. There are efforts underway to generate genetically modified soybeans that produce therapeutic agents of interest with the intent to deliver those agents in a soy-based infant formula platform. The threefold purpose of this review article is to first discuss the pros and cons of soy-based infant formulas, then present testable hypotheses to discern the suitability of a soy platform for drug delivery in babies, and finally start a discussion to inform public policy on this important area of infant nutrition.

  16. Embedded design-for-testability strategies to test high-resolution SD modulators

    Science.gov (United States)

    Escalera, Sara; Espin, Alvaro; Guerra, Oscar; de la Rosa, Jose M.; Medeiro, Fernando; Perez-Verdu, Belen

    2005-06-01

    This paper describes the design-for-testability strategies integrated in a 0.35μm CMOS 17-bit@40-kS/s chopper-stabilized Switched-Capacitor 2-1 cascade ΣΔ modulator for automotive sensor interfaces. After a brief review on the most important effects degrading the circuit performance, a test technique, based on the division of the circuit into several blocks that are tested separately, is presented. Experimental results shows the utility of the implemented test technique to detect errors in the circuit and to characterize the most important blocks with a minimum increase of extra area for the additional test circuitry.

  17. Testability, Test Automation and Test Driven Development for the Trick Simulation Toolkit

    Science.gov (United States)

    Penn, John

    2014-01-01

    This paper describes the adoption of a Test Driven Development approach and a Continuous Integration System in the development of the Trick Simulation Toolkit, a generic simulation development environment for creating high fidelity training and engineering simulations at the NASA Johnson Space Center and many other NASA facilities. It describes the approach, and the significant benefits seen, such as fast, thorough and clear test feedback every time code is checked into the code repository. It also describes an approach that encourages development of code that is testable and adaptable.

  18. Online testable concept maps: benefits for learning about the pathogenesis of disease.

    Science.gov (United States)

    Ho, Veronica; Kumar, Rakesh K; Velan, Gary

    2014-07-01

    Concept maps have been used to promote meaningful learning and critical thinking. Although these are crucially important in all disciplines, evidence for the benefits of concept mapping for learning in medicine is limited. We performed a randomised crossover study to assess the benefits of online testable concept maps for learning in pathology by volunteer junior medical students. Participants (n = 65) were randomly allocated to either of two groups with equivalent mean prior academic performance, in which they were given access to either online maps or existing online resources for a 2-week block on renal disease. Groups then crossed over for a 2-week block on hepatic disease. Outcomes were assessed using timed online quizzes, which included questions unrelated to topics in the pathogenesis maps as an internal control. Questionnaires were administered to evaluate students' acceptance of the maps. In both blocks, the group with access to pathogenesis maps achieved significantly higher average scores than the control group on quiz questions related to topics covered by the maps (Block 1: p online testable pathogenesis maps are well accepted and can improve learning of concepts in pathology by medical students. © 2014 John Wiley & Sons Ltd.

  19. A Requirements-Based Exploration of Open-Source Software Development Projects--Towards a Natural Language Processing Software Analysis Framework

    Science.gov (United States)

    Vlas, Radu Eduard

    2012-01-01

    Open source projects do have requirements; they are, however, mostly informal, text descriptions found in requests, forums, and other correspondence. Understanding such requirements provides insight into the nature of open source projects. Unfortunately, manual analysis of natural language requirements is time-consuming, and for large projects,…

  20. Software Tools to Support the Assessment of System Health

    Science.gov (United States)

    Melcher, Kevin J.

    2013-01-01

    This presentation provides an overview of three software tools that were developed by the NASA Glenn Research Center to support the assessment of system health: the Propulsion Diagnostic Method Evaluation Strategy (ProDIMES), the Systematic Sensor Selection Strategy (S4), and the Extended Testability Analysis (ETA) tool. Originally developed to support specific NASA projects in aeronautics and space, these software tools are currently available to U.S. citizens through the NASA Glenn Software Catalog. The ProDiMES software tool was developed to support a uniform comparison of propulsion gas path diagnostic methods. Methods published in the open literature are typically applied to dissimilar platforms with different levels of complexity. They often address different diagnostic problems and use inconsistent metrics for evaluating performance. As a result, it is difficult to perform a one ]to ]one comparison of the various diagnostic methods. ProDIMES solves this problem by serving as a theme problem to aid in propulsion gas path diagnostic technology development and evaluation. The overall goal is to provide a tool that will serve as an industry standard, and will truly facilitate the development and evaluation of significant Engine Health Management (EHM) capabilities. ProDiMES has been developed under a collaborative project of The Technical Cooperation Program (TTCP) based on feedback provided by individuals within the aircraft engine health management community. The S4 software tool provides a framework that supports the optimal selection of sensors for health management assessments. S4 is structured to accommodate user ]defined applications, diagnostic systems, search techniques, and system requirements/constraints. One or more sensor suites that maximize this performance while meeting other user ]defined system requirements that are presumed to exist. S4 provides a systematic approach for evaluating combinations of sensors to determine the set or sets of

  1. Frame rate required for speckle tracking echocardiography: A quantitative clinical study with open-source, vendor-independent software.

    Science.gov (United States)

    Negoita, Madalina; Zolgharni, Massoud; Dadkho, Elham; Pernigo, Matteo; Mielewczik, Michael; Cole, Graham D; Dhutia, Niti M; Francis, Darrel P

    2016-09-01

    To determine the optimal frame rate at which reliable heart walls velocities can be assessed by speckle tracking. Assessing left ventricular function with speckle tracking is useful in patient diagnosis but requires a temporal resolution that can follow myocardial motion. In this study we investigated the effect of different frame rates on the accuracy of speckle tracking results, highlighting the temporal resolution where reliable results can be obtained. 27 patients were scanned at two different frame rates at their resting heart rate. From all acquired loops, lower temporal resolution image sequences were generated by dropping frames, decreasing the frame rate by up to 10-fold. Tissue velocities were estimated by automated speckle tracking. Above 40 frames/s the peak velocity was reliably measured. When frame rate was lower, the inter-frame interval containing the instant of highest velocity also contained lower velocities, and therefore the average velocity in that interval was an underestimate of the clinically desired instantaneous maximum velocity. The higher the frame rate, the more accurately maximum velocities are identified by speckle tracking, until the frame rate drops below 40 frames/s, beyond which there is little increase in peak velocity. We provide in an online supplement the vendor-independent software we used for automatic speckle-tracked velocity assessment to help others working in this field. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  2. Software attribute visualization for high integrity software

    Energy Technology Data Exchange (ETDEWEB)

    Pollock, G.M.

    1998-03-01

    This report documents a prototype tool developed to investigate the use of visualization and virtual reality technologies for improving software surety confidence. The tool is utilized within the execution phase of the software life cycle. It provides a capability to monitor an executing program against prespecified requirements constraints provided in a program written in the requirements specification language SAGE. The resulting Software Attribute Visual Analysis Tool (SAVAnT) also provides a technique to assess the completeness of a software specification.

  3. 48 CFR 227.7202 - Commercial computer software and commercial computer software documentation.

    Science.gov (United States)

    2010-10-01

    ... REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7202 Commercial computer software and commercial computer software documentation. ... software and commercial computer software documentation. 227.7202 Section 227.7202 Federal Acquisition...

  4. 48 CFR 227.7203 - Noncommercial computer software and noncommercial computer software documentation.

    Science.gov (United States)

    2010-10-01

    ... REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203 Noncommercial computer software and noncommercial computer software documentation. ... software and noncommercial computer software documentation. 227.7203 Section 227.7203 Federal Acquisition...

  5. Software engineering the current practice

    CERN Document Server

    Rajlich, Vaclav

    2011-01-01

    INTRODUCTION History of Software EngineeringSoftware PropertiesOrigins of SoftwareBirth of Software EngineeringThird Paradigm: Iterative ApproachSoftware Life Span ModelsStaged ModelVariants of Staged ModelSoftware Technologies Programming Languages and CompilersObject-Oriented TechnologyVersion Control SystemSoftware ModelsClass DiagramsUML Activity DiagramsClass Dependency Graphs and ContractsSOFTWARE CHANGEIntroduction to Software ChangeCharacteristics of Software ChangePhases of Software ChangeRequirements and Their ElicitationRequirements Analysis and Change InitiationConcepts and Concept

  6. Some design constraints required for the assembly of software components: The incorporation of atomic abstract types into generically structured abstract types

    Science.gov (United States)

    Johnson, Charles S.

    1986-01-01

    It is nearly axiomatic, that to take the greatest advantage of the useful features available in a development system, and to avoid the negative interactions of those features, requires the exercise of a design methodology which constrains their use. A major design support feature of the Ada language is abstraction: for data, functions processes, resources, and system elements in general. Atomic abstract types can be created in packages defining those private types and all of the overloaded operators, functions, and hidden data required for their use in an application. Generically structured abstract types can be created in generic packages defining those structured private types, as buildups from the user-defined data types which are input as parameters. A study is made of the design constraints required for software incorporating either atomic or generically structured abstract types, if the integration of software components based on them is to be subsequently performed. The impact of these techniques on the reusability of software and the creation of project-specific software support environments is also discussed.

  7. Great software debates

    CERN Document Server

    Davis, A

    2004-01-01

    The industry’s most outspoken and insightful critic explains how the software industry REALLY works. In Great Software Debates, Al Davis, shares what he has learned about the difference between the theory and the realities of business and encourages you to question and think about software engineering in ways that will help you succeed where others fail. In short, provocative essays, Davis fearlessly reveals the truth about process improvement, productivity, software quality, metrics, agile development, requirements documentation, modeling, software marketing and sales, empiricism, start-up financing, software research, requirements triage, software estimation, and entrepreneurship.

  8. Agile Software Development

    Science.gov (United States)

    Biju, Soly Mathew

    2008-01-01

    Many software development firms are now adopting the agile software development method. This method involves the customer at every level of software development, thus reducing the impact of change in the requirement at a later stage. In this article, the principles of the agile method for software development are explored and there is a focus on…

  9. Software Reviews.

    Science.gov (United States)

    Science Software Quarterly, 1984

    1984-01-01

    Provides extensive reviews of computer software, examining documentation, ease of use, performance, error handling, special features, and system requirements. Includes statistics, problem-solving (TK Solver), label printing, database management, experimental psychology, Encyclopedia Britannica biology, and DNA-sequencing programs. A program for…

  10. Paladin Software Support Lab

    Data.gov (United States)

    Federal Laboratory Consortium — The Paladin Software Support Environment (SSE) occupies 2,241 square-feet. It contains the hardware and software tools required to support the Paladin Automatic Fire...

  11. Runtime Instrumentation of SystemC/TLM2 Interfaces for Fault Tolerance Requirements Verification in Software Cosimulation

    Directory of Open Access Journals (Sweden)

    Antonio da Silva

    2014-01-01

    Full Text Available This paper presents the design of a SystemC transaction level modelling wrapping library that can be used for the assertion of system properties, protocol compliance, or fault injection. The library uses C++ virtual table hooks as a dynamic binary instrumentation technique to inline wrappers in the TLM2 transaction path. This technique can be applied after the elaboration phase and needs neither source code modifications nor recompilation of the top level SystemC modules. The proposed technique has been successfully applied to the robustness verification of the on-board boot software of the Instrument Control Unit of the Solar Orbiter’s Energetic Particle Detector.

  12. "PolyMin": software for identification of the minimum number of polymorphisms required for haplotype and genotype differentiation

    DEFF Research Database (Denmark)

    Frei, Ursala K; Wollenweber, Bernd; Lübberstedt, Thomas

    2009-01-01

    this information from available allele sequence data, resulting in an errorprone multi-step process of data handling. Results: PolyMin, a computer program combining the detection of a minimum set of single nucleotide polymorphisms (SNPs) and/or insertions/deletions (INDELs) necessary for allele differentiation...... differentiation has been developed, and its performance compared to other relevant software. The main advantages of PolyMin, especially for plant scientists, is the integration of procedures from sequence analysis to polymorphism selection within a single program, including both haplotype and genotype...

  13. Software Engineering Improvement Plan

    Science.gov (United States)

    2006-01-01

    In performance of this task order, bd Systems personnel provided support to the Flight Software Branch and the Software Working Group through multiple tasks related to software engineering improvement and to activities of the independent Technical Authority (iTA) Discipline Technical Warrant Holder (DTWH) for software engineering. To ensure that the products, comments, and recommendations complied with customer requirements and the statement of work, bd Systems personnel maintained close coordination with the customer. These personnel performed work in areas such as update of agency requirements and directives database, software effort estimation, software problem reports, a web-based process asset library, miscellaneous documentation review, software system requirements, issue tracking software survey, systems engineering NPR, and project-related reviews. This report contains a summary of the work performed and the accomplishments in each of these areas.

  14. Software Quality Assurance Metrics

    Science.gov (United States)

    McRae, Kalindra A.

    2004-01-01

    Software Quality Assurance (SQA) is a planned and systematic set of activities that ensures conformance of software life cycle processes and products conform to requirements, standards and procedures. In software development, software quality means meeting requirements and a degree of excellence and refinement of a project or product. Software Quality is a set of attributes of a software product by which its quality is described and evaluated. The set of attributes includes functionality, reliability, usability, efficiency, maintainability, and portability. Software Metrics help us understand the technical process that is used to develop a product. The process is measured to improve it and the product is measured to increase quality throughout the life cycle of software. Software Metrics are measurements of the quality of software. Software is measured to indicate the quality of the product, to assess the productivity of the people who produce the product, to assess the benefits derived from new software engineering methods and tools, to form a baseline for estimation, and to help justify requests for new tools or additional training. Any part of the software development can be measured. If Software Metrics are implemented in software development, it can save time, money, and allow the organization to identify the caused of defects which have the greatest effect on software development. The summer of 2004, I worked with Cynthia Calhoun and Frank Robinson in the Software Assurance/Risk Management department. My task was to research and collect, compile, and analyze SQA Metrics that have been used in other projects that are not currently being used by the SA team and report them to the Software Assurance team to see if any metrics can be implemented in their software assurance life cycle process.

  15. Software Architecture Evolution

    Science.gov (United States)

    Barnes, Jeffrey M.

    2013-01-01

    Many software systems eventually undergo changes to their basic architectural structure. Such changes may be prompted by new feature requests, new quality attribute requirements, changing technology, or other reasons. Whatever the causes, architecture evolution is commonplace in real-world software projects. Today's software architects, however,…

  16. Runtime Testability in Dynamic Highly-Availability Component-based Systems

    NARCIS (Netherlands)

    Gonzalez, A.; Piel, E.; Gross, H.G.; Van Gemund, A.J.C.

    2010-01-01

    Runtime testing is emerging as the solution for the integration and assessment of highly dynamic, high availability software systems where traditional development-time integration testing cannot be performed. A prerequisite for runtime testing is the knowledge about to which extent the system can be

  17. RiTMO : A Method for Runtime Testability Measurement and Optimisation

    NARCIS (Netherlands)

    Gonzalez, A.; Piel, E.; Gross, H.G.

    2009-01-01

    Version: Accepted as short paper at QSIC 2009. Runtime testing is emerging as the solution for the integration and assessment of highly dynamic, high availability software systems where traditional development-time integration testing is too costly, or cannot be performed. However, in many

  18. A Model for the Measurement of the Runtime Testability of Component-based Systems

    NARCIS (Netherlands)

    González, A.; Piel, E.; Gross, H.G.

    2009-01-01

    Version note: Paper submitted for review at the 5th AMOST Workshop. Runtime testing is emerging as the solution for the integration and validation of software systems where traditional development-time integration testing cannot be performed, such as Systems of Systems or Service Oriented

  19. Developing Software Requirements for a Knowledge Management System That Coordinates Training Programs with Business Processes and Policies in Large Organizations

    Science.gov (United States)

    Kiper, J. Richard

    2013-01-01

    For large organizations, updating instructional programs presents a challenge to keep abreast of constantly changing business processes and policies. Each time a process or policy changes, significant resources are required to locate and modify the training materials that convey the new content. Moreover, without the ability to track learning…

  20. Interplay between requirements, software architecture, and hardware constraints in the development of a home control user interface

    DEFF Research Database (Denmark)

    Loft, M.S.; Nielsen, S.S.; Nørskov, Kim

    2012-01-01

    is to propose the hardware platform as a third Twin Peaks element that must be given attention in projects such as the one described in this paper. Specifically, we discuss how the presence of severe hardware constraints exacerbates making trade-offs between requirements and architecture....

  1. From Software Development to Software Assembly

    NARCIS (Netherlands)

    Sneed, Harry M.; Verhoef, Chris

    2016-01-01

    The lack of skilled programming personnel and the growing burden of maintaining customized software are forcing organizations to quit producing their own software. It's high time they turned to ready-made, standard components to fulfill their business requirements. Cloud services might be one way to

  2. The Paradox of "Structured" Methods for Software Requirements Management: A Case Study of an e-Government Development Project

    Science.gov (United States)

    Conboy, Kieran; Lang, Michael

    This chapter outlines the alternative perspectives of "rationalism" and "improvisation" within information systems development and describes the major shortcomings of each. It then discusses how these shortcomings manifested themselves within an e-government case study where a "structured" requirements management method was employed. Although this method was very prescriptive and firmly rooted in the "rational" paradigm, it was observed that users often resorted to improvised behaviour, such as privately making decisions on how certain aspects of the method should or should not be implemented.

  3. Software Open Source, Software Gratis?

    OpenAIRE

    Rakhmawati, Nur Aini

    2006-01-01

    Berlakunya Undang – undang Hak Atas Kekayaan Intelektual (HAKI), memunculkan suatu alternatif baru untuk menggunakan software open source. Penggunaan software open source menyebar seiring dengan isu global pada Information Communication Technology (ICT) saat ini. Beberapa organisasi dan Perusahaan mulai menjadikan software open source sebagai pertimbangan. Banyak konsep mengenai software open source ini. Mulai dari software yang gratis sampai software tidak berlisensi. Tidak sepenuhnya isu so...

  4. Software Maintenance and Evolution: The Implication for Software ...

    African Journals Online (AJOL)

    Software maintenance is the process of modifying existing operational software by correcting errors, migration of the software to new technologies and platforms, and adapting it to deal with new environmental requirements. It denotes any change made to a software product before and after delivery to customer or user.

  5. Software engineering

    CERN Document Server

    Sommerville, Ian

    2010-01-01

    The ninth edition of Software Engineering presents a broad perspective of software engineering, focusing on the processes and techniques fundamental to the creation of reliable, software systems. Increased coverage of agile methods and software reuse, along with coverage of 'traditional' plan-driven software engineering, gives readers the most up-to-date view of the field currently available. Practical case studies, a full set of easy-to-access supplements, and extensive web resources make teaching the course easier than ever.

  6. The Systems Test Architect: Enabling The Leap From Testable To Tested

    Science.gov (United States)

    2016-09-01

    not an argument on the value of test and evaluation in systems engineering, that has been covered by multiple sources (Bodmer 2003; Barret 2009; United...verification and validation approach and requirements. Buede (2009, 344–345) makes a similar argument regarding requirements language and attribute, stating...hierarchies of functions, stated as verb -noun sets, into an ever more detail hierarchy of functions. The systems engineer and systems architect perform this

  7. Essential software architecture

    CERN Document Server

    Gorton, Ian

    2011-01-01

    Job titles like ""Technical Architect"" and ""Chief Architect"" nowadays abound in software industry, yet many people suspect that ""architecture"" is one of the most overused and least understood terms in professional software development. Gorton's book tries to resolve this dilemma. It concisely describes the essential elements of knowledge and key skills required to be a software architect. The explanations encompass the essentials of architecture thinking, practices, and supporting technologies. They range from a general understanding of structure and quality attributes through technical i

  8. Some design constraints required for the use of generic software in embedded systems: Packages which manage abstract dynamic structures without the need for garbage collection

    Science.gov (United States)

    Johnson, Charles S.

    1986-01-01

    The embedded systems running real-time applications, for which Ada was designed, require their own mechanisms for the management of dynamically allocated storage. There is a need for packages which manage their own internalo structures to control their deallocation as well, due to the performance implications of garbage collection by the KAPSE. This places a requirement upon the design of generic packages which manage generically structured private types built-up from application-defined input types. These kinds of generic packages should figure greatly in the development of lower-level software such as operating systems, schedulers, controllers, and device driver; and will manage structures such as queues, stacks, link-lists, files, and binary multary (hierarchical) trees. Controlled to prevent inadvertent de-designation of dynamic elements, which is implicit in the assignment operation A study was made of the use of limited private type, in solving the problems of controlling the accumulation of anonymous, detached objects in running systems. The use of deallocator prodecures for run-down of application-defined input types during deallocation operations during satellites.

  9. 48 CFR 227.7203-16 - Providing computer software or computer software documentation to foreign governments, foreign...

    Science.gov (United States)

    2010-10-01

    ... software or computer software documentation to foreign governments, foreign contractors, or international... REGULATIONS SYSTEM, DEPARTMENT OF DEFENSE GENERAL CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-16 Providing computer software...

  10. SOFTWARE OPEN SOURCE, SOFTWARE GRATIS?

    Directory of Open Access Journals (Sweden)

    Nur Aini Rakhmawati

    2006-01-01

    Full Text Available Normal 0 false false false IN X-NONE X-NONE MicrosoftInternetExplorer4 Berlakunya Undang – undang Hak Atas Kekayaan Intelektual (HAKI, memunculkan suatu alternatif baru untuk menggunakan software open source. Penggunaan software open source menyebar seiring dengan isu global pada Information Communication Technology (ICT saat ini. Beberapa organisasi dan perusahaan mulai menjadikan software open source sebagai pertimbangan. Banyak konsep mengenai software open source ini. Mulai dari software yang gratis sampai software tidak berlisensi. Tidak sepenuhnya isu software open source benar, untuk itu perlu dikenalkan konsep software open source mulai dari sejarah, lisensi dan bagaimana cara memilih lisensi, serta pertimbangan dalam memilih software open source yang ada. Kata kunci :Lisensi, Open Source, HAKI

  11. Software Metrics and Software Metrology

    CERN Document Server

    Abran, Alain

    2010-01-01

    Most of the software measures currently proposed to the industry bring few real benefits to either software managers or developers. This book looks at the classical metrology concepts from science and engineering, using them as criteria to propose an approach to analyze the design of current software measures and then design new software measures (illustrated with the design of a software measure that has been adopted as an ISO measurement standard). The book includes several case studies analyzing strengths and weaknesses of some of the software measures most often quoted. It is meant for sof

  12. DEVELOPMENT OF METHODOLOGY FOR DESIGNING TESTABLE COMPONENT STRUCTURE OF DISCIPLINARY COMPETENCE

    OpenAIRE

    Vladimir I. Freyman

    2014-01-01

    The aim of the study is to present new methods of quality results assessment of the education corresponding to requirements of Federal State Educational Standards (FSES) of the Third Generation developed for the higher school. The urgency of search of adequate tools for quality competency measurement and its elements formed in the course of experts’ preparation are specified. Methods. It is necessary to consider interference of competency components such as knowledge, abilities, possession in...

  13. Patterns for Parallel Software Design

    CERN Document Server

    Ortega-Arjona, Jorge Luis

    2010-01-01

    Essential reading to understand patterns for parallel programming Software patterns have revolutionized the way we think about how software is designed, built, and documented, and the design of parallel software requires you to consider other particular design aspects and special skills. From clusters to supercomputers, success heavily depends on the design skills of software developers. Patterns for Parallel Software Design presents a pattern-oriented software architecture approach to parallel software design. This approach is not a design method in the classic sense, but a new way of managin

  14. Increasing the impact of usability work in software development

    DEFF Research Database (Denmark)

    Uldall-Espersen, Tobias; Frøkjær, Erik

    2006-01-01

    Usability, Case Study, Software Engineering, Software Quality, Organizational Impact, Usability Requirement Management, CHI 2007 workshop......Usability, Case Study, Software Engineering, Software Quality, Organizational Impact, Usability Requirement Management, CHI 2007 workshop...

  15. Design Principles for Interactive Software

    DEFF Research Database (Denmark)

    The book addresses the crucial intersection of human-computer interaction (HCI) and software engineering by asking both what users require from interactive systems and what developers need to produce well-engineered software. Needs are expressed as......The book addresses the crucial intersection of human-computer interaction (HCI) and software engineering by asking both what users require from interactive systems and what developers need to produce well-engineered software. Needs are expressed as...

  16. Software Radar Technology

    Directory of Open Access Journals (Sweden)

    Tang Jun

    2015-08-01

    Full Text Available In this paper, the definition and the key features of Software Radar, which is a new concept, are proposed and discussed. We consider the development of modern radar system technology to be divided into three stages: Digital Radar, Software radar and Intelligent Radar, and the second stage is just commencing now. A Software Radar system should be a combination of various modern digital modular components conformed to certain software and hardware standards. Moreover, a software radar system with an open system architecture supporting to decouple application software and low level hardware would be easy to adopt "user requirements-oriented" developing methodology instead of traditional "specific function-oriented" developing methodology. Compared with traditional Digital Radar, Software Radar system can be easily reconfigured and scaled up or down to adapt to the changes of requirements and technologies. A demonstration Software Radar signal processing system, RadarLab 2.0, which has been developed by Tsinghua University, is introduced in this paper and the suggestions for the future development of Software Radar in China are also given in the conclusion.

  17. Software reliability

    CERN Document Server

    Bendell, A

    1986-01-01

    Software Reliability reviews some fundamental issues of software reliability as well as the techniques, models, and metrics used to predict the reliability of software. Topics covered include fault avoidance, fault removal, and fault tolerance, along with statistical methods for the objective assessment of predictive accuracy. Development cost models and life-cycle cost models are also discussed. This book is divided into eight sections and begins with a chapter on adaptive modeling used to predict software reliability, followed by a discussion on failure rate in software reliability growth mo

  18. Software architecture evolution

    DEFF Research Database (Denmark)

    Barais, Olivier; Le Meur, Anne-Francoise; Duchien, Laurence

    2008-01-01

    architecture. Following the early aspect paradigm, Tran SAT allows the software architect to design a software architecture stepwise in terms of aspects at the design stage. It realises the evolution as the weaving of new architectural aspects into an existing software architecture.......Software architectures must frequently evolve to cope with changing requirements, and this evolution often implies integrating new concerns. Unfortunately, when the new concerns are crosscutting, existing architecture description languages provide little or no support for this kind of evolution....... The software architect must modify multiple elements of the architecture manually, which risks introducing inconsistencies. This chapter provides an overview, comparison and detailed treatment of the various state-of-the-art approaches to describing and evolving software architectures. Furthermore, we discuss...

  19. Software Defined Networking Demands on Software Technologies

    DEFF Research Database (Denmark)

    Galinac Grbac, T.; Caba, Cosmin Marius; Soler, José

    2015-01-01

    , we review this new approach to networking from an architectural point of view, and identify and discuss some critical quality issues that require new developments in software technologies. These issues we discuss along with use case scenarios. Here in this paper we aim to identify challenges......Software Defined Networking (SDN) is a networking approach based on a centralized control plane architecture with standardised interfaces between control and data planes. SDN enables fast configuration and reconfiguration of the network to enhance resource utilization and service performances....... This new approach enables a more dynamic and flexible network, which may adapt to user needs and application requirements. To this end, systemized solutions must be implemented in network software, aiming to provide secure network services that meet the required service performance levels. In this paper...

  20. Viable and testable SUSY GUTs with Yukawa unification the case of split trilinears

    CERN Document Server

    Guadagnoli, Diego; Straub, David M

    2009-01-01

    We explore general SUSY GUT models with exact third-generation Yukawa unification, but where the requirement of universal soft terms at the GUT scale is relaxed. We consider the scenario in which the breaking of universality inherits from the Yukawa couplings, i.e. is of minimal flavor violating (MFV) type. In particular, the MFV principle allows for a splitting between the up-type and the down-type soft trilinear couplings. We explore the viability of this trilinear splitting scenario by means of a fitting procedure to electroweak observables, quark masses as well as flavor-changing neutral current processes. Phenomenological viability singles out one main scenario. This scenario is characterized by a sizable splitting between the trilinear soft terms and a large mu term. Remarkably, this scenario does not invoke a partial decoupling of the sparticle spectrum, as in the case of universal soft terms, but instead it requires part of the spectrum, notably the lightest stop, the gluino and the lightest charginos...

  1. Choosing a software design method for real-time Ada applications: JSD process inversion as a means to tailor a design specification to the performance requirements and target machine

    Science.gov (United States)

    Withey, James V.

    1986-01-01

    The validity of real-time software is determined by its ability to execute on a computer within the time constraints of the physical system it is modeling. In many applications the time constraints are so critical that the details of process scheduling are elevated to the requirements analysis phase of the software development cycle. It is not uncommon to find specifications for a real-time cyclic executive program included to assumed in such requirements. It was found that prelininary designs structured around this implementation abscure the data flow of the real world system that is modeled and that it is consequently difficult and costly to maintain, update and reuse the resulting software. A cyclic executive is a software component that schedules and implicitly synchronizes the real-time software through periodic and repetitive subroutine calls. Therefore a design method is sought that allows the deferral of process scheduling to the later stages of design. The appropriate scheduling paradigm must be chosen given the performance constraints, the largest environment and the software's lifecycle. The concept of process inversion is explored with respect to the cyclic executive.

  2. What Counts in Software Process?

    DEFF Research Database (Denmark)

    Cohn, Marisa

    2009-01-01

    In software development, there is an interplay between Software Process models and Software Process enactments. The former tends to be abstract descriptions or plans. The latter tends to be specific instantiations of some ideal procedure. In this paper, we examine the role of work artifacts...... and conversations in negotiating between prescriptions from a model and the contingencies that arise in an enactment. A qualitative field study at two Agile software development companies was conducted to investigate the role of artifacts in the software development work and the relationship between these artifacts...... and the Software Process. Documentation of software requirements is a major concern among software developers and software researchers. Agile software development denotes a different relationship to documentation, one that warrants investigation. Empirical findings are presented which suggest a new understanding...

  3. 48 CFR 227.7203-15 - Subcontractor rights in computer software or computer software documentation.

    Science.gov (United States)

    2010-10-01

    ... REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-15 Subcontractor rights in computer software or computer software documentation. (a... computer software or computer software documentation. 227.7203-15 Section 227.7203-15 Federal Acquisition...

  4. 48 CFR 227.7202-3 - Rights in commercial computer software or commercial computer software documentation.

    Science.gov (United States)

    2010-10-01

    ... CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7202-3 Rights in commercial computer software or commercial computer software documentation... computer software or commercial computer software documentation. 227.7202-3 Section 227.7202-3 Federal...

  5. The Software Invention Cube: A classification scheme for software inventions

    NARCIS (Netherlands)

    Bergstra, J.A.; Klint, P.

    2008-01-01

    The patent system protects inventions. The requirement that a software invention should make ‘a technical contribution’ turns out to be untenable in practice and this raises the question, what constitutes an invention in the realm of software. The authors developed the Software Invention Cube

  6. The software invention cube: A classification scheme for software inventions

    NARCIS (Netherlands)

    J.A. Bergstra; P. Klint (Paul)

    2008-01-01

    htmlabstractThe patent system protects inventions. The requirement that a software invention should make ‘a technical contribution’ turns out to be untenable in practice and this raises the question, what constitutes an invention in the realm of software. The authors developed the Software Invention

  7. Scientific Software Component Technology

    Energy Technology Data Exchange (ETDEWEB)

    Kohn, S.; Dykman, N.; Kumfert, G.; Smolinski, B.

    2000-02-16

    We are developing new software component technology for high-performance parallel scientific computing to address issues of complexity, re-use, and interoperability for laboratory software. Component technology enables cross-project code re-use, reduces software development costs, and provides additional simulation capabilities for massively parallel laboratory application codes. The success of our approach will be measured by its impact on DOE mathematical and scientific software efforts. Thus, we are collaborating closely with library developers and application scientists in the Common Component Architecture forum, the Equation Solver Interface forum, and other DOE mathematical software groups to gather requirements, write and adopt a variety of design specifications, and develop demonstration projects to validate our approach. Numerical simulation is essential to the science mission at the laboratory. However, it is becoming increasingly difficult to manage the complexity of modern simulation software. Computational scientists develop complex, three-dimensional, massively parallel, full-physics simulations that require the integration of diverse software packages written by outside development teams. Currently, the integration of a new software package, such as a new linear solver library, can require several months of effort. Current industry component technologies such as CORBA, JavaBeans, and COM have all been used successfully in the business domain to reduce software development costs and increase software quality. However, these existing industry component infrastructures will not scale to support massively parallel applications in science and engineering. In particular, they do not address issues related to high-performance parallel computing on ASCI-class machines, such as fast in-process connections between components, language interoperability for scientific languages such as Fortran, parallel data redistribution between components, and massively

  8. 48 CFR 12.212 - Computer software.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Computer software. 12.212... software. (a) Commercial computer software or commercial computer software documentation shall be acquired... required to— (1) Furnish technical information related to commercial computer software or commercial...

  9. DiPS: Filling the Gap between System Software and Testing

    OpenAIRE

    Michiels, Sam; Walravens, Dirk; Janssens, Nico; Verbaeten, Pierre

    2002-01-01

    Testing system software (such as protocol stacks or file systems) often is a tedious and error-prone process. The reason for this is that such software is very complex and often not designed to be tested. This paper presents DiPS, a component framework, which forces to develop testable software, and DiPSUnit, a JUnit extension, to test DiPS units in a uniform way. Although non-trivial test support is provided, using DiPSUnit keeps testing simple and intuitive thanks to...

  10. Software licenses: Stay honest!

    CERN Multimedia

    Computer Security Team

    2012-01-01

    Do you recall our article about copyright violation in the last issue of the CERN Bulletin, “Music, videos and the risk for CERN”? Now let’s be more precise. “Violating copyright” not only means the illegal download of music and videos, it also applies to software packages and applications.   Users must respect proprietary rights in compliance with the CERN Computing Rules (OC5). Not having legitimately obtained a program or the required licenses to run that software is not a minor offense. It violates CERN rules and puts the Organization at risk! Vendors deserve credit and compensation. Therefore, make sure that you have the right to use their software. In other words, you have bought the software via legitimate channels and use a valid and honestly obtained license. This also applies to “Shareware” and software under open licenses, which might also come with a cost. Usually, only “Freeware” is complete...

  11. Developing Software Simulations

    Directory of Open Access Journals (Sweden)

    Tom Hall

    2007-06-01

    Full Text Available Programs in education and business often require learners to develop and demonstrate competence in specified areas and then be able to effectively apply this knowledge. One method to aid in developing a skill set in these areas is through the use of software simulations. These simulations can be used for learner demonstrations of competencies in a specified course as well as a review of the basic skills at the beginning of subsequent courses. The first section of this paper discusses ToolBook, the software used to develop our software simulations. The second section discusses the process of developing software simulations. The third part discusses how we have used software simulations to assess student knowledge of research design by providing simulations that allow the student to practice using SPSS and Excel.

  12. Software engineering

    CERN Document Server

    Sommerville, Ian

    2016-01-01

    For courses in computer science and software engineering The Fundamental Practice of Software Engineering Software Engineering introduces readers to the overwhelmingly important subject of software programming and development. In the past few years, computer systems have come to dominate not just our technological growth, but the foundations of our world's major industries. This text seeks to lay out the fundamental concepts of this huge and continually growing subject area in a clear and comprehensive manner. The Tenth Edition contains new information that highlights various technological updates of recent years, providing readers with highly relevant and current information. Sommerville's experience in system dependability and systems engineering guides the text through a traditional plan-based approach that incorporates some novel agile methods. The text strives to teach the innovators of tomorrow how to create software that will make our world a better, safer, and more advanced place to live.

  13. A MAINTAINABILITY ENHANCEMENT PROCEDURE FOR REDUCING AGILE SOFTWARE DEVELOPMENT RISK

    OpenAIRE

    Sen-Tarng Lai

    2015-01-01

    In mobile communications age, environment changes rapidly, the requirements change is the software project must face challenge. Able to overcome the impact of requirements change, software development risk can be effectively decreased. In order to reduce software requirements change risk, the paper investigates the major software development models and recommends the adaptable requirements change software development. Agile development applied the Iterative and Incremental Develop...

  14. Towards a software profession

    Science.gov (United States)

    Berard, Edward V.

    1986-01-01

    An increasing number of programmers have attempted to change their image. They have made it plain that they wish not only to be taken seriously, but they also wish to be regarded as professionals. Many programmers now wish to referred to as software engineers. If programmers wish to be considered professionals in every sense of the word, two obstacles must be overcome: the inability to think of software as a product, and the idea that little or no skill is required to create and handle software throughout its life cycle. The steps to be taken toward professionalization are outlined along with recommendations.

  15. Software engineering methodologies and tools

    Science.gov (United States)

    Wilcox, Lawrence M.

    1993-01-01

    Over the years many engineering disciplines have developed, including chemical, electronic, etc. Common to all engineering disciplines is the use of rigor, models, metrics, and predefined methodologies. Recently, a new engineering discipline has appeared on the scene, called software engineering. For over thirty years computer software has been developed and the track record has not been good. Software development projects often miss schedules, are over budget, do not give the user what is wanted, and produce defects. One estimate is there are one to three defects per 1000 lines of deployed code. More and more systems are requiring larger and more complex software for support. As this requirement grows, the software development problems grow exponentially. It is believed that software quality can be improved by applying engineering principles. Another compelling reason to bring the engineering disciplines to software development is productivity. It has been estimated that productivity of producing software has only increased one to two percent a year in the last thirty years. Ironically, the computer and its software have contributed significantly to the industry-wide productivity, but computer professionals have done a poor job of using the computer to do their job. Engineering disciplines and methodologies are now emerging supported by software tools that address the problems of software development. This paper addresses some of the current software engineering methodologies as a backdrop for the general evaluation of computer assisted software engineering (CASE) tools from actual installation of and experimentation with some specific tools.

  16. Computer-aided design of microfluidic very large scale integration (mVLSI) biochips design automation, testing, and design-for-testability

    CERN Document Server

    Hu, Kai; Ho, Tsung-Yi

    2017-01-01

    This book provides a comprehensive overview of flow-based, microfluidic VLSI. The authors describe and solve in a comprehensive and holistic manner practical challenges such as control synthesis, wash optimization, design for testability, and diagnosis of modern flow-based microfluidic biochips. They introduce practical solutions, based on rigorous optimization and formal models. The technical contributions presented in this book will not only shorten the product development cycle, but also accelerate the adoption and further development of modern flow-based microfluidic biochips, by facilitating the full exploitation of design complexities that are possible with current fabrication techniques. Offers the first practical problem formulation for automated control-layer design in flow-based microfluidic biochips and provides a systematic approach for solving this problem; Introduces a wash-optimization method for cross-contamination removal; Presents a design-for-testability (DfT) technique that can achieve 100...

  17. Software Innovation

    DEFF Research Database (Denmark)

    Rose, Jeremy

      Innovation is the forgotten key to modern systems development - the element that defines the enterprising engineer, the thriving software firm and the cutting edge software application.  Traditional forms of technical education pay little attention to creativity - often encouraging overly...... rationalistic ways of thinking which stifle the ability to innovate. Professional software developers are often drowned in commercial drudgery and overwhelmed by work pressure and deadlines. The topic that will both ensure success in the market and revitalize their work lives is never addressed. This book sets...... out the new field of software innovation. It organizes the existing scientific research into eight simple heuristics - guiding principles for organizing a system developer's work-life so that it focuses on innovation....

  18. Software quality assurance: in large scale and complex software-intensive systems

    NARCIS (Netherlands)

    Mistrik, I.; Soley, R.; Ali, N.; Grundy, J.; Tekinerdogan, B.

    2015-01-01

    Software Quality Assurance in Large Scale and Complex Software-intensive Systems presents novel and high-quality research related approaches that relate the quality of software architecture to system requirements, system architecture and enterprise-architecture, or software testing. Modern software

  19. The LUCIFER control software

    Science.gov (United States)

    Jütte, Marcus; Knierim, Volker; Polsterer, Kai; Lehmitz, Michael; Storz, Clemens; Seifert, Walter; Ageorges, Nancy

    2010-07-01

    The successful roll-out of the control software for a complex NIR imager/spectrograph with MOS calls for flexible development strategies due to changing requirements during different phases of the project. A waterfall strategy used in the beginning has to change to a more iterative and agile process in the later stages. The choice of an appropriate program language as well as suitable software layout is crucial. For example the software has to accomplish multiple demands of different user groups, including a high level of flexibility for later changes and extensions. Different access levels to the instrument are mandatory to afford direct control mechanisms for lab operations and inspections of the instrument as well as tools to accomplish efficient science observations. Our hierarchical software structure with four layers of increasing abstract levels and the use of an object oriented language ideally supports these requirements. Here we describe our software architecture, the software development process, the different access levels and our commissioning experiences with LUCIFER 1.

  20. MIAWARE Software

    DEFF Research Database (Denmark)

    Wilkowski, Bartlomiej; Pereira, Oscar N. M.; Dias, Paulo

    2008-01-01

    This article presents MIAWARE, a software for Medical Image Analysis With Automated Reporting Engine, which was designed and developed for doctor/radiologist assistance. It allows to analyze an image stack from computed axial tomography scan of lungs (thorax) and, at the same time, to mark all...... pathologies on images and report their characteristics. The reporting process is normalized - radiologists cannot describe pathological changes with their own words, but can only use some terms from a specific vocabulary set provided by the software. Consequently, a normalized radiological report...... is automatically generated. Furthermore, MIAWARE software is accompanied with an intelligent search engine for medical reports, based on the relations between parts of the lungs. A logical structure of the lungs is introduced to the search algorithm through the specially developed ontology. As a result...

  1. Software survey

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    2007-07-15

    This article presented a guide to new software applications designed to facilitate exploration, drilling and production activities. Oil and gas producers can use the proudcts for a range of functions, including reservoir characterization and accounting. In addition to a description of the software application, this article listed the name of software providers and the new features available in products. The featured software of Calgary-based providers included: PetroLOOK by Alcaro Softworks Inc.; ProphetFM and MasterDRIL by Advanced Measurements Inc.,; the EDGE screening tool by Canadian Discovery Ltd.; Emission Manager and Regulatory Document Manager by Envirosoft Corporation; ResSurveil, ResBalance and ResAssist by Epic Consulting Services Ltd; FAST WellTest and FAST RTA by Fekete Associates Inc.; OMNI 3D and VISTA 2D/3D by Gedco; VisualVoxAT, SBED and SBEDStudio by Geomodeling Technology Corporation; geoSCOUT, petroCUBE and gDC by GeoLOGIC Systems Ltd.; IHS Enerdeq Desktop and PETRA by IHS; DataVera by Intervera Data Solutions; FORGAS, PIPEFLO and WELLFLO by Neotechnology Consultants Ltd.; E and P Workflow Solutions by Neuralog Inc.; Oil and Gas Solutions by RiskAdvisory division of SAS; Petrel; GeoFrame, ECLIPSE, OFM, Osprey Risk and Avocet modeler, PIPESIM and Merak by Schlumberger Information Solutions; esi.manage and esi.executive by 3esi; and, dbAFE and PROSPECTOR by Winfund Corporation. Tower Management and Maintenance System, OverSite and Safety Orientation Management System software by Edmonton-based 3C Information Solutions Inc. were also highlighted along with PowerSHAPE, PowerMILL and FeatureCAM software by Windsor, Ontario-based Delcam. Software products by Texas-based companies featured in this article included the HTRI Xchanger Suite by Heat Transfer Research Inc.; Drillworks by Knowledge Systems; and GeoProbe, PowerView; GeoGraphix, AssetPlanner, Nexus software, Decision Management System, AssetSolver, and OpenWorks by Landmark; and, eVIN, Rig

  2. Effective methods for software and systems integration

    CERN Document Server

    Summers, Boyd L

    2012-01-01

    Before software engineering builds and installations can be implemented into software and/or systems integrations in military and aerospace programs, a comprehensive understanding of the software development life cycle is required. Covering all the development life cycle disciplines, Effective Methods for Software and Systems Integration explains how to select and apply a life cycle that promotes effective and efficient software and systems integration. The book defines time-tested methods for systems engineering, software design, software engineering informal/formal builds, software engineeri

  3. [Software version and medical device software supervision].

    Science.gov (United States)

    Peng, Liang; Liu, Xiaoyan

    2015-01-01

    The importance of software version in the medical device software supervision does not cause enough attention at present. First of all, the effect of software version in the medical device software supervision is discussed, and then the necessity of software version in the medical device software supervision is analyzed based on the discussion of the misunderstanding of software version. Finally the concrete suggestions on software version naming rules, software version supervision for the software in medical devices, and software version supervision scheme are proposed.

  4. Software Reviews.

    Science.gov (United States)

    Mathematics and Computer Education, 1987

    1987-01-01

    Presented are reviews of several microcomputer software programs. Included are reviews of: (1) Microstat (Zenith); (2) MathCAD (MathSoft); (3) Discrete Mathematics (True Basic); (4) CALCULUS (True Basic); (5) Linear-Kit (John Wiley); and (6) Geometry Sensei (Broderbund). (RH)

  5. Software Modernization.

    Science.gov (United States)

    1986-05-01

    is desirable to automate these error-prone tasks to the maximum extent practica ble. For even modestly-sized software activities, the costs of auto...22161. (703) 487 4848. FY Fiscal Year (I October - 30 September) IIQ Headquarters IEEE Institute or Electrical and Electronics Engineers %ON, JOVIAL A

  6. Software Architecture

    NARCIS (Netherlands)

    Tekinerdogan, B.; Zdun, Uwe; Babar, Ali

    2016-01-01

    This book constitutes the proceedings of the 10th European Conference on Software Architecture, ECSA 2016, held in Copenhagen, Denmark, in November/December 2016.

    The 13 full papers presented together with 12 short papers were carefully reviewed and selected from 84 submissions. They are

  7. Ten recommendations for software engineering in research.

    Science.gov (United States)

    Hastings, Janna; Haug, Kenneth; Steinbeck, Christoph

    2014-01-01

    Research in the context of data-driven science requires a backbone of well-written software, but scientific researchers are typically not trained at length in software engineering, the principles for creating better software products. To address this gap, in particular for young researchers new to programming, we give ten recommendations to ensure the usability, sustainability and practicality of research software.

  8. Software archeology: a case study in software quality assurance and design

    Energy Technology Data Exchange (ETDEWEB)

    Macdonald, John M [Los Alamos National Laboratory; Lloyd, Jane A [Los Alamos National Laboratory; Turner, Cameron J [COLORADO SCHOOL OF MINES

    2009-01-01

    Ideally, quality is designed into software, just as quality is designed into hardware. However, when dealing with legacy systems, demonstrating that the software meets required quality standards may be difficult to achieve. As the need to demonstrate the quality of existing software was recognized at Los Alamos National Laboratory (LANL), an effort was initiated to uncover and demonstrate that legacy software met the required quality standards. This effort led to the development of a reverse engineering approach referred to as software archaeology. This paper documents the software archaeology approaches used at LANL to document legacy software systems. A case study for the Robotic Integrated Packaging System (RIPS) software is included.

  9. Computing and software

    Directory of Open Access Journals (Sweden)

    White, G. C.

    2004-06-01

    Full Text Available The reality is that the statistical methods used for analysis of data depend upon the availability of software. Analysis of marked animal data is no different than the rest of the statistical field. The methods used for analysis are those that are available in reliable software packages. Thus, the critical importance of having reliable, up–to–date software available to biologists is obvious. Statisticians have continued to develop more robust models, ever expanding the suite of potential analysis methods available. But without software to implement these newer methods, they will languish in the abstract, and not be applied to the problems deserving them. In the Computers and Software Session, two new software packages are described, a comparison of implementation of methods for the estimation of nest survival is provided, and a more speculative paper about how the next generation of software might be structured is presented. Rotella et al. (2004 compare nest survival estimation with different software packages: SAS logistic regression, SAS non–linear mixed models, and Program MARK. Nests are assumed to be visited at various, possibly infrequent, intervals. All of the approaches described compute nest survival with the same likelihood, and require that the age of the nest is known to account for nests that eventually hatch. However, each approach offers advantages and disadvantages, explored by Rotella et al. (2004. Efford et al. (2004 present a new software package called DENSITY. The package computes population abundance and density from trapping arrays and other detection methods with a new and unique approach. DENSITY represents the first major addition to the analysis of trapping arrays in 20 years. Barker & White (2004 discuss how existing software such as Program MARK require that each new model’s likelihood must be programmed specifically for that model. They wishfully think that future software might allow the user to combine

  10. Computing and software

    Science.gov (United States)

    White, Gary C.; Hines, J.E.

    2004-01-01

    The reality is that the statistical methods used for analysis of data depend upon the availability of software. Analysis of marked animal data is no different than the rest of the statistical field. The methods used for analysis are those that are available in reliable software packages. Thus, the critical importance of having reliable, up–to–date software available to biologists is obvious. Statisticians have continued to develop more robust models, ever expanding the suite of potential analysis methodsavailable. But without software to implement these newer methods, they will languish in the abstract, and not be applied to the problems deserving them.In the Computers and Software Session, two new software packages are described, a comparison of implementation of methods for the estimation of nest survival is provided, and a more speculative paper about how the next generation of software might be structured is presented.Rotella et al. (2004) compare nest survival estimation with different software packages: SAS logistic regression, SAS non–linear mixed models, and Program MARK. Nests are assumed to be visited at various, possibly infrequent, intervals. All of the approaches described compute nest survival with the same likelihood, and require that the age of the nest is known to account for nests that eventually hatch. However, each approach offers advantages and disadvantages, explored by Rotella et al. (2004).Efford et al. (2004) present a new software package called DENSITY. The package computes population abundance and density from trapping arrays and other detection methods with a new and unique approach. DENSITY represents the first major addition to the analysis of trapping arrays in 20 years.Barker & White (2004) discuss how existing software such as Program MARK require that each new model’s likelihood must be programmed specifically for that model. They wishfully think that future software might allow the user to combine pieces of likelihood

  11. Calculation Software

    Science.gov (United States)

    1994-01-01

    MathSoft Plus 5.0 is a calculation software package for electrical engineers and computer scientists who need advanced math functionality. It incorporates SmartMath, an expert system that determines a strategy for solving difficult mathematical problems. SmartMath was the result of the integration into Mathcad of CLIPS, a NASA-developed shell for creating expert systems. By using CLIPS, MathSoft, Inc. was able to save the time and money involved in writing the original program.

  12. Software preservation

    Directory of Open Access Journals (Sweden)

    Tadej Vodopivec

    2011-01-01

    Full Text Available Comtrade Ltd. covers a wide range of activities related to information and communication technologies; its deliverables include web applications, locally installed programs,system software, drivers, embedded software (used e.g. in medical devices, auto parts,communication switchboards. Also the extensive knowledge and practical experience about digital long-term preservation technologies have been acquired. This wide spectrum of activities puts us in the position to discuss the often overlooked aspect of the digital preservation - preservation of software programs. There are many resources dedicated to digital preservation of digital data, documents and multimedia records,but not so many about how to preserve the functionalities and features of computer programs. Exactly these functionalities - dynamic response to inputs - render the computer programs rich compared to documents or linear multimedia. The article opens the questions on the beginning of the way to the permanent digital preservation. The purpose is to find a way in the right direction, where all relevant aspects will be covered in proper balance. The following questions are asked: why at all to preserve computer programs permanently, who should do this and for whom, when we should think about permanent program preservation, what should be persevered (such as source code, screenshots, documentation, and social context of the program - e.g. media response to it ..., where and how? To illustrate the theoretic concepts given the idea of virtual national museum of electronic banking is also presented.

  13. Agile distributed software development

    DEFF Research Database (Denmark)

    Persson, John Stouby; Mathiassen, Lars; Aaen, Ivan

    2012-01-01

    While face-to-face interaction is fundamental in agile software development, distributed environments must rely extensively on mediated interactions. Practicing agile principles in distributed environments therefore poses particular control challenges related to balancing fixed vs. evolving quality...... requirements and people vs. process-based collaboration. To investigate these challenges, we conducted an in-depth case study of a successful agile distributed software project with participants from a Russian firm and a Danish firm. Applying Kirsch’s elements of control framework, we offer an analysis of how...

  14. CMS software and computing

    CERN Document Server

    Charlot, C

    2003-01-01

    CMS is one of the two general-purpose HEP experiments currently under construction for the Large Hadron Collider at CERN. The handling of multi-petabyte data samples in a worldwide context requires computing and software systems with unprecedented scale and complexity. We describe how CMS is meeting the many data analysis challenges in the LHC area. We cover in particular our system of globally distributed regional centers, the status of our object-oriented software, and our strategies for Grid-enriched data analysis.

  15. Green in software engineering

    CERN Document Server

    Calero Munoz, Coral

    2015-01-01

    This is the first book that presents a comprehensive overview of sustainability aspects in software engineering. Its format follows the structure of the SWEBOK and covers the key areas involved in the incorporation of green aspects in software engineering, encompassing topics from requirement elicitation to quality assurance and maintenance, while also considering professional practices and economic aspects. The book consists of thirteen chapters, which are structured in five parts. First the "Introduction" gives an overview of the primary general concepts related to Green IT, discussing wha

  16. Software Development at Belle II

    Science.gov (United States)

    Kuhr, Thomas; Hauth, Thomas

    2015-12-01

    Belle II is a next generation B-factory experiment that will collect 50 times more data than its predecessor Belle. This requires not only a major upgrade of the detector hardware, but also of the simulation, reconstruction, and analysis software. The challenges of the software development at Belle II and the tools and procedures to address them are reviewed in this article.

  17. Reviews in innovative software development

    DEFF Research Database (Denmark)

    Aaen, Ivan; Boelsmand, Jeppe Vestergaard; Jensen, Rasmus

    2009-01-01

    This paper proposes a new review approach for innovative software development. Innovative software development implies that requirements are rarely available as a basis for reviewing and that the purpose of a review is as much to forward additional ideas, as to validate what has been accomplished...

  18. Evaluation of Students’ Skills in Software Project

    OpenAIRE

    Pinar Cihan; Oya Kalipsiz

    2014-01-01

    Software project probably is a sector that has witnessed the highest rate of project failure in the world. The industry claims that the software engineering graduates are not able to meet their requirements in software industry. This is surprising to the academia that offers software engineering specialization. This type of situation is creating the barrier between the software industry and academics and the efforts are made to reduce the gap. So it is important identify weaknesses of project...

  19. CONRAD Software Architecture

    Science.gov (United States)

    Guzman, J. C.; Bennett, T.

    2008-08-01

    The Convergent Radio Astronomy Demonstrator (CONRAD) is a collaboration between the computing teams of two SKA pathfinder instruments, MeerKAT (South Africa) and ASKAP (Australia). Our goal is to produce the required common software to operate, process and store the data from the two instruments. Both instruments are synthesis arrays composed of a large number of antennas (40 - 100) operating at centimeter wavelengths with wide-field capabilities. Key challenges are the processing of high volume of data in real-time as well as the remote mode of operations. Here we present the software architecture for CONRAD. Our design approach is to maximize the use of open solutions and third-party software widely deployed in commercial applications, such as SNMP and LDAP, and to utilize modern web-based technologies for the user interfaces, such as AJAX.

  20. Software Epistemology

    Science.gov (United States)

    2016-03-01

    in-vitro decision to incubate a startup, Lexumo [7], which is developing a commercial Software as a Service (SaaS) vulnerability assessment...reduction and technology exploration as well as incubated a commercial offering with Draper white-label support to DoD and IC customers—dramatically...Figure 9. OpHash Map 1.3.7 Opcode Hash (OpHash) A custom hashing scheme was developed in our SWE research to enable fast, but fuzzy , matching of basic

  1. Software Engineering Improvement Activities/Plan

    Science.gov (United States)

    2003-01-01

    bd Systems personnel accomplished the technical responsibilities for this reporting period, as planned. A close working relationship was maintained with personnel of the MSFC Avionics Department Software Group (ED14). Work accomplishments included development, evaluation, and enhancement of a software cost model, performing literature search and evaluation of software tools available for code analysis and requirements analysis, and participating in other relevant software engineering activities. Monthly reports were submitted. This support was provided to the Flight Software Group/ED 1 4 in accomplishing the software engineering improvement engineering activities of the Marshall Space Flight Center (MSFC) Software Engineering Improvement Plan.

  2. Software system safety

    Science.gov (United States)

    Uber, James G.

    1988-01-01

    Software itself is not hazardous, but since software and hardware share common interfaces there is an opportunity for software to create hazards. Further, these software systems are complex, and proven methods for the design, analysis, and measurement of software safety are not yet available. Some past software failures, future NASA software trends, software engineering methods, and tools and techniques for various software safety analyses are reviewed. Recommendations to NASA are made based on this review.

  3. Software essentials design and construction

    CERN Document Server

    Dingle, Adair

    2014-01-01

    About the Cover: Although capacity may be a problem for a doghouse, other requirements are usually minimal. Unlike skyscrapers, doghouses are simple units. They do not require plumbing, electricity, fire alarms, elevators, or ventilation systems, and they do not need to be built to code or pass inspections. The range of complexity in software design is similar. Given available software tools and libraries-many of which are free-hobbyists can build small or short-lived computer apps. Yet, design for software longevity, security, and efficiency can be intricate-as is the design of large-scale sy

  4. Methods of Software Verification

    Directory of Open Access Journals (Sweden)

    R. E. Gurin

    2015-01-01

    Full Text Available This article is devoted to the problem of software verification (SW. Methods of software verification designed to check the software for compliance with the stated requirements such as correctness, system security and system adaptability to small changes in the environment, portability and compatibility, etc. These are various methods both by the operation process and by the way of achieving result. The article describes the static and dynamic methods of software verification and paid attention to the method of symbolic execution. In its review of static analysis are discussed and described the deductive method, and methods for testing the model. A relevant issue of the pros and cons of a particular method is emphasized. The article considers classification of test techniques for each method. In this paper we present and analyze the characteristics and mechanisms of the static analysis of dependencies, as well as their views, which can reduce the number of false positives in situations where the current state of the program combines two or more states obtained both in different paths of execution and in working with multiple object values. Dependences connect various types of software objects: single variables, the elements of composite variables (structure fields, array elements, the size of the heap areas, the length of lines, the number of initialized array elements in the verification code using static methods. The article pays attention to the identification of dependencies within the framework of the abstract interpretation, as well as gives an overview and analysis of the inference tools.Methods of dynamic analysis such as testing, monitoring and profiling are presented and analyzed. Also some kinds of tools are considered which can be applied to the software when using the methods of dynamic analysis. Based on the work a conclusion is drawn, which describes the most relevant problems of analysis techniques, methods of their solutions and

  5. Sandia software guidelines: Software quality planning

    Energy Technology Data Exchange (ETDEWEB)

    1987-08-01

    This volume is one in a series of Sandia Software Guidelines intended for use in producing quality software within Sandia National Laboratories. In consonance with the IEEE Standard for Software Quality Assurance Plans, this volume identifies procedures to follow in producing a Software Quality Assurance Plan for an organization or a project, and provides an example project SQA plan. 2 figs., 4 tabs.

  6. Importance of Requirement Management : A Requirement Engineering Concern

    OpenAIRE

    Dhirendra Pandey; Vandana Pandey

    2012-01-01

    Requirement engineering is first phase of software development processes and it is most important phase for every software development model. In requirement engineering phase we can gather the requirements from user and use this requirement to software development and produce software product that satisfy the user needs. In this research paper we describe the fundamental description of requirement engineering and present the basics dimensions of requirement engineering. Also, in this research...

  7. Comparative Study on Agile software development methodologies

    OpenAIRE

    Moniruzzaman, A. B. M.; Hossain, Dr Syed Akhter

    2013-01-01

    Today-s business environment is very much dynamic, and organisations are constantly changing their software requirements to adjust with new environment. They also demand for fast delivery of software products as well as for accepting changing requirements. In this aspect, traditional plan-driven developments fail to meet up these requirements. Though traditional software development methodologies, such as life cycle-based structured and object oriented approaches, continue to dominate the sys...

  8. Software engineering architecture-driven software development

    CERN Document Server

    Schmidt, Richard F

    2013-01-01

    Software Engineering: Architecture-driven Software Development is the first comprehensive guide to the underlying skills embodied in the IEEE's Software Engineering Body of Knowledge (SWEBOK) standard. Standards expert Richard Schmidt explains the traditional software engineering practices recognized for developing projects for government or corporate systems. Software engineering education often lacks standardization, with many institutions focusing on implementation rather than design as it impacts product architecture. Many graduates join the workforce with incomplete skil

  9. Mechanisms of Component-Oriented Software Development.

    Science.gov (United States)

    Hofmann, Holger D.; Muench, Volker; Stynes, Jeanne

    1999-01-01

    Explains componentware, a new paradigm in software development that is based on the concept of a software component, a self-contained unit of software which can be distributed over large networks. Discusses the need for new, Internet-based search and retrieval mechanisms, and the architectural requirements and mechanisms of componentware.…

  10. Software Prototyping: Designing Systems for Users.

    Science.gov (United States)

    Spies, Phyllis Bova

    1983-01-01

    Reports on major change in computer software development process--the prototype model, i.e., implementation of skeletal system that is enhanced during interaction with users. Expensive and unreliable software, software design errors, traditional development approach, resources required for prototyping, success stories, and systems designer's role…

  11. Software Architecture for Big Data Systems

    Science.gov (United States)

    2014-03-27

    Eventual Software Architecture : Trends and New Directions #SEIswArch © 2014 Carnegie Mellon University NoSQL Landscape https... landscape 2.  Identify the architecturally -significant requirements and decision criteria 3.  Evaluate candidate technologies against quality...Software Architecture : Trends and New Directions #SEIswArch © 2014 Carnegie Mellon University Software Architecture for Big Data Systems

  12. Effective Software Engineering Leadership for Development Programs

    Science.gov (United States)

    Cagle West, Marsha

    2010-01-01

    Software is a critical component of systems ranging from simple consumer appliances to complex health, nuclear, and flight control systems. The development of quality, reliable, and effective software solutions requires the incorporation of effective software engineering processes and leadership. Processes, approaches, and methodologies for…

  13. Non-intrusive Instance Level Software Composition

    NARCIS (Netherlands)

    Hatun, Kardelen

    2014-01-01

    A software system is comprised of parts, which interact through shared interfaces. Certain qualities of integration, such as loose-coupling, requiring minimal changes to the software and fine-grained localisation of dependencies, have impact on the overall software quality. Current general-purpose

  14. Physics Validation of the LHC Software

    CERN Multimedia

    CERN. Geneva

    2004-01-01

    The LHC Software will be confronted to unprecedented challenges as soon as the LHC will turn on. We summarize the main Software requirements coming from the LHC detectors, triggers and physics, and we discuss several examples of Software components developed by the experiments and the LCG project (simulation, reconstruction, etc.), their validation, and their adequacy for LHC physics.

  15. Architectural viewpoints for global software development

    NARCIS (Netherlands)

    Yildiz, Bugra Mehmet; Tekinerdogan, B.

    Global Software Development (GSD) can be considered as the coordinated activity of software development that is not localized and central but geographically distributed. Designing an appropriate software architecture of a GSD system is important to meet the requirements for the communication,

  16. SWiFT Software Quality Assurance Plan.

    Energy Technology Data Exchange (ETDEWEB)

    Berg, Jonathan Charles [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2016-01-01

    This document describes the software development practice areas and processes which contribute to the ability of SWiFT software developers to provide quality software. These processes are designed to satisfy the requirements set forth by the Sandia Software Quality Assurance Program (SSQAP). APPROVALS SWiFT Software Quality Assurance Plan (SAND2016-0765) approved by: Department Manager SWiFT Site Lead Dave Minster (6121) Date Jonathan White (6121) Date SWiFT Controls Engineer Jonathan Berg (6121) Date CHANGE HISTORY Issue Date Originator(s) Description A 2016/01/27 Jon Berg (06121) Initial release of the SWiFT Software Quality Assurance Plan

  17. 48 CFR 227.7203-8 - Deferred delivery and deferred ordering of computer software and computer software documentation.

    Science.gov (United States)

    2010-10-01

    ... GENERAL CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-8 Deferred delivery and deferred ordering of computer software and computer... deferred ordering of computer software and computer software documentation. 227.7203-8 Section 227.7203-8...

  18. Implementing Software Safety in the NASA Environment

    Science.gov (United States)

    Wetherholt, Martha S.; Radley, Charles F.

    1994-01-01

    Until recently, NASA did not consider allowing computers total control of flight systems. Human operators, via hardware, have constituted the ultimate safety control. In an attempt to reduce costs, NASA has come to rely more and more heavily on computers and software to control space missions. (For example. software is now planned to control most of the operational functions of the International Space Station.) Thus the need for systematic software safety programs has become crucial for mission success. Concurrent engineering principles dictate that safety should be designed into software up front, not tested into the software after the fact. 'Cost of Quality' studies have statistics and metrics to prove the value of building quality and safety into the development cycle. Unfortunately, most software engineers are not familiar with designing for safety, and most safety engineers are not software experts. Software written to specifications which have not been safety analyzed is a major source of computer related accidents. Safer software is achieved step by step throughout the system and software life cycle. It is a process that includes requirements definition, hazard analyses, formal software inspections, safety analyses, testing, and maintenance. The greatest emphasis is placed on clearly and completely defining system and software requirements, including safety and reliability requirements. Unfortunately, development and review of requirements are the weakest link in the process. While some of the more academic methods, e.g. mathematical models, may help bring about safer software, this paper proposes the use of currently approved software methodologies, and sound software and assurance practices to show how, to a large degree, safety can be designed into software from the start. NASA's approach today is to first conduct a preliminary system hazard analysis (PHA) during the concept and planning phase of a project. This determines the overall hazard potential of

  19. Software-based acoustical measurements

    CERN Document Server

    Miyara, Federico

    2017-01-01

    This textbook provides a detailed introduction to the use of software in combination with simple and economical hardware (a sound level meter with calibrated AC output and a digital recording system) to obtain sophisticated measurements usually requiring expensive equipment. It emphasizes the use of free, open source, and multiplatform software. Many commercial acoustical measurement systems use software algorithms as an integral component; however the methods are not disclosed. This book enables the reader to develop useful algorithms and provides insight into the use of digital audio editing tools to document features in the signal. Topics covered include acoustical measurement principles, in-depth critical study of uncertainty applied to acoustical measurements, digital signal processing from the basics, and metrologically-oriented spectral and statistical analysis of signals. The student will gain a deep understanding of the use of software for measurement purposes; the ability to implement software-based...

  20. Prospective observer and software-based assessment of magnetic resonance imaging quality in head and neck cancer: Should standard positioning and immobilization be required for radiation therapy applications?

    Science.gov (United States)

    Ding, Yao; Mohamed, Abdallah S R; Yang, Jinzhong; Colen, Rivka R; Frank, Steven J; Wang, Jihong; Wassal, Eslam Y; Wang, Wenjie; Kantor, Michael E; Balter, Peter A; Rosenthal, David I; Lai, Stephen Y; Hazle, John D; Fuller, Clifton D

    2015-01-01

    The purpose of this study was to investigate the potential of a head and neck magnetic resonance simulation and immobilization protocol on reducing motion-induced artifacts and improving positional variance for radiation therapy applications. Two groups (group 1, 17 patients; group 2, 14 patients) of patients with head and neck cancer were included under a prospective, institutional review board-approved protocol and signed informed consent. A 3.0-T magnetic resonance imaging (MRI) scanner was used for anatomic and dynamic contrast-enhanced acquisitions with standard diagnostic MRI setup for group 1 and radiation therapy immobilization devices for group 2 patients. The impact of magnetic resonance simulation/immobilization was evaluated qualitatively by 2 observers in terms of motion artifacts and positional reproducibility and quantitatively using 3-dimensional deformable registration to track intrascan maximum motion displacement of voxels inside 7 manually segmented regions of interest. The image quality of group 2 (29 examinations) was significantly better than that of group 1 (50 examinations) as rated by both observers in terms of motion minimization and imaging reproducibility (P quality of head and neck MRI in terms of motion-related artifacts and positional reproducibility was greatly improved by use of radiation therapy immobilization devices. Consequently, immobilization with external and intraoral fixation in MRI examinations is required for radiation therapy application. Copyright © 2015 American Society for Radiation Oncology. Published by Elsevier Inc. All rights reserved.

  1. The software life cycle

    CERN Document Server

    Ince, Darrel

    1990-01-01

    The Software Life Cycle deals with the software lifecycle, that is, what exactly happens when software is developed. Topics covered include aspects of software engineering, structured techniques of software development, and software project management. The use of mathematics to design and develop computer systems is also discussed. This book is comprised of 20 chapters divided into four sections and begins with an overview of software engineering and software development, paying particular attention to the birth of software engineering and the introduction of formal methods of software develop

  2. Flight Software Math Library

    Science.gov (United States)

    McComas, David

    2013-01-01

    The flight software (FSW) math library is a collection of reusable math components that provides typical math utilities required by spacecraft flight software. These utilities are intended to increase flight software quality reusability and maintainability by providing a set of consistent, well-documented, and tested math utilities. This library only has dependencies on ANSI C, so it is easily ported. Prior to this library, each mission typically created its own math utilities using ideas/code from previous missions. Part of the reason for this is that math libraries can be written with different strategies in areas like error handling, parameters orders, naming conventions, etc. Changing the utilities for each mission introduces risks and costs. The obvious risks and costs are that the utilities must be coded and revalidated. The hidden risks and costs arise in miscommunication between engineers. These utilities must be understood by both the flight software engineers and other subsystem engineers (primarily guidance navigation and control). The FSW math library is part of a larger goal to produce a library of reusable Guidance Navigation and Control (GN&C) FSW components. A GN&C FSW library cannot be created unless a standardized math basis is created. This library solves the standardization problem by defining a common feature set and establishing policies for the library s design. This allows the libraries to be maintained with the same strategy used in its initial development, which supports a library of reusable GN&C FSW components. The FSW math library is written for an embedded software environment in C. This places restrictions on the language features that can be used by the library. Another advantage of the FSW math library is that it can be used in the FSW as well as other environments like the GN&C analyst s simulators. This helps communication between the teams because they can use the same utilities with the same feature set and syntax.

  3. Four Phase Methodology for Developing Secure Software

    OpenAIRE

    Carlos Gonzalez-Flores; Ernesto Liñan-García

    2016-01-01

    A simple and robust approach for developing secure software. A Four Phase methodology consists in developing the non-secure software in phase one, and for the next three phases, one phase for each of the secure developing types (i.e. self-protected software, secure code transformation, and the secure shield). Our methodology requires first the determination and understanding of the type of security level needed for the software. The methodology proposes the use of several teams to accomplish ...

  4. The ATLAS Trigger Simulation with Legacy Software

    CERN Document Server

    Bernius, Catrin; The ATLAS collaboration

    2017-01-01

    Physics analyses at the LHC require accurate simulations of the detector response and the event selection processes, generally done with the most recent software releases. The trigger response simulation is crucial for determination of overall selection efficiencies and signal sensitivities and should be done with the same software release with which data were recorded. This requires potentially running with software dating many years back, the so-called legacy software. Therefore having a strategy for running legacy software in a modern environment becomes essential when data simulated for past years start to present a sizeable fraction of the total. The requirements and possibilities for such a simulation scheme within the ATLAS software framework were examined and a proof-of-concept simulation chain has been successfully implemented. One of the greatest challenges was the choice of a data format which promises long term compatibility with old and new software releases. Over the time periods envisaged, data...

  5. Risk Assessment Methodology for Software Supportability (RAMSS): guidelines for Adapting Software Supportability Evaluations

    Science.gov (United States)

    1986-04-14

    and software during the system’s operational life. e e. The computer software development cycle consists of six activities: requirements analysis...Although computer software development typically occurs in the Full-Scale Development Phase, it may also occur during other phases. For example...support an evolving system requirement. In fact, it is common for the system life cycle to entail computer software development in several phases. The

  6. Generic Kalman Filter Software

    Science.gov (United States)

    Lisano, Michael E., II; Crues, Edwin Z.

    2005-01-01

    The Generic Kalman Filter (GKF) software provides a standard basis for the development of application-specific Kalman-filter programs. Historically, Kalman filters have been implemented by customized programs that must be written, coded, and debugged anew for each unique application, then tested and tuned with simulated or actual measurement data. Total development times for typical Kalman-filter application programs have ranged from months to weeks. The GKF software can simplify the development process and reduce the development time by eliminating the need to re-create the fundamental implementation of the Kalman filter for each new application. The GKF software is written in the ANSI C programming language. It contains a generic Kalman-filter-development directory that, in turn, contains a code for a generic Kalman filter function; more specifically, it contains a generically designed and generically coded implementation of linear, linearized, and extended Kalman filtering algorithms, including algorithms for state- and covariance-update and -propagation functions. The mathematical theory that underlies the algorithms is well known and has been reported extensively in the open technical literature. Also contained in the directory are a header file that defines generic Kalman-filter data structures and prototype functions and template versions of application-specific subfunction and calling navigation/estimation routine code and headers. Once the user has provided a calling routine and the required application-specific subfunctions, the application-specific Kalman-filter software can be compiled and executed immediately. During execution, the generic Kalman-filter function is called from a higher-level navigation or estimation routine that preprocesses measurement data and post-processes output data. The generic Kalman-filter function uses the aforementioned data structures and five implementation- specific subfunctions, which have been developed by the user on

  7. Software fault tolerance

    OpenAIRE

    Kazinov, Tofik Hasanaga; Mostafa, Jalilian Shahrukh

    2009-01-01

    Because of our present inability to produce errorfree software, software fault tolerance is and will contiune to be an important consideration in software system. The root cause of software design errors in the complexity of the systems. This paper surveys various software fault tolerance techniquest and methodologies. They are two gpoups: Single version and Multi version software fault tolerance techniques. It is expected that software fault tolerance research will benefit from this research...

  8. Software not as a service

    Science.gov (United States)

    Teal, Tracy

    2017-01-01

    With the expansion in the variety, velocity and volume of data being produced, computing and software development has become a crucial element of astronomy research. However, while we value the research, we place less importance on the development of the software itself, viewing software as a service to research. By viewing software as a service, we derate the effort and expertise it takes to produce, and the training required, for effective research computing. We also don’t provide support for the people doing the development, often expecting individual developers to provide systems administration, user support and training and produce documentation and user interfaces. With our increased reliance on research computing, accurate and reproducible research requires that software not be separate from the act of conducting research, but an integral component - a part of, rather than a service to research. Shifts in how we provide data skills and software development training, integrate development into research programs and academic departments and value software as a product can have an impact on the quality, creativity and types of research we can conduct.

  9. NASA Software Engineering Benchmarking Study

    Science.gov (United States)

    Rarick, Heather L.; Godfrey, Sara H.; Kelly, John C.; Crumbley, Robert T.; Wifl, Joel M.

    2013-01-01

    To identify best practices for the improvement of software engineering on projects, NASA's Offices of Chief Engineer (OCE) and Safety and Mission Assurance (OSMA) formed a team led by Heather Rarick and Sally Godfrey to conduct this benchmarking study. The primary goals of the study are to identify best practices that: Improve the management and technical development of software intensive systems; Have a track record of successful deployment by aerospace industries, universities [including research and development (R&D) laboratories], and defense services, as well as NASA's own component Centers; and Identify candidate solutions for NASA's software issues. Beginning in the late fall of 2010, focus topics were chosen and interview questions were developed, based on the NASA top software challenges. Between February 2011 and November 2011, the Benchmark Team interviewed a total of 18 organizations, consisting of five NASA Centers, five industry organizations, four defense services organizations, and four university or university R and D laboratory organizations. A software assurance representative also participated in each of the interviews to focus on assurance and software safety best practices. Interviewees provided a wealth of information on each topic area that included: software policy, software acquisition, software assurance, testing, training, maintaining rigor in small projects, metrics, and use of the Capability Maturity Model Integration (CMMI) framework, as well as a number of special topics that came up in the discussions. NASA's software engineering practices compared favorably with the external organizations in most benchmark areas, but in every topic, there were ways in which NASA could improve its practices. Compared to defense services organizations and some of the industry organizations, one of NASA's notable weaknesses involved communication with contractors regarding its policies and requirements for acquired software. One of NASA's strengths

  10. Software Development Standard Processes (SDSP)

    Science.gov (United States)

    Lavin, Milton L.; Wang, James J.; Morillo, Ronald; Mayer, John T.; Jamshidian, Barzia; Shimizu, Kenneth J.; Wilkinson, Belinda M.; Hihn, Jairus M.; Borgen, Rosana B.; Meyer, Kenneth N.; hide

    2011-01-01

    A JPL-created set of standard processes is to be used throughout the lifecycle of software development. These SDSPs cover a range of activities, from management and engineering activities, to assurance and support activities. These processes must be applied to software tasks per a prescribed set of procedures. JPL s Software Quality Improvement Project is currently working at the behest of the JPL Software Process Owner to ensure that all applicable software tasks follow these procedures. The SDSPs are captured as a set of 22 standards in JPL s software process domain. They were developed in-house at JPL by a number of Subject Matter Experts (SMEs) residing primarily within the Engineering and Science Directorate, but also from the Business Operations Directorate and Safety and Mission Success Directorate. These practices include not only currently performed best practices, but also JPL-desired future practices in key thrust areas like software architecting and software reuse analysis. Additionally, these SDSPs conform to many standards and requirements to which JPL projects are beholden.

  11. Lean software development in action

    CERN Document Server

    Janes, Andrea

    2014-01-01

    This book illustrates how goal-oriented, automated measurement can be used to create Lean organizations and to facilitate the development of Lean software, while also demonstrating the practical implementation of Lean software development by combining tried and trusted tools. In order to be successful, a Lean orientation of software development has to go hand in hand with a company's overall business strategy. To achieve this, two interrelated aspects require special attention: measurement and experience management. In this book, Janes and Succi provide the necessary knowledge to establish "

  12. Software Development Risk Management Model

    OpenAIRE

    Islam, Shareeful

    2011-01-01

    Risk management is an effective tool to control risks in software projects and increases the likelihood of project success. Risk management needs to be integrated as early as possible in the project. This dissertation proposes a Goal-driven Software Development Risk Management Model (GSRM) and explicitly integrates it into requirements engineering phase. This integration provides an early warning of potential problems so that both preventive and corrective actions can be undertaken to avoid t...

  13. Computer-Aided Software Engineering - An approach to real-time software development

    Science.gov (United States)

    Walker, Carrie K.; Turkovich, John J.

    1989-01-01

    A new software engineering discipline is Computer-Aided Software Engineering (CASE), a technology aimed at automating the software development process. This paper explores the development of CASE technology, particularly in the area of real-time/scientific/engineering software, and a history of CASE is given. The proposed software development environment for the Advanced Launch System (ALS CASE) is described as an example of an advanced software development system for real-time/scientific/engineering (RT/SE) software. The Automated Programming Subsystem of ALS CASE automatically generates executable code and corresponding documentation from a suitably formatted specification of the software requirements. Software requirements are interactively specified in the form of engineering block diagrams. Several demonstrations of the Automated Programming Subsystem are discussed.

  14. Ontologies for software engineering and software technology

    CERN Document Server

    Calero, Coral; Piattini, Mario

    2006-01-01

    Covers two applications of ontologies in software engineering and software technology: sharing knowledge of the problem domain and using a common terminology among all stakeholders; and filtering the knowledge when defining models and metamodels. This book is of benefit to software engineering researchers in both academia and industry.

  15. Evaluating Educational Software

    Directory of Open Access Journals (Sweden)

    Paula Escudeiro

    2010-04-01

    Full Text Available This paper presents the overall evaluation of the Quantitative Evaluation Framework (QEF approach which has been applied in an operational teaching environment for the last six years. During this period we have evaluated the difference between educational software systems that were developed using the Techno-Didactical Extension for Instruction/Learning Based on Computer (X-TEC model and educational software systems using other models. The X-TEC model is used in the development of educational software in order to strengthen the potential quality of e-Learning systems. We selected the QEF approach for this evaluation to highlight the strengths and limitations of the X-TEC model. We adapted the approach in a way where the essential criteria are assessed in a pre-evaluation phase which will cover the general usage requirements. In this research project we conduct experiments with groups of students and teachers in Multimedia Information Systems classes of Oporto Polytechnic, to examine the influence of training in an instructional system design approach on their attitude to re-use this approach and on their performances in design, using this approach.

  16. Controlling Software Piracy.

    Science.gov (United States)

    King, Albert S.

    1992-01-01

    Explains what software manufacturers are doing to combat software piracy, recommends how managers should deal with this problem, and provides a role-playing exercise to help students understand the issues in software piracy. (SR)

  17. GIMS-Software for asset market experiments.

    Science.gov (United States)

    Palan, Stefan

    2015-03-01

    In this article we lay out requirements for an experimental market software for financial and economic research. We then discuss existing solutions. Finally, we introduce GIMS, an open source market software which is characterized by extensibility and ease of use, while offering nearly all of the required functionality.

  18. GIMS—Software for asset market experiments

    Science.gov (United States)

    Palan, Stefan

    2015-01-01

    In this article we lay out requirements for an experimental market software for financial and economic research. We then discuss existing solutions. Finally, we introduce GIMS, an open source market software which is characterized by extensibility and ease of use, while offering nearly all of the required functionality. PMID:26525085

  19. Automated Estimation Of Software-Development Costs

    Science.gov (United States)

    Roush, George B.; Reini, William

    1993-01-01

    COSTMODL is automated software development-estimation tool. Yields significant reduction in risk of cost overruns and failed projects. Accepts description of software product developed and computes estimates of effort required to produce it, calendar schedule required, and distribution of effort and staffing as function of defined set of development life-cycle phases. Written for IBM PC(R)-compatible computers.

  20. Software Metrics: Measuring Haskell

    OpenAIRE

    Ryder, Chris; Thompson, Simon

    2005-01-01

    Software metrics have been used in software engineering as a mechanism for assessing code quality and for targeting software development activities, such as testing or refactoring, at areas of a program that will most benefit from them. Haskell has many tools for software engineering, such as testing, debugging and refactoring tools, but software metrics have mostly been neglected. The work presented in this paper identifies a collection of software metrics for use with Haskell programs. Thes...

  1. Software Intensive Systems

    National Research Council Canada - National Science Library

    Horvitz, E; Katz, D. J; Rumpf, R. L; Shrobe, H; Smith, T. B; Webber, G. E; Williamson, W. E; Winston, P. H; Wolbarsht, James L

    2006-01-01

    .... Recommend that DoN create a software acquisition specialty, mandate basic schooling for software acquisition specialists, close certain acquisition loopholes that permit poor development practices...

  2. Interface-based software integration

    Directory of Open Access Journals (Sweden)

    Aziz Ahmad Rais

    2016-07-01

    Full Text Available Enterprise architecture frameworks define the goals of enterprise architecture in order to make business processes and IT operations more effective, and to reduce the risk of future investments. These enterprise architecture frameworks offer different architecture development methods that help in building enterprise architecture. In practice, the larger organizations become, the larger their enterprise architecture and IT become. This leads to an increasingly complex system of enterprise architecture development and maintenance. Application software architecture is one type of architecture that, along with business architecture, data architecture and technology architecture, composes enterprise architecture. From the perspective of integration, enterprise architecture can be considered a system of interaction between multiple examples of application software. Therefore, effective software integration is a very important basis for the future success of the enterprise architecture in question. This article will provide interface-based integration practice in order to help simplify the process of building such a software integration system. The main goal of interface-based software integration is to solve problems that may arise with software integration requirements and developing software integration architecture.

  3. Modern Tools for Modern Software

    Energy Technology Data Exchange (ETDEWEB)

    Kumfert, G; Epperly, T

    2001-10-31

    This is a proposal for a new software configure/build tool for building, maintaining, deploying, and installing software. At its completion, this new tool will replace current standard tool suites such as ''autoconf'', ''automake'', ''libtool'', and the de facto standard build tool, ''make''. This ambitious project is born out of the realization that as scientific software has grown in size and complexity over the years, the difficulty of configuring and building software has increased as well. For high performance scientific software, additional complexities often arises from the need for portability to multiple platforms (including many one-of-a-kind platforms), multilanguage implementations, use of third party libraries, and a need to adapt algorithms to the specific features of the hardware. Development of scientific software is being hampered by the quality of configuration and build tools commonly available. Inordinate amounts of time and expertise are required to develop and maintain the configure and build system for a moderately complex project. Better build and configure tools will increase developer productivity. This proposal is a first step in a process of shoring up the foundation upon which DOE software is created and used.

  4. Gravity with free initial conditions: A solution to the cosmological constant problem testable by CMB B -mode polarization

    Science.gov (United States)

    Totani, Tomonori

    2017-10-01

    In standard general relativity the universe cannot be started with arbitrary initial conditions, because four of the ten components of the Einstein's field equations (EFE) are constraints on initial conditions. In the previous work it was proposed to extend the gravity theory to allow free initial conditions, with a motivation to solve the cosmological constant problem. This was done by setting four constraints on metric variations in the action principle, which is reasonable because the gravity's physical degrees of freedom are at most six. However, there are two problems about this theory; the three constraints in addition to the unimodular condition were introduced without clear physical meanings, and the flat Minkowski spacetime is unstable against perturbations. Here a new set of gravitational field equations is derived by replacing the three constraints with new ones requiring that geodesic paths remain geodesic against metric variations. The instability problem is then naturally solved. Implications for the cosmological constant Λ are unchanged; the theory converges into EFE with nonzero Λ by inflation, but Λ varies on scales much larger than the present Hubble horizon. Then galaxies are formed only in small Λ regions, and the cosmological constant problem is solved by the anthropic argument. Because of the increased degrees of freedom in metric dynamics, the theory predicts new non-oscillatory modes of metric anisotropy generated by quantum fluctuation during inflation, and CMB B -mode polarization would be observed differently from the standard predictions by general relativity.

  5. Practical support for Lean Six Sigma software process definition using IEEE software engineering standards

    CERN Document Server

    Land, Susan K; Walz, John W

    2012-01-01

    Practical Support for Lean Six Sigma Software Process Definition: Using IEEE Software Engineering Standards addresses the task of meeting the specific documentation requirements in support of Lean Six Sigma. This book provides a set of templates supporting the documentation required for basic software project control and management and covers the integration of these templates for their entire product development life cycle. Find detailed documentation guidance in the form of organizational policy descriptions, integrated set of deployable document templates, artifacts required in suppo

  6. Artificial intelligence approaches to software engineering

    Science.gov (United States)

    Johannes, James D.; Macdonald, James R.

    1988-01-01

    Artificial intelligence approaches to software engineering are examined. The software development life cycle is a sequence of not so well-defined phases. Improved techniques for developing systems have been formulated over the past 15 years, but pressure continues to attempt to reduce current costs. Software development technology seems to be standing still. The primary objective of the knowledge-based approach to software development presented in this paper is to avoid problem areas that lead to schedule slippages, cost overruns, or software products that fall short of their desired goals. Identifying and resolving software problems early, often in the phase in which they first occur, has been shown to contribute significantly to reducing risks in software development. Software development is not a mechanical process but a basic human activity. It requires clear thinking, work, and rework to be successful. The artificial intelligence approaches to software engineering presented support the software development life cycle through the use of software development techniques and methodologies in terms of changing current practices and methods. These should be replaced by better techniques that that improve the process of of software development and the quality of the resulting products. The software development process can be structured into well-defined steps, of which the interfaces are standardized, supported and checked by automated procedures that provide error detection, production of the documentation and ultimately support the actual design of complex programs.

  7. Analyser Framework to Verify Software Components

    Directory of Open Access Journals (Sweden)

    Rolf Andreas Rasenack

    2009-01-01

    Full Text Available Today, it is important for software companies to build software systems in a short time-interval, to reduce costs and to have a good market position. Therefore well organized and systematic development approaches are required. Reusing software components, which are well tested, can be a good solution to develop software applications in effective manner. The reuse of software components is less expensive and less time consuming than a development from scratch. But it is dangerous to think that software components can be match together without any problems. Software components itself are well tested, of course, but even if they composed together problems occur. Most problems are based on interaction respectively communication. Avoiding such errors a framework has to be developed for analysing software components. That framework determines the compatibility of corresponding software components. The promising approach discussed here, presents a novel technique for analysing software components by applying an Abstract Syntax Language Tree (ASLT. A supportive environment will be designed that checks the compatibility of black-box software components. This article is concerned to the question how can be coupled software components verified by using an analyzer framework and determines the usage of the ASLT. Black-box Software Components and Abstract Syntax Language Tree are the basis for developing the proposed framework and are discussed here to provide the background knowledge. The practical implementation of this framework is discussed and shows the result by using a test environment.

  8. Programming Makes Software; Support Makes Users

    Science.gov (United States)

    Batcheller, A. L.

    2010-12-01

    Skilled software engineers may build fantastic software for climate modeling, yet fail to achieve their project’s objectives. Software support and related activities are just as critical as writing software. This study followed three different software projects in the climate sciences, using interviews, observation, and document analysis to examine the value added by support work. Supporting the project and interacting with users was a key task for software developers, who often spent 50% of their time on it. Such support work most often involved replying to questions on an email list, but also included talking to users on teleconference calls and in person. Software support increased adoption by building the software’s reputation and showing individuals how the software can meet their needs. In the process of providing support, developers often learned new of requirements as users reported features they desire and bugs they found. As software matures and gains widespread use, support work often increases. In fact, such increases can be one signal that the software has achieved broad acceptance. Maturing projects also find demand for instructional classes, online tutorials and detailed examples of how to use the software. The importance of support highlights the fact that building software systems involves both social and technical aspects. Yes, we need to build the software, but we also need to “build” the users and practices that can take advantage of it.

  9. Maximizing ROI on software development

    CERN Document Server

    Sikka, Vijay

    2004-01-01

    A brief review of software development history. Software complexity crisis. Software development ROI. The case for global software development and testing. Software quality and test ROI. How do you implement global software development and testing. Case studies.

  10. Test Suite Cooperative Framework on Software Quality

    Science.gov (United States)

    Liu, Zhenyu; Yang, Genxing; Cai, Lizhi

    Software testing has gradually played an important role in controlling the quality of software product. In this paper, we study the characteristics of test suites in software testing and analyze their structure. A novel test suite cooperative framework is presented for software testing based on the existing test suite. The framework can analyze different test suites with ontology and taxonomy, and help cooperation among the test suites to some extent. A tool has been developed with .NET platform to meet the requirements of designing cooperative test suite in software testing projects.

  11. Software Product Manager: A Mechanism to manage software products in small and medium ISVs

    NARCIS (Netherlands)

    Katchow, R.; van de Weerd, I.|info:eu-repo/dai/nl/304836664; Brinkkemper, S.|info:eu-repo/dai/nl/07500707X; Rooswinkel, A.

    2009-01-01

    In this paper, we present SP Manager as an innovative tool for managing software products in small and medium independent software vendors (ISVs). This tool incorporates the operational software product management (SPM) processes focused on requirements management and release planning. By using

  12. Quality Assurance in Software Development: An Exploratory Investigation in Software Project Failures and Business Performance

    Science.gov (United States)

    Ichu, Emmanuel A.

    2010-01-01

    Software quality is perhaps one of the most sought-after attributes in product development, however; this goal is unattained. Problem factors in software development and how these have affected the maintainability of the delivered software systems requires a thorough investigation. It was, therefore, very important to understand software…

  13. SAGA: A project to automate the management of software production systems

    Science.gov (United States)

    Campbell, Roy H.; Beckman-Davies, C. S.; Benzinger, L.; Beshers, G.; Laliberte, D.; Render, H.; Sum, R.; Smith, W.; Terwilliger, R.

    1986-01-01

    Research into software development is required to reduce its production cost and to improve its quality. Modern software systems, such as the embedded software required for NASA's space station initiative, stretch current software engineering techniques. The requirements to build large, reliable, and maintainable software systems increases with time. Much theoretical and practical research is in progress to improve software engineering techniques. One such technique is to build a software system or environment which directly supports the software engineering process, i.e., the SAGA project, comprising the research necessary to design and build a software development which automates the software engineering process. Progress under SAGA is described.

  14. Workflow-Based Software Development Environment

    Science.gov (United States)

    Izygon, Michel E.

    2013-01-01

    The Software Developer's Assistant (SDA) helps software teams more efficiently and accurately conduct or execute software processes associated with NASA mission-critical software. SDA is a process enactment platform that guides software teams through project-specific standards, processes, and procedures. Software projects are decomposed into all of their required process steps or tasks, and each task is assigned to project personnel. SDA orchestrates the performance of work required to complete all process tasks in the correct sequence. The software then notifies team members when they may begin work on their assigned tasks and provides the tools, instructions, reference materials, and supportive artifacts that allow users to compliantly perform the work. A combination of technology components captures and enacts any software process use to support the software lifecycle. It creates an adaptive workflow environment that can be modified as needed. SDA achieves software process automation through a Business Process Management (BPM) approach to managing the software lifecycle for mission-critical projects. It contains five main parts: TieFlow (workflow engine), Business Rules (rules to alter process flow), Common Repository (storage for project artifacts, versions, history, schedules, etc.), SOA (interface to allow internal, GFE, or COTS tools integration), and the Web Portal Interface (collaborative web environment

  15. Statistical Software Engineering

    Science.gov (United States)

    1998-04-13

    multiversion software subject to coincident errors. IEEE Trans. Software Eng. SE-11:1511-1517. Eckhardt, D.E., A.K Caglayan, J.C. Knight, L.D. Lee, D.F...J.C. and N.G. Leveson. 1986. Experimental evaluation of the assumption of independence in multiversion software. IEEE Trans. Software

  16. Software distribution using xnetlib

    Energy Technology Data Exchange (ETDEWEB)

    Dongarra, J.J. [Univ. of Tennessee, Knoxville, TN (US). Dept. of Computer Science]|[Oak Ridge National Lab., TN (US); Rowan, T.H. [Oak Ridge National Lab., TN (US); Wade, R.C. [Univ. of Tennessee, Knoxville, TN (US). Dept. of Computer Science

    1993-06-01

    Xnetlib is a new tool for software distribution. Whereas its predecessor netlib uses e-mail as the user interface to its large collection of public-domain mathematical software, xnetlib uses an X Window interface and socket-based communication. Xnetlib makes it easy to search through a large distributed collection of software and to retrieve requested software in seconds.

  17. Ensuring Software IP Cleanliness

    Directory of Open Access Journals (Sweden)

    Mahshad Koohgoli

    2007-12-01

    Full Text Available At many points in the life of a software enterprise, determination of intellectual property (IP cleanliness becomes critical. The value of an enterprise that develops and sells software may depend on how clean the software is from the IP perspective. This article examines various methods of ensuring software IP cleanliness and discusses some of the benefits and shortcomings of current solutions.

  18. Testability Design Rating System: Testability Handbook. Volume 1

    Science.gov (United States)

    1992-02-01

    13-3 Figure 13-4. Addition of Test Point Following Frequency Conversion .................. 13-4 Figure 13-5. Partitioning of Matched Components...fault isolation major cost factors in the design verification, manifacturing , and support phases of a product’s life.Testing and fault isolation are...NY 13441-5700. In addition , contractors must send a copy of their approved DD Form 2345.. Information on the DD Form 2345 may be obtained from the

  19. Software Language Evolution

    OpenAIRE

    Vermolen, S.D.

    2012-01-01

    Software plays a critical role in our daily life. Vast amounts of money are spent on more and more complex systems. All software, regardless if it controls a plane or the game on your phone is never finished. Software changes when it contains bugs or when new functionality is added. This process of change is called software eovlution. Despite what the name suggests, this is in practice a rapid process. Software is described in a software language. Not only software can evolve, also the langua...

  20. The MINERVA Software Development Process

    Science.gov (United States)

    Narkawicz, Anthony; Munoz, Cesar A.; Dutle, Aaron M.

    2017-01-01

    This paper presents a software development process for safety-critical software components of cyber-physical systems. The process is called MINERVA, which stands for Mirrored Implementation Numerically Evaluated against Rigorously Verified Algorithms. The process relies on formal methods for rigorously validating code against its requirements. The software development process uses: (1) a formal specification language for describing the algorithms and their functional requirements, (2) an interactive theorem prover for formally verifying the correctness of the algorithms, (3) test cases that stress the code, and (4) numerical evaluation on these test cases of both the algorithm specifications and their implementations in code. The MINERVA process is illustrated in this paper with an application to geo-containment algorithms for unmanned aircraft systems. These algorithms ensure that the position of an aircraft never leaves a predetermined polygon region and provide recovery maneuvers when the region is inadvertently exited.

  1. Agile software assessment

    OpenAIRE

    Nierstrasz Oscar; Lungu Mircea

    2012-01-01

    Informed decision making is a critical activity in software development but it is poorly supported by common development environments which focus mainly on low level programming tasks. We posit the need for agile software assessment which aims to support decision making by enabling rapid and effective construction of software models and custom analyses. Agile software assessment entails gathering and exploiting the broader context of software information related to the system at hand as well ...

  2. 48 CFR 227.7203-10 - Contractor identification and marking of computer software or computer software documentation to...

    Science.gov (United States)

    2010-10-01

    ... and marking of computer software or computer software documentation to be furnished with restrictive... REGULATIONS SYSTEM, DEPARTMENT OF DEFENSE GENERAL CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-10 Contractor identification and...

  3. Analog Input Data Acquisition Software

    Science.gov (United States)

    Arens, Ellen

    2009-01-01

    DAQ Master Software allows users to easily set up a system to monitor up to five analog input channels and save the data after acquisition. This program was written in LabVIEW 8.0, and requires the LabVIEW runtime engine 8.0 to run the executable.

  4. Software architecture analysis of usability

    NARCIS (Netherlands)

    Folmer, E; van Gurp, J; Bosch, J; Bastide, R; Palanque, P; Roth, J

    2005-01-01

    Studies of software engineering projects show that a large number of usability related change requests are made after its deployment. Fixing usability problems during the later stages of development often proves to be costly, since many of the necessary changes require changes to the system that

  5. RELAP-7 Software Verification and Validation Plan

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Curtis L. [Idaho National Lab. (INL), Idaho Falls, ID (United States). Risk, Reliability, and Regulatory Support; Choi, Yong-Joon [Idaho National Lab. (INL), Idaho Falls, ID (United States). Risk, Reliability, and Regulatory Support; Zou, Ling [Idaho National Lab. (INL), Idaho Falls, ID (United States). Risk, Reliability, and Regulatory Support

    2014-09-25

    This INL plan comprehensively describes the software for RELAP-7 and documents the software, interface, and software design requirements for the application. The plan also describes the testing-based software verification and validation (SV&V) process—a set of specially designed software models used to test RELAP-7. The RELAP-7 (Reactor Excursion and Leak Analysis Program) code is a nuclear reactor system safety analysis code being developed at Idaho National Laboratory (INL). The code is based on the INL’s modern scientific software development framework – MOOSE (Multi-Physics Object-Oriented Simulation Environment). The overall design goal of RELAP-7 is to take advantage of the previous thirty years of advancements in computer architecture, software design, numerical integration methods, and physical models. The end result will be a reactor systems analysis capability that retains and improves upon RELAP5’s capability and extends the analysis capability for all reactor system simulation scenarios.

  6. Testable baryogenesis in seesaw models

    Energy Technology Data Exchange (ETDEWEB)

    Hernández, P.; Kekic, M. [Instituto de Física Corpuscular, Universidad de Valencia and CSIC,Edificio Institutos Investigación,Catedrático José Beltrán 2, 46980 (Spain); López-Pavón, J. [INFN, Sezione di Genova,via Dodecaneso 33, 16146 Genova (Italy); Racker, J.; Salvado, J. [Instituto de Física Corpuscular, Universidad de Valencia and CSIC,Edificio Institutos Investigación,Catedrático José Beltrán 2, 46980 (Spain)

    2016-08-26

    We revisit the production of baryon asymmetries in the minimal type I seesaw model with heavy Majorana singlets in the GeV range. In particular we include “washout” effects from scattering processes with gauge bosons, Higgs decays and inverse decays, besides the dominant top scatterings. We show that in the minimal model with two singlets, and for an inverted light neutrino ordering, future measurements from SHiP and neutrinoless double beta decay could in principle provide sufficient information to predict the matter-antimatter asymmetry in the universe. We also show that SHiP measurements could provide very valuable information on the PMNS CP phases.

  7. Amerind taxonomy and testable hypotheses.

    Science.gov (United States)

    Pichardo, M

    1998-06-01

    The acceptance of a 30,000 yr B.P. age for Valsequillo sets new parameters for hypotheses of Paleoindian entry into America. A review of Amerind taxonomy defines the early groups as Otamid-Sundadonts. Isolation in America led to an adaptive radiation that has implications for the origin and dispersal of Pithecanthropus.

  8. Software Engineering Program: Software Process Improvement Guidebook

    Science.gov (United States)

    1996-01-01

    The purpose of this document is to provide experience-based guidance in implementing a software process improvement program in any NASA software development or maintenance community. This guidebook details how to define, operate, and implement a working software process improvement program. It describes the concept of the software process improvement program and its basic organizational components. It then describes the structure, organization, and operation of the software process improvement program, illustrating all these concepts with specific NASA examples. The information presented in the document is derived from the experiences of several NASA software organizations, including the SEL, the SEAL, and the SORCE. Their experiences reflect many of the elements of software process improvement within NASA. This guidebook presents lessons learned in a form usable by anyone considering establishing a software process improvement program within his or her own environment. This guidebook attempts to balance general and detailed information. It provides material general enough to be usable by NASA organizations whose characteristics do not directly match those of the sources of the information and models presented herein. It also keeps the ideas sufficiently close to the sources of the practical experiences that have generated the models and information.

  9. Amalgamation of Personal Software Process in Software ...

    African Journals Online (AJOL)

    Today, concern for quality has become an international movement. Even though most industrial organizations have now adopted modern quality principles, the software community has continued to rely on testing as the principal quality management method. Different decades have different trends in software engineering.

  10. Software as a service approach to sensor simulation software deployment

    Science.gov (United States)

    Webster, Steven; Miller, Gordon; Mayott, Gregory

    2012-05-01

    Traditionally, military simulation has been problem domain specific. Executing an exercise currently requires multiple simulation software providers to specialize, deploy, and configure their respective implementations, integrate the collection of software to achieve a specific system behavior, and then execute for the purpose at hand. This approach leads to rigid system integrations which require simulation expertise for each deployment due to changes in location, hardware, and software. Our alternative is Software as a Service (SaaS) predicated on the virtualization of Night Vision Electronic Sensors (NVESD) sensor simulations as an exemplary case. Management middleware elements layer self provisioning, configuration, and integration services onto the virtualized sensors to present a system of services at run time. Given an Infrastructure as a Service (IaaS) environment, enabled and managed system of simulations yields a durable SaaS delivery without requiring user simulation expertise. Persistent SaaS simulations would provide on demand availability to connected users, decrease integration costs and timelines, and benefit the domain community from immediate deployment of lessons learned.

  11. Software Engineering Institute: Year in Review 2008

    Science.gov (United States)

    2008-01-01

    into a single package composed of a single integrated circuit . The increasing availability of processors with many computing cores requires better...interaction among Army software experts that the force will be able to assure that it obtains high-quality and effective software products. In short , the...the Impact of Scale,” 5th International Workshop on Model-Based Methodologies for Pervasive and Embedded Software (part of the ETAPS Conference

  12. The ATLAS Trigger Simulation with Legacy Software

    CERN Document Server

    Bernius, Catrin; The ATLAS collaboration

    2017-01-01

    Physics analyses at the LHC which search for rare physics processes or measure Standard Model parameters with high precision require accurate simulations of the detector response and the event selection processes. The accurate simulation of the trigger response is crucial for determination of overall selection efficiencies and signal sensitivities. For the generation and the reconstruction of simulated event data, generally the most recent software releases are used to ensure the best agreement between simulated data and real data. For the simulation of the trigger selection process, however, the same software release with which real data were taken should be ideally used. This requires potentially running with software dating many years back, the so-called legacy software. Therefore having a strategy for running legacy software in a modern environment becomes essential when data simulated for past years start to present a sizeable fraction of the total. The requirements and possibilities for such a simulatio...

  13. Dynamic visualization techniques for high consequence software

    Energy Technology Data Exchange (ETDEWEB)

    Pollock, G.M.

    1998-02-01

    This report documents a prototype tool developed to investigate the use of visualization and virtual reality technologies for improving software surety confidence. The tool is utilized within the execution phase of the software life cycle. It provides a capability to monitor an executing program against prespecified requirements constraints provided in a program written in the requirements specification language SAGE. The resulting Software Attribute Visual Analysis Tool (SAVAnT) also provides a technique to assess the completeness of a software specification. The prototype tool is described along with the requirements constraint language after a brief literature review is presented. Examples of how the tool can be used are also presented. In conclusion, the most significant advantage of this tool is to provide a first step in evaluating specification completeness, and to provide a more productive method for program comprehension and debugging. The expected payoff is increased software surety confidence, increased program comprehension, and reduced development and debugging time.

  14. Improving Agile Software Practice

    DEFF Research Database (Denmark)

    Tjørnehøj, Gitte

    2006-01-01

    Software process improvement in small and agile organizations is often problematic, but achieving good SPI-assessments can still be necessary to stay in the marked or to meet demands of multinational owners. The traditional norm driven, centralized and control centered improvement approaches has...... from their new multinational owners. In the project we experimented to reach a less centralized and control centered SPI approach trying to meet the agile culture of the firm both within diagnosing, improvement planning, process design and evaluation eventhough the goal was applying to the norm. After...... having chosen the improvement area; requirement management, a more formal culture assessment and comparisons with the culture of the CMM-norm helped guiding the design of the new processes and tools. The paper suggests a SPI -approach based on problem diagnosis instead of formal CMMI-assessment, culture...

  15. A Probabilistic Software System Attribute Acceptance Paradigm for COTS Software Evaluation

    Science.gov (United States)

    Morris, A. Terry

    2005-01-01

    Standard software requirement formats are written from top-down perspectives only, that is, from an ideal notion of a client s needs. Despite the exactness of the standard format, software and system errors in designed systems have abounded. Bad and inadequate requirements have resulted in cost overruns, schedule slips and lost profitability. Commercial off-the-shelf (COTS) software components are even more troublesome than designed systems because they are often provided as is and subsequently delivered with unsubstantiated validation of described capabilities. For COTS software, there needs to be a way to express the client s software needs in a consistent and formal manner using software system attributes derived from software quality standards. Additionally, the format needs to be amenable to software evaluation processes that integrate observable evidence garnered from historical data. This paper presents a paradigm that effectively bridges the gap between what a client desires (top-down) and what has been demonstrated (bottom-up) for COTS software evaluation. The paradigm addresses the specification of needs before the software evaluation is performed and can be used to increase the shared understanding between clients and software evaluators about what is required and what is technically possible.

  16. Operational excellence (six sigma) philosophy: Application to software quality assurance

    Energy Technology Data Exchange (ETDEWEB)

    Lackner, M.

    1997-11-01

    This report contains viewgraphs on operational excellence philosophy of six sigma applied to software quality assurance. This report outlines the following: goal of six sigma; six sigma tools; manufacturing vs administrative processes; Software quality assurance document inspections; map software quality assurance requirements document; failure mode effects analysis for requirements document; measuring the right response variables; and questions.

  17. Mapping Social Network to Software Architecture to Detect Structure Clashes in Agile Software Development

    NARCIS (Netherlands)

    Amrit, Chintan Amrit; van Hillegersberg, Jos

    2007-01-01

    Software development is rarely an individual effort and generally involves teams of developers collaborating together in order to generate reliable code. Such collaborations require proper communication and regular coordination among the team members. In addition, coordination is required to sort

  18. Strengthening Software Authentication with the ROSE Software Suite

    Energy Technology Data Exchange (ETDEWEB)

    White, G

    2006-06-15

    Many recent nonproliferation and arms control software projects include a software authentication regime. These include U.S. Government-sponsored projects both in the United States and in the Russian Federation (RF). This trend toward requiring software authentication is only accelerating. Demonstrating assurance that software performs as expected without hidden ''backdoors'' is crucial to a project's success. In this context, ''authentication'' is defined as determining that a software package performs only its intended purpose and performs said purpose correctly and reliably over the planned duration of an agreement. In addition to visual inspections by knowledgeable computer scientists, automated tools are needed to highlight suspicious code constructs, both to aid visual inspection and to guide program development. While many commercial tools are available for portions of the authentication task, they are proprietary and not extensible. An open-source, extensible tool can be customized to the unique needs of each project (projects can have both common and custom rules to detect flaws and security holes). Any such extensible tool has to be based on a complete language compiler. ROSE is precisely such a compiler infrastructure developed within the Department of Energy (DOE) and targeted at the optimization of scientific applications and user-defined libraries within large-scale applications (typically applications of a million lines of code). ROSE is a robust, source-to-source analysis and optimization infrastructure currently addressing large, million-line DOE applications in C and C++ (handling the full C, C99, C++ languages and with current collaborations to support Fortran90). We propose to extend ROSE to address a number of security-specific requirements, and apply it to software authentication for nonproliferation and arms control projects.

  19. Evolvable Neural Software System

    Science.gov (United States)

    Curtis, Steven A.

    2009-01-01

    The Evolvable Neural Software System (ENSS) is composed of sets of Neural Basis Functions (NBFs), which can be totally autonomously created and removed according to the changing needs and requirements of the software system. The resulting structure is both hierarchical and self-similar in that a given set of NBFs may have a ruler NBF, which in turn communicates with other sets of NBFs. These sets of NBFs may function as nodes to a ruler node, which are also NBF constructs. In this manner, the synthetic neural system can exhibit the complexity, three-dimensional connectivity, and adaptability of biological neural systems. An added advantage of ENSS over a natural neural system is its ability to modify its core genetic code in response to environmental changes as reflected in needs and requirements. The neural system is fully adaptive and evolvable and is trainable before release. It continues to rewire itself while on the job. The NBF is a unique, bilevel intelligence neural system composed of a higher-level heuristic neural system (HNS) and a lower-level, autonomic neural system (ANS). Taken together, the HNS and the ANS give each NBF the complete capabilities of a biological neural system to match sensory inputs to actions. Another feature of the NBF is the Evolvable Neural Interface (ENI), which links the HNS and ANS. The ENI solves the interface problem between these two systems by actively adapting and evolving from a primitive initial state (a Neural Thread) to a complicated, operational ENI and successfully adapting to a training sequence of sensory input. This simulates the adaptation of a biological neural system in a developmental phase. Within the greater multi-NBF and multi-node ENSS, self-similar ENI s provide the basis for inter-NBF and inter-node connectivity.

  20. Model-driven specification of software services

    NARCIS (Netherlands)

    Shishkov, Boris; van Sinderen, Marten J.; Tekinerdogan, B.

    2007-01-01

    Aligning adequately business requirements and software functionality as well as achieving ‘loose coupling’ for service functionalities, are identified challenges relevant to service-oriented software design. Furthering previous related work, we propose in this paper an application design process

  1. Pragmatic Software Innovation

    DEFF Research Database (Denmark)

    Aaen, Ivan; Jensen, Rikke Hagensby

    2014-01-01

    We understand software innovation as concerned with introducing innovation into the development of software intensive systems, i.e. systems in which software development and/or integration are dominant considerations. Innovation is key in almost any strategy for competitiveness in existing markets......, for creating new markets, or for curbing rising public expenses, and software intensive systems are core elements in most such strategies. Software innovation therefore is vital for about every sector of the economy. Changes in software technologies over the last decades have opened up for experimentation......, learning, and flexibility in ongoing software projects, but how can this change be used to facilitate software innovation? How can a team systematically identify and pursue opportunities to create added value in ongoing projects? In this paper, we describe Deweyan pragmatism as the philosophical foundation...

  2. Improving Software Reliability Forecasting

    NARCIS (Netherlands)

    Burtsy, Bernard; Albeanu, Grigore; Boros, Dragos N.; Popentiu, Florin; Nicola, V.F.

    1996-01-01

    This work investigates some methods for software reliability forecasting. A supermodel is presented as a suited tool for prediction of reliability in software project development. Also, times series forecasting for cumulative interfailure time is proposed and illustrated.

  3. Software Assurance in Acquisition: Mitigating Risks to the Enterprise. A Reference Guide for Security-Enhanced Software Acquisition and Outsourcing

    Science.gov (United States)

    2009-02-01

    IEC] 17011) and accrediting Information Technology Testing Laboratories (working under ISO /IEC 17025 ) maintain a cadre of commercial software testing...software product is provided to an independent accredited software testing organization ( ISO /IEC 17025 ) to verify that not only functional requirements but...2004 Conformity assessment—General Requirements for Accreditation Bodies Accrediting Conformity Assessment Bodies. [ ISO /IEC 17025 ] ISO /IEC 17025 :2005

  4. Agent Building Software

    Science.gov (United States)

    2000-01-01

    AgentBuilder is a software component developed under an SBIR contract between Reticular Systems, Inc., and Goddard Space Flight Center. AgentBuilder allows software developers without experience in intelligent agent technologies to easily build software applications using intelligent agents. Agents are components of software that will perform tasks automatically, with no intervention or command from a user. AgentBuilder reduces the time and cost of developing agent systems and provides a simple mechanism for implementing high-performance agent systems.

  5. Software engineer's pocket book

    CERN Document Server

    Tooley, Michael

    2013-01-01

    Software Engineer's Pocket Book provides a concise discussion on various aspects of software engineering. The book is comprised of six chapters that tackle various areas of concerns in software engineering. Chapter 1 discusses software development, and Chapter 2 covers programming languages. Chapter 3 deals with operating systems. The book also tackles discrete mathematics and numerical computation. Data structures and algorithms are also explained. The text will be of great use to individuals involved in the specification, design, development, implementation, testing, maintenance, and qualit

  6. Software Architecture Simulation

    OpenAIRE

    Mårtensson, Frans; Jönsson, Per

    2002-01-01

    A software architecture is one of the first steps towards a software system. A software architecture can be designed in different ways. During the design phase, it is important to select the most suitable design of the architecture, in order to create a good foundation for the system. The selection process is performed by evaluating architecture alternatives against each other. We investigate the use of continuous simulation of a software architecture as a support tool for architecture evalua...

  7. Improving Software Developer's Competence

    DEFF Research Database (Denmark)

    Abrahamsson, Pekka; Kautz, Karlheinz; Sieppi, Heikki

    2002-01-01

    Emerging agile software development methods are people oriented development approaches to be used by the software industry. The personal software process (PSP) is an accepted method for improving the capabilities of a single software engineer. Five original hypotheses regarding the impact...... and time estimation skills but that the productivity did not decrease and the resulting product quality was improved. The implications of these findings are briefly addressed....

  8. Viking Software Data

    Science.gov (United States)

    1977-05-01

    SWSG were then identified to be A.n Integration Contractor Software System Engineer (ICSSE), a NMC Software System Engineer (VLSSE), an Orbiter Software...resistance ohm V-A electromotive fort e volt W/A energy toleI k entropy toule per kelvin K force newton N k!:-Mls fre’quency heri Ilz (cycle~s illuminance

  9. Software variability management

    NARCIS (Netherlands)

    Bosch, J; Nord, RL

    2004-01-01

    During recent years, the amount of variability that has to be supported by a software artefact is growing considerably and its management is evolving into a major challenge during development, usage, and evolution of software artefacts. Successful management of variability in software leads to

  10. Software Language Evolution

    NARCIS (Netherlands)

    Vermolen, S.D.

    2012-01-01

    Software plays a critical role in our daily life. Vast amounts of money are spent on more and more complex systems. All software, regardless if it controls a plane or the game on your phone is never finished. Software changes when it contains bugs or when new functionality is added. This process of

  11. Microcomputer Software Collections.

    Science.gov (United States)

    Demas, Samuel

    1985-01-01

    Presents overview of special considerations in developing microcomputer software collections, review of standardized cataloging practices, and discussion of problems of selection and acquisition of software. Policies governing loan procedures for microcomputer software which involve four types of copy protection (patent, trade secret, contract,…

  12. Copyright and Computer Software.

    Science.gov (United States)

    Haugness, C. A.

    1985-01-01

    Explores some of the difficult areas of copyright protection for computer software including legal and illegal copying; values imparted to students if illegal software copies are used; and teacher and administrator responsibility. A suggested school district policy on software copyright is presented. (MBR)

  13. Model-driven software migration a methodology

    CERN Document Server

    Wagner, Christian

    2014-01-01

    Today, reliable software systems are the basis of any business or company. The continuous further development of those systems is the central component in software evolution. It requires a huge amount of time- man power- as well as financial resources. The challenges are size, seniority and heterogeneity of those software systems. Christian Wagner addresses software evolution: the inherent problems and uncertainties in the process. He presents a model-driven method which leads to a synchronization between source code and design. As a result the model layer will be the central part in further e

  14. Estimating The Cost Of Developing Software

    Science.gov (United States)

    Tausworthe, Robert C.

    1991-01-01

    Software Cost Estimation Model program, SOFTCOST, developed to provide consistent automated resource-and-schedule mathematical model more formalized than guesswork model. Combines several software-cost models found in open literature into one comprehensive set of algorithms compensating for nearly 50 implementation factors relative to size of task, inherited baseline, organizational and system environment, and difficulty of task. Produces mean and variance estimates of software size, implementation productivity, recommended staff level, probable duration, amount of computer resources required, and amount and cost of software documentation. Written in Microsoft BASIC.

  15. Software Quality Assurance in Software Projects: A Study of Pakistan

    OpenAIRE

    Faisal Shafique Butt; Sundus Shaukat; M. Wasif Nisar; Ehsan Ullah Munir; Muhammad Waseem; Kashif Ayyub

    2013-01-01

    Software quality is specific property which tells what kind of standard software should have. In a software project, quality is the key factor of success and decline of software related organization. Many researches have been done regarding software quality. Software related organization follows standards introduced by Capability Maturity Model Integration (CMMI) to achieve good quality software. Quality is divided into three main layers which are Software Quality Assurance (SQA), Software Qu...

  16. Modernising ATLAS Software Build Infrastructure

    CERN Document Server

    Ritsch, Elmar; The ATLAS collaboration

    2017-01-01

    In the last year ATLAS has radically updated its software development infrastructure hugely reducing the complexity of building releases and greatly improving build speed, flexibility and code testing. The first step in this transition was the adoption of CMake as the software build system over the older CMT. This required the development of an automated translation from the old system to the new, followed by extensive testing and improvements. This resulted in a far more standard build process that was married to the method of building ATLAS software as a series of $12$ separate projects from Subversion. We then proceeded with a migration of the code base from Subversion to Git. As the Subversion repository had been structured to manage each package more or less independently there was no simple mapping that could be used to manage the migration into Git. Instead a specialist set of scripts that captured the software changes across official software releases was developed. With some clean up of the repositor...

  17. Modernising ATLAS Software Build Infrastructure

    CERN Document Server

    Gaycken, Goetz; The ATLAS collaboration

    2017-01-01

    In the last year ATLAS has radically updated its software development infrastructure hugely reducing the complexity of building releases and greatly improving build speed, flexibility and code testing. The first step in this transition was the adoption of CMake as the software build system over the older CMT. This required the development of an automated translation from the old system to the new, followed by extensive testing and improvements. This resulted in a far more standard build process that was married to the method of building ATLAS software as a series of 12 separate projects from SVN. We then proceeded with a migration of its code base from SVN to git. As the SVN repository had been structured to manage each package more or less independently there was no simple mapping that could be used to manage the migration into git. Instead a specialist set of scripts that captured the software changes across official software releases was developed. With some clean up of the repository and the policy of onl...

  18. Software Complexity Threatens Performance Portability

    Energy Technology Data Exchange (ETDEWEB)

    Gamblin, T. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-09-11

    Modern HPC software packages are rarely self-contained. They depend on a large number of external libraries, and many spend large fractions of their runtime in external subroutines. Performance portability depends not only on the effort of application teams, but also on the availability of well-tuned libraries. At most sites, the burden of maintaining libraries is shared by code teams and facilities. Facilities typically provide well-tuned default versions, but code teams frequently build with bleeding-edge compilers to achieve high performance. For this reason, HPC has no “standard” software stack, unlike other domains where performance is not critical. Incompatibilities among compilers and software versions force application teams and facility staff to re-build custom versions of libraries for each new toolchain. Because the number of potential configurations is combinatorial, and because HPC software is notoriously difficult to port to new machines [3, 7, 8], the tuning effort required to support and maintain performance-portable libraries outstrips the available manpower at most sites. Software complexity is a growing obstacle to performance portability for HPC.

  19. ENACTED SOFTWARE DEVELOPMENT PROCESS BASED ON AGILE AND AGENT METHODOLOGIES

    OpenAIRE

    DR. NACHAMAI. M; M. SENTHIL VADIVU; VINITA TAPASKAR

    2011-01-01

    Software Engineering gives the procedures and practices to be followed in the software development and acts as a backbone for computer science engineering techniques. This paper deals with current trends in software engineering methodologies, Agile and Agent Oriented software development process. Agile Methodology is to meet the needs of dynamic changing requirements of the customers. This model is iterative and incremental and accepts the changes in requirements at any stage of development. ...

  20. Open software in small enterprises. Private medical practise example

    Directory of Open Access Journals (Sweden)

    Dominik Meller

    2011-06-01

    Full Text Available Article covers basic concepts of free and open software and its implementation at small health care facilities. It summarizes costs of possession and maintenance free/open and proprietary software. Functional analysis is conducted on example of office software. Further analysis covers barriers of open software implementation based on requirements of small health care facilities. Finally, according to stated requirements suggestion of ways to overcome these barriers are made.

  1. Funding Research Software Development

    Science.gov (United States)

    Momcheva, Ivelina G.

    2017-01-01

    Astronomical software is used by each and every member of our scientific community. Purpose-build software is becoming ever more critical as we enter the regime of large datasets and simulations of increasing complexity. However, financial investments in building, maintaining and renovating the software infrastructure have been uneven. In this talk I will summarize past and current funding sources for astronomical software development, discuss other models of funding and introduce a new initiative for supporting community software at STScI. The purpose of this talk is to prompt discussion about how we allocate resources to this vital infrastructure.

  2. Software Innovation in a Mission Critical Environment

    Science.gov (United States)

    Fredrickson, Steven

    2015-01-01

    Operating in mission-critical environments requires trusted solutions, and the preference for "tried and true" approaches presents a potential barrier to infusing innovation into mission-critical systems. This presentation explores opportunities to overcome this barrier in the software domain. It outlines specific areas of innovation in software development achieved by the Johnson Space Center (JSC) Engineering Directorate in support of NASA's major human spaceflight programs, including International Space Station, Multi-Purpose Crew Vehicle (Orion), and Commercial Crew Programs. Software engineering teams at JSC work with hardware developers, mission planners, and system operators to integrate flight vehicles, habitats, robotics, and other spacecraft elements for genuinely mission critical applications. The innovations described, including the use of NASA Core Flight Software and its associated software tool chain, can lead to software that is more affordable, more reliable, better modelled, more flexible, more easily maintained, better tested, and enabling of automation.

  3. Evaluating software architecture using fuzzy formal models

    Directory of Open Access Journals (Sweden)

    Payman Behbahaninejad

    2012-04-01

    Full Text Available Unified Modeling Language (UML has been recognized as one of the most popular techniques to describe static and dynamic aspects of software systems. One of the primary issues in designing software packages is the existence of uncertainty associated with such models. Fuzzy-UML to describe software architecture has both static and dynamic perspective, simultaneously. The evaluation of software architecture design phase initiates always help us find some additional requirements, which helps reduce cost of design. In this paper, we use a fuzzy data model to describe the static aspects of software architecture and the fuzzy sequence diagram to illustrate the dynamic aspects of software architecture. We also transform these diagrams into Petri Nets and evaluate reliability of the architecture. The web-based hotel reservation system for further explanation has been studied.

  4. A Software Development Platform for Mechatronic Systems

    DEFF Research Database (Denmark)

    Guan, Wei

    Software has become increasingly determinative for development of mechatronic systems, which underscores the importance of demands for shortened time-to-market, increased productivity, higher quality, and improved dependability. As the complexity of systems is dramatically increasing, these demands...... present a challenge to the practitioners who adopt conventional software development approach. An effective approach towards industrial production of software for mechatronic systems is needed. This approach requires a disciplined engineering process that encompasses model-driven engineering and component......-based software engineering, whereby we enable incremental software development using component models to address the essential design issues of real-time embedded systems. To this end, this dissertation presents a software development platform that provides an incremental model-driven development process based...

  5. Software Tools for Fault Management Technologies Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Fault Management (FM) is a key requirement for safety, efficient onboard and ground operations, maintenance, and repair. QSI's TEAMS Software suite is a leading...

  6. A software tool for network intrusion detection

    CSIR Research Space (South Africa)

    Van der Walt, C

    2012-10-01

    Full Text Available This presentation illustrates how a recently developed software tool enables operators to easily monitor a network and detect intrusions without requiring expert knowledge of network intrusion detections....

  7. Learning Software Engineering via Internet

    Directory of Open Access Journals (Sweden)

    Debora Weber-Wulff

    2005-12-01

    Full Text Available In recent years, education authorities worldwide, including the German Federal Government, have invested heavily in the development of e-learning and multimedia materials for institutions of higher education. While for some subject matters the benefits of e-learning seem obvious, there are subjects, often consisting of a number of tenuously connected topics or requiring a balance of learning and training, for which it is a valid question whether appropriate learning materials can be presented via the Internet. Software Engineering belongs to this second group, both for its broad collection of topics and, particularly, for the required emphasis on teamwork and communication training.This paper reports on a successful e-learning module on Software Engineering, which is used at the VFH (Virtual University of Applied Sciences in the third semester of the Bachelor program for Media and Computing. The report concentrates on two major aspects: The conceptual approach of producing didactically adequate online course material, and the didactics and techniques required for training communication and teamwork in an online course, which are considered an essential part of Software Engineering. In a concluding passage, the authors share some practical experiences with this course and consider the teaching efforts required to make it work successfully.

  8. Jitter Controller Software

    Science.gov (United States)

    Lansdowne, Chatwin; Schlensinger, Adam

    2011-01-01

    Sinusoidal jitter is produced by simply modulating a clock frequency sinusoidally with a given frequency and amplitude. But this can be expressed as phase jitter, frequency jitter, or cycle-to-cycle jitter, rms or peak, absolute units, or normalized to the base clock frequency. Jitter using other waveforms requires calculating and downloading these waveforms to an arbitrary waveform generator, and helping the user manage relationships among phase jitter crest factor, frequency jitter crest factor, and cycle-to-cycle jitter (CCJ) crest factor. Software was developed for managing these relationships, automatically configuring the generator, and saving test results documentation. Tighter management of clock jitter and jitter sensitivity is required by new codes that further extend the already high performance of space communication links, completely correcting symbol error rates higher than 10 percent, and therefore typically requiring demodulation and symbol synchronization hardware to operating at signal-to-noise ratios of less than one. To accomplish this, greater demands are also made on transmitter performance, and measurement techniques are needed to confirm performance. It was discovered early that sinusoidal jitter can be stepped on a grid such that one can connect points by constant phase jitter, constant frequency jitter, or constant cycle-cycle jitter. The tool automates adherence to a grid while also allowing adjustments off-grid. Also, the jitter can be set by the user on any dimension and the others are calculated. The calculations are all recorded, allowing the data to be rapidly plotted or re-plotted against different interpretations just by changing pointers to columns. A key advantage is taking data on a carefully controlled grid, which allowed a single data set to be post-analyzed many different ways. Another innovation was building a software tool to provide very tight coupling between the generator and the recorded data product, and the operator

  9. Reconfigurability Function Deployment in Software Development

    Directory of Open Access Journals (Sweden)

    Stelian BRAD

    2011-01-01

    Full Text Available In the forthcoming highly dynamic and complex business environment high-speed and cost-effective development of software applications for targeting a precise, unique and momentary set of requirements (no more-no less associated to a customized business case will bring sig-nificant benefits both for producers and users. This requires a life cycle change-oriented ap-proach in software development. In this respect, designing software with intrinsic evolutionary resources for reconfiguration represents the sound approach. A methodology for concurrent deployment of reconfigurability characteristics in software applications is introduced in this paper. Its potential is exemplified in a case study dealing with web-based software tools to support systematic product innovation projects.

  10. Modeling software behavior a craftsman's approach

    CERN Document Server

    Jorgensen, Paul C

    2009-01-01

    A common problem with most texts on requirements specifications is that they emphasize structural models to the near exclusion of behavioral models-focusing on what the software is, rather than what it does. If they do cover behavioral models, the coverage is brief and usually focused on a single model. Modeling Software Behavior: A Craftsman's Approach provides detailed treatment of various models of software behavior that support early analysis, comprehension, and model-based testing. Based on the popular and continually evolving course on requirements specification models taught by the auth

  11. Quality in Software Development: a pragmatic approach using metrics

    Directory of Open Access Journals (Sweden)

    Daniel Acton

    2014-06-01

    Full Text Available As long as software has been produced, there have been efforts to strive for quality in software products. In order to understand quality in software products, researchers have built models of software quality that rely on metrics in an attempt to provide a quantitative view of software quality. The aim of these models is to provide software producers with the capability to define and evaluate metrics related to quality and use these metrics to improve the quality of the software they produce over time. The main disadvantage of these models is that they require effort and resources to define and evaluate metrics from software projects. This article briefly describes some prominent models of software quality in the literature and continues to describe a new approach to gaining insight into quality in software development projects. A case study based on this new approach is described and results from the case study are discussed.

  12. Cooperative and human aspects of software engineering: CHASE 2010

    DEFF Research Database (Denmark)

    Dittrich, Yvonne; Sharp, Helen C.; Winschiers Theophilus, Heike

    2010-01-01

    Software is created by people -- software engineers in cooperation with domain experts, users and other stakeholders--in varied environments, under various conditions. Thus understanding cooperative and human aspects of software development is crucial to comprehend how and which methods and tools...... and high quality research in the field. Further dissemination of research results will lead to an improvement of software development and deployment across the globe.......Software is created by people -- software engineers in cooperation with domain experts, users and other stakeholders--in varied environments, under various conditions. Thus understanding cooperative and human aspects of software development is crucial to comprehend how and which methods and tools...... are required, to improve the creation and maintenance of software. The 3rd workshop on Cooperative and Human Aspects of Software Engineering held at the International Conference on Software Engineering continued the tradition from earlier workshops and provided a lively forum to discuss current developments...

  13. Roadmap for Peridynamic Software Implementation

    Energy Technology Data Exchange (ETDEWEB)

    Littlewood, David John [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-10-01

    The application of peridynamics for engineering analysis requires an efficient and robust software implementation. Key elements include processing of the discretization, the proximity search for identification of pairwise interactions, evaluation of the con- stitutive model, application of a bond-damage law, and contact modeling. Additional requirements may arise from the choice of time integration scheme, for example esti- mation of the maximum stable time step for explicit schemes, and construction of the tangent stiffness matrix for many implicit approaches. This report summaries progress to date on the software implementation of the peridynamic theory of solid mechanics. Discussion is focused on parallel implementation of the meshfree discretization scheme of Silling and Askari [33] in three dimensions, although much of the discussion applies to computational peridynamics in general.

  14. High performance in software development

    CERN Multimedia

    CERN. Geneva; Haapio, Petri; Liukkonen, Juha-Matti

    2015-01-01

    What are the ingredients of high-performing software? Software development, especially for large high-performance systems, is one the most complex tasks mankind has ever tried. Technological change leads to huge opportunities but challenges our old ways of working. Processing large data sets, possibly in real time or with other tight computational constraints, requires an efficient solution architecture. Efficiency requirements span from the distributed storage and large-scale organization of computation and data onto the lowest level of processor and data bus behavior. Integrating performance behavior over these levels is especially important when the computation is resource-bounded, as it is in numerics: physical simulation, machine learning, estimation of statistical models, etc. For example, memory locality and utilization of vector processing are essential for harnessing the computing power of modern processor architectures due to the deep memory hierarchies of modern general-purpose computers. As a r...

  15. Framework for Small-Scale Experiments in Software Engineering: Guidance and Control Software Project: Software Engineering Case Study

    Science.gov (United States)

    Hayhurst, Kelly J.

    1998-01-01

    Software is becoming increasingly significant in today's critical avionics systems. To achieve safe, reliable software, government regulatory agencies such as the Federal Aviation Administration (FAA) and the Department of Defense mandate the use of certain software development methods. However, little scientific evidence exists to show a correlation between software development methods and product quality. Given this lack of evidence, a series of experiments has been conducted to understand why and how software fails. The Guidance and Control Software (GCS) project is the latest in this series. The GCS project is a case study of the Requirements and Technical Concepts for Aviation RTCA/DO-178B guidelines, Software Considerations in Airborne Systems and Equipment Certification. All civil transport airframe and equipment vendors are expected to comply with these guidelines in building systems to be certified by the FAA for use in commercial aircraft. For the case study, two implementations of a guidance and control application were developed to comply with the DO-178B guidelines for Level A (critical) software. The development included the requirements, design, coding, verification, configuration management, and quality assurance processes. This paper discusses the details of the GCS project and presents the results of the case study.

  16. MANAGING INTERACTING SOFTWARE PROJECT RISKS

    OpenAIRE

    Dey, Pradip Peter; Khan, Muzibul; Amin, Mohammad; Sinha, Bhaskar Raj; Badkoobehi, Hassan; Jawad, Shatha; Any, Laith Al

    2016-01-01

    Managing risks in a software project can be challenging. There are many risk categories including communication risks, project planning risks, technical risks, budget risks, scheduling risks, legal risks, ethical risks, operational risks, security risks, and personnel risks that require timely attention. Potential risks should be identified, analyzed and evaluated. Appropriate strategies should be developed for managing imminent risks in a timely manner. This paper advocates a strategy that a...

  17. SAGA: A project to automate the management of software production systems

    Science.gov (United States)

    Campbell, Roy H.; Laliberte, D.; Render, H.; Sum, R.; Smith, W.; Terwilliger, R.

    1987-01-01

    The Software Automation, Generation and Administration (SAGA) project is investigating the design and construction of practical software engineering environments for developing and maintaining aerospace systems and applications software. The research includes the practical organization of the software lifecycle, configuration management, software requirements specifications, executable specifications, design methodologies, programming, verification, validation and testing, version control, maintenance, the reuse of software, software libraries, documentation, and automated management.

  18. Software Architecture Evolution

    Science.gov (United States)

    2013-12-01

    Software Architecture Evolution Jeffrey M. Barnes December 2013 CMU-ISR-13-118 Institute for Software Research School of Computer Science Carnegie...DEC 2013 2. REPORT TYPE 3. DATES COVERED 00-00-2013 to 00-00-2013 4. TITLE AND SUBTITLE Software Architecture Evolution 5a. CONTRACT NUMBER 5b...systems eventually undergo changes to their basic architectural structure. Such changes may be prompted by new feature requests, new quality attribute

  19. Software systems for astronomy

    CERN Document Server

    Conrad, Albert R

    2014-01-01

    This book covers the use and development of software for astronomy. It describes the control systems used to point the telescope and operate its cameras and spectrographs, as well as the web-based tools used to plan those observations. In addition, the book also covers the analysis and archiving of astronomical data once it has been acquired. Readers will learn about existing software tools and packages, develop their own software tools, and analyze real data sets.

  20. Software Process Improvement Defined

    DEFF Research Database (Denmark)

    Aaen, Ivan

    2002-01-01

    This paper argues in favor of the development of explanatory theory on software process improvement. The last one or two decades commitment to prescriptive approaches in software process improvement theory may contribute to the emergence of a gulf dividing theorists and practitioners....... It is proposed that this divide be met by the development of theory evaluating prescriptive approaches and informing practice with a focus on the software process policymaking and process control aspects of improvement efforts...

  1. Solar Asset Management Software

    Energy Technology Data Exchange (ETDEWEB)

    Iverson, Aaron [Ra Power Management, Inc., Oakland, CA (United States); Zviagin, George [Ra Power Management, Inc., Oakland, CA (United States)

    2016-09-30

    Ra Power Management (RPM) has developed a cloud based software platform that manages the financial and operational functions of third party financed solar projects throughout their lifecycle. RPM’s software streamlines and automates the sales, financing, and management of a portfolio of solar assets. The software helps solar developers automate the most difficult aspects of asset management, leading to increased transparency, efficiency, and reduction in human error. More importantly, our platform will help developers save money by improving their operating margins.

  2. Computing and software

    OpenAIRE

    White, G C; Hines, J.E.

    2004-01-01

    The reality is that the statistical methods used for analysis of data depend upon the availability of software. Analysis of marked animal data is no different than the rest of the statistical field. The methods used for analysis are those that are available in reliable software packages. Thus, the critical importance of having reliable, up–to–date software available to biologists is obvious. Statisticians have continued to develop more robust models, ever expanding the suite of potential ...

  3. Computing and software

    OpenAIRE

    White, G C; Hines, J.E.

    2004-01-01

    The reality is that the statistical methods used for analysis of data depend upon the availability of software. Analysis of marked animal data is no different than the rest of the statistical field. The methods used for analysis are those that are available in reliable software packages. Thus, the critical importance of having reliable, up–to–date software available to biologists is obvious. Statisticians have continued to develop more robust models, ever expanding the suite of potential anal...

  4. Software evolution and maintenance

    CERN Document Server

    Tripathy, Priyadarshi

    2014-01-01

    Software Evolution and Maintenance: A Practitioner's Approach is an accessible textbook for students and professionals, which collates the advances in software development and provides the most current models and techniques in maintenance.Explains two maintenance standards: IEEE/EIA 1219 and ISO/IEC14764Discusses several commercial reverse and domain engineering toolkitsSlides for instructors are available onlineInformation is based on the IEEE SWEBOK (Software Engineering Body of Knowledge)

  5. Testing unconstrained optimization software

    Energy Technology Data Exchange (ETDEWEB)

    More, J.J.; Garbow, B.S.; Hillstrom, K.E.

    1978-07-01

    Much of the testing of optimization software is inadequate because the number of test functions is small or the starting points are close to the solution. In addition, there has been too much emphasis on measuring the efficiency of the software and not enough on testing reliability and robustness. To address this need, a relatively large but easy-to-use collection of test functions was produced and guidelines for testing the reliability and robustness of unconstrained optimization software were designed. 9 tables.

  6. Software evolution in prototyping

    OpenAIRE

    Berzins, V.; Qi, Lu

    1996-01-01

    This paper proposes a model of software changes for supporting the evolution of software prototypes. The software evolution steps are decomposed into primitive substeps that correspond to monotonic specification changes. This structure is used to rearrange chronological derivation sequences into structures containing only meaning-preserving changes. The authors indicate how this structure can be used to automatically combine different changes to a specification. A set of examples illustrates ...

  7. Software configuration management

    CERN Document Server

    Keyes, Jessica

    2004-01-01

    Software Configuration Management discusses the framework from a standards viewpoint, using the original DoD MIL-STD-973 and EIA-649 standards to describe the elements of configuration management within a software engineering perspective. Divided into two parts, the first section is composed of 14 chapters that explain every facet of configuration management related to software engineering. The second section consists of 25 appendices that contain many valuable real world CM templates.

  8. Software architecture 1

    CERN Document Server

    Oussalah , Mourad Chabane

    2014-01-01

    Over the past 20 years, software architectures have significantly contributed to the development of complex and distributed systems. Nowadays, it is recognized that one of the critical problems in the design and development of any complex software system is its architecture, i.e. the organization of its architectural elements. Software Architecture presents the software architecture paradigms based on objects, components, services and models, as well as the various architectural techniques and methods, the analysis of architectural qualities, models of representation of architectural template

  9. Software product lines

    OpenAIRE

    Cortés Verdín, María Karen

    2005-01-01

    A Software Product Line is a “set of software-intensive systems sharing a common, managed set of features that satisfy the specific needs of a particular market segment or mission and that are developed from a common set of core assets in a prescribed way”1. A software product line (or software product family) approach promotes planned and proactive reuse of core assets and architecture-centric development, achieving a substantial increment in product quality and a reduced time to market. Bec...

  10. Essence: Facilitating Software Innovation

    DEFF Research Database (Denmark)

    Aaen, Ivan

    2008-01-01

      This paper suggests ways to facilitate creativity and innovation in software development. The paper applies four perspectives – Product, Project, Process, and People –to identify an outlook for software innovation. The paper then describes a new facility–Software Innovation Research Lab (SIRL......) – and a new method concept for software innovation – Essence – based on views, modes, and team roles. Finally, the paper reports from an early experiment using SIRL and Essence and identifies further research....

  11. Software architecture 2

    CERN Document Server

    Oussalah, Mourad Chabanne

    2014-01-01

    Over the past 20 years, software architectures have significantly contributed to the development of complex and distributed systems. Nowadays, it is recognized that one of the critical problems in the design and development of any complex software system is its architecture, i.e. the organization of its architectural elements. Software Architecture presents the software architecture paradigms based on objects, components, services and models, as well as the various architectural techniques and methods, the analysis of architectural qualities, models of representation of architectural templa

  12. Agile software development

    CERN Document Server

    Dingsoyr, Torgeir; Moe, Nils Brede

    2010-01-01

    Agile software development has become an umbrella term for a number of changes in how software developers plan and coordinate their work, how they communicate with customers and external stakeholders, and how software development is organized in small, medium, and large companies, from the telecom and healthcare sectors to games and interactive media. Still, after a decade of research, agile software development is the source of continued debate due to its multifaceted nature and insufficient synthesis of research results. Dingsoyr, Dyba, and Moe now present a comprehensive snapshot of the kno

  13. Dtest Testing Software

    Science.gov (United States)

    Jain, Abhinandan; Cameron, Jonathan M.; Myint, Steven

    2013-01-01

    This software runs a suite of arbitrary software tests spanning various software languages and types of tests (unit level, system level, or file comparison tests). The dtest utility can be set to automate periodic testing of large suites of software, as well as running individual tests. It supports distributing multiple tests over multiple CPU cores, if available. The dtest tool is a utility program (written in Python) that scans through a directory (and its subdirectories) and finds all directories that match a certain pattern and then executes any tests in that directory as described in simple configuration files.

  14. Contractor Software Charges

    National Research Council Canada - National Science Library

    Granetto, Paul

    1994-01-01

    .... Examples of computer software costs that contractors charge through indirect rates are material management systems, security systems, labor accounting systems, and computer-aided design and manufacturing...

  15. Global Software Engineering

    DEFF Research Database (Denmark)

    Ebert, Christof; Kuhrmann, Marco; Prikladnicki, Rafael

    2016-01-01

    Professional software products and IT systems and services today are developed mostly by globally distributed teams, projects, and companies. Successfully orchestrating Global Software Engineering (GSE) has become the major success factor both for organizations and practitioners. Yet, more than...... and experience reported at the IEEE International Conference on Software Engineering (ICGSE) series. The outcomes of our analysis show GSE as a field highly attached to industry and, thus, a considerable share of ICGSE papers address the transfer of Software Engineering concepts and solutions to the global stage...

  16. Optimization of Antivirus Software

    Directory of Open Access Journals (Sweden)

    2007-01-01

    Full Text Available The paper describes the main techniques used in development of computer antivirus software applications. For this particular category of software, are identified and defined optimum criteria that helps determine which solution is better and what are the objectives of the optimization process. From the general viewpoint of software optimization are presented methods and techniques that are applied at code development level. Regarding the particularities of antivirus software, the paper analyzes some of the optimization concepts applied to this category of applications

  17. Open source software to control Bioflo bioreactors.

    Science.gov (United States)

    Burdge, David A; Libourel, Igor G L

    2014-01-01

    Bioreactors are designed to support highly controlled environments for growth of tissues, cell cultures or microbial cultures. A variety of bioreactors are commercially available, often including sophisticated software to enhance the functionality of the bioreactor. However, experiments that the bioreactor hardware can support, but that were not envisioned during the software design cannot be performed without developing custom software. In addition, support for third party or custom designed auxiliary hardware is often sparse or absent. This work presents flexible open source freeware for the control of bioreactors of the Bioflo product family. The functionality of the software includes setpoint control, data logging, and protocol execution. Auxiliary hardware can be easily integrated and controlled through an integrated plugin interface without altering existing software. Simple experimental protocols can be entered as a CSV scripting file, and a Python-based protocol execution model is included for more demanding conditional experimental control. The software was designed to be a more flexible and free open source alternative to the commercially available solution. The source code and various auxiliary hardware plugins are publicly available for download from https://github.com/LibourelLab/BiofloSoftware. In addition to the source code, the software was compiled and packaged as a self-installing file for 32 and 64 bit windows operating systems. The compiled software will be able to control a Bioflo system, and will not require the installation of LabVIEW.

  18. Orbiter subsystem hardware/software interaction analysis. Volume 8: AFT reaction control system, part 2

    Science.gov (United States)

    Becker, D. D.

    1980-01-01

    The orbiter subsystems and interfacing program elements which interact with the orbiter computer flight software are analyzed. The failure modes identified in the subsystem/element failure mode and effects analysis are examined. Potential interaction with the software is examined through an evaluation of the software requirements. The analysis is restricted to flight software requirements and excludes utility/checkout software. The results of the hardware/software interaction analysis for the forward reaction control system are presented.

  19. 35 years of EDS software.

    Science.gov (United States)

    Schamber, Frederick H

    2009-12-01

    The computerized multichannel analyzer running software specifically designed for X-ray analysis appeared very early in the commercialization of the energy dispersive X-ray spectrometer (EDS) and, like the solid-state X-ray detector itself, was built on a technology foundation originally developed for nuclear spectroscopy. However, software techniques employed for gamma-ray spectra could not accommodate the continuum component of EDS spectra, and a new approach was required. Least-squares fitting with "top-hat" filtered spectra proved to be an effective solution that is still widely used today. Though modern computer technology has subsequently contributed greatly to the speed and convenience of present-day EDS software, it seems that the achievable accuracy and precision of spectrum analysis has not fundamentally improved, and most of the early challenges are still quite relevant, although they may appear in new guises. The availability of the high speed silicon drift detector, however, may provide both the incentive and the data precision to drive future advances. This article traces the formative years of EDS software from the personalized perspective of a participant. Factors that shaped the development of the industry are identified, and future directions are speculated.

  20. Motorola Secure Software Development Model

    Directory of Open Access Journals (Sweden)

    Francis Mahendran

    2008-08-01

    Full Text Available In today's world, the key to meeting the demand for improved security is to implement repeatable processes that reliably deliver measurably improved security. While many organizations have announced efforts to institutionalize a secure software development process, there is little or no industry acceptance for a common process improvement framework for secure software development. Motorola has taken the initiative to develop such a framework, and plans to share this with the Software Engineering Institute for possible inclusion into its Capability Maturity Model Integration (CMMI®. This paper will go into the details of how Motorola is addressing this issue. The model that is being developed is designed as an extension of the existing CMMI structure. The assumption is that the audience will have a basic understanding of the SEI CMM® / CMMI® process framework. The paper will not describe implementation details of a security process model or improvement framework, but will address WHAT security practices are required for a company with many organizations operating at different maturity levels. It is left to the implementing organization to answer the HOW, WHEN, WHO and WHERE aspects. The paper will discuss how the model is being implemented in the Motorola Software Group.

  1. Social software in global software development

    DEFF Research Database (Denmark)

    Giuffrida, Rosalba; Dittrich, Yvonne

    2010-01-01

    variety of tools such as: instant messaging, internet forums, mailing lists, blogs, wikis, social network sites, social bookmarking, social libraries, virtual worlds. Though normally rather belonging to the private realm, the use of social software in corporate context has been reported, e.g. as a way...

  2. Advanced information processing system: Input/output network management software

    Science.gov (United States)

    Nagle, Gail; Alger, Linda; Kemp, Alexander

    1988-01-01

    The purpose of this document is to provide the software requirements and specifications for the Input/Output Network Management Services for the Advanced Information Processing System. This introduction and overview section is provided to briefly outline the overall architecture and software requirements of the AIPS system before discussing the details of the design requirements and specifications of the AIPS I/O Network Management software. A brief overview of the AIPS architecture followed by a more detailed description of the network architecture.

  3. A measurement system for large, complex software programs

    Science.gov (United States)

    Rone, Kyle Y.; Olson, Kitty M.; Davis, Nathan E.

    1994-01-01

    This paper describes measurement systems required to forecast, measure, and control activities for large, complex software development and support programs. Initial software cost and quality analysis provides the foundation for meaningful management decisions as a project evolves. In modeling the cost and quality of software systems, the relationship between the functionality, quality, cost, and schedule of the product must be considered. This explicit relationship is dictated by the criticality of the software being developed. This balance between cost and quality is a viable software engineering trade-off throughout the life cycle. Therefore, the ability to accurately estimate the cost and quality of software systems is essential to providing reliable software on time and within budget. Software cost models relate the product error rate to the percent of the project labor that is required for independent verification and validation. The criticality of the software determines which cost model is used to estimate the labor required to develop the software. Software quality models yield an expected error discovery rate based on the software size, criticality, software development environment, and the level of competence of the project and developers with respect to the processes being employed.

  4. Customer Interaction in Software Development: A Comparison of Software Methodologies Deployed in Namibian Software Firms

    CSIR Research Space (South Africa)

    Iyawa, GE

    2016-01-01

    Full Text Available Software methodologies provide guidelines for the development of software applications. Studies reveal that customer interaction in the software development process improves the chances that software applications will meet customers’ needs. Despite...

  5. Safe Integration of Concerns in a Software Architecture

    DEFF Research Database (Denmark)

    Barais, Olivier; Lawall, Julia Laetitia; Le Meur, Anne-Francoise

    2006-01-01

    Software architectures must frequently evolve to cope with changing requirements, and this evolution often implies integrating new concerns. Unfortunately, existing architecture description languages provide little or no support for this kind of evolution. The software architect must modify......, such that a software architect can confidently apply a pattern obtained from a third-party developer....

  6. ClassCompass: A Software Design Mentoring System

    Science.gov (United States)

    Coelho, Wesley; Murphy, Gail

    2007-01-01

    Becoming a quality software developer requires practice under the guidance of an expert mentor. Unfortunately, in most academic environments, there are not enough experts to provide any significant design mentoring for software engineering students. To address this problem, we present a collaborative software design tool intended to maximize an…

  7. Automating the management of software projects in a developing it ...

    African Journals Online (AJOL)

    Software project management is the control of the transformation of users' requirements and resources into a successful software result (product). This work automates the management of software projects in an emerging IT economy like Nigeria. It also explores the simulation of management practices such as configuration ...

  8. Automating the management of software projects in a developing IT ...

    African Journals Online (AJOL)

    Software project management is the control of the transformation of users' requirements and resources into a successful software result (product). This work automates the management of software projects in an emerging IT economy like Nigeria. It also explores the simulation of management practices such as configuration ...

  9. 48 CFR 212.7003 - Technical data and computer software.

    Science.gov (United States)

    2010-10-01

    ... computer software. 212.7003 Section 212.7003 Federal Acquisition Regulations System DEFENSE ACQUISITION... data and computer software. For purposes of establishing delivery requirements and license rights for technical data under 227.7102 and for computer software under 227.7202, there shall be a rebuttable...

  10. OpenArgue: Supporting Argumentation to Evolve Secure Software Systems

    NARCIS (Netherlands)

    Yu, Yijun; Tun, Thein Tan; Tedeschi, Alessandra; Nunes Leal Franqueira, V.; Nuseibeh, Bashar

    When software systems are verified against security requirements, formal and informal arguments provide a structure for organizing the software artifacts. Our recent work on the evolution of security-critical software systems demonstrates that our argumentation technique is useful in limiting the

  11. Business engineering. Generic Software Architecture in an Object Oriented View

    Directory of Open Access Journals (Sweden)

    Mihaela MURESAN

    2006-01-01

    Full Text Available The generic software architecture offers a solution for the the information system's development and implementation. A generic software/non-software model could be developed by integrating the enterprise blueprint concept (Zachman and the object oriented paradigm (Coad's archetype concept. The standardization of the generic software architecture for various specific software components could be a direction of crucial importance, offering the guarantee of the quality of the model and increasing the efficiency of the design, development and implementation of the software. This approach is also useful for the implementation of the ERP systems designed to fit the user’s particular requirements.

  12. Sustainability in Software Engineering

    NARCIS (Netherlands)

    Wolfram, N.J.E.; Lago, P.; Osborne, Francesco

    2017-01-01

    The intersection between software engineering research and issues related to sustainability and green IT has been the subject of increasing attention. In spite of that, we observe that sustainability is still not clearly defined, or understood, in the field of software engineering. This lack of

  13. Computer software profiles

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    2009-04-15

    A review of various computer software programs designed for use in the petroleum industry was presented with reference to each programs capabilities, efficiencies, and operational parameters. This article highlighted 3 software packages developed by Epic Consulting Services Ltd. for reservoir surveillance, forecasting and waterflooding optimization. Two oil and gas software solutions developed by Energy Navigator were presented, notably AFE Navigator for capital tracking and Value Navigator for full suite engineering applications. Six reservoir simulation packages developed by Fekete Associates Inc. were also presented along with software programs developed by 3esi for resource planning, reservoir forecasting, scheduling tracking, archiving, variance reporting and project performance monitoring. Streamsim Technologies Inc. has also developed software packages known as studioSL and 3DSL for geologists and reservoir engineers. Three multiphase flow modelling software packages developed by Neotec were also highlighted. These included Wellflo, Pipeflo and Forgas to model pipelines, flow conditions and wellbore conditions. The consulting software packages developed by Schlumberger Information Solutions for use in geology and geophysics were also highlighted along with software solutions for reservoir engineering; production optimization; reserves risk management; and information management. tabs., figs.

  14. Avoidable Software Procurements

    Science.gov (United States)

    2012-09-01

    basic functioning of federal agencies. GSA Advantage is an online purchasing service created by the GSA organization. Its mission is to provide a...managing enterprise software. Agreement negotiations and retail contracting actions are performed by IT acquisition and contracting professionals...deploying CICO technology, commercial IT industry leaders, such as Amazon and Apple, have been loaning out software titles, albeit video titles, for

  15. Marketing Mix del Software.

    Directory of Open Access Journals (Sweden)

    Yudith del Carmen Rodríguez Pérez

    2006-03-01

    Por ello, en este trabajo se define el concepto de producto software, se caracteriza al mismo y se exponen sus atributos de calidad. Además, se aborda la mezcla de marketing del software necesaria y diferente a la de otros productos para que este triunfe en el mercado.

  16. Software evolution with XVCL

    DEFF Research Database (Denmark)

    Zhang, Weishan; Jarzabek, Stan; Zhang, Hongyu

    2004-01-01

    This chapter introduces software evolution with XVCL (XML-based Variant Configuration Language), which is an XML-based metaprogramming technique. As the software evolves, a large number of variants may arise, especially whtn such kinds of evolutions are related to multiple platforms as shown in our...

  17. Who Owns Computer Software?

    Science.gov (United States)

    Branscomb, Anne Wells

    1995-01-01

    Discusses the protection of intellectual property as it applies to computer software and its impact on private enterprise and the public good. Highlights include the role of patents, copyrights, and trade secrets; some court cases; and recommendations for alternatives to the existing legal framework for protecting computer software. (KRN)

  18. Software for multistate analysis

    NARCIS (Netherlands)

    Willekens, Frans; Putter, H.

    2014-01-01

    Background: The growing interest in pathways, the increased availability of life-history data, innovations in statistical and demographic techniques, and advances in software technology have stimulated the development of software packages for multistate modeling of life histories. Objective: In the

  19. Software for multistate analysis

    NARCIS (Netherlands)

    Willekens, Frans; Putter, H.

    2014-01-01

    Background: The growing interest in pathways, the increased availability of life-history data, innovations in statistical and demographic techniques, and advances in software technology have stimulated the development of software packages for multistate modeling of life histories.Objective: In the

  20. Software cost estimation

    NARCIS (Netherlands)

    Heemstra, F.J.; Heemstra, F.J.

    1993-01-01

    The paper gives an overview of the state of the art of software cost estimation (SCE). The main questions to be answered in the paper are: (1) What are the reasons for overruns of budgets and planned durations? (2) What are the prerequisites for estimating? (3) How can software development effort be

  1. Software product family evaluation

    NARCIS (Netherlands)

    van der Linden, F; Bosch, J; Kamsties, E; Kansala, K; Krzanik, L; Obbink, H; VanDerLinden, F

    2004-01-01

    This paper proposes a 4-dimensional software product family engineering evaluation model. The 4 dimensions relate to the software engineering concerns of business, architecture, organisation and process. The evaluation model is meant to be used within organisations to determine the status of their

  2. Upgradable Software Product Customization by Code Query

    DEFF Research Database (Denmark)

    Vaucouleur, Sebastien

    of a subset of software systems that we call software products: software that needs special support for customization. Through customization, external companies can modify part of the original product to better t the needs of a niche market. Upon the release of a new version of the original software product......, external companies must port their customizations to the latest version of the base software product, a process called an upgrade. Companies typically consider upgrades as mandatory, and hence must bear their high cost on a regular basis. The objectives of customizability and upgradability are conicting...... be anticipated accurately. This result puts an important constraint on the solution and calls for an approach that complements the traditional customization techniques. We present the novel concept of code query by example, an approach that (a) requires little anticipation, (b) is simple and (c) may be adopted...

  3. Using containers with ATLAS offline software

    CERN Document Server

    Vogel, Marcelo; The ATLAS collaboration

    2017-01-01

    This paper describes the deployment of ATLAS offline software in containers for software development. For this we are using Docker, which is a lightweight virtualization technology that encapsulates a piece of software inside a complete file system. The deployment of offline releases via containers removes the strict requirement of compatibility between the runtime environment needed for job execution and the configuration of worker nodes at computing sites. If these two are decoupled from each other, sites can upgrade their nodes whenever and however they see fit. In this work, ATLAS software is distributed in containers either via the CernVM File System (CVMFS) or by means of a full ATLAS offline release installation. In software development, separating the build and runtime environment from the development environment allows users to take advantage of many modern code development tools that may not be available in production runtime setups like SLC6. It also frees developers from depending on resources lik...

  4. Progress towards the professionalization of Software Engineering

    Directory of Open Access Journals (Sweden)

    Janeth McAlister

    2014-12-01

    Full Text Available Software Engineer provides a theoretical framework, methods, and tools needed to develop quality software, and has impulse the revolution of Information and Knowledge Society, because without their contributions computers would be just a tool without a specific utility. Furthermore, despite of advances in hardware, the impact and potentiation of technological development just was possible thanks to software products. On the other hand, current Society is starting to be recognize as software–dependent, since in this century software is part of all devices required to manipulated information, and which people used in their daily activities. In this article is presented an analysis to the process of search professionalize software engineer and their products, having as base the work develop since the GSwE2009.

  5. An overview of 3D software visualization.

    Science.gov (United States)

    Teyseyre, Alfredo R; Campo, Marcelo R

    2009-01-01

    Software visualization studies techniques and methods for graphically representing different aspects of software. Its main goal is to enhance, simplify and clarify the mental representation a software engineer has of a computer system. During many years, visualization in 2D space has been actively studied, but in the last decade, researchers have begun to explore new 3D representations for visualizing software. In this article, we present an overview of current research in the area, describing several major aspects like: visual representations, interaction issues, evaluation methods and development tools. We also perform a survey of some representative tools to support different tasks, i.e., software maintenance and comprehension, requirements validation and algorithm animation for educational purposes, among others. Finally, we conclude identifying future research directions.

  6. Knowledge Assessment Software in Mining Specialist Training

    Science.gov (United States)

    Lebedev, Vladimir; Puhova, Olga

    2017-11-01

    The article reviews the knowledge assessment software module of electronic teaching and testing in mining specialist training. To develop the software module integrated programming environment state-of-the-art is used. Its advantage consists in small computer resource consumption, simple editing, and protection against the users' trying to find out the correct answers to test tasks. The software makes it possible to learn the ongoing learning information systematically and consistently as well as to assess the current knowledge in mining. The developed module meets the following requirements: a software module user-friendly interface, the storage of passed test results to be used for subsequent viewing, analyses, and evaluation, fast troubleshooting in case of any troubles with a stable module operation, and further software function extension and upgrading.

  7. Cleanroom software development

    Science.gov (United States)

    Dyer, M.; Mills, H. D.

    1981-01-01

    The 'cleanroom' software development process is a technical and organizational approach to developing software with certifiable reliability. Key ideas behind the process are well structured software specifications, randomized testing methods and the introduction of statistical controls; but the main point is to deny entry for defects during the development of software. This latter point suggests the use of the term 'cleanroom' in analogy to the defect prevention controls used in the manufacturing of high technology hardware. In the 'cleanroom', the entire software development process is embedded within a formal statistical design, in contrast to executing selected tests and appealing to the randomness of operational settings for drawing statistical inferences. Instead, random testing is introduced as a part of the statistical design itself so that when development and testing are completed, statistical inferences are made about the operation of the system.

  8. Software quality in 1997

    Energy Technology Data Exchange (ETDEWEB)

    Jones, C. [Software Productivity Research, Inc., Burlington, MA (United States)

    1997-11-01

    For many years, software quality assurance lagged behind hardware quality assurance in terms of methods, metrics, and successful results. New approaches such as Quality Function Deployment (QFD) the ISO 9000-9004 standards, the SEI maturity levels, and Total Quality Management (TQM) are starting to attract wide attention, and in some cases to bring software quality levels up to a parity with manufacturing quality levels. Since software is on the critical path for many engineered products, and for internal business systems as well, the new approaches are starting to affect global competition and attract widespread international interest. It can be hypothesized that success in mastering software quality will be a key strategy for dominating global software markets in the 21st century.

  9. Trends in software testing

    CERN Document Server

    Mohanty, J; Balakrishnan, Arunkumar

    2017-01-01

    This book is focused on the advancements in the field of software testing and the innovative practices that the industry is adopting. Considering the widely varied nature of software testing, the book addresses contemporary aspects that are important for both academia and industry. There are dedicated chapters on seamless high-efficiency frameworks, automation on regression testing, software by search, and system evolution management. There are a host of mathematical models that are promising for software quality improvement by model-based testing. There are three chapters addressing this concern. Students and researchers in particular will find these chapters useful for their mathematical strength and rigor. Other topics covered include uncertainty in testing, software security testing, testing as a service, test technical debt (or test debt), disruption caused by digital advancement (social media, cloud computing, mobile application and data analytics), and challenges and benefits of outsourcing. The book w...

  10. Revisiting software ecosystems research

    DEFF Research Database (Denmark)

    Manikas, Konstantinos

    2016-01-01

    Software ecosystems’ is argued to first appear as a concept more than 10 years ago and software ecosystem research started to take off in 2010. We conduct a systematic literature study, based on the most extensive literature review in the field up to date, with two primarily aims: (a) to provide...... an updated overview of the field and (b) to document evolution in the field. In total, we analyze 231 papers from 2007 until 2014 and provide an overview of the research in software ecosystems. Our analysis reveals a field that is rapidly growing both in volume and empirical focus while becoming more mature...... from evolving. We propose means for future research and the community to address them. Finally, our analysis shapes the view of the field having evolved outside the existing definitions of software ecosystems and thus propose the update of the definition of software ecosystems....

  11. Systematic Software Development

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; Méndez Fernández, Daniel

    2015-01-01

    makes these countries an attractive host for software companies. Often, high-quality engineering and excellent quality of products, e.g., machinery and equipment, are mentioned. Yet, the question is: Can such arguments be also found for the software industry? We aim at investigating the degree...... project- and quality management and their implementation in practice. So far, our results suggest that the necessity for a systematic software development is well recognized, while software development still follows an ad-hoc rather than a systematized style. Our results provide initial findings, which we......The speed of innovation and the global allocation of resources to accelerate development or to reduce cost put pressure on the software industry. In the global competition, especially so-called high-price countries have to present arguments why the higher development cost is justified and what...

  12. 48 CFR 227.7203-14 - Conformity, acceptance, and warranty of computer software and computer software documentation.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 3 2010-10-01 2010-10-01 false Conformity, acceptance... Software Documentation 227.7203-14 Conformity, acceptance, and warranty of computer software and computer...) Conformity and acceptance. Solicitations and contracts requiring the delivery of computer software shall...

  13. Publishing Platform for Scientific Software - Lessons Learned

    Science.gov (United States)

    Hammitzsch, Martin; Fritzsch, Bernadette; Reusser, Dominik; Brembs, Björn; Deinzer, Gernot; Loewe, Peter; Fenner, Martin; van Edig, Xenia; Bertelmann, Roland; Pampel, Heinz; Klump, Jens; Wächter, Joachim

    2015-04-01

    the life sciences. Based on the developed blueprints a scientific software publishing platform will be iteratively implemented, tested, and evaluated. Thus the platform should be developed continuously on the basis of gained experiences and results. The platform services will be extended one by one corresponding to the requirements of the communities. Thus the implemented platform for the publication of scientific software can be improved and stabilized incrementally as a tool with software, science, publishing, and user oriented features.

  14. Software cost estimation, benchmarking, and risk assessment the software decision-makers' guide to predictable software development

    CERN Document Server

    Trendowicz, Adam

    2012-01-01

    Software effort estimation is a key element of software project planning and management. Yet, in industrial practice, the important role of effort estimation is often underestimated and/or misunderstood. In this book, Adam Trendowicz presents the CoBRA method (an abbreviation for Cost Estimation, Benchmarking, and Risk Assessment) for estimating the effort required to successfully complete a software development project, which uniquely combines human judgment and measurement data in order to systematically create a custom-specific effort estimation model. CoBRA goes far beyond simply predictin

  15. Assessment Environment for Complex Systems Software Guide

    Science.gov (United States)

    2013-01-01

    This Software Guide (SG) describes the software developed to test the Assessment Environment for Complex Systems (AECS) by the West Virginia High Technology Consortium (WVHTC) Foundation's Mission Systems Group (MSG) for the National Aeronautics and Space Administration (NASA) Aeronautics Research Mission Directorate (ARMD). This software is referred to as the AECS Test Project throughout the remainder of this document. AECS provides a framework for developing, simulating, testing, and analyzing modern avionics systems within an Integrated Modular Avionics (IMA) architecture. The purpose of the AECS Test Project is twofold. First, it provides a means to test the AECS hardware and system developed by MSG. Second, it provides an example project upon which future AECS research may be based. This Software Guide fully describes building, installing, and executing the AECS Test Project as well as its architecture and design. The design of the AECS hardware is described in the AECS Hardware Guide. Instructions on how to configure, build and use the AECS are described in the User's Guide. Sample AECS software, developed by the WVHTC Foundation, is presented in the AECS Software Guide. The AECS Hardware Guide, AECS User's Guide, and AECS Software Guide are authored by MSG. The requirements set forth for AECS are presented in the Statement of Work for the Assessment Environment for Complex Systems authored by NASA Dryden Flight Research Center (DFRC). The intended audience for this document includes software engineers, hardware engineers, project managers, and quality assurance personnel from WVHTC Foundation (the suppliers of the software), NASA (the customer), and future researchers (users of the software). Readers are assumed to have general knowledge in the field of real-time, embedded computer software development.

  16. GENII Version 2 Software Design Document

    Energy Technology Data Exchange (ETDEWEB)

    Napier, Bruce A.; Strenge, Dennis L.; Ramsdell, James V.; Eslinger, Paul W.; Fosmire, Christian J.

    2004-03-08

    This document describes the architectural design for the GENII-V2 software package. This document defines details of the overall structure of the software, the major software components, their data file interfaces, and specific mathematical models to be used. The design represents a translation of the requirements into a description of the software structure, software components, interfaces, and necessary data. The design focuses on the major components and data communication links that are key to the implementation of the software within the operating framework. The purpose of the GENII-V2 software package is to provide the capability to perform dose and risk assessments of environmental releases of radionuclides. The software also has the capability of calculating environmental accumulation and radiation doses from surface water, groundwater, and soil (buried waste) media when an input concentration of radionuclide in these media is provided. This report represents a detailed description of the capabilities of the software product with exact specifications of mathematical models that form the basis for the software implementation and testing efforts. This report also presents a detailed description of the overall structure of the software package, details of main components (implemented in the current phase of work), details of data communication files, and content of basic output reports. The GENII system includes the capabilities for calculating radiation doses following chronic and acute releases. Radionuclide transport via air, water, or biological activity may be considered. Air transport options include both puff and plume models, each allow use of an effective stack height or calculation of plume rise from buoyant or momentum effects (or both). Building wake effects can be included in acute atmospheric release scenarios. The code provides risk estimates for health effects to individuals or populations; these can be obtained using the code by applying

  17. Computational intelligence and quantitative software engineering

    CERN Document Server

    Succi, Giancarlo; Sillitti, Alberto

    2016-01-01

    In a down-to-the earth manner, the volume lucidly presents how the fundamental concepts, methodology, and algorithms of Computational Intelligence are efficiently exploited in Software Engineering and opens up a novel and promising avenue of a comprehensive analysis and advanced design of software artifacts. It shows how the paradigm and the best practices of Computational Intelligence can be creatively explored to carry out comprehensive software requirement analysis, support design, testing, and maintenance. Software Engineering is an intensive knowledge-based endeavor of inherent human-centric nature, which profoundly relies on acquiring semiformal knowledge and then processing it to produce a running system. The knowledge spans a wide variety of artifacts, from requirements, captured in the interaction with customers, to design practices, testing, and code management strategies, which rely on the knowledge of the running system. This volume consists of contributions written by widely acknowledged experts ...

  18. Software design for resilient computer systems

    CERN Document Server

    Schagaev, Igor

    2016-01-01

    This book addresses the question of how system software should be designed to account for faults, and which fault tolerance features it should provide for highest reliability. The authors first show how the system software interacts with the hardware to tolerate faults. They analyze and further develop the theory of fault tolerance to understand the different ways to increase the reliability of a system, with special attention on the role of system software in this process. They further develop the general algorithm of fault tolerance (GAFT) with its three main processes: hardware checking, preparation for recovery, and the recovery procedure. For each of the three processes, they analyze the requirements and properties theoretically and give possible implementation scenarios and system software support required. Based on the theoretical results, the authors derive an Oberon-based programming language with direct support of the three processes of GAFT. In the last part of this book, they introduce a simulator...

  19. Beyond Reactive Planning: Self Adaptive Software and Self Modeling Software in Predictive Deliberation Management

    National Research Council Canada - National Science Library

    Lenahan, Jack; Nash, Michael P; Charles, Phil

    2008-01-01

    .... We present the following hypothesis: predictive deliberation management using self-adapting and self-modeling software will be required to provide mission planning adjustments after the start of a mission...

  20. ISO and software quality assurance - licensing and certification of software professionals

    Energy Technology Data Exchange (ETDEWEB)

    Hare, J.; Rodin, L.

    1997-11-01

    This report contains viewgraphs on licensing and certifing of software professionals. Discussed in this report are: certification programs; licensing programs; why became certified; certification as a condition of empolyment; certification requirements; and examination structures.

  1. LDUA software custodian`s notebook

    Energy Technology Data Exchange (ETDEWEB)

    Aftanas, B.L.

    1998-08-20

    This plan describes the activities to be performed and controls to be applied to the process of specifying, obtaining, and qualifying the control and data acquisition software for the Light Duty Utility Arm (LDUA) System. It serves the purpose of a software quality assurance plan, a verification and validation plan, and a configuration management plan. This plan applies to all software that is an integral part of the LDUA control and data acquisition system, that is, software that is installed in the computers that are part of the LDUA system as it is deployed in the field. This plan applies to the entire development process, including: requirements; design; implementation; and operations and maintenance. This plan does not apply to any software that is not integral with the LDUA system. This plan has-been prepared in accordance with WHC-CM-6-1 Engineering Practices, EP-2.1; WHC-CM-3-10 Software Practices; and WHC-CM-4-2, QR 19.0, Software Quality Assurance Requirements.

  2. Precise Documentation: The Key to Better Software

    Science.gov (United States)

    Parnas, David Lorge

    The prime cause of the sorry “state of the art” in software development is our failure to produce good design documentation. Poor documentation is the cause of many errors and reduces efficiency in every phase of a software product's development and use. Most software developers believe that “documentation” refers to a collection of wordy, unstructured, introductory descriptions, thousands of pages that nobody wanted to write and nobody trusts. In contrast, Engineers in more traditional disciplines think of precise blueprints, circuit diagrams, and mathematical specifications of component properties. Software developers do not know how to produce precise documents for software. Software developments also think that documentation is something written after the software has been developed. In other fields of Engineering much of the documentation is written before and during the development. It represents forethought not afterthought. Among the benefits of better documentation would be: easier reuse of old designs, better communication about requirements, more useful design reviews, easier integration of separately written modules, more effective code inspection, more effective testing, and more efficient corrections and improvements. This paper explains how to produce and use precise software documentation and illustrate the methods with several examples.

  3. Software Quality Metrics Enhancements. Volume 1

    Science.gov (United States)

    1980-04-01

    requirements are levied on the developer for deliverable documentation. The requirements are usually a rigorous application of military standards with...1978. [KEEP77] Keen, P. G. W., Gerson , E. M. "The Politics of Software Systems Design" Datamation, November 1977. * .- R-5 . @ ,, ., ,w

  4. Software survey 2008

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    2008-07-15

    This article presented a guide to new software applications designed to facilitate petroleum exploration, drilling and production activities. Oil and gas producers can use the products for a range of functions, including reservoir characterization and accounting. In addition to a description of the software application, this article listed the name of software providers and the new features available in each product. The featured software developed by Calgary-based providers included: PetroLOOK by Alcaro Softworks Inc.; StabView and RocksBank by Advanced Geotechnology; the Edge screening tool by Canadian Discovery Ltd.; AFE Navigator by Energy Navigator Inc.; evMosaic by Entero Corp.; ResSurveil, ResBalance and ResAssist by Epic Consulting Services Ltd; FAST WellTest and FAST Piper by Fekete Associates Inc.; OMNI 3D and VISTA 2D/3D by Gedco; geoSCOUT, petroCUBE and gDC by GeoLOGIC Systems Ltd.; DataVera by Intervera Data Solutions; PIPEFLO, WELLFLO and FORGAS by Neotechnology Consultants Ltd.; AFENexus, EANexus; ForeFront, GeoNexus and JVNexus management software by Pandell Technology Corp.; Oil and Gas Solutions by Risk Advisory; Petrel, GeoFrame, ECLIPSE, and Frontsim by Schlumberger Information Solutions; esi.manage and esi.executive by 3esi; and SeisWare by Zokero Inc. The featured software developed by Texas-based providers included the HTRI Xchanger Suite by Heat Transfer Research Inc.; GeoProbe, PowerView, GeoGraphix, AssetPlanner, Nexus software, Decision Management System, AssetConnect, and OpenWorks by Landmark; NeuraScanner, NeuraLaser, NeuraLog, NeuraMap, NeuraSection and NeuraView by Neuralog Inc.; and software by OpenSpirit. The article also featured management and tracking solutions developed by Vancouver-based SustaiNet Software Solutions Inc.

  5. Formal Verification of Mathematical Software. Volume 2

    Science.gov (United States)

    1990-05-01

    copy RAfJC-TR-90-53, Vol I (of twol Final Techrical Report ?"ay 1990 AD-A223 633 FORMAL VERIFICATION OF MATHEMATICAL SOFTWARE DTIC ELECTE Odyssey...copies of this report unless contractual obligations or notices on a specific document require that it be returned. FORMAL VERIFICATION OF...1 May 1986 Contract Expiration Date: 31 July 1989 Short Title of Work: Formal Verification of SDI Mathematical Software Period of Work Covered: May 86

  6. Knowledge coordination in distributed software management

    DEFF Research Database (Denmark)

    Persson, John Stouby; Mathiassen, Lars

    2012-01-01

    Software organizations are increasingly relying on cross-organizational and cross-border collaboration, requiring effective coordination of distributed knowledge. However, such coordination is challenging due to spatial separation, diverging communities-of-practice, and unevenly distributed...... communication breakdowns on recordings of their combined teleconferencing and real-time collaborative modeling. As a result, we offer theoretical propositions that explain how distributed software managers can deal with communication breakdowns and effectively coordinate knowledge through multimodal virtual...

  7. Advanced Software Development Workstation Project, phase 3

    Science.gov (United States)

    1991-01-01

    ACCESS provides a generic capability to develop software information system applications which are explicitly intended to facilitate software reuse. In addition, it provides the capability to retrofit existing large applications with a user friendly front end for preparation of input streams in a way that will reduce required training time, improve the productivity even of experienced users, and increase accuracy. Current and past work shows that ACCESS will be scalable to much larger object bases.

  8. Common System and Software Testing Pitfalls

    Science.gov (United States)

    2014-11-03

    connecting servers and data libraries (e.g., SAN) – Busses within systems (embedded software) • Software must meet quality requirements (thresholds of...Firesmith, 3 November 2014 General Pitfalls – Stakeholder Involvement and Commitment Wrong Testing Mindset (GEN- SIC -1) → Unrealistic Testing...Expectations (GEN- SIC -2) Lack of Stakeholder Commitment to Testing (GEN- SIC -3) 22Common System/SW Testing PitfallsDonald G. Firesmith, 3 November 2014 General

  9. Hardware and software reliability estimation using simulations

    Science.gov (United States)

    Swern, Frederic L.

    1994-01-01

    The simulation technique is used to explore the validation of both hardware and software. It was concluded that simulation is a viable means for validating both hardware and software and associating a reliability number with each. This is useful in determining the overall probability of system failure of an embedded processor unit, and improving both the code and the hardware where necessary to meet reliability requirements. The methodologies were proved using some simple programs, and simple hardware models.

  10. Goal Driven Iterative Software Project Management

    OpenAIRE

    Wautelet, Yves; Kolp, Manuel

    2011-01-01

    Iterative development has gained popularity in the software industry notably in the development of enterprise applications where requirements and needs are difficult to express for the users and business processes difficult to understand by analysts. Such a software development life cycle is nevertheless often used in an ad-hoc manner. Even when templates such as the Unified Process are furnished, poor documentation is provided on how to breakdown the project into manageable units and to plan...

  11. Multiphase flow calculation software

    Science.gov (United States)

    Fincke, James R.

    2003-04-15

    Multiphase flow calculation software and computer-readable media carrying computer executable instructions for calculating liquid and gas phase mass flow rates of high void fraction multiphase flows. The multiphase flow calculation software employs various given, or experimentally determined, parameters in conjunction with a plurality of pressure differentials of a multiphase flow, preferably supplied by a differential pressure flowmeter or the like, to determine liquid and gas phase mass flow rates of the high void fraction multiphase flows. Embodiments of the multiphase flow calculation software are suitable for use in a variety of applications, including real-time management and control of an object system.

  12. Calidad de componentes software

    OpenAIRE

    Carvallo Vega, Juan Pablo; Franch Gutiérrez, Javier; Quer Bosor, Maria Carme

    2010-01-01

    En los últimos años se constata una tendencia creciente por parte de las organizaciones a desarrollar sus sistemas software mediante la combinación de componentes, en lugar de desarrollar dichos sistemas partiendo de cero. Esta tendencia es debida a varios factores. Entre ellos cabe destacar: la necesidad de las organizaciones de reducir los costes y el tiempo dedicados al desarrollo de los sistemas software; el crecimiento del mercado de componentes software; la reducción de la distancia ent...

  13. Software takes command

    CERN Document Server

    Manovich, Lev

    2013-01-01

    Software has replaced a diverse array of physical, mechanical, and electronic technologies used before 21st century to create, store, distribute and interact with cultural artifacts. It has become our interface to the world, to others, to our memory and our imagination - a universal language through which the world speaks, and a universal engine on which the world runs. What electricity and combustion engine were to the early 20th century, software is to the early 21st century. Offering the the first theoretical and historical account of software for media authoring and its effects on the prac

  14. Beginning software engineering

    CERN Document Server

    Stephens, Rod

    2015-01-01

    Beginning Software Engineering demystifies the software engineering methodologies and techniques that professional developers use to design and build robust, efficient, and consistently reliable software. Free of jargon and assuming no previous programming, development, or management experience, this accessible guide explains important concepts and techniques that can be applied to any programming language. Each chapter ends with exercises that let you test your understanding and help you elaborate on the chapter's main concepts. Everything you need to understand waterfall, Sashimi, agile, RAD, Scrum, Kanban, Extreme Programming, and many other development models is inside!

  15. Global Software Engineering

    DEFF Research Database (Denmark)

    Ebert, Christof; Kuhrmann, Marco; Prikladnicki, Rafael

    2016-01-01

    SOFTWARE, LIKE ALL industry products, is the result of complex multinational supply chains with many partners from concept to development to production and maintenance. Global software engineering (GSE), IT outsourcing, and business process outsourcing during the past decade have showed growth...... rates of 10 to 20 percent per year. This instalment of Practitioner’s Digest summarizes experiences and guidance from industry to facilitate knowledge and technology transfer for GSE. It’s based on industry feedback from the annual IEEE International Conference on Global Software Engineering, which had...

  16. Software quality assurance handbook

    Energy Technology Data Exchange (ETDEWEB)

    1990-09-01

    There are two important reasons for Software Quality Assurance (SQA) at Allied-Signal Inc., Kansas City Division (KCD): First, the benefits from SQA make good business sense. Second, the Department of Energy has requested SQA. This handbook is one of the first steps in a plant-wide implementation of Software Quality Assurance at KCD. The handbook has two main purposes. The first is to provide information that you will need to perform software quality assurance activities. The second is to provide a common thread to unify the approach to SQA at KCD. 2 figs.

  17. Error Free Software

    Science.gov (United States)

    1985-01-01

    A mathematical theory for development of "higher order" software to catch computer mistakes resulted from a Johnson Space Center contract for Apollo spacecraft navigation. Two women who were involved in the project formed Higher Order Software, Inc. to develop and market the system of error analysis and correction. They designed software which is logically error-free, which, in one instance, was found to increase productivity by 600%. USE.IT defines its objectives using AXES -- a user can write in English and the system converts to computer languages. It is employed by several large corporations.

  18. Guide to software export

    CERN Document Server

    Philips, Roger A

    2014-01-01

    An ideal reference source for CEOs, marketing and sales managers, sales consultants, and students of international marketing, Guide to Software Export provides a step-by-step approach to initiating or expanding international software sales. It teaches you how to examine critically your candidate product for exportability; how to find distributors, agents, and resellers abroad; how to identify the best distribution structure for export; and much, much more!Not content with providing just the guidelines for setting up, expanding, and managing your international sales channels, Guide to Software

  19. Intellectual Property Protection of Software – At the Crossroads of Software Patents and Open Source Software

    OpenAIRE

    Tantarimäki, Maria

    2018-01-01

    The thesis considers the intellectual property protection of software in Europe and in the US, which is increasingly important subject as the world is globalizing and digitalizing. The special nature of software has challenges the intellectual property rights. The current protection of software is based on copyright protection but in this thesis, two other options are considered: software patents and open source software. Software patents provide strong protection for software whereas the pur...

  20. Software For Genetic Algorithms

    Science.gov (United States)

    Wang, Lui; Bayer, Steve E.

    1992-01-01

    SPLICER computer program is genetic-algorithm software tool used to solve search and optimization problems. Provides underlying framework and structure for building genetic-algorithm application program. Written in Think C.

  1. Collaborative software development

    NARCIS (Netherlands)

    M. de Jonge (Merijn); E. Visser; J.M.W. Visser (Joost)

    2001-01-01

    textabstractWe present an approach to collaborative software development where obtaining components and contributing components across organizational boundaries are explicit phases in the development process. A lightweight generative infrastructure supports this approach with an online package base,

  2. Software Design Analyzer System

    Science.gov (United States)

    Tausworthe, R. C.

    1985-01-01

    CRISP80 software design analyzer system a set of programs that supports top-down, hierarchic, modular structured design, and programing methodologies. CRISP80 allows for expression of design as picture of program.

  3. eSoftwareList

    Data.gov (United States)

    US Agency for International Development — USAID Software Database reporting tool created in Oracle Application Express (APEX). This version provides read only access to a database view of the JIRA SAR...

  4. Core Flight Software Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The mission of the CFS project is to provide reusable software in support of human space exploration programs.   The top-level technical approach to...

  5. Tier2 Submit Software

    Science.gov (United States)

    Download this tool for Windows or Mac, which helps facilities prepare a Tier II electronic chemical inventory report. The data can also be exported into the CAMEOfm (Computer-Aided Management of Emergency Operations) emergency planning software.

  6. Test af Software

    DEFF Research Database (Denmark)

    Dette dokument udgør slutrapporten for netværkssamarbejdet ”Testnet”, som er udført i perioden 1.4.2006 til 31.12.2008. Netværket beskæftiger sig navnlig med emner inden for test af indlejret og teknisk software, men et antal eksempler på problemstillinger og løsninger forbundet med test af...... administrativ software indgår også. Rapporten er opdelt i følgende 3 dele: Overblik. Her giver vi et resumé af netværkets formål, aktiviteter og resultater. State of the art af software test ridses op. Vi omtaler, at CISS og netværket tager nye tiltag. Netværket. Formål, deltagere og behandlede emner på ti...... række danske software-, elektronik- og IT-virksomheder....

  7. Managing Software Process Evolution

    DEFF Research Database (Denmark)

    This book focuses on the design, development, management, governance and application of evolving software processes that are aligned with changing business objectives, such as expansion to new domains or shifting to global production. In the context of an evolving business world, it examines...... the complete software process lifecycle, from the initial definition of a product to its systematic improvement. In doing so, it addresses difficult problems, such as how to implement processes in highly regulated domains or where to find a suitable notation system for documenting processes, and provides...... essential insights and tips to help readers manage process evolutions. And last but not least, it provides a wealth of examples and cases on how to deal with software evolution in practice. Reflecting these topics, the book is divided into three parts. Part 1 focuses on software business transformation...

  8. Error-Free Software

    Science.gov (United States)

    1989-01-01

    001 is an integrated tool suited for automatically developing ultra reliable models, simulations and software systems. Developed and marketed by Hamilton Technologies, Inc. (HTI), it has been applied in engineering, manufacturing, banking and software tools development. The software provides the ability to simplify the complex. A system developed with 001 can be a prototype or fully developed with production quality code. It is free of interface errors, consistent, logically complete and has no data or control flow errors. Systems can be designed, developed and maintained with maximum productivity. Margaret Hamilton, President of Hamilton Technologies, also directed the research and development of USE.IT, an earlier product which was the first computer aided software engineering product in the industry to concentrate on automatically supporting the development of an ultrareliable system throughout its life cycle. Both products originated in NASA technology developed under a Johnson Space Center contract.

  9. Project Portfolio Management Software

    OpenAIRE

    Paul POCATILU

    2006-01-01

    In order to design a methodology for the development of project portfolio management (PPM) applications, the existing applications have to be studied. This paper describes the main characteristics of the leading project portfolio management software applications.

  10. A Framework for Instituting Software Metrics in Small Software Organizations

    OpenAIRE

    Hisham M. Haddad; Nancy C. Ross; Donald E. Meredith

    2012-01-01

    The role of metrics in software quality is well-recognized; however, software metrics are yet to be standardized and integrated into development practices across the software industry. Literature reports indicate that software companies with less than 50 employees may represent up to 85% of the software organizations in several countries, including the United States. While process, project, and product metrics share a common goal of contributing to software quality and reliability, utilizatio...

  11. Survey on Impact of Software Metrics on Software Quality

    OpenAIRE

    Mrinal Singh Rawat; Arpita Mittal; Sanjay Kumar Dubey

    2012-01-01

    Software metrics provide a quantitative basis for planning and predicting software development processes. Therefore the quality of software can be controlled and improved easily. Quality in fact aids higher productivity, which has brought software metrics to the forefront. This research paper focuses on different views on software quality. Moreover, many metrics and models have been developed; promoted and utilized resulting in remarkable successes. This paper examines the realm of software e...

  12. Global software development

    DEFF Research Database (Denmark)

    Matthiesen, Stina

    2016-01-01

    This overview presents the mid stages of my doctoral research-based on ethnographic work conducted in IT companies in India and in Denmark-on collaborative work within global software development (GSD). In the following I briefly introduce how this research seeks to spark a debate in CSCW...... by challenging contemporary ideals about software development outsourcing through the exploration of the multiplicities and asymmetric dynamics inherent in the collaborative work of GSD....

  13. Transformational Leadershipin Software Projects

    OpenAIRE

    MOUSAVIKHAH, MARYAM

    2013-01-01

    Lack of management in software projects is among the most important reasons for the failure of this kind of projects. Considering this fact, in addition to high rate of IS (Information System) projects’ failure, and the lack of leadership studies in IS field, it is necessary to pay more attention to the concept of leadership in software projects. Transformational leadership as one of the most popular leadership theories, although might bring specific advantages for this kind of projects, has ...

  14. Benchmarking Software Assurance Implementation

    Science.gov (United States)

    2011-05-18

    product The chicken#. (a.k.a. Process Focused Assessment ) – Management Systems ( ISO 9001, ISO 27001 , ISO 2000) – Capability Maturity Models (CMMI...Benchmarking Software Assurance Implementation Michele Moss SSTC Conference May 18, 2011 Report Documentation Page Form ApprovedOMB No. 0704-0188...00-00-2011 4. TITLE AND SUBTITLE Benchmarking Software Assurance Implementation 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER

  15. Mining unstructured software data

    OpenAIRE

    Bacchelli, Alberto; Lanza, Michele

    2013-01-01

    Our thesis is that the analysis of unstructured data supports software understanding and evolution analysis, and complements the data mined from structured sources. To this aim, we implemented the necessary toolset and investigated methods for exploring, exposing, and exploiting unstructured data.To validate our thesis, we focused on development email data. We found two main challenges in using it to support program comprehension and software development: The disconnection between emai...

  16. Software Process Assessment (SPA)

    Science.gov (United States)

    Rosenberg, Linda H.; Sheppard, Sylvia B.; Butler, Scott A.

    1994-01-01

    NASA's environment mirrors the changes taking place in the nation at large, i.e. workers are being asked to do more work with fewer resources. For software developers at NASA's Goddard Space Flight Center (GSFC), the effects of this change are that we must continue to produce quality code that is maintainable and reusable, but we must learn to produce it more efficiently and less expensively. To accomplish this goal, the Data Systems Technology Division (DSTD) at GSFC is trying a variety of both proven and state-of-the-art techniques for software development (e.g., object-oriented design, prototyping, designing for reuse, etc.). In order to evaluate the effectiveness of these techniques, the Software Process Assessment (SPA) program was initiated. SPA was begun under the assumption that the effects of different software development processes, techniques, and tools, on the resulting product must be evaluated in an objective manner in order to assess any benefits that may have accrued. SPA involves the collection and analysis of software product and process data. These data include metrics such as effort, code changes, size, complexity, and code readability. This paper describes the SPA data collection and analysis methodology and presents examples of benefits realized thus far by DSTD's software developers and managers.

  17. 7 Processes that Enable NASA Software Engineering Technologies: Value-Added Process Engineering

    Science.gov (United States)

    Housch, Helen; Godfrey, Sally

    2011-01-01

    The presentation reviews Agency process requirements and the purpose, benefits, and experiences or seven software engineering processes. The processes include: product integration, configuration management, verification, software assurance, measurement and analysis, requirements management, and planning and monitoring.

  18. Eprints Institutional Repository Software: A Review

    Directory of Open Access Journals (Sweden)

    Mike R. Beazley

    2011-01-01

    Full Text Available Setting up an institutional repository (IR can be a daunting task. There are many software packages out there, some commercial, some open source, all of which offer different features and functionality. This article will provide some thoughts about one of these software packages: Eprints. Eprints was one of the first IR software packages to appear and has been available for 10 years. It is under continual development by its creators at the University of Southampton and the current version is v3.2.3. Eprints is open-source, meaning that anyone can download and make use of the software for free and the software can be modified however the user likes. This presents clear advantages for institutions will smaller budgets and also for institutions that have programmers on staff. Eprints requires some additional software to run: Linux, Apache, MySQL, and Perl. This software is all open-source and already present on the servers of many institutions. There is now a version of Eprints that will run on Windows servers as well, which will make the adoption of Eprints even easier for some. In brief, Eprints is an excellent choice for any institution looking to get an IR up and running quickly and easily. Installation is straightforward as is the initial configuration. Once the IR is up and running, users may upload documents and provide the necessary metadata for the records by filling out a simple web form. Embargoes on published documents are handled elegantly by the software, and the software links to the SHERPA/RoMEO database so authors can easily verify their rights regarding IR submissions. Eprints has some drawbacks, which will be discussed later in the review, but on the whole it is easy to recommend to anyone looking to start an IR. However, It is less clear that an institution with an existing IR based on another software package should migrate to Eprints.

  19. 2006 XSD Scientific Software Workshop report.

    Energy Technology Data Exchange (ETDEWEB)

    Evans, K., Jr.; De Carlo, F.; Jemian, P.; Lang, J.; Lienert, U.; Maclean, J.; Newville, M.; Tieman, B.; Toby, B.; van Veenendaal, B.; Univ. of Chicago

    2006-01-22

    In May of 2006, a committee was formed to assess the fundamental needs and opportunities in scientific software for x-ray data reduction, analysis, modeling, and simulation. This committee held a series of discussions throughout the summer, conducted a poll of the members of the x-ray community, and held a workshop. This report details the findings and recommendations of the committee. Each experiment performed at the APS requires three crucial ingredients: the powerful x-ray source, an optimized instrument to perform measurements, and computer software to acquire, visualize, and analyze the experimental observations. While the APS has invested significant resources in the accelerator, investment in other areas such as scientific software for data analysis and visualization has lagged behind. This has led to the adoption of a wide variety of software with variable levels of usability. In order to maximize the scientific output of the APS, it is essential to support the broad development of real-time analysis and data visualization software. As scientists attack problems of increasing sophistication and deal with larger and more complex data sets, software is playing an ever more important role. Furthermore, our need for excellent and flexible scientific software can only be expected to increase, as the upgrade of the APS facility and the implementation of advanced detectors create a host of new measurement capabilities. New software analysis tools must be developed to take full advantage of these capabilities. It is critical that the APS take the lead in software development and the implementation of theory to software to ensure the continued success of this facility. The topics described in this report are relevant to the APS today and critical for the APS upgrade plan. Implementing these recommendations will have a positive impact on the scientific productivity of the APS today and will be even more critical in the future.

  20. Critical Software for Human Spaceflight

    Science.gov (United States)

    Preden, Antonio; Kaschner, Jens; Rettig, Felix; Rodriggs, Michael

    2017-01-01

    The NASA Orion vehicle that will fly to the moon in the next years is propelled along its mission by the European Service Module (ESM), developed by ESA and its prime contractor Airbus Defense and Space. This paper describes the development of the Propulsion Drive Electronics (PDE) Software that provides the interface between the propulsion hardware of the European Service Module with the Orion flight computers, and highlights the challenges that have been faced during the development. Particularly, the specific aspects relevant to Human Spaceflight in an international cooperation are presented, as the compliance to both European and US standards and the software criticality classification to the highest category A. An innovative aspect of the PDE SW is its Time- Triggered Ethernet interface with the Orion Flight Computers, which has never been flown so far on any European spacecraft. Finally the verification aspects are presented, applying the most exigent quality requirements defined in the European Cooperation for Space Standardization (ECSS) standards such as the structural coverage analysis of the object code and the recourse to an independent software verification and validation activity carried on in parallel by a different team.

  1. Hardware impacts to software development strategies - The history of the development of the Mars Observer Payload Data Subsystem embedded real-time software

    Science.gov (United States)

    Elson, Anne B.

    1989-01-01

    Ways in which parallel hardware development and high level requirements changes have influenced Mars Observer Payload Data Subsystem (PDS) flight software development are discussed. Particular attention is given to ways in which the evolving hardware product and changing requirements have led to repeated modification to software requirements, design, code, and test tools and a delay in the closure of corresponding phases of the software development life cycle. Design and implementation problems which were encountered during the PDS software development effort are described.

  2. Evaluation Framework for Quality Management Software

    Directory of Open Access Journals (Sweden)

    Nadica Hrgarek

    2008-06-01

    Full Text Available Identifying and specifying user requirements is an integral part of information systems design and is critical for the project success. More than 50% of the reasons for the project failure presented in the CHAOS report [36] and study of a US Air Force project by Sheldon et al. [33] are related to requirements. The goal of this paper is to assess the relevant user and software requirements which are the basis for an electronic quality management system selection in medical device companies. This paper describes the structured evaluation and selection process of different quality management software tools that shall support business processes. The purpose of this paper is to help the small to medium size medical device companies to choose the right quality management software which meets the company's business needs.

  3. On the impact of medical device regulations on software architecture

    DEFF Research Database (Denmark)

    Hansen, Klaus Marius; Manikas, Konstantinos

    2016-01-01

    Compliance to regulations and regulatory approval are requirements for many medical device software systems. In this paper, we investigate the implications of medical device software regulations to the design of software systems. We do so by focusing on the American and European regulatory...... of the device. Moreover, we review software modularity in the implementation of software medical device and propose a set of preliminary principles for architectural design of software medical device based on a set of constrains identified from the reviewed regulations....... authorities and review the legal requirements for regulatory approval of medical devices. We define a simplified process for regulatory approval, consisting of five steps, and enhance this process by descriptions of how to decide whether a software system is a medical device and how to identify the class...

  4. A Model for Quality Optimization in Software Design Processes

    NARCIS (Netherlands)

    Noppen, J.A.R.; van den Broek, P.M.; Aksit, Mehmet

    The main objective of software engineers is to design and implement systems that implement all functional and non-functional requirements. Unfortunately, it is very difficult or even generally impossible to deliver a software system that satisfies all the requirements. Even more seriously, failures

  5. HydroShare: Applying professional software engineering to a new NSF-funded large software project

    Science.gov (United States)

    Idaszak, R.; Tarboton, D. G.; Ames, D.; Saleem Arrigo, J. A.; Band, L. E.; Bedig, A.; Castronova, A. M.; Christopherson, L.; Coposky, J.; Couch, A.; Dash, P.; Gan, T.; Goodall, J.; Gustafson, K.; Heard, J.; Hooper, R. P.; Horsburgh, J. S.; Jackson, S.; Johnson, H.; Maidment, D. R.; Mbewe, P.; Merwade, V.; Miles, B.; Reeder, S.; Russell, T.; Song, C.; Taylor, A.; Thakur, S.; Valentine, D. W.; Whiteaker, T. L.

    2013-12-01

    HydroShare is an online, collaborative system being developed for sharing hydrologic data and models as part of the NSF's Software Infrastructure for Sustained Innovation (SI2) program (NSF collaborative award numbers 1148453 and 1148090). HydroShare involves a large software development effort requiring cooperative research and distributed software development between domain scientists, professional software engineers (here 'professional' denotes previous commercial experience in the application of modern software engineering), and university software developers. HydroShare expands upon the data sharing capabilities of the Hydrologic Information System of the Consortium of Universities for the Advancement of Hydrologic Sciences, Inc. (CUAHSI) by broadening the classes of data accommodated, expanding capability to include the sharing of models and model components, and taking advantage of emerging social media functionality to enhance information about and collaboration around hydrologic data and models. With a goal of enabling better science concomitant with improved sustainable software practices, we will describe our approach, experiences, and lessons learned thus-far in applying professional software engineering to a large NSF-funded software project from the project's onset.

  6. Autonomous Real Time Requirements Tracing

    Science.gov (United States)

    Plattsmier, George; Stetson, Howard

    2014-01-01

    One of the more challenging aspects of software development is the ability to verify and validate the functional software requirements dictated by the Software Requirements Specification (SRS) and the Software Detail Design (SDD). Insuring the software has achieved the intended requirements is the responsibility of the Software Quality team and the Software Test team. The utilization of Timeliner-TLX(sup TM) Auto- Procedures for relocating ground operations positions to ISS automated on-board operations has begun the transition that would be required for manned deep space missions with minimal crew requirements. This transition also moves the auto-procedures from the procedure realm into the flight software arena and as such the operational requirements and testing will be more structured and rigorous. The autoprocedures would be required to meet NASA software standards as specified in the Software Safety Standard (NASASTD- 8719), the Software Engineering Requirements (NPR 7150), the Software Assurance Standard (NASA-STD-8739) and also the Human Rating Requirements (NPR-8705). The Autonomous Fluid Transfer System (AFTS) test-bed utilizes the Timeliner-TLX(sup TM) Language for development of autonomous command and control software. The Timeliner-TLX(sup TM) system has the unique feature of providing the current line of the statement in execution during real-time execution of the software. The feature of execution line number internal reporting unlocks the capability of monitoring the execution autonomously by use of a companion Timeliner-TLX(sup TM) sequence as the line number reporting is embedded inside the Timeliner-TLX(sup TM) execution engine. This negates I/O processing of this type data as the line number status of executing sequences is built-in as a function reference. This paper will outline the design and capabilities of the AFTS Autonomous Requirements Tracker, which traces and logs SRS requirements as they are being met during real-time execution of the

  7. System and Software Reliability (C103)

    Science.gov (United States)

    Wallace, Dolores

    2003-01-01

    Within the last decade better reliability models (hardware. software, system) than those currently used have been theorized and developed but not implemented in practice. Previous research on software reliability has shown that while some existing software reliability models are practical, they are no accurate enough. New paradigms of development (e.g. OO) have appeared and associated reliability models have been proposed posed but not investigated. Hardware models have been extensively investigated but not integrated into a system framework. System reliability modeling is the weakest of the three. NASA engineers need better methods and tools to demonstrate that the products meet NASA requirements for reliability measurement. For the new models for the software component of the last decade, there is a great need to bring them into a form that they can be used on software intensive systems. The Statistical Modeling and Estimation of Reliability Functions for Systems (SMERFS'3) tool is an existing vehicle that may be used to incorporate these new modeling advances. Adapting some existing software reliability modeling changes to accommodate major changes in software development technology may also show substantial improvement in prediction accuracy. With some additional research, the next step is to identify and investigate system reliability. System reliability models could then be incorporated in a tool such as SMERFS'3. This tool with better models would greatly add value in assess in GSFC projects.

  8. Guidance and Control Software Project Data - Volume 1: Planning Documents

    Science.gov (United States)

    Hayhurst, Kelly J. (Editor)

    2008-01-01

    The Guidance and Control Software (GCS) project was the last in a series of software reliability studies conducted at Langley Research Center between 1977 and 1994. The technical results of the GCS project were recorded after the experiment was completed. Some of the support documentation produced as part of the experiment, however, is serving an unexpected role far beyond its original project context. Some of the software used as part of the GCS project was developed to conform to the RTCA/DO-178B software standard, "Software Considerations in Airborne Systems and Equipment Certification," used in the civil aviation industry. That standard requires extensive documentation throughout the software development life cycle, including plans, software requirements, design and source code, verification cases and results, and configuration management and quality control data. The project documentation that includes this information is open for public scrutiny without the legal or safety implications associated with comparable data from an avionics manufacturer. This public availability has afforded an opportunity to use the GCS project documents for DO-178B training. This report provides a brief overview of the GCS project, describes the 4-volume set of documents and the role they are playing in training, and includes the planning documents from the GCS project. Volume 1 contains five appendices: A. Plan for Software Aspects of Certification for the Guidance and Control Software Project; B. Software Development Standards for the Guidance and Control Software Project; C. Software Verification Plan for the Guidance and Control Software Project; D. Software Configuration Management Plan for the Guidance and Control Software Project; and E. Software Quality Assurance Activities.

  9. Requirements Reasoning for Distributed Requirements Analysis using Semantic Wiki

    NARCIS (Netherlands)

    Liang, Peng; Avgeriou, Paris; Clerc, Viktor

    2009-01-01

    In large-scale collaborative software projects, thousands of requirements with complex interdependencies and different granularity spreading in different levels are elicited, documented, and evolved during the project lifecycle. Non-technical stakeholders involved in requirements engineering

  10. From Pragmatic to Systematic Software Process Improvement

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; Méndez Fernández, Daniel

    2015-01-01

    -intensive requiring many stakeholders to contribute to the process assessment, analysis, design, realisation, and deployment. Although there exist many valuable SPI approaches, none address the needs of both process engineers and project managers. This article presents an Artefact-based Software Process Improvement......Software processes improvement (SPI) is a challenging task, as many different stakeholders, project settings, and contexts and goals need to be considered. SPI projects are often operated in a complex and volatile environment and, thus, require a sound management that is resource...

  11. Cost effective software internationalisation

    Directory of Open Access Journals (Sweden)

    Tim Hunt

    Full Text Available This paper describes the design and implementation of a method for allowing the user interface of a software application to be translated by the end user into any other language. It is proposed that if used by the software industry this technique will increase the availability of software to minority groups. The research involved the modification of an existing email application for children (www.mifrenz.com by providing a tool for parents to modify the pre-installed translations created using automated translation tools. Standard software internationalisation techniques available with modern programming languages were extensively used. This work resulted in a fully implemented product that has been sold in 11 countries and has confirmed usage in Dutch, French, German, Spanish, Norwegian, Russian, Swedish, and English. It is concluded that with the advent of automated translation tools and giving the end user the ability to modify the translation, as described in this paper, means that it is now possible for all software to be delivered with any interface language at a minimal cost.

  12. Managing Distributed Software Projects

    DEFF Research Database (Denmark)

    Persson, John Stouby

    Increasingly, software projects are becoming geographically distributed, with limited face-toface interaction between participants. These projects face particular challenges that need careful managerial attention. This PhD study reports on how we can understand and support the management of distr......Increasingly, software projects are becoming geographically distributed, with limited face-toface interaction between participants. These projects face particular challenges that need careful managerial attention. This PhD study reports on how we can understand and support the management...... of distributed software projects, based on a literature study and a case study. The main emphasis of the literature study was on how to support the management of distributed software projects, but also contributed to an understanding of these projects. The main emphasis of the case study was on how to understand...... the management of distributed software projects, but also contributed to supporting the management of these projects. The literature study integrates what we know about risks and risk-resolution techniques, into a framework for managing risks in distributed contexts. This framework was developed iteratively...

  13. SWEBOS – The Software Engineering Body of Skills

    Directory of Open Access Journals (Sweden)

    Yvonne Sedelmaier

    2015-02-01

    Full Text Available The development of complex software systems requires a mixture of various technical and non-technical competencies. While some guidelines exist which technical knowledge is required to make a good software engineer, there is a lack of insight as to which non-technical or soft skills are necessary to master complex software projects. This paper proposes a body of skills (SWEBOS for soft-ware engineering. The collection of necessary skills is developed on the basis of a clear, data-driven research design. The resulting required soft skills for software engineering are described precisely and semantically rich in a three-level structure. This approach guarantees that skills are not just characterized in a broad and general manner, but rather they are specifically adapted to the domain of software engineering.

  14. REQUIEREMENTS TO SOFTWARE FOR TEACHING ADOLESCENTS THE INTERNET SECURITY

    Directory of Open Access Journals (Sweden)

    Denys V. Stolbov

    2015-02-01

    Full Text Available The article presents requierents to software for teaching adolescents the Internet security. The requierements were systematized and described in three groups. The first group includes requierements to content of the software, which were made according to fundamental didactic principlles: accessibility, correctness, visualization, problem, logicality description of learning materials. The second group consists of requierements to interface of the software: authenticity, accuracy, dynamism, simplicity and detailing. An interactive requierement and an assistance system in the software were analyzed. A group of requirements to functionality and use conditions of the software was defined.

  15. Software cost/resource modeling: Software quality tradeoff measurement

    Science.gov (United States)

    Lawler, R. W.

    1980-01-01

    A conceptual framework for treating software quality from a total system perspective is developed. Examples are given to show how system quality objectives may be allocated to hardware and software; to illustrate trades among quality factors, both hardware and software, to achieve system performance objectives; and to illustrate the impact of certain design choices on software functionality.

  16. Impact of Agile Software Development Model on Software Maintainability

    Science.gov (United States)

    Gawali, Ajay R.

    2012-01-01

    Software maintenance and support costs account for up to 60% of the overall software life cycle cost and often burdens tightly budgeted information technology (IT) organizations. Agile software development approach delivers business value early, but implications on software maintainability are still unknown. The purpose of this quantitative study…

  17. Software Engineering Laboratory Series: Collected Software Engineering Papers. Volume 13

    Science.gov (United States)

    1995-01-01

    The Software Engineering Laboratory (SEL) is an organization sponsored by NASA/GSFC and created to investigate the effectiveness of software engineering technologies when applied to the development of application software. The activities, findings, and recommendations of the SEL are recorded in the Software Engineering Laboratory Series, a continuing series of reports that includes this document.

  18. Software Engineering Laboratory Series: Collected Software Engineering Papers. Volume 14

    Science.gov (United States)

    1996-01-01

    The Software Engineering Laboratory (SEL) is an organization sponsored by NASA/GSFC and created to investigate the effectiveness of software engineering technologies when applied to the development of application software. The activities, findings, and recommendations of the SEL are recorded in the Software Engineering Laboratory Series, a continuing series of reports that includes this document.

  19. Software Engineering Laboratory Series: Collected Software Engineering Papers. Volume 15

    Science.gov (United States)

    1997-01-01

    The Software Engineering Laboratory (SEL) is an organization sponsored by NASA/GSFC and created to investigate the effectiveness of software engineering technologies when applied to the development of application software. The activities, findings, and recommendations of the SEL are recorded in the Software Engineering Laboratory Series, a continuing series of reports that includes this document.

  20. User Requirements for Wireless

    DEFF Research Database (Denmark)

    In most IT system development processes, the identification or elicitation of user requirements is recognized as a key building block. In practice, the identification of user needs and wants is a challenge and inadequate or faulty identifications in this step of an IT system development can cause...... huge problems with the final product. The elicitation of user requirements as such changes according to age groups;, to gender,; to cultural settings,; and into time; and experience in the use of the system/software. User requirements, therefore, cannot be used between projects, IT systems......, and different software. That makes the elicitation of user requirements an inherent part of any software development project and a resourceful activity as well. This book provides insights to the process of identifying user requirements and to different types by describing varying case studies in which...

  1. Crowd-Centric Requirements Engineering

    NARCIS (Netherlands)

    Snijders, Remco; Dalpiaz, Fabiano; Hosseini, Mahmood; Shahri, Alimohammad; Ali, Raian

    2014-01-01

    Requirements engineering is a preliminary and crucial phase for the correctness and quality of software systems. Despite the agreement on the positive correlation between user involvement in requirements engineering and software success, current development methods employ a too narrow concept of

  2. New Media as Software

    Directory of Open Access Journals (Sweden)

    Manuel Portela

    2014-03-01

    Full Text Available Review of Lev Manovich, Software Takes Command: Extending the Language of New Media. London: Bloomsbury, 2013, 358 pp. ISBN 978-1-6235-6817-7. In Lev Manovich’s most recent book, this programmatic interrogation of our medial condition leads to the following question: do media still exist after software? This is the question that triggers Manovich’s dialogue both with computing history and with theories of digital media of recent decades, including the extension of his own previous formulations in The Language of New Media, published in 2001, and which became a major reference work in the field. The subtitle of the new book points precisely to this critical revisiting of his earlier work in the context of ubiquitous computing and accelerated transcoding of social, cultural and artistic practices by software.

  3. Lecture 2: Software Security

    CERN Multimedia

    CERN. Geneva

    2013-01-01

    Computer security has been an increasing concern for IT professionals for a number of years, yet despite all the efforts, computer systems and networks remain highly vulnerable to attacks of different kinds. Design flaws and security bugs in the underlying software are among the main reasons for this. This lecture addresses the following question: how to create secure software? The lecture starts with a definition of computer security and an explanation of why it is so difficult to achieve. It then introduces the main security principles (like least-privilege, or defense-in-depth) and discusses security in different phases of the software development cycle. The emphasis is put on the implementation part: most common pitfalls and security bugs are listed, followed by advice on best practice for security development, testing and deployment. Sebastian Lopienski is CERN’s deputy Computer Security Officer. He works on security strategy and policies; offers internal consultancy and audit services; develops and ...

  4. Software to Control and Monitor Gas Streams

    Science.gov (United States)

    Arkin, C.; Curley, Charles; Gore, Eric; Floyd, David; Lucas, Damion

    2012-01-01

    This software package interfaces with various gas stream devices such as pressure transducers, flow meters, flow controllers, valves, and analyzers such as a mass spectrometer. The software provides excellent user interfacing with various windows that provide time-domain graphs, valve state buttons, priority- colored messages, and warning icons. The user can configure the software to save as much or as little data as needed to a comma-delimited file. The software also includes an intuitive scripting language for automated processing. The configuration allows for the assignment of measured values or calibration so that raw signals can be viewed as usable pressures, flows, or concentrations in real time. The software is based on those used in two safety systems for shuttle processing and one volcanic gas analysis system. Mass analyzers typically have very unique applications and vary from job to job. As such, software available on the market is usually inadequate or targeted on a specific application (such as EPA methods). The goal was to develop powerful software that could be used with prototype systems. The key problem was to generalize the software to be easily and quickly reconfigurable. At Kennedy Space Center (KSC), the prior art consists of two primary methods. The first method was to utilize Lab- VIEW and a commercial data acquisition system. This method required rewriting code for each different application and only provided raw data. To obtain data in engineering units, manual calculations were required. The second method was to utilize one of the embedded computer systems developed for another system. This second method had the benefit of providing data in engineering units, but was limited in the number of control parameters.

  5. Software Safety Risk in Legacy Safety-Critical Computer Systems

    Science.gov (United States)

    Hill, Janice; Baggs, Rhoda

    2007-01-01

    Safety-critical computer systems must be engineered to meet system and software safety requirements. For legacy safety-critical computer systems, software safety requirements may not have been formally specified during development. When process-oriented software safety requirements are levied on a legacy system after the fact, where software development artifacts don't exist or are incomplete, the question becomes 'how can this be done?' The risks associated with only meeting certain software safety requirements in a legacy safety-critical computer system must be addressed should such systems be selected as candidates for reuse. This paper proposes a method for ascertaining formally, a software safety risk assessment, that provides measurements for software safety for legacy systems which may or may not have a suite of software engineering documentation that is now normally required. It relies upon the NASA Software Safety Standard, risk assessment methods based upon the Taxonomy-Based Questionnaire, and the application of reverse engineering CASE tools to produce original design documents for legacy systems.

  6. Omics Metadata Management Software (OMMS).

    Science.gov (United States)

    Perez-Arriaga, Martha O; Wilson, Susan; Williams, Kelly P; Schoeniger, Joseph; Waymire, Russel L; Powell, Amy Jo

    2015-01-01

    Next-generation sequencing projects have underappreciated information management tasks requiring detailed attention to specimen curation, nucleic acid sample preparation and sequence production methods required for downstream data processing, comparison, interpretation, sharing and reuse. The few existing metadata management tools for genome-based studies provide weak curatorial frameworks for experimentalists to store and manage idiosyncratic, project-specific information, typically offering no automation supporting unified naming and numbering conventions for sequencing production environments that routinely deal with hundreds, if not thousands of samples at a time. Moreover, existing tools are not readily interfaced with bioinformatics executables, (e.g., BLAST, Bowtie2, custom pipelines). Our application, the Omics Metadata Management Software (OMMS), answers both needs, empowering experimentalists to generate intuitive, consistent metadata, and perform analyses and information management tasks via an intuitive web-based interface. Several use cases with short-read sequence datasets are provided to validate installation and integrated function, and suggest possible methodological road maps for prospective users. Provided examples highlight possible OMMS workflows for metadata curation, multistep analyses, and results management and downloading. The OMMS can be implemented as a stand alone-package for individual laboratories, or can be configured for webbased deployment supporting geographically-dispersed projects. The OMMS was developed using an open-source software base, is flexible, extensible and easily installed and executed. The OMMS can be obtained at http://omms.sandia.gov. The OMMS can be obtained at http://omms.sandia.gov.

  7. Agile software development

    CERN Document Server

    Stober, Thomas

    2009-01-01

    Software Development is moving towards a more agile and more flexible approach. It turns out that the traditional 'waterfall' model is not supportive in an environment where technical, financial and strategic constraints are changing almost every day. But what is agility? What are today's major approaches? And especially: What is the impact of agile development principles on the development teams, on project management and on software architects? How can large enterprises become more agile and improve their business processes, which have been existing since many, many years? What are the limit

  8. Software product quality control

    CERN Document Server

    Wagner, Stefan

    2013-01-01

    Quality is not a fixed or universal property of software; it depends on the context and goals of its stakeholders. Hence, when you want to develop a high-quality software system, the first step must be a clear and precise specification of quality. Yet even if you get it right and complete, you can be sure that it will become invalid over time. So the only solution is continuous quality control: the steady and explicit evaluation of a product's properties with respect to its updated quality goals.This book guides you in setting up and running continuous quality control in your environment. Star

  9. Software Testing as Science

    Directory of Open Access Journals (Sweden)

    Ingrid Gallesdic

    2013-06-01

    Full Text Available The most widespread opinion among people who have some connection with software testing is that this activity is an art. In fact, books have been published widely whose titles refer to it as art, role or process. But because software complexity is increasing every year, this paper proposes a new approach, conceiving the test as a science. This is because the processes by which they are applied are the steps of the scientific method: inputs, processes, outputs. The contents of this paper examines the similarities and test characteristics as science.

  10. Software Testing as Science

    Directory of Open Access Journals (Sweden)

    Ingrid Gallesdic

    2013-07-01

    Full Text Available The most widespread opinion among people who have some connection with software testing is that this activity is an art. In fact, books have been published widely whose titles refer to it as art, role or process. But because software complexity is increasing every year, this paper proposes a new approach, conceiving the test as a science. This is because the processes by which they are applied are the steps of the scientific method: inputs, processes, outputs. The contents of this paper examines the similarities and test characteristics as science.

  11. Processeringsoptimering med Canons software

    DEFF Research Database (Denmark)

    Precht, Helle

    2009-01-01

    . Muligheder i software optimering blev studeret i relation til optimal billedkvalitet og kontrol optagelser, for at undersøge om det var muligt at acceptere diagnostisk billedkvalitet og derved tage afsæt i ALARA. Metode og materialer Et kvantitativt eksperimentelt studie baseret på forsøg med teknisk og...... humant fantom. CD Rad fantom anvendes som teknisk fantom, hvor billederne blev analyseret med CD Rad software, og resultatet var en objektiv IQF værdi. Det humane fantom var et lamme pelvis med femur, der via NRPB’ er sammenlignelig med absorptionen ved et femårigt barn. De humane forsøgsbilleder blev...

  12. The Software Engineering Prototype.

    Science.gov (United States)

    1983-06-01

    sequential problem solving in which the cycles form networks. An essential part of this model is the continuous feedback between tke designer and the...34Systems, Reitv and the SystEms Pracgioner", Journal of Systems Man_ emen&, January 1981, p. 26-28. --.... Jnuar 34. Gilt , Tom "High-Level Systems...Development", Software Sngineri.& Note.s, vol. 6, no. 2, April 1981, T.- 7" 45. Stavely, Allan I. ,"Design Feedback and ims Use in Software resign

  13. Software Safety and Security

    CERN Document Server

    Nipkow, T; Hauptmann, B

    2012-01-01

    Recent decades have seen major advances in methods and tools for checking the safety and security of software systems. Automatic tools can now detect security flaws not only in programs of the order of a million lines of code, but also in high-level protocol descriptions. There has also been something of a breakthrough in the area of operating system verification. This book presents the lectures from the NATO Advanced Study Institute on Tools for Analysis and Verification of Software Safety and Security; a summer school held at Bayrischzell, Germany, in 2011. This Advanced Study Institute was

  14. Six Sigma software development

    CERN Document Server

    Tayntor, Christine B

    2002-01-01

    Since Six Sigma has had marked success in improving quality in other settings, and since the quality of software remains poor, it seems a natural evolution to apply the concepts and tools of Six Sigma to system development and the IT department. Until now however, there were no books available that applied these concepts to the system development process. Six Sigma Software Development fills this void and illustrates how Six Sigma concepts can be applied to all aspects of the evolving system development process. It includes the traditional waterfall model and in the support of legacy systems,

  15. Machine Tool Software

    Science.gov (United States)

    1988-01-01

    A NASA-developed software package has played a part in technical education of students who major in Mechanical Engineering Technology at William Rainey Harper College. Professor Hack has been using (APT) Automatically Programmed Tool Software since 1969 in his CAD/CAM Computer Aided Design and Manufacturing curriculum. Professor Hack teaches the use of APT programming languages for control of metal cutting machines. Machine tool instructions are geometry definitions written in APT Language to constitute a "part program." The part program is processed by the machine tool. CAD/CAM students go from writing a program to cutting steel in the course of a semester.

  16. Improving Software Systems By Flow Control Analysis

    Directory of Open Access Journals (Sweden)

    Piotr Poznanski

    2012-01-01

    Full Text Available Using agile methods during the implementation of the system that meets mission critical requirements can be a real challenge. The change in the system built of dozens or even hundreds of specialized devices with embedded software requires the cooperation of a large group of engineers. This article presents a solution that supports parallel work of groups of system analysts and software developers. Deployment of formal rules to the requirements written in natural language enables using formal analysis of artifacts being a bridge between software and system requirements. Formalism and textual form of requirements allowed the automatic generation of message flow graph for the (sub system, called the “big-picture-model”. Flow diagram analysis helped to avoid a large number of defects whose repair cost in extreme cases could undermine the legitimacy of agile methods in projects of this scale. Retrospectively, a reduction of technical debt was observed. Continuous analysis of the “big picture model” improves the control of the quality parameters of the software architecture. The article also tries to explain why the commercial platform based on UML modeling language may not be sufficient in projects of this complexity.

  17. Some challenges facing software engineers developing software for scientists

    OpenAIRE

    Segal, Judith

    2009-01-01

    In this paper, the author discusses two types of challenges facing software engineers as they develop software for scientists. The first type is those challenges that arise from the experience that scientists might have of developing their own software. From this experience, they internalise a model of software development but may not realise the contextual factors which make such a model successful. They thus have expectations and assumptions which prove challenging to software engineers. Th...

  18. Architecture-driven requirements prioritization

    OpenAIRE

    Koziolek, Anne

    2012-01-01

    Quality requirements are main drivers for architectural decisions of software systems. However, in practice they are often dismissed during development, because of initially unknown dependencies and consequences that complicate implementation. To decide for meaningful, feasible quality requirements and trade them off with functional requirements, tighter integration of software architecture evaluation and requirements prioritization is necessary. In this position paper, we propose a tool-supp...

  19. Technology Assessment of Software Engineering

    Science.gov (United States)

    1989-02-01

    POSTON, Robert, et.al, "Counting Down to Zero Software Failures," IEEE Software, September 1987. PRIETO- DIAZ , Ruben, et.al, "Classifying Software for...Revisited," IEEE Software, January 1987. TAFT , Darryl, "People Still Seen as No. 1 Challenge for Technology," Government Computer News, April 1, 1988

  20. The fallacy of Software Patents

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    Software patents are usually used as argument for innovation but do they really promote innovation? Who really benefits from software patents? This talk attempts to show the problems with software patents and how they can actually harm innovation having little value for software users and our society in general.

  1. The Impact of Software Patents.

    Science.gov (United States)

    Kahin, Brian

    1989-01-01

    Discusses issues involved in obtaining patents for computer software. Topics discussed include copyright protection; algorithms and other software processes within programs; the software industry and the effects of patenting; perspectives on software patents within higher education; cost factors; and the use of patents to control information…

  2. Happy software developers solve problems better: psychological measurements in empirical software engineering

    Directory of Open Access Journals (Sweden)

    Daniel Graziotin

    2014-03-01

    Full Text Available For more than thirty years, it has been claimed that a way to improve software developers’ productivity and software quality is to focus on people and to provide incentives to make developers satisfied and happy. This claim has rarely been verified in software engineering research, which faces an additional challenge in comparison to more traditional engineering fields: software development is an intellectual activity and is dominated by often-neglected human factors (called human aspects in software engineering research. Among the many skills required for software development, developers must possess high analytical problem-solving skills and creativity for the software construction process. According to psychology research, affective states—emotions and moods—deeply influence the cognitive processing abilities and performance of workers, including creativity and analytical problem solving. Nonetheless, little research has investigated the correlation between the affective states, creativity, and analytical problem-solving performance of programmers. This article echoes the call to employ psychological measurements in software engineering research. We report a study with 42 participants to investigate the relationship between the affective states, creativity, and analytical problem-solving skills of software developers. The results offer support for the claim that happy developers are indeed better problem solvers in terms of their analytical abilities. The following contributions are made by this study: (1 providing a better understanding of the impact of affective states on the creativity and analytical problem-solving capacities of developers, (2 introducing and validating psychological measurements, theories, and concepts of affective states, creativity, and analytical-problem-solving skills in empirical software engineering, and (3 raising the need for studying the human factors of software engineering by employing a

  3. Happy software developers solve problems better: psychological measurements in empirical software engineering.

    Science.gov (United States)

    Graziotin, Daniel; Wang, Xiaofeng; Abrahamsson, Pekka

    2014-01-01

    For more than thirty years, it has been claimed that a way to improve software developers' productivity and software quality is to focus on people and to provide incentives to make developers satisfied and happy. This claim has rarely been verified in software engineering research, which faces an additional challenge in comparison to more traditional engineering fields: software development is an intellectual activity and is dominated by often-neglected human factors (called human aspects in software engineering research). Among the many skills required for software development, developers must possess high analytical problem-solving skills and creativity for the software construction process. According to psychology research, affective states-emotions and moods-deeply influence the cognitive processing abilities and performance of workers, including creativity and analytical problem solving. Nonetheless, little research has investigated the correlation between the affective states, creativity, and analytical problem-solving performance of programmers. This article echoes the call to employ psychological measurements in software engineering research. We report a study with 42 participants to investigate the relationship between the affective states, creativity, and analytical problem-solving skills of software developers. The results offer support for the claim that happy developers are indeed better problem solvers in terms of their analytical abilities. The following contributions are made by this study: (1) providing a better understanding of the impact of affective states on the creativity and analytical problem-solving capacities of developers, (2) introducing and validating psychological measurements, theories, and concepts of affective states, creativity, and analytical-problem-solving skills in empirical software engineering, and (3) raising the need for studying the human factors of software engineering by employing a multidisciplinary viewpoint.

  4. Software testing concepts and operations

    CERN Document Server

    Mili, Ali

    2015-01-01

    Explores and identifies the main issues, concepts, principles and evolution of software testing, including software quality engineering and testing concepts, test data generation, test deployment analysis, and software test managementThis book examines the principles, concepts, and processes that are fundamental to the software testing function. This book is divided into five broad parts. Part I introduces software testing in the broader context of software engineering and explores the qualities that testing aims to achieve or ascertain, as well as the lifecycle of software testing. Part II c

  5. The Art of Software Testing

    CERN Document Server

    Myers, Glenford J; Badgett, Tom

    2011-01-01

    The classic, landmark work on software testing The hardware and software of computing have changed markedly in the three decades since the first edition of The Art of Software Testing, but this book's powerful underlying analysis has stood the test of time. Whereas most books on software testing target particular development techniques, languages, or testing methods, The Art of Software Testing, Third Edition provides a brief but powerful and comprehensive presentation of time-proven software testing approaches. If your software development project is mission critical, this book is an investme

  6. A Generic Software Architecture For Prognostics

    Science.gov (United States)

    Teubert, Christopher; Daigle, Matthew J.; Sankararaman, Shankar; Goebel, Kai; Watkins, Jason

    2017-01-01

    Prognostics is a systems engineering discipline focused on predicting end-of-life of components and systems. As a relatively new and emerging technology, there are few fielded implementations of prognostics, due in part to practitioners perceiving a large hurdle in developing the models, algorithms, architecture, and integration pieces. As a result, no open software frameworks for applying prognostics currently exist. This paper introduces the Generic Software Architecture for Prognostics (GSAP), an open-source, cross-platform, object-oriented software framework and support library for creating prognostics applications. GSAP was designed to make prognostics more accessible and enable faster adoption and implementation by industry, by reducing the effort and investment required to develop, test, and deploy prognostics. This paper describes the requirements, design, and testing of GSAP. Additionally, a detailed case study involving battery prognostics demonstrates its use.

  7. ATLAS software configuration and build tool optimisation

    CERN Document Server

    Rybkin, G; The ATLAS collaboration

    2013-01-01

    ATLAS software code base is over 6 million lines organised in about 2000 packages. It makes use of some 100 external software packages, is developed by more than 400 developers and used by more than 2500 physicists from over 200 universities and laboratories in 6 continents. To meet the challenge of configuration and building of this software, the Configuration Management Tool (CMT) is used. CMT expects each package to describe its build targets, build and environment setup parameters, dependencies on other packages in a text file called requirements, and each project (group of packages) to describe its policies and dependencies on other projects in a text project file. Based on the effective set of configuration parameters read from the requirements files of dependent packages and project files, CMT commands build the packages, generate the environment for their use, or query the packages. The main focus was on build time performance that was optimised within several approaches: reduction of the number of re...

  8. ATLAS software configuration and build tool optimisation

    CERN Document Server

    Rybkin, G; The ATLAS collaboration

    2014-01-01

    ATLAS software code base is over 6 million lines organised in about 2000 packages. It makes use of some 100 external software packages, is developed by more than 400 developers and used by more than 2500 physicists from over 200 universities and laboratories in 6 continents. To meet the challenge of configuration and building of this software, the Configuration Management Tool (CMT) is used. CMT expects each package to describe its build targets, build and environment setup parameters, dependencies on other packages in a text file called requirements, and each project (group of packages) to describe its policies and dependencies on other projects in a text project file. Based on the effective set of configuration parameters read from the requirements files of dependent packages and project files, CMT commands build the packages, generate the environment for their use, or query the packages. The main focus was on build time performance that was optimised within several approaches: reduction of the number of re...

  9. Software Reliability Study

    Science.gov (United States)

    1976-08-01

    being solved by the software (algorithms, vector algebra , modeling code, etc.) and equations - used iU a bookkeeping sense (computation of indices...RAT,,. a standard lineal - regression analysis (Phase 11) was performed on the independent parameters with no constraints on the influence coefficients

  10. Software Engineering Education Directory

    Science.gov (United States)

    1990-04-01

    Macintosh PC Sun Additional Information: Design and Documentation and Software Leadership are proposed as part of a ro-,ised currculum. Master’s Project is a...Computer Science Nashville, TN 37208-3051 Degreies: BS CS, BS M Contact: Ms. Vivan J. Fielder Assistant professor Update: February 1990o Courses

  11. AOFlagger: RFI Software

    Science.gov (United States)

    Offringa, A. R.

    2010-10-01

    The RFI software presented here can automatically flag data and can be used to analyze the data in a measurement. The purpose of flagging is to mark samples that are affected by interfering sources such as radio stations, airplanes, electrical fences or other transmitting interferers. The tools in the package are meant for offline use. The software package contains a graphical interface ("rfigui") that can be used to visualize a measurement set and analyze mitigation techniques. It also contains a console flagger ("rficonsole") that can execute a script of mitigation functions without the overhead of a graphical environment. All tools were written in C++. The software has been tested extensively on low radio frequencies (150 MHz or lower) produced by the WSRT and LOFAR telescopes. LOFAR is the Low Frequency Array that is built in and around the Netherlands. Higher frequencies should work as well. Some of the methods implemented are the SumThreshold, the VarThreshold and the singular value decomposition (SVD) method. Included also are several surface fitting algorithms. The software is published under the GNU General Public License version 3.

  12. Application software profiles 2010

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    2010-04-15

    This article presented information on new software applications designed to facilitate petroleum exploration, drilling and production activities. Computer modelling and analysis enables oil and gas producers to characterize reservoirs, estimate reserves forecast production, plan operations and manage assets. Seven Calgary-based organizations were highlighted along with their sophisticated software tools, the applications and the new features available in each product. The geoSCOUT version 7.7 by GeoLOGIC Systems Ltd. integrates public and proprietary data on wells, well logs, reserves, pipelines, production, ownership and seismic location data. The Value Navigator and AFE Navigator by Energy Navigator provides control over reserves, production and cash flow forecasting. FAST Harmony, FAST Evolution, FAST CBM, FAST FieldNotes, Fast Piper, FAST RTA, FAST VirtuWell and FAST WellTest by Fekete Associates Inc. provide reserve evaluations for reservoir engineering projects and production data analysis. The esi.manage software program by 3esi improves business results for upstream oil and gas companies through enhanced decision making and workforce effectiveness. WELLFLO, PIPEFLO, FORGAS, OLGA, Drillbench, and MEPO wellbore solutions by Neotec provide unique platforms for flow simulation to optimize oil and gas production systems. Petrel, ECLIPSE, Avocet, PipeSim and Merak software tools by Schlumberger Information Solutions are petroleum systems modelling tools for geologic mapping, visualization modelling and reservoir engineering. StudioSL by Streamsim Technologies Inc. is a modelling tool for optimizing flood management. figs.

  13. JSATS Decoder Software Manual

    Energy Technology Data Exchange (ETDEWEB)

    Flory, Adam E.; Lamarche, Brian L.; Weiland, Mark A.

    2013-05-01

    The Juvenile Salmon Acoustic Telemetry System (JSATS) Decoder is a software application that converts a digitized acoustic signal (a waveform stored in the .bwm file format) into a list of potential JSATS Acoustic MicroTransmitter (AMT) tagcodes along with other data about the signal including time of arrival and signal to noise ratios (SNR). This software is capable of decoding single files, directories, and viewing raw acoustic waveforms. When coupled with the JSATS Detector, the Decoder is capable of decoding in ‘real-time’ and can also provide statistical information about acoustic beacons placed within receive range of hydrophones within a JSATS array. This document details the features and functionality of the software. The document begins with software installation instructions (section 2), followed in order by instructions for decoder setup (section 3), decoding process initiation (section 4), then monitoring of beacons (section 5) using real-time decoding features. The last section in the manual describes the beacon, beacon statistics, and the results file formats. This document does not consider the raw binary waveform file format.

  14. Software Process Improvement

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; Diebold, Philipp; Münch, Jürgen

    2016-01-01

    Software process improvement (SPI) is around for decades: frameworks are proposed, success factors are studied, and experiences have been reported. However, the sheer mass of concepts, approaches, and standards published over the years overwhelms practitioners as well as researchers. What is out...

  15. Open Source Software Acquisition

    DEFF Research Database (Denmark)

    Holck, Jesper; Kühn Pedersen, Mogens; Holm Larsen, Michael

    2005-01-01

    Lately we have seen a growing interest from both public and private organisations to adopt OpenSource Software (OSS), not only for a few, specific applications but also on a more general levelthroughout the organisation. As a consequence, the organisations' decisions on adoption of OSS arebecoming...

  16. Global Software Development

    DEFF Research Database (Denmark)

    Søderberg, Anne-Marie; Krishna, S.; Bjørn, Pernille

    2013-01-01

    accounts of close collaboration processes in two large and complex projects, where off-shoring of software development is moved to a strategic level, we found that the vendor was able to establish a strategic partnership through long-term engagement with the field of banking and insurance as well...

  17. Reflections on Software Research

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 17; Issue 8. Reflections on Software Research. Dennis M Ritchie. Classics Volume 17 Issue 8 August 2012 pp 810-816. Fulltext. Click here to view fulltext PDF. Permanent link: http://www.ias.ac.in/article/fulltext/reso/017/08/0810-0816. Author Affiliations.

  18. MOCASSIN-prot software

    Science.gov (United States)

    MOCASSIN-prot is a software, implemented in Perl and Matlab, for constructing protein similarity networks to classify proteins. Both domain composition and quantitative sequence similarity information are utilized in constructing the directed protein similarity networks. For each reference protein i...

  19. Green Software Products

    NARCIS (Netherlands)

    Jagroep, E.A.

    2017-01-01

    The rising energy consumption of the ICT industry has triggered a quest for more green, energy efficient ICT solutions. The role of software as the true consumer of power and its potential contribution to reach sustainability goals has increasingly been acknowledged. At the same time, it is shown to

  20. Software engineering tools.

    Science.gov (United States)

    Wear, L L; Pinkert, J R

    1994-01-01

    We have looked at general descriptions and illustrations of several software development tools, such as tools for prototyping, developing DFDs, testing, and maintenance. Many others are available, and new ones are being developed. However, you have at least seen some examples of powerful CASE tools for systems development.