WorldWideScience

Sample records for core software requirement

  1. Software requirements

    CERN Document Server

    Wiegers, Karl E

    2003-01-01

    Without formal, verifiable software requirements-and an effective system for managing them-the programs that developers think they've agreed to build often will not be the same products their customers are expecting. In SOFTWARE REQUIREMENTS, Second Edition, requirements engineering authority Karl Wiegers amplifies the best practices presented in his original award-winning text?now a mainstay for anyone participating in the software development process. In this book, you'll discover effective techniques for managing the requirements engineering process all the way through the development cy

  2. Core Flight Software Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The mission of the CFS project is to provide reusable software in support of human space exploration programs.   The top-level technical approach to...

  3. Software Requirements Management

    Directory of Open Access Journals (Sweden)

    Ali Altalbe

    2015-04-01

    Full Text Available Requirements are defined as the desired set of characteristics of a product or a service. In the world of software development, it is estimated that more than half of the failures are attributed towards poor requirements management. This means that although the software functions correctly, it is not what the client requested. Modern software requirements management methodologies are available to reduce the occur-rence of such incidents. This paper performs a review on the available literature in the area while tabulating possible methods of managing requirements. It also highlights the benefits of following a proper guideline for the requirements management task. With the introduction of specific software tools for the requirements management task, better software products are now been developed with lesser resources.

  4. Writing testable software requirements

    Energy Technology Data Exchange (ETDEWEB)

    Knirk, D. [Sandia National Labs., Albuquerque, NM (United States)

    1997-11-01

    This tutorial identifies common problems in analyzing requirements in the problem and constructing a written specification of what the software is to do. It deals with two main problem areas: identifying and describing problem requirements, and analyzing and describing behavior specifications.

  5. Software Security Requirements Gathering Instrument

    OpenAIRE

    2011-01-01

    Security breaches are largely caused by the vulnerable software. Since individuals and organizations mostly depend on softwares, it is important to produce in secured manner. The first step towards producing secured software is through gathering security requirements. This paper describes Software Security Requirements Gathering Instrument (SSRGI) that helps gather security requirements from the various stakeholders. This will guide the developers to gather security requirements along with th...

  6. Software Testing Requires Variability

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak

    2003-01-01

    Software variability is the ability of a software system or artefact to be changed, customized or configured for use in a particular context. Variability in software systems is important from a number of perspectives. Some perspectives rightly receive much attention due to their direct economic...... impact in software production. As is also apparent from the call for papers these perspectives focus on qualities such as reuse, adaptability, and maintainability....

  7. Software Testing Requires Variability

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak

    2003-01-01

    Software variability is the ability of a software system or artefact to be changed, customized or configured for use in a particular context. Variability in software systems is important from a number of perspectives. Some perspectives rightly receive much attention due to their direct economic...... impact in software production. As is also apparent from the call for papers these perspectives focus on qualities such as reuse, adaptability, and maintainability....

  8. Software Security Requirements Gathering Instrument

    Directory of Open Access Journals (Sweden)

    Smriti Jain

    2011-08-01

    Full Text Available Security breaches are largely caused by the vulnerable software. Since individuals and organizations mostly depend on softwares, it is important to produce in secured manner. The first step towards producing secured software is through gathering security requirements. This paper describes Software Security Requirements Gathering Instrument (SSRGI that helps gather security requirements from the various stakeholders. This will guide the developers to gather security requirements along with the functional requirements and further incorporate security during other phases of software development. We subsequently present case studies that describe the integration of the SSRGI instrument with Software Requirements Specification (SRS document as specified in standard IEEE 830-1998. Proposed SSRGI will support the software developers in gathering security requirements in detail during requirements gathering phase.

  9. Requirement emergence computation of networked software

    Institute of Scientific and Technical Information of China (English)

    HE Keqing; LIANG Peng; PENG Rong; LI Bing; LIU Jing

    2007-01-01

    Emergence Computation has become a hot topic in the research of complex systems in recent years.With the substantial increase in scale and complexity of network-based information systems,the uncertain user requirements from the Internet and personalized application requirement result in the frequent change for the software requirement.Meanwhile,the software system with non self-possessed,resource become more and more complex.Furthermore,the interaction and cooperation requirement between software units and running environment in service computing increase the complexity of software systems.The software systems with complex system characteristics are developing into the"Networked Software" with characteristics of change-on-demand and change-with-cooperation.The concepts "programming","compiling" and "running"of software in common sense are extended from "desktop" to "network".The core issue of software engineering is moving to the requirement engineering,which becomes the research focus of complex systemsoftware engineering.In this paper,we present the software network view based on complex system theory,and the concept of networked software and networked requirement.We proposethe challenge problem in the research of emergence computation of networked software requirement.A hierarchical & cooperative Unified requirement modeling framework URF (Unified Requirement Framework) and related RGPS (Role,Goal,Process and Service) meta-models are proposed.Five scales and the evolutionary growth mechanismin requirement emergence computation of networked software are given with focus on user-dominant and domain-oriented requirement,and the rules and predictability in requirement emergence computation are analyzed.A case study in the application of networked e-Business with evolutionary growth based on State design pattern is presented in the end.

  10. UTM TCL2 Software Requirements

    Science.gov (United States)

    Smith, Irene S.; Rios, Joseph L.; McGuirk, Patrick O.; Mulfinger, Daniel G.; Venkatesan, Priya; Smith, David R.; Baskaran, Vijayakumar; Wang, Leo

    2017-01-01

    The Unmanned Aircraft Systems (UAS) Traffic Management (UTM) Technical Capability Level (TCL) 2 software implements the UTM TCL 2 software requirements described herein. These software requirements are linked to the higher level UTM TCL 2 System Requirements. Each successive TCL implements additional UTM functionality, enabling additional use cases. TCL 2 demonstrated how to enable expanded multiple operations by implementing automation for beyond visual line-of-sight, tracking operations, and operations flying over sparsely populated areas.

  11. Software package requirements and procurement

    OpenAIRE

    1996-01-01

    This paper outlines the problems of specifying requirements and deploying these requirements in the procurement of software packages. Despite the fact that software construction de novo is the exception rather than the rule, little or no support for the task of formulating requirements to support assessment and selection among existing software packages has been developed. We analyse the problems arising in this process and review related work. We outline the key components of a programme of ...

  12. The NLC Software Requirements Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Shoaee, Hamid

    2002-08-20

    We describe the software requirements and development methodology developed for the NLC control system. Given the longevity of that project, and the likely geographical distribution of the collaborating engineers, the planned requirements management process is somewhat more formal than the norm in high energy physics projects. The short term goals of the requirements process are to accurately estimate costs, to decompose the problem, and to determine likely technologies. The long term goal is to enable a smooth transition from high level functional requirements to specific subsystem and component requirements for individual programmers, and to support distributed development. The methodology covers both ends of that life cycle. It covers both the analytical and documentary tools for software engineering, and project management support. This paper introduces the methodology, which is fully described in [1].

  13. Conflict Resolution (CORE) for Software Quality Factors

    Science.gov (United States)

    1993-05-01

    theory. Evaluate recent algorithm and concept developments for possible use in CORE. IH-34 SECTION IV REFERENCES [1] Jeffrey A. Lasky and Alan R...February, 1985. [5] Ruben Prieto -Diaz and Guillermo Arango (eds.), Domain Analysis and Software Systems Modeling, IEEE Computer Society Press, 1991. [6

  14. Requirements Engineering in Building Climate Science Software

    Science.gov (United States)

    Batcheller, Archer L.

    2011-01-01

    Software has an important role in supporting scientific work. This dissertation studies teams that build scientific software, focusing on the way that they determine what the software should do. These requirements engineering processes are investigated through three case studies of climate science software projects. The Earth System Modeling…

  15. Requirements Engineering in Building Climate Science Software

    Science.gov (United States)

    Batcheller, Archer L.

    2011-01-01

    Software has an important role in supporting scientific work. This dissertation studies teams that build scientific software, focusing on the way that they determine what the software should do. These requirements engineering processes are investigated through three case studies of climate science software projects. The Earth System Modeling…

  16. DETERMINING THE CORE PART OF SOFTWARE DEVELOPMENT CURRICULUM APPLYING ASSOCIATION RULE MINING ON SOFTWARE JOB ADS IN TURKEY

    Directory of Open Access Journals (Sweden)

    Ilkay Yelmen

    2016-01-01

    Full Text Available The software technology is advancing rapidly over the years. In order to adapt to this advancement, the employees on software development should renew themselves consistently. During this rapid change, it is vital to train the proper software developer with respect to the criteria desired by the industry. Therefore, the curriculum of the programs related to software development at the universities should be revised according to software industry requirements. In this study, the core part of Software Development Curriculum is determined by applying association rule mining on Software Job ads in Turkey. The courses in the core part are chosen with respect to IEEE/ACM computer science curriculum. As a future study, it is also important to gather the academic personnel and the software company professionals to determine the compulsory and elective courses so that newly graduated software dev

  17. Requirements Engineering for Software Integrity and Safety

    Science.gov (United States)

    Leveson, Nancy G.

    2002-01-01

    Requirements flaws are the most common cause of errors and software-related accidents in operational software. Most aerospace firms list requirements as one of their most important outstanding software development problems and all of the recent, NASA spacecraft losses related to software (including the highly publicized Mars Program failures) can be traced to requirements flaws. In light of these facts, it is surprising that relatively little research is devoted to requirements in contrast with other software engineering topics. The research proposed built on our previous work. including both criteria for determining whether a requirements specification is acceptably complete and a new approach to structuring system specifications called Intent Specifications. This grant was to fund basic research on how these ideas could be extended to leverage innovative approaches to the problems of (1) reducing the impact of changing requirements, (2) finding requirements specification flaws early through formal and informal analysis, and (3) avoiding common flaws entirely through appropriate requirements specification language design.

  18. Requirements engineering: foundation for software quality

    NARCIS (Netherlands)

    Daneva, Maia; Pastor, Oscar

    2016-01-01

    Welcome to the proceedings of the 22nd edition of REFSQ: the International Working Conference on Requirements Engineering – Foundation for Software Quality! Requirements engineering (RE) has been recognized as a critical factor that impacts the quality of software, systems, and services. Since the

  19. Analyzing the Core Flight Software (CFS) with SAVE

    Science.gov (United States)

    Ganesan, Dharmalingam; Lindvall, Mikael; McComas, David

    2008-01-01

    This viewgraph presentation describes the SAVE tool and it's application to Core Flight Software (CFS). The contents include: 1) Fraunhofer-a short intro; 2) Context of this Collaboration; 3) CFS-Core Flight Software?; 4) The SAVE Tool; 5) Applying SAVE to CFS -A few example analyses; and 6) Goals.

  20. SECURED CLOUD SUPPORT FOR GLOBAL SOFTWARE REQUIREMENT RISK MANAGEMENT

    OpenAIRE

    Shruti Patil; Roshani Ade

    2014-01-01

    This paper presents core problem solution to security of Global Software Development Requirement Information. Currently the major issue deals with hacking of sensitive client information which may lead to major financial as well as social loss. To avoid this system provides cloud security by encryption of data as well as deployment of tool over the cloud will provide significant security to whole global content management system. The core findings are presented in terms of how hac...

  1. Identifying dependability requirements for space software systems

    Directory of Open Access Journals (Sweden)

    Edgar Toshiro Yano

    2010-09-01

    Full Text Available Computer systems are increasingly used in space, whether in launch vehicles, satellites, ground support and payload systems. Software applications used in these systems have become more complex, mainly due to the high number of features to be met, thus contributing to a greater probability of hazards related to software faults. Therefore, it is fundamental that the specification activity of requirements have a decisive role in the effort of obtaining systems with high quality and safety standards. In critical systems like the embedded software of the Brazilian Satellite Launcher, ambiguity, non-completeness, and lack of good requirements can cause serious accidents with economic, material and human losses. One way to assure quality with safety, reliability and other dependability attributes may be the use of safety analysis techniques during the initial phases of the project in order to identify the most adequate dependability requirements to minimize possible fault or failure occurrences during the subsequent phases. This paper presents a structured software dependability requirements analysis process that uses system software requirement specifications and traditional safety analysis techniques. The main goal of the process is to help to identify a set of essential software dependability requirements which can be added to the software requirement previously specified for the system. The final results are more complete, consistent, and reliable specifications.

  2. Core software security security at the source

    CERN Document Server

    Ransome, James

    2013-01-01

    First and foremost, Ransome and Misra have made an engaging book that will empower readers in both large and small software development and engineering organizations to build security into their products. This book clarifies to executives the decisions to be made on software security and then provides guidance to managers and developers on process and procedure. Readers are armed with firm solutions for the fight against cyber threats.-Dr. Dena Haritos Tsamitis. Carnegie Mellon UniversityIn the wake of cloud computing and mobile apps, the issue of software security has never been more importan

  3. Core Flight Software (CFS) Maturation Towards Human Rating Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The Core Flight Software (CFS) system developed by Goddard Space Flight Center, through experience on Morpheus, has proven to be a quality product and a viable...

  4. Requirements engineering for software and systems

    CERN Document Server

    Laplante, Phillip A

    2014-01-01

    Solid requirements engineering has increasingly been recognized as the key to improved, on-time and on-budget delivery of software and systems projects. This book provides practical teaching for graduate and professional systems and software engineers. It uses extensive case studies and exercises to help students grasp concepts and techniques. With a focus on software-intensive systems, this text provides a probing and comprehensive review of recent developments in intelligent systems, soft computing techniques, and their diverse applications in manufacturing. The second edition contains 100% revised content and approximately 30% new material

  5. Software Engineering for Multi-core Platforms

    NARCIS (Netherlands)

    Arbab, F.; Jongmans, S.-S.T.Q.

    2012-01-01

    Decades after Turing proposed his model of computation, we still lack suitable means to tackle the complexity of getting more than a few Turing Machines to interact with one another in a verifiably coherent manner. This dearth currently hampers software engineering in unleashing the full potential o

  6. Capturing security requirements for software systems

    Directory of Open Access Journals (Sweden)

    Hassan El-Hadary

    2014-07-01

    Full Text Available Security is often an afterthought during software development. Realizing security early, especially in the requirement phase, is important so that security problems can be tackled early enough before going further in the process and avoid rework. A more effective approach for security requirement engineering is needed to provide a more systematic way for eliciting adequate security requirements. This paper proposes a methodology for security requirement elicitation based on problem frames. The methodology aims at early integration of security with software development. The main goal of the methodology is to assist developers elicit adequate security requirements in a more systematic way during the requirement engineering process. A security catalog, based on the problem frames, is constructed in order to help identifying security requirements with the aid of previous security knowledge. Abuse frames are used to model threats while security problem frames are used to model security requirements. We have made use of evaluation criteria to evaluate the resulting security requirements concentrating on conflicts identification among requirements. We have shown that more complete security requirements can be elicited by such methodology in addition to the assistance offered to developers to elicit security requirements in a more systematic way.

  7. Capturing security requirements for software systems.

    Science.gov (United States)

    El-Hadary, Hassan; El-Kassas, Sherif

    2014-07-01

    Security is often an afterthought during software development. Realizing security early, especially in the requirement phase, is important so that security problems can be tackled early enough before going further in the process and avoid rework. A more effective approach for security requirement engineering is needed to provide a more systematic way for eliciting adequate security requirements. This paper proposes a methodology for security requirement elicitation based on problem frames. The methodology aims at early integration of security with software development. The main goal of the methodology is to assist developers elicit adequate security requirements in a more systematic way during the requirement engineering process. A security catalog, based on the problem frames, is constructed in order to help identifying security requirements with the aid of previous security knowledge. Abuse frames are used to model threats while security problem frames are used to model security requirements. We have made use of evaluation criteria to evaluate the resulting security requirements concentrating on conflicts identification among requirements. We have shown that more complete security requirements can be elicited by such methodology in addition to the assistance offered to developers to elicit security requirements in a more systematic way.

  8. A Framework for Modelling Software Requirements

    Directory of Open Access Journals (Sweden)

    Dhirendra Pandey

    2011-05-01

    Full Text Available Requirement engineering plays an important role in producing quality software products. In recent past years, some approaches of requirement framework have been designed to provide an end-to-end solution for system development life cycle. Textual requirements specifications are difficult to learn, design, understand, review, and maintain whereas pictorial modelling is widely recognized as an effective requirement analysis tool. In this paper, we will present a requirement modelling framework with the analysis of modern requirements modelling techniques. Also, we will discuss various domains of requirement engineering with the help of modelling elements such as semantic map of business concepts, lifecycles of business objects, business processes, business rules, system context diagram, use cases and their scenarios, constraints, and user interface prototypes. The proposed framework will be illustrated with the case study of inventory management system.

  9. 78 FR 47015 - Software Requirement Specifications for Digital Computer Software Used in Safety Systems of...

    Science.gov (United States)

    2013-08-02

    ... COMMISSION Software Requirement Specifications for Digital Computer Software Used in Safety Systems of... 1 of RG 1.172, ``Software Requirement Specifications for Digital Computer Software used in Safety... (IEEE) Standard (Std.) 830-1998, ``IEEE Recommended Practice for Software Requirements Specifications...

  10. Standards and methods for software requirements specification

    OpenAIRE

    2014-01-01

    This thesis presents the comparison between three selected standards and methods for software requirements specifications. IEEE 830 specification and use cases represent older, while user stories represent newer generation of methods for specification writing. Each method is first explained in theory and then on a practical example. E-študent, a well-known application to the students of the Faculty of computer and information science, serves as our practical example. The application is onl...

  11. Designing Law-Compliant Software Requirements

    Science.gov (United States)

    Siena, Alberto; Mylopoulos, John; Perini, Anna; Susi, Angelo

    New laws, such as HIPAA and SOX, are increasingly impacting the design of software systems, as business organisations strive to comply. This paper studies the problem of generating a set of requirements for a new system which comply with a given law. Specifically, the paper proposes a systematic process for generating law-compliant requirements by using a taxonomy of legal concepts and a set of primitives to describe stakeholders and their strategic goals. Given a model of law and a model of stakeholders goals, legal alternatives are identified and explored. Strategic goals that can realise legal prescriptions are systematically analysed, and alternative ways of fulfilling a law are evaluated. The approach is demonstrated by means of a case study. This work is part of the Nomos framework, intended to support the design of law-compliant requirements models.

  12. Revisiting the Core Ontology and Problem in Requirements Engineering

    CERN Document Server

    Jureta, Ivan; Faulkner, Stephane; 10.1109/RE.2008.13

    2008-01-01

    In their seminal paper in the ACM Transactions on Software Engineering and Methodology, Zave and Jackson established a core ontology for Requirements Engineering (RE) and used it to formulate the "requirements problem", thereby defining what it means to successfully complete RE. Given that stakeholders of the system-to-be communicate the information needed to perform RE, we show that Zave and Jackson's ontology is incomplete. It does not cover all types of basic concerns that the stakeholders communicate. These include beliefs, desires, intentions, and attitudes. In response, we propose a core ontology that covers these concerns and is grounded in sound conceptual foundations resting on a foundational ontology. The new core ontology for RE leads to a new formulation of the requirements problem that extends Zave and Jackson's formulation. We thereby establish new standards for what minimum information should be represented in RE languages and new criteria for determining whether RE has been successfully comple...

  13. Core Requirements for the Economics Major

    Science.gov (United States)

    Petkus, Marie; Perry, John J.; Johnson, Bruce K.

    2014-01-01

    In this article, the authors are the first to describe the core economics curriculum requirements for economics majors at all American colleges and universities, as opposed to a sample of institutions. Not surprisingly, principles of economics is nearly universally required and implemented as a two-semester course in 85 percent of economics major…

  14. Core Requirements for the Economics Major

    Science.gov (United States)

    Petkus, Marie; Perry, John J.; Johnson, Bruce K.

    2014-01-01

    In this article, the authors are the first to describe the core economics curriculum requirements for economics majors at all American colleges and universities, as opposed to a sample of institutions. Not surprisingly, principles of economics is nearly universally required and implemented as a two-semester course in 85 percent of economics major…

  15. Requirements engineering and management for software development projects

    CERN Document Server

    Chemuturi, Murali

    2012-01-01

    Requirements Engineering and Management for Software Development Projects presents a complete guide on requirements for software development including engineering, computer science and management activities. It is the first book to cover all aspects of requirements management in software development projects. This book introduces the understanding of the requirements, elicitation and gathering, requirements analysis, verification and validation of the requirements, establishment of requirements, different methodologies in brief, requirements traceability and change management among other topic

  16. A Quantitative Study of Global Software Development Teams, Requirements, and Software Projects

    Science.gov (United States)

    Parker, Linda L.

    2016-01-01

    The study explored the relationship between global software development teams, effective software requirements, and stakeholders' perception of successful software development projects within the field of information technology management. It examined the critical relationship between Global Software Development (GSD) teams creating effective…

  17. A Quantitative Study of Global Software Development Teams, Requirements, and Software Projects

    Science.gov (United States)

    Parker, Linda L.

    2016-01-01

    The study explored the relationship between global software development teams, effective software requirements, and stakeholders' perception of successful software development projects within the field of information technology management. It examined the critical relationship between Global Software Development (GSD) teams creating effective…

  18. Graph Based Verification of Software Evolution Requirements

    NARCIS (Netherlands)

    Ciraci, S.

    2009-01-01

    Due to market demands and changes in the environment, software systems have to evolve. However, the size and complexity of the current software systems make it time consuming to incorporate changes. During our collaboration with the industry, we observed that the developers spend much time on the

  19. Secure and Resilient Software Requirements, Test Cases, and Testing Methods

    CERN Document Server

    Merkow, Mark S

    2011-01-01

    Secure and Resilient Software: Requirements, Test Cases, and Testing Methods provides a comprehensive set of requirements for secure and resilient software development and operation. It supplies documented test cases for those requirements as well as best practices for testing nonfunctional requirements for improved information assurance. This resource-rich book includes: Pre-developed nonfunctional requirements that can be reused for any software development project Documented test cases that go along with the requirements and can be used to develop a Test Plan for the software Testing method

  20. Requirements: Towards an understanding on why software projects fail

    Science.gov (United States)

    Hussain, Azham; Mkpojiogu, Emmanuel O. C.

    2016-08-01

    Requirement engineering is at the foundation of every successful software project. There are many reasons for software project failures; however, poorly engineered requirements process contributes immensely to the reason why software projects fail. Software project failure is usually costly and risky and could also be life threatening. Projects that undermine requirements engineering suffer or are likely to suffer from failures, challenges and other attending risks. The cost of project failures and overruns when estimated is very huge. Furthermore, software project failures or overruns pose a challenge in today's competitive market environment. It affects the company's image, goodwill, and revenue drive and decreases the perceived satisfaction of customers and clients. In this paper, requirements engineering was discussed. Its role in software projects success was elaborated. The place of software requirements process in relation to software project failure was explored and examined. Also, project success and failure factors were also discussed with emphasis placed on requirements factors as they play a major role in software projects' challenges, successes and failures. The paper relied on secondary data and empirical statistics to explore and examine factors responsible for the successes, challenges and failures of software projects in large, medium and small scaled software companies.

  1. Training Requirements and Information Management System. Software user guide

    Energy Technology Data Exchange (ETDEWEB)

    Cillan, T.F.; Hodgson, M.A.

    1992-05-01

    This is the software user`s guide for the Training Requirements and Information Management System. This guide defines and describes the software operating procedures as they apply to the end user of the software program. This guide is intended as a reference tool for the user who already has an indepth knowledge of the Training Requirements and Information Management System functions and data reporting requirement.

  2. Section 508 Electronic Information Accessibility Requirements for Software Development

    Science.gov (United States)

    Ellis, Rebecca

    2014-01-01

    Section 508 Subpart B 1194.21 outlines requirements for operating system and software development in order to create a product that is accessible to users with various disabilities. This portion of Section 508 contains a variety of standards to enable those using assistive technology and with visual, hearing, cognitive and motor difficulties to access all information provided in software. The focus on requirements was limited to the Microsoft Windows® operating system as it is the predominant operating system used at this center. Compliance with this portion of the requirements can be obtained by integrating the requirements into the software development cycle early and by remediating issues in legacy software if possible. There are certain circumstances with software that may arise necessitating an exemption from these requirements, such as design or engineering software using dynamically changing graphics or numbers to convey information. These exceptions can be discussed with the Section 508 Coordinator and another method of accommodation used.

  3. 77 FR 50726 - Software Requirement Specifications for Digital Computer Software and Complex Electronics Used in...

    Science.gov (United States)

    2012-08-22

    ... COMMISSION Software Requirement Specifications for Digital Computer Software and Complex Electronics Used in... Digital Computer Software and Complex Electronics used in Safety Systems of Nuclear Power Plants.'' The DG... National Standards Institute and Institute of Electrical and Electronics Engineers (ANSI/IEEE) Standard...

  4. Proposing an Evidence-Based Strategy for Software Requirements Engineering.

    Science.gov (United States)

    Lindoerfer, Doris; Mansmann, Ulrich

    2016-01-01

    This paper discusses an evidence-based approach to software requirements engineering. The approach is called evidence-based, since it uses publications on the specific problem as a surrogate for stakeholder interests, to formulate risks and testing experiences. This complements the idea that agile software development models are more relevant, in which requirements and solutions evolve through collaboration between self-organizing cross-functional teams. The strategy is exemplified and applied to the development of a Software Requirements list used to develop software systems for patient registries.

  5. Software requirements specification for an ammunition management system

    OpenAIRE

    Alderman, Robert Bruce

    1986-01-01

    Approved for public release; distribution is unlimited. This thesis concerns the software requirements necessary to automate the present manual effort associated with ammunition inventory management and reporting at the afloat end-user level. Functional characteristics for the application software are developed, program and data structures are proposed and possible sources of data are identified. The end-product of this research is the software requirements specification. This document sup...

  6. Feature-Oriented Nonfunctional Requirement Analysis for Software Product Line

    Institute of Scientific and Technical Information of China (English)

    Xin Peng; Seok-Won Lee; Wen-Yun Zhao

    2009-01-01

    Domain analysis in software product line (SPL) development provides a basis for core assets design and implementation by a systematic and comprehensive commonality/variability analysis. In feature-oriented SPL methods, products of the domain analysis are domain feature models and corresponding feature decision models to facilitate application-oriented customization. As in requirement analysis for a single system, the domain analysis in the SPL development should consider both functional and nonfunctional domain requirements. However, the nonfunctional requirements (NFRs) are often neglected in the existing domain analysis methods. In this paper, we propose a context-based method of the NFR analysis for the SPL development. In the method, NFRs are materialized by connecting nonfunctional goals with real-world context,thus NFR elicitation and variability analysis can be performed by context analysis for the whole domain with the assistance of NFR templates and NFR graphs. After the variability analysis, our method integrates both functional and nonfunctional perspectives by incorporating the nonfunctional goals and operationalizations into an initial functional feature model.NFR-related constraints are also elicited and integrated. Finally, a decision model with both functional and nonfunctional perspectives is constructed to facilitate application-oriented feature model customization. A computer-aided grading system (CAGS) product line is employed to demonstrate the method throughout the paper.

  7. Analyzing Software Requirements Errors in Safety-Critical, Embedded Systems

    Science.gov (United States)

    Lutz, Robyn R.

    1993-01-01

    This paper analyzes the root causes of safety-related software errors in safety-critical, embedded systems. The results show that software errors identified as potentially hazardous to the system tend to be produced by different error mechanisms than non- safety-related software errors. Safety-related software errors are shown to arise most commonly from (1) discrepancies between the documented requirements specifications and the requirements needed for correct functioning of the system and (2) misunderstandings of the software's interface with the rest of the system. The paper uses these results to identify methods by which requirements errors can be prevented. The goal is to reduce safety-related software errors and to enhance the safety of complex, embedded systems.

  8. The caCORE Software Development Kit: Streamlining construction of interoperable biomedical information services

    Directory of Open Access Journals (Sweden)

    Warzel Denise

    2006-01-01

    Full Text Available Abstract Background Robust, programmatically accessible biomedical information services that syntactically and semantically interoperate with other resources are challenging to construct. Such systems require the adoption of common information models, data representations and terminology standards as well as documented application programming interfaces (APIs. The National Cancer Institute (NCI developed the cancer common ontologic representation environment (caCORE to provide the infrastructure necessary to achieve interoperability across the systems it develops or sponsors. The caCORE Software Development Kit (SDK was designed to provide developers both within and outside the NCI with the tools needed to construct such interoperable software systems. Results The caCORE SDK requires a Unified Modeling Language (UML tool to begin the development workflow with the construction of a domain information model in the form of a UML Class Diagram. Models are annotated with concepts and definitions from a description logic terminology source using the Semantic Connector component. The annotated model is registered in the Cancer Data Standards Repository (caDSR using the UML Loader component. System software is automatically generated using the Codegen component, which produces middleware that runs on an application server. The caCORE SDK was initially tested and validated using a seven-class UML model, and has been used to generate the caCORE production system, which includes models with dozens of classes. The deployed system supports access through object-oriented APIs with consistent syntax for retrieval of any type of data object across all classes in the original UML model. The caCORE SDK is currently being used by several development teams, including by participants in the cancer biomedical informatics grid (caBIG program, to create compatible data services. caBIG compatibility standards are based upon caCORE resources, and thus the caCORE SDK has

  9. The Use of UML for Software Requirements Expression and Management

    Science.gov (United States)

    Murray, Alex; Clark, Ken

    2015-01-01

    It is common practice to write English-language "shall" statements to embody detailed software requirements in aerospace software applications. This paper explores the use of the UML language as a replacement for the English language for this purpose. Among the advantages offered by the Unified Modeling Language (UML) is a high degree of clarity and precision in the expression of domain concepts as well as architecture and design. Can this quality of UML be exploited for the definition of software requirements? While expressing logical behavior, interface characteristics, timeliness constraints, and other constraints on software using UML is commonly done and relatively straight-forward, achieving the additional aspects of the expression and management of software requirements that stakeholders expect, especially traceability, is far less so. These other characteristics, concerned with auditing and quality control, include the ability to trace a requirement to a parent requirement (which may well be an English "shall" statement), to trace a requirement to verification activities or scenarios which verify that requirement, and to trace a requirement to elements of the software design which implement that requirement. UML Use Cases, designed for capturing requirements, have not always been satisfactory. Some applications of them simply use the Use Case model element as a repository for English requirement statements. Other applications of Use Cases, in which Use Cases are incorporated into behavioral diagrams that successfully communicate the behaviors and constraints required of the software, do indeed take advantage of UML's clarity, but not in ways that support the traceability features mentioned above. Our approach uses the Stereotype construct of UML to precisely identify elements of UML constructs, especially behaviors such as State Machines and Activities, as requirements, and also to achieve the necessary mapping capabilities. We describe this approach in the

  10. Semantic-Based Requirements Content Management for Cloud Software

    Directory of Open Access Journals (Sweden)

    Jianqiang Hu

    2015-01-01

    Full Text Available Cloud Software is a software complex system whose topology and behavior can evolve dynamically in Cloud-computing environments. Given the unpredictable, dynamic, elasticity, and on-demand nature of the Cloud, it would be unrealistic to assume that traditional software engineering can “cleanly” satisfy the behavioral requirements of Cloud Software. In particular, the majority of traditional requirements managements take document-centric approaches, which have low degree of automation, coarse-grained management, and limited support for requirements modeling activities. Facing the challenges, based on metamodeling frame called RGPS (Role-Goal-Process-Service international standard, this paper firstly presents a hierarchical framework of semantic-based requirements content management for Cloud Software. And then, it focuses on some of the important management techniques in this framework, such as the native storage scheme, an ordered index with keywords, requirements instances classification based linear conditional random fields (CRFs, and breadth-first search algorithm for associated instances. Finally, a prototype tool called RGPS-RM for semantic-based requirements content management is implemented to provide supporting services for open requirements process of Cloud Software. The proposed framework applied to the Cloud Software development is demonstrated to show the validity and applicability. RGPS-RM also displays effect of fine-grained retrieval and breadth-first search algorithm for associated instance in visualization.

  11. MODIS. Volume 1: MODIS level 1A software baseline requirements

    Science.gov (United States)

    Masuoka, Edward; Fleig, Albert; Ardanuy, Philip; Goff, Thomas; Carpenter, Lloyd; Solomon, Carl; Storey, James

    1994-01-01

    This document describes the level 1A software requirements for the moderate resolution imaging spectroradiometer (MODIS) instrument. This includes internal and external requirements. Internal requirements include functional, operational, and data processing as well as performance, quality, safety, and security engineering requirements. External requirements include those imposed by data archive and distribution systems (DADS); scheduling, control, monitoring, and accounting (SCMA); product management (PM) system; MODIS log; and product generation system (PGS). Implementation constraints and requirements for adapting the software to the physical environment are also included.

  12. Impact Propagation of Human Errors on Software Requirements Volatility

    Directory of Open Access Journals (Sweden)

    Zahra Askarinejadamiri

    2017-02-01

    Full Text Available Requirements volatility (RV is one of the key risk sources in software development and maintenance projects because of the frequent changes made to the software. Human faults and errors are major factors contributing to requirement change in software development projects. As such, predicting requirements volatility is a challenge to risk management in the software area. Previous studies only focused on certain aspects of the human error in this area. This study specifically identifies and analyses the impact of human errors on requirements gathering and requirements volatility. It proposes a model based on responses to a survey questionnaire administered to 215 participants who have experience in software requirement gathering. Exploratory factor analysis (EFA and structural equation modelling (SEM were used to analyse the correlation of human errors and requirement volatility. The results of the analysis confirm the correlation between human errors and RV. The results show that human actions have a higher impact on RV compared to human perception. The study provides insights into software management to understand socio-technical aspects of requirements volatility in order to control risk management. Human actions and perceptions respectively are a root cause contributing to human errors that lead to RV.

  13. Green Software Engineering Adaption In Requirement Elicitation Process

    Directory of Open Access Journals (Sweden)

    Umma Khatuna Jannat

    2015-08-01

    Full Text Available A recent technology investigates the role of concern in the environment software that is green software system. Now it is widely accepted that the green software can fit all process of software development. It is also suitable for the requirement elicitation process. Now a days software companies have used requirements elicitation techniques in an enormous majority. Because this process plays more and more important roles in software development. At the present time most of the requirements elicitation process is improved by using some techniques and tools. So that the intention of this research suggests to adapt green software engineering for the intention of existing elicitation technique and recommend suitable actions for improvement. This research being involved qualitative data. I used few keywords in my searching procedure then searched IEEE ACM Springer Elsevier Google scholar Scopus and Wiley. Find out articles which published in 2010 until 2016. Finding from the literature review Identify 15 traditional requirement elicitations factors and 23 improvement techniques to convert green engineering. Lastly The paper includes a squat review of the literature a description of the grounded theory and some of the identity issues related finding of the necessity for requirements elicitation improvement techniques.

  14. Dynamic optical resource allocation for mobile core networks with software defined elastic optical networking.

    Science.gov (United States)

    Zhao, Yongli; Chen, Zhendong; Zhang, Jie; Wang, Xinbo

    2016-07-25

    Driven by the forthcoming of 5G mobile communications, the all-IP architecture of mobile core networks, i.e. evolved packet core (EPC) proposed by 3GPP, has been greatly challenged by the users' demands for higher data rate and more reliable end-to-end connection, as well as operators' demands for low operational cost. These challenges can be potentially met by software defined optical networking (SDON), which enables dynamic resource allocation according to the users' requirement. In this article, a novel network architecture for mobile core network is proposed based on SDON. A software defined network (SDN) controller is designed to realize the coordinated control over different entities in EPC networks. We analyze the requirement of EPC-lightpath (EPCL) in data plane and propose an optical switch load balancing (OSLB) algorithm for resource allocation in optical layer. The procedure of establishment and adjustment of EPCLs is demonstrated on a SDON-based EPC testbed with extended OpenFlow protocol. We also evaluate the OSLB algorithm through simulation in terms of bandwidth blocking ratio, traffic load distribution, and resource utilization ratio compared with link-based load balancing (LLB) and MinHops algorithms.

  15. RePizer:a framework for prioritization of software requirements

    Institute of Scientific and Technical Information of China (English)

    Saif Ur Rehman KHAN; Sai Peck LEE; Mohammad DABBAGH; Muhammad TAHIR; Muzafar KHAN; Muhammad ARIF

    2016-01-01

    The standard software development life cycle heavily depends on requirements elicited from stakeholders. Based on those requirements, software development is planned and managed from its inception phase to closure. Due to time and resource constraints, it is imperative to identify the high-priority requirements that need to be considered first during the software devel-opment process. Moreover, existing prioritization frameworks lack a store of historical data useful for selecting the most suitable prioritization technique of any similar project domain. In this paper, we propose a framework for prioritization of software re-quirements, called RePizer, to be used in conjunction with a selected prioritization technique to rank software requirements based on defined criteria such as implementation cost. RePizer assists requirements engineers in a decision-making process by retrieving historical data from a requirements repository. RePizer also provides a panoramic view of the entire project to ensure the judicious use of software development resources. We compared the performance of RePizer in terms of expected accuracy and ease of use while separately adopting two different prioritization techniques, planning game (PG) and analytical hierarchy process (AHP). The results showed that RePizer performed better when used in conjunction with the PG technique.

  16. Review of Requirements Management Issues in Software Development

    Directory of Open Access Journals (Sweden)

    Muhammad Naeem Ahmed Khan

    2013-01-01

    Full Text Available A requirement is a capability to which a product or service should conform to. A meticulous consideration to requirements engineering acts as a backbone of software projects. Ambiguous and unrealistic requirements are major source of failure in the software-intensive systems. Requirements engineering processes are complex as most of the requirements engineering documentation is written in natural languages which are less formal and often distract the designers and developers. Requirements management is a continuous process throughout the project lifecycle and relates to documenting, analyzing, tracing and prioritizing requirements and then finally controlling changes. The main issues related to requirements management are usually social, political and cultural. Software requirement engineers who gather the requirements generally consider that such issues are beyond the scope of their profession as they deem them within the project management ambit. In this study, we highlight the management issues that arise in the requirements engineering process and explore the possibilities to tackle them amicably. The study is supplemented with a critical review of the existing methodologies for resolving and managing software requirements.

  17. Requirements Specifications Checking of Embedded Real-Time Software

    Institute of Scientific and Technical Information of China (English)

    WU Guoqing(毋国庆); SHU Fengdi(舒风笛); WANG Min(王敏); CHEN Weiqing(陈伟清)

    2002-01-01

    After introducing the overview of our requirements description model HRFSM,the paper presents a dynamic software execution model (DERTS) of embedded real-time software, which can integrate control flow, data flow and time. Based on DERTS, a checking method is also presented. It consists of three kinds of checking and can check the consistency and completeness of the requirement specifications of embedded real-time software. Besides providing information helpful to improve the efficiency of analyzing and checking specifications,the checking method is flexible, and easy to understand and to use for the analyst.

  18. Spectrum analysis on quality requirements consideration in software design documents.

    Science.gov (United States)

    Kaiya, Haruhiko; Umemura, Masahiro; Ogata, Shinpei; Kaijiri, Kenji

    2013-12-01

    Software quality requirements defined in the requirements analysis stage should be implemented in the final products, such as source codes and system deployment. To guarantee this meta-requirement, quality requirements should be considered in the intermediate stages, such as the design stage or the architectural definition stage. We propose a novel method for checking whether quality requirements are considered in the design stage. In this method, a technique called "spectrum analysis for quality requirements" is applied not only to requirements specifications but also to design documents. The technique enables us to derive the spectrum of a document, and quality requirements considerations in the document are numerically represented in the spectrum. We can thus objectively identify whether the considerations of quality requirements in a requirements document are adapted to its design document. To validate the method, we applied it to commercial software systems with the help of a supporting tool, and we confirmed that the method worked well.

  19. More about software requirements thorny issues and practical advice

    CERN Document Server

    Wiegers, Karl E

    2006-01-01

    No matter how much instruction you've had on managing software requirements, there's no substitute for experience. Too often, lessons about requirements engineering processes lack the no-nonsense guidance that supports real-world solutions. Complementing the best practices presented in his book, Software Requirements, Second Edition, requirements engineering authority Karl Wiegers tackles even more of the real issues head-on in this book. With straightforward, professional advice and practical solutions based on actual project experiences, this book answers many of the tough questions rais

  20. Geolocating thermal binoculars based on a software defined camera core incorporating HOT MCT grown by MOVPE

    Science.gov (United States)

    Pillans, Luke; Harmer, Jack; Edwards, Tim; Richardson, Lee

    2016-05-01

    Geolocation is the process of calculating a target position based on bearing and range relative to the known location of the observer. A high performance thermal imager with integrated geolocation functions is a powerful long range targeting device. Firefly is a software defined camera core incorporating a system-on-a-chip processor running the AndroidTM operating system. The processor has a range of industry standard serial interfaces which were used to interface to peripheral devices including a laser rangefinder and a digital magnetic compass. The core has built in Global Positioning System (GPS) which provides the third variable required for geolocation. The graphical capability of Firefly allowed flexibility in the design of the man-machine interface (MMI), so the finished system can give access to extensive functionality without appearing cumbersome or over-complicated to the user. This paper covers both the hardware and software design of the system, including how the camera core influenced the selection of peripheral hardware, and the MMI design process which incorporated user feedback at various stages.

  1. A report on NASA software engineering and Ada training requirements

    Science.gov (United States)

    Legrand, Sue; Freedman, Glenn B.; Svabek, L.

    1987-01-01

    NASA's software engineering and Ada skill base are assessed and information that may result in new models for software engineering, Ada training plans, and curricula are provided. A quantitative assessment which reflects the requirements for software engineering and Ada training across NASA is provided. A recommended implementation plan including a suggested curriculum with associated duration per course and suggested means of delivery is also provided. The distinction between education and training is made. Although it was directed to focus on NASA's need for the latter, the key relationships to software engineering education are also identified. A rationale and strategy for implementing a life cycle education and training program are detailed in support of improved software engineering practices and the transition to Ada.

  2. A Method for Software Requirement Volatility Analysis Using QFD

    Directory of Open Access Journals (Sweden)

    Yunarso Anang

    2016-10-01

    Full Text Available Changes of software requirements are inevitable during the development life cycle. Rather than avoiding the circumstance, it is easier to just accept it and find a way to anticipate those changes. This paper proposes a method to analyze the volatility of requirement by using the Quality Function Deployment (QFD method and the introduced degree of volatility. Customer requirements are deployed to software functions and subsequently to architectural design elements. And then, after determining the potential for changes of the design elements, the degree of volatility of the software requirements is calculated. In this paper the method is described using a flow diagram and illustrated using a simple example, and is evaluated using a case study.

  3. Software Requirements Specification Verifiable Fuel Cycle Simulation (VISION) Model

    Energy Technology Data Exchange (ETDEWEB)

    D. E. Shropshire; W. H. West

    2005-11-01

    The purpose of this Software Requirements Specification (SRS) is to define the top-level requirements for a Verifiable Fuel Cycle Simulation Model (VISION) of the Advanced Fuel Cycle (AFC). This simulation model is intended to serve a broad systems analysis and study tool applicable to work conducted as part of the AFCI (including costs estimates) and Generation IV reactor development studies.

  4. Facilitating Software Architecting by Ranking Requirements based on their Impact on the Architecture Process

    NARCIS (Netherlands)

    Galster, Matthias; Eberlein, Armin; Sprinkle, J; Sterritt, R; Breitman, K

    2011-01-01

    Ranking software requirements helps decide what requirements to implement during a software development project, and when. Currently, requirements ranking techniques focus on resource constraints or stakeholder priorities and neglect the effect of requirements on the software architecture process. H

  5. Facilitating Software Architecting by Ranking Requirements based on their Impact on the Architecture Process

    NARCIS (Netherlands)

    Galster, Matthias; Eberlein, Armin; Sprinkle, J; Sterritt, R; Breitman, K

    2011-01-01

    Ranking software requirements helps decide what requirements to implement during a software development project, and when. Currently, requirements ranking techniques focus on resource constraints or stakeholder priorities and neglect the effect of requirements on the software architecture process.

  6. Solid Waste Information and Tracking System (SWITS) Software Requirements Specification

    Energy Technology Data Exchange (ETDEWEB)

    MAY, D.L.

    2000-03-22

    This document is the primary document establishing requirements for the Solid Waste Information and Tracking System (SWITS) as it is converted to a client-server architecture. The purpose is to provide the customer and the performing organizations with the requirements for the SWITS in the new environment. This Software Requirement Specification (SRS) describes the system requirements for the SWITS Project, and follows the PHMC Engineering Requirements, HNF-PRO-1819, and Computer Software Qualify Assurance Requirements, HNF-PRO-309, policies. This SRS includes sections on general description, specific requirements, references, appendices, and index. The SWITS system defined in this document stores information about the solid waste inventory on the Hanford site. Waste is tracked as it is generated, analyzed, shipped, stored, and treated. In addition to inventory reports a number of reports for regulatory agencies are produced.

  7. A SYSTEMATIC LITERATURE REVIEW ABOUT SOFTWARE REQUIREMENTS ELICITATION

    Directory of Open Access Journals (Sweden)

    LENIS R. WONG

    2017-02-01

    Full Text Available Requirements Elicitation is recognized as one of the most important activity in software development process as it has direct impact on its success. Although there are many proposals for improving this task, still there are issues which have to be solved. This paper aims to identify the current status of the latest researches related to software requirements elicitation through general framework for literature review, in order to answer the following research questions: Q1 What aspects have been covered by different proposal of requirements elicitation? Q2 What activities of the requirements elicitation process have been covered? And Q3 What factors influence on requirements elicitation and how? A cross-analysis of the outcome was performed. One of the results showed that requirements elicitation process needs improvements.

  8. Requirements Prioritization: Challenges and Techniques for Quality Software Development

    Directory of Open Access Journals (Sweden)

    Muhammad Abdullah Awais

    2016-04-01

    Full Text Available Every organization is aware of the consequences and importance of requirements for the development of quality software product whether local or global. Requirement engineering phase of development with focus on the prioritization of requirements is going under huge research every day because in any development methodology, all requirements cannot be implemented at same time so requirements are prioritized to be implemented to give solution as early as possible in phases as scheduled in incremental fashion. Numerous frameworks and practices have been devised, in progress and some being discovered day by day. With such huge knowledge database and research available, it has always been confusing to decide which technique to follow to gain maximum results. Thus many projects fail because of the wrong choice in requirement prioritization because it’s really difficult to employ right technique and framework at right time. And problems do not end here rather due to strict deadlines, it’s often best to develop system in parts by different team members dispersed globally with diverse methodologies and differences and in this situation it becomes more difficult to prioritize requirements. Main focus would be on ETVX based prioritization [1] for in house development and requirement prioritization of software developed globally by diverse team members [2]. This paper will try to provide an overview of different prioritization techniques for software requirement, and a critical analysis of ETVX based model will be presented to highlight issues and challenges in this proposed model of requirement prioritization in [1] and improved version of this model will be presented while an analysis of requirement prioritization for software developed in global environment [2] also be presented.

  9. Managing Software Requirements Changes Based on Negotiation-Style Revision

    Institute of Scientific and Technical Information of China (English)

    Ke-Dian Mu; Weiru Liu; Zhi Jin; Jun Hong; David Bell

    2011-01-01

    For any proposed software project,when the software requirements specification has been established,requirements changes may result in not only a modification of the requirements specification but also a series of modifications of all existing artifacts during the development.Then it is necessary to provide effective and flexible requirements changes management.In this paper,we present an approach to managing requirements changes based on Booth's negotiation-style framework for belief revision.Informally,we consider the current requirements specification as a belief set about the systemto-be.The request of requirements change is viewed as new information about the same system-to-be.Then the process of executing the requirements change is a process of revising beliefs about the system-to-be.We design a family of belief negotiation models appropriate for different processes of requirements revision,including the setting of the request of requirements change being fully accepted,the setting of the current requirements specification being fully preserved,and that of the current specification and the request of requirements change reaching a compromise.In particular,the prioritization of requirements plays an important role in reaching an agreement in each belief negotiation model designed in this paper.

  10. Metric-based method of software requirements correctness improvement

    Directory of Open Access Journals (Sweden)

    Yaremchuk Svitlana

    2017-01-01

    Full Text Available The work highlights the most important principles of software reliability management (SRM. The SRM concept construes a basis for developing a method of requirements correctness improvement. The method assumes that complicated requirements contain more actual and potential design faults/defects. The method applies a newer metric to evaluate the requirements complexity and double sorting technique evaluating the priority and complexity of a particular requirement. The method enables to improve requirements correctness due to identification of a higher number of defects with restricted resources. Practical application of the proposed method in the course of demands review assured a sensible technical and economic effect.

  11. Software requirements flow-down and preliminary software design for the G-CLEF spectrograph

    Science.gov (United States)

    Evans, Ian N.; Budynkiewicz, Jamie A.; DePonte Evans, Janet; Miller, Joseph B.; Onyuksel, Cem; Paxson, Charles; Plummer, David A.

    2016-08-01

    The Giant Magellan Telescope (GMT)-Consortium Large Earth Finder (G-CLEF) is a fiber-fed, precision radial velocity (PRV) optical echelle spectrograph that will be the first light instrument on the GMT. The G-CLEF instrument device control subsystem (IDCS) provides software control of the instrument hardware, including the active feedback loops that are required to meet the G-CLEF PRV stability requirements. The IDCS is also tasked with providing operational support packages that include data reduction pipelines and proposal preparation tools. A formal, but ultimately pragmatic approach is being used to establish a complete and correct set of requirements for both the G-CLEF device control and operational support packages. The device control packages must integrate tightly with the state-machine driven software and controls reference architecture designed by the GMT Organization. A model-based systems engineering methodology is being used to develop a preliminary design that meets these requirements. Through this process we have identified some lessons that have general applicability to the development of software for ground-based instrumentation. For example, tasking an individual with overall responsibility for science/software/hardware integration is a key step to ensuring effective integration between these elements. An operational concept document that includes detailed routine and non- routine operational sequences should be prepared in parallel with the hardware design process to tie together these elements and identify any gaps. Appropriate time-phasing of the hardware and software design phases is important, but revisions to driving requirements that impact software requirements and preliminary design are inevitable. Such revisions must be carefully managed to ensure efficient use of resources.

  12. VERTAF/Multi-Core: A SysML-Based Application Framework for Multi-Core Embedded Software Development

    Institute of Scientific and Technical Information of China (English)

    Chao-Sheng Lin; Chun-Hsien Lu; Shang-Wei Lin; Yean-Ru Chen; Pao-Ann Hsiung

    2011-01-01

    Multi-core processors are becoming prevalent rapidly in personal computing and embedded systems. Nev-ertheless, the programming environment for multi-core processor-based systems is still quite immature and lacks efficient tools. In this work, we present a new VERTAF/Multi-Core framework and show how software code can be automatically generated from SysML models of multi-core embedded systems. We illustrate how model-driven design based on SysML can be seamlessly integrated with Intel's threading building blocks (TBB) and the quantum framework (QF) middleware. We use a digital video recording system to illustrate the benefits of the framework. Our experiments show how SysML/QF/TBB help in making multi-core embedded system programming model-driven, easy, and efficient.

  13. Experience with Intel's Many Integrated Core Architecture in ATLAS Software

    CERN Document Server

    Fleischmann, S; The ATLAS collaboration; Lavrijsen, W; Neumann, M; Vitillo, R

    2013-01-01

    Intel recently released the first commercial boards of its Many Integrated Core (MIC) Architecture. MIC is Intel's solution for the domain of throughput computing, currently dominated by general purpose programming on graphics processors (GPGPU). MIC allows the use of the more familiar x86 programming model and supports standard technologies such as OpenMP, MPI, and Intel's Threading Building Blocks. This should make it possible to develop for both throughput and latency devices using a single code base.\

  14. Experience with Intel's Many Integrated Core Architecture in ATLAS Software

    CERN Document Server

    Fleischmann, S; The ATLAS collaboration; Lavrijsen, W; Neumann, M; Vitillo, R

    2014-01-01

    Intel recently released the first commercial boards of its Many Integrated Core (MIC) Architecture. MIC is Intel's solution for the domain of throughput computing, currently dominated by general purpose programming on graphics processors (GPGPU). MIC allows the use of the more familiar x86 programming model and supports standard technologies such as OpenMP, MPI, and Intel's Threading Building Blocks. This should make it possible to develop for both throughput and latency devices using a single code base.\

  15. Software requirements specification for the HAWK operating system

    Energy Technology Data Exchange (ETDEWEB)

    Holmes, V.P.; Harris, D.L.; Borgman, C.R.; Davidson, G.S.

    1989-03-01

    This document represents the original requirements specification for the HAWK operating system. HAWK is the operating system for the SANDAC V, a real-time embedded multiprocessor based on the Motorola 68020 microprocessor. When the effort to create the operating system was first undertaken, it was clear that a careful specification of the requirements would be vital. Unfortunately, there were few models to work from since requirement documents for operating systems of any kind are seldom published. The final form of the requirements used a functional organization adapted from the IEEE Guide to Software Requirements Specifications (ANSI/IEEE Std 830-1984). Hopefully, this document will provide a historical case study from which others can benefit when faced with similar circumstances. 2 refs., 4 figs.

  16. Software use cases to elicit the software requirements analysis within the ASTRI project

    Science.gov (United States)

    Conforti, Vito; Antolini, Elisa; Bonnoli, Giacomo; Bruno, Pietro; Bulgarelli, Andrea; Capalbi, Milvia; Fioretti, Valentina; Fugazza, Dino; Gardiol, Daniele; Grillo, Alessandro; Leto, Giuseppe; Lombardi, Saverio; Lucarelli, Fabrizio; Maccarone, Maria Concetta; Malaguti, Giuseppe; Pareschi, Giovanni; Russo, Federico; Sangiorgi, Pierluca; Schwarz, Joseph; Scuderi, Salvatore; Tanci, Claudio; Tosti, Gino; Trifoglio, Massimo; Vercellone, Stefano; Zanmar Sanchez, Ricardo

    2016-07-01

    The Italian National Institute for Astrophysics (INAF) is leading the Astrofisica con Specchi a Tecnologia Replicante Italiana (ASTRI) project whose main purpose is the realization of small size telescopes (SST) for the Cherenkov Telescope Array (CTA). The first goal of the ASTRI project has been the development and operation of an innovative end-to-end telescope prototype using a dual-mirror optical configuration (SST-2M) equipped with a camera based on silicon photo-multipliers and very fast read-out electronics. The ASTRI SST-2M prototype has been installed in Italy at the INAF "M.G. Fracastoro" Astronomical Station located at Serra La Nave, on Mount Etna, Sicily. This prototype will be used to test several mechanical, optical, control hardware and software solutions which will be used in the ASTRI mini-array, comprising nine telescopes proposed to be placed at the CTA southern site. The ASTRI mini-array is a collaborative and international effort led by INAF and carried out by Italy, Brazil and South-Africa. We present here the use cases, through UML (Unified Modeling Language) diagrams and text details, that describe the functional requirements of the software that will manage the ASTRI SST-2M prototype, and the lessons learned thanks to these activities. We intend to adopt the same approach for the Mini Array Software System that will manage the ASTRI miniarray operations. Use cases are of importance for the whole software life cycle; in particular they provide valuable support to the validation and verification activities. Following the iterative development approach, which breaks down the software development into smaller chunks, we have analysed the requirements, developed, and then tested the code in repeated cycles. The use case technique allowed us to formalize the problem through user stories that describe how the user procedurally interacts with the software system. Through the use cases we improved the communication among team members, fostered

  17. SMV model-based safety analysis of software requirements

    Energy Technology Data Exchange (ETDEWEB)

    Koh, Kwang Yong [Department of Nuclear and Quantum Engineering, Korea Advanced Institute of Science and Technology, 373-1, Guseong-dong, Yuseong-gu, Daejeon 305-701 (Korea, Republic of); Seong, Poong Hyun [Department of Nuclear and Quantum Engineering, Korea Advanced Institute of Science and Technology, 373-1, Guseong-dong, Yuseong-gu, Daejeon 305-701 (Korea, Republic of)], E-mail: phseong@kaist.ac.kr

    2009-02-15

    Fault tree analysis (FTA) is one of the most frequently applied safety analysis techniques when developing safety-critical industrial systems such as software-based emergency shutdown systems of nuclear power plants and has been used for safety analysis of software requirements in the nuclear industry. However, the conventional method for safety analysis of software requirements has several problems in terms of correctness and efficiency; the fault tree generated from natural language specifications may contain flaws or errors while the manual work of safety verification is very labor-intensive and time-consuming. In this paper, we propose a new approach to resolve problems of the conventional method; we generate a fault tree from a symbolic model verifier (SMV) model, not from natural language specifications, and verify safety properties automatically, not manually, by a model checker SMV. To demonstrate the feasibility of this approach, we applied it to shutdown system 2 (SDS2) of Wolsong nuclear power plant (NPP). In spite of subtle ambiguities present in the approach, the results of this case study demonstrate its overall feasibility and effectiveness.

  18. The future of commodity computing and many-core versus the interests of HEP software

    CERN Document Server

    CERN. Geneva

    2012-01-01

    As the mainstream computing world has shifted from multi-core to many-core platforms, the situation for software developers has changed as well. With the numerous hardware and software options available, choices balancing programmability and performance are becoming a significant challenge. The expanding multiplicative dimensions of performance offer a growing number of possibilities that need to be assessed and addressed on several levels of abstraction. This paper reviews the major tradeoffs forced upon the software domain by the changing landscape of parallel technologies – hardware and software alike. Recent developments, paradigms and techniques are considered with respect to their impact on the rather traditional HEP programming models. Other considerations addressed include aspects of efficiency and reasonably achievable targets for the parallelization of large scale HEP workloads.

  19. The future of commodity computing and many-core versus the interests of HEP software

    CERN Document Server

    Jarp, Sverre; Nowak, Andrzej

    2012-01-01

    As the mainstream computing world has shifted from multi-core to many-core platforms, the situation for software developers has changed as well. With the numerous hardware and software options available, choices balancing programmability and performance are becoming a significant challenge. The expanding multiplicative dimensions of performance offer a growing number of possibilities that need to be assessed and addressed on several levels of abstraction. This paper reviews the major trade-offs forced upon the software domain by the changing landscape of parallel technologies - hardware and software alike. Recent developments, paradigms and techniques are considered with respect to their impact on the rather traditional HEP programming models. Other considerations addressed include aspects of efficiency and reasonably achievable targets for the parallelization of large scale HEP workloads.

  20. Optimal hardware/software co-synthesis for core-based SoC designs

    Institute of Scientific and Technical Information of China (English)

    Zhan Jinyu; Xiong Guangze

    2006-01-01

    A hardware/software co-synthesis method is presented for SoC designs consisting of both hardware IP cores and software components on a graph-theoretic formulation. Given a SoC integrated with a set of functions and a set of performance factors, a core for each function is selected from a set of alternative IP cores and software components, and optimal partitions is found in a way to evenly balance the performance factors and to ultimately reduce the overall cost, size, power consumption and runtime of the core-based SoC. The algorithm formulates IP cores and components into the corresponding mathematical models, presents a graph-theoretic model for finding the optimal partitions of SoC design and transforms SoC hardware/software co-synthesis problem into finding optimal paths in a weighted, directed graph. Overcoming the three main deficiencies of the traditional methods, this method can work automatically, evaluate more performance factors at the same time and meet the particularity of SoC designs.At last, the approach is illustrated that is practical and effective through partitioning a practical system.

  1. SWEPP assay system version 2.0 software requirements specification

    Energy Technology Data Exchange (ETDEWEB)

    Matthews, S.D.; East, L.V.; Marwil, E.S.; Ferguson, J.J.

    1996-06-01

    The INEL Stored Waste Examination Pilot Plant (SWEPP) operations staff use nondestructive analysis methods to characterize the radiological contents of contact-handled radioactive waste containers. Containers of waste from Rocky Flats Environmental Technology Site and other DOE sites are currently stored at SWEPP. Before these containers can be shipped to WIPP, SWEPP must verify compliance with storage, shipping, and disposal requirements. One part of the SWEPP program measures neutron emissions from the containers and estimates the mass of Pu and other transuranic isotopes present. The code NEUT2 was originally used to perform data acquisition and reduction; the SWEPP Assay System (SAS) code replaced NEUT2 in early 1994. This document specifies the requirements for the SAS software as installed at INEL and was written to comply with RWMC (INEL Radioactive Waste Management Complex) quality requirements.

  2. Work in Progress: Malleable Software Pipelines for Efficient Many-core System Utilization

    OpenAIRE

    Jahn, Janmartin; Kobbe, Sebastian; Pagani, Santiago; Chen, Jian-Jia; Henkel, Jörg

    2012-01-01

    International audience; This paper details our current research project on the efficient utilization of many-core systems by utilizing applications based on a novel kind of software pipelines. These pipelines form malleable applications that can change their degree of parallelism at runtime. This allows not only for a well-balanced load, but also for an efficient distribution of the cores to the individual, competing applications to maximize the overall system performance. We are convinced th...

  3. Core Community Specifications for Electron Microprobe Operating Systems: Software, Quality Control, and Data Management Issues

    Science.gov (United States)

    Fournelle, John; Carpenter, Paul

    2006-01-01

    Modem electron microprobe systems have become increasingly sophisticated. These systems utilize either UNIX or PC computer systems for measurement, automation, and data reduction. These systems have undergone major improvements in processing, storage, display, and communications, due to increased capabilities of hardware and software. Instrument specifications are typically utilized at the time of purchase and concentrate on hardware performance. The microanalysis community includes analysts, researchers, software developers, and manufacturers, who could benefit from exchange of ideas and the ultimate development of core community specifications (CCS) for hardware and software components of microprobe instrumentation and operating systems.

  4. Exploring the impact of socio-technical core-periphery structures in open source software development

    NARCIS (Netherlands)

    Amrit, Chintan; Hillegersberg, van Jos

    2010-01-01

    In this paper we apply the social network concept of core-periphery structure to the socio-technical structure of a software development team. We propose a socio-technical pattern that can be used to locate emerging coordination problems in Open Source projects. With the help of our tool and method

  5. A MODEL FOR ALIGNING SOFTWARE PROJECTS REQUIREMENTS WITH PROJECT TEAM MEMBERS REQUIREMENTS

    Directory of Open Access Journals (Sweden)

    Robert Hans

    2013-02-01

    Full Text Available The fast-paced, dynamic environment within which information and communication technology (ICT projects are run as well as ICT professionals’ constant changing requirements present a challenge for project managers in terms of aligning projects’ requirements with project team members’ requirements. This research paper purports that if projects’ requirements are properly aligned with team members’ requirements, then this will result in a balanced decision approach. Moreover, such an alignment will result in the realization of employee’s needs as well as meeting project’s needs. This paper presents a Project’s requirements and project Team members’ requirements (PrTr alignment model and argues that a balanced decision which meets both software project’s requirements and team members’ requirements can be achieved through the application of the PrTr alignment model.

  6. The development of test software for the inadequate core cooling monitoring system

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Soon Sung

    1996-06-01

    The test software including the ICCMS simulator which is necessary for dynamic test for the ICCMS software in PWR is developed. The developed dynamic test software consists of the module test simulator, the integration test simulator, and the test result analyser. The simulator was programmed by C language according to the same algorithm requirements for the FORTRAN version ICCMS software, and also for the Factory Acceptance Test (FAT). And the simulator can be used as training tool for the reactor operator and system development tool for the performance improvement. (author). 4 tabs., 8 figs., 11 refs.

  7. Computer software requirements specification for the world model light duty utility arm system

    Energy Technology Data Exchange (ETDEWEB)

    Ellis, J.E.

    1996-02-01

    This Computer Software Requirements Specification defines the software requirements for the world model of the Light Duty Utility Arm (LDUA) System. It is intended to be used to guide the design of the application software, to be a basis for assessing the application software design, and to establish what is to be tested in the finished application software product. (This deploys end effectors into underground storage tanks by means of robotic arm on end of telescoping mast.)

  8. Code forking in open-source software: a requirements perspective

    CERN Document Server

    Ernst, Neil A; Mylopoulos, John

    2010-01-01

    To fork a project is to copy the existing code base and move in a direction different than that of the erstwhile project leadership. Forking provides a rapid way to address new requirements by adapting an existing solution. However, it can also create a plethora of similar tools, and fragment the developer community. Hence, it is not always clear whether forking is the right strategy. In this paper, we describe a mixed-methods exploratory case study that investigated the process of forking a project. The study concerned the forking of an open-source tool for managing software projects, Trac. Trac was forked to address differing requirements in an academic setting. The paper makes two contributions to our understanding of code forking. First, our exploratory study generated several theories about code forking in open source projects, for further research. Second, we investigated one of these theories in depth, via a quantitative study. We conjectured that the features of the OSS forking process would allow new...

  9. ESAIR: A Behavior-Based Robotic Software Architecture on Multi-Core Processor Platforms

    Directory of Open Access Journals (Sweden)

    Chin-Yuan Tseng

    2013-03-01

    Full Text Available This paper introduces an Embedded Software Architecture for Intelligent Robot systems (ESAIR that addresses the issues of parallel thread executions on multi-core processor platforms. ESAIR provides a thread scheduling interface to improve the execution performance of a robot system by assigning a dedicated core to a running thread on the fly and dynamically rescheduling the priority of the thread. In the paper, we describe the object-oriented design and the control functions of ESAIR. The modular design of ESAIR helps improve the software quality, reliability and scalability in research and real practice. We prove the improvement by realizing ESAIR on an autonomous robot, named AVATAR. AVATAR implements various human-robot interactions, including speech recognition, human following, face recognition, speaker identification, etc. With the support of ESAIR, AVATAR can integrate a comprehensive set of behaviors and peripherals with better resource utilization.

  10. Development of requirements tracking and verification technology for the NPP software

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Chul Hwan; Kim, Jang Yeol; Lee, Jang Soo; Song, Soon Ja; Lee, Dong Young; Kwon, Kee Choon

    1998-12-30

    Searched and analyzed the technology of requirements engineering in the areas of aerospace and defense industry, medical industry and nuclear industry. Summarized the status of tools for the software design and requirements management. Analyzed the software design methodology for the safety software of NPP. Development of the design requirements for the requirements tracking and verification system. Development of the background technology to design the prototype tool for the requirements tracking and verification.

  11. A Comparative Study of Software Requirement, Elicitation, Prioritization and Decision Making.

    Directory of Open Access Journals (Sweden)

    Pradeep Kumar. G

    2016-04-01

    Full Text Available The failure of many software systems are mainly due to the lack of the requirement engineering. Where software requirement play a very vital role in the field of software engineering. The main task of the requirement engineering are eliciting the requirements from the customer and to prioritize those requirements to make decisions in the software design. Prioritization of the software requirement is very much useful in giving priority within the set of requirements. Requirement prioritization is very much important when there are strict constraints on schedule and the resources, then the software engineer must take some decisions on neglecting or to give prioritization to some of the requirements that are to be added to the project which makes it successful. This paper is the frame work of comparison of various techniques and to propose a most competent method among them

  12. Software-Defined Radio FPGA Cores: Building towards a Domain-Specific Language

    Directory of Open Access Journals (Sweden)

    Lekhobola Tsoeunyane

    2017-01-01

    Full Text Available This paper reports on the design and implementation of an open-source library of parameterizable and reusable Hardware Description Language (HDL Intellectual Property (IP cores designed for the development of Software-Defined Radio (SDR applications that are deployed on FPGA-based reconfigurable computing platforms. The library comprises a set of cores that were chosen, together with their parameters and interfacing schemas, based on recommendations from industry and academic SDR experts. The operation of the SDR cores is first validated and then benchmarked against two other cores libraries of a similar type to show that our cores do not take much more logic elements than existing cores and that they support a comparable maximum clock speed. Finally, we propose our design for a Domain-Specific Language (DSL and supporting tool-flow, which we are in the process of building using our SDR library and the Delite DSL framework. We intend to take this DSL and supporting framework further to provide a rapid prototyping system for SDR application development to programmers not experienced in HDL coding. We conclude with a summary of the main characteristics of our SDR library and reflect on how our DSL tool-flow could assist other developers working in SDR field.

  13. Toward an Agile Approach to Managing the Effect of Requirements on Software Architecture during Global Software Development

    Directory of Open Access Journals (Sweden)

    Abdulaziz Alsahli

    2016-01-01

    Full Text Available Requirement change management (RCM is a critical activity during software development because poor RCM results in occurrence of defects, thereby resulting in software failure. To achieve RCM, efficient impact analysis is mandatory. A common repository is a good approach to maintain changed requirements, reusing and reducing effort. Thus, a better approach is needed to tailor knowledge for better change management of requirements and architecture during global software development (GSD.The objective of this research is to introduce an innovative approach for handling requirements and architecture changes simultaneously during global software development. The approach makes use of Case-Based Reasoning (CBR and agile practices. Agile practices make our approach iterative, whereas CBR stores requirements and makes them reusable. Twin Peaks is our base model, meaning that requirements and architecture are handled simultaneously. For this research, grounded theory has been applied; similarly, interviews from domain experts were conducted. Interview and literature transcripts formed the basis of data collection in grounded theory. Physical saturation of theory has been achieved through a published case study and developed tool. Expert reviews and statistical analysis have been used for evaluation. The proposed approach resulted in effective change management of requirements and architecture simultaneously during global software development.

  14. Prediction of Software Requirements Stability Based on Complexity Point Measurement Using Multi-Criteria Fuzzy Approach

    Directory of Open Access Journals (Sweden)

    D. Francis Xavier Christopher

    2012-12-01

    Full Text Available Many software projects fail due to instable requirements and lack of managing the requirements changesefficiently. Software Requirements Stability Index Metric (RSI helps to evaluate the overall stability ofrequirements and also keep track of the project status. Higher the stability, less changes tends topropagate. The existing system use Function Point modeling for measuring the Requirements Stability.However, the main drawback of the existing modeling is that the complexity of non-functional requirementshas not been measured for Requirements Stability. The Non-Functional Factors plays a vital role inassessing the Requirements Stability. Numerous Measurement methods have been proposed for measuringthe software complexity. This paper proposes Multi-criteria Fuzzy Based approach for finding out thecomplexity weight based on Requirement Complexity Attributes such as Functional RequirementComplexity, Non-Functional Requirement Complexity, Input Output Complexity, Interface and FileComplexity. Based on the complexity weight, this paper computes the software complexity point. And thenpredict the Software Requirements Stability based on Software Complexity Point changes. The advantageof this model is that it is able to estimate the software complexity early which in turn predicts the SoftwareRequirement Stability during the software development life cycle.

  15. Evaluating the scalability of HEP software and multi-core hardware

    CERN Document Server

    Jarp, S; Leduc, J; Nowak, A

    2011-01-01

    As researchers have reached the practical limits of processor performance improvements by frequency scaling, it is clear that the future of computing lies in the effective utilization of parallel and multi-core architectures. Since this significant change in computing is well underway, it is vital for HEP programmers to understand the scalability of their software on modern hardware and the opportunities for potential improvements. This work aims to quantify the benefit of new mainstream architectures to the HEP community through practical benchmarking on recent hardware solutions, including the usage of parallelized HEP applications.

  16. Enhancing requirements engineering for patient registry software systems with evidence-based components.

    Science.gov (United States)

    Lindoerfer, Doris; Mansmann, Ulrich

    2017-07-01

    Patient registries are instrumental for medical research. Often their structures are complex and their implementations use composite software systems to meet the wide spectrum of challenges. Commercial and open-source systems are available for registry implementation, but many research groups develop their own systems. Methodological approaches in the selection of software as well as the construction of proprietary systems are needed. We propose an evidence-based checklist, summarizing essential items for patient registry software systems (CIPROS), to accelerate the requirements engineering process. Requirements engineering activities for software systems follow traditional software requirements elicitation methods, general software requirements specification (SRS) templates, and standards. We performed a multistep procedure to develop a specific evidence-based CIPROS checklist: (1) A systematic literature review to build a comprehensive collection of technical concepts, (2) a qualitative content analysis to define a catalogue of relevant criteria, and (3) a checklist to construct a minimal appraisal standard. CIPROS is based on 64 publications and covers twelve sections with a total of 72 items. CIPROS also defines software requirements. Comparing CIPROS with traditional software requirements elicitation methods, SRS templates and standards show a broad consensus but differences in issues regarding registry-specific aspects. Using an evidence-based approach to requirements engineering for registry software adds aspects to the traditional methods and accelerates the software engineering process for registry software. The method we used to construct CIPROS serves as a potential template for creating evidence-based checklists in other fields. The CIPROS list supports developers in assessing requirements for existing systems and formulating requirements for their own systems, while strengthening the reporting of patient registry software system descriptions. It may be

  17. The ERP System for an Effective Management of a Small Software Company – Requirements Analysis

    Directory of Open Access Journals (Sweden)

    Jan Mittner

    2014-01-01

    Full Text Available As found out by a questionnaire survey a significant part of small software companies is not satisfied with the way their company processes are supported by software systems. To change this situation it is necessary first to specify requirements for such software systems in small software companies. Based on the analysis of the literature and the market and own experience the first version of the ERP system requirements specification for small software companies was framed and subsequently validated by interviewing the executives of the target group companies.

  18. Choropleth Mapping on Personal Computers: Software Sources and Hardware Requirements.

    Science.gov (United States)

    Lewis, Lawrence T.

    1986-01-01

    Describes the hardware and some of the choropleth mapping software available for the IBM-PC, PC compatible and Apple II microcomputers. Reviewed are: Micromap II, Documap, Desktop Information Display System (DIDS) , Multimap, Execuvision, Iris Gis, Mapmaker, PC Map, Statmap, and Atlas Map. Vendors' addresses are provided. (JDH)

  19. Hazard Analysis of Software Requirements Specification for Process Module of FPGA-based Controllers in NPP

    Energy Technology Data Exchange (ETDEWEB)

    Jung; Sejin; Kim, Eui-Sub; Yoo, Junbeom [Konkuk University, Seoul (Korea, Republic of); Keum, Jong Yong; Lee, Jang-Soo [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2016-10-15

    Software in PLC, FPGA which are used to develop I and C system also should be analyzed to hazards and risks before used. NUREG/CR-6430 proposes the method for performing software hazard analysis. It suggests analysis technique for software affected hazards and it reveals that software hazard analysis should be performed with the aspects of software life cycle such as requirements analysis, design, detailed design, implements. It also provides the guide phrases for applying software hazard analysis. HAZOP (Hazard and operability analysis) is one of the analysis technique which is introduced in NUREG/CR-6430 and it is useful technique to use guide phrases. HAZOP is sometimes used to analyze the safety of software. Analysis method of NUREG/CR-6430 had been used in Korea nuclear power plant software for PLC development. Appropriate guide phrases and analysis process are selected to apply efficiently and NUREG/CR-6430 provides applicable methods for software hazard analysis is identified in these researches. We perform software hazard analysis of FPGA software requirements specification with two approaches which are NUREG/CR-6430 and HAZOP with using general GW. We also perform the comparative analysis with them. NUREG/CR-6430 approach has several pros and cons comparing with the HAZOP with general guide words and approach. It is enough applicable to analyze the software requirements specification of FPGA.

  20. A Study of the Speedups and Competitiveness of FPGA Soft Processor Cores using Dynamic Hardware/Software Partitioning

    CERN Document Server

    Lysecky, Roman

    2011-01-01

    Field programmable gate arrays (FPGAs) provide designers with the ability to quickly create hardware circuits. Increases in FPGA configurable logic capacity and decreasing FPGA costs have enabled designers to more readily incorporate FPGAs in their designs. FPGA vendors have begun providing configurable soft processor cores that can be synthesized onto their FPGA products. While FPGAs with soft processor cores provide designers with increased flexibility, such processors typically have degraded performance and energy consumption compared to hard-core processors. Previously, we proposed warp processing, a technique capable of optimizing a software application by dynamically and transparently re-implementing critical software kernels as custom circuits in on-chip configurable logic. In this paper, we study the potential of a MicroBlaze soft-core based warp processing system to eliminate the performance and energy overhead of a soft-core processor compared to a hard-core processor. We demonstrate that the soft-c...

  1. A Methodology to Evaluate Object oriented Software Systems Using Change Requirement Traceability Based on Impact Analysis

    Directory of Open Access Journals (Sweden)

    Sunil T. D

    2014-06-01

    Full Text Available It is a well known fact that software maintenance plays a major role and finds importance in software development life cycle. As object - oriented programming has become the standard, it is very important to understand the problems of maintaining object -oriented software systems. This paper aims at evaluating object - oriented software system through change requirement traceability – based impact analysis methodology for non functional requirements using functional requirements . The major issues have been related to change impact algorithms and inheritance of functionality.

  2. Software Licenses: DOD’s Plan to Collect Inventory Data Meets Statutory Requirements

    Science.gov (United States)

    2014-07-01

    SOFTWARE LICENSES DOD’s Plan to Collect Inventory Data Meets Statutory Requirements Report to Congressional...4. TITLE AND SUBTITLE Software Licenses: DOD’s Plan to Collect Inventory Data Meets Statutory Requirements 5a. CONTRACT NUMBER 5b. GRANT NUMBER...Highlights of GAO-14-625, a report to congressional committees July 2014 SOFTWARE LICENSES DOD’s Plan to Collect Inventory Data Meets Statutory

  3. Engineering Safety- and Security-Related Requirements for Software-Intensive Systems

    Science.gov (United States)

    2016-06-30

    2007 Carnegie Mellon University Engineering Safety- and Security-Related Requirements for Software- Intensive Systems ICCBSS’2007 Conference...Tutorial Software Engineering Institute Carnegie Mellon University Pittsburgh, PA 15213 Donald Firesmith 27 February 2007 Report Documentation Page Form...COVERED 00-00-2007 to 00-00-2007 4. TITLE AND SUBTITLE Engineering Safety- and Security-Related Requirements for Software-Intensive Systems 5a

  4. A Study on the Quantitative Assessment Method of Software Requirement Documents Using Software Engineering Measures and Bayesian Belief Networks

    Energy Technology Data Exchange (ETDEWEB)

    Eom, Heung Seop; Kang, Hyun Gook; Park, Ki Hong; Kwon, Kee Choon; Chang, Seung Cheol [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    2005-07-01

    One of the major challenges in using the digital systems in a NPP is the reliability estimation of safety critical software embedded in the digital safety systems. Precise quantitative assessment of the reliability of safety critical software is nearly impossible, since many of the aspects to be considered are of qualitative nature and not directly measurable, but they have to be estimated for a practical use. Therefore an expert's judgment plays an important role in estimating the reliability of the software embedded in safety-critical systems in practice, because they can deal with all the diverse evidence relevant to the reliability and can perform an inference based on the evidence. But, in general, the experts' way of combining the diverse evidence and performing an inference is usually informal and qualitative, which is hard to discuss and will eventually lead to a debate about the conclusion. We have been carrying out research on a quantitative assessment of the reliability of safety critical software using Bayesian Belief Networks (BBN). BBN has been proven to be a useful modeling formalism because a user can represent a complex set of events and relationships in a fashion that can easily be interpreted by others. In the previous works we have assessed a software requirement specification of a reactor protection system by using our BBN-based assessment model. The BBN model mainly employed an expert's subjective probabilities as inputs. In the process of assessing the software requirement documents we found out that the BBN model was excessively dependent on experts' subjective judgments in a large part. Therefore, to overcome the weakness of our methodology we employed conventional software engineering measures into the BBN model as shown in this paper. The quantitative relationship between the conventional software measures and the reliability of software were not identified well in the past. Then recently there appeared a few

  5. Views on Software Engineering from the Twin Peaks of Requirements and Architecture

    NARCIS (Netherlands)

    Galster, Matthias; Mirakhorli, Mehdi; Cleland-Huang, Jane; Burge, Janet E.; Franch, Xavier; Roshandel, Roshanak; Avgeriou, Paris

    2013-01-01

    The disciplines of requirements engineering (RE) and software architecture (SA) are fundamental to the success of software projects. Even though RE and SA are often considered in isolation, drawing a line between RE and SA is neither feasible nor reasonable as requirements and architectural design i

  6. Large area crop inventory experiment crop assessment subsystem software requirements document

    Science.gov (United States)

    1975-01-01

    The functional data processing requirements are described for the Crop Assessment Subsystem of the Large Area Crop Inventory Experiment. These requirements are used as a guide for software development and implementation.

  7. Development of requirements tracking and verification system for the software design of distributed control system

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Chul Hwan; Kim, Jang Yeol; Kim, Jung Tack; Lee, Jang Soo; Ham, Chang Shik [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1998-12-31

    In this paper a prototype of Requirement Tracking and Verification System(RTVS) for a Distributed Control System was implemented and tested. The RTVS is a software design and verification tool. The main functions required by the RTVS are managing, tracking and verification of the software requirements listed in the documentation of the DCS. The analysis of DCS software design procedures and interfaces with documents were performed to define the user of the RTVS, and the design requirements for RTVS were developed. 4 refs., 3 figs. (Author)

  8. STUDY THE IMPACT OF REQUIREMENTS MANAGEMENT CHARACTERISTICS IN GLOBAL SOFTWARE DEVELOPMENT PROJECTS: AN ONTOLOGY BASED APPROACH

    Directory of Open Access Journals (Sweden)

    S. Arun Kumar

    2011-11-01

    Full Text Available Requirements Management is one of the challenging and key tasks in the development of software productsin distributed software development environment. One of the key reasons found in our literature survey thefailure of software projects due to poor project management and requirement management activity. Thismain aim of this paper 1. Formulate a framework for the successful and efficient requirements managementframework for Global Software Development Projects. (GSD 2. Design a Mixed organization structure ofboth traditional approaches and agile approaches, of global software development projects. 3. ApplyOntology based Knowledge Management Systems for both the approaches to achieve requirements issuessuch as missing, inconsistency of requirements, communication and knowledge management issues andimprove the project management activities in a global software development environment. 4. Proposerequirements management metrics to measure and manage software process during the development ofinformation systems. The major contribution of this paper is to analyze the requirements managementissues and challenges associated with global software development projects. Two hypotheses have beenformulated and tested this problem through statistical techniques like correlation and regression analysisand validate the same.

  9. REAS: An Interactive Semi-Automated System for Software Requirements Elicitation Assistance

    Directory of Open Access Journals (Sweden)

    Hanan Hamed Elazhary

    2010-05-01

    Full Text Available Faulty requirements specifications lead to developing a faulty software system. This may require repeating the whole software engineering cycle wasting time and money. This paper presents an interactive semi-automated system that is a compromise between two approaches. The first tries to avoid the introduction of imprecision while the software requirements are being written. The other attempts to detect and possibly correct many types of imprecision after the software requirements are written. This is achieved by imposing the use of a good writing style and by interactively emulating a conversation between the requirements engineer and the user. This helps free the requirements engineer from such a systematic task, helps in processing many ill-structured statements, and helps maintain consistency in the used terminology. Explanations produced by the system helps in detecting and correcting any missed imprecision. The proposed techniques are easy enough to be used by non-technical stakeholders in different domains

  10. Extension of the AMBER molecular dynamics software to Intel's Many Integrated Core (MIC) architecture

    Science.gov (United States)

    Needham, Perri J.; Bhuiyan, Ashraf; Walker, Ross C.

    2016-04-01

    We present an implementation of explicit solvent particle mesh Ewald (PME) classical molecular dynamics (MD) within the PMEMD molecular dynamics engine, that forms part of the AMBER v14 MD software package, that makes use of Intel Xeon Phi coprocessors by offloading portions of the PME direct summation and neighbor list build to the coprocessor. We refer to this implementation as pmemd MIC offload and in this paper present the technical details of the algorithm, including basic models for MPI and OpenMP configuration, and analyze the resultant performance. The algorithm provides the best performance improvement for large systems (>400,000 atoms), achieving a ∼35% performance improvement for satellite tobacco mosaic virus (1,067,095 atoms) when 2 Intel E5-2697 v2 processors (2 ×12 cores, 30M cache, 2.7 GHz) are coupled to an Intel Xeon Phi coprocessor (Model 7120P-1.238/1.333 GHz, 61 cores). The implementation utilizes a two-fold decomposition strategy: spatial decomposition using an MPI library and thread-based decomposition using OpenMP. We also present compiler optimization settings that improve the performance on Intel Xeon processors, while retaining simulation accuracy.

  11. AthenaMT: Upgrading the ATLAS Software Framework for the Many-Core World with Multi-Threading

    CERN Document Server

    Leggett, Charles; The ATLAS collaboration

    2017-01-01

    ATLAS's current software framework, Gaudi/Athena, has been very successful for the experiment in LHC Runs 1 and 2. However, its single threaded design has been recognized for some time to be increasingly problematic as CPUs have increased core counts and decreased available memory per core. Even the multi-process version of Athena, AthenaMP, will not scale to the range of architectures we expect to use beyond Run2. After concluding a rigorous requirements phase, where many design components were examined in detail, ATLAS has begun the migration to a new data-flow driven, multi-threaded framework, which enables the simultaneous processing of singleton, thread unsafe legacy Algorithms, cloned Algorithms that execute concurrently in their own threads with different Event contexts, and fully re-entrant, thread safe Algorithms. In this paper we will report on the process of modifying the framework to safely process multiple concurrent events in different threads, which entails significant changes in the underlying...

  12. AthenaMT: Upgrading the ATLAS Software Framework for the Many-Core World with Multi-Threading

    CERN Document Server

    Leggett, Charles; The ATLAS collaboration; Bold, Tomasz; Calafiura, Paolo; Farrell, Steven; Malon, David; Ritsch, Elmar; Stewart, Graeme; Snyder, Scott; Tsulaia, Vakhtang; Wynne, Benjamin; van Gemmeren, Peter

    2016-01-01

    ATLAS's current software framework, Gaudi/Athena, has been very successful for the experiment in LHC Runs 1 and 2. However, its single threaded design has been recognised for some time to be increasingly problematic as CPUs have increased core counts and decreased available memory per core. Even the multi-process version of Athena, AthenaMP, will not scale to the range of architectures we expect to use beyond Run2. After concluding a rigorous requirements phase, where many design components were examined in detail, ATLAS has begun the migration to a new data-flow driven, multi-threaded framework, which enables the simultaneous processing of singleton, thread unsafe legacy Algorithms, cloned Algorithms that execute concurrently in their own threads with different Event contexts, and fully re-entrant, thread safe Algorithms. In this paper we will report on the process of modifying the framework to safely process multiple concurrent events in different threads, which entails significant changes in the underlying...

  13. Questioning the Role of Requirements Engineering in the Causes of Safety-Critical Software Failures

    Science.gov (United States)

    Johnson, C. W.; Holloway, C. M.

    2006-01-01

    Many software failures stem from inadequate requirements engineering. This view has been supported both by detailed accident investigations and by a number of empirical studies; however, such investigations can be misleading. It is often difficult to distinguish between failures in requirements engineering and problems elsewhere in the software development lifecycle. Further pitfalls arise from the assumption that inadequate requirements engineering is a cause of all software related accidents for which the system fails to meet its requirements. This paper identifies some of the problems that have arisen from an undue focus on the role of requirements engineering in the causes of major accidents. The intention is to provoke further debate within the emerging field of forensic software engineering.

  14. The Automation of Government Publications: Functional Requirements and Selected Software Systems for Serials Controls.

    Science.gov (United States)

    Stephenson, Mary Sue; Purcell, Gary R.

    1985-01-01

    Describes computer-based software and network systems for application to serials and government publications. General and specific functional requirements (hardware, software, file structure) are discussed, and descriptive information about commercially available serials control systems and a list of distributors are provided. (CLB)

  15. Towards an Early Software Effort Estimation Based on Functional and Non-Functional Requirements

    NARCIS (Netherlands)

    Kassab, M.; Daneva, Maia; Ormanjieva, Olga; Abran, A.; Braungarten, R.; Dumke, R.; Cuadrado-Gallego, J.; Brunekreef, J.

    2009-01-01

    The increased awareness of the non-functional requirements as a key to software project and product success makes explicit the need to include them in any software project effort estimation activity. However, the existing approaches to defining size-based effort relationships still pay insufficient

  16. Investigation of Adherence Degree of Agile Requirements Engineering Practices in Non-Agile Software Development Organizations

    Directory of Open Access Journals (Sweden)

    Mennatallah H. Ibrahim

    2015-01-01

    Full Text Available Requirements are critical for the success of software projects. Requirements are practically difficult to produce, as the hardest stage of building a software system is to decide what the system should do. Moreover, requirements errors are expensive to fix in the later phases of the software development life cycle. The rapidly changing business environment is highly challenging traditional Requirements Engineering (RE practices. Most of the software development organizations are working in such dynamic environment, as a result, either by or without their awareness agile methodologies are adopted in various phases of their software development cycles. The aim of this paper is to investigate the adherence degree of agile RE practices in various software development organizations that are classifying themselves as adopting traditional (i.e. non-agile software development methodologies. An approach is proposed for achieving this aim and it is applied on five different projects from four different organizations. The result shows that even the non-agile software development organizations are applying agile RE practices by different adherence degrees.

  17. The astrometric core solution for the Gaia mission. Overview of models, algorithms and software implementation

    CERN Document Server

    Lindegren, Lennart; Hobbs, David; O'Mullane, William; Bastian, Ulrich; Hernández, José

    2011-01-01

    The Gaia satellite will observe about one billion stars and other point-like sources. The astrometric core solution will determine the astrometric parameters (position, parallax, and proper motion) for a subset of these sources, using a global solution approach which must also include a large number of parameters for the satellite attitude and optical instrument. The accurate and efficient implementation of this solution is an extremely demanding task, but crucial for the outcome of the mission. We provide a comprehensive overview of the mathematical and physical models applicable to this solution, as well as its numerical and algorithmic framework. The astrometric core solution is a simultaneous least-squares estimation of about half a billion parameters, including the astrometric parameters for some 100 million well-behaved so-called primary sources. The global nature of the solution requires an iterative approach, which can be broken down into a small number of distinct processing blocks (source, attitude,...

  18. GRAPES: a software for parallel searching on biological graphs targeting multi-core architectures.

    Directory of Open Access Journals (Sweden)

    Rosalba Giugno

    Full Text Available Biological applications, from genomics to ecology, deal with graphs that represents the structure of interactions. Analyzing such data requires searching for subgraphs in collections of graphs. This task is computationally expensive. Even though multicore architectures, from commodity computers to more advanced symmetric multiprocessing (SMP, offer scalable computing power, currently published software implementations for indexing and graph matching are fundamentally sequential. As a consequence, such software implementations (i do not fully exploit available parallel computing power and (ii they do not scale with respect to the size of graphs in the database. We present GRAPES, software for parallel searching on databases of large biological graphs. GRAPES implements a parallel version of well-established graph searching algorithms, and introduces new strategies which naturally lead to a faster parallel searching system especially for large graphs. GRAPES decomposes graphs into subcomponents that can be efficiently searched in parallel. We show the performance of GRAPES on representative biological datasets containing antiviral chemical compounds, DNA, RNA, proteins, protein contact maps and protein interactions networks.

  19. A Requirements Engineering Environment for Embedded Real- Time Software-SREE

    Institute of Scientific and Technical Information of China (English)

    LI Yonghua; SHU Fengdi; WU Guoqing; LIANG Zhengping

    2006-01-01

    The paper presents the embedded real-time software-oriented requirements engineering environment-SREE.It involves the whole process of software requirements engineering, including the definition, analysis and checking of requirements specifications. We first explain the principles of the executable specification language RTRSM. Subsequently,we introduce the main functions of SREE, illustrate the methods and techniques of checking requirements specifica tions, especially how to perform simulation execution, combining prototyping method with RTRSM and animated representations. At last, we compare the SREE with other requirements specifications methods and make a summary for SREE's advantages.

  20. Basic Requirements for Systems Software Research and Development

    Science.gov (United States)

    Kuszmaul, Chris; Nitzberg, Bill

    1996-01-01

    Our success over the past ten years evaluating and developing advanced computing technologies has been due to a simple research and development (R/D) model. Our model has three phases: (a) evaluating the state-of-the-art, (b) identifying problems and creating innovations, and (c) developing solutions, improving the state- of-the-art. This cycle has four basic requirements: a large production testbed with real users, a diverse collection of state-of-the-art hardware, facilities for evalua- tion of emerging technologies and development of innovations, and control over system management on these testbeds. Future research will be irrelevant and future products will not work if any of these requirements is eliminated. In order to retain our effectiveness, the numerical aerospace simulator (NAS) must replace out-of-date production testbeds in as timely a fashion as possible, and cannot afford to ignore innovative designs such as new distributed shared memory machines, clustered commodity-based computers, and multi-threaded architectures.

  1. Software Configuration and Management System. User requirements document Scam/ URD/ WORD/ Issue 1/ revision 0

    CERN Document Server

    Bartolomé, R; Hatziangeli, Eugenia; Last, I; Ninin, P; CERN. Geneva. SPS and LEP Division

    1997-01-01

    This document is the output of the User Requirements phase of the project. It contains a classification of the requirements stated by the users and it will serve as a basis for the selection of a software configuration management tool and for the derivation of a software configuration management procedure for SL and ST. This document has been reviewed by the SLAPS administrators and SL/CO management .

  2. Review of the SOR Development Process and the Requirement for SOR-Spec Maker Software

    Science.gov (United States)

    1998-04-01

    I DCIEM No. 98-CR- 55 REVIEW OF THE SOR DEVELOPMENT PROCESS AND THE REQUIREMENT FOR SOR-SPEC MAKER SOFTWARE by Michael P. Greenley and David...I Statement of Operational Requirement I I For I SOR-Spec Maker I Requirements Management Tool I I DRAFT I by I Mike Greenley April1998 I

  3. Digital flight control software design requirements. [for space shuttle orbiter

    Science.gov (United States)

    1973-01-01

    The objective of the integrated digital flight control system is to provide rotational and translational control of the space shuttle orbiter in all phases of flight: from launch ascent through orbit to entry and touchdown, and during powered horizontal flights. The program provides a versatile control system structure while maintaining uniform communications with other programs, sensors, and control effects by using an executive routine/function subroutine format. The program reads all external variables at a single point, copies them into its dedicated storage, and then calls the required subroutines in the proper sequence. As a result, the flight control program is largely independent of other programs in the GN and C computer complex and is equally insensitive to the characteristics of the processor configuration. The integrated structure of the control system and the DFCS executive routine which embodies that structure are described. The specific estimation and control algorithms used in the various mission phases are shown. Attitude maneuver routines that interface with the DFCS are also described.

  4. Software safety analysis on the model specified by NuSCR and SMV input language at requirements phase of software development life cycle using SMV

    Energy Technology Data Exchange (ETDEWEB)

    Koh, Kwang Yong; Seong, Poong Hyun [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    2005-07-01

    Safety-critical software process is composed of development process, verification and validation (V and V) process and safety analysis process. Safety analysis process has been often treated as an additional process and not found in a conventional software process. But software safety analysis (SSA) is required if software is applied to a safety system, and the SSA shall be performed independently for the safety software through software development life cycle (SDLC). Of all the phases in software development, requirements engineering is generally considered to play the most critical role in determining the overall software quality. NASA data demonstrate that nearly 75% of failures found in operational software were caused by errors in the requirements. The verification process in requirements phase checks the correctness of software requirements specification, and the safety analysis process analyzes the safety-related properties in detail. In this paper, the method for safety analysis at requirements phase of software development life cycle using symbolic model verifier (SMV) is proposed. Hazard is discovered by hazard analysis and in other to use SMV for the safety analysis, the safety-related properties are expressed by computation tree logic (CTL)

  5. On the Design of Energy Efficient Optical Networks with Software Defined Networking Control Across Core and Access Networks

    DEFF Research Database (Denmark)

    Wang, Jiayuan; Yan, Ying; Dittmann, Lars

    2013-01-01

    This paper presents a Software Defined Networking (SDN) control plane based on an overlay GMPLS control model. The SDN control platform manages optical core networks (WDM/DWDM networks) and the associated access networks (GPON networks), which makes it possible to gather global information...

  6. On the Design of Energy Efficient Optical Networks with Software Defined Networking Control Across Core and Access Networks

    DEFF Research Database (Denmark)

    Wang, Jiayuan; Yan, Ying; Dittmann, Lars

    2013-01-01

    This paper presents a Software Defined Networking (SDN) control plane based on an overlay GMPLS control model. The SDN control platform manages optical core networks (WDM/DWDM networks) and the associated access networks (GPON networks), which makes it possible to gather global information...

  7. BROCCOLI: Software for Fast fMRI Analysis on Many-Core CPUs and GPUs

    Directory of Open Access Journals (Sweden)

    Anders eEklund

    2014-03-01

    Full Text Available Analysis of functional magnetic resonance imaging (fMRI data is becoming ever more computationally demanding as temporal and spatial resolutions improve, and large, publicly available data sets proliferate. Moreover, methodological improvements in the neuroimaging pipeline, such as non-linear spatial normalization, non-parametric permutation tests and Bayesian Markov Chain Monte Carlo approaches, can dramatically increase the computational burden. Despite these challenges, there do not yet exist any fMRI software packages which leverage inexpensive and powerful graphics processing units (GPUs to perform these analyses. Here, we therefore present BROCCOLI, a free software package written in OpenCL (Open Computing Language that can be used for parallel analysis of fMRI data on a large variety of hardware configurations. BROCCOLI has, for example, been tested with an Intel CPU, an Nvidia GPU and an AMD GPU. These tests show that parallel processing of fMRI data can lead to significantly faster analysis pipelines. This speedup can be achieved on relatively standard hardware, but further, dramatic speed improvements require only a modest investment in GPU hardware. BROCCOLI (running on a GPU can perform non-linear spatial normalization to a 1 mm3 brain template in 4-6 seconds, and run a second level permutation test with 10,000 permutations in about a minute. These non-parametric tests are generally more robust than their parametric counterparts, and can also enable more sophisticated analyses by estimating complicated null distributions. Additionally, BROCCOLI includes support for Bayesian first-level fMRI analysis using a Gibbs sampler. The new software is freely available under GNU GPL3 and can be downloaded from github (https://github.com/wanderine/BROCCOLI/.

  8. BROCCOLI: Software for fast fMRI analysis on many-core CPUs and GPUs.

    Science.gov (United States)

    Eklund, Anders; Dufort, Paul; Villani, Mattias; Laconte, Stephen

    2014-01-01

    Analysis of functional magnetic resonance imaging (fMRI) data is becoming ever more computationally demanding as temporal and spatial resolutions improve, and large, publicly available data sets proliferate. Moreover, methodological improvements in the neuroimaging pipeline, such as non-linear spatial normalization, non-parametric permutation tests and Bayesian Markov Chain Monte Carlo approaches, can dramatically increase the computational burden. Despite these challenges, there do not yet exist any fMRI software packages which leverage inexpensive and powerful graphics processing units (GPUs) to perform these analyses. Here, we therefore present BROCCOLI, a free software package written in OpenCL (Open Computing Language) that can be used for parallel analysis of fMRI data on a large variety of hardware configurations. BROCCOLI has, for example, been tested with an Intel CPU, an Nvidia GPU, and an AMD GPU. These tests show that parallel processing of fMRI data can lead to significantly faster analysis pipelines. This speedup can be achieved on relatively standard hardware, but further, dramatic speed improvements require only a modest investment in GPU hardware. BROCCOLI (running on a GPU) can perform non-linear spatial normalization to a 1 mm(3) brain template in 4-6 s, and run a second level permutation test with 10,000 permutations in about a minute. These non-parametric tests are generally more robust than their parametric counterparts, and can also enable more sophisticated analyses by estimating complicated null distributions. Additionally, BROCCOLI includes support for Bayesian first-level fMRI analysis using a Gibbs sampler. The new software is freely available under GNU GPL3 and can be downloaded from github (https://github.com/wanderine/BROCCOLI/).

  9. BROCCOLI: Software for fast fMRI analysis on many-core CPUs and GPUs

    Science.gov (United States)

    Eklund, Anders; Dufort, Paul; Villani, Mattias; LaConte, Stephen

    2014-01-01

    Analysis of functional magnetic resonance imaging (fMRI) data is becoming ever more computationally demanding as temporal and spatial resolutions improve, and large, publicly available data sets proliferate. Moreover, methodological improvements in the neuroimaging pipeline, such as non-linear spatial normalization, non-parametric permutation tests and Bayesian Markov Chain Monte Carlo approaches, can dramatically increase the computational burden. Despite these challenges, there do not yet exist any fMRI software packages which leverage inexpensive and powerful graphics processing units (GPUs) to perform these analyses. Here, we therefore present BROCCOLI, a free software package written in OpenCL (Open Computing Language) that can be used for parallel analysis of fMRI data on a large variety of hardware configurations. BROCCOLI has, for example, been tested with an Intel CPU, an Nvidia GPU, and an AMD GPU. These tests show that parallel processing of fMRI data can lead to significantly faster analysis pipelines. This speedup can be achieved on relatively standard hardware, but further, dramatic speed improvements require only a modest investment in GPU hardware. BROCCOLI (running on a GPU) can perform non-linear spatial normalization to a 1 mm3 brain template in 4–6 s, and run a second level permutation test with 10,000 permutations in about a minute. These non-parametric tests are generally more robust than their parametric counterparts, and can also enable more sophisticated analyses by estimating complicated null distributions. Additionally, BROCCOLI includes support for Bayesian first-level fMRI analysis using a Gibbs sampler. The new software is freely available under GNU GPL3 and can be downloaded from github (https://github.com/wanderine/BROCCOLI/). PMID:24672471

  10. 78 FR 32988 - Core Principles and Other Requirements for Designated Contract Markets; Correction

    Science.gov (United States)

    2013-06-03

    ... COMMISSION 17 CFR Part 38 RIN 3038-AD09 Core Principles and Other Requirements for Designated Contract...: This document corrects the Federal Register release of the final rule regarding Core Principles and... language for the previously published Federal Register release of the final rule regarding Core...

  11. 78 FR 47154 - Core Principles and Other Requirements for Swap Execution Facilities; Correction

    Science.gov (United States)

    2013-08-05

    ... COMMISSION 17 CFR Part 37 RIN 3038-AD18 Core Principles and Other Requirements for Swap Execution Facilities... Acceptable Practices in, Compliance With Core Principles 2. On page 33600, in the second column, under the heading Core Principle 3 of Section 5h of the Act--Swaps Not Readily Susceptible to Manipulation,...

  12. 76 FR 14825 - Core Principles and Other Requirements for Designated Contact Markets

    Science.gov (United States)

    2011-03-18

    ... COMMISSION 17 CFR Parts 1, 16, and 38 RIN 3038-AD09 Core Principles and Other Requirements for Designated... Commission in the Federal Register release for the notice of proposed rulemaking for ``Core Principles and... comment period for the proposed rulemaking closed on February 22, 2011. \\2\\ See Core Principles and...

  13. Minimum Requirements for Core Competency in Pediatric Pharmacy Practice.

    Science.gov (United States)

    Boucher, Elizabeth A; Burke, Margaret M; Johnson, Peter N; Klein, Kristin C; Miller, Jamie L

    2015-01-01

    Colleges of pharmacy provide varying amounts of didactic and clinical hours in pediatrics resulting in variability in the knowledge, skills, and perceptions of new graduates toward pediatric pharmaceutical care. The Pediatric Pharmacy Advocacy Group (PPAG) endorses the application of a minimum set of core competencies for all pharmacists involved in the care of hospitalized children.

  14. Qualification of Simulation Software for Safety Assessment of Sodium Cooled Fast Reactors. Requirements and Recommendations

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Nicholas R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Pointer, William David [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Sieger, Matt [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Flanagan, George F. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Moe, Wayne [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); HolbrookINL, Mark [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-04-01

    The goal of this review is to enable application of codes or software packages for safety assessment of advanced sodium-cooled fast reactor (SFR) designs. To address near-term programmatic needs, the authors have focused on two objectives. First, the authors have focused on identification of requirements for software QA that must be satisfied to enable the application of software to future safety analyses. Second, the authors have collected best practices applied by other code development teams to minimize cost and time of initial code qualification activities and to recommend a path to the stated goal.

  15. Software architects’ experiences of quality requirements : what we know and what we do not know?

    NARCIS (Netherlands)

    Daneva, Maia; Buglione, Luigi; Herrmann, Andrea; Doerr, J.; Opdahl, A.

    2013-01-01

    [Context/motivation] Quality requirements (QRs) are a concern of both requirement engineering (RE) specialists and software architects (SAs). However, the majority of empirical studies on QRs take the RE analysts’/clients’ perspectives, and only recently very few included the SAs’ perspective. As a

  16. An integrated approach for requirement selection and scheduling in software release planning

    NARCIS (Netherlands)

    Li, C.; van den Akker, Marjan; Brinkkemper, Sjaak; Diepen, Guido

    It is essential for product software companies to decide which requirements should be included in the next release and to make an appropriate time plan of the development project. Compared to the extensive research done on requirement selection, very little research has been performed on time

  17. An integrated approach for requirement selection and scheduling in software release planning

    NARCIS (Netherlands)

    Li, Chen; Akker, van den Marjan; Brinkkemper, Sjaak; Diepen, Guido

    2010-01-01

    It is essential for product software companies to decide which requirements should be included in the next release and to make an appropriate time plan of the development project. Compared to the extensive research done on requirement selection, very little research has been performed on time schedu

  18. Core Logistics Capability Policy Applied to USAF Combat Aircraft Avionics Software: A Systems Engineering Analysis

    Science.gov (United States)

    2010-06-01

    cannot make a distinction between software maintenance and development” (Sharma, 2004). ISO/ IEC 12207 Software Lifecycle Processes offers a guide to...synopsis of ISO/ IEC 12207, Raghu Singh of the Federal Aviation Administration states “Whenever a software product needs modifications, the development...Sustainment for the F-35 Lightning II. (C. Johnstun, Ed.) Crosstalk: The Journal of Defense Software Engineering , 20 (12), 9-14. 130

  19. Development and Evaluation of Vectorised and Multi-Core Event Reconstruction Algorithms within the CMS Software Framework

    Science.gov (United States)

    Hauth, T.; Innocente and, V.; Piparo, D.

    2012-12-01

    The processing of data acquired by the CMS detector at LHC is carried out with an object-oriented C++ software framework: CMSSW. With the increasing luminosity delivered by the LHC, the treatment of recorded data requires extraordinary large computing resources, also in terms of CPU usage. A possible solution to cope with this task is the exploitation of the features offered by the latest microprocessor architectures. Modern CPUs present several vector units, the capacity of which is growing steadily with the introduction of new processor generations. Moreover, an increasing number of cores per die is offered by the main vendors, even on consumer hardware. Most recent C++ compilers provide facilities to take advantage of such innovations, either by explicit statements in the programs sources or automatically adapting the generated machine instructions to the available hardware, without the need of modifying the existing code base. Programming techniques to implement reconstruction algorithms and optimised data structures are presented, that aim to scalable vectorization and parallelization of the calculations. One of their features is the usage of new language features of the C++11 standard. Portions of the CMSSW framework are illustrated which have been found to be especially profitable for the application of vectorization and multi-threading techniques. Specific utility components have been developed to help vectorization and parallelization. They can easily become part of a larger common library. To conclude, careful measurements are described, which show the execution speedups achieved via vectorised and multi-threaded code in the context of CMSSW.

  20. GENIE: a software package for gene-gene interaction analysis in genetic association studies using multiple GPU or CPU cores.

    Science.gov (United States)

    Chikkagoudar, Satish; Wang, Kai; Li, Mingyao

    2011-05-26

    Gene-gene interaction in genetic association studies is computationally intensive when a large number of SNPs are involved. Most of the latest Central Processing Units (CPUs) have multiple cores, whereas Graphics Processing Units (GPUs) also have hundreds of cores and have been recently used to implement faster scientific software. However, currently there are no genetic analysis software packages that allow users to fully utilize the computing power of these multi-core devices for genetic interaction analysis for binary traits. Here we present a novel software package GENIE, which utilizes the power of multiple GPU or CPU processor cores to parallelize the interaction analysis. GENIE reads an entire genetic association study dataset into memory and partitions the dataset into fragments with non-overlapping sets of SNPs. For each fragment, GENIE analyzes: 1) the interaction of SNPs within it in parallel, and 2) the interaction between the SNPs of the current fragment and other fragments in parallel. We tested GENIE on a large-scale candidate gene study on high-density lipoprotein cholesterol. Using an NVIDIA Tesla C1060 graphics card, the GPU mode of GENIE achieves a speedup of 27 times over its single-core CPU mode run. GENIE is open-source, economical, user-friendly, and scalable. Since the computing power and memory capacity of graphics cards are increasing rapidly while their cost is going down, we anticipate that GENIE will achieve greater speedups with faster GPU cards. Documentation, source code, and precompiled binaries can be downloaded from http://www.cceb.upenn.edu/~mli/software/GENIE/.

  1. Design Requirements, Epistemic Uncertainty and Solution Development Strategies in Software Design

    DEFF Research Database (Denmark)

    Ball, Linden J.; Onarheim, Balder; Christensen, Bo Thomas

    2010-01-01

    This paper investigates the potential involvement of “epistemic uncertainty” in mediating between complex design requirements and strategic switches in software design strategies. The analysis revealed that the designers produced an initial “first-pass” solution to the given design brief in a bre......This paper investigates the potential involvement of “epistemic uncertainty” in mediating between complex design requirements and strategic switches in software design strategies. The analysis revealed that the designers produced an initial “first-pass” solution to the given design brief...... a view of software design as involving a mixed breadth-first and depth-first solution development approach, with strategic switching to depth-first design being triggered by requirement complexity and being mediated by associated feelings of uncertainty....

  2. A discussion of higher order software concepts as they apply to functional requirements and specifications. [space shuttles and guidance

    Science.gov (United States)

    Hamilton, M.

    1973-01-01

    The entry guidance software functional requirements (requirements design phase), its architectural requirements (specifications design phase), and the entry guidance software verified code are discussed. It was found that the proper integration of designs at both the requirements and specifications levels are of high priority consideration.

  3. Dependencies among Architectural Views Got from Software Requirements Based on a Formal Model

    Directory of Open Access Journals (Sweden)

    Osis Janis

    2014-12-01

    Full Text Available A system architect has software requirements and some unspecified knowledge about a problem domain (e.g., an enterprise as source information for assessment and evaluation of possible solutions and getting the target point, a preliminary software design. The solving factor is architect’s experience and expertise in the problem domain (“AS-IS”. A proposed approach is dedicated to assist a system architect in making an appropriate decision on the solution (“TO-BE”. It is based on a formal mathematical model, Topological Functioning Model (TFM. Compliant TFMs can be transformed into software architectural views. The paper demonstrates and discusses tracing dependency links from the requirements to and between the architectural views.

  4. Core reserve expansion requirement/Long-term SKR HCP

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — This letter is regarding the Stephens Kangaroo Rat Habitat Conservation Plan (SKR HCP) which requires that the Riverside County Habitat Conservation Agency establish...

  5. Vías de internacionalización de la Industria Argentina de Software: El caso de Core Security Technologies

    Directory of Open Access Journals (Sweden)

    ALEJANDRO ARTOPOULOS

    2013-05-01

    Full Text Available Este trabajo presenta el caso de Core Security Technologies, una de las pocas empresas argentinas de software que se internacionalizó en el mercado norteamericano. Ilustra una de las estrategias clásicas de internacionalización de las empresas de garaje de software, seguida principalmente en Israel y da cuenta de nueva población de empresas argentinas de software y servicios informáticos internacionalizadas basadas en el conocimiento. La empresa se caracteriza por operar bajo un nuevo régimen de innovación post-sustitutivo que comparte con otras empresas que la precedieron y otras contemporáneas. El caso no hace pensar sobre las condiciones de posibilidad de la innovación radical en la historia reciente de los países periféricos, sino en las dificultades de la replicabilidad de este tipo de casos. Abstract This paper presents the case of Core Security Technologies, one of the few Argentine companies internationalized software on the northamerican market. It shows one of the classic strategies of internationalization of software garage´s companies, followed mainly in Israel and reports new population of Argentine companies internationalized software and IT services based in knowledge. The firm is characterized by operating under a new regime of innovation post-import substitution period, shared with other companies that preceded and other contemporary. The case does not think about the conditions of possibility for radical innovation in the recent history of peripheral countries, but also the difficulties of replicability of such case.

  6. Thirty Meter Telescope: observatory software requirements, architecture, and preliminary implementation strategies

    Science.gov (United States)

    Silva, David R.; Angeli, George; Boyer, Corinne; Sirota, Mark; Trinh, Thang

    2008-07-01

    The Thirty Meter Telescope (TMT) will be a ground-based, 30-m optical-IR alt-az telescope with a highly segmented primary mirror located in a remote location. Efficient science operations require the asynchronous coordination of many different sub-systems including telescope mount, three independent active optics sub-systems, adaptive optics, laser guide stars, and user-configured science instrument. An important high-level requirement is target acquisition and observatory system configuration must be completed in less than 5 minutes (or 10 minutes if moving to a new instrument). To meet this coordination challenge and target acquisition time requirement, a distributed software architecture is envisioned consisting of software components linked by a service-based software communications backbone. A master sequencer coordinates the activities of mid-layer sequencers for the telescope, adaptive optics, and selected instrument. In turn, these mid-layer sequencers coordinate the activities of groups of sub-systems. In this paper, TMT observatory requirements are presented in more detail, followed by a description of the design reference software architecture and a discussion of preliminary implementation strategies.

  7. MODSARE-V: Validation of Dependability and Safety Critical Software Components with Model Based Requirements

    Science.gov (United States)

    Silveira, Daniel T. de M. M.; Schoofs, Tobias; Alana Salazar, Elena; Rodriguez Rodriguez, Ana Isabel; Devic, Marie-Odile

    2010-08-01

    The wide use of RAMS methods and techniques [1] (e.g. SFMECA, SFTA, HAZOP, HA...) in critical software development resulted in the specification of new software requirements, design constraints and other issues such as mandatory coding rules. Given the large variety of RAMS Requirements and Techniques, different types of Verification and Validation (V&V) [14] are spread over the phases of the software engineering process. As a result, the V&V process becomes complex and the cost and time required for a complete and consistent V&V process is increased. By introducing the concept of a model based approach to facilitate the RAMS requirements definition process, the V&V may be reduce in time and effort. MODSARE-V is demonstrates the feasibility of this concept based on case studies applied to ground or on-board software space projects with critical functions/components. This paper describes the approach adopted at MODSARE-V to realize the concept into a prototype and summarizes the results and conclusions met after the prototype application on the case studies.

  8. Incorporating Computer-Aided Software in the Undergraduate Chemical Engineering Core Courses

    Science.gov (United States)

    Alnaizy, Raafat; Abdel-Jabbar, Nabil; Ibrahim, Taleb H.; Husseini, Ghaleb A.

    2014-01-01

    Introductions of computer-aided software and simulators are implemented during the sophomore-year of the chemical engineering (ChE) curriculum at the American University of Sharjah (AUS). Our faculty concurs that software integration within the curriculum is beneficial to our students, as evidenced by the positive feedback received from industry…

  9. Incorporating Computer-Aided Software in the Undergraduate Chemical Engineering Core Courses

    Science.gov (United States)

    Alnaizy, Raafat; Abdel-Jabbar, Nabil; Ibrahim, Taleb H.; Husseini, Ghaleb A.

    2014-01-01

    Introductions of computer-aided software and simulators are implemented during the sophomore-year of the chemical engineering (ChE) curriculum at the American University of Sharjah (AUS). Our faculty concurs that software integration within the curriculum is beneficial to our students, as evidenced by the positive feedback received from industry…

  10. Requirements Engineering Challenges in Service Oriented Software Engineering: an exploratory online survey

    Directory of Open Access Journals (Sweden)

    Muneera Bano

    2013-07-01

    Full Text Available Service Oriented Software Engineering (SOSE is an emerging field for developing software using web services. One of the main tasks of a Requirement Engineer in SOSE is matchmaking between requirements and available services. Published literature indicates that Requirements Engineering (RE in SOSE is facing differentchallenges. In this study, we report the results of an online survey conducted with practitioners and the researchers working on service oriented projects. The aim is to get an insight about the issues and challenges faced in SOSE during requirements engineering. The results show an interesting pattern of how the researchers and practitioners have differing views on reported challenges. The difference in opinion is mostly because SOSE is a new field and most of its concepts are not fully understood and appreciated by designers and developers, resulting in a poor implementation of the SOSE concepts.

  11. FEM simulation of formation of metamorphic core complex with ANSYS software

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    This study utilizes ANSYS to establish FEM's model of metamorphic core complex,and used thermal-structure analysis to simulate metamorphic core complex's temperature field and stress field.The metamorphic core complex formation mechanism is discussed.The simulation results show that the temperature field change appearing as the earth surface's temperature is the lowest,and the temperature of metamorphic core complex's nucleus is the highest.The temperature field is higher along with depth increase,and the stress field change appearing as the biggest stress occurs in the nucleus.The next stress field occurs at the top of the cover.

  12. Waste Receiving and Processing Facility Module 1 Data Management System software requirements specification

    Energy Technology Data Exchange (ETDEWEB)

    Rosnick, C.K.

    1996-04-19

    This document provides the software requirements for Waste Receiving and Processing (WRAP) Module 1 Data Management System (DMS). The DMS is one of the plant computer systems for the new WRAP 1 facility (Project W-0126). The DMS will collect, store and report data required to certify the low level waste (LLW) and transuranic (TRU) waste items processed at WRAP 1 as acceptable for shipment, storage, or disposal.

  13. Waste Receiving and Processing Facility Module 1 Data Management System Software Requirements Specification

    Energy Technology Data Exchange (ETDEWEB)

    Brann, E.C. II

    1994-09-09

    This document provides the software requirements for Waste Receiving and Processing (WRAP) Module 1 Data Management System (DMS). The DMS is one of the plant computer systems for the new WRAP 1 facility (Project W-026). The DMS will collect, store and report data required to certify the low level waste (LLW) and transuranic (TRU) waste items processed at WRAP 1 as acceptable for shipment, storage, or disposal.

  14. Validation of a new software version for monitoring of the core of the Unit 2 of the Laguna Verde power plant with ARTS; Validacion de una nueva version del software para monitoreo del nucleo de la Unidad 2 de la Central Laguna Verde con ARTS

    Energy Technology Data Exchange (ETDEWEB)

    Calleros, G.; Riestra, M.; Ibanez, C.; Lopez, X.; Vargas, A.; Mendez, A.; Gomez, R. [CFE, Central Nucleoelectrica de Laguna Verde, Alto Lucero, Veracruz (Mexico)]. e-mail: gcm9acpp@cfe.gob.mx

    2005-07-01

    In this work it is intended a methodology to validate a new version of the software used for monitoring the reactor core, which requires of the evaluation of the thermal limits settled down in the Operation Technical Specifications, for the Unit 2 of Laguna Verde with ARTS (improvements to the APRMs, Rod Block Monitor and Technical specifications). According to the proposed methodology, those are shown differences found in the thermal limits determined with the new versions and previous of the core monitoring software. Author)

  15. A PRIORITY-BASED NEGOTIATIONS APPROACH FOR HANDLING INCONSISTENCIES IN MULTI-PERSPECTIVE SOFTWARE REQUIREMENTS

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    Inconsistency of multi-perspective requirements specifications is a pervasive issue during the requirements process.However,managing inconsistency is not just a pure technical problem.It is always associated with a process of interactions and competitions among corresponding stakeholders.The main contribution of this paper is to present a negotiations approach to handling inconsistencies in multi-perspective software requirements.In particular,the priority of requirements relative to each perspective plays an important role in proceeding negotiation over resolving inconsistencies among different stakeholders.An algorithm of generating negotiation proposals and an approach to evaluating proposals are also presented in this paper,respectively.

  16. Detailed requirements document for common software of shuttle program information management system

    Science.gov (United States)

    Everette, J. M.; Bradfield, L. D.; Horton, C. L.

    1975-01-01

    Common software was investigated as a method for minimizing development and maintenance cost of the shuttle program information management system (SPIMS) applications while reducing the time-frame of their development. Those requirements satisfying these criteria are presented along with the stand-alone modules which may be used directly by applications. The SPIMS applications operating on the CYBER 74 computer, are specialized information management systems which use System 2000 as a data base manager. Common software provides the features to support user interactions on a CRT terminal using form input and command response capabilities. These features are available as subroutines to the applications.

  17. Requirements for guidelines systems: implementation challenges and lessons from existing software-engineering efforts.

    Science.gov (United States)

    Shah, Hemant; Allard, Raymond D; Enberg, Robert; Krishnan, Ganesh; Williams, Patricia; Nadkarni, Prakash M

    2012-03-09

    A large body of work in the clinical guidelines field has identified requirements for guideline systems, but there are formidable challenges in translating such requirements into production-quality systems that can be used in routine patient care. Detailed analysis of requirements from an implementation perspective can be useful in helping define sub-requirements to the point where they are implementable. Further, additional requirements emerge as a result of such analysis. During such an analysis, study of examples of existing, software-engineering efforts in non-biomedical fields can provide useful signposts to the implementer of a clinical guideline system. In addition to requirements described by guideline-system authors, comparative reviews of such systems, and publications discussing information needs for guideline systems and clinical decision support systems in general, we have incorporated additional requirements related to production-system robustness and functionality from publications in the business workflow domain, in addition to drawing on our own experience in the development of the Proteus guideline system (http://proteme.org). The sub-requirements are discussed by conveniently grouping them into the categories used by the review of Isern and Moreno 2008. We cite previous work under each category and then provide sub-requirements under each category, and provide example of similar work in software-engineering efforts that have addressed a similar problem in a non-biomedical context. When analyzing requirements from the implementation viewpoint, knowledge of successes and failures in related software-engineering efforts can guide implementers in the choice of effective design and development strategies.

  18. 78 FR 31563 - Ryan White HIV/AIDS Program Core Medical Services Waiver; Application Requirements

    Science.gov (United States)

    2013-05-24

    ... Administration Ryan White HIV/AIDS Program Core Medical Services Waiver; Application Requirements AGENCY: Health... XXVI of the Public Health Service Act, as amended by the Ryan White HIV/AIDS Treatment Extension Act of... on core medical services, including antiretroviral drugs, for individuals with HIV/AIDS identified...

  19. Active Mirror Predictive and Requirements Verification Software (AMP-ReVS)

    Science.gov (United States)

    Basinger, Scott A.

    2012-01-01

    This software is designed to predict large active mirror performance at various stages in the fabrication lifecycle of the mirror. It was developed for 1-meter class powered mirrors for astronomical purposes, but is extensible to other geometries. The package accepts finite element model (FEM) inputs and laboratory measured data for large optical-quality mirrors with active figure control. It computes phenomenological contributions to the surface figure error using several built-in optimization techniques. These phenomena include stresses induced in the mirror by the manufacturing process and the support structure, the test procedure, high spatial frequency errors introduced by the polishing process, and other process-dependent deleterious effects due to light-weighting of the mirror. Then, depending on the maturity of the mirror, it either predicts the best surface figure error that the mirror will attain, or it verifies that the requirements for the error sources have been met once the best surface figure error has been measured. The unique feature of this software is that it ties together physical phenomenology with wavefront sensing and control techniques and various optimization methods including convex optimization, Kalman filtering, and quadratic programming to both generate predictive models and to do requirements verification. This software combines three distinct disciplines: wavefront control, predictive models based on FEM, and requirements verification using measured data in a robust, reusable code that is applicable to any large optics for ground and space telescopes. The software also includes state-of-the-art wavefront control algorithms that allow closed-loop performance to be computed. It allows for quantitative trade studies to be performed for optical systems engineering, including computing the best surface figure error under various testing and operating conditions. After the mirror manufacturing process and testing have been completed, the

  20. Importance of Requirements Analysis & Traceability to Improve Software Quality and Reduce Cost and Risk

    Science.gov (United States)

    Kapoor, Manju M.; Mehta, Manju

    2010-01-01

    The goal of this paper is to emphasize the importance of developing complete and unambiguous requirements early in the project cycle (prior to Preliminary Design Phase). Having a complete set of requirements early in the project cycle allows sufficient time to generate a traceability matrix. Requirements traceability and analysis are the key elements in improving verification and validation process, and thus overall software quality. Traceability can be most beneficial when the system changes. If changes are made to high-level requirements it implies that low-level requirements need to be modified. Traceability ensures that requirements are appropriately and efficiently verified at various levels whereas analysis ensures that a rightly interpreted set of requirements is produced.

  1. Rapid Development of Guidance, Navigation, and Control Core Flight System Software Applications Using Simulink Models Project

    Data.gov (United States)

    National Aeronautics and Space Administration — We will demonstrate the usefulness of SIL for GSFC missions by attempting to compile the SIL source code with an autocoded sample GNC application flight software....

  2. Independent Verification and Validation Of SAPHIRE 8 Software Requirements Project Number: N6423 U.S. Nuclear Regulatory Commission

    Energy Technology Data Exchange (ETDEWEB)

    Kent Norris

    2009-09-01

    The purpose of the Independent Verification and Validation (IV&V) role in the evaluation of the SAPHIRE requirements definition is to assess the activities that results in the specification, documentation, and review of the requirements that the software product must satisfy, including functionality, performance, design constraints, attributes and external interfaces. The IV&V team began this endeavor after the software engineering and software development of SAPHIRE had already been in production. IV&V reviewed the requirements specified in the NRC Form 189s to verify these requirements were included in SAPHIRE’s Software Verification and Validation Plan (SVVP).

  3. Independent Verification and Validation Of SAPHIRE 8 Software Requirements Project Number: N6423 U.S. Nuclear Regulatory Commission

    Energy Technology Data Exchange (ETDEWEB)

    Kent Norris

    2010-03-01

    The purpose of the Independent Verification and Validation (IV&V) role in the evaluation of the SAPHIRE requirements definition is to assess the activities that results in the specification, documentation, and review of the requirements that the software product must satisfy, including functionality, performance, design constraints, attributes and external interfaces. The IV&V team began this endeavor after the software engineering and software development of SAPHIRE had already been in production. IV&V reviewed the requirements specified in the NRC Form 189s to verify these requirements were included in SAPHIRE’s Software Verification and Validation Plan (SVVP).

  4. Software requirements specification for the GIS-T/ISTEA pooled fund study phase C linear referencing engine

    Energy Technology Data Exchange (ETDEWEB)

    Amai, W.; Espinoza, J. Jr. [Sandia National Lab., Albuquerque, NM (United States); Fletcher, D.R. [Univ. of New Mexico, Albuquerque, NM (United States). Alliance for Transportation Research

    1997-06-01

    This Software Requirements Specification (SRS) describes the features to be provided by the software for the GIS-T/ISTEA Pooled Fund Study Phase C Linear Referencing Engine project. This document conforms to the recommendations of IEEE Standard 830-1984, IEEE Guide to Software Requirements Specification (Institute of Electrical and Electronics Engineers, Inc., 1984). The software specified in this SRS is a proof-of-concept implementation of the Linear Referencing Engine as described in the GIS-T/ISTEA pooled Fund Study Phase B Summary, specifically Sheet 13 of the Phase B object model. The software allows an operator to convert between two linear referencing methods and a datum network.

  5. SEMANTIC WEB-BASED SOFTWARE ENGINEERING BY AUTOMATED REQUIREMENTS ONTOLOGY GENERATION IN SOA

    Directory of Open Access Journals (Sweden)

    Vahid Rastgoo

    2014-04-01

    Full Text Available This paper presents an approach for automated generation of requirements ontology using UML diagrams in service-oriented architecture (SOA. The goal of this paper is to convenience progress of software engineering processes like software design, software reuse, service discovering and etc. The proposed method is based on a four conceptual layers. The first layer includes requirements achieved by stakeholders, the second one designs service-oriented diagrams from the data in first layer and extracts XMI codes of them. The third layer includes requirement ontology and protocol ontology to describe behavior of services and relationships between them semantically. Finally the forth layer makes standard the concepts exists in ontologies of previous layer. The generated ontology exceeds absolute domain ontology because it considers the behavior of services moreover the hierarchical relationship of them. Experimental results conducted on a set of UML4Soa diagrams in different scopes demonstrate the improvement of the proposed approach from different points of view such as: completeness of requirements ontology, automatic generation and considering SOA.

  6. Meta-Model and UML Profile for Requirements Management of Software and Embedded Systems

    Directory of Open Access Journals (Sweden)

    Arpinen Tero

    2011-01-01

    Full Text Available Software and embedded system companies today encounter problems related to requirements management tool integration, incorrect tool usage, and lack of traceability. This is due to utilized tools with no clear meta-model and semantics to communicate requirements between different stakeholders. This paper presents a comprehensive meta-model for requirements management. The focus is on software and embedded system domains. The goal is to define generic requirements management domain concepts and abstract interfaces between requirements management and system development. This leads to a portable requirements management meta-model which can be adapted with various system modeling languages. The created meta-model is prototyped by translating it into a UML profile. The profile is imported into a UML tool which is used for rapid evaluation of meta-model concepts in practice. The developed profile is associated with a proof of concept report generator tool that automatically produces up-to-date documentation from the models in form of web pages. The profile is adopted to create an example model of embedded system requirement specification which is built with the profile.

  7. Software Infusion: Using Computers to Enhance Instruction. Part Two: What Kind of Training Does Software Infusion Require?

    Science.gov (United States)

    Schiffman, Shirl S.

    1986-01-01

    Presents a four-step conceptual framework for designing workshops to teach educators software infusion (SI), i.e., the use of computer software to enhance instructional effectiveness in school academic areas. Suggestions for implementation and sample worksheets accompany the discussions of each step. (MBR)

  8. CoreIDRAW Software Applications in the Textile and Garment Design Digitizing%CoreIDRAW软件在纺织服装设计数字化上的应用

    Institute of Scientific and Technical Information of China (English)

    陈凤琴

    2014-01-01

    本文主要探讨了CoreIDRAW软件在纺织服装设计数字化中的应用,探讨了软件的适用性。%With the development of society, the progress of science and technology, information technology era, used in various industries. This paper mainly discusses the CoreIDRAW software applications in the textile clothing design digitized, and to explore the applicability of the software.

  9. System requirements for one-time-use ENRAF control panel software

    Energy Technology Data Exchange (ETDEWEB)

    HUBER, J.H.

    1999-08-19

    An Enraf Densitometer is installed on tank 241-AY-102. The Densitometer will frequently be tasked to obtain and log density profiles. The activity can be effected a number of ways. Enraf Incorporated provides a software package called ''Logger18'' to its customers for the purpose of in-shop testing of their gauges. Logger18 is capable of accepting an input file which can direct the gauge to obtain a density profile for a given tank level and bottom limit. Logger18 is a complex, DOS based program which will require trained technicians and/or tank farm entries to obtain the data. ALARA considerations have prompted the development of a more user-friendly, computer-based interface to the Enraf densitometers. This document records the plan by which this new Enraf data acquisition software will be developed, reviewed, verified, and released. This plan applies to the development and implementation of a one-time-use software program, which will be called ''Enraf Control Panel.'' The software will be primarily used for remote operation of Enraf Densitometers for the purpose of obtaining and logging tank product density profiles.

  10. Knowledge Base for an Intelligent System in order to Identify Security Requirements for Government Agencies Software Projects

    Directory of Open Access Journals (Sweden)

    Adán Beltrán G.

    2016-01-01

    Full Text Available It has been evidenced that one of the most common causes in the failure of software security is the lack of identification and specification of requirements for information security, it is an activity with an insufficient importance in the software development or software acquisition We propose the knowledge base of CIBERREQ. CIBERREQ is an intelligent knowledge-based system used for the identification and specification of security requirements in the software development cycle or in the software acquisition. CIBERREQ receives functional software requirements written in natural language and produces non-functional security requirements through a semi-automatic process of risk management. The knowledge base built is formed by an ontology developed collaboratively by experts in information security. In this process has been identified six types of assets: electronic data, physical data, hardware, software, person and service; as well as six types of risk: competitive disadvantage, loss of credibility, economic risks, strategic risks, operational risks and legal sanctions. In addition there are defined 95 vulnerabilities, 24 threats, 230 controls, and 515 associations between concepts. Additionally, automatic expansion was used with Wikipedia for the asset types Software and Hardware, obtaining 7125 and 5894 software and hardware subtypes respectively, achieving thereby an improvement of 10% in the identification of the information assets candidates, one of the most important phases of the proposed system.

  11. Design and Implementation of an Efficient Software Communications Architecture Core Framework for a Digital Signal Processors Platform

    Directory of Open Access Journals (Sweden)

    Wael A. Murtada

    2011-01-01

    Full Text Available Problem statement: The Software Communications Architecture (SCA was developed to improve software reuse and interoperability in Software Defined Radios (SDR. However, there have been performance concerns since its conception. Arguably, the majority of the problems and inefficiencies associated with the SCA can be attributed to the assumption of modular distributed platforms relying on General Purpose Processors (GPPs to perform all signal processing. Approach: Significant improvements in cost and power consumption can be obtained by utilizing specialized and more efficient platforms. Digital Signal Processors (DSPs present such a platform and have been widely used in the communications industry. Improvements in development tools and middleware technology opened the possibility of fully integrating DSPs into the SCA. This approach takes advantage of the exceptional power, cost and performance characteristics of DSPs, while still enjoying the flexibility and portability of the SCA. Results: This study presents the design and implementation of an SCA Core Framework (CF for a TI TMS320C6416 DSP. The framework is deployed on a C6416 Device Cycle Accurate Simulator and TI C6416 Development board. The SCA CF is implemented by leveraging OSSIE, an open-source implementation of the SCA, to support the DSP platform. OIS’s ORBExpress DSP and DSP/BIOS are used as the middleware and operating system, respectively. A sample waveform was developed to demonstrate the framework’s functionality. Benchmark results for the framework and sample applications are provided. Conclusion: Benchmark results show that, using OIS ORBExpress DSP ORB middleware has an impact for decreasing the Software Memory Footprint and increasing the System Performance compared with PrismTech's e*ORB middleware.

  12. Software requirements and support for image-algebraic analysis, detection, and recognition of small targets

    Science.gov (United States)

    Schmalz, Mark S.; Ritter, Gerhard X.; Forsman, Robert H.; Yang, Chyuan-Huei T.; Hu, Wen-Chen; Porter, Ryan A.; McTaggart, Gary; Hranicky, James F.; Davis, James F.

    1995-06-01

    The detection of hazardous targets frequently requires a multispectral approach to image acquisition and analysis, which we have implemented in a software system called MATRE (multispectral automated target recognition and enhancement). MATRE provides capabilities of image enhancement, image database management, spectral signature extraction and visualization, statistical analysis of greyscale imagery, as well as 2D and 3D image processing operations. Our system is based upon a client-server architecture that is amenable to distributed implementation. In this paper, we discuss salient issues and requirements for multispectral recognition of hazardous targets, and show that our software fulfills or exceeds such requirements. MATRE's capabilities, as well as statistical and morphological analysis results, are exemplified with emphasis upon computational cost, ease of installation, and maintenance on various Unix platforms. Additionally, MATRE's image processing functions can be coded in vector-parallel form, for ease of implementation of SIMD-parallel processors. Our algorithms are expressed in terms of image algebra, a concise, rigorous notation that unifies linear and nonlinear mathematics in the image domain. An image algebra class library for the C + + language has been incorporated into the our system, which facilitates fast algorithm prototyping without the numerous drawbacks of descrete coding.

  13. Hanford Soil Inventory Model (SIM) Rev. 1 Software Documentation – Requirements, Design, and Limitations

    Energy Technology Data Exchange (ETDEWEB)

    Simpson, Brett C.; Corbin, Rob A.; Anderson, Michael J.; Kincaid, Charles T.

    2006-09-25

    The objective of this document is to support the simulation results reported by Corbin et al. (2005) by documenting the requirements, conceptual model, simulation methodology, testing, and quality assurance associated with the Hanford Soil Inventory Model (SIM). There is no conventional software life-cycle documentation associated with the Hanford SIM because of the research and development nature of the project. Because of the extensive use of commercial- off-the-shelf software products, there was little actual software development as part of this application. This document is meant to provide historical context and technical support of Corbin et al. (2005), which is a significant revision and update to an earlier product Simpson et al. (2001). The SIM application computed waste discharges composed of 75 analytes at 377 waste sites (liquid disposal, unplanned releases, and tank farm leaks) over an operational period of approximately 50 years. The development and application of SIM was an effort to develop a probabilistic approach to estimate comprehensive, mass balanced-based contaminant inventories for the Hanford Site post-closure setting. A computer model capable of calculating inventories and the associated uncertainties as a function of time was identified to address the needs of the Remediation and Closure Science (RCS) Project.

  14. Evaluation of a Game to Teach Requirements Collection and Analysis in Software Engineering at Tertiary Education Level

    Science.gov (United States)

    Hainey, Thomas; Connolly, Thomas M.; Stansfield, Mark; Boyle, Elizabeth A.

    2011-01-01

    A highly important part of software engineering education is requirements collection and analysis which is one of the initial stages of the Database Application Lifecycle and arguably the most important stage of the Software Development Lifecycle. No other conceptual work is as difficult to rectify at a later stage or as damaging to the overall…

  15. Evaluation of a Game to Teach Requirements Collection and Analysis in Software Engineering at Tertiary Education Level

    Science.gov (United States)

    Hainey, Thomas; Connolly, Thomas M.; Stansfield, Mark; Boyle, Elizabeth A.

    2011-01-01

    A highly important part of software engineering education is requirements collection and analysis which is one of the initial stages of the Database Application Lifecycle and arguably the most important stage of the Software Development Lifecycle. No other conceptual work is as difficult to rectify at a later stage or as damaging to the overall…

  16. RELAP-7 Software Verification and Validation Plan: Requirements Traceability Matrix (RTM) Part 1 – Physics and numerical methods

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Yong Joon [Idaho National Lab. (INL), Idaho Falls, ID (United States); Yoo, Jun Soo [Idaho National Lab. (INL), Idaho Falls, ID (United States); Smith, Curtis Lee [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-09-01

    This INL plan comprehensively describes the Requirements Traceability Matrix (RTM) on main physics and numerical method of the RELAP-7. The plan also describes the testing-based software verification and validation (SV&V) process—a set of specially designed software models used to test RELAP-7.

  17. Investigation of the current requirements engineering practices among software developers at the Universiti Utara Malaysia Information Technology (UUMIT) centre

    Science.gov (United States)

    Hussain, Azham; Mkpojiogu, Emmanuel O. C.; Abdullah, Inam

    2016-08-01

    Requirements Engineering (RE) is a systemic and integrated process of eliciting, elaborating, negotiating, validating and managing of the requirements of a system in a software development project. UUM has been supported by various systems developed and maintained by the UUM Information Technology (UUMIT) Centre. The aim of this study was to assess the current requirements engineering practices at UUMIT. The main problem that prompted this research is the lack of studies that support software development activities at the UUMIT. The study is geared at helping UUMIT produce quality but time and cost saving software products by implementing cutting edge and state of the art requirements engineering practices. Also, the study contributes to UUM by identifying the activities needed for software development so that the management will be able to allocate budget to provide adequate and precise training for the software developers. Three variables were investigated: Requirement Description, Requirements Development (comprising: Requirements Elicitation, Requirements Analysis and Negotiation, Requirements Validation), and Requirement Management. The results from the study showed that the current practice of requirement engineering in UUMIT is encouraging, but still need further development and improvement because a few RE practices were seldom practiced.

  18. Harmonic Domain Modelling of Transformer Core Nonlinearities Using the DIgSILENT PowerFactory Software

    DEFF Research Database (Denmark)

    Bak, Claus Leth; Bak-Jensen, Birgitte; Wiechowski, Wojciech

    2008-01-01

    This paper demonstrates the results of implementation and verification of an already existing algorithm that allows for calculating saturation characteristics of singlephase power transformers. The algorithm was described for the first time in 1993. Now this algorithm has been implemented using...... the DIgSILENT Programming Language (DPL) as an external script in the harmonic domain calculations of a power system analysis tool PowerFactory [10]. The algorithm is verified by harmonic measurements on a single-phase power transformer. A theoretical analysis of the core nonlinearities phenomena...... in single and three-phase transformers is also presented. This analysis leads to the conclusion that the method can be applied for modelling nonlinearities of three-phase autotransformers....

  19. Medical Device Software Requirements and Evaluation%医疗器械软件的要求和评价

    Institute of Scientific and Technical Information of China (English)

    何涛; 吴夷; 杜堃

    2011-01-01

    论述了医疗器械软件的安全性和有效性要求,并从软件开发设计过程和软件产品两方面论述如何对医疗器械软件进行评价.%This paper introduces the safety and 6ff6ctiv6n6ss requirements of medical device software And it discusses th6 methods on evaluating msdical dsvics software from two aspscts the design and development procsss of the software and software products.

  20. Investigation of Classification and Design Requirements for Digital Software for Advanced Research Reactors

    Energy Technology Data Exchange (ETDEWEB)

    Park, Gee Young; Jung, H. S.; Ryu, J. S.; Park, C

    2005-06-15

    software for use in I and C systems in nuclear power plants and describes the requirements for software development recommended by international standard.

  1. Reservoir characterization using core, well log, and seismic data and intelligent software

    Science.gov (United States)

    Soto Becerra, Rodolfo

    We have developed intelligent software, Oilfield Intelligence (OI), as an engineering tool to improve the characterization of oil and gas reservoirs. OI integrates neural networks and multivariate statistical analysis. It is composed of five main subsystems: data input, preprocessing, architecture design, graphics design, and inference engine modules. More than 1,200 lines of programming code as M-files using the language MATLAB been written. The degree of success of many oil and gas drilling, completion, and production activities depends upon the accuracy of the models used in a reservoir description. Neural networks have been applied for identification of nonlinear systems in almost all scientific fields of humankind. Solving reservoir characterization problems is no exception. Neural networks have a number of attractive features that can help to extract and recognize underlying patterns, structures, and relationships among data. However, before developing a neural network model, we must solve the problem of dimensionality such as determining dominant and irrelevant variables. We can apply principal components and factor analysis to reduce the dimensionality and help the neural networks formulate more realistic models. We validated OI by obtaining confident models in three different oil field problems: (1) A neural network in-situ stress model using lithology and gamma ray logs for the Travis Peak formation of east Texas, (2) A neural network permeability model using porosity and gamma ray and a neural network pseudo-gamma ray log model using 3D seismic attributes for the reservoir VLE 196 Lamar field located in Block V of south-central Lake Maracaibo (Venezuela), and (3) Neural network primary ultimate oil recovery (PRUR), initial waterflooding ultimate oil recovery (IWUR), and infill drilling ultimate oil recovery (IDUR) models using reservoir parameters for San Andres and Clearfork carbonate formations in west Texas. In all cases, we compared the results from

  2. 76 FR 34287 - ITS Joint Program Office; Core System Requirements Walkthrough and Architecture Proposal Review...

    Science.gov (United States)

    2011-06-13

    ... ITS Joint Program Office; Core System Requirements Walkthrough and Architecture Proposal Review..., U.S. Department of Transportation. ACTION: Notice. The U.S. Department of Transportation (USDOT) ITS Joint Program Office (ITS JPO) will host two free public meetings with accompanying webinars to...

  3. Attributed Goal-Oriented Analysis Method for Selecting Alternatives of Software Requirements

    Science.gov (United States)

    Yamamoto, Kazuma; Saeki, Motoshi

    During software requirements analysis, developers and stakeholders have many alternatives of requirements to be achieved and should make decisions to select an alternative out of them. There are two significant points to be considered for supporting these decision making processes in requirements analysis; 1) dependencies among alternatives and 2) evaluation based on multi-criteria and their trade-off. This paper proposes the technique to address the above two issues by using an extended version of goal-oriented analysis. In goal-oriented analysis, elicited goals and their dependencies are represented with an AND-OR acyclic directed graph. We use this technique to model the dependencies of the alternatives. Furthermore we associate attribute values and their propagation rules with nodes and edges in a goal graph in order to evaluate the alternatives with them. The attributes and their calculation rules greatly depend on the characteristics of a development project. Thus, in our approach, we select and use the attributes and their rules that can be appropriate for the project. TOPSIS method is adopted to show alternatives and their resulting attribute values.

  4. CORE

    DEFF Research Database (Denmark)

    Krigslund, Jeppe; Hansen, Jonas; Hundebøll, Martin

    2013-01-01

    different flows. Instead of maintaining these approaches separate, we propose a protocol (CORE) that brings together these coding mechanisms. Our protocol uses random linear network coding (RLNC) for intra- session coding but allows nodes in the network to setup inter- session coding regions where flows...... intersect. Routes for unicast sessions are agnostic to other sessions and setup beforehand, CORE will then discover and exploit intersecting routes. Our approach allows the inter-session regions to leverage RLNC to compensate for losses or failures in the overhearing or transmitting process. Thus, we...... increase the benefits of XORing by exploiting the underlying RLNC structure of individual flows. This goes beyond providing additional reliability to each individual session and beyond exploiting coding opportunistically. Our numerical results show that CORE outperforms both forwarding and COPE...

  5. CORE

    DEFF Research Database (Denmark)

    Krigslund, Jeppe; Hansen, Jonas; Hundebøll, Martin

    2013-01-01

    different flows. Instead of maintaining these approaches separate, we propose a protocol (CORE) that brings together these coding mechanisms. Our protocol uses random linear network coding (RLNC) for intra- session coding but allows nodes in the network to setup inter- session coding regions where flows...... intersect. Routes for unicast sessions are agnostic to other sessions and setup beforehand, CORE will then discover and exploit intersecting routes. Our approach allows the inter-session regions to leverage RLNC to compensate for losses or failures in the overhearing or transmitting process. Thus, we...... increase the benefits of XORing by exploiting the underlying RLNC structure of individual flows. This goes beyond providing additional reliability to each individual session and beyond exploiting coding opportunistically. Our numerical results show that CORE outperforms both forwarding and COPE...

  6. Incorporating Software Requirements into the System RFP: Survey of RFP Language for Software by Topic, v. 2.0

    Science.gov (United States)

    2009-05-01

    Bergey 2005]. EXAMPLE 2 The contractor will development and document scenarios required to conduct architecture evaluation using the method...requirements that are part of the RFP [ Bergey 2002]. The statement of work (SOW) describes what the supplier must accomplish. In terms of any evaluation...and the potential system supplier [ Bergey 2002]. 7.14.2 Section M - Evaluation EXAMPLE 1 To incorporate architecture evaluation, Section M must

  7. Phasing software for a free flyer space-based sparse mirror array not requiring laser interferometry

    Science.gov (United States)

    Maker, David J.

    2004-10-01

    This paper presents new software (and simulations) that would phase a space based free flyer sparse array telescope. This particular sparse array method uses mirrors that are far enough away for sensors at the focal point module to detect tip tilt by simply using the deflection of the beam from each mirror. Also the large distance allows these circle six array mirrors to be actuated flats. For piston the secondary actuated mirrors (one for each large mirror segment of these widely spaced sparse array mirrors distributed on a parabola) are moved in real time to maximize the Strehle ratio using the light from the star the planet is revolving around since that star usually has an extremely high SNR (Signal to Noise Ratio). There is then no need for a 6DOF spider web of laser interferometric beams and deep dish mirrors (as in the competing Darwin and JPL methods) to accomplish this. Also the distance between the six 3 meter aperture mirrors could be large (kilometer range) guaranteeing a high resolution and also substantial light gathering power (with these 6 large mirrors) for imaging the details on the surface of extrasolar terrestrial type planets. In any case such a multisatellite free flyer concept would then be no more complex than the European cluster which is now operational. This is a viable concept and a compelling way to image surface detail on extra solar earthlike planets. It is the ideal engineering solution to the problem of space based large baseline sparse arrays. Significant details of the software requirements have been recently developed. In this paper the Fortran code needed to both simulate and operate the actuators in the secondary mirror for this type of sparse array is discussed.

  8. HDAC Activity Is Required for Efficient Core Promoter Function at the Mouse Mammary Tumor Virus Promoter

    Directory of Open Access Journals (Sweden)

    Sang C. Lee

    2011-01-01

    Full Text Available Histone deacetylases (HDACs have been shown to be required for basal or inducible transcription at a variety of genes by poorly understood mechanisms. We demonstrated previously that HDAC inhibition rapidly repressed transcription from the mouse mammary tumor virus (MMTV promoter by a mechanism that does not require the binding of upstream transcription factors. In the current study, we find that HDACs work through the core promoter sequences of MMTV as well as those of several cellular genes to facilitate transcriptional initiation through deacetylation of nonhistone proteins.

  9. Implosion and core heating requirements in subignition experiments FIREX-I

    Science.gov (United States)

    Johzaki, Tomoyuki; Nakao, Yasuyuki; Mima, Kunioki

    2008-06-01

    In the fast ignition realization experiment project phase-I (FIREX-I) [H. Azechi and the FIREX Project, Plasma Phys. Control. Fusion 48, B267 (2006)], core heating up to an ion temperature of 5keV is expected for subignition-class carbon-deuterium (CD) and deuterium-tritium (DT) fuels. The dependence of the achieved ion temperature on heating pulse parameters, and core density is investigated using two-dimensional simulations. Since the core size in FIREX-I is insufficient for self-ignition, and the confinement time is comparable to the heating duration (˜10ps), the temperature relaxation between the bulk electrons and ions is important for efficient ion heating. High compression (a core density of ρ >200g/cm3) is required for pure DT fuel to shorten the relaxation time. In this case, a heating energy of Eh>2kJ and a duration of τh2kJ and τh˜10ps.

  10. Abstracting and Metrics of Core Frame Structurein Large-Scale Software Based onk-Core%基于k-核的大规模软件核心框架结构抽取与度量

    Institute of Scientific and Technical Information of China (English)

    李辉; 赵海; 郝立颖; 何滨

    2011-01-01

    对大规模开源软件结构层次性的实证分析发现其具有扁平层次结构特征.在此基础上,利用k-核对软件系统结构进行层次划分,抽取出软件系统的核心框架结构CFS(core frame structure);通过对CFS与其他层节点的加权连接度统计,发现CFS与其他层联系紧密,CFS的节点对其他层的节点有巨大影响.通过对CFS网络结构特征量的度量,发现CFS具有无尺度网络特征和小世界网络特征,体现了较高的软件复用程度,在软件系统整体结构中处于支配地位.%The case study on hierarchical structure of large-scale open-source software shows that software systems are reflecting characteristic of flat hierarchical structure.On this basis,we took advantage ofk-core to divide the software system structure into layers,abstract the core frame structure(CFS) of software system.The statistics on weighted connected degree between CFS and other layers show that CFS tightly communicates with other layers and its nodes have great influence on the nodes of other layers.Then,through the metrics on network parameters of CFS,It was found that CFS reflects characteristic of free-scale and small-world,and reflects higher degree of software reuse.In addition,CFS is in a decisive status in the software system integrated structure.

  11. Using CORE Model-Based Systems Engineering Software to Support Program Management in the U.S. Department of Energy Office of the Biomass Project: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Riley, C.; Sandor, D.; Simpkins, P.

    2006-11-01

    This paper describes how a model-based systems engineering software, CORE, is helping the U. S. Department of Energy's Office of Biomass Program assist with bringing biomass-derived biofuels to the market. This software tool provides information to guide informed decision-making as biomass-to-biofuels systems are advanced from concept to commercial adoption. It facilitates management and communication of program status by automatically generating custom reports, Gantt charts, and tables using the widely available programs of Microsoft Word, Project and Excel.

  12. Understanding quality requirements engineering in contract-based projects from the perspective of software architects: an exploratory study

    NARCIS (Netherlands)

    Daneva, Maya; Herrmann, Andrea; Buglione, Luigi; Mistrik, Ivan; Bahsoon, Rami; Eeles, Peter; Roshandel, Roshanak; Stal, Michael

    2014-01-01

    This chapter discusses how software architects from 21 European project organizations cope with quality requirements (QRs) in large, contract-based systems delivery projects. It reports on the roles that architects played in QRs engineering, their interactions with other project roles, the specific

  13. Interplay between requirements, software architecture, and hardware constraints in the development of a home control user interface

    DEFF Research Database (Denmark)

    Loft, M.S.; Nielsen, S.S.; Nørskov, Kim;

    2012-01-01

    We have developed a new graphical user interface for a home control device for a large industrial customer. In this industrial case study, we first present our approaches to requirements engineering and to software architecture; we also describe the given hardware platform. Then we make two...

  14. DoD Related Software Technology Requirements, Practices, and Prospects for the Future

    Science.gov (United States)

    1984-06-01

    Program Testing". IEEE Trans. on Software Engineering, 6:2, March 1980, 162-169. A • 21. S. T. Redwine, Jr. "An Engineering Approach to SoftwareTest Data... local and rapid programming capability - voice recognition - multiple path message routing - access to all supporting databases - common decision...o Local and wide area weather control o Topographic systems-position reporting and recording and terrain information and video terrain displays

  15. DSA-WDS Common Requirements: Developing a New Core Data Repository Certification

    Science.gov (United States)

    Minster, J. B. H.; Edmunds, R.; L'Hours, H.; Mokrane, M.; Rickards, L.

    2016-12-01

    The Data Seal of Approval (DSA) and the International Council for Science - World Data System (ICSU-WDS) have both developed minimally intensive core certification standards whereby digital repositories supply evidence that they are trustworthy and have a long-term outlook. Both DSA and WDS applicants have found core certification to be beneficial: building stakeholder confidence, enhancing the repository's reputation, and demonstrating that it is following good practices; as well as stimulating the repository to focus on processes and procedures, thereby achieving ever higher levels of professionalism over time.The DSA and WDS core certifications evolved independently serving initially different communities but both initiatives are multidisciplinary with catalogues of criteria and review procedures based on the same principles. Hence, to realize efficiencies, simplify assessment options, stimulate more certifications, and increase impact on the community, the Repository Audit and Certification DSA-WDS Partnership Working Group (WG) was established under the umbrella of the Research Data Alliance (RDA). The WG conducted a side-by-side analysis of both frameworks to unify the wording and criteria, ultimately leading to a harmonized Catalogue of Common Requirements for core certification of repositories—as well as a set of Common Procedures for their assessment.This presentation will focus on the collaborative effort by DSA and WDS to establish (1) a testbed comprising DSA and WDS certified data repositories to validate both the new Catalogue and Procedures, and (2) a joint Certification Board towards their practical implementation. We will describe:• The purpose and methodology of the testbed, including selection of repositories to be assessed against the common standard.• The results of the testbed, with an in-depth look at some of the comments received and issues highlighted.• General insights gained from evaluating the testbed results, the subsequent

  16. Analysis of free geo-server software usability from the viewpoint of INSPIRE requirementsAnalysis of free geo-server software usability from the viewpoint of INSPIRE requirements

    Directory of Open Access Journals (Sweden)

    Tomasz  Grasza

    2014-06-01

    Full Text Available The paper presents selected server platforms based on free and open source license, coherent with the standards of the Open Geospatial Consortium. The presented programs are evaluated in the context of the INSPIRE Directive. The first part describes the requirements of the Directive, and afterwards presented are the pros and cons of each platform, to meet these demands. This article provides an answer to the question whether the use of free software can provide interoperable network services in accordance with the requirements of the INSPIRE Directive, on the occasion of presenting the application examples and practical tips on the use of particular programs.[b]Keywords[/b]: GIS, INSPIRE, free software, OGC, geoportal, network services, GeoServer, deegree, GeoNetwork

  17. EPICS: porting iocCore to multiple operating systems.

    Energy Technology Data Exchange (ETDEWEB)

    Kraimer, M.

    1999-09-30

    An important component of EPICS (Experimental Physics and Industrial Control System) is iocCore, which is the core software in the IOC (input/output controller) front-end processors. Currently iocCore requires the vxWorks operating system. This paper describes the porting of iocCore to other operating systems.

  18. Adding Timing Requirements to the CODARTS Real-Time Software Design Method

    DEFF Research Database (Denmark)

    Bach, K.R.

    The CODARTS software design method consideres how concurrent, distributed and real-time applications can be designed. Although accounting for the important issues of task and communication, the method does not provide means for expressing the timeliness of the tasks and communication directly...

  19. Meeting the International Health Regulations (2005) surveillance core capacity requirements at the subnational level in Europe

    DEFF Research Database (Denmark)

    Ziemann, Alexandra; Rosenkötter, Nicole; Riesgo, Luis Garcia-Castrillo;

    2015-01-01

    public health emergencies of international concern: (i) can syndromic surveillance support countries, especially the subnational level, to meet the International Health Regulations (2005) core surveillance capacity requirements, (ii) are European syndromic surveillance systems comparable to enable cross......-border surveillance, and (iii) at which administrative level should syndromic surveillance best be applied? DISCUSSION: Despite the ongoing criticism on the usefulness of syndromic surveillance which is related to its clinically nonspecific output, we demonstrate that it was a suitable supplement for timely...... assessment of the impact of three different public health emergencies affecting Europe. Subnational syndromic surveillance analysis in some cases proved to be of advantage for detecting an event earlier compared to national level analysis. However, in many cases, syndromic surveillance did not detect local...

  20. Multiple metal-binding cores are required for metalloregulation by M-box riboswitch RNAs.

    Science.gov (United States)

    Wakeman, Catherine A; Ramesh, Arati; Winkler, Wade C

    2009-09-25

    Riboswitches are regulatory RNAs that control downstream gene expression in response to direct association with intracellular metabolites or metals. Typically, riboswitch aptamer domains bind to a single small-molecule metabolite. In contrast, an X-ray crystallographic structural model for the M-box riboswitch aptamer revealed the absence of an organic metabolite ligand but the presence of at least six tightly associated magnesiums. This observation agrees well with the proposed role of the M-box riboswitch in functioning as a sensor of intracellular magnesium, although additional nonspecific metal interactions are also undoubtedly required for these purposes. To gain greater functional insight into the metalloregulatory capabilities of M-box RNAs, we sought to determine whether all or a subset of the RNA-chelated magnesium ions were required for riboswitch function. To accomplish this task, each magnesium-binding site was simultaneously yet individually perturbed through random incorporation of phosphorothioate nucleotide analogues, and RNA molecules were investigated for their ability to fold in varying levels of magnesium. These data revealed that all of the magnesium ions observed in the structural model are important for magnesium-dependent tertiary structure formation. Additionally, these functional data revealed a new core of potential metal-binding sites that are likely to assist formation of key tertiary interactions and were previously unobserved in the structural model. It is clear from these data that M-box RNAs require specific binding of a network of metal ions for partial fulfillment of their metalloregulatory functions.

  1. Preliminary input to the space shuttle reaction control subsystem failure detection and identification software requirements (uncontrolled)

    Science.gov (United States)

    Bergmann, E.

    1976-01-01

    The current baseline method and software implementation of the space shuttle reaction control subsystem failure detection and identification (RCS FDI) system is presented. This algorithm is recommended for conclusion in the redundancy management (RM) module of the space shuttle guidance, navigation, and control system. Supporting software is presented, and recommended for inclusion in the system management (SM) and display and control (D&C) systems. RCS FDI uses data from sensors in the jets, in the manifold isolation valves, and in the RCS fuel and oxidizer storage tanks. A list of jet failures and fuel imbalance warnings is generated for use by the jet selection algorithm of the on-orbit and entry flight control systems, and to inform the crew and ground controllers of RCS failure status. Manifold isolation valve close commands are generated in the event of failed on or leaking jets to prevent loss of large quantities of RCS fuel.

  2. A Research Agenda for Identifying and Developing Required Competencies in Software Engineering

    Directory of Open Access Journals (Sweden)

    Yvonne Sedelmaier

    2013-04-01

    Full Text Available 0 0 1 130 820 Hochschule Coburg 6 1 949 14.0 96 Normal 0 21 false false false DE JA X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Normale Tabelle"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin:0cm; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Times New Roman";} Various issues make learning and teaching software engineering a challenge for both students and instructors. Since there are no standard curricula and no cookbook recipes for successful software engineering, it is fairly hard to figure out which specific topics and competencies should be learned or acquired by a particular group of students. Furthermore, it is not clear which particular didactic approaches might work well for a specific topic and a particular group of students. This contribution presents a research agenda that aims at identifying relevant competencies and environmental constraints as well as their effect on learning and teaching software engineering. To that end, an experimental approach will be taken. As a distinctive feature, this approach iteratively introduces additional or modified didactical methods into existing courses and carefully evaluates their appropriateness. Thus, it continuously improves these methods.

  3. Estimation of Barley (Hordeum Vulgare L. Crop Water Requirements Using Cropwat Software in Ksar-Chellala Region, Algeria

    Directory of Open Access Journals (Sweden)

    M. B. Laouisset

    2016-09-01

    Full Text Available This paper estimates the reference Evapotranspiration (ET0 and Water requirements of barley (Hordeum vulgare L. in Ksar-Chellala region, Algeria, for one dry year by using CROPWAT software. Determination of Evapotranspiration ( ET is important in application such as irrigation design, irrigation scheduling, water resource management, hydrology and cropping systems modeling. Estimation of crop water requirements of barley ( CWR b respected the methodology adopted by the service of development and management service of FAO, based on the use of software CROPWAT 8.0. The total water requirements for barley depend on a variety of target yields and crops management. The period of climatic data used is 23 years (1990-2012, the average rain in this period is 254 mm. The total rain of the dry year is 190 mm. The results of this study show, during the vegetative cycle of barley which is 6 months, the calculation of ET 0 is 453 mm, the potential water which was used by the crop barley is estimated at 281.4 mm, the efficiency of rainfall is 69 mm and a total water requirements of barley ( CWR b equals to 211 mm, this amount distributed on three months coincided with important stages of development in barley. The supplementary irrigation in these conditions with optimal contents equals water requirements estimated by CROPWAT software that increases significantly grain yield of barely. Consequently, the gross irrigation water requirements ( GIWR of 1250000 ha which project to grow barley in the Algerian steppes regions are estimated at 3.77 billion and this for a dry year and a irrigation efficiency of 70%.

  4. RELAP-7 Software Verification and Validation Plan - Requirements Traceability Matrix (RTM) Part 2: Code Assessment Strategy, Procedure, and RTM Update

    Energy Technology Data Exchange (ETDEWEB)

    Yoo, Jun Soo [Idaho National Lab. (INL), Idaho Falls, ID (United States); Choi, Yong Joon [Idaho National Lab. (INL), Idaho Falls, ID (United States); Smith, Curtis Lee [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-09-01

    This document addresses two subjects involved with the RELAP-7 Software Verification and Validation Plan (SVVP): (i) the principles and plan to assure the independence of RELAP-7 assessment through the code development process, and (ii) the work performed to establish the RELAP-7 assessment plan, i.e., the assessment strategy, literature review, and identification of RELAP-7 requirements. Then, the Requirements Traceability Matrices (RTMs) proposed in previous document (INL-EXT-15-36684) are updated. These RTMs provide an efficient way to evaluate the RELAP-7 development status as well as the maturity of RELAP-7 assessment through the development process.

  5. Software Requirements Specification of the UIFA's UUIS -- a Team 4 COMP5541-W10 Project Approach

    CERN Document Server

    Alhazmi, Ali; Liu, Bing; Oliveira, Deyvisson; Sobh, Kanj; Mayantz, Max; de Bled, Robin; Zhang, Yu Ming

    2010-01-01

    This document presents the business requirement of Unified University Inventory System (UUIS) in Technology-independent manner. All attempts have been made in using mostly business terminology and business language while describing the requirements in this document. Very minimal and commonly understood Technical terminology is used. Use case approach is used in modeling the business requirements in this document.

  6. Applicability of SREM to the Verification of Management Information System Software Requirements. Volume I.

    Science.gov (United States)

    1981-04-30

    approach during R ,ET development is required during the verification effort. The approach used for verifying the MOM 3FER to prepare for TD . X was to...the value resident in TRACK NR is equal to the value resident in TRACK NR ’N. The portion of the VMH requirement described above requires tnat the

  7. HSC90 is required for nascent hepatitis C virus core protein stability in yeast cells.

    Science.gov (United States)

    Kubota, Naoko; Inayoshi, Yasutaka; Satoh, Naoko; Fukuda, Takashi; Iwai, Kenta; Tomoda, Hiroshi; Kohara, Michinori; Kataoka, Kazuhiro; Shimamoto, Akira; Furuichi, Yasuhiro; Nomoto, Akio; Naganuma, Akira; Kuge, Shusuke

    2012-07-30

    Hepatitis C virus core protein (Core) contributes to HCV pathogenicity. Here, we demonstrate that Core impairs growth in budding yeast. We identify HSP90 inhibitors as compounds that reduce intracellular Core protein level and restore yeast growth. Our results suggest that HSC90 (Hsc82) may function in the protection of the nascent Core polypeptide against degradation in yeast and the C-terminal region of Core corresponding to the organelle-interaction domain was responsible for Hsc82-dependent stability. The yeast system may be utilized to select compounds that can direct the C-terminal region to reduce the stability of Core protein. Copyright © 2012 Federation of European Biochemical Societies. Published by Elsevier B.V. All rights reserved.

  8. Specification of problems from the business goals in the context of early software requirements elicitation

    Directory of Open Access Journals (Sweden)

    Carlos Mario Zapata-J.

    2014-01-01

    Full Text Available Una de las principales actividades de la educción temprana de requisitos de software es el reconocimiento y especificación de los problemas de la organización. Esta actividad tiene por objeto la definición de los requisitos iniciales y la satisfacción de las necesidades de los interesados. Estos problemas deben tener relación con los objetivos de la organización para lograr una aplicación de software contextualizada y alineada con la razón de ser de la organización. En los métodos de educción actuales basados en objetivos y problemas, las relaciones se detectan con la ayuda de la experiencia y conocimiento del analista y el interesado. Sin embargo aún no se logra trazabilidad entre objetivos y problemas. En este artículo se propone un método para la especificación de problemas a partir de objetivos organizacionales. Este método se compone de un conjunto de reglas sintácticas y semánticas que el analista usa para expresar los problemas a partir de las declaraciones de los objetivos. También, se presenta un ejemplo de laboratorio basado en el diagrama de objetivos de KAOS.

  9. A Functional Core of IncA Is Required for Chlamydia trachomatis Inclusion Fusion.

    Science.gov (United States)

    Weber, Mary M; Noriea, Nicholas F; Bauler, Laura D; Lam, Jennifer L; Sager, Janet; Wesolowski, Jordan; Paumet, Fabienne; Hackstadt, Ted

    2016-04-01

    Chlamydia trachomatis is an obligate intracellular pathogen that is the etiological agent of a variety of human diseases, including blinding trachoma and sexually transmitted infections. Chlamydiae replicate within a membrane-bound compartment, termed an inclusion, which they extensively modify by the insertion of type III secreted proteins called Inc proteins. IncA is an inclusion membrane protein that encodes two coiled-coil domains that are homologous to eukaryotic SNARE (soluble N-ethylmaleimide-sensitive factor attachment receptor) motifs. Recent biochemical evidence suggests that a functional core, composed of SNARE-like domain 1 (SLD-1) and part of SNARE-like domain 2 (SLD-2), is required for the characteristic homotypic fusion of C. trachomatis inclusions in multiply infected cells. To verify the importance of IncA in homotypic fusion in Chlamydia, we generated an incA::bla mutant. Insertional inactivation of incA resulted in the formation of nonfusogenic inclusions, a phenotype that was completely rescued by complementation with full-length IncA. Rescue of homotypic inclusion fusion was dependent on the presence of the functional core consisting of SLD-1 and part of SLD-2. Collectively, these results confirm in vitro membrane fusion assays identifying functional domains of IncA and expand the genetic tools available for identification of chlamydia with a method for complementation of site-specific mutants. Chlamydia trachomatis replicates within a parasitophorous vacuole termed an inclusion. The chlamydial inclusions are nonfusogenic with vesicles in the endocytic pathway but, in multiply infected cells, fuse with each other to form a single large inclusion. This homotypic fusion is dependent upon the presence of a chlamydial inclusion membrane-localized protein, IncA. Specificity of membrane fusion in eukaryotic cells is regulated by SNARE (soluble N-ethylmaleimide sensitive factor attachment receptor) proteins on the cytosolic face of vesicles and target

  10. Handling requirements dependencies in agile projects: a focus group with agile software development practitioners

    NARCIS (Netherlands)

    Martakis, Aias; Daneva, Maya; Wieringa, R.J.; Jean-Louis Cavarero, S.; Rolland, C.; Cavarero, J.-L.

    2013-01-01

    Agile practices on requirements dependencies are a relatively unexplored topic in literature. Empirical studies on it are scarce. This research sets out to uncover concepts that practitioners in companies of various sizes across the globe and in various industries, use for dealing with requirements

  11. An evaluation of the metal fuel core performance for commercial FBR requirements

    Energy Technology Data Exchange (ETDEWEB)

    Ohta, H.; Yokoo, T. [Central Research Inst. of Electric Power Industry, Komae, Tokyo (Japan)

    2001-07-01

    Neutronic and thermal hydraulic design studies are conducted on the 3,900 MWt and 800 MWt metal fuel fast breeder reactor (FBR) cores that achieve a high burnup of 150 GWd/t, which is considered to be one of the goals of future commercial FBRs. The results show that a large-scale metal fuel core with a homogeneous configuration is a promising design for future commercial FBRs from the view point of economics, core safety, effective use of uranium resources and reduction of environmental load. In small-scale high-burnup cores, a radial heterogeneous design that can improve neutronic performance and safety parameters is suitable. (author)

  12. STM32双核板的应用设计与ISP的从机软件升级%AppI ication Design of STM32 DuaI-core Board and SIave Core Software Upgrade of ISP

    Institute of Scientific and Technical Information of China (English)

    綦声波; 刘英男; 王圣南; 刘群

    2015-01-01

    Aiming at the application problems of MCU such as the lack of resources,poor reliability and the software upgrade,a dual-core board based on Cortex-M3 kernel is designed in this paper.The dual-core board almost doubles the MCU resources without increasing the difficulty of development,and it improves the reliability of the whole system through the rational burden division and mutual supervi-sion between two MCUs.Upgrading program can be transmitted between the master core and the upper monitor by CAN bus.The slave core software upgrade based on ISP is realized through controlling pin.%针对单片机应用设计中的资源不足、可靠性差和软件升级问题,设计了一款基于 Cortex M3内核的双核板。该双核板在不增加开发难度的情况下使单片机资源翻倍;通过两个 MC U之间的任务合理分工和相互监督,提高了整体系统的可靠性;利用CAN总线完成主机与上位机的通信和升级程序的下载,并通过控制引脚实现基于 ISP功能的从机软件升级。

  13. Core requirements for successful data linkage: an example of a triangulation method.

    Science.gov (United States)

    Hopf, Y M; Francis, J; Helms, P J; Haughney, J; Bond, C

    2016-10-21

    The aim was to explore the views of professional stakeholders and healthcare professionals (HCPs) on the linkage of UK National Health Service (NHS) data for paediatric pharmacovigilance purposes and to make recommendations for such a system. A mixed methods approach including a literature review, interviews, focus groups and a three-round Delphi survey with HCPs in Scotland was followed by a triangulation process using a systematic protocol. The survey was structured using the Theoretical Domains Framework of behaviour change. Items retained after applying the matrix-based triangulation process were thematically coded. Ethical approval was granted by the North of Scotland Research Ethics Service. Results from 18 papers, 23 interviewees, 23 participants of focus groups and 61 completed questionnaires in the Delphi survey contributed to the triangulation process. A total of 25 key findings from all four studies were identified during triangulation. There was good convergence; 21 key findings were agreed and remained to inform recommendations. The items were coded as practical/technical (eg, decision about the unique patient identifier to use), mandatory (eg, governed by statute), essential (consistently mentioned in all studies and therefore needed to ensure professional support) or preferable. The development of a paediatric linked database has support from professional stakeholders and HCPs in Scotland. The triangulation identified three sets of core requirements for a new system of data linkage. An additional fourth set of 'preferable' requirements might increase engagement of HCPs and their support for the new system. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  14. HardwareSoftware Co-design for Heterogeneous Multi-core Platforms The hArtes Toolchain

    CERN Document Server

    2012-01-01

    This book describes the results and outcome of the FP6 project, known as hArtes, which focuses on the development of an integrated tool chain targeting a heterogeneous multi core platform comprising of a general purpose processor (ARM or powerPC), a DSP (the diopsis) and an FPGA. The tool chain takes existing source code and proposes transformations and mappings such that legacy code can easily be ported to a modern, multi-core platform. Benefits of the hArtes approach, described in this book, include: Uses a familiar programming paradigm: hArtes proposes a familiar programming paradigm which is compatible with the widely used programming practice, irrespective of the target platform. Enables users to view multiple cores as a single processor: the hArtes approach abstracts away the heterogeneity as well as the multi-core aspect of the underlying hardware so the developer can view the platform as consisting of a single, general purpose processor. Facilitates easy porting of existing applications: hArtes provid...

  15. Software requirements specification for the program analysis and control system risk management module

    Energy Technology Data Exchange (ETDEWEB)

    SCHAEFER, J.C.

    1999-06-02

    TWR Program Analysis and Control System Risk Module is used to facilitate specific data processes surrounding the Risk Management program of the Tank Waste Retrieval environment. This document contains the Risk Management system requirements of the database system.

  16. Family Skills for General Psychiatry Residents: Meeting ACGME Core Competency Requirements

    Science.gov (United States)

    Berman, Ellen M.; Heru, Alison M.; Grunebaum, Henry; Rolland, John; Wood, Beatrice; Bruty, Heidi

    2006-01-01

    Objective: The authors discuss the knowledge, attitudes, and skills needed for a resident to be competent in supporting and working with families, as mandated by the residency review committee (RRC) core competencies. Methods: The RRC core competencies, as they relate to patients and their families, are reviewed. The Group for Advancement of…

  17. Comparison of the Number of Image Acquisitions and Procedural Time Required for Transarterial Chemoembolization of Hepatocellular Carcinoma with and without Tumor-Feeder Detection Software

    Directory of Open Access Journals (Sweden)

    Jin Iwazawa

    2013-01-01

    Full Text Available Purpose. To compare the number of image acquisitions and procedural time required for transarterial chemoembolization (TACE with and without tumor-feeder detection software in cases of hepatocellular carcinoma (HCC. Materials and Methods. We retrospectively reviewed 50 cases involving software-assisted TACE (September 2011–February 2013 and 84 cases involving TACE without software assistance (January 2010–August 2011. We compared the number of image acquisitions, the overall procedural time, and the therapeutic efficacy in both groups. Results. Angiography acquisition per session reduced from 6.6 times to 4.6 times with software assistance (P<0.001. Total image acquisition significantly decreased from 10.4 times to 8.7 times with software usage (P=0.004. The mean procedural time required for a single session with software-assisted TACE (103 min was significantly lower than that for a session without software (116 min, P=0.021. For TACE with and without software usage, the complete (68% versus 63%, resp. and objective (78% versus 80%, resp. response rates did not differ significantly. Conclusion. In comparison with software-unassisted TACE, automated feeder-vessel detection software-assisted TACE for HCC involved fewer image acquisitions and could be completed faster while maintaining a comparable treatment response.

  18. Impact of Software Requirement Volatility Pattern on Project Dynamics: Evidences from a Case Study

    CERN Document Server

    Thakurta, Rahul

    2011-01-01

    Requirements are found to change in various ways during the course of a project. This can affect the process in widely different manner and extent. Here we present a case study where-in we investigate the impact of requirement volatility pattern on project performance. The project setting described in the case is emulated on a validated system dynamics model representing the waterfall model. The findings indicate deviations in project outcome from the estimated thereby corroborating to previous findings. The results reinforce the applicability of system dynamics approach to analyze project performance under requirement volatility, which is expected to speed up adoption of the same in organizations and in the process contribute to more project successes.

  19. Transfer of neuroplasticity from nucleus accumbens core to shell is required for cocaine reward.

    Directory of Open Access Journals (Sweden)

    Nicolas Marie

    Full Text Available It is well established that cocaine induces an increase of dendritic spines density in some brain regions. However, few studies have addressed the role of this neuroplastic changes in cocaine rewarding effects and have often led to contradictory results. So, we hypothesized that using a rigorous time- and subject-matched protocol would demonstrate the role of this spine increase in cocaine reward. We designed our experiments such as the same animals (rats were used for spine analysis and behavioral studies. Cocaine rewarding effects were assessed with the conditioned place preference paradigm. Spines densities were measured in the two subdivisions of the nucleus accumbens (NAcc, core and shell. We showed a correlation between the increase of spine density in NAcc core and shell and cocaine rewarding effects. Interestingly, when cocaine was administered in home cages, spine density was increase in NAcc core only. With anisomycin, a protein synthesis inhibitor, injected in the core we blocked spine increase in core and shell and also cocaine rewarding effects. Strikingly, whereas injection of this inhibitor in the shell immediately after conditioning had no effect on neuroplasticity or behavior, its injection 4 hours after conditioning was able to block neuroplasticity in shell only and cocaine-induced place preference. Thus, it clearly appears that the neuronal plasticity in the NAcc core is essential to induce plasticity in the shell, necessary for cocaine reward. Altogether, our data revealed a new mechanism in the NAcc functioning where a neuroplasticity transfer occurred from core to shell.

  20. Core Placement Algorithm for Multicast Routing with QoS Requirements

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    Differing from the source-oriented algorithms, the Core-Based Tree (CBT) multicast routing architecture establishes a single shared tree for multiple connections on a multicast group, which results in higher ratio of network resources utilization. In alluding to the problem of Core Placement, we propose a simple method (QOCP) to locate an optimal core node, which can minimize the multicast delay and inter-destination delay variation simultaneously. The simulation results show that our method is very effective, and outperforms the other algorithms studied in this paper.

  1. Rab18 is required for viral assembly of hepatitis C virus through trafficking of the core protein to lipid droplets.

    Science.gov (United States)

    Dansako, Hiromichi; Hiramoto, Hiroki; Ikeda, Masanori; Wakita, Takaji; Kato, Nobuyuki

    2014-08-01

    During persistent infection of HCV, the HCV core protein (HCV-JFH-1 strain of genotype 2a) is recruited to lipid droplets (LDs) for viral assembly, but the mechanism of recruitment of the HCV core protein is uncertain. Here, we demonstrated that one of the Ras-related small GTPases, Rab18, was required for trafficking of the core protein around LDs. The knockdown of Rab18 reduced intracellular and extracellular viral infectivity, but not intracellular viral replication in HCV-JFH-1-infected RSc cells (an HuH-7-derived cell line). Exogenous expression of Rab18 increased extracellular viral infectivity almost two-fold. Furthermore, Rab18 was co-localized with the core protein in HCV-JFH-1-infected RSc cells, and the knockdown of Rab18 blocked recruitment of the HCV-JFH-1 core protein to LDs. These results suggest that Rab18 has an important role in viral assembly through the trafficking of the core protein to LDs.

  2. Requirement of cellular DDX3 for hepatitis C virus replication is unrelated to its interaction with the viral core protein.

    Science.gov (United States)

    Angus, Allan G N; Dalrymple, David; Boulant, Steeve; McGivern, David R; Clayton, Reginald F; Scott, Martin J; Adair, Richard; Graham, Susan; Owsianka, Ania M; Targett-Adams, Paul; Li, Kui; Wakita, Takaji; McLauchlan, John; Lemon, Stanley M; Patel, Arvind H

    2010-01-01

    The cellular DEAD-box protein DDX3 was recently shown to be essential for hepatitis C virus (HCV) replication. Prior to that, we had reported that HCV core binds to DDX3 in yeast-two hybrid and transient transfection assays. Here, we confirm by co-immunoprecipitation that this interaction occurs in cells replicating the JFH1 virus. Consistent with this result, immunofluorescence staining of infected cells revealed a dramatic redistribution of cytoplasmic DDX3 by core protein to the virus assembly sites around lipid droplets. Given this close association of DDX3 with core and lipid droplets, and its involvement in virus replication, we investigated the importance of this host factor in the virus life cycle. Mutagenesis studies located a single amino acid in the N-terminal domain of JFH1 core that when changed to alanine significantly abrogated this interaction. Surprisingly, this mutation did not alter infectious virus production and RNA replication, indicating that the core-DDX3 interaction is dispensable in the HCV life cycle. Consistent with previous studies, siRNA-led knockdown of DDX3 lowered virus production and RNA replication levels of both WT JFH1 and the mutant virus unable to bind DDX3. Thus, our study shows for the first time that the requirement of DDX3 for HCV replication is unrelated to its interaction with the viral core protein.

  3. SATISFACTION OF QUALIFICATION REQUIREMENTS OF EMPLOYERS APPLIED TO SOFTWARE ENGINEERS IN THE PROCESS OF TRAINING AT HIGHER EDUCATIONAL INSTITUTIONS

    Directory of Open Access Journals (Sweden)

    Vladislav Kruhlyk

    2017-03-01

    Full Text Available In the article, based on the analysis of the problems of the professional training of software engineers in higher educational institutions, was shown that the contents of the curricula for the training of software engineers in basic IT specialties in higher education institutions generally meet the requirements to them at the labor market. It is stated that at the present time there are certain changes in the job market not only in the increasing demand for IT professionals but also in the requirements settled for future specialists. To scientists’ opinion, at present there is a gap between the level of expectation of employers and the level of education of graduates of IT-specialties of universities. Due to the extremely fast pace of IT development, already at the end of the studies, students' knowledge may become obsolete. We are talking about a complex of competencies offered by university during training of specialist for their relevance and competitiveness at the labor market. At the same time, the practical training of students does not fully correspond to the current state of information technology. Therefore, it is necessary to ensure the updating of the contents of the academic disciplines with the aim of providing quality training of specialists.

  4. Targeted MRI-guided prostate biopsy: are two biopsy cores per MRI-lesion required?

    Energy Technology Data Exchange (ETDEWEB)

    Schimmoeller, L.; Quentin, M.; Blondin, D.; Dietzel, F.; Schleich, C.; Thomas, C.; Antoch, G. [University Dusseldorf, Medical Faculty, Department of Diagnostic and Interventional Radiology, Dusseldorf (Germany); Hiester, A.; Rabenalt, R.; Albers, P.; Arsov, C. [University Dusseldorf, Medical Faculty, Department of Urology, Dusseldorf (Germany); Gabbert, H.E. [University Dusseldorf, Medical Faculty, Department of Pathology, Dusseldorf (Germany)

    2016-11-15

    This study evaluates the feasibility of performing less than two core biopsies per MRI-lesion when performing targeted MR-guided in-bore prostate biopsy. Retrospectively evaluated were 1545 biopsy cores of 774 intraprostatic lesions (two cores per lesion) in 290 patients (66 ± 7.8 years; median PSA 8.2 ng/ml) regarding prostate cancer (PCa) detection, Gleason score, and tumor infiltration of the first (FBC) compared to the second biopsy core (SBC). Biopsies were acquired under in-bore MR-guidance. For the biopsy cores, 491 were PCa positive, 239 of 774 (31 %) were FBC and 252 of 771 (33 %) were SBC (p = 0.4). Patient PCa detection rate based on the FBC vs. SBC were 46 % vs. 48 % (p = 0.6). For clinically significant PCa (Gleason score ≥4 + 3 = 7) the detection rate was 18 % for both, FBC and SBC (p = 0.9). Six hundred and eighty-seven SBC (89 %) showed no histologic difference. On the lesion level, 40 SBC detected PCa with negative FBC (7.5 %). Twenty SBC showed a Gleason upgrade from 3 + 3 = 6 to ≥3 + 4 = 7 (2.6 %) and 4 to ≥4 + 3 = 7 (0.5 %). The benefit of a second targeted biopsy core per suspicious MRI-lesion is likely minor, especially regarding PCa detection rate and significant Gleason upgrading. Therefore, a further reduction of biopsy cores is reasonable when performing a targeted MR-guided in-bore prostate biopsy. (orig.)

  5. Software system requirements for the Army Tactical Missile System (ATACMS) End-To-End System using the Computer Aided Prototyping System (CAPS) multi-file approach

    OpenAIRE

    Angrisani, David Stuart; Whitbeck, George Steven.

    1996-01-01

    The Department of Defense (DOD) is seeking software system requirements for the Army Tactical Missile System (ATACMS) End to End System, which comprises both ATACMS and all sensors, links, and command centers which enable integration across system and service boundaries. The complexity, multiple interfaces, and joint nature of planned ATACMS operations demands accurate specification of software system requirements. DOD also desires automated tools capable of developing rapid prototypes to ass...

  6. Analysis of software engineering requirement analysis and structure construction%浅析软件工程需求分析与结构建设

    Institute of Scientific and Technical Information of China (English)

    任延璞

    2016-01-01

    软件工程飞速发展,应用于多个不同领域有着各式各样的开发标准,在软件结构建设方面同样存在一定差异。文章通过阐述软件工程需求分析含义,分析软件工程中需求分析的重要性、软件工程需求分析存在的问题,对软件工程需求分析及结构建设展开探讨,旨在为促进软件工程需求分析与结构建设的有序开展提供一些思路。%Software engineering has got a rapid development, which is applied to a variety of different areas, there are some differences in the construction of software structure. By explaining the meaning of software engineering requirement analysis, this paper analyzes the importance of requirement analysis in software engineering, existing problems of software engineering requirement analysis, and discusses the software engineering requirement analysis and structure construction, aiming at providing some ideas for promoting the demand analysis and orderly carried out software engineering structure construction.

  7. Solid Waste Information Tracking System (SWITS), Backlog Waste Modifications, Software Requirements Specification (SRS)

    Energy Technology Data Exchange (ETDEWEB)

    Clark, R.E. [USDOE Richland Operations Office, WA (United States)

    1995-05-05

    Purpose of this document is to define the system requirements necessary to improve computer support for the WHC backlog waste business process through enhancements to the backlog waste function of the SWITS system. This SRS document covers enhancements to the SWITS system to support changes to the existing Backlog Waste screens including new data elements, label changes, and new pop-up screens. The pop-ups will allow the user to flag the processes that a waste container must have performed on it, and will provide history tracking of changes to data. A new screen will also be provided allowing Acceptable Services to perform mass updates to specific data in Backlog Waste table. The SWITS Backlog Waste enhancements in this document will support the project goals in WHC-SD-WM-003 and its Revision 1 (Radioactive Solid Waste Tracking System Conceptual Definition) for the control, tracing, and inventory management of waste as the packages are generated and moved through final disposal (cradle-to-grave).

  8. Cronos 2: a neutronic simulation software for reactor core calculations; Cronos 2: un logiciel de simulation neutronique des coeurs de reacteurs

    Energy Technology Data Exchange (ETDEWEB)

    Lautard, J.J.; Magnaud, C.; Moreau, F.; Baudron, A.M. [CEA Saclay, Dept. de Mecanique et de Technologie (DMT/SERMA), 91 - Gif-sur-Yvette (France)

    1999-07-01

    The CRONOS2 software is that part of the SAPHYR code system dedicated to neutronic core calculations. CRONOS2 is a powerful tool for reactor design, fuel management and safety studies. Its modular structure and great flexibility make CRONOS2 an unique simulation tool for research and development for a wide variety of reactor systems. CRONOS2 is a versatile tool that covers a large range of applications from very fast calculations used in training simulators to time and memory consuming reference calculations needed to understand complex physical phenomena. CRONOS2 has a procedure library named CPROC that allows the user to create its own application environment fitted to a specific industrial use. (authors)

  9. Is American Teacher Education Fully up to the Common Core Requirements?

    Science.gov (United States)

    Murray, Frank B.

    2014-01-01

    This narrative considers how well equipped today's teacher education students and faculty are to meet the demands of the new Common Core State Standards. Data from the Teacher Education Accreditation Council's national evaluation of teacher education programs gives a mixed picture that, while mostly encouraging, also reveals that some…

  10. A National Survey of Core Course Requirements, Department Names, and Undergraduate Program Titles in Communication.

    Science.gov (United States)

    King, Corwin P.

    1998-01-01

    Assesses the typical number of core courses departments have, the most commonly used ones, and, in a general way, their contents. Provides data on department names and undergraduate program areas for a picture of the communication disciplines in the 1990s. (RS)

  11. Software and systems traceability

    CERN Document Server

    Cleland-Huang, Jane; Zisman, Andrea

    2012-01-01

    ""Software and Systems Traceability"" provides a comprehensive description of the practices and theories of software traceability across all phases of the software development lifecycle. The term software traceability is derived from the concept of requirements traceability. Requirements traceability is the ability to track a requirement all the way from its origins to the downstream work products that implement that requirement in a software system. Software traceability is defined as the ability to relate the various types of software artefacts created during the development of software syst

  12. Design of a computer software for calculation of required barrier against radiation at the diagnostic x-ray units

    Directory of Open Access Journals (Sweden)

    S.A. Rahimi

    2005-01-01

    Full Text Available Background and purpose : Instalation of protective barrier against diagnostic x-ray is generally done based on the recommendations of NCRP49. There are analytic methods for designing protective barriers howerer, they lack sufficient efficiency and considering the NCRP49 reports, designing mechanical protective barrier in order to protect the initial x-ray radiation and absorption of the ray quality of such radiation is different.Therefore, the protective barrier for each radiation is measured separately. In this study, a computer software was designed to calculate the needed barrier with high accuracy.Materials and methods: Calculation of required protective barrier particularly when two or more generators are in use at diagnostic x-ray units and or installed diagnostic equipments do not have proper room space and the limitations for other clanges in parameters which are time- consuming and impossible to be manually calculated. For proper determination of thichness of the protective barrier, relevant information about curves of radiation weakness, dose limit etc should be entered. This program was done in windows and designed in such a way that the operator works easily, flexibility of the program is acceptable and its accuracy and sensitivity is high.Results : Results of this program indicate that, in most cases, in x-ray units required protective barrier was not used. Meanwhile sometimes shielding is more than what required which lacks technical standards and cost effectiveness. When the application index is contrasting zero, thichness of NCRP49 calculation is about 20% less than the calculated rate done by the method of this study. When the applied index is equal to zero (that is the only situation where the second barrier is considered, thickness of requined barrier is about 15% less than the lead barrier and concrete barrier calculated in this project is 8% less than that calculated by McGuire method.Conclusion : In this study proper

  13. The N-terminus of murine leukaemia virus p12 protein is required for mature core stability.

    Directory of Open Access Journals (Sweden)

    Darren J Wight

    2014-10-01

    Full Text Available The murine leukaemia virus (MLV gag gene encodes a small protein called p12 that is essential for the early steps of viral replication. The N- and C-terminal regions of p12 are sequentially acting domains, both required for p12 function. Defects in the C-terminal domain can be overcome by introducing a chromatin binding motif into the protein. However, the function of the N-terminal domain remains unknown. Here, we undertook a detailed analysis of the effects of p12 mutation on incoming viral cores. We found that both reverse transcription complexes and isolated mature cores from N-terminal p12 mutants have altered capsid complexes compared to wild type virions. Electron microscopy revealed that mature N-terminal p12 mutant cores have different morphologies, although immature cores appear normal. Moreover, in immunofluorescent studies, both p12 and capsid proteins were lost rapidly from N-terminal p12 mutant viral cores after entry into target cells. Importantly, we determined that p12 binds directly to the MLV capsid lattice. However, we could not detect binding of an N-terminally altered p12 to capsid. Altogether, our data imply that p12 stabilises the mature MLV core, preventing premature loss of capsid, and that this is mediated by direct binding of p12 to the capsid shell. In this manner, p12 is also retained in the pre-integration complex where it facilitates tethering to mitotic chromosomes. These data also explain our previous observations that modifications to the N-terminus of p12 alter the ability of particles to abrogate restriction by TRIM5alpha and Fv1, factors that recognise viral capsid lattices.

  14. The N-Terminus of Murine Leukaemia Virus p12 Protein Is Required for Mature Core Stability

    Science.gov (United States)

    Wight, Darren J.; Boucherit, Virginie C.; Wanaguru, Madushi; Elis, Efrat; Hirst, Elizabeth M. A.; Li, Wilson; Ehrlich, Marcelo; Bacharach, Eran; Bishop, Kate N.

    2014-01-01

    The murine leukaemia virus (MLV) gag gene encodes a small protein called p12 that is essential for the early steps of viral replication. The N- and C-terminal regions of p12 are sequentially acting domains, both required for p12 function. Defects in the C-terminal domain can be overcome by introducing a chromatin binding motif into the protein. However, the function of the N-terminal domain remains unknown. Here, we undertook a detailed analysis of the effects of p12 mutation on incoming viral cores. We found that both reverse transcription complexes and isolated mature cores from N-terminal p12 mutants have altered capsid complexes compared to wild type virions. Electron microscopy revealed that mature N-terminal p12 mutant cores have different morphologies, although immature cores appear normal. Moreover, in immunofluorescent studies, both p12 and capsid proteins were lost rapidly from N-terminal p12 mutant viral cores after entry into target cells. Importantly, we determined that p12 binds directly to the MLV capsid lattice. However, we could not detect binding of an N-terminally altered p12 to capsid. Altogether, our data imply that p12 stabilises the mature MLV core, preventing premature loss of capsid, and that this is mediated by direct binding of p12 to the capsid shell. In this manner, p12 is also retained in the pre-integration complex where it facilitates tethering to mitotic chromosomes. These data also explain our previous observations that modifications to the N-terminus of p12 alter the ability of particles to abrogate restriction by TRIM5alpha and Fv1, factors that recognise viral capsid lattices. PMID:25356837

  15. Core competency requirements among extension workers in peninsular Malaysia: Use of Borich's needs assessment model.

    Science.gov (United States)

    Umar, Sulaiman; Man, Norsida; Nawi, Nolila Mohd; Latif, Ismail Abd; Samah, Bahaman Abu

    2017-06-01

    The study described the perceived importance of, and proficiency in core agricultural extension competencies among extension workers in Peninsular Malaysia; and evaluating the resultant deficits in the competencies. The Borich's Needs Assessment Model was used to achieve the objectives of the study. A sample of 298 respondents was randomly selected and interviewed using a pre-tested structured questionnaire. Thirty-three core competency items were assessed. Instrument validity and reliability were ensured. The cross-sectional data obtained was analysed using SPSS for descriptive statistics including mean weighted discrepancy score (MWDS). Results of the study showed that on a scale of 5, the most important core extension competency items according to respondents' perception were: "Making good use of information and communication technologies/access and use of web-based resources" (M=4.86, SD=0.23); "Conducting needs assessments" (M=4.84, SD=0.16); "organizing extension campaigns" (M=4.82, SD=0.47) and "Managing groups and teamwork" (M=4.81, SD=0.76). In terms of proficiency, the highest competency identified by the respondents was "Conducting farm and home visits (M=3.62, SD=0.82) followed by 'conducting meetings effectively' (M=3.19, SD=0.72); "Conducting focus group discussions" (M=3.16, SD=0.32) and "conducting community forums" (M=3.13, SD=0.64). The discrepancies implying competency deficits were widest in "Acquiring and allocating resources" (MWDS=12.67); use of information and communication technologies (ICTs) and web-based resources in agricultural extension (MWDS=12.59); and report writing and sharing the results and impacts (MWDS=11.92). It is recommended that any intervention aimed at developing the capacity of extension workers in Peninsular Malaysia should prioritize these core competency items in accordance with the deficits established in this study. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Core promoter acetylation is not required for high transcription from the phosphoenolpyruvate carboxylase promoter in maize

    Directory of Open Access Journals (Sweden)

    Horst Ina

    2009-12-01

    Full Text Available Abstract Background Acetylation of promoter nucleosomes is tightly correlated and mechanistically linked to gene activity. However, transcription is not necessary for promoter acetylation. It seems, therefore, that external and endogenous stimuli control histone acetylation and by this contribute to gene regulation. Photosynthetic genes in plants are excellent models with which to study the connection between stimuli and chromatin modifications because these genes are strongly expressed and regulated by multiple stimuli that are easily manipulated. We have previously shown that acetylation of specific histone lysine residues on the photosynthetic phosphoenolpyruvate carboxylase (Pepc promoter in maize is controlled by light and is independent of other stimuli or gene activity. Acetylation of upstream promoter regions responds to a set of other stimuli which include the nutrient availability of the plant. Here, we have extended these studies by analysing histone acetylation during the diurnal and circadian rhythm of the plant. Results We show that histone acetylation of individual lysine residues is removed from the core promoter before the end of the illumination period which is an indication that light is not the only factor influencing core promoter acetylation. Deacetylation is accompanied by a decrease in gene activity. Pharmacological inhibition of histone deacetylation is not sufficient to prevent transcriptional repression, indicating that deacetylation is not controlling diurnal gene regulation. Variation of the Pepc promoter activity during the day is controlled by the circadian oscillator as it is maintained under constant illumination for at least 3 days. During this period, light-induced changes in histone acetylation are completely removed from the core promoter, although the light stimulus is continuously applied. However, acetylation of most sites on upstream promoter elements follows the circadian rhythm. Conclusion Our results

  17. Software Reviews.

    Science.gov (United States)

    Classroom Computer Learning, 1990

    1990-01-01

    Reviewed are three computer software packages including "Martin Luther King, Jr.: Instant Replay of History,""Weeds to Trees," and "The New Print Shop, School Edition." Discussed are hardware requirements, costs, grade levels, availability, emphasis, strengths, and weaknesses. (CW)

  18. Software Partitioning Technologies

    Science.gov (United States)

    2001-05-29

    1 Software Partitioning Technologies Tim Skutt Smiths Aerospace 3290 Patterson Ave. SE Grand Rapids, MI 49512-1991 (616) 241-8645 skutt_timothy...Limitation of Abstract UU Number of Pages 12 2 Agenda n Software Partitioning Overview n Smiths Software Partitioning Technology n Software Partitioning...Partition Level OS Core Module Level OS Timers MMU I/O API Layer Partitioning Services 6 Smiths Software Partitioning Technology n Smiths has developed

  19. Imprinting Community College Computer Science Education with Software Engineering Principles

    Science.gov (United States)

    Hundley, Jacqueline Holliday

    2012-01-01

    Although the two-year curriculum guide includes coverage of all eight software engineering core topics, the computer science courses taught in Alabama community colleges limit student exposure to the programming, or coding, phase of the software development lifecycle and offer little experience in requirements analysis, design, testing, and…

  20. Imprinting Community College Computer Science Education with Software Engineering Principles

    Science.gov (United States)

    Hundley, Jacqueline Holliday

    2012-01-01

    Although the two-year curriculum guide includes coverage of all eight software engineering core topics, the computer science courses taught in Alabama community colleges limit student exposure to the programming, or coding, phase of the software development lifecycle and offer little experience in requirements analysis, design, testing, and…

  1. The core structure of a Dendrobium huoshanense polysaccharide required for the inhibition of human lens epithelial cell apoptosis.

    Science.gov (United States)

    Zha, Xue-Qiang; Deng, Yuan-Yuan; Li, Xiao-Long; Wang, Jing-Fei; Pan, Li-Hua; Luo, Jian-Ping

    2017-01-02

    The aim of this work was to investigate the core structure of a Dendrobium huoshanense polysaccharide DHPD1 required for the inhibition of lens epithelial cell apoptosis. In order to obtain the fragments containing the core domain, pectinase was employed to hydrolyze DHPD1. After 24h reaction, it is interesting that the hydrolyzation seemed to be stopped, leading to a final enzymatic fragment DHPD1-24 with molecular weight about 1552Da. Compared to DHPD1, although the bioactivity is decreased, DHPD1-24 remained the ability to inhibit the H2O2-induced apoptosis of human lens epithelial (HLE) cells via suppressing the MAPK signaling pathways. These results suggested that DHPD1-24 might be the core domain required for DHPD1 to inhibit HLE cell apoptosis. Methylation analysis showed DHPD1-24 was composed of (1→5)-linked-Araf, (1→3,6)-linked-Manp, 1-linked-Glcp, (1→4)-linked-Glcp, (1→6)-linked-Glcp, (1→4,6)-linked-Glcp, (1→6)-linked-Galp and 1-linked-Xylp in a molar ratio of 1.06:1.53:2.11:2.04:0.93:0.91:0.36:1.01. Moreover, the primary structural features of DHPD1-24 were characterized by NMR spectrum. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. Coincident activation of NMDA and dopamine D1 receptors within the nucleus accumbens core is required for appetitive instrumental learning.

    Science.gov (United States)

    Smith-Roe, S L; Kelley, A E

    2000-10-15

    The nucleus accumbens, a brain structure ideally situated to act as an interface between corticolimbic information-processing regions and motor output systems, is well known to subserve behaviors governed by natural reinforcers. In the accumbens core, glutamatergic input from its corticolimbic afferents and dopaminergic input from the ventral tegmental area converge onto common dendrites of the medium spiny neurons that populate the accumbens. We have previously found that blockade of NMDA receptors in the core with the antagonist 2-amino-5-phosphonopentanoic acid (AP-5; 5 nmol) abolishes acquisition but not performance of an appetitive instrumental learning task (Kelley et al., 1997). Because it is currently hypothesized that concurrent dopamine D(1) and glutamate receptor activation is required for long-term changes associated with plasticity, we wished to examine whether the dopamine system in the accumbens core modulates learning via NMDA receptors. Co-infusion of low doses of the D(1) receptor antagonist SCH-23390 (0.3 nmol) and AP-5 (0.5 nmol) into the accumbens core strongly impaired acquisition of instrumental learning (lever pressing for food), whereas when infused separately, these low doses had no effect. Infusion of the combined low doses had no effect on indices of feeding and motor activity, suggesting a specific effect on learning. We hypothesize that co-activation of NMDA and D(1) receptors in the nucleus accumbens core is a key process for acquisition of appetitive instrumental learning. Such an interaction is likely to promote intracellular events and gene regulation necessary for synaptic plasticity and is supported by a number of cellular models.

  3. A Requirements-Based Exploration of Open-Source Software Development Projects--Towards a Natural Language Processing Software Analysis Framework

    Science.gov (United States)

    Vlas, Radu Eduard

    2012-01-01

    Open source projects do have requirements; they are, however, mostly informal, text descriptions found in requests, forums, and other correspondence. Understanding such requirements provides insight into the nature of open source projects. Unfortunately, manual analysis of natural language requirements is time-consuming, and for large projects,…

  4. A Requirements-Based Exploration of Open-Source Software Development Projects--Towards a Natural Language Processing Software Analysis Framework

    Science.gov (United States)

    Vlas, Radu Eduard

    2012-01-01

    Open source projects do have requirements; they are, however, mostly informal, text descriptions found in requests, forums, and other correspondence. Understanding such requirements provides insight into the nature of open source projects. Unfortunately, manual analysis of natural language requirements is time-consuming, and for large projects,…

  5. Global Software Engineering: A Software Process Approach

    Science.gov (United States)

    Richardson, Ita; Casey, Valentine; Burton, John; McCaffery, Fergal

    Our research has shown that many companies are struggling with the successful implementation of global software engineering, due to temporal, cultural and geographical distance, which causes a range of factors to come into play. For example, cultural, project managementproject management and communication difficulties continually cause problems for software engineers and project managers. While the implementation of efficient software processes can be used to improve the quality of the software product, published software process models do not cater explicitly for the recent growth in global software engineering. Our thesis is that global software engineering factors should be included in software process models to ensure their continued usefulness in global organisations. Based on extensive global software engineering research, we have developed a software process, Global Teaming, which includes specific practices and sub-practices. The purpose is to ensure that requirements for successful global software engineering are stipulated so that organisations can ensure successful implementation of global software engineering.

  6. Frame rate required for speckle tracking echocardiography: A quantitative clinical study with open-source, vendor-independent software.

    Science.gov (United States)

    Negoita, Madalina; Zolgharni, Massoud; Dadkho, Elham; Pernigo, Matteo; Mielewczik, Michael; Cole, Graham D; Dhutia, Niti M; Francis, Darrel P

    2016-09-01

    To determine the optimal frame rate at which reliable heart walls velocities can be assessed by speckle tracking. Assessing left ventricular function with speckle tracking is useful in patient diagnosis but requires a temporal resolution that can follow myocardial motion. In this study we investigated the effect of different frame rates on the accuracy of speckle tracking results, highlighting the temporal resolution where reliable results can be obtained. 27 patients were scanned at two different frame rates at their resting heart rate. From all acquired loops, lower temporal resolution image sequences were generated by dropping frames, decreasing the frame rate by up to 10-fold. Tissue velocities were estimated by automated speckle tracking. Above 40 frames/s the peak velocity was reliably measured. When frame rate was lower, the inter-frame interval containing the instant of highest velocity also contained lower velocities, and therefore the average velocity in that interval was an underestimate of the clinically desired instantaneous maximum velocity. The higher the frame rate, the more accurately maximum velocities are identified by speckle tracking, until the frame rate drops below 40 frames/s, beyond which there is little increase in peak velocity. We provide in an online supplement the vendor-independent software we used for automatic speckle-tracked velocity assessment to help others working in this field. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  7. Core skills requirement and competencies expected of quantity surveyors: perspectives from quantity surveyors, allied professionals and clients in Nigeria

    Directory of Open Access Journals (Sweden)

    Joshua Oluwasuji Dada

    2015-10-01

    Full Text Available Abstract Deployment of appropriate skills and competencies is crucial and germane to the development and continuous relevance of any profession. In the built environment, the science for selecting the required skills and competencies expected of quantity surveyors and understanding the inherent dependencies between them remains a research issue. The purpose of this study was to determine the skill requirements and competencies expected of quantity surveyors. A structured questionnaire was administered among quantity surveyors, architects, engineers, builders and clients in Nigeria. The respondents were asked to give rating, on a 5 point Likert scale, on usual skills and competencies required of quantity surveyors. A secondary objective of the study was to examine the important skills and competencies and categorized them into core skill, basic skill, core competence, optional competence and special competence. The results of the study indicate the important skills as computer literacy, building engineering, information technology, economics, measurement/quantification and knowledge of civil/heavy engineering works. The results also indicate the important competencies as cost planning and control, estimating, construction procurement system, contract documentation, contract administration and project management. It is emphasized that the findings of the research have considerable implications on the training and practice of quantity surveying in Nigeria.

  8. Core skills requirement and competencies expected of quantity surveyors: perspectives from quantity surveyors, allied professionals and clients in Nigeria

    Directory of Open Access Journals (Sweden)

    Joshua Oluwasuji Dada

    2012-12-01

    Full Text Available AbstractDeployment of appropriate skills and competencies is crucial and germane to the development and continuous relevance of any profession. In the built environment, the science for selecting the required skills and competencies expected of quantity surveyors and understanding the inherent dependencies between them remains a research issue. The purpose of this study was to determine the skill requirements and competencies expected of quantity surveyors. A structured questionnaire was administered among quantity surveyors, architects, engineers, builders and clients in Nigeria. The respondents were asked to give rating, on a 5 point Likert scale, on usual skills and competencies required of quantity surveyors. A secondary objective of the study was to examine the important skills and competencies and categorized them into core skill, basic skill, core competence, optional competence and special competence. The results of the study indicate the important skills as computer literacy, building engineering, information technology, economics, measurement/quantification and knowledge of civil/heavy engineering works. The results also indicate the important competencies as cost planning and control, estimating, construction procurement system, contract documentation, contract administration and project management. It is emphasized that the findings of the research have considerable implications on the training and practice of quantity surveying in Nigeria.

  9. cFE/CFS (Core Flight Executive/Core Flight System)

    Science.gov (United States)

    Wildermann, Charles P.

    2008-01-01

    This viewgraph presentation describes in detail the requirements and goals of the Core Flight Executive (cFE) and the Core Flight System (CFS). The Core Flight Software System is a mission independent, platform-independent, Flight Software (FSW) environment integrating a reusable core flight executive (cFE). The CFS goals include: 1) Reduce time to deploy high quality flight software; 2) Reduce project schedule and cost uncertainty; 3) Directly facilitate formalized software reuse; 4) Enable collaboration across organizations; 5) Simplify sustaining engineering (AKA. FSW maintenance); 6) Scale from small instruments to System of Systems; 7) Platform for advanced concepts and prototyping; and 7) Common standards and tools across the branch and NASA wide.

  10. Software Innovation in a Mission Critical Environment

    Science.gov (United States)

    Fredrickson, Steven

    2015-01-01

    Operating in mission-critical environments requires trusted solutions, and the preference for "tried and true" approaches presents a potential barrier to infusing innovation into mission-critical systems. This presentation explores opportunities to overcome this barrier in the software domain. It outlines specific areas of innovation in software development achieved by the Johnson Space Center (JSC) Engineering Directorate in support of NASA's major human spaceflight programs, including International Space Station, Multi-Purpose Crew Vehicle (Orion), and Commercial Crew Programs. Software engineering teams at JSC work with hardware developers, mission planners, and system operators to integrate flight vehicles, habitats, robotics, and other spacecraft elements for genuinely mission critical applications. The innovations described, including the use of NASA Core Flight Software and its associated software tool chain, can lead to software that is more affordable, more reliable, better modelled, more flexible, more easily maintained, better tested, and enabling of automation.

  11. An empirical analysis of the required management skills in the core employees' identification

    Directory of Open Access Journals (Sweden)

    Natalia García Carbonell

    2016-01-01

    Full Text Available The current study empirically analyses the influence of top management team human capital attributes on one of the most relevant stages in the human resource management strategy formulation: the core employees' identification. Drawing on recent calls from the strategic human resource management literature, this study proposes a "process" perspective instead of the traditional "content" analysis, with the intention of going a step further on the internal dynamic of these strategic processes. Applying the structural equation modeling via Partial Least Square (PLS on a sample of 120 Spanish firms, results reveal that critical human resources identification processes demand mixed cognitive skills, rational and creative ones, in order to complete efficiently different steps of the process. Consequently, to reach a balanced combination of previous skills, collectivistic dynamics are needed, fostering cooperative and collaborative decision making processes. In this context, HR managers will participate improving the process with his/her expert power and developing technical HR activities; subsequently, the HR information will be integrated the strategic decision making process with the rest of the team. In addition, interesting professional implications arise from the study in relation to the presence of the cognitive diversity in top management teams.

  12. Discussion on Supervision Requirements for Medical Device Software change%关于医疗器械软件变更监管要求探讨

    Institute of Scientific and Technical Information of China (English)

    彭亮; 袁鹏

    2013-01-01

    As a result of the medical device software characteristic, the software will have more changes after it has been approved by Authorities. But, now, the supervision requirements can’t cover al circumstances, so, the supervision requirements should been established more Scientiifc and reasonable. This article point at the stakeholder’s concerns, analysis the principle on classiifcation of medical device software change, and base on the types of medical device software change, discuss the Supervision Requirements for Medical Device Software change, then, give the advices.%由于医疗器械软件的特殊性,在产品注册后变更相对频繁,但现在对于医疗器械软件变更的监管要求尚未能充分考虑这种情况,应进一步科学合理设置相应监管要求。本文针对业内人士所关注焦点,分析了医疗器械软件变更类型划分原则,并基于相关变更类型划分,探讨了医疗器械软件变更的监管要求,最后提出了相关工作建议。

  13. 75 FR 80571 - Core Principles and Other Requirements for Designated Contract Markets

    Science.gov (United States)

    2010-12-22

    .... Subpart O--Dispute Resolution 15. Subpart P--Governance Fitness Standards 16. Subpart Q--Conflicts of... certain requirements and practices that are commonly accepted in the industry and have been found, based... means for identifying industry trends and DCM best practices for self-regulation. Essentially,...

  14. A Conserved GPG-Motif in the HIV-1 Nef Core Is Required for Principal Nef-Activities.

    Directory of Open Access Journals (Sweden)

    Marta Martínez-Bonet

    Full Text Available To find out new determinants required for Nef activity we performed a functional alanine scanning analysis along a discrete but highly conserved region at the core of HIV-1 Nef. We identified the GPG-motif, located at the 121-137 region of HIV-1 NL4.3 Nef, as a novel protein signature strictly required for the p56Lck dependent Nef-induced CD4-downregulation in T-cells. Since the Nef-GPG motif was dispensable for CD4-downregulation in HeLa-CD4 cells, Nef/AP-1 interaction and Nef-dependent effects on Tf-R trafficking, the observed effects on CD4 downregulation cannot be attributed to structure constraints or to alterations on general protein trafficking. Besides, we found that the GPG-motif was also required for Nef-dependent inhibition of ring actin re-organization upon TCR triggering and MHCI downregulation, suggesting that the GPG-motif could actively cooperate with the Nef PxxP motif for these HIV-1 Nef-related effects. Finally, we observed that the Nef-GPG motif was required for optimal infectivity of those viruses produced in T-cells. According to these findings, we propose the conserved GPG-motif in HIV-1 Nef as functional region required for HIV-1 infectivity and therefore with a potential interest for the interference of Nef activity during HIV-1 infection.

  15. Software attribute visualization for high integrity software

    Energy Technology Data Exchange (ETDEWEB)

    Pollock, G.M.

    1998-03-01

    This report documents a prototype tool developed to investigate the use of visualization and virtual reality technologies for improving software surety confidence. The tool is utilized within the execution phase of the software life cycle. It provides a capability to monitor an executing program against prespecified requirements constraints provided in a program written in the requirements specification language SAGE. The resulting Software Attribute Visual Analysis Tool (SAVAnT) also provides a technique to assess the completeness of a software specification.

  16. 基于需求级软件复用技术研究%Studies on software reuse technology based on requirement level

    Institute of Scientific and Technical Information of China (English)

    张晓燕

    2011-01-01

    At present,the success ratio and the productivity in our country's software development are very low,which have restricted the development of our country's software industry seriously.The two challenging problems in software engineering are the inaccuracy of demand analysis and low level of software reusability,which lead to the current situation.To solve these two problems,the idea of realizing requirement level's software reuse is proposed by constructing a software requirement management system based on domain-specific ontology.In this paper,the background of the solution of this project is elaborated.Ontology extraction model and a software development model are put forward.Finally the innovation of the solution is pointed out.%目前我国软件项目开发的成功率和生产率都还很低,其中需求分析不准确和软件复用程度低严重制约了我国软件产业的良性发展。为了提高需求分析的准确度及软件复用程度,提出了通过构建面向领域本体的软件需求管理系统来实现需求级软件复用的解决方案。文中阐述了此方案的提出背景,提出了领域本体提取模型及软件开发过程模型,最后指出了此解决方案的创新之处。

  17. Pragmatic Software Innovation

    DEFF Research Database (Denmark)

    Aaen, Ivan; Jensen, Rikke Hagensby

    2014-01-01

    We understand software innovation as concerned with introducing innovation into the development of software intensive systems, i.e. systems in which software development and/or integration are dominant considerations. Innovation is key in almost any strategy for competitiveness in existing markets......, for creating new markets, or for curbing rising public expenses, and software intensive systems are core elements in most such strategies. Software innovation therefore is vital for about every sector of the economy. Changes in software technologies over the last decades have opened up for experimentation...

  18. Continuing engineering education for software engineering professionals

    Energy Technology Data Exchange (ETDEWEB)

    Davis, P.I.

    1992-02-19

    Designers of software for safety-critical applications are impelled to supplement their education through continuing engineering studies in the areas of requirements analysis, hazard identification, risk analysis, fault tolerance, failure modes, and psychology. Today`s complex level of design is contributing to opportunities for catastrophic design errors in computer functions where failure of such functions is capable of causing injury and death. A syllabus of post-graduate, core studies within the curricula of five engineering specialties is suggested. Software Engineers are exhorted to undertake a professional, responsible role in safety-critical software design.

  19. Continuing engineering education for software engineering professionals

    Energy Technology Data Exchange (ETDEWEB)

    Davis, P.I.

    1992-02-19

    Designers of software for safety-critical applications are impelled to supplement their education through continuing engineering studies in the areas of requirements analysis, hazard identification, risk analysis, fault tolerance, failure modes, and psychology. Today's complex level of design is contributing to opportunities for catastrophic design errors in computer functions where failure of such functions is capable of causing injury and death. A syllabus of post-graduate, core studies within the curricula of five engineering specialties is suggested. Software Engineers are exhorted to undertake a professional, responsible role in safety-critical software design.

  20. SOFTWARE METRICS VALIDATION METHODOLOGIES IN SOFTWARE ENGINEERING

    Directory of Open Access Journals (Sweden)

    K.P. Srinivasan

    2014-12-01

    Full Text Available In the software measurement validations, assessing the validation of software metrics in software engineering is a very difficult task due to lack of theoretical methodology and empirical methodology [41, 44, 45]. During recent years, there have been a number of researchers addressing the issue of validating software metrics. At present, software metrics are validated theoretically using properties of measures. Further, software measurement plays an important role in understanding and controlling software development practices and products. The major requirement in software measurement is that the measures must represent accurately those attributes they purport to quantify and validation is critical to the success of software measurement. Normally, validation is a collection of analysis and testing activities across the full life cycle and complements the efforts of other quality engineering functions and validation is a critical task in any engineering project. Further, validation objective is to discover defects in a system and assess whether or not the system is useful and usable in operational situation. In the case of software engineering, validation is one of the software engineering disciplines that help build quality into software. The major objective of software validation process is to determine that the software performs its intended functions correctly and provides information about its quality and reliability. This paper discusses the validation methodology, techniques and different properties of measures that are used for software metrics validation. In most cases, theoretical and empirical validations are conducted for software metrics validations in software engineering [1-50].

  1. Multidisciplinary and Active/Collaborative Approaches in Teaching Requirements Engineering

    Science.gov (United States)

    Rosca, Daniela

    2005-01-01

    The requirements engineering course is a core component of the curriculum for the Master's in Software Engineering programme, at Monmouth University (MU). It covers the process, methods and tools specific to this area, together with the corresponding software quality issues. The need to produce software engineers with strong teamwork and…

  2. Multidisciplinary and Active/Collaborative Approaches in Teaching Requirements Engineering

    Science.gov (United States)

    Rosca, Daniela

    2005-01-01

    The requirements engineering course is a core component of the curriculum for the Master's in Software Engineering programme, at Monmouth University (MU). It covers the process, methods and tools specific to this area, together with the corresponding software quality issues. The need to produce software engineers with strong teamwork and…

  3. "PolyMin": software for identification of the minimum number of polymorphisms required for haplotype and genotype differentiation

    DEFF Research Database (Denmark)

    Frei, Ursala K; Wollenweber, Bernd; Lübberstedt, Thomas

    2009-01-01

    . To date, software for identification of the minimum number of required markers has been optimized for human genetics and is only partly matching the needs of plant scientists and breeders. In addition, different software packages with insufficient interoperability need to be combined to extract...... this information from available allele sequence data, resulting in an errorprone multi-step process of data handling. Results: PolyMin, a computer program combining the detection of a minimum set of single nucleotide polymorphisms (SNPs) and/or insertions/deletions (INDELs) necessary for allele differentiation...

  4. A Framework for Integrating Biosimilars Into the Didactic Core Requirements of a Doctor of Pharmacy Curriculum.

    Science.gov (United States)

    Li, Edward; Liu, Jennifer; Ramchandani, Monica

    2017-04-01

    Biologic drugs approved via the abbreviated United States biosimilar approval pathway are anticipated to improve access to medications by addressing increasing health care expenditures. Surveys of health care practitioners indicate that there is inadequate knowledge and understanding about biosimilars; this must be addressed to ensure safe and effective use of this new category of products. Concepts of biosimilar development, manufacturing, regulation, naming, formulary, and inventory considerations, as well as patient and provider education should be included within the doctor of pharmacy (PharmD) curriculum as preparation for clinical practice. Based on these considerations, we propose that PharmD graduates be required to have knowledge in the following domains regarding biologics and biosimilars: legal definition, development and regulation, state pharmacy practice laws, and pharmacy practice management. We link these general biosimilar concepts to the Accreditation Council for Pharmacy Education (ACPE) Standards 2016 and Center for the Advancement of Pharmacy Education (CAPE) Outcomes 2013, and provide example classroom learning objectives, in-class activities, and assessments to guide implementation.

  5. Software engineering the current practice

    CERN Document Server

    Rajlich, Vaclav

    2011-01-01

    INTRODUCTION History of Software EngineeringSoftware PropertiesOrigins of SoftwareBirth of Software EngineeringThird Paradigm: Iterative ApproachSoftware Life Span ModelsStaged ModelVariants of Staged ModelSoftware Technologies Programming Languages and CompilersObject-Oriented TechnologyVersion Control SystemSoftware ModelsClass DiagramsUML Activity DiagramsClass Dependency Graphs and ContractsSOFTWARE CHANGEIntroduction to Software ChangeCharacteristics of Software ChangePhases of Software ChangeRequirements and Their ElicitationRequirements Analysis and Change InitiationConcepts and Concept

  6. Great software debates

    CERN Document Server

    Davis, A

    2004-01-01

    The industry’s most outspoken and insightful critic explains how the software industry REALLY works. In Great Software Debates, Al Davis, shares what he has learned about the difference between the theory and the realities of business and encourages you to question and think about software engineering in ways that will help you succeed where others fail. In short, provocative essays, Davis fearlessly reveals the truth about process improvement, productivity, software quality, metrics, agile development, requirements documentation, modeling, software marketing and sales, empiricism, start-up financing, software research, requirements triage, software estimation, and entrepreneurship.

  7. Agile Software Development

    Science.gov (United States)

    Biju, Soly Mathew

    2008-01-01

    Many software development firms are now adopting the agile software development method. This method involves the customer at every level of software development, thus reducing the impact of change in the requirement at a later stage. In this article, the principles of the agile method for software development are explored and there is a focus on…

  8. Agile Software Development

    Science.gov (United States)

    Biju, Soly Mathew

    2008-01-01

    Many software development firms are now adopting the agile software development method. This method involves the customer at every level of software development, thus reducing the impact of change in the requirement at a later stage. In this article, the principles of the agile method for software development are explored and there is a focus on…

  9. Software Review.

    Science.gov (United States)

    McGrath, Diane, Ed.

    1989-01-01

    Reviewed is a computer software package entitled "Audubon Wildlife Adventures: Grizzly Bears" for Apple II and IBM microcomputers. Included are availability, hardware requirements, cost, and a description of the program. The murder-mystery flavor of the program is stressed in this program that focuses on illegal hunting and game management. (CW)

  10. Software Cost Estimation Review

    OpenAIRE

    Ongere, Alphonce

    2013-01-01

    Software cost estimation is the process of predicting the effort, the time and the cost re-quired to complete software project successfully. It involves size measurement of the soft-ware project to be produced, estimating and allocating the effort, drawing the project schedules, and finally, estimating overall cost of the project. Accurate estimation of software project cost is an important factor for business and the welfare of software organization in general. If cost and effort estimat...

  11. Pragmatic Software Innovation

    DEFF Research Database (Denmark)

    Aaen, Ivan; Jensen, Rikke Hagensby

    2014-01-01

    We understand software innovation as concerned with introducing innovation into the development of software intensive systems, i.e. systems in which software development and/or integration are dominant considerations. Innovation is key in almost any strategy for competitiveness in existing markets......, for creating new markets, or for curbing rising public expenses, and software intensive systems are core elements in most such strategies. Software innovation therefore is vital for about every sector of the economy. Changes in software technologies over the last decades have opened up for experimentation......, learning, and flexibility in ongoing software projects, but how can this change be used to facilitate software innovation? How can a team systematically identify and pursue opportunities to create added value in ongoing projects? In this paper, we describe Deweyan pragmatism as the philosophical foundation...

  12. Paladin Software Support Lab

    Data.gov (United States)

    Federal Laboratory Consortium — The Paladin Software Support Environment (SSE) occupies 2,241 square-feet. It contains the hardware and software tools required to support the Paladin Automatic Fire...

  13. Software Requirements Specification of the IUfA's UUIS -- a Team 4 COMP5541-W10 Project Approach

    OpenAIRE

    Alhazmi, Ali; Al-Sharawi, Abdulrahman; Liu, Bing; Oliveira, Deyvisson; Sobh, Kanj; Mayantz, Max; de Bled, Robin; Zhang, Yu Ming

    2010-01-01

    This document presents the business requirement of Unified University Inventory System (UUIS) in Technology-independent manner. All attempts have been made in using mostly business terminology and business language while describing the requirements in this document. Very minimal and commonly understood Technical terminology is used. Use case approach is used in modeling the business requirements in this document.

  14. LANMAS core: Update and current directions

    Energy Technology Data Exchange (ETDEWEB)

    Claborn, J. [Los Alamos National Lab., NM (United States). Safeguards Systems Group; Alvarado, A. [Sandia National Labs., Albuquerque, NM (United States)

    1994-08-01

    Local Area Network Material Accountability System (LANMAS) core software will provide the framework of a material accountability system. LANMAS is a network-based nuclear material accountability system. It tracks the movement of material throughout a site and generates the required reports on material accountability. LANMAS will run in a client/server mode. The database of material type and location will reside on the server, while the user interface runs on the client. The user interface accesses the server via a network. The LANMAS core can be used as the foundation for building required Materials Control and Accountability (MC&A) functionality at any site requiring a new MC&A system. An individual site will build on the LANMAS core by supplying site-specific software. This paper will provide an update on the current LANMAS development activities and discuss the current direction of the LANMAS project.

  15. Runtime Instrumentation of SystemC/TLM2 Interfaces for Fault Tolerance Requirements Verification in Software Cosimulation

    Directory of Open Access Journals (Sweden)

    Antonio da Silva

    2014-01-01

    Full Text Available This paper presents the design of a SystemC transaction level modelling wrapping library that can be used for the assertion of system properties, protocol compliance, or fault injection. The library uses C++ virtual table hooks as a dynamic binary instrumentation technique to inline wrappers in the TLM2 transaction path. This technique can be applied after the elaboration phase and needs neither source code modifications nor recompilation of the top level SystemC modules. The proposed technique has been successfully applied to the robustness verification of the on-board boot software of the Instrument Control Unit of the Solar Orbiter’s Energetic Particle Detector.

  16. 面向软件行为的需求模型及特性检测%A Software Behavior Oriented Requirements Model and Properties Verification

    Institute of Scientific and Technical Information of China (English)

    吴怀广; 毋国庆; 陈曙; 万黎

    2011-01-01

    Requirements model and its verification are important procedures in software requirements.In this paper, the existing methods on requirements model are discussed and the related works of software behavior are concerned; furthermore, why utilizing software behavior in requirements model is illuminated. So a software behavior oriented requirements model is proposed formally which is named behavior description language (BDL). Its syntax and semantics of the model are given subsequently, and the hierarchy of behavior-oriented requirements model is presented based on BDL.In order to compare BDL with the current results and make full use of the latter related tools, the transformational relation is considered between BDL and CCS (calculus of communication system)which is an algebra process developed by Milner Robin. A transformation function which maps BDL into CCS is constructed and denoted by M [|-|]. For the sake of verifying properties of behavior-oriented requirements model, consistency of system, safety of system, behavioral trust and behavioral non-termination are depicted by μ-calculus, which is used to describe properties of labeled transition systems and for verifying these properties. At the end, an example described by BDL is analyzed and the specified properties are verified through CWB (Concurrency WorkBench) which is a model checking tool of CCS.%软件需求模型及其检测是软件需求工程中的重要工作.在分析现有需求建模方法和软件行为相关研究的基础上,对将软件行为概念引入需求模型进行了详细的阐述,提出一个面向软件行为的需求模型描述语言BDL(behavior description language),定义了它的语法、语义;讨论了CCS(calculus of communication system)与BDL的转换关系,构造了BDL到CCS的转换函数M[I-I];给出了需求模型的系统一致性、系统安全性、行为可信性及行为非终止性等4种系统特性的时序逻辑描述;最

  17. The EARP Complex and Its Interactor EIPR-1 Are Required for Cargo Sorting to Dense-Core Vesicles.

    Science.gov (United States)

    Topalidou, Irini; Cattin-Ortolá, Jérôme; Pappas, Andrea L; Cooper, Kirsten; Merrihew, Gennifer E; MacCoss, Michael J; Ailion, Michael

    2016-05-01

    The dense-core vesicle is a secretory organelle that mediates the regulated release of peptide hormones, growth factors, and biogenic amines. Dense-core vesicles originate from the trans-Golgi of neurons and neuroendocrine cells, but it is unclear how this specialized organelle is formed and acquires its specific cargos. To identify proteins that act in dense-core vesicle biogenesis, we performed a forward genetic screen in Caenorhabditis elegans for mutants defective in dense-core vesicle function. We previously reported the identification of two conserved proteins that interact with the small GTPase RAB-2 to control normal dense-core vesicle cargo-sorting. Here we identify several additional conserved factors important for dense-core vesicle cargo sorting: the WD40 domain protein EIPR-1 and the endosome-associated recycling protein (EARP) complex. By assaying behavior and the trafficking of dense-core vesicle cargos, we show that mutants that lack EIPR-1 or EARP have defects in dense-core vesicle cargo-sorting similar to those of mutants in the RAB-2 pathway. Genetic epistasis data indicate that RAB-2, EIPR-1 and EARP function in a common pathway. In addition, using a proteomic approach in rat insulinoma cells, we show that EIPR-1 physically interacts with the EARP complex. Our data suggest that EIPR-1 is a new interactor of the EARP complex and that dense-core vesicle cargo sorting depends on the EARP-dependent trafficking of cargo through an endosomal sorting compartment.

  18. Payload software technology: Software technology development plan

    Science.gov (United States)

    1977-01-01

    Programmatic requirements for the advancement of software technology are identified for meeting the space flight requirements in the 1980 to 1990 time period. The development items are described, and software technology item derivation worksheets are presented along with the cost/time/priority assessments.

  19. Software redundancy: what, where, how

    OpenAIRE

    Mattavelli, Andrea; Pezzè, Mauro; Carzaniga, Antonio

    2017-01-01

    Software systems have become pervasive in everyday life and are the core component of many crucial activities. An inadequate level of reliability may determine the commercial failure of a software product. Still, despite the commitment and the rigorous verification processes employed by developers, software is deployed with faults. To increase the reliability of software systems, researchers have investigated the use of various form of redundancy. Informally, a software system is redunda...

  20. Development of Occupational Safety and Health Requirement Management System (OSHREMS) Software Using Adobe Dreamweaver CS5 for Building Construction Project

    National Research Council Canada - National Science Library

    Nor Haslinda Abas; Nurjeha Adman; Rafikullah Deraman

    2017-01-01

    ...) for evaluating the performance of a contractor in construction project by setting out the safety and health management and practices, however the requirement checklist provided is not comprehensive...

  1. Software Engineering Improvement Plan

    Science.gov (United States)

    2006-01-01

    In performance of this task order, bd Systems personnel provided support to the Flight Software Branch and the Software Working Group through multiple tasks related to software engineering improvement and to activities of the independent Technical Authority (iTA) Discipline Technical Warrant Holder (DTWH) for software engineering. To ensure that the products, comments, and recommendations complied with customer requirements and the statement of work, bd Systems personnel maintained close coordination with the customer. These personnel performed work in areas such as update of agency requirements and directives database, software effort estimation, software problem reports, a web-based process asset library, miscellaneous documentation review, software system requirements, issue tracking software survey, systems engineering NPR, and project-related reviews. This report contains a summary of the work performed and the accomplishments in each of these areas.

  2. Software Requirements Specification of the IUfA's UUIS -- a Team 3 COMP5541-W10 Project Approach

    CERN Document Server

    Daoudi, Ahmed; Hazan, Gay; Toutant, Isabelle; Diaz, Mariano; Toutant, Rene; Cook, Virginia; Nzoukou, William; Amaiche, Yassine

    2010-01-01

    The purpose of this document is to specify the requirements of the University Unified Inventory System, of the UIfA. The Team of Analysts used a Feedback Waterfall approach to collect the requirements. UML diagrams, such as Use case diagrams, Block Diagrams, Domain Models, and interface prototypes are some of the tools employed to develop the present document.

  3. Beds Simulator 1.0: a software for the modelisation of the number of beds required for a hospital department.

    Science.gov (United States)

    Nguyen, Jean-Michel; Six, Patrick; Antonioli, Daniel; Lombrail, Pierre; Le Beux, Pierre

    2003-01-01

    The determination of the number of beds needed for a hospital department is a complex problem that try to take into account efficiency, forecasting of needs, appropriateness of stays. Health authority used methods based on ratios that do not take into account local specificities and use rather to support an economic decision. On the other side, the models developed are too specific to be applied to all type of hospital department. Moreover, all the solutions depend on the LoS (Length of Stay). We have developed a non parametric method to solve this problem. This modelisation was successfully tested in teaching and non teaching hospitals, for an Intensive Care Unit, two Internal Medicine and a surgical departments. A software easy to use was developed, working on Windows available on our website www.sante.univ-nantes.fr/med/stat/.

  4. Core-binding factor subunit beta is not required for non-primate lentiviral Vif-mediated APOBEC3 degradation.

    Science.gov (United States)

    Ai, Youwei; Zhu, Dantong; Wang, Cuihui; Su, Chao; Ma, Jian; Ma, Jianzhang; Wang, Xiaojun

    2014-10-01

    Viral infectivity factor (Vif) is required for lentivirus fitness and pathogenicity, except in equine infectious anemia virus (EIAV). Vif enhances viral infectivity by a Cullin5-Elongin B/C E3 complex to inactivate the host restriction factor APOBEC3. Core-binding factor subunit beta (CBF-β) is a cell factor that was recently shown to be important for the primate lentiviral Vif function. Non-primate lentiviral Vif also degrades APOBEC3 through the proteasome pathway. However, it is unclear whether CBF-β is required for the non-primate lentiviral Vif function. In this study, we demonstrated that the Vifs of non-primate lentiviruses, including feline immunodeficiency virus (FIV), bovine immunodeficiency virus (BIV), caprine arthritis encephalitis virus (CAEV), and maedi-visna virus (MVV), do not interact with CBF-β. In addition, CBF-β did not promote the stability of FIV, BIV, CAEV, and MVV Vifs. Furthermore, CBF-β silencing or overexpression did not affect non-primate lentiviral Vif-mediated APOBEC3 degradation. Our results suggest that non-primate lentiviral Vif induces APOBEC3 degradation through a different mechanism than primate lentiviral Vif. Importance: The APOBEC3 protein family members are host restriction factors that block retrovirus replication. Vif, an accessory protein of lentivirus, degrades APOBEC3 to rescue viral infectivity by forming Cullin5-Elongin B/C-based E3 complex. CBF-β was proved to be a novel regulator of primate lentiviral Vif function. In this study, we found that CBF-β knockdown or overexpression did not affect FIV Vif's function, which induced polyubiquitination and degradation of APOBEC3 by recruiting the E3 complex in a manner similar to that of HIV-1 Vif. We also showed that other non-primate lentiviral Vifs did not require CBF-β to degrade APOBEC3. CBF-β did not interact with non-primate lentiviral Vifs or promote their stability. These results suggest that a different mechanism exists for the Vif-APOBEC interaction and

  5. "IBSAR" Software 4.0

    Directory of Open Access Journals (Sweden)

    2004-06-01

    Full Text Available A review for Arabic software entitled "IBSAR" software assigned to help blinds in usage of the computer, the software pronounces the commands and the contents of screens and applications browsed by users, this review includes general introduction about the software, the components and commands of the software , system requirements , and its functions with Windows operating system and Microsoft Word.

  6. Comparative Genomic Analysis of Drechmeria coniospora Reveals Core and Specific Genetic Requirements for Fungal Endoparasitism of Nematodes.

    Directory of Open Access Journals (Sweden)

    Kevin Lebrigand

    2016-05-01

    Full Text Available Drechmeria coniospora is an obligate fungal pathogen that infects nematodes via the adhesion of specialized spores to the host cuticle. D. coniospora is frequently found associated with Caenorhabditis elegans in environmental samples. It is used in the study of the nematode's response to fungal infection. Full understanding of this bi-partite interaction requires knowledge of the pathogen's genome, analysis of its gene expression program and a capacity for genetic engineering. The acquisition of all three is reported here. A phylogenetic analysis placed D. coniospora close to the truffle parasite Tolypocladium ophioglossoides, and Hirsutella minnesotensis, another nematophagous fungus. Ascomycete nematopathogenicity is polyphyletic; D. coniospora represents a branch that has not been molecularly characterized. A detailed in silico functional analysis, comparing D. coniospora to 11 fungal species, revealed genes and gene families potentially involved in virulence and showed it to be a highly specialized pathogen. A targeted comparison with nematophagous fungi highlighted D. coniospora-specific genes and a core set of genes associated with nematode parasitism. A comparative gene expression analysis of samples from fungal spores and mycelia, and infected C. elegans, gave a molecular view of the different stages of the D. coniospora lifecycle. Transformation of D. coniospora allowed targeted gene knock-out and the production of fungus that expresses fluorescent reporter genes. It also permitted the initial characterisation of a potential fungal counter-defensive strategy, involving interference with a host antimicrobial mechanism. This high-quality annotated genome for D. coniospora gives insights into the evolution and virulence of nematode-destroying fungi. Coupled with genetic transformation, it opens the way for molecular dissection of D. coniospora physiology, and will allow both sides of the interaction between D. coniospora and C. elegans, as

  7. Comparative Genomic Analysis of Drechmeria coniospora Reveals Core and Specific Genetic Requirements for Fungal Endoparasitism of Nematodes.

    Science.gov (United States)

    Lebrigand, Kevin; He, Le D; Thakur, Nishant; Arguel, Marie-Jeanne; Polanowska, Jolanta; Henrissat, Bernard; Record, Eric; Magdelenat, Ghislaine; Barbe, Valérie; Raffaele, Sylvain; Barbry, Pascal; Ewbank, Jonathan J

    2016-05-01

    Drechmeria coniospora is an obligate fungal pathogen that infects nematodes via the adhesion of specialized spores to the host cuticle. D. coniospora is frequently found associated with Caenorhabditis elegans in environmental samples. It is used in the study of the nematode's response to fungal infection. Full understanding of this bi-partite interaction requires knowledge of the pathogen's genome, analysis of its gene expression program and a capacity for genetic engineering. The acquisition of all three is reported here. A phylogenetic analysis placed D. coniospora close to the truffle parasite Tolypocladium ophioglossoides, and Hirsutella minnesotensis, another nematophagous fungus. Ascomycete nematopathogenicity is polyphyletic; D. coniospora represents a branch that has not been molecularly characterized. A detailed in silico functional analysis, comparing D. coniospora to 11 fungal species, revealed genes and gene families potentially involved in virulence and showed it to be a highly specialized pathogen. A targeted comparison with nematophagous fungi highlighted D. coniospora-specific genes and a core set of genes associated with nematode parasitism. A comparative gene expression analysis of samples from fungal spores and mycelia, and infected C. elegans, gave a molecular view of the different stages of the D. coniospora lifecycle. Transformation of D. coniospora allowed targeted gene knock-out and the production of fungus that expresses fluorescent reporter genes. It also permitted the initial characterisation of a potential fungal counter-defensive strategy, involving interference with a host antimicrobial mechanism. This high-quality annotated genome for D. coniospora gives insights into the evolution and virulence of nematode-destroying fungi. Coupled with genetic transformation, it opens the way for molecular dissection of D. coniospora physiology, and will allow both sides of the interaction between D. coniospora and C. elegans, as well as the

  8. Software Architecture Evolution

    Science.gov (United States)

    Barnes, Jeffrey M.

    2013-01-01

    Many software systems eventually undergo changes to their basic architectural structure. Such changes may be prompted by new feature requests, new quality attribute requirements, changing technology, or other reasons. Whatever the causes, architecture evolution is commonplace in real-world software projects. Today's software architects, however,…

  9. Software Architecture Evolution

    Science.gov (United States)

    Barnes, Jeffrey M.

    2013-01-01

    Many software systems eventually undergo changes to their basic architectural structure. Such changes may be prompted by new feature requests, new quality attribute requirements, changing technology, or other reasons. Whatever the causes, architecture evolution is commonplace in real-world software projects. Today's software architects, however,…

  10. Differential requirements for HIV-1 Vif-mediated APOBEC3G degradation and RUNX1-mediated transcription by core binding factor beta.

    Science.gov (United States)

    Du, Juan; Zhao, Ke; Rui, Yajuan; Li, Peng; Zhou, Xiaohong; Zhang, Wenyan; Yu, Xiao-Fang

    2013-02-01

    Core binding factor beta (CBFβ), a transcription regulator through RUNX binding, was recently reported critical for Vif function. Here, we mapped the primary functional domain important for Vif function to amino acids 15 to 126 of CBFβ. We also revealed that different lengths and regions are required for CBFβ to assist Vif or RUNX. The important interaction domains that are uniquely required for Vif but not RUNX function represent novel targets for the development of HIV inhibitors.

  11. Comparación de efectividad de las técnicas de educción de requisitos software: visión novel y experta Comparison on effectiveness of the software requirements elicitation techniques: novice and expert vision

    Directory of Open Access Journals (Sweden)

    Dante Carrizo Moreno

    2012-12-01

    Full Text Available La Ingeniería de Requisitos puede hacer uso de una gran cantidad de técnicas para educir las necesidades de los usuarios. No obstante, apenas existen guías y criterios prácticos para realizar la selección de técnicas en un proyecto de desarrollo de software. Este artículo intenta conocer la visión que tienen los ingenieros de requisitos noveles acerca de la efectividad de las técnicas de educción y compararlas con la visión de expertos en requisitos. Para efectuar la comparación se utiliza la técnica de emparrillado que permite conocer indirectamente la opinión de los sujetos respecto de las técnicas. Los resultados muestran una sustancial diferencia entre ambas visiones respecto a la efectividad de las técnicas y contexto de la educción. Esto implica que una más amplia formación, y principalmente práctica, es necesaria para que los ingenieros noveles puedan reconocer diferencias de efectividad de las técnicas y poder decidir con mayor certeza sobre las técnicas más adecuadas a utilizar en las sesiones de educción de requisitos.Requirements engineering can use a lot of techniques to gather the users' needs. However, currently there are few practical guidelines and criteria for selecting techniques in a software development project. This paper tries to know the vision that novice requirements engineers have about the effectiveness of requirements elicitation techniques and compare it with the requirements experts' vision. To carry out the comparison, repertory grid technique was used. This technique allows knowing indirectly the subject's opinion on the techniques. The results show a substantial difference between both visions with regard to the techniques and elicitation context. This implies that a more extensive training, and primarily practice, is necessary in order for novice engineers to recognize differences in techniques effectiveness and decide with greater certainty about the most appropriate techniques to use in

  12. Simulation of an MSLB scenario using the 3D neutron kinetic core model DYN3D coupled with the CFD software Trio-U

    Energy Technology Data Exchange (ETDEWEB)

    Grahn, Alexander, E-mail: a.grahn@hzdr.de; Gommlich, André; Kliem, Sören; Bilodid, Yurii; Kozmenkov, Yaroslav

    2017-04-15

    Highlights: • Improved thermal-hydraulic description of nuclear reactor cores. • Providing reactor dynamics code with realistic thermal-hydraulic boundary conditions. • Possibility of three-dimensional flow phenomena in the core, such as cross flow, flow reversal. • Simulation at higher spatial resolution as compared to system codes. - Abstract: In the framework of the European project NURESAFE, the reactor dynamics code DYN3D, developed at Helmholtz-Zentrum Dresden-Rossendorf (HZDR), was coupled with the Computational Fluid Dynamics (CFD) solver Trio-U, developed at CEA France, in order to replace DYN3D’s one-dimensional hydraulic part with a full three-dimensional description of the coolant flow in the reactor core at higher spatial resolution. The present document gives an introduction into the coupling method and shows results of its application to the simulation of a Main Steamline Break (MSLB) accident of a Pressurised Water Reactor (PWR).

  13. Multicore Considerations for Legacy Flight Software Migration

    Science.gov (United States)

    Vines, Kenneth; Day, Len

    2013-01-01

    In this paper we will discuss potential benefits and pitfalls when considering a migration from an existing single core code base to a multicore processor implementation. The results of this study present options that should be considered before migrating fault managers, device handlers and tasks with time-constrained requirements to a multicore flight software environment. Possible future multicore test bed demonstrations are also discussed.

  14. Multicore Considerations for Legacy Flight Software Migration

    Science.gov (United States)

    Vines, Kenneth; Day, Len

    2013-01-01

    In this paper we will discuss potential benefits and pitfalls when considering a migration from an existing single core code base to a multicore processor implementation. The results of this study present options that should be considered before migrating fault managers, device handlers and tasks with time-constrained requirements to a multicore flight software environment. Possible future multicore test bed demonstrations are also discussed.

  15. Zero phase sequence impedance and tank heating model for three phase three leg core type power transformers coupling magnetic field and electric circuit equations in finite element software

    Energy Technology Data Exchange (ETDEWEB)

    Ngnegueu, T.; Mailhot, M.; Munar, A. [Jeumont Schneider Transformateurs, Lyon (France); Sacotte, M. [France-Transfo. Voie romaine, Mezieres-Les-Metz (France)

    1995-05-01

    In this paper, the authors present a finite element model for the calculation of zero phase sequence reactance for three phase three leg core type power transformers. An axisymmetrical approximation is assumed. A simplified model is used to assess the tank`s hottest spot temperature.

  16. Research on Customer Relationship Management Based on Core Value in Software Enterprise%基于核心价值的软件企业客户关系管理研究

    Institute of Scientific and Technical Information of China (English)

    李晴; 朱艳阳

    2013-01-01

    从提升客户价值因素出发,阐释企业核心竞争力与客户关系管理核心价值的内涵,在软件企业转型与市场竞争两个层面论证CRM建设在提升软件企业客户价值,提升软件企业核心价值中的重要意义,对软件企业的CRM系统进行功能设计.%The paper elaborates how to build a core competence and what the customer relationship management really means from the perspective of improving customs'value.From the change of software enterprise and market competence,the paper demonstrates that the building of CRM can improve customs'value and the importance of improving the core competence.It also designs the function module of software enterprise CRMS in the end.

  17. Software Requirements Specification of the IUfA's UUIS -- a Team 1 COMP5541-W10 Project Approach

    CERN Document Server

    Sankaran, Abirami; Attar, Maab; Parham, Mohammad; Zayikina, Olena; Rifai, Omar Jandali; Lepin, Pavel; Hassan, Rana

    2010-01-01

    Unified University Inventory System (UUIS), is an inventory system created for the Imaginary University of Arctica (IUfA) to facilitate its inventory management, of all the faculties in one system. Team 1 elucidates the functions of the system and the characteristics of the users who have access to these functions. It shows the access restrictions to different functionalities of the system provided to users, who are the staff and students of the University. Team 1, also, emphasises on the necessary steps required to prevent the security of the system and its data.

  18. The Paradox of "Structured" Methods for Software Requirements Management: A Case Study of an e-Government Development Project

    Science.gov (United States)

    Conboy, Kieran; Lang, Michael

    This chapter outlines the alternative perspectives of "rationalism" and "improvisation" within information systems development and describes the major shortcomings of each. It then discusses how these shortcomings manifested themselves within an e-government case study where a "structured" requirements management method was employed. Although this method was very prescriptive and firmly rooted in the "rational" paradigm, it was observed that users often resorted to improvised behaviour, such as privately making decisions on how certain aspects of the method should or should not be implemented.

  19. Teaching Software Engineering through Robotics

    OpenAIRE

    Shin, Jiwon; Rusakov, Andrey; Meyer, Bertrand

    2014-01-01

    This paper presents a newly-developed robotics programming course and reports the initial results of software engineering education in robotics context. Robotics programming, as a multidisciplinary course, puts equal emphasis on software engineering and robotics. It teaches students proper software engineering -- in particular, modularity and documentation -- by having them implement four core robotics algorithms for an educational robot. To evaluate the effect of software engineering educati...

  20. 面向软件非功能需求的软件过程建模方法%Non-Functional Requirements Oriented Software Process Modeling

    Institute of Scientific and Technical Information of China (English)

    张璇; 李彤; 王旭; 代飞; 谢仲文; 于倩

    2016-01-01

    The qualities of software relate to their synonym of non‐functional requirements (NFRs) and mostly depend on the software processes .Based on this viewpoint ,collecting process strategies from different software engineering processes and using aspect‐oriented modeling ,an approach to modeling NFRs‐oriented software processes is proposed .The purpose of the approach is to ensure the development or evolution of high quality software through the whole life cycle of the software .First , a knowledge base of process strategies is created to store the activities for ensuring the software qualities .Based on these strategies and using aspect‐oriented approach ,corresponding aspects are defined to be composed into the base software process models .The need for these aspects is based primarily on the factor that the activities for NFRs and the base process models can be created separately and easily to be composed later .Besides ,the conflicts between multi‐aspects and between aspects and base models are detected and controlled .Second ,a modeling aided tool NPAT (non‐functional requirements‐oriented processes aided tool ) is developed to support the modeling of NFR‐oriented software processes .Finally ,the theory ,the approach and the tool were used in a case study . Through the case study ,the theory and the approach are proved to be feasible and the tool is proved to be effective . The NFRs‐oriented software process modeling approach can help an organization provide a focus for enhancing software qualities by adding the NFR activities to the software processes .%软件非功能需求决定了软件的质量,而软件质量需求的满足很大程度上依赖于软件开发或演化时所使用的过程。从软件过程的角度出发,总结凝练满足软件非功能需求的过程策略,使用面向方面方法,提出面向软件非功能需求的软件过程建模方法,从软件过程的方法和技术角度保证软件的质量需求贯

  1. Development of clinician-friendly software for musculoskeletal modeling and control.

    Science.gov (United States)

    Davoodi, R; Urata, C; Todorov, E; Loeb, G E

    2004-01-01

    Research and development in various fields dealing with human movement has been hampered by the lack of adequate software tools. We have formed a core development team to organize a collective effort by the research community to develop musculoskeletal modeling software that satisfies the requirements of both researchers and clinicians. We have identified initial requirements and have developed some of the basic components. We are developing common standards to facilitate sharing and reuse of musculoskeletal models and their component parts. Free distribution of the software and its source code will allow users to contribute to further development of the software as new models and data become available in the future.

  2. What Will They Learn? 2014-15. A Survey of Core Requirements at Our Nation's Colleges and Universities

    Science.gov (United States)

    Kempson, Lauri; Lewin, Greg; Burt, Evan; Poliakoff, Michael

    2014-01-01

    A college education is rightly part of the American Dream. It is seen as the ticket to success in career and community, a credential that repays the investment of time and money in higher education that students, families, and taxpayers make. In "What Will They Learn?"™ the authors take as a premise that the core purpose of attending…

  3. Software Engineering Reviews and Audits

    CERN Document Server

    Summers, Boyd L

    2011-01-01

    Accurate software engineering reviews and audits have become essential to the success of software companies and military and aerospace programs. These reviews and audits define the framework and specific requirements for verifying software development efforts. Authored by an industry professional with three decades of experience, Software Engineering Reviews and Audits offers authoritative guidance for conducting and performing software first article inspections, and functional and physical configuration software audits. It prepares readers to answer common questions for conducting and perform

  4. Software architecture

    CERN Document Server

    Vogel, Oliver; Chughtai, Arif

    2011-01-01

    As a software architect you work in a wide-ranging and dynamic environment. You have to understand the needs of your customer, design architectures that satisfy both functional and non-functional requirements, and lead development teams in implementing the architecture. And it is an environment that is constantly changing: trends such as cloud computing, service orientation, and model-driven procedures open up new architectural possibilities. This book will help you to develop a holistic architectural awareness and knowledge base that extends beyond concrete methods, techniques, and technologi

  5. Impact of Collaborative ALM on Software Project Management

    Directory of Open Access Journals (Sweden)

    Tahmoor Shoukat

    2014-05-01

    Full Text Available To produce a release of software, ALM is a key for streaming the team’s ability. ALM consists of the core disciplines of requirements definition and its management, asset management, development, build creation, testing, and release that are all planned by project management and orchestrated by using some form of process [1]. The assets and their relationships are stored by the development team repository. Detailed reports and charts provide visibility into team’s progress. In this paper we will describe how the ALM involves software development activities and assets coordination for the production and management of software applications throughout their entire life cycle.

  6. COTS software selection process.

    Energy Technology Data Exchange (ETDEWEB)

    Watkins, William M. (Strike Wire Technologies, Louisville, CO); Lin, Han Wei; McClelland, Kelly (U.S. Security Associates, Livermore, CA); Ullrich, Rebecca Ann; Khanjenoori, Soheil; Dalton, Karen; Lai, Anh Tri; Kuca, Michal; Pacheco, Sandra; Shaffer-Gant, Jessica

    2006-05-01

    Today's need for rapid software development has generated a great interest in employing Commercial-Off-The-Shelf (COTS) software products as a way of managing cost, developing time, and effort. With an abundance of COTS software packages to choose from, the problem now is how to systematically evaluate, rank, and select a COTS product that best meets the software project requirements and at the same time can leverage off the current corporate information technology architectural environment. This paper describes a systematic process for decision support in evaluating and ranking COTS software. Performed right after the requirements analysis, this process provides the evaluators with more concise, structural, and step-by-step activities for determining the best COTS software product with manageable risk. In addition, the process is presented in phases that are flexible to allow for customization or tailoring to meet various projects' requirements.

  7. Software Maintenance Success Recipes

    CERN Document Server

    Reifer, Donald J

    2011-01-01

    Dispelling much of the folklore surrounding software maintenance, Software Maintenance Success Recipes identifies actionable formulas for success based on in-depth analysis of more than 200 real-world maintenance projects. It details the set of factors that are usually present when effective software maintenance teams do their work and instructs on the methods required to achieve success. Donald J. Reifer--an award winner for his contributions to the field of software engineering and whose experience includes managing the DoD Software Initiatives Office--provides step-by-step guidance on how t

  8. Unique carbohydrate-carbohydrate interactions are required for high affinity binding between FcgammaRIII and antibodies lacking core fucose.

    Science.gov (United States)

    Ferrara, Claudia; Grau, Sandra; Jäger, Christiane; Sondermann, Peter; Brünker, Peter; Waldhauer, Inja; Hennig, Michael; Ruf, Armin; Rufer, Arne Christian; Stihle, Martine; Umaña, Pablo; Benz, Jörg

    2011-08-02

    Antibody-mediated cellular cytotoxicity (ADCC), a key immune effector mechanism, relies on the binding of antigen-antibody complexes to Fcγ receptors expressed on immune cells. Antibodies lacking core fucosylation show a large increase in affinity for FcγRIIIa leading to an improved receptor-mediated effector function. Although afucosylated IgGs exist naturally, a next generation of recombinant therapeutic, glycoenginereed antibodies is currently being developed to exploit this finding. In this study, the crystal structures of a glycosylated Fcγ receptor complexed with either afucosylated or fucosylated Fc were determined allowing a detailed, molecular understanding of the regulatory role of Fc-oligosaccharide core fucosylation in improving ADCC. The structures reveal a unique type of interface consisting of carbohydrate-carbohydrate interactions between glycans of the receptor and the afucosylated Fc. In contrast, in the complex structure with fucosylated Fc, these contacts are weakened or nonexistent, explaining the decreased affinity for the receptor. These findings allow us to understand the higher efficacy of therapeutic antibodies lacking the core fucose and also suggest a unique mechanism by which the immune system can regulate antibody-mediated effector functions.

  9. Lipopolysaccharide (LPS) inner-core phosphates are required for complete LPS synthesis and transport to the outer membrane in Pseudomonas aeruginosa PAO1.

    Science.gov (United States)

    Delucia, Angela M; Six, David A; Caughlan, Ruth E; Gee, Patricia; Hunt, Ian; Lam, Joseph S; Dean, Charles R

    2011-01-01

    also caused gross changes in cell morphology and led to the accumulation of an aberrant LPS lacking several core sugars and all core phosphates. The aberrant LPS failed to reach the OM, suggesting that WaaP is essential in P. aeruginosa because it is required to produce the full-length LPS that is recognized by the OM transport/assembly machinery in this organism. Therefore, WaaP may constitute a good target for the development of novel antipseudomonal agents.

  10. Evolvable Neural Software System

    Science.gov (United States)

    Curtis, Steven A.

    2009-01-01

    The Evolvable Neural Software System (ENSS) is composed of sets of Neural Basis Functions (NBFs), which can be totally autonomously created and removed according to the changing needs and requirements of the software system. The resulting structure is both hierarchical and self-similar in that a given set of NBFs may have a ruler NBF, which in turn communicates with other sets of NBFs. These sets of NBFs may function as nodes to a ruler node, which are also NBF constructs. In this manner, the synthetic neural system can exhibit the complexity, three-dimensional connectivity, and adaptability of biological neural systems. An added advantage of ENSS over a natural neural system is its ability to modify its core genetic code in response to environmental changes as reflected in needs and requirements. The neural system is fully adaptive and evolvable and is trainable before release. It continues to rewire itself while on the job. The NBF is a unique, bilevel intelligence neural system composed of a higher-level heuristic neural system (HNS) and a lower-level, autonomic neural system (ANS). Taken together, the HNS and the ANS give each NBF the complete capabilities of a biological neural system to match sensory inputs to actions. Another feature of the NBF is the Evolvable Neural Interface (ENI), which links the HNS and ANS. The ENI solves the interface problem between these two systems by actively adapting and evolving from a primitive initial state (a Neural Thread) to a complicated, operational ENI and successfully adapting to a training sequence of sensory input. This simulates the adaptation of a biological neural system in a developmental phase. Within the greater multi-NBF and multi-node ENSS, self-similar ENI s provide the basis for inter-NBF and inter-node connectivity.

  11. Software Validation using Power Profiles

    OpenAIRE

    Lencevicius, Raimondas; Metz, Edu; Ran, Alexander

    2002-01-01

    The validation of modern software systems incorporates both functional and quality requirements. This paper proposes a validation approach for software quality requirement - its power consumption. This approach validates whether the software produces the desired results with a minimum expenditure of energy. We present energy requirements and an approach for their validation using a power consumption model, test-case specification, software traces, and power measurements. Three different appro...

  12. Software engineering

    CERN Document Server

    Sommerville, Ian

    2010-01-01

    The ninth edition of Software Engineering presents a broad perspective of software engineering, focusing on the processes and techniques fundamental to the creation of reliable, software systems. Increased coverage of agile methods and software reuse, along with coverage of 'traditional' plan-driven software engineering, gives readers the most up-to-date view of the field currently available. Practical case studies, a full set of easy-to-access supplements, and extensive web resources make teaching the course easier than ever.

  13. Designing Scientific Software for Heterogeneous Computing

    DEFF Research Database (Denmark)

    Glimberg, Stefan Lemvig

    , algorithms and data structures must be designed to utilize the underlying parallel architecture. The architectural changes in hardware design within the last decade, from single to multi and many-core architectures, require software developers to identify and properly implement methods that both exploit...... concurrency and maintain numerical efficiency. Graphical Processing Units (GPUs) have proven to be very e_ective units for computing the solution of scientific problems described by partial differential equations (PDEs). GPUs have today become standard devices in portable, desktop, and supercomputers, which...... makes parallel software design applicable, but also a challenge for scientific software developers at all levels. We have developed a generic C++ library for fast prototyping of large-scale PDEs solvers based on flexible-order finite difference approximations on structured regular grids. The library...

  14. Software Communication Architecture Implementation and Its Waveform Application

    Institute of Scientific and Technical Information of China (English)

    SUN Pei-gang; ZHAO Hai; WANG Ting-chang; FAN Jian-hua

    2006-01-01

    This paper attempts to do a research on the development of software defined radio(SDR) based on software communication architecture(SCA). Firstly, SCA is studied and a whole reference model of SCA3.0 core framework (CF)is realized; Secondly, an application-specific FM3TR waveform is implemented on the platform of common software based on the reference model; Thirdly, from the point of view of real-time performance and software reuse, tests and validations are made on the above realized CF reference model and FM3TR waveform. As a result, the SCA-compliant SDR has favorable interoperability and software portability and can satisfy the real-time performance requirements which are not too rigorous.

  15. γ射线在数字岩心中沉积谱的蒙特卡罗模拟%The Design and Application of Monte Carlo Software in Simulating Energy Spectrum of γRadiation Deposited in Digital Core

    Institute of Scientific and Technical Information of China (English)

    唐朝云; 吴文圣; 吴冲; 卢贵武

    2014-01-01

    In order to simulate the γradiation response properties in stratum porosity structure , the software that we can build general geometry model or the digital core model is made by using the Monte Carlo method .Theγspectrum of photon pulse deposition in NaI crystal is simulated by the software and MCNP 4C respectively, and the results demonstrate that the program can be regarded as a correct and effective simulation .And then, we construct digital core model , as well as the energy spectrum of γradiation in sandstone with different porosity is simulated .This will be beneficial for the foundation explanation of density logging .%为分析含孔隙结构地层的γ射线响应特性,编写了一种Monte Carlo方法模拟γ射线散射的程序,使用该程序除了能建立一般的几何模型外,还能在数字岩心的基础上进行模拟。利用该程序对γ光子在NaI晶体中的脉冲沉积谱做了模拟,并与MCNP4C相应的计算结果进行了对比,对比显示,该程序能提供正确、有效的模拟手段。在正确性验证的基础上,以数字岩心构建地层,对不同孔隙度的砂岩做了密度测井的能谱模拟,得到了一些有益的结果。

  16. Some design constraints required for the use of generic software in embedded systems: Packages which manage abstract dynamic structures without the need for garbage collection

    Science.gov (United States)

    Johnson, Charles S.

    1986-01-01

    The embedded systems running real-time applications, for which Ada was designed, require their own mechanisms for the management of dynamically allocated storage. There is a need for packages which manage their own internalo structures to control their deallocation as well, due to the performance implications of garbage collection by the KAPSE. This places a requirement upon the design of generic packages which manage generically structured private types built-up from application-defined input types. These kinds of generic packages should figure greatly in the development of lower-level software such as operating systems, schedulers, controllers, and device driver; and will manage structures such as queues, stacks, link-lists, files, and binary multary (hierarchical) trees. Controlled to prevent inadvertent de-designation of dynamic elements, which is implicit in the assignment operation A study was made of the use of limited private type, in solving the problems of controlling the accumulation of anonymous, detached objects in running systems. The use of deallocator prodecures for run-down of application-defined input types during deallocation operations during satellites.

  17. Essential software architecture

    CERN Document Server

    Gorton, Ian

    2011-01-01

    Job titles like ""Technical Architect"" and ""Chief Architect"" nowadays abound in software industry, yet many people suspect that ""architecture"" is one of the most overused and least understood terms in professional software development. Gorton's book tries to resolve this dilemma. It concisely describes the essential elements of knowledge and key skills required to be a software architect. The explanations encompass the essentials of architecture thinking, practices, and supporting technologies. They range from a general understanding of structure and quality attributes through technical i

  18. Developing high-quality educational software.

    Science.gov (United States)

    Johnson, Lynn A; Schleyer, Titus K L

    2003-11-01

    The development of effective educational software requires a systematic process executed by a skilled development team. This article describes the core skills required of the development team members for the six phases of successful educational software development. During analysis, the foundation of product development is laid including defining the audience and program goals, determining hardware and software constraints, identifying content resources, and developing management tools. The design phase creates the specifications that describe the user interface, the sequence of events, and the details of the content to be displayed. During development, the pieces of the educational program are assembled. Graphics and other media are created, video and audio scripts written and recorded, the program code created, and support documentation produced. Extensive testing by the development team (alpha testing) and with students (beta testing) is conducted. Carefully planned implementation is most likely to result in a flawless delivery of the educational software and maintenance ensures up-to-date content and software. Due to the importance of the sixth phase, evaluation, we have written a companion article on it that follows this one. The development of a CD-ROM product is described including the development team, a detailed description of the development phases, and the lessons learned from the project.

  19. Software Assurance Competency Model

    Science.gov (United States)

    2013-03-01

    2010a]: Application of technologies and processes to achieve a required level of confidence that software systems and services function in the...for specific projects. L5: Analyze assurance technologies and contribute to the development of new ones. Assured Software Development L1

  20. Selecting the Right Software.

    Science.gov (United States)

    Shearn, Joseph

    1987-01-01

    Selection of administrative software requires analyzing present needs and, to meet future needs, choosing software that will function with a more powerful computer system. Other important factors to include are a professional system demonstration, maintenance and training, and financial considerations that allow leasing or renting alternatives.…

  1. SOFTWARE OPEN SOURCE, SOFTWARE GRATIS?

    Directory of Open Access Journals (Sweden)

    Nur Aini Rakhmawati

    2006-01-01

    Full Text Available Normal 0 false false false IN X-NONE X-NONE MicrosoftInternetExplorer4 Berlakunya Undang – undang Hak Atas Kekayaan Intelektual (HAKI, memunculkan suatu alternatif baru untuk menggunakan software open source. Penggunaan software open source menyebar seiring dengan isu global pada Information Communication Technology (ICT saat ini. Beberapa organisasi dan perusahaan mulai menjadikan software open source sebagai pertimbangan. Banyak konsep mengenai software open source ini. Mulai dari software yang gratis sampai software tidak berlisensi. Tidak sepenuhnya isu software open source benar, untuk itu perlu dikenalkan konsep software open source mulai dari sejarah, lisensi dan bagaimana cara memilih lisensi, serta pertimbangan dalam memilih software open source yang ada. Kata kunci :Lisensi, Open Source, HAKI

  2. A Core Language for Separate Variability Modeling

    DEFF Research Database (Denmark)

    Iosif-Lazăr, Alexandru Florin; Wasowski, Andrzej; Schaefer, Ina

    2014-01-01

    Separate variability modeling adds variability to a modeling language without requiring modifications of the language or the supporting tools. We define a core language for separate variability modeling using a single kind of variation point to define transformations of software artifacts in object...... hierarchical dependencies between variation points via copying and flattening. Thus, we reduce a model with intricate dependencies to a flat executable model transformation consisting of simple unconditional local variation points. The core semantics is extremely concise: it boils down to two operational rules...

  3. Software Reviews.

    Science.gov (United States)

    Smith, Richard L., Ed.

    1985-01-01

    Reviews software packages by providing extensive descriptions and discussions of their strengths and weaknesses. Software reviewed include (1) "VISIFROG: Vertebrate Anatomy" (grade seven-adult); (2) "Fraction Bars Computer Program" (grades three to six) and (3) four telecommunications utilities. (JN)

  4. A software engineering process for safety-critical software application.

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Byung Heon; Kim, Hang Bae; Chang, Hoon Seon; Jeon, Jong Sun [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1995-02-01

    Application of computer software to safety-critical systems in on the increase. To be successful, the software must be designed and constructed to meet the functional and performance requirements of the system. For safety reason, the software must be demonstrated not only to meet these requirements, but also to operate safely as a component within the system. For longer-term cost consideration, the software must be designed and structured to ease future maintenance and modifications. This paper presents a software engineering process for the production of safety-critical software for a nuclear power plant. The presentation is expository in nature of a viable high quality safety-critical software development. It is based on the ideas of a rational design process and on the experience of the adaptation of such process in the production of the safety-critical software for the shutdown system number two of Wolsung 2, 3 and 4 nuclear power generation plants. This process is significantly different from a conventional process in terms of rigorous software development phases and software design techniques, The process covers documentation, design, verification and testing using mathematically precise notations and highly reviewable tabular format to specify software requirements and software requirements and software requirements and code against software design using static analysis. The software engineering process described in this paper applies the principle of information-hiding decomposition in software design using a modular design technique so that when a change is required or an error is detected, the affected scope can be readily and confidently located. it also facilitates a sense of high degree of confidence in the `correctness` of the software production, and provides a relatively simple and straightforward code implementation effort. 1 figs., 10 refs. (Author).

  5. Design Principles for Interactive Software

    DEFF Research Database (Denmark)

    The book addresses the crucial intersection of human-computer interaction (HCI) and software engineering by asking both what users require from interactive systems and what developers need to produce well-engineered software. Needs are expressed as......The book addresses the crucial intersection of human-computer interaction (HCI) and software engineering by asking both what users require from interactive systems and what developers need to produce well-engineered software. Needs are expressed as...

  6. What Will They Learn? 2015-16. A Survey of Core Requirements at Our Nation's Colleges and Universities

    Science.gov (United States)

    Kempson, Lauri; Burt, Evan; Bledsoe, Eric; Poliakoff, Michael

    2015-01-01

    At a time when 87% of employers believe that our colleges must raise the quality of students' educations in order for the United States to remain competitive globally, and four in five Americans say they believe all graduates should have to take the key courses outlined in the study, few colleges require a real liberal arts education. "What…

  7. Software reliability

    CERN Document Server

    Bendell, A

    1986-01-01

    Software Reliability reviews some fundamental issues of software reliability as well as the techniques, models, and metrics used to predict the reliability of software. Topics covered include fault avoidance, fault removal, and fault tolerance, along with statistical methods for the objective assessment of predictive accuracy. Development cost models and life-cycle cost models are also discussed. This book is divided into eight sections and begins with a chapter on adaptive modeling used to predict software reliability, followed by a discussion on failure rate in software reliability growth mo

  8. Software Defined Networking Demands on Software Technologies

    DEFF Research Database (Denmark)

    Galinac Grbac, T.; Caba, Cosmin Marius; Soler, José

    2015-01-01

    Software Defined Networking (SDN) is a networking approach based on a centralized control plane architecture with standardised interfaces between control and data planes. SDN enables fast configuration and reconfiguration of the network to enhance resource utilization and service performances....... This new approach enables a more dynamic and flexible network, which may adapt to user needs and application requirements. To this end, systemized solutions must be implemented in network software, aiming to provide secure network services that meet the required service performance levels. In this paper......, we review this new approach to networking from an architectural point of view, and identify and discuss some critical quality issues that require new developments in software technologies. These issues we discuss along with use case scenarios. Here in this paper we aim to identify challenges...

  9. NASA software documentation standard software engineering program

    Science.gov (United States)

    1991-01-01

    The NASA Software Documentation Standard (hereinafter referred to as Standard) can be applied to the documentation of all NASA software. This Standard is limited to documentation format and content requirements. It does not mandate specific management, engineering, or assurance standards or techniques. This Standard defines the format and content of documentation for software acquisition, development, and sustaining engineering. Format requirements address where information shall be recorded and content requirements address what information shall be recorded. This Standard provides a framework to allow consistency of documentation across NASA and visibility into the completeness of project documentation. This basic framework consists of four major sections (or volumes). The Management Plan contains all planning and business aspects of a software project, including engineering and assurance planning. The Product Specification contains all technical engineering information, including software requirements and design. The Assurance and Test Procedures contains all technical assurance information, including Test, Quality Assurance (QA), and Verification and Validation (V&V). The Management, Engineering, and Assurance Reports is the library and/or listing of all project reports.

  10. Computer software.

    Science.gov (United States)

    Rosenthal, L E

    1986-10-01

    Software is the component in a computer system that permits the hardware to perform the various functions that a computer system is capable of doing. The history of software and its development can be traced to the early nineteenth century. All computer systems are designed to utilize the "stored program concept" as first developed by Charles Babbage in the 1850s. The concept was lost until the mid-1940s, when modern computers made their appearance. Today, because of the complex and myriad tasks that a computer system can perform, there has been a differentiation of types of software. There is software designed to perform specific business applications. There is software that controls the overall operation of a computer system. And there is software that is designed to carry out specialized tasks. Regardless of types, software is the most critical component of any computer system. Without it, all one has is a collection of circuits, transistors, and silicone chips.

  11. Evaluation & Optimization of Software Engineering

    Directory of Open Access Journals (Sweden)

    Asaduzzaman Noman

    2016-06-01

    Full Text Available The term is made of two words, software and engineering. Software is more than just a program code. A program is an executable code, which serves some computational purpose. Software is considered to be collection of executable programming code, associated libraries and documentations. Software, when made for a specific requirement is called software product. Engineering on the other hand, is all about developing products, using well-defined, scientific principles and methods. The outcome of software engineering is an efficient and reliable software product. IEEE defines software engineering as: The application of a systematic, disciplined, quantifiable approach to the development, operation and maintenance of software; that is, the application of engineering to software.

  12. ATLAS software configuration and build tool optimisation

    Science.gov (United States)

    Rybkin, Grigory; Atlas Collaboration

    2014-06-01

    ATLAS software code base is over 6 million lines organised in about 2000 packages. It makes use of some 100 external software packages, is developed by more than 400 developers and used by more than 2500 physicists from over 200 universities and laboratories in 6 continents. To meet the challenge of configuration and building of this software, the Configuration Management Tool (CMT) is used. CMT expects each package to describe its build targets, build and environment setup parameters, dependencies on other packages in a text file called requirements, and each project (group of packages) to describe its policies and dependencies on other projects in a text project file. Based on the effective set of configuration parameters read from the requirements files of dependent packages and project files, CMT commands build the packages, generate the environment for their use, or query the packages. The main focus was on build time performance that was optimised within several approaches: reduction of the number of reads of requirements files that are now read once per package by a CMT build command that generates cached requirements files for subsequent CMT build commands; introduction of more fine-grained build parallelism at package task level, i.e., dependent applications and libraries are compiled in parallel; code optimisation of CMT commands used for build; introduction of package level build parallelism, i. e., parallelise the build of independent packages. By default, CMT launches NUMBER-OF-PROCESSORS build commands in parallel. The other focus was on CMT commands optimisation in general that made them approximately 2 times faster. CMT can generate a cached requirements file for the environment setup command, which is especially useful for deployment on distributed file systems like AFS or CERN VMFS. The use of parallelism, caching and code optimisation significantly-by several times-reduced software build time, environment setup time, increased the efficiency of

  13. 基于单核 DS P实时多任务宏观并行软件架构%Real-time Multitasking Macro Parallel Software Architecture Based on Single-core DSP

    Institute of Scientific and Technical Information of China (English)

    周敬东; 黄云朋; 周明刚; 李敏慧; 程钗

    2015-01-01

    The serial task execution mode is used in most of the existing single‐core embedded systems .The system can not respond to other tasks quickly and effectively , during the execution of complex tasks . A real‐time multitasking macro parallel software architecture based on single‐core DSP is designed by splitting for the complex task ,reducing the use of system latency program and querying task marks to scheduling task .The experiment test show that rapid task responding in complex multitasking system can meet the request of instantaneity ,stability of the system and multitask parallel at the macroscopic level .%现有单核嵌入式系统大多采用的是串行任务执行方式,系统在复杂任务执行过程中,不能快速有效地响应其他任务。通过对复杂任务的拆分以限制单个任务执行时间,减少系统延时程序的使用,以及使用查询任务标志调度任务等方法,设计并实现了一种基于单核DSP的实时多任务宏观并行软件架构。实际应用系统的试验验证了该软件架构在复杂多任务系统中任务响应快,可满足系统实时性要求,能够保证系统稳定运行。

  14. The software invention cube: A classification scheme for software inventions

    NARCIS (Netherlands)

    Bergstra, J.A.; Klint, P.

    2008-01-01

    The patent system protects inventions. The requirement that a software invention should make ‘a technical contribution’ turns out to be untenable in practice and this raises the question, what constitutes an invention in the realm of software. The authors developed the Software Invention Cube (SWIC)

  15. The software invention cube: A classification scheme for software inventions

    NARCIS (Netherlands)

    J.A. Bergstra; P. Klint (Paul)

    2008-01-01

    htmlabstractThe patent system protects inventions. The requirement that a software invention should make ‘a technical contribution’ turns out to be untenable in practice and this raises the question, what constitutes an invention in the realm of software. The authors developed the Software Invention

  16. Scientific Software Component Technology

    Energy Technology Data Exchange (ETDEWEB)

    Kohn, S.; Dykman, N.; Kumfert, G.; Smolinski, B.

    2000-02-16

    We are developing new software component technology for high-performance parallel scientific computing to address issues of complexity, re-use, and interoperability for laboratory software. Component technology enables cross-project code re-use, reduces software development costs, and provides additional simulation capabilities for massively parallel laboratory application codes. The success of our approach will be measured by its impact on DOE mathematical and scientific software efforts. Thus, we are collaborating closely with library developers and application scientists in the Common Component Architecture forum, the Equation Solver Interface forum, and other DOE mathematical software groups to gather requirements, write and adopt a variety of design specifications, and develop demonstration projects to validate our approach. Numerical simulation is essential to the science mission at the laboratory. However, it is becoming increasingly difficult to manage the complexity of modern simulation software. Computational scientists develop complex, three-dimensional, massively parallel, full-physics simulations that require the integration of diverse software packages written by outside development teams. Currently, the integration of a new software package, such as a new linear solver library, can require several months of effort. Current industry component technologies such as CORBA, JavaBeans, and COM have all been used successfully in the business domain to reduce software development costs and increase software quality. However, these existing industry component infrastructures will not scale to support massively parallel applications in science and engineering. In particular, they do not address issues related to high-performance parallel computing on ASCI-class machines, such as fast in-process connections between components, language interoperability for scientific languages such as Fortran, parallel data redistribution between components, and massively

  17. Software piracy

    OpenAIRE

    Kráčmer, Stanislav

    2011-01-01

    The objective of the present thesis is to clarify the term of software piracy and to determine responsibility of individual entities as to actual realization of software piracy. First, the thesis focuses on a computer programme, causes, realization and pitfalls of its inclusion under copyright protection. Subsequently, it observes methods of legal usage of a computer programme. This is the point of departure for the following attempt to define software piracy, accompanied with methods of actu...

  18. 48 CFR 12.212 - Computer software.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Computer software. 12.212... software. (a) Commercial computer software or commercial computer software documentation shall be acquired... required to— (1) Furnish technical information related to commercial computer software or commercial...

  19. RELAP-7 SOFTWARE VERIFICATION AND VALIDATION PLAN

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Curtis L [Idaho National Laboratory; Choi, Yong-Joon [Idaho National Laboratory; Zou, Ling [Idaho National Laboratory

    2014-09-01

    This INL plan comprehensively describes the software for RELAP-7 and documents the software, interface, and software design requirements for the application. The plan also describes the testing-based software verification and validation (SV&V) process—a set of specially designed software models used to test RELAP-7.

  20. Location planning software of fixture based on process requirements%基于工序要求的夹具定位方案规划软件

    Institute of Scientific and Technical Information of China (English)

    彭贺明; 吴玉光

    2013-01-01

    为提高工件加工的质量,研究面向精加工工序的夹具定位方案自动规划方法,根据工序要求和工件几何特性建立定位基准的选择和评价准则.将工件的加工工序要求转换成设计基准的约束自由度任务,根据几何特性建立基准约束能力评价方法,利用基准的几何类型和位置关系类型评价基准的约束自由度能力.根据基准之间的位置关系建立基准的组合和基准替换方法,解决基准定位能力不足和基准约束冲突问题,在定位能力评价和定位问题解决过程中确定基准的定位点数量和布局.利用UG NX 7.5开发平台和VC++语言,编制基于三维实体模型的夹具定位方案自动规划原型软件,并通过实例进行了验证.%To improve the quality of workpiece machining,the automated planning approach for fixture location oriented to finishing process was studied.According to th eprocess requirements and the workpiece geometrical features,the evaluation criterion of location datum was determined.Process requirements of workpiece was converted to the constrained Degrees Of Freedom (DOF) of design datum,the evaluation method of datum constraint ability was established,and the capability of constraint DOF of workpiece datum was evaluated according to the geometrical types and the types of position relationship.The combination and the replacement method of datum were constructed based on position relationship between datum.Thus the datum conflicts and the incapability of the datum locating were solved.The location quantity and the layout of the datum locating points were determined during the process of the locating ability evaluation and the locating problem resolution.The automated location planning prototype software of fixture for 3D solid model was developed by UG NX7.5 development platform and VCA++ Language,and it was proved to be effective by examples.

  1. A Task Scheduling and Allocation Algorithm for Asymmetric Multi-core Software-defined Radio%一种非对称多核SDR的任务调度和分配算法

    Institute of Scientific and Technical Information of China (English)

    徐力; 史少波

    2014-01-01

    Aiming at the synchronous data flow feature of Software-defined Radio(SDR) application, this paper proposes a task scheduling and allocation algorithm for asymmetric multi-core SDR. The algorithm comprehensively considering communication time and task fixed pipeline between tasks, and ensures versatility and parallelism of task scheduling and allocation. It models the task scheduling and allocation with Integer Linear Programming(ILP) method, and further improves the execution efficiency task scheduling and allocation by using the task split method to optimize scheduling and allocation results. The experiment of targeted SDR platform for IEEE 802.11a frequency offset estimation shows that the proposed algorithm can improve the SDR throughput by 5.97%, the processor core utilization by 3.03%, and reduce longest leisure waiting time of processor core by 34.31%.%针对软件无线电(SDR)应用同步数据流的特点,提出一种非对称多核 SDR 的任务调度和分配算法。该算法综合考虑任务之间的通信时间和任务固定流水,保证任务调度和分配的通用性和并行性。利用整数线性规划(ILP)方法对任务调度和分配进行建模,采用任务拆分方法优化调度和分配的结果,进一步提高任务调度和分配的执行效率。在目标SDR平台上实现IEEE 802.11a频偏估计处理的任务调度和分配,实验结果表明,该算法能提高5.97%的软件无线电平台吞吐量和3.03%的处理器核平均利用率,并减少34.31%的处理器核最长空闲等待时间。

  2. Terminological recommendations for software localization

    Directory of Open Access Journals (Sweden)

    Klaus-Dirk Schmitz

    2012-08-01

    Full Text Available After an explosive growth of data processing and software starting at the beginning of the 1980s, the software industry shifted toward a strong orientation in non-US markets at the beginning of the 1990s. Today we see the global marketing of software in almost all regions of the world. Since software is no longer used by IT experts only, and since European and national regulations require user interfaces, manuals and documentation to be provided in the language of the customer, the market for software translation, i.e. for software localization, is the fastest growing market in the translation business.

  3. Terminological recommendations for software localization

    Directory of Open Access Journals (Sweden)

    Klaus-Dirk Schmitz

    2009-03-01

    Full Text Available After an explosive growth of data processing and software starting at the beginning of the 1980s, the software industry shifted toward a strong orientation in non-US markets at the beginning of the 1990s. Today we see the global marketing of software in almost all regions of the world. Since software is no longer used by IT experts only, and since European and national regulations require user interfaces, manuals and documentation to be provided in the language of the customer, the market for software translation, i.e. for software localization, is the fastest growing market in the translation business.

  4. Software architecture evolution

    DEFF Research Database (Denmark)

    Barais, Olivier; Le Meur, Anne-Francoise; Duchien, Laurence

    2008-01-01

    Software architectures must frequently evolve to cope with changing requirements, and this evolution often implies integrating new concerns. Unfortunately, when the new concerns are crosscutting, existing architecture description languages provide little or no support for this kind of evolution....... The software architect must modify multiple elements of the architecture manually, which risks introducing inconsistencies. This chapter provides an overview, comparison and detailed treatment of the various state-of-the-art approaches to describing and evolving software architectures. Furthermore, we discuss...... one particular framework named Tran SAT, which addresses the above problems of software architecture evolution. Tran SAT provides a new element in the software architecture descriptions language, called an architectural aspect, for describing new concerns and their integration into an existing...

  5. Software licenses: Stay honest!

    CERN Document Server

    Computer Security Team

    2012-01-01

    Do you recall our article about copyright violation in the last issue of the CERN Bulletin, “Music, videos and the risk for CERN”? Now let’s be more precise. “Violating copyright” not only means the illegal download of music and videos, it also applies to software packages and applications.   Users must respect proprietary rights in compliance with the CERN Computing Rules (OC5). Not having legitimately obtained a program or the required licenses to run that software is not a minor offense. It violates CERN rules and puts the Organization at risk! Vendors deserve credit and compensation. Therefore, make sure that you have the right to use their software. In other words, you have bought the software via legitimate channels and use a valid and honestly obtained license. This also applies to “Shareware” and software under open licenses, which might also come with a cost. Usually, only “Freeware” is complete...

  6. Developing Software Simulations

    Directory of Open Access Journals (Sweden)

    Tom Hall

    2007-06-01

    Full Text Available Programs in education and business often require learners to develop and demonstrate competence in specified areas and then be able to effectively apply this knowledge. One method to aid in developing a skill set in these areas is through the use of software simulations. These simulations can be used for learner demonstrations of competencies in a specified course as well as a review of the basic skills at the beginning of subsequent courses. The first section of this paper discusses ToolBook, the software used to develop our software simulations. The second section discusses the process of developing software simulations. The third part discusses how we have used software simulations to assess student knowledge of research design by providing simulations that allow the student to practice using SPSS and Excel.

  7. Developing Software Simulations

    Directory of Open Access Journals (Sweden)

    Tom Hall

    2007-06-01

    Full Text Available Programs in education and business often require learners to develop and demonstrate competence in specified areas and then be able to effectively apply this knowledge. One method to aid in developing a skill set in these areas is through the use of software simulations. These simulations can be used for learner demonstrations of competencies in a specified course as well as a review of the basic skills at the beginning of subsequent courses. The first section of this paper discusses ToolBook, the software used to develop our software simulations. The second section discusses the process of developing software simulations. The third part discusses how we have used software simulations to assess student knowledge of research design by providing simulations that allow the student to practice using SPSS and Excel.

  8. Performance Flexibility Architecture of Core Service Platform for Next-Generation Network

    Institute of Scientific and Technical Information of China (English)

    YANG Menghui; YANG Weikang; WANG Xiaoge; LIAO Jianxin; CHEN Junliang

    2008-01-01

    The hardware and software architectures of core service platforms for next-generation networks were analyzed to compute the minimum cost hardware configuration of a core service platform. This method gives a closed form expression for the optimized hardware cost configuration based on the service require-ments, the processing features of the computers running the core service platform software, and the proc-essing capabilities of the common object request broker architecture middleware. Three simulation scenar-ios were used to evaluate the model. The input includes the number of servers for the protocol mapping (PM), Parlay gateway (PG), application sever (AS), and communication handling (CH) functions. The simu-lation results show that the mean delay meets requirements. When the number of servers for PM, PG, AS,and CH functions were not properly selected, the mean delay was excessive. Simulation results show that the model is valid and can be used to optimize investments in core service platforms.

  9. Software engineering

    CERN Document Server

    Sommerville, Ian

    2016-01-01

    For courses in computer science and software engineering The Fundamental Practice of Software Engineering Software Engineering introduces readers to the overwhelmingly important subject of software programming and development. In the past few years, computer systems have come to dominate not just our technological growth, but the foundations of our world's major industries. This text seeks to lay out the fundamental concepts of this huge and continually growing subject area in a clear and comprehensive manner. The Tenth Edition contains new information that highlights various technological updates of recent years, providing readers with highly relevant and current information. Sommerville's experience in system dependability and systems engineering guides the text through a traditional plan-based approach that incorporates some novel agile methods. The text strives to teach the innovators of tomorrow how to create software that will make our world a better, safer, and more advanced place to live.

  10. Formulation of a Production Strategy for a Software Product Line

    Science.gov (United States)

    2009-08-01

    individual assets. 4. Be acquisitive. Acquire as much of the software as possible [ Bergey 2006]. This strategy affects the development of core...requires more up-front investment than more manual development strategies. Complex operations such as building the product can be repeated quickly and...Architecture in Practice. Addison-Wesley Profes- sional, 2003. [ Bergey 2006] Bergey , J. & Cohen, S. Product Line Acquisition in a DoD Organization

  11. libdrdc: software standards library

    Science.gov (United States)

    Erickson, David; Peng, Tie

    2008-04-01

    This paper presents the libdrdc software standards library including internal nomenclature, definitions, units of measure, coordinate reference frames, and representations for use in autonomous systems research. This library is a configurable, portable C-function wrapped C++ / Object Oriented C library developed to be independent of software middleware, system architecture, processor, or operating system. It is designed to use the automatically-tuned linear algebra suite (ATLAS) and Basic Linear Algebra Suite (BLAS) and port to firmware and software. The library goal is to unify data collection and representation for various microcontrollers and Central Processing Unit (CPU) cores and to provide a common Application Binary Interface (ABI) for research projects at all scales. The library supports multi-platform development and currently works on Windows, Unix, GNU/Linux, and Real-Time Executive for Multiprocessor Systems (RTEMS). This library is made available under LGPL version 2.1 license.

  12. Application of Improved SQUARE Model in Software Security Requirements Elicitation%改进的 SQUARE 模型在软件安全需求获取中的应用

    Institute of Scientific and Technical Information of China (English)

    范洁; 许盛伟; 娄嘉鹏

    2013-01-01

    The eliciting of security requirement is a key factor to ensure software's security .To obtain the software's security requirement effectively , on the basis of the analysis of the Security Quality Re-quirements Engineering model , the steps of the SQUARE model was improved , and the classification standard about security requirements was defined , and the XML Schema definition of security require-ments document was presented .This thesis applied the Light -SQUARE model to university student Score Management System and elicited its security requirement , and stored the security requirement with XML format , realized cross-platform usability of the security requirement .%安全需求的获取是确保软件安全性的关键因素。为有效地获取软件的安全需求,在分析安全质量需求工程SQUARE模型的基础上,改进了该模型的执行步骤,制定了安全需求的分类标准,给出了安全需求文档的XML模式定义。应用改进的SQUARE模型对高校学生成绩管理系统进行安全需求获取,并将安全需求文档以XML格式进行存储,实现了安全需求的跨平台通用。

  13. Software engineering methodologies and tools

    Science.gov (United States)

    Wilcox, Lawrence M.

    1993-01-01

    Over the years many engineering disciplines have developed, including chemical, electronic, etc. Common to all engineering disciplines is the use of rigor, models, metrics, and predefined methodologies. Recently, a new engineering discipline has appeared on the scene, called software engineering. For over thirty years computer software has been developed and the track record has not been good. Software development projects often miss schedules, are over budget, do not give the user what is wanted, and produce defects. One estimate is there are one to three defects per 1000 lines of deployed code. More and more systems are requiring larger and more complex software for support. As this requirement grows, the software development problems grow exponentially. It is believed that software quality can be improved by applying engineering principles. Another compelling reason to bring the engineering disciplines to software development is productivity. It has been estimated that productivity of producing software has only increased one to two percent a year in the last thirty years. Ironically, the computer and its software have contributed significantly to the industry-wide productivity, but computer professionals have done a poor job of using the computer to do their job. Engineering disciplines and methodologies are now emerging supported by software tools that address the problems of software development. This paper addresses some of the current software engineering methodologies as a backdrop for the general evaluation of computer assisted software engineering (CASE) tools from actual installation of and experimentation with some specific tools.

  14. SLS Flight Software Testing: Using a Modified Agile Software Testing Approach

    Science.gov (United States)

    Bolton, Albanie T.

    2016-01-01

    NASA's Space Launch System (SLS) is an advanced launch vehicle for a new era of exploration beyond earth's orbit (BEO). The world's most powerful rocket, SLS, will launch crews of up to four astronauts in the agency's Orion spacecraft on missions to explore multiple deep-space destinations. Boeing is developing the SLS core stage, including the avionics that will control vehicle during flight. The core stage will be built at NASA's Michoud Assembly Facility (MAF) in New Orleans, LA using state-of-the-art manufacturing equipment. At the same time, the rocket's avionics computer software is being developed here at Marshall Space Flight Center in Huntsville, AL. At Marshall, the Flight and Ground Software division provides comprehensive engineering expertise for development of flight and ground software. Within that division, the Software Systems Engineering Branch's test and verification (T&V) team uses an agile test approach in testing and verification of software. The agile software test method opens the door for regular short sprint release cycles. The idea or basic premise behind the concept of agile software development and testing is that it is iterative and developed incrementally. Agile testing has an iterative development methodology where requirements and solutions evolve through collaboration between cross-functional teams. With testing and development done incrementally, this allows for increased features and enhanced value for releases. This value can be seen throughout the T&V team processes that are documented in various work instructions within the branch. The T&V team produces procedural test results at a higher rate, resolves issues found in software with designers at an earlier stage versus at a later release, and team members gain increased knowledge of the system architecture by interfacing with designers. SLS Flight Software teams want to continue uncovering better ways of developing software in an efficient and project beneficial manner

  15. Biological Imaging Software Tools

    Science.gov (United States)

    Eliceiri, Kevin W.; Berthold, Michael R.; Goldberg, Ilya G.; Ibáñez, Luis; Manjunath, B.S.; Martone, Maryann E.; Murphy, Robert F.; Peng, Hanchuan; Plant, Anne L.; Roysam, Badrinath; Stuurman, Nico; Swedlow, Jason R.; Tomancak, Pavel; Carpenter, Anne E.

    2013-01-01

    Few technologies are more widespread in modern biological laboratories than imaging. Recent advances in optical technologies and instrumentation are providing hitherto unimagined capabilities. Almost all these advances have required the development of software to enable the acquisition, management, analysis, and visualization of the imaging data. We review each computational step that biologists encounter when dealing with digital images, the challenges in that domain, and the overall status of available software for bioimage informatics, focusing on open source options. PMID:22743775

  16. Design of a stateless low-latency router architecture for green software-defined networking

    DEFF Research Database (Denmark)

    Saldaña Cercos, Silvia; Ramos, Ramon M.; Eller, Ana C. Ewald;

    2015-01-01

    Expanding software defined networking (SDN) to transport networks requires new strategies to deal with the large number of flows that future core networks will have to face. New south-bound protocols within SDN have been proposed to benefit from having control plane detached from the data plane...

  17. Discussion on software aging management of nuclear power plant safety digital control system.

    Science.gov (United States)

    Liang, Huihui; Gu, Pengfei; Tang, Jianzhong; Chen, Weihua; Gao, Feng

    2016-01-01

    Managing the aging of digital control systems ensures that nuclear power plant systems are in adequate safety margins during their life cycles. Software is a core component in the execution of control logic and differs between digital and analog control systems. The hardware aging management for the digital control system is similar to that for the analog system, which has matured over decades of study. However, software aging management is still in the exploratory stage. Software aging evaluation is critical given the higher reliability and safety requirements of nuclear power plants. To ensure effective inputs for reliability assessment, this paper provides the required software aging information during the life cycle. Moreover, the software aging management scheme for safety digital control system is proposed on the basis of collected aging information.

  18. Wildlife software: procedures for publication of computer software

    Science.gov (United States)

    Samuel, M.D.

    1990-01-01

    Computers and computer software have become an integral part of the practice of wildlife science. Computers now play an important role in teaching, research, and management applications. Because of the specialized nature of wildlife problems, specific computer software is usually required to address a given problem (e.g., home range analysis). This type of software is not usually available from commercial vendors and therefore must be developed by those wildlife professionals with particular skill in computer programming. Current journal publication practices generally prevent a detailed description of computer software associated with new techniques. In addition, peer review of journal articles does not usually include a review of associated computer software. Thus, many wildlife professionals are usually unaware of computer software that would meet their needs or of major improvements in software they commonly use. Indeed most users of wildlife software learn of new programs or important changes only by word of mouth.

  19. Software Innovation

    DEFF Research Database (Denmark)

    Rose, Jeremy

      Innovation is the forgotten key to modern systems development - the element that defines the enterprising engineer, the thriving software firm and the cutting edge software application.  Traditional forms of technical education pay little attention to creativity - often encouraging overly...... rationalistic ways of thinking which stifle the ability to innovate. Professional software developers are often drowned in commercial drudgery and overwhelmed by work pressure and deadlines. The topic that will both ensure success in the market and revitalize their work lives is never addressed. This book sets...... out the new field of software innovation. It organizes the existing scientific research into eight simple heuristics - guiding principles for organizing a system developer's work-life so that it focuses on innovation....

  20. Software Reviews.

    Science.gov (United States)

    Wulfson, Stephen, Ed.

    1987-01-01

    Reviews seven computer software programs that can be used in science education programs. Describes courseware which deals with muscles and bones, terminology, classifying animals without backbones, molecular structures, drugs, genetics, and shaping the earth's surface. (TW)

  1. Software Reviews.

    Science.gov (United States)

    Dwyer, Donna; And Others

    1989-01-01

    Reviewed are seven software packages for Apple and IBM computers. Included are: "Toxicology"; "Science Corner: Space Probe"; "Alcohol and Pregnancy"; "Science Tool Kit Plus"; Computer Investigations: Plant Growth"; "Climatrolls"; and "Animal Watch: Whales." (CW)

  2. Reusable Software.

    Science.gov (United States)

    1984-03-01

    overseeing reusable software, the Reusable Software Organization ( RUSO ). This author does not feel at this time that establishment of such a specific...49] have not been accompanied by establishment of RUSO -like activities. There is need, however, for assurance that functions which a RUSO might be...assurance 6. establishment and maintenance of reuse archival facilities and activities. Actual establishment of a RUSO is best dictated by size of the

  3. Software Epistemology

    Science.gov (United States)

    2016-03-01

    comprehensive approach for determining software epistemology which significantly advances the state of the art in automated vulnerability discovery...straightforward. First, internet -based repositories of open source software (e.g., FreeBSD ports, GitHub, SourceForge, etc.) are mined Approved for...the fix delta, we attempted to perform the same process to determine if the firmware release present in an Internet -of-Things (IoT) streaming camera

  4. Structure of the cytoplasmic domain of TcpE, the inner membrane core protein required for assembly of the Vibrio cholerae toxin-coregulated pilus.

    Science.gov (United States)

    Kolappan, Subramaniapillai; Craig, Lisa

    2013-04-01

    Type IV pili are long thin surface-displayed polymers of the pilin subunit that are present in a diverse group of bacteria. These multifunctional filaments are critical to virulence for pathogens such as Vibrio cholerae, which use them to form microcolonies and to secrete the colonization factor TcpF. The type IV pili are assembled from pilin subunits by a complex inner membrane machinery. The core component of the type IV pilus-assembly platform is an integral inner membrane protein belonging to the GspF superfamily of secretion proteins. These proteins somehow convert chemical energy from ATP hydrolysis by an assembly ATPase on the cytoplasmic side of the inner membrane to mechanical energy for extrusion of the growing pilus filament out of the inner membrane. Most GspF-family inner membrane core proteins are predicted to have N-terminal and central cytoplasmic domains, cyto1 and cyto2, and three transmembrane segments, TM1, TM2 and TM3. Cyto2 and TM3 represent an internal repeat of cyto1 and TM1. Here, the 1.88 Å resolution crystal structure of the cyto1 domain of V. cholerae TcpE, which is required for assembly of the toxin-coregulated pilus, is reported. This domain folds as a monomeric six-helix bundle with a positively charged membrane-interaction face at one end and a hydrophobic groove at the other end that may serve as a binding site for partner proteins in the pilus-assembly complex.

  5. The Ettention software package

    Energy Technology Data Exchange (ETDEWEB)

    Dahmen, Tim, E-mail: Tim.Dahmen@dfki.de [German Research Center for Artificial Intelligence GmbH (DFKI), 66123 Saarbrücken (Germany); Saarland University, 66123 Saarbrücken (Germany); Marsalek, Lukas [Eyen SE, Na Nivách 1043/16, 141 00 Praha 4 (Czech Republic); Saarland University, 66123 Saarbrücken (Germany); Marniok, Nico [Saarland University, 66123 Saarbrücken (Germany); Turoňová, Beata [Saarland University, 66123 Saarbrücken (Germany); IMPRS-CS, Max-Planck Institute for Informatics, Campus E 1.4, 66123 Saarbrücken (Germany); Bogachev, Sviatoslav [Saarland University, 66123 Saarbrücken (Germany); Trampert, Patrick; Nickels, Stefan [German Research Center for Artificial Intelligence GmbH (DFKI), 66123 Saarbrücken (Germany); Slusallek, Philipp [German Research Center for Artificial Intelligence GmbH (DFKI), 66123 Saarbrücken (Germany); Saarland University, 66123 Saarbrücken (Germany)

    2016-02-15

    We present a novel software package for the problem “reconstruction from projections” in electron microscopy. The Ettention framework consists of a set of modular building-blocks for tomographic reconstruction algorithms. The well-known block iterative reconstruction method based on Kaczmarz algorithm is implemented using these building-blocks, including adaptations specific to electron tomography. Ettention simultaneously features (1) a modular, object-oriented software design, (2) optimized access to high-performance computing (HPC) platforms such as graphic processing units (GPU) or many-core architectures like Xeon Phi, and (3) accessibility to microscopy end-users via integration in the IMOD package and eTomo user interface. We also provide developers with a clean and well-structured application programming interface (API) that allows for extending the software easily and thus makes it an ideal platform for algorithmic research while hiding most of the technical details of high-performance computing. - Highlights: • Novel software package for “reconstruction from projections” in electron microscopy. • Support for high-resolution reconstructions on iterative reconstruction algorithms. • Support for CPU, GPU and Xeon Phi. • Integration in the IMOD software. • Platform for algorithm researchers: object oriented, modular design.

  6. Soft Computing Approach for Software Cost Estimation

    OpenAIRE

    Iman Attarzadeh; Siew Hock Ow

    2010-01-01

    Software metric and estimation is base on measuring of software attributes which are typically related to the product, the process and the resources of software development. One of the greatest challenges for software developers is predicting the development effort for a software system based on some metrics for the last decades. Project managers are required to the ability to give a good estimation on software development effort. Most of the traditional techniques such as function points, re...

  7. Clustering Methodologies for Software Engineering

    Directory of Open Access Journals (Sweden)

    Mark Shtern

    2012-01-01

    Full Text Available The size and complexity of industrial strength software systems are constantly increasing. This means that the task of managing a large software project is becoming even more challenging, especially in light of high turnover of experienced personnel. Software clustering approaches can help with the task of understanding large, complex software systems by automatically decomposing them into smaller, easier-to-manage subsystems. The main objective of this paper is to identify important research directions in the area of software clustering that require further attention in order to develop more effective and efficient clustering methodologies for software engineering. To that end, we first present the state of the art in software clustering research. We discuss the clustering methods that have received the most attention from the research community and outline their strengths and weaknesses. Our paper describes each phase of a clustering algorithm separately. We also present the most important approaches for evaluating the effectiveness of software clustering.

  8. Durable ideas in software engineering concepts, methods and approaches from my virtual toolbox

    CERN Document Server

    J Cusick, James

    2013-01-01

    ""Software Engineering now occupies a central place in the development of technology and in the advancement of the economy. From telecommunications to aerospace and from cash registers to medical imaging, software plays a vital and often decisive role in the successful accomplishment of a variety of projects. The creation of software requires a variety of techniques, tools, and especially, properly skilled engineers. This e-book focuses on core concepts and approaches that have proven useful to the author time and time again on many industry projects over a quarter century of research, develo

  9. Software quality assurance: in large scale and complex software-intensive systems

    NARCIS (Netherlands)

    Mistrik, I.; Soley, R.; Ali, N.; Grundy, J.; Tekinerdogan, B.

    2015-01-01

    Software Quality Assurance in Large Scale and Complex Software-intensive Systems presents novel and high-quality research related approaches that relate the quality of software architecture to system requirements, system architecture and enterprise-architecture, or software testing. Modern software

  10. Software quality assurance: in large scale and complex software-intensive systems

    NARCIS (Netherlands)

    Mistrik, I.; Soley, R.; Ali, N.; Grundy, J.; Tekinerdogan, B.

    2015-01-01

    Software Quality Assurance in Large Scale and Complex Software-intensive Systems presents novel and high-quality research related approaches that relate the quality of software architecture to system requirements, system architecture and enterprise-architecture, or software testing. Modern software

  11. A Method Based on Role and Collaboration for Capturing Software Requirements%一种基于角色和协作的软件需求获取方法

    Institute of Scientific and Technical Information of China (English)

    郭辉; 董瑞志

    2012-01-01

      精确地获取客户的需求是一个必需的任务。本文提出了在网络协作平台上的一个协作方法和过程。该方法有许多角色,每种角色在网络环境下同时执行一些需求编写、复查、评论等任务。不同的角色负责不同的任务。基于协作方式,提出了一些角色关系。为了达到协作工作的目的,平台提供了几种协作服务,以满足一个软件系统的开发需求。所有的用户通过不同的协作模式完成他们的工作。所有的涉众完成各自的需求编写后,一个综合的软件需求文档就会产生%  Capturing customer’s desired requirements precisely is a necessary task. Many efforts are made in requirement engineering. The broadly used approaches are goal-oriented approach, object-oriented approach, team-oriented approach, etc. A collaborative method and process to develop software requirements are presented in this paper. Many roles are defined and each of the roles performs a number of tasks of requirements writing, reviewing and commenting at the same time. Different roles are responsible for different tasks. A number of role relationships are proposed. The platform provides several collaborative services to satisfy the needs of developing requirements of a software system. After all stakeholders finish their authoring of requirements, the integrated software requirements document will be generated.

  12. Advanced Software Protection Now

    CERN Document Server

    Bendersky, Diego; Notarfrancesco, Luciano; Sarraute, Carlos; Waissbein, Ariel

    2010-01-01

    Software digital rights management is a pressing need for the software development industry which remains, as no practical solutions have been acclamaimed succesful by the industry. We introduce a novel software-protection method, fully implemented with today's technologies, that provides traitor tracing and license enforcement and requires no additional hardware nor inter-connectivity. Our work benefits from the use of secure triggers, a cryptographic primitive that is secure assuming the existence of an ind-cpa secure block cipher. Using our framework, developers may insert license checks and fingerprints, and obfuscate the code using secure triggers. As a result, this rises the cost that software analysis tools have detect and modify protection mechanisms. Thus rising the complexity of cracking this system.

  13. Software citation principles

    Directory of Open Access Journals (Sweden)

    Arfon M. Smith

    2016-09-01

    Full Text Available Software is a critical part of modern research and yet there is little support across the scholarly ecosystem for its acknowledgement and citation. Inspired by the activities of the FORCE11 working group focused on data citation, this document summarizes the recommendations of the FORCE11 Software Citation Working Group and its activities between June 2015 and April 2016. Based on a review of existing community practices, the goal of the working group was to produce a consolidated set of citation principles that may encourage broad adoption of a consistent policy for software citation across disciplines and venues. Our work is presented here as a set of software citation principles, a discussion of the motivations for developing the principles, reviews of existing community practice, and a discussion of the requirements these principles would place upon different stakeholders. Working examples and possible technical solutions for how these principles can be implemented will be discussed in a separate paper.

  14. MIAWARE Software

    DEFF Research Database (Denmark)

    Wilkowski, Bartlomiej; Pereira, Oscar N. M.; Dias, Paulo

    2008-01-01

    This article presents MIAWARE, a software for Medical Image Analysis With Automated Reporting Engine, which was designed and developed for doctor/radiologist assistance. It allows to analyze an image stack from computed axial tomography scan of lungs (thorax) and, at the same time, to mark all...... pathologies on images and report their characteristics. The reporting process is normalized - radiologists cannot describe pathological changes with their own words, but can only use some terms from a specific vocabulary set provided by the software. Consequently, a normalized radiological report...... is automatically generated. Furthermore, MIAWARE software is accompanied with an intelligent search engine for medical reports, based on the relations between parts of the lungs. A logical structure of the lungs is introduced to the search algorithm through the specially developed ontology. As a result...

  15. A Requirement-Driven Software Trustworthiness Evaluation and Evolution Model%一种需求驱动的软件可信性评估及演化模型

    Institute of Scientific and Technical Information of China (English)

    丁帅; 鲁付俊; 杨善林; 夏承遗

    2011-01-01

    Software trustworthiness evaluation model is built upon the accurately eliciting of trustworthy requirements and the reasonable establish ment of indicator system in the domain special application. Toward software which has huge architecture and complex non-functional demands,trustworthy requirements become changing with software operational state transition. The stability of trustworthiness evaluation indicator system will be affected by trustworthy requirement dynamic evolution. The software trustworthiness evaluation and evolution problem are attracting wide attention in the field of trustworthy software. In this paper, a novel requirement-driven software trustworthiness evaluation and evolution model is designed. Firstly, several key technologies adopted in the process of software trustworthiness evaluation are analyzed and summarized, such as requirements analysis and indicators extraction, trustworthy evidence acquisition and conversion, etc.And the problem of trustworthiness evaluation adaptive solution under the requirements evolution is discussed. Secondly, incidence matrix is used to achieve correlation analysis between trustworthy attributes and then the variation rule of relative weight is revealed. On this basis, adaptive reconstruction device, which can analyze and solve the software trustworthiness evaluation indicator system of self-reconfiguration, is designed based on incidence matrix. Finally, a complete framework of trustworthiness evaluation evolution and evolution model is proposed. Experimental results show the rationality and validity of the model.%软件可信性评估模型的构建依赖于对特定应用领域中可信需求的准确提取和指标系统的合理建立.对于体系结构庞大、非功能性需求复杂的软件而言,可信需求往往随着软件运行状态的转移而不断发生变化.由于可信需求的动态演化将直接影响指标系统的稳定性,因此引起了可信软件研究领域专家的广泛关

  16. Software engineering

    CERN Document Server

    Thorin, Marc

    1985-01-01

    Software Engineering describes the conceptual bases as well as the main methods and rules on computer programming. This book presents software engineering as a coherent and logically built synthesis and makes it possible to properly carry out an application of small or medium difficulty that can later be developed and adapted to more complex cases. This text is comprised of six chapters and begins by introducing the reader to the fundamental notions of entities, actions, and programming. The next two chapters elaborate on the concepts of information and consistency domains and show that a proc

  17. Ten recommendations for software engineering in research.

    Science.gov (United States)

    Hastings, Janna; Haug, Kenneth; Steinbeck, Christoph

    2014-01-01

    Research in the context of data-driven science requires a backbone of well-written software, but scientific researchers are typically not trained at length in software engineering, the principles for creating better software products. To address this gap, in particular for young researchers new to programming, we give ten recommendations to ensure the usability, sustainability and practicality of research software.

  18. [Software version and medical device software supervision].

    Science.gov (United States)

    Peng, Liang; Liu, Xiaoyan

    2015-01-01

    The importance of software version in the medical device software supervision does not cause enough attention at present. First of all, the effect of software version in the medical device software supervision is discussed, and then the necessity of software version in the medical device software supervision is analyzed based on the discussion of the misunderstanding of software version. Finally the concrete suggestions on software version naming rules, software version supervision for the software in medical devices, and software version supervision scheme are proposed.

  19. Proteomics Core

    Data.gov (United States)

    Federal Laboratory Consortium — Proteomics Core is the central resource for mass spectrometry based proteomics within the NHLBI. The Core staff help collaborators design proteomics experiments in a...

  20. Proteomics Core

    Data.gov (United States)

    Federal Laboratory Consortium — Proteomics Core is the central resource for mass spectrometry based proteomics within the NHLBI. The Core staff help collaborators design proteomics experiments in...

  1. NASA's Approach to Software Assurance

    Science.gov (United States)

    Wetherholt, Martha

    2015-01-01

    NASA defines software assurance as: the planned and systematic set of activities that ensure conformance of software life cycle processes and products to requirements, standards, and procedures via quality, safety, reliability, and independent verification and validation. NASA's implementation of this approach to the quality, safety, reliability, security and verification and validation of software is brought together in one discipline, software assurance. Organizationally, NASA has software assurance at each NASA center, a Software Assurance Manager at NASA Headquarters, a Software Assurance Technical Fellow (currently the same person as the SA Manager), and an Independent Verification and Validation Organization with its own facility. An umbrella risk mitigation strategy for safety and mission success assurance of NASA's software, software assurance covers a wide area and is better structured to address the dynamic changes in how software is developed, used, and managed, as well as it's increasingly complex functionality. Being flexible, risk based, and prepared for challenges in software at NASA is essential, especially as much of our software is unique for each mission.

  2. Software archeology: a case study in software quality assurance and design

    Energy Technology Data Exchange (ETDEWEB)

    Macdonald, John M [Los Alamos National Laboratory; Lloyd, Jane A [Los Alamos National Laboratory; Turner, Cameron J [COLORADO SCHOOL OF MINES

    2009-01-01

    Ideally, quality is designed into software, just as quality is designed into hardware. However, when dealing with legacy systems, demonstrating that the software meets required quality standards may be difficult to achieve. As the need to demonstrate the quality of existing software was recognized at Los Alamos National Laboratory (LANL), an effort was initiated to uncover and demonstrate that legacy software met the required quality standards. This effort led to the development of a reverse engineering approach referred to as software archaeology. This paper documents the software archaeology approaches used at LANL to document legacy software systems. A case study for the Robotic Integrated Packaging System (RIPS) software is included.

  3. An Effective Methodology with Automated Product Configuration for Software Product Line Development

    Directory of Open Access Journals (Sweden)

    Scott Uk-Jin Lee

    2015-01-01

    Full Text Available The wide adaptation of product line engineering in software industry has enabled cost effective development of high quality software for diverse market segments. In software product line (SPL, a family of software is specified with a set of core assets representing reusable features with their variability, dependencies, and constraints. From such core assets, valid software products are configured after thoroughly analysing the represented features and their properties. However, current implementations of SPL lack effective means to configure a valid product as core assets specified in SPL, being high-dimensional data, are often too complex to analyse. This paper presents a time and cost effective methodology with associated tool supports to design a SPL model, analyse features, and configure a valid product. The proposed approach uses eXtensible Markup Language (XML to model SPL, where an adequate schema is defined to precisely specify core assets. Furthermore, it enables automated product configuration by (i extracting all the properties of required features from a given SPL model and calculating them with Alloy Analyzer; (ii generating a decision model with appropriate eXtensible Stylesheet Language Transformation (XSLT instructions embedded in each resolution effect; and (iii processing XSLT instructions of all the selected resolution effects.

  4. Educational Software.

    Science.gov (United States)

    Northwest Regional Educational Lab., Portland, OR.

    The third session of IT@EDU98 consisted of five papers on educational software and was chaired by Tran Van Hao (University of Education, Ho Chi Minh City, Vietnam). "Courseware Engineering" (Nguyen Thanh Son, Ngo Ngoc Bao Tran, Quan Thanh Tho, Nguyen Hong Lam) briefly describes the use of courseware. "Machine Discovery Theorems in Geometry: A…

  5. Software Patents.

    Science.gov (United States)

    Burke, Edmund B.

    1994-01-01

    Outlines basic patent law information that pertains to computer software programs. Topics addressed include protection in other countries; how to obtain patents; kinds of patents; duration; classes of patentable subject matter, including machines and processes; patentability searches; experimental use prior to obtaining a patent; and patent…

  6. Software Systems

    Institute of Scientific and Technical Information of China (English)

    崔涛; 周淼

    1996-01-01

    The information used with computers is known as software and includesprograms and data. Programs are sets of instructions telling the computerwhat operations have to be carried out and in what order they should be done. Specialised programs which enable the computer to be used for particularpurposes are called applications programs. A collection of these programs kept

  7. Software Reviews.

    Science.gov (United States)

    Science and Children, 1990

    1990-01-01

    Reviewed are seven computer software packages for IBM and/or Apple Computers. Included are "Windows on Science: Volume 1--Physical Science"; "Science Probe--Physical Science"; "Wildlife Adventures--Grizzly Bears"; "Science Skills--Development Programs"; "The Clean Machine"; "Rock Doctor"; and "Geology Search." Cost, quality, hardware, and…

  8. A software radio platform based on ARM and FPGA

    Directory of Open Access Journals (Sweden)

    Yang Xin.

    2016-01-01

    Full Text Available The rapid rise in computational performance offered by computer systems has greatly increased the number of practical software radio applications. A scheme presented in this paper is a software radio platform based on ARM and FPGA. FPGA works as the coprocessor together with the ARM, which serves as the core processor. ARM is used for digital signal processing and real-time data transmission, and FPGA is used for synchronous timing control and serial-parallel conversion. A SPI driver for real-time data transmission between ARM and FPGA under ARM-Linux system is provided. By adopting modular design, the software radio platform is capable of implementing wireless communication functions and satisfies the requirements of real-time signal processing platform for high security and broad applicability.

  9. Computing and software

    Directory of Open Access Journals (Sweden)

    White, G. C.

    2004-06-01

    Full Text Available The reality is that the statistical methods used for analysis of data depend upon the availability of software. Analysis of marked animal data is no different than the rest of the statistical field. The methods used for analysis are those that are available in reliable software packages. Thus, the critical importance of having reliable, up–to–date software available to biologists is obvious. Statisticians have continued to develop more robust models, ever expanding the suite of potential analysis methods available. But without software to implement these newer methods, they will languish in the abstract, and not be applied to the problems deserving them. In the Computers and Software Session, two new software packages are described, a comparison of implementation of methods for the estimation of nest survival is provided, and a more speculative paper about how the next generation of software might be structured is presented. Rotella et al. (2004 compare nest survival estimation with different software packages: SAS logistic regression, SAS non–linear mixed models, and Program MARK. Nests are assumed to be visited at various, possibly infrequent, intervals. All of the approaches described compute nest survival with the same likelihood, and require that the age of the nest is known to account for nests that eventually hatch. However, each approach offers advantages and disadvantages, explored by Rotella et al. (2004. Efford et al. (2004 present a new software package called DENSITY. The package computes population abundance and density from trapping arrays and other detection methods with a new and unique approach. DENSITY represents the first major addition to the analysis of trapping arrays in 20 years. Barker & White (2004 discuss how existing software such as Program MARK require that each new model’s likelihood must be programmed specifically for that model. They wishfully think that future software might allow the user to combine

  10. Benign Papillomas of the Breast Diagnosed on Large-Gauge Vacuum Biopsy compared with 14 Gauge Core Needle Biopsy - Do they require surgical excision?

    Science.gov (United States)

    Seely, Jean M; Verma, Raman; Kielar, Ania; Smyth, Karl R; Hack, Kalesha; Taljaard, Monica; Gravel, Denis; Ellison, Erin

    2017-03-01

    To evaluate whether biopsy with vacuum-assisted biopsy (VAB) devices improves histologic underestimation rates of benign papillomas when compared to smaller bore core needle biopsy (CNB) devices. Patients with biopsy-proven benign papillomas with surgical resection or minimum 12 months follow-up were selected. Two breast pathologists reviewed all pathology slides of percutaneous and excisional biopsy specimens. Histologic underestimation rates for lesions biopsied with 10-12 Gauge (G) VAB were compared to those with 14G CNB. A total of 107 benign papillomas in 107 patients from two centers were included. There were 60 patients (mean age 57 years, SD 10.3 years) diagnosed with VAB and 47 patients (mean age 57.6 years, SD 11.3 years) with 14G CNB who underwent surgical excision or imaging follow-up. The upgrade rate to ductal carcinoma in situ or invasive carcinoma was 1.6% (1/60) with VAB and 8.5% (4/47) with 14G. Upgrade to atypia was 3.3% (2/60) after VAB and 10.6% (5/47) with CNB. The total underestimation rates were 5% (3/60) with VAB and 19.1% (9/47) with CNB. The odds of an upgrade to malignancy was 5.5 times higher with a 14G needle than VAB (95% CI: 0.592-50.853, p = 0.17). We observed a lower but not statistically significant upgrade rate to malignancy and atypia with the use of the 10-12 G VAB as compared with 14G CNB. When a papilloma without atypia is diagnosed with vacuum biopsy there is a high likelihood that it is benign; however, if surgical excision is not performed, long-term follow-up is still required. © 2016 Wiley Periodicals, Inc.

  11. EPIQR software

    Energy Technology Data Exchange (ETDEWEB)

    Flourentzos, F. [Federal Institute of Technology, Lausanne (Switzerland); Droutsa, K. [National Observatory of Athens, Athens (Greece); Wittchen, K.B. [Danish Building Research Institute, Hoersholm (Denmark)

    1999-11-01

    The support of the EPIQR method is a multimedia computer program. Several modules help the users of the method to treat the data collected during a diagnosis survey, to set up refurbishment scenario and calculate their cost or energy performance, and finally to visualize the results in a comprehensive way and to prepare quality reports. This article presents the structure and the main features of the software. (au)

  12. EPIQR software

    Energy Technology Data Exchange (ETDEWEB)

    Flourentzos, F. [Federal Institute of Technology-Lausanne (EPFL), Solar Energy and Building Physics Laboratory (LESO-PB), Lausanne (Switzerland); Droutsa, K. [National Observatory of Athens, Institute of Meteorology and Physics of Atmospheric Environment, Group Energy Conservation, Athens (Greece); Wittchen, K.B. [Danish Building Research Institute, Division of Energy and Indoor Environment, Hoersholm, (Denmark)

    2000-07-01

    The support of the EPIQR method is a multimedia computer program. Several modules help the users of the method to treat the data collected during a diagnosis survey, to set up refurbishment scenarios and calculate their cost or energy performance, and finally to visualize the results in a comprehensive way and to prepare quality reports. This article presents the structure and the main features of the software. (author)

  13. Fault tolerant software modules for SIFT

    Science.gov (United States)

    Hecht, M.; Hecht, H.

    1982-01-01

    The implementation of software fault tolerance is investigated for critical modules of the Software Implemented Fault Tolerance (SIFT) operating system to support the computational and reliability requirements of advanced fly by wire transport aircraft. Fault tolerant designs generated for the error reported and global executive are examined. A description of the alternate routines, implementation requirements, and software validation are included.

  14. Software preservation

    Directory of Open Access Journals (Sweden)

    Tadej Vodopivec

    2011-01-01

    Full Text Available Comtrade Ltd. covers a wide range of activities related to information and communication technologies; its deliverables include web applications, locally installed programs,system software, drivers, embedded software (used e.g. in medical devices, auto parts,communication switchboards. Also the extensive knowledge and practical experience about digital long-term preservation technologies have been acquired. This wide spectrum of activities puts us in the position to discuss the often overlooked aspect of the digital preservation - preservation of software programs. There are many resources dedicated to digital preservation of digital data, documents and multimedia records,but not so many about how to preserve the functionalities and features of computer programs. Exactly these functionalities - dynamic response to inputs - render the computer programs rich compared to documents or linear multimedia. The article opens the questions on the beginning of the way to the permanent digital preservation. The purpose is to find a way in the right direction, where all relevant aspects will be covered in proper balance. The following questions are asked: why at all to preserve computer programs permanently, who should do this and for whom, when we should think about permanent program preservation, what should be persevered (such as source code, screenshots, documentation, and social context of the program - e.g. media response to it ..., where and how? To illustrate the theoretic concepts given the idea of virtual national museum of electronic banking is also presented.

  15. 基于岗位技能需求的高职《软件测试》课程建设%The Curriculum Construction of Software Testing in High Vocational Colleges Based on the Job Skill Requirements

    Institute of Scientific and Technical Information of China (English)

    胡双

    2016-01-01

    随着互联网信息时代的到来,软件测试这个不常见的专业逐渐浮现在人们的眼前,软件测试是一种电子信息技术,是使用人工操作或者软件自动运行的方式来检验它是否满足规定的需求或弄清预期结果与实际结果之间差别的过程。它是帮助识别开发完成的计算机的正确性、完成度和质量的软件过程。然而,由于现代人们对与软件测试专业的认识还不够完善,导致国家以及各大高等学校在《软件测试》课程的设置上不够重视,社会虽然急需软件测试专业方面的人才,却没有培养软件测试专业人才的地方。因此,加强人们对于《软件测试》课程的了解刻不容缓。文章具体探讨了基于岗位技能需求的高职《软件测试》课程建设的相关问题。%With the advent of the era of Internet information, software testing this unusual profession gradually emerge in the eyes of people, software testing is a kind of electronic information technology, is to use manual operation or software automatically run to verify whether it meet the requirements of rules or make clear expected results with the actual result of the difference between process. It is to help identify the correctness of the development of computer, ifnish and quality of software process. However, due to the modern people to software testing professional knowledge is not enough perfect, the cause of the country and the major institutions of higher learning in the curriculum of software testing set up seriously enough, although social need software testing professional talents, but has no place in the training of specialists in the ifeld of software testing. Therefore, to strengthen people for the understanding of the course of the software testing is urgently needed. The article discussed the speciifc job skills based on the needs of higher vocational education of the curriculum of the construction of the software test

  16. Agile distributed software development

    DEFF Research Database (Denmark)

    Persson, John Stouby; Mathiassen, Lars; Aaen, Ivan

    2012-01-01

    While face-to-face interaction is fundamental in agile software development, distributed environments must rely extensively on mediated interactions. Practicing agile principles in distributed environments therefore poses particular control challenges related to balancing fixed vs. evolving quality...... requirements and people vs. process-based collaboration. To investigate these challenges, we conducted an in-depth case study of a successful agile distributed software project with participants from a Russian firm and a Danish firm. Applying Kirsch’s elements of control framework, we offer an analysis of how...

  17. Green in software engineering

    CERN Document Server

    Calero Munoz, Coral

    2015-01-01

    This is the first book that presents a comprehensive overview of sustainability aspects in software engineering. Its format follows the structure of the SWEBOK and covers the key areas involved in the incorporation of green aspects in software engineering, encompassing topics from requirement elicitation to quality assurance and maintenance, while also considering professional practices and economic aspects. The book consists of thirteen chapters, which are structured in five parts. First the "Introduction" gives an overview of the primary general concepts related to Green IT, discussing wha

  18. Software Engineering to Professionalize Software Development

    Directory of Open Access Journals (Sweden)

    Juan Miguel Alonso

    2011-12-01

    Full Text Available The role, increasingly important, that plays the software in the systems with widespread effects presents new challenges for the formation of Software Engineers. Not only because social dependence software is increasing, but also because the character of software development is also changing and with it the demands for software developers certified. In this paper are propose some challenges and aspirations that guide the learning processes Software Engineering and help to identify the need to train professionals in software development.

  19. Flight Software Design Choices Based on Criticality

    Science.gov (United States)

    Lee, Earl

    1999-01-01

    This slide presentation reviews the rationale behind flight software design as a function of criticality. The requirements of human rated systems implies a high criticality for the flight support software. Human life is dependent on correct operation of the software. Flexibility should be permitted when the consequences of software failure are not life threatening. This is also relevant for selecting Commercial Off the Shelf (COTS) software.

  20. Reviews in innovative software development

    DEFF Research Database (Denmark)

    Aaen, Ivan; Boelsmand, Jeppe Vestergaard; Jensen, Rasmus

    2009-01-01

    This paper proposes a new review approach for innovative software development. Innovative software development implies that requirements are rarely available as a basis for reviewing and that the purpose of a review is as much to forward additional ideas, as to validate what has been accomplished...

  1. The 3' region of the chicken hypersensitive site-4 insulator has properties similar to its core and is required for full insulator activity.

    Directory of Open Access Journals (Sweden)

    Paritha I Arumugam

    Full Text Available Chromatin insulators separate active transcriptional domains and block the spread of heterochromatin in the genome. Studies on the chicken hypersensitive site-4 (cHS4 element, a prototypic insulator, have identified CTCF and USF-1/2 motifs in the proximal 250 bp of cHS4, termed the "core", which provide enhancer blocking activity and reduce position effects. However, the core alone does not insulate viral vectors effectively. The full-length cHS4 has excellent insulating properties, but its large size severely compromises vector titers. We performed a structure-function analysis of cHS4 flanking lentivirus-vectors and analyzed transgene expression in the clonal progeny of hematopoietic stem cells and epigenetic changes in cHS4 and the transgene promoter. We found that the core only reduced the clonal variegation in expression. Unique insulator activity resided in the distal 400 bp cHS4 sequences, which when combined with the core, restored full insulator activity and open chromatin marks over the transgene promoter and the insulator. These data consolidate the known insulating activity of the canonical 5' core with a novel 3' 400 bp element with properties similar to the core. Together, they have excellent insulating properties and viral titers. Our data have important implications in understanding the molecular basis of insulator function and design of gene therapy vectors.

  2. Making an Ice Core.

    Science.gov (United States)

    Kopaska-Merkel, David C.

    1995-01-01

    Explains an activity in which students construct a simulated ice core. Materials required include only a freezer, food coloring, a bottle, and water. This hands-on exercise demonstrates how a glacier is formed, how ice cores are studied, and the nature of precision and accuracy in measurement. Suitable for grades three through eight. (Author/PVD)

  3. Using SCR methods to analyze requirements documentation

    Science.gov (United States)

    Callahan, John; Morrison, Jeffery

    1995-01-01

    Software Cost Reduction (SCR) methods are being utilized to analyze and verify selected parts of NASA's EOS-DIS Core System (ECS) requirements documentation. SCR is being used as a spot-inspection tool. Through this formal and systematic approach of the SCR requirements methods, insights as to whether the requirements are internally inconsistent or incomplete as the scenarios of intended usage evolve in the OC (Operations Concept) documentation. Thus, by modelling the scenarios and requirements as mode charts using the SCR methods, we have been able to identify problems within and between the documents.

  4. Software Engineering Education: Some Important Dimensions

    Science.gov (United States)

    Mishra, Alok; Cagiltay, Nergiz Ercil; Kilic, Ozkan

    2007-01-01

    Software engineering education has been emerging as an independent and mature discipline. Accordingly, various studies are being done to provide guidelines for curriculum design. The main focus of these guidelines is around core and foundation courses. This paper summarizes the current problems of software engineering education programs. It also…

  5. Lessons from 30 Years of Flight Software

    Science.gov (United States)

    McComas, David C.

    2015-01-01

    This presentation takes a brief historical look at flight software over the past 30 years, extracts lessons learned and shows how many of the lessons learned are embodied in the Flight Software product line called the core Flight System (cFS). It also captures the lessons learned from developing and applying the cFS.

  6. Software Engineering Education: Some Important Dimensions

    Science.gov (United States)

    Mishra, Alok; Cagiltay, Nergiz Ercil; Kilic, Ozkan

    2007-01-01

    Software engineering education has been emerging as an independent and mature discipline. Accordingly, various studies are being done to provide guidelines for curriculum design. The main focus of these guidelines is around core and foundation courses. This paper summarizes the current problems of software engineering education programs. It also…

  7. Evolution of the ATLAS Software Framework towards Concurrency

    Science.gov (United States)

    Jones, R. W. L.; Stewart, G. A.; Leggett, C.; Wynne, B. M.

    2015-05-01

    The ATLAS experiment has successfully used its Gaudi/Athena software framework for data taking and analysis during the first LHC run, with billions of events successfully processed. However, the design of Gaudi/Athena dates from early 2000 and the software and the physics code has been written using a single threaded, serial design. This programming model has increasing difficulty in exploiting the potential of current CPUs, which offer their best performance only through taking full advantage of multiple cores and wide vector registers. Future CPU evolution will intensify this trend, with core counts increasing and memory per core falling. Maximising performance per watt will be a key metric, so all of these cores must be used as efficiently as possible. In order to address the deficiencies of the current framework, ATLAS has embarked upon two projects: first, a practical demonstration of the use of multi-threading in our reconstruction software, using the GaudiHive framework; second, an exercise to gather requirements for an updated framework, going back to the first principles of how event processing occurs. In this paper we report on both these aspects of our work. For the hive based demonstrators, we discuss what changes were necessary in order to allow the serially designed ATLAS code to run, both to the framework and to the tools and algorithms used. We report on what general lessons were learned about the code patterns that had been employed in the software and which patterns were identified as particularly problematic for multi-threading. These lessons were fed into our considerations of a new framework and we present preliminary conclusions on this work. In particular we identify areas where the framework can be simplified in order to aid the implementation of a concurrent event processing scheme. Finally, we discuss the practical difficulties involved in migrating a large established code base to a multi-threaded framework and how this can be achieved

  8. TWRS systems engineering software configuration management plan

    Energy Technology Data Exchange (ETDEWEB)

    Porter, P.E.

    1996-10-09

    This plan delineates the requirements for control of software developed and supported by the Tank Waste Remediation System (TWRS) Technical Integration organization. The information contained in this plan shall assist employees involved with software modification and configuration control.

  9. Green software engineering for airbus avionics

    OpenAIRE

    I. Brooks

    2016-01-01

    Presentation at EU Ashley Project Public Forum 25 October 2016. Presentation examines the risks from Software Engineering without sustainability requirements and looks at the use of the UN Sustainable Development Goals to de-risk avionics software engineering.

  10. Green software engineering for airbus avionics

    OpenAIRE

    2016-01-01

    Presentation at EU Ashley Project Public Forum 25 October 2016. Presentation examines the risks from Software Engineering without sustainability requirements and looks at the use of the UN Sustainable Development Goals to de-risk avionics software engineering.

  11. Online Rule Generation Software Process Model

    National Research Council Canada - National Science Library

    Sudeep Marwaha; Alka Aroa; Satma M C; Rajni Jain; R C Goyal

    2013-01-01

    .... The software process model for rule generation using decision tree classifier refers to the various steps required to be executed for the development of a web based software model for decision rule generation...

  12. Astronomers as Software Developers

    Science.gov (United States)

    Pildis, Rachel A.

    2016-01-01

    Astronomers know that their research requires writing, adapting, and documenting computer software. Furthermore, they often have to learn new computer languages and figure out how existing programs work without much documentation or guidance and with extreme time pressure. These are all skills that can lead to a software development job, but recruiters and employers probably won't know that. I will discuss all the highly useful experience that astronomers may not know that they already have, and how to explain that knowledge to others when looking for non-academic software positions. I will also talk about some of the pitfalls I have run into while interviewing for jobs and working as a developer, and encourage you to embrace the curiosity employers might have about your non-standard background.

  13. SIMD studies in the LHCb reconstruction software

    Science.gov (United States)

    Cámpora Pérez, Daniel Hugo; Couturier, Ben

    2015-12-01

    During the data taking process in the LHC at CERN, millions of collisions are recorded every second by the LHCb Detector. The LHCb Online computing farm, counting around 15000 cores, is dedicated to the reconstruction of the events in real-time, in order to filter those with interesting Physics. The ones kept are later analysed Offline in a more precise fashion on the Grid. This imposes very stringent requirements on the reconstruction software, which has to be as efficient as possible. Modern CPUs support so-called vector-extensions, which extend their Instruction Sets, allowing for concurrent execution across functional units. Several libraries expose the Single Instruction Multiple Data programming paradigm to issue these instructions. The use of vectorisation in our codebase can provide performance boosts, leading ultimately to Physics reconstruction enhancements. In this paper, we present vectorisation studies of significant reconstruction algorithms. A variety of vectorisation libraries are analysed and compared in terms of design, maintainability and performance. We also present the steps taken to systematically measure the performance of the released software, to ensure the consistency of the run-time of the vectorised software.

  14. Software Engineering Improvement Activities/Plan

    Science.gov (United States)

    2003-01-01

    bd Systems personnel accomplished the technical responsibilities for this reporting period, as planned. A close working relationship was maintained with personnel of the MSFC Avionics Department Software Group (ED14). Work accomplishments included development, evaluation, and enhancement of a software cost model, performing literature search and evaluation of software tools available for code analysis and requirements analysis, and participating in other relevant software engineering activities. Monthly reports were submitted. This support was provided to the Flight Software Group/ED 1 4 in accomplishing the software engineering improvement engineering activities of the Marshall Space Flight Center (MSFC) Software Engineering Improvement Plan.

  15. Software Engineering Improvement Activities/Plan

    Science.gov (United States)

    2003-01-01

    bd Systems personnel accomplished the technical responsibilities for this reporting period, as planned. A close working relationship was maintained with personnel of the MSFC Avionics Department Software Group (ED14). Work accomplishments included development, evaluation, and enhancement of a software cost model, performing literature search and evaluation of software tools available for code analysis and requirements analysis, and participating in other relevant software engineering activities. Monthly reports were submitted. This support was provided to the Flight Software Group/ED 1 4 in accomplishing the software engineering improvement engineering activities of the Marshall Space Flight Center (MSFC) Software Engineering Improvement Plan.

  16. Verification of safety-critical software requirement based on Petri-net model checking%基于Petri网模型检验的安全关键软件需求验证

    Institute of Scientific and Technical Information of China (English)

    李震; 刘斌; 李小勋; 殷永峰

    2011-01-01

    需求形式化建模和模型检验可以提高安全关健软件的可信性,但在模型描述、调试和解释能力方面存在局限.对使用Petri网支持软件系统建模进行了扩展,设定默认值为零的权函数、利用"非"虚线描述在状态为假和变迁失败情况下的触发,增强阈值条件的描述能力,区分了枚举型和数值型库所,区分了普通迁移和强赋值迁移,并给出了扩展后的形式化定义及其和检验语言的语义映射.最后给出在典型机载软件上的应用,建立了软件需求模型和部分映射代码,对模型进行检验、反例路径分析和需求完善.过程和结果表明该方法可以有效的支持实际的关键安全软件需求建模和验证.%It is very beneficial to apply formal modeling and model checking to improve the dependability of safety-critical software. but there are limitations in model description, debugging and explanation. The extension of Petri-net to model the software system and the default value of weight functions to zero are set up, the fire under false state or failed transition by a "Not" dotted line is described, the ability to describe the condition of threshold is enhanced. This paper differentiates between enumeration place and numeric place, between normal transition and value-assigned transition. The formal definition of extended Petri net and its semantic mapping to verification language are given. Finally, the application in airborne engine software to build the software requirement model and the mapping code for verification, counterexample analysis and improvement are given.The process and result show the methodology can effectively support the safety-critical software requirement modeling and verification in practice.

  17. DEVELOPING SOFTWARE FOR CORPUS RESEARCH

    Directory of Open Access Journals (Sweden)

    Oliver Mason

    2008-06-01

    Full Text Available Despite the central role of the computer in corpus research, programming is generally not seen as a core skill within corpus linguistics. As a consequence, limitations in software for text and corpus analysis slow down the progress of research while analysts often have to rely on third party software or even manual data analysis if no suitable software is available. Apart from software itself, data formats are also of great importance for text processing. But again, many practitioners are not very aware of the options available to them, and thus idiosyncratic text formats often make sharing of resources difficult if not impossible. This article discusses some issues relating to both data and processing which should aid researchers to become more aware of the choices available to them when it comes to using computers in linguistic research. It also describes an easy way towards automating some common text processing tasks that can easily be acquired without knowledge of actual computer programming.

  18. Space Software

    Science.gov (United States)

    1990-01-01

    Xontech, Inc.'s software package, XonVu, simulates the missions of Voyager 1 at Jupiter and Saturn, Voyager 2 at Jupiter, Saturn, Uranus and Neptune, and Giotto in close encounter with Comet Halley. With the program, the user can generate scenes of the planets, moons, stars or Halley's nucleus and tail as seen by Giotto, all graphically reproduced with high accuracy in wireframe representation. Program can be used on a wide range of computers, including PCs. User friendly and interactive, with many options, XonVu can be used by a space novice or a professional astronomer. With a companion user's manual, it sells for $79.

  19. A Cloverleaf of Software Engineering

    DEFF Research Database (Denmark)

    Bjørner, Dines

    2005-01-01

    We shall touch upon four issues of software engineering (SE): domain engineering, formal techniques, SE sociology, and academic software architects. First, before software can be designed one must understand its requirements; but before requirements can be formulated one must understand the domain....... So we assume that requirements development is based on first having established models of the (application) domain. We illustrate facets of the railway domain. Second, we touch upon all of the three phases: domain engineering, requirements engineering and software design also being done formally......, however "lite". Third, despite 35 years of formal methods, the SE industry, maturity-wise still lags far behind that of other engineering disciplines. So we examine why. Finally, in several areas, in health care, in architecture, and others, we see that major undertakings are primarily spearheaded...

  20. The Ettention software package.

    Science.gov (United States)

    Dahmen, Tim; Marsalek, Lukas; Marniok, Nico; Turoňová, Beata; Bogachev, Sviatoslav; Trampert, Patrick; Nickels, Stefan; Slusallek, Philipp

    2016-02-01

    We present a novel software package for the problem "reconstruction from projections" in electron microscopy. The Ettention framework consists of a set of modular building-blocks for tomographic reconstruction algorithms. The well-known block iterative reconstruction method based on Kaczmarz algorithm is implemented using these building-blocks, including adaptations specific to electron tomography. Ettention simultaneously features (1) a modular, object-oriented software design, (2) optimized access to high-performance computing (HPC) platforms such as graphic processing units (GPU) or many-core architectures like Xeon Phi, and (3) accessibility to microscopy end-users via integration in the IMOD package and eTomo user interface. We also provide developers with a clean and well-structured application programming interface (API) that allows for extending the software easily and thus makes it an ideal platform for algorithmic research while hiding most of the technical details of high-performance computing. Copyright © 2015 Elsevier B.V. All rights reserved.

  1. Software Reliability through Theorem Proving

    Directory of Open Access Journals (Sweden)

    S.G.K. Murthy

    2009-05-01

    Full Text Available Improving software reliability of mission-critical systems is widely recognised as one of the major challenges. Early detection of errors in software requirements, designs and implementation, need rigorous verification and validation techniques. Several techniques comprising static and dynamic testing approaches are used to improve reliability of mission critical software; however it is hard to balance development time and budget with software reliability. Particularly using dynamic testing techniques, it is hard to ensure software reliability, as exhaustive testing is not possible. On the other hand, formal verification techniques utilise mathematical logic to prove correctness of the software based on given specifications, which in turn improves the reliability of the software. Theorem proving is a powerful formal verification technique that enhances the software reliability for missioncritical aerospace applications. This paper discusses the issues related to software reliability and theorem proving used to enhance software reliability through formal verification technique, based on the experiences with STeP tool, using the conventional and internationally accepted methodologies, models, theorem proving techniques available in the tool without proposing a new model.Defence Science Journal, 2009, 59(3, pp.314-317, DOI:http://dx.doi.org/10.14429/dsj.59.1527

  2. 48 CFR 227.7202 - Commercial computer software and commercial computer software documentation.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 3 2010-10-01 2010-10-01 false Commercial computer software and commercial computer software documentation. 227.7202 Section 227.7202 Federal Acquisition... REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation...

  3. Security Quality Requirements Engineering (SQUARE) Methodology

    Science.gov (United States)

    2005-11-01

    Oriented Do- main Analysis ( FODA ) [Kang 90], Critical Discourse Analysis (CDA) [Schiffrin 94], and the Accelerated Requirements Method (ARM) [Hubbard...99]. Table 12: Comparison of Elicitation Techniques Misuse Cases SSM QFD CORE IBIS JAD FODA CDA ARM Adaptability 3 1 3 2 2 3 2 1 2 CASE Tool 1...Feature- Oriented Domain Analysis ( FODA ) Feasibility Study (CMU/SEI- 90-TR-021, ADA235785). Pittsburgh, PA: Software Engineering Institute, Carnegie

  4. A Survey of Software Reusability

    OpenAIRE

    Rohit Patidar; Prof. Virendra Singh

    2014-01-01

    Reusability is an only one best direction to increase developing productivity and maintainability of application. One must first search for good tested software component and reusable. Developed Application software by one programmer can be shown useful for others also component. This is proving that code specifics to application requirements can be also reused in develop projects related with same requirements. The main aim of this paper proposed a way for reusable module. An...

  5. Ice cores

    DEFF Research Database (Denmark)

    Svensson, Anders

    2014-01-01

    Ice cores from Antarctica, from Greenland, and from a number of smaller glaciers around the world yield a wealth of information on past climates and environments. Ice cores offer unique records on past temperatures, atmospheric composition (including greenhouse gases), volcanism, solar activity......, dustiness, and biomass burning, among others. In Antarctica, ice cores extend back more than 800,000 years before present (Jouzel et al. 2007), whereas. Greenland ice cores cover the last 130,000 years...

  6. Ice cores

    DEFF Research Database (Denmark)

    Svensson, Anders

    2014-01-01

    Ice cores from Antarctica, from Greenland, and from a number of smaller glaciers around the world yield a wealth of information on past climates and environments. Ice cores offer unique records on past temperatures, atmospheric composition (including greenhouse gases), volcanism, solar activity......, dustiness, and biomass burning, among others. In Antarctica, ice cores extend back more than 800,000 years before present (Jouzel et al. 2007), whereas. Greenland ice cores cover the last 130,000 years...

  7. IGCSE core mathematics

    CERN Document Server

    Wall, Terry

    2013-01-01

    Give your core level students the support and framework they require to get their best grades with this book dedicated to the core level content of the revised syllabus and written specifically to ensure a more appropriate pace. This title has been written for Core content of the revised Cambridge IGCSE Mathematics (0580) syllabus for first teaching from 2013. ? Gives students the practice they require to deepen their understanding through plenty of practice questions. ? Consolidates learning with unique digital resources on the CD, included free with every book. We are working with Cambridge

  8. Software essentials design and construction

    CERN Document Server

    Dingle, Adair

    2014-01-01

    About the Cover: Although capacity may be a problem for a doghouse, other requirements are usually minimal. Unlike skyscrapers, doghouses are simple units. They do not require plumbing, electricity, fire alarms, elevators, or ventilation systems, and they do not need to be built to code or pass inspections. The range of complexity in software design is similar. Given available software tools and libraries-many of which are free-hobbyists can build small or short-lived computer apps. Yet, design for software longevity, security, and efficiency can be intricate-as is the design of large-scale sy

  9. A Survey of Software Reusability

    Directory of Open Access Journals (Sweden)

    Rohit Patidar

    2014-08-01

    Full Text Available Reusability is an only one best direction to increase developing productivity and maintainability of application. One must first search for good tested software component and reusable. Developed Application software by one programmer can be shown useful for others also component. This is proving that code specifics to application requirements can be also reused in develop projects related with same requirements. The main aim of this paper proposed a way for reusable module. An process that takes source code as a input that will helped to take the decision approximately which particular software, reusable artefacts should be reused or not.

  10. Preparing HEP software for concurrency

    Science.gov (United States)

    Clemencic, M.; Hegner, B.; Mato, P.; Piparo, D.

    2014-06-01

    The necessity for thread-safe experiment software has recently become very evident, largely driven by the evolution of CPU architectures towards exploiting increasing levels of parallelism. For high-energy physics this represents a real paradigm shift, as concurrent programming was previously only limited to special, well-defined domains like control software or software framework internals. This paradigm shift, however, falls into the middle of the successful LHC programme and many million lines of code have already been written without the need for parallel execution in mind. In this paper we have a closer look at the offline processing applications of the LHC experiments and their readiness for the many-core era. We review how previous design choices impact the move to concurrent programming. We present our findings on transforming parts of the LHC experiment reconstruction software to thread-safe code, and the main design patterns that have emerged during the process. A plethora of parallel-programming patterns are well known outside the HEP community, but only a few have turned out to be straightforward enough to be suited for non-expert physics programmers. Finally, we propose a potential strategy for the migration of existing HEP experiment software to the many-core era.

  11. Teams at Their Core: Implementing an “All LANDS Approach to Conservation” Requires Focusing on Relationships, Teamwork Process, and Communications

    Science.gov (United States)

    Kasey Jacobs

    2017-01-01

    The U.S. Forest Service has found itself in an era of intense human activity, a changing climate; development and loss of open space; resource consumption; and problematic introduced species; and diversity in core beliefs and values. These challenges test our task-relevant maturity and the ability and willingness to meet the growing demands for services. The Forest...

  12. Requirements: The Key to Sustainability

    OpenAIRE

    2016-01-01

    Software's critical role in society demands a paradigm shift in the software engineering mind-set. This shift's focus begins in requirements engineering. This article is part of a special issue on the Future of Software Engineering.

  13. Methods of Software Verification

    Directory of Open Access Journals (Sweden)

    R. E. Gurin

    2015-01-01

    Full Text Available This article is devoted to the problem of software verification (SW. Methods of software verification designed to check the software for compliance with the stated requirements such as correctness, system security and system adaptability to small changes in the environment, portability and compatibility, etc. These are various methods both by the operation process and by the way of achieving result. The article describes the static and dynamic methods of software verification and paid attention to the method of symbolic execution. In its review of static analysis are discussed and described the deductive method, and methods for testing the model. A relevant issue of the pros and cons of a particular method is emphasized. The article considers classification of test techniques for each method. In this paper we present and analyze the characteristics and mechanisms of the static analysis of dependencies, as well as their views, which can reduce the number of false positives in situations where the current state of the program combines two or more states obtained both in different paths of execution and in working with multiple object values. Dependences connect various types of software objects: single variables, the elements of composite variables (structure fields, array elements, the size of the heap areas, the length of lines, the number of initialized array elements in the verification code using static methods. The article pays attention to the identification of dependencies within the framework of the abstract interpretation, as well as gives an overview and analysis of the inference tools.Methods of dynamic analysis such as testing, monitoring and profiling are presented and analyzed. Also some kinds of tools are considered which can be applied to the software when using the methods of dynamic analysis. Based on the work a conclusion is drawn, which describes the most relevant problems of analysis techniques, methods of their solutions and

  14. N286.7-99, A Canadian standard specifying software quality management system requirements for analytical, scientific, and design computer programs and its implementation at AECL

    Energy Technology Data Exchange (ETDEWEB)

    Abel, R. [R and M Abel Consultants Inc. (Canada)

    2000-07-01

    Analytical, scientific, and design computer programs (referred to in this paper as 'scientific computer programs') are developed for use in a large number of ways by the user-engineer to support and prove engineering calculations and assumptions. These computer programs are subject to frequent modifications inherent in their application and are often used for critical calculations and analysis relative to safety and functionality of equipment and systems. N286.7-99(4) was developed to establish appropriate quality management system requirements to deal with the development, modification, and application of scientific computer programs. N286.7-99 provides particular guidance regarding the treatment of legacy codes.

  15. Validation of a new version of software for monitoring the core of nuclear power plant of Laguna Verde Unit 2, at the end of Cycle 10; Validacion de una nueva version del software para monitoreo del nucleo de la Central Laguna Verde Unidad 2, al final del Ciclo 10

    Energy Technology Data Exchange (ETDEWEB)

    Hernandez, G.; Calleros, G.; Mata, F. [Comision Federal de Electricidad, Central Nucleoelectrica de Laguna Verde, Carretera Cardel-Nautla Km 42.5, Veracruz (Mexico)], e-mail: gabriel.hernandez05@cfe.gob.mx

    2009-10-15

    This work shows the differences observed in thermal limits established in the technical specifications of operation, among the new software, installed at the end of Cycle 10 of Unit 2 of nuclear power plant of Laguna Verde, and the old software that was installed from the beginning of the cycle. The methodology allowed to validate the new software during the coast down stage, before finishing the cycle, for what could be used as tool during the shutdown of Unit 2 at the end of Cycle 10. (Author)

  16. Groupware requirements evolution patterns

    NARCIS (Netherlands)

    Pumareja, Dulce Trinidad

    2013-01-01

    Requirements evolution is a generally known problem in software development. Requirements are known to change all throughout a system's lifecycle. Nevertheless, requirements evolution is a poorly understood phenomenon. Most studies on requirements evolution focus on changes to written specifications

  17. SIMD studies in the LHCb reconstruction software

    CERN Document Server

    Campora Perez, D H

    2015-01-01

    During the data taking process in the LHC at CERN, millions of collisions are recorded every second by the LHCb Detector. The LHCb Online computing farm, counting around 15000 cores, is dedicated to the reconstruction of the events in real-time, in order to filter those with interesting Physics. The ones kept are later analysed $Offline$ in a more precise fashion on the Grid. This imposes very stringent requirements on the reconstruction software, which has to be as efficient as possible. Modern CPUs support so-called vector-extensions, which extend their Instruction Sets, allowing for concurrent execution across functional units. Several libraries expose the Single Instruction Multiple Data programming paradigm to issue these instructions. The use of vectorisation in our codebase can provide performance boosts, leading ultimately to Physics reconstruction enhancements. In this paper, we present vectorisation studies of significant reconstruction algorithms. A variety of vectorisation libraries are analysed a...

  18. Computing platforms for software-defined radio

    CERN Document Server

    Nurmi, Jari; Isoaho, Jouni; Garzia, Fabio

    2017-01-01

    This book addresses Software-Defined Radio (SDR) baseband processing from the computer architecture point of view, providing a detailed exploration of different computing platforms by classifying different approaches, highlighting the common features related to SDR requirements and by showing pros and cons of the proposed solutions. Coverage includes architectures exploiting parallelism by extending single-processor environment (such as VLIW, SIMD, TTA approaches), multi-core platforms distributing the computation to either a homogeneous array or a set of specialized heterogeneous processors, and architectures exploiting fine-grained, coarse-grained, or hybrid reconfigurability. Describes a computer engineering approach to SDR baseband processing hardware; Discusses implementation of numerous compute-intensive signal processing algorithms on single and multicore platforms; Enables deep understanding of optimization techniques related to power and energy consumption of multicore platforms using several basic a...

  19. Software Reviews.

    Science.gov (United States)

    Campion, Martin C.

    1985-01-01

    Reviews TIGERS IN THE SNOW: THE BATTLE OF THE BULGE (Strategic Simulations),and the BATTLE OF GETTYSBURG (Softwride). Each narrative review describes the hardware required and provides complete ordering information. (JDH)

  20. Sandia software guidelines: Software quality planning

    Energy Technology Data Exchange (ETDEWEB)

    1987-08-01

    This volume is one in a series of Sandia Software Guidelines intended for use in producing quality software within Sandia National Laboratories. In consonance with the IEEE Standard for Software Quality Assurance Plans, this volume identifies procedures to follow in producing a Software Quality Assurance Plan for an organization or a project, and provides an example project SQA plan. 2 figs., 4 tabs.

  1. SproutCore web application development

    CERN Document Server

    Keating, Tyler

    2013-01-01

    Written as a practical, step-by-step tutorial, Creating HTML5 Apps with SproutCore is full of engaging examples to help you learn in a practical context.This book is for any person looking to write software for the Web or already writing software for the Web. Whether your background is in web development or in software development, Creating HTML5 Apps with SproutCore will help you expand your skills so that you will be ready to apply the software development principles in the web development space.

  2. Generalized Software Security Framework

    Directory of Open Access Journals (Sweden)

    Smriti Jain

    2011-01-01

    Full Text Available Security of information has become a major concern in today's digitized world. As a result, effective techniques to secure information are required. The most effective way is to incorporate security in the development process itself thereby resulting into secured product. In this paper, we propose a framework that enables security to be included in the software development process. The framework consists of three layers namely; control layer, aspect layer and development layer. The control layer illustrates the managerial control of the entire software development process with the help of governance whereas aspect layer recognizes the security mechanisms that can be incorporated during the software development to identify the various security features. The development layer helps to integrate the various security aspects as well as the controls identified in the above layers during the development process. The layers are further verified by a survey amongst the IT professionals. The professionals concluded that the developed framework is easy to use due to its layered architecture and, can be customized for various types of softwares.

  3. A practical application of software security in an undergraduate software engineering course

    Directory of Open Access Journals (Sweden)

    Cynthia Y. Lester

    2010-05-01

    Full Text Available Computer software is developed according to software engineering methodologies. However, as more of the economy and our social lives move online, software security has become a topic of increasing importance. Traditionally, courses in software security are offered at the graduate level or in a stand-alone course at the undergraduate level, with many undergraduate students being required to apply security principles to their software development processes as soon as they complete their degrees. Therefore, this paper posits that software security can be effectively introduced to undergraduate students in a traditionally taught software engineering course. The paper presents a modified software engineering course which introduces the secure development life cycle. Several traditional software development methodologies are presented which provide a foundation for introducing secure software principles. Additionally, the paper introduces collaborative learning and service-learning which are used in the practical application of software security concepts. Lastly, challenges and future work are presented.

  4. Teams at Their Core: Implementing an “All LANDS Approach to Conservation” Requires Focusing on Relationships, Teamwork Process, and Communications

    OpenAIRE

    Kasey R. Jacobs

    2017-01-01

    The U.S. Forest Service has found itself in an era of intense human activity, a changing climate; development and loss of open space; resource consumption; and problematic introduced species; and diversity in core beliefs and values. These challenges test our task-relevant maturity and the ability and willingness to meet the growing demands for services. The Forest Service is now on a transformative campaign to improve abilities and meet these challenges. The “All-Lands Approach to Conservati...

  5. Effective Software Engineering Leadership for Development Programs

    Science.gov (United States)

    Cagle West, Marsha

    2010-01-01

    Software is a critical component of systems ranging from simple consumer appliances to complex health, nuclear, and flight control systems. The development of quality, reliable, and effective software solutions requires the incorporation of effective software engineering processes and leadership. Processes, approaches, and methodologies for…

  6. Effective Software Engineering Leadership for Development Programs

    Science.gov (United States)

    Cagle West, Marsha

    2010-01-01

    Software is a critical component of systems ranging from simple consumer appliances to complex health, nuclear, and flight control systems. The development of quality, reliable, and effective software solutions requires the incorporation of effective software engineering processes and leadership. Processes, approaches, and methodologies for…

  7. Non-intrusive Instance Level Software Composition

    NARCIS (Netherlands)

    Hatun, Kardelen

    2014-01-01

    A software system is comprised of parts, which interact through shared interfaces. Certain qualities of integration, such as loose-coupling, requiring minimal changes to the software and fine-grained localisation of dependencies, have impact on the overall software quality. Current general-purpose l

  8. Architectural viewpoints for global software development

    NARCIS (Netherlands)

    Yildiz, Bugra Mehmet; Tekinerdogan, B.

    Global Software Development (GSD) can be considered as the coordinated activity of software development that is not localized and central but geographically distributed. Designing an appropriate software architecture of a GSD system is important to meet the requirements for the communication,

  9. Software engineering architecture-driven software development

    CERN Document Server

    Schmidt, Richard F

    2013-01-01

    Software Engineering: Architecture-driven Software Development is the first comprehensive guide to the underlying skills embodied in the IEEE's Software Engineering Body of Knowledge (SWEBOK) standard. Standards expert Richard Schmidt explains the traditional software engineering practices recognized for developing projects for government or corporate systems. Software engineering education often lacks standardization, with many institutions focusing on implementation rather than design as it impacts product architecture. Many graduates join the workforce with incomplete skil

  10. SWiFT Software Quality Assurance Plan.

    Energy Technology Data Exchange (ETDEWEB)

    Berg, Jonathan Charles [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2016-01-01

    This document describes the software development practice areas and processes which contribute to the ability of SWiFT software developers to provide quality software. These processes are designed to satisfy the requirements set forth by the Sandia Software Quality Assurance Program (SSQAP). APPROVALS SWiFT Software Quality Assurance Plan (SAND2016-0765) approved by: Department Manager SWiFT Site Lead Dave Minster (6121) Date Jonathan White (6121) Date SWiFT Controls Engineer Jonathan Berg (6121) Date CHANGE HISTORY Issue Date Originator(s) Description A 2016/01/27 Jon Berg (06121) Initial release of the SWiFT Software Quality Assurance Plan

  11. Implementing software safety in the NASA environment

    Science.gov (United States)

    Wetherholt, Martha S.; Radley, Charles F.

    1994-05-01

    Until recently, NASA did not consider allowing computers total control of flight systems. Human operators, via hardware, have constituted the ultimate safety control. In an attempt to reduce costs, NASA has come to rely more and more heavily on computers and software to control space missions. (For example. software is now planned to control most of the operational functions of the International Space Station.) Thus the need for systematic software safety programs has become crucial for mission success. Concurrent engineering principles dictate that safety should be designed into software up front, not tested into the software after the fact. 'Cost of Quality' studies have statistics and metrics to prove the value of building quality and safety into the development cycle. Unfortunately, most software engineers are not familiar with designing for safety, and most safety engineers are not software experts. Software written to specifications which have not been safety analyzed is a major source of computer related accidents. Safer software is achieved step by step throughout the system and software life cycle. It is a process that includes requirements definition, hazard analyses, formal software inspections, safety analyses, testing, and maintenance. The greatest emphasis is placed on clearly and completely defining system and software requirements, including safety and reliability requirements. Unfortunately, development and review of requirements are the weakest link in the process. While some of the more academic methods, e.g. mathematical models, may help bring about safer software, this paper proposes the use of currently approved software methodologies, and sound software and assurance practices to show how, to a large degree, safety can be designed into software from the start. NASA's approach today is to first conduct a preliminary system hazard analysis (PHA) during the concept and planning phase of a project. This determines the overall hazard potential of

  12. Software Metrics to Estimate Software Quality using Software Component Reusability

    Directory of Open Access Journals (Sweden)

    Prakriti Trivedi

    2012-03-01

    Full Text Available Today most of the applications developed using some existing libraries, codes, open sources etc. As a code is accessed in a program, it is represented as the software component. Such as in java beans and in .net ActiveX controls are the software components. These components are ready to use programming code or controls that excel the code development. A component based software system defines the concept of software reusability. While using these components the main question arise is whether to use such components is beneficial or not. In this proposed work we are trying to present the answer for the same question. In this work we are presenting a set of software matrix that will check the interconnection between the software component and the application. How strong this relation defines the software quality after using this software component. The overall metrics will return the final result in terms of the boundless of the component with application.

  13. Software Graphics Processing Unit (sGPU) for Deep Space Applications

    Science.gov (United States)

    McCabe, Mary; Salazar, George; Steele, Glen

    2015-01-01

    A graphics processing capability will be required for deep space missions and must include a range of applications, from safety-critical vehicle health status to telemedicine for crew health. However, preliminary radiation testing of commercial graphics processing cards suggest they cannot operate in the deep space radiation environment. Investigation into an Software Graphics Processing Unit (sGPU)comprised of commercial-equivalent radiation hardened/tolerant single board computers, field programmable gate arrays, and safety-critical display software shows promising results. Preliminary performance of approximately 30 frames per second (FPS) has been achieved. Use of multi-core processors may provide a significant increase in performance.

  14. Software-based acoustical measurements

    CERN Document Server

    Miyara, Federico

    2017-01-01

    This textbook provides a detailed introduction to the use of software in combination with simple and economical hardware (a sound level meter with calibrated AC output and a digital recording system) to obtain sophisticated measurements usually requiring expensive equipment. It emphasizes the use of free, open source, and multiplatform software. Many commercial acoustical measurement systems use software algorithms as an integral component; however the methods are not disclosed. This book enables the reader to develop useful algorithms and provides insight into the use of digital audio editing tools to document features in the signal. Topics covered include acoustical measurement principles, in-depth critical study of uncertainty applied to acoustical measurements, digital signal processing from the basics, and metrologically-oriented spectral and statistical analysis of signals. The student will gain a deep understanding of the use of software for measurement purposes; the ability to implement software-based...

  15. Rapid Application Development Using Software Factories

    CERN Document Server

    Stojanovski, Toni

    2012-01-01

    Software development is still based on manufactory production, and most of the programming code is still hand-crafted. Software development is very far away from the ultimate goal of industrialization in software production, something which has been achieved long time ago in the other industries. The lack of software industrialization creates an inability to cope with fast and frequent changes in user requirements, and causes cost and time inefficiencies during their implementation. Analogous to what other industries had done long time ago, industrialization of software development has been proposed using the concept of software factories. We have accepted this vision about software factories, and developed our own software factory which produces three-layered ASP.NET web applications. In this paper we report about our experience with using this approach in the process of software development, and present comparative results on performances and deliverables in both traditional development and development usin...

  16. Investigating interoperability of the LSST data management software stack with Astropy

    Science.gov (United States)

    Jenness, Tim; Bosch, James; Owen, Russell; Parejko, John; Sick, Jonathan; Swinbank, John; de Val-Borro, Miguel; Dubois-Felsmann, Gregory; Lim, K.-T.; Lupton, Robert H.; Schellart, Pim; Krughoff, K. S.; Tollerud, Erik J.

    2016-07-01

    The Large Synoptic Survey Telescope (LSST) will be an 8.4m optical survey telescope sited in Chile and capable of imaging the entire sky twice a week. The data rate of approximately 15TB per night and the requirements to both issue alerts on transient sources within 60 seconds of observing and create annual data releases means that automated data management systems and data processing pipelines are a key deliverable of the LSST construction project. The LSST data management software has been in development since 2004 and is based on a C++ core with a Python control layer. The software consists of nearly a quarter of a million lines of code covering the system from fundamental WCS and table libraries to pipeline environments and distributed process execution. The Astropy project began in 2011 as an attempt to bring together disparate open source Python projects and build a core standard infrastructure that can be used and built upon by the astronomy community. This project has been phenomenally successful in the years since it has begun and has grown to be the de facto standard for Python software in astronomy. Astropy brings with it considerable expectations from the community on how astronomy Python software should be developed and it is clear that by the time LSST is fully operational in the 2020s many of the prospective users of the LSST software stack will expect it to be fully interoperable with Astropy. In this paper we describe the overlap between the LSST science pipeline software and Astropy software and investigate areas where the LSST software provides new functionality. We also discuss the possibilities of re-engineering the LSST science pipeline software to build upon Astropy, including the option of contributing affliated packages.

  17. Certification of COTS Software in NASA Human Rated Flight Systems

    Science.gov (United States)

    Goforth, Andre

    2012-01-01

    Adoption of commercial off-the-shelf (COTS) products in safety critical systems has been seen as a promising acquisition strategy to improve mission affordability and, yet, has come with significant barriers and challenges. Attempts to integrate COTS software components into NASA human rated flight systems have been, for the most part, complicated by verification and validation (V&V) requirements necessary for flight certification per NASA s own standards. For software that is from COTS sources, and, in general from 3rd party sources, either commercial, government, modified or open source, the expectation is that it meets the same certification criteria as those used for in-house and that it does so as if it were built in-house. The latter is a critical and hidden issue. This paper examines the longstanding barriers and challenges in the use of 3rd party software in safety critical systems and cover recent efforts to use COTS software in NASA s Multi-Purpose Crew Vehicle (MPCV) project. It identifies some core artifacts that without them, the use of COTS and 3rd party software is, for all practical purposes, a nonstarter for affordable and timely insertion into flight critical systems. The paper covers the first use in a flight critical system by NASA of COTS software that has prior FAA certification heritage, which was shown to meet the RTCA-DO-178B standard, and how this certification may, in some cases, be leveraged to allow the use of analysis in lieu of testing. Finally, the paper proposes the establishment of an open source forum for development of safety critical 3rd party software.

  18. The software life cycle

    CERN Document Server

    Ince, Darrel

    1990-01-01

    The Software Life Cycle deals with the software lifecycle, that is, what exactly happens when software is developed. Topics covered include aspects of software engineering, structured techniques of software development, and software project management. The use of mathematics to design and develop computer systems is also discussed. This book is comprised of 20 chapters divided into four sections and begins with an overview of software engineering and software development, paying particular attention to the birth of software engineering and the introduction of formal methods of software develop

  19. NASA Software Engineering Benchmarking Study

    Science.gov (United States)

    Rarick, Heather L.; Godfrey, Sara H.; Kelly, John C.; Crumbley, Robert T.; Wifl, Joel M.

    2013-01-01

    To identify best practices for the improvement of software engineering on projects, NASA's Offices of Chief Engineer (OCE) and Safety and Mission Assurance (OSMA) formed a team led by Heather Rarick and Sally Godfrey to conduct this benchmarking study. The primary goals of the study are to identify best practices that: Improve the management and technical development of software intensive systems; Have a track record of successful deployment by aerospace industries, universities [including research and development (R&D) laboratories], and defense services, as well as NASA's own component Centers; and Identify candidate solutions for NASA's software issues. Beginning in the late fall of 2010, focus topics were chosen and interview questions were developed, based on the NASA top software challenges. Between February 2011 and November 2011, the Benchmark Team interviewed a total of 18 organizations, consisting of five NASA Centers, five industry organizations, four defense services organizations, and four university or university R and D laboratory organizations. A software assurance representative also participated in each of the interviews to focus on assurance and software safety best practices. Interviewees provided a wealth of information on each topic area that included: software policy, software acquisition, software assurance, testing, training, maintaining rigor in small projects, metrics, and use of the Capability Maturity Model Integration (CMMI) framework, as well as a number of special topics that came up in the discussions. NASA's software engineering practices compared favorably with the external organizations in most benchmark areas, but in every topic, there were ways in which NASA could improve its practices. Compared to defense services organizations and some of the industry organizations, one of NASA's notable weaknesses involved communication with contractors regarding its policies and requirements for acquired software. One of NASA's strengths

  20. Flight Software Math Library

    Science.gov (United States)

    McComas, David

    2013-01-01

    The flight software (FSW) math library is a collection of reusable math components that provides typical math utilities required by spacecraft flight software. These utilities are intended to increase flight software quality reusability and maintainability by providing a set of consistent, well-documented, and tested math utilities. This library only has dependencies on ANSI C, so it is easily ported. Prior to this library, each mission typically created its own math utilities using ideas/code from previous missions. Part of the reason for this is that math libraries can be written with different strategies in areas like error handling, parameters orders, naming conventions, etc. Changing the utilities for each mission introduces risks and costs. The obvious risks and costs are that the utilities must be coded and revalidated. The hidden risks and costs arise in miscommunication between engineers. These utilities must be understood by both the flight software engineers and other subsystem engineers (primarily guidance navigation and control). The FSW math library is part of a larger goal to produce a library of reusable Guidance Navigation and Control (GN&C) FSW components. A GN&C FSW library cannot be created unless a standardized math basis is created. This library solves the standardization problem by defining a common feature set and establishing policies for the library s design. This allows the libraries to be maintained with the same strategy used in its initial development, which supports a library of reusable GN&C FSW components. The FSW math library is written for an embedded software environment in C. This places restrictions on the language features that can be used by the library. Another advantage of the FSW math library is that it can be used in the FSW as well as other environments like the GN&C analyst s simulators. This helps communication between the teams because they can use the same utilities with the same feature set and syntax.

  1. Software Quality Assurance for Nuclear Safety Systems

    Energy Technology Data Exchange (ETDEWEB)

    Sparkman, D R; Lagdon, R

    2004-05-16

    The US Department of Energy has undertaken an initiative to improve the quality of software used to design and operate their nuclear facilities across the United States. One aspect of this initiative is to revise or create new directives and guides associated with quality practices for the safety software in its nuclear facilities. Safety software includes the safety structures, systems, and components software and firmware, support software and design and analysis software used to ensure the safety of the facility. DOE nuclear facilities are unique when compared to commercial nuclear or other industrial activities in terms of the types and quantities of hazards that must be controlled to protect workers, public and the environment. Because of these differences, DOE must develop an approach to software quality assurance that ensures appropriate risk mitigation by developing a framework of requirements that accomplishes the following goals: {sm_bullet} Ensures the software processes developed to address nuclear safety in design, operation, construction and maintenance of its facilities are safe {sm_bullet} Considers the larger system that uses the software and its impacts {sm_bullet} Ensures that the software failures do not create unsafe conditions Software designers for nuclear systems and processes must reduce risks in software applications by incorporating processes that recognize, detect, and mitigate software failure in safety related systems. It must also ensure that fail safe modes and component testing are incorporated into software design. For nuclear facilities, the consideration of risk is not necessarily sufficient to ensure safety. Systematic evaluation, independent verification and system safety analysis must be considered for software design, implementation, and operation. The software industry primarily uses risk analysis to determine the appropriate level of rigor applied to software practices. This risk-based approach distinguishes safety

  2. Generic Kalman Filter Software

    Science.gov (United States)

    Lisano, Michael E., II; Crues, Edwin Z.

    2005-01-01

    The Generic Kalman Filter (GKF) software provides a standard basis for the development of application-specific Kalman-filter programs. Historically, Kalman filters have been implemented by customized programs that must be written, coded, and debugged anew for each unique application, then tested and tuned with simulated or actual measurement data. Total development times for typical Kalman-filter application programs have ranged from months to weeks. The GKF software can simplify the development process and reduce the development time by eliminating the need to re-create the fundamental implementation of the Kalman filter for each new application. The GKF software is written in the ANSI C programming language. It contains a generic Kalman-filter-development directory that, in turn, contains a code for a generic Kalman filter function; more specifically, it contains a generically designed and generically coded implementation of linear, linearized, and extended Kalman filtering algorithms, including algorithms for state- and covariance-update and -propagation functions. The mathematical theory that underlies the algorithms is well known and has been reported extensively in the open technical literature. Also contained in the directory are a header file that defines generic Kalman-filter data structures and prototype functions and template versions of application-specific subfunction and calling navigation/estimation routine code and headers. Once the user has provided a calling routine and the required application-specific subfunctions, the application-specific Kalman-filter software can be compiled and executed immediately. During execution, the generic Kalman-filter function is called from a higher-level navigation or estimation routine that preprocesses measurement data and post-processes output data. The generic Kalman-filter function uses the aforementioned data structures and five implementation- specific subfunctions, which have been developed by the user on

  3. Software not as a service

    Science.gov (United States)

    Teal, Tracy

    2017-01-01

    With the expansion in the variety, velocity and volume of data being produced, computing and software development has become a crucial element of astronomy research. However, while we value the research, we place less importance on the development of the software itself, viewing software as a service to research. By viewing software as a service, we derate the effort and expertise it takes to produce, and the training required, for effective research computing. We also don’t provide support for the people doing the development, often expecting individual developers to provide systems administration, user support and training and produce documentation and user interfaces. With our increased reliance on research computing, accurate and reproducible research requires that software not be separate from the act of conducting research, but an integral component - a part of, rather than a service to research. Shifts in how we provide data skills and software development training, integrate development into research programs and academic departments and value software as a product can have an impact on the quality, creativity and types of research we can conduct.

  4. Software Development Standard Processes (SDSP)

    Science.gov (United States)

    Lavin, Milton L.; Wang, James J.; Morillo, Ronald; Mayer, John T.; Jamshidian, Barzia; Shimizu, Kenneth J.; Wilkinson, Belinda M.; Hihn, Jairus M.; Borgen, Rosana B.; Meyer, Kenneth N.; Crean, Kathleen A.; Rinker, George C.; Smith, Thomas P.; Lum, Karen T.; Hanna, Robert A.; Erickson, Daniel E.; Gamble, Edward B., Jr.; Morgan, Scott C.; Kelsay, Michael G.; Newport, Brian J.; Lewicki, Scott A.; Stipanuk, Jeane G.; Cooper, Tonja M.; Meshkat, Leila

    2011-01-01

    A JPL-created set of standard processes is to be used throughout the lifecycle of software development. These SDSPs cover a range of activities, from management and engineering activities, to assurance and support activities. These processes must be applied to software tasks per a prescribed set of procedures. JPL s Software Quality Improvement Project is currently working at the behest of the JPL Software Process Owner to ensure that all applicable software tasks follow these procedures. The SDSPs are captured as a set of 22 standards in JPL s software process domain. They were developed in-house at JPL by a number of Subject Matter Experts (SMEs) residing primarily within the Engineering and Science Directorate, but also from the Business Operations Directorate and Safety and Mission Success Directorate. These practices include not only currently performed best practices, but also JPL-desired future practices in key thrust areas like software architecting and software reuse analysis. Additionally, these SDSPs conform to many standards and requirements to which JPL projects are beholden.

  5. Lean software development in action

    CERN Document Server

    Janes, Andrea

    2014-01-01

    This book illustrates how goal-oriented, automated measurement can be used to create Lean organizations and to facilitate the development of Lean software, while also demonstrating the practical implementation of Lean software development by combining tried and trusted tools. In order to be successful, a Lean orientation of software development has to go hand in hand with a company's overall business strategy. To achieve this, two interrelated aspects require special attention: measurement and experience management. In this book, Janes and Succi provide the necessary knowledge to establish "

  6. Transformer core

    NARCIS (Netherlands)

    Mehendale, A.; Hagedoorn, Wouter; Lötters, Joost Conrad

    2010-01-01

    A transformer core includes a stack of a plurality of planar core plates of a magnetically permeable material, which plates each consist of a first and a second sub-part that together enclose at least one opening. The sub-parts can be fitted together via contact faces that are located on either side

  7. Transformer core

    NARCIS (Netherlands)

    Mehendale, A.; Hagedoorn, Wouter; Lötters, Joost Conrad

    2008-01-01

    A transformer core includes a stack of a plurality of planar core plates of a magnetically permeable material, which plates each consist of a first and a second sub-part that together enclose at least one opening. The sub-parts can be fitted together via contact faces that are located on either side

  8. Crowded Cluster Cores: Algorithms for Deblending in Dark Energy Survey Images

    CERN Document Server

    Zhang, Yuanyuan; Bertin, Emmanuel; Jeltema, Tesla; Miller, Christopher J; Rykoff, Eli; Song, Jeeseon

    2014-01-01

    Deep optical images are often crowded with overlapping objects. This is especially true in the cores of galaxy clusters, where images of dozens of galaxies may lie atop one another. Accurate measurements of cluster properties require deblending algorithms designed to automatically extract a list of individual objects and decide what fraction of the light in each pixel comes from each object. We present new software called the Gradient And INterpolation based deblender (GAIN) as a secondary deblender to improve deblending the images of cluster cores. This software relies on using image intensity gradient and using an image interpolation technique usually used to correct flawed terrestrial digital images. We test this software on Dark Energy Survey coadd images. GAIN helps extracting unbiased photometry measurement for blended sources. It also helps improving detection completeness while introducing only a modest amount of spurious detections. For example, when applied to deep images simulated with high level o...

  9. A C-terminal Hydrophobic, Solvent-protected Core and a Flexible N-terminus are Potentially Required for Human Papillomavirus 18 E7 Protein Functionality

    Energy Technology Data Exchange (ETDEWEB)

    Liu, S.; Tian, Y; Greenaway, F; Sun, M

    2010-01-01

    The oncogenic potential of the high-risk human papillomavirus (HPV) relies on the expression of genes specifying the E7 and E6 proteins. To investigate further the variation in oligomeric structure that has been reported for different E7 proteins, an HPV-18 E7 cloned from a Hispanic woman with cervical intraepithelial neoplasia was purified to homogeneity most probably as a stable monomeric protein in aqueous solution. We determined that one zinc ion is present per HPV-18 E7 monomer by amino acid and inductively coupled plasma-atomic emission spectroscopy analysis. Intrinsic fluorescence and circular dichroism spectroscopic results indicate that the zinc ion is important for the correct folding and thermal stability of HPV-18 E7. Hydroxyl radical mediated protein footprinting coupled to mass spectrometry and other biochemical and biophysical data indicate that near the C-terminus, the four cysteines of the two Cys-X{sub 2}-Cys motifs that are coordinated to the zinc ion form a solvent inaccessible core. The N-terminal LXCXE pRb binding motif region is hydroxyl radical accessible and conformationally flexible. Both factors, the relative flexibility of the pRb binding motif at the N-terminus and the C-terminal metal-binding hydrophobic solvent-protected core, combine together and facilitate the biological functions of HPV-18 E7.

  10. Computer-Aided Software Engineering - An approach to real-time software development

    Science.gov (United States)

    Walker, Carrie K.; Turkovich, John J.

    1989-01-01

    A new software engineering discipline is Computer-Aided Software Engineering (CASE), a technology aimed at automating the software development process. This paper explores the development of CASE technology, particularly in the area of real-time/scientific/engineering software, and a history of CASE is given. The proposed software development environment for the Advanced Launch System (ALS CASE) is described as an example of an advanced software development system for real-time/scientific/engineering (RT/SE) software. The Automated Programming Subsystem of ALS CASE automatically generates executable code and corresponding documentation from a suitably formatted specification of the software requirements. Software requirements are interactively specified in the form of engineering block diagrams. Several demonstrations of the Automated Programming Subsystem are discussed.

  11. Parallel execution of chemical software on EGEE Grid

    CERN Document Server

    Sterzel, Mariusz

    2008-01-01

    Constant interest among chemical community to study larger and larger molecules forces the parallelization of existing computational methods in chemistry and development of new ones. These are main reasons of frequent port updates and requests from the community for the Grid ports of new packages to satisfy their computational demands. Unfortunately some parallelization schemes used by chemical packages cannot be directly used in Grid environment. Here we present a solution for Gaussian package. The current state of development of Grid middleware allows easy parallel execution in case of software using any of MPI flavour. Unfortunately many chemical packages do not use MPI for parallelization therefore special treatment is needed. Gaussian can be executed in parallel on SMP architecture or via Linda. These require reservation of certain number of processors/cores on a given WN and the equal number of processors/cores on each WN, respectively. The current implementation of EGEE middleware does not offer such f...

  12. LTP data analysis software and infrastructure

    Science.gov (United States)

    Nofrarias Serra, Miquel

    The LTP (LISA Technology Package) is the core part of the LISA Pathfinder mission. The main goal of the mission is to study the sources of any disturbances that perturb the motion of the freely-falling test masses from their geodesic trajectories as well as to test various technologies needed for LISA. The LTP experiment is designed as a sequence of experimental runs in which the performance of the instrument is studied and characterised under different operating conditions. In order to best optimise subsequent experimental runs, each run must be promptly analysed to ensure that the following ones make best use of the available knowledge of the instrument. In order to do this, a robust and flexible data analysis software package is required. The software developed for the LTP Data Analysis is a comprehensive data analysis tool based on MATLAB. The environment provides an object-oriented approach to data analysis which allows the user to design and run data analysis pipelines, either graphically or via scripts. The output objects of the analyses contain a full history of the processing that took place; this history tree can be inspected and used to rebuild the objects. This poster introduces the analysis environment and the concepts that have gone in to its design.

  13. Amalgamation of Personal Software Process in Software ...

    African Journals Online (AJOL)

    evolutionary series of personal software engineering techniques that an engineer learns and ... Article History: Received : 30-04- ... began to realize that software process, plans and methodologies for ..... Executive Strategy. Addison-Wesley ...

  14. Experimental and Analytic Study on the Core Bypass Flow in a Very High Temperature Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Richard Schultz

    2012-04-01

    Core bypass flow has been one of key issues in the very high temperature reactor (VHTR) design for securing core thermal margins and achieving target temperatures at the core exit. The bypass flow in a prismatic VHTR core occurs through the control element holes and the radial and axial gaps between the graphite blocks for manufacturing and refueling tolerances. These gaps vary with the core life cycles because of the irradiation swelling/shrinkage characteristic of the graphite blocks such as fuel and reflector blocks, which are main components of a core's structure. Thus, the core bypass flow occurs in a complicated multidimensional way. The accurate prediction of this bypass flow and counter-measures to minimize it are thus of major importance in assuring core thermal margins and securing higher core efficiency. Even with this importance, there has not been much effort in quantifying and accurately modeling the effect of the core bypass flow. The main objectives of this project were to generate experimental data for validating the software to be used to calculate the bypass flow in a prismatic VHTR core, validate thermofluid analysis tools and their model improvements, and identify and assess measures for reducing the bypass flow. To achieve these objectives, tasks were defined to (1) design and construct experiments to generate validation data for software analysis tools, (2) determine the experimental conditions and define the measurement requirements and techniques, (3) generate and analyze the experimental data, (4) validate and improve the thermofluid analysis tools, and (5) identify measures to control the bypass flow and assess its performance in the experiment.

  15. Ontologies for software engineering and software technology

    CERN Document Server

    Calero, Coral; Piattini, Mario

    2006-01-01

    Covers two applications of ontologies in software engineering and software technology: sharing knowledge of the problem domain and using a common terminology among all stakeholders; and filtering the knowledge when defining models and metamodels. This book is of benefit to software engineering researchers in both academia and industry.

  16. Proteaselike sequence in hepatitis B virus core antigen is not required for e antigen generation and may not be part of an aspartic acid-type protease.

    Science.gov (United States)

    Nassal, M; Galle, P R; Schaller, H

    1989-01-01

    The hepatitis B virus (HBV) C gene directs the synthesis of two major gene products: HBV core antigen (HBcAg[p21c]), which forms the nucleocapsid, and HBV e antigen (HBeAg [p17e]), a secreted antigen that is produced by several processing events during its maturation. These proteins contain an amino acid sequence similar to the active-site residues of aspartic acid and retroviral proteases. On the basis of this sequence similarity, which is highly conserved among mammalian hepadnaviruses, a model has been put forward according to which processing to HBeAg is due to self-cleavage of p21c involving the proteaselike sequence. Using site-directed mutagenesis in conjunction with transient expression of HBV proteins in the human hepatoma cell line HepG2, we tested this hypothesis. Our results with HBV mutants in which one or two of the conserved amino acids have been replaced by others suggest strongly that processing to HBeAg does not depend on the presence of an intact proteaselike sequence in the core protein. Attempts to detect an influence of this sequence on the processing of HBV P gene products into enzymatically active viral polymerase also gave no conclusive evidence for the existence of an HBV protease. Mutations replacing the putatively essential aspartic acid showed little effect on polymerase activity. Additional substitution of the likewise conserved threonine residue by alanine, in contrast, almost abolished the activity of the polymerase. We conclude that an HBV protease, if it exists, is functionally different from aspartic acid and retroviral proteases. Images PMID:2657101

  17. JPI UML Software Modeling

    Directory of Open Access Journals (Sweden)

    Cristian Vidal Silva

    2015-12-01

    Full Text Available Aspect-Oriented Programming AOP extends object-oriented programming OOP with aspects to modularize crosscutting behavior on classes by means of aspects to advise base code in the occurrence of join points according to pointcut rules definition. However, join points introduce dependencies between aspects and base code, a great issue to achieve an effective independent development of software modules. Join Point Interfaces JPI represent join points using interfaces between classes and aspect, thus these modules do not depend of each other. Nevertheless, since like AOP, JPI is a programming methodology; thus, for a complete aspect-oriented software development process, it is necessary to define JPI requirements and JPI modeling phases. Towards previous goal, this article proposes JPI UML class and sequence diagrams for modeling JPI software solutions. A purpose of these diagrams is to facilitate understanding the structure and behavior of JPI programs. As an application example, this article applies the JPI UML diagrams proposal on a case study and analyzes the associated JPI code to prove their hegemony.

  18. Self-organising software

    CERN Document Server

    Serugendo, Giovanna Di Marzo; Karageorgos, Anthony

    2011-01-01

    Self-organisation, self-regulation, self-repair and self-maintenance are promising conceptual approaches for dealing with complex distributed interactive software and information-handling systems. Self-organising applications dynamically change their functionality and structure without direct user intervention, responding to changes in requirements and the environment. This is the first book to offer an integrated view of self-organisation technologies applied to distributed systems, particularly focusing on multiagent systems. The editors developed this integrated book with three aims: to exp

  19. Controlling Software Piracy.

    Science.gov (United States)

    King, Albert S.

    1992-01-01

    Explains what software manufacturers are doing to combat software piracy, recommends how managers should deal with this problem, and provides a role-playing exercise to help students understand the issues in software piracy. (SR)

  20. Space Flight Software Development Software for Intelligent System Health Management

    Science.gov (United States)

    Trevino, Luis C.; Crumbley, Tim

    2004-01-01

    The slide presentation examines the Marshall Space Flight Center Flight Software Branch, including software development projects, mission critical space flight software development, software technical insight, advanced software development technologies, and continuous improvement in the software development processes and methods.

  1. The Earth System Documentation (ES-DOC) Software Process

    Science.gov (United States)

    Greenslade, M. A.; Murphy, S.; Treshansky, A.; DeLuca, C.; Guilyardi, E.; Denvil, S.

    2013-12-01

    Earth System Documentation (ES-DOC) is an international project supplying high-quality tools & services in support of earth system documentation creation, analysis and dissemination. It is nurturing a sustainable standards based documentation eco-system that aims to become an integral part of the next generation of exa-scale dataset archives. ES-DOC leverages open source software, and applies a software development methodology that places end-user narratives at the heart of all it does. ES-DOC has initially focused upon nurturing the Earth System Model (ESM) documentation eco-system and currently supporting the following projects: * Coupled Model Inter-comparison Project Phase 5 (CMIP5); * Dynamical Core Model Inter-comparison Project (DCMIP); * National Climate Predictions and Projections Platforms Quantitative Evaluation of Downscaling Workshop. This talk will demonstrate that ES-DOC implements a relatively mature software development process. Taking a pragmatic Agile process as inspiration, ES-DOC: * Iteratively develops and releases working software; * Captures user requirements via a narrative based approach; * Uses online collaboration tools (e.g. Earth System CoG) to manage progress; * Prototypes applications to validate their feasibility; * Leverages meta-programming techniques where appropriate; * Automates testing whenever sensibly feasible; * Streamlines complex deployments to a single command; * Extensively leverages GitHub and Pivotal Tracker; * Enforces strict separation of the UI from underlying API's; * Conducts code reviews.

  2. Software Development for JSA Source Jerk Measurement

    Institute of Scientific and Technical Information of China (English)

    LUO; Huang-da; ZHANG; Tao

    2013-01-01

    We have developed a series of experiment measurement system for Jordan sub-critical assembly.The source jerk measurement system is used for measuring the reactivity of sub-critical reactor.It mainlyconsists of a BF3 neutron detector around the reactor core,main amplifier,the data acquisition and processing software.The software acquires neutron pulse data by controlling DAQ card,and displaying

  3. Ice Cores

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Records of past temperature, precipitation, atmospheric trace gases, and other aspects of climate and environment derived from ice cores drilled on glaciers and ice...

  4. Core BPEL

    DEFF Research Database (Denmark)

    Hallwyl, Tim; Højsgaard, Espen

    extensions. Combined with the fact that the language definition does not provide a formal semantics, it is an arduous task to work formally with the language (e.g. to give an implementation). In this paper we identify a core subset of the language, called Core BPEL, which has fewer and simpler constructs......, does not allow omissions, and does not contain ignorable elements. We do so by identifying syntactic sugar, including default values, and ignorable elements in WS-BPEL. The analysis results in a translation from the full language to the core subset. Thus, we reduce the effort needed for working...... formally with WS-BPEL, as one, without loss of generality, need only consider the much simpler Core BPEL. This report may also be viewed as an addendum to the WS-BPEL standard specification, which clarifies the WS-BPEL syntax and presents the essential elements of the language in a more concise way...

  5. Core BPEL

    DEFF Research Database (Denmark)

    Hallwyl, Tim; Højsgaard, Espen

    extensions. Combined with the fact that the language definition does not provide a formal semantics, it is an arduous task to work formally with the language (e.g. to give an implementation). In this paper we identify a core subset of the language, called Core BPEL, which has fewer and simpler constructs......, does not allow omissions, and does not contain ignorable elements. We do so by identifying syntactic sugar, including default values, and ignorable elements in WS-BPEL. The analysis results in a translation from the full language to the core subset. Thus, we reduce the effort needed for working...... formally with WS-BPEL, as one, without loss of generality, need only consider the much simpler Core BPEL. This report may also be viewed as an addendum to the WS-BPEL standard specification, which clarifies the WS-BPEL syntax and presents the essential elements of the language in a more concise way...

  6. Core benefits

    National Research Council Canada - National Science Library

    Keith, Brian W

    2010-01-01

    This SPEC Kit explores the core employment benefits of retirement, and life, health, and other insurance -benefits that are typically decided by the parent institution and often have significant governmental regulation...

  7. Software Engineering Guidebook

    Science.gov (United States)

    Connell, John; Wenneson, Greg

    1993-01-01

    The Software Engineering Guidebook describes SEPG (Software Engineering Process Group) supported processes and techniques for engineering quality software in NASA environments. Three process models are supported: structured, object-oriented, and evolutionary rapid-prototyping. The guidebook covers software life-cycles, engineering, assurance, and configuration management. The guidebook is written for managers and engineers who manage, develop, enhance, and/or maintain software under the Computer Software Services Contract.

  8. Modern Tools for Modern Software

    Energy Technology Data Exchange (ETDEWEB)

    Kumfert, G; Epperly, T

    2001-10-31

    This is a proposal for a new software configure/build tool for building, maintaining, deploying, and installing software. At its completion, this new tool will replace current standard tool suites such as ''autoconf'', ''automake'', ''libtool'', and the de facto standard build tool, ''make''. This ambitious project is born out of the realization that as scientific software has grown in size and complexity over the years, the difficulty of configuring and building software has increased as well. For high performance scientific software, additional complexities often arises from the need for portability to multiple platforms (including many one-of-a-kind platforms), multilanguage implementations, use of third party libraries, and a need to adapt algorithms to the specific features of the hardware. Development of scientific software is being hampered by the quality of configuration and build tools commonly available. Inordinate amounts of time and expertise are required to develop and maintain the configure and build system for a moderately complex project. Better build and configure tools will increase developer productivity. This proposal is a first step in a process of shoring up the foundation upon which DOE software is created and used.

  9. Interface-based software integration

    Directory of Open Access Journals (Sweden)

    Aziz Ahmad Rais

    2016-07-01

    Full Text Available Enterprise architecture frameworks define the goals of enterprise architecture in order to make business processes and IT operations more effective, and to reduce the risk of future investments. These enterprise architecture frameworks offer different architecture development methods that help in building enterprise architecture. In practice, the larger organizations become, the larger their enterprise architecture and IT become. This leads to an increasingly complex system of enterprise architecture development and maintenance. Application software architecture is one type of architecture that, along with business architecture, data architecture and technology architecture, composes enterprise architecture. From the perspective of integration, enterprise architecture can be considered a system of interaction between multiple examples of application software. Therefore, effective software integration is a very important basis for the future success of the enterprise architecture in question. This article will provide interface-based integration practice in order to help simplify the process of building such a software integration system. The main goal of interface-based software integration is to solve problems that may arise with software integration requirements and developing software integration architecture.

  10. Practical support for Lean Six Sigma software process definition using IEEE software engineering standards

    CERN Document Server

    Land, Susan K; Walz, John W

    2012-01-01

    Practical Support for Lean Six Sigma Software Process Definition: Using IEEE Software Engineering Standards addresses the task of meeting the specific documentation requirements in support of Lean Six Sigma. This book provides a set of templates supporting the documentation required for basic software project control and management and covers the integration of these templates for their entire product development life cycle. Find detailed documentation guidance in the form of organizational policy descriptions, integrated set of deployable document templates, artifacts required in suppo

  11. Programming Makes Software; Support Makes Users

    Science.gov (United States)

    Batcheller, A. L.

    2010-12-01

    Skilled software engineers may build fantastic software for climate modeling, yet fail to achieve their project’s objectives. Software support and related activities are just as critical as writing software. This study followed three different software projects in the climate sciences, using interviews, observation, and document analysis to examine the value added by support work. Supporting the project and interacting with users was a key task for software developers, who often spent 50% of their time on it. Such support work most often involved replying to questions on an email list, but also included talking to users on teleconference calls and in person. Software support increased adoption by building the software’s reputation and showing individuals how the software can meet their needs. In the process of providing support, developers often learned new of requirements as users reported features they desire and bugs they found. As software matures and gains widespread use, support work often increases. In fact, such increases can be one signal that the software has achieved broad acceptance. Maturing projects also find demand for instructional classes, online tutorials and detailed examples of how to use the software. The importance of support highlights the fact that building software systems involves both social and technical aspects. Yes, we need to build the software, but we also need to “build” the users and practices that can take advantage of it.

  12. Models for composing software : an analysis of software composition and objects

    NARCIS (Netherlands)

    Bergmans, Lodewijk

    1999-01-01

    In this report, we investigate component-based software construction with a focus on composition. In particular we try to analyze the requirements and issues for components and software composition. As a means to understand this research area, we introduce a canonical model for representing software

  13. Quality Assurance in Software Development: An Exploratory Investigation in Software Project Failures and Business Performance

    Science.gov (United States)

    Ichu, Emmanuel A.

    2010-01-01

    Software quality is perhaps one of the most sought-after attributes in product development, however; this goal is unattained. Problem factors in software development and how these have affected the maintainability of the delivered software systems requires a thorough investigation. It was, therefore, very important to understand software…

  14. Quality Assurance in Software Development: An Exploratory Investigation in Software Project Failures and Business Performance

    Science.gov (United States)

    Ichu, Emmanuel A.

    2010-01-01

    Software quality is perhaps one of the most sought-after attributes in product development, however; this goal is unattained. Problem factors in software development and how these have affected the maintainability of the delivered software systems requires a thorough investigation. It was, therefore, very important to understand software…

  15. Maximizing ROI on software development

    CERN Document Server

    Sikka, Vijay

    2004-01-01

    A brief review of software development history. Software complexity crisis. Software development ROI. The case for global software development and testing. Software quality and test ROI. How do you implement global software development and testing. Case studies.

  16. Software for the LHCb experiment

    CERN Document Server

    Corti, Gloria; Belyaev, Ivan; Cattaneo, Marco; Charpentier, Philippe; Frank, Markus; Koppenburg, Patrick; Mato-Vila, P; Ranjard, Florence; Roiser, Stefan

    2006-01-01

    LHCb is an experiment for precision measurements of CP-violation and rare decays in B mesons at the LHC collider at CERN. The LHCb software development strategy follows an architecture-centric approach as a way of creating a resilient software framework that can withstand changes in requirements and technology over the expected long lifetime of the experiment. The software architecture, called GAUDI, supports event data processing applications that run in different processing environments ranging from the real-time high- level triggers in the online system to the final physics analysis performed by more than one hundred physicists. The major architectural design choices and the arguments that lead to these choices will be outlined. Object oriented technologies have been used throughout. Initially developed for the LHCb experiment, GAUDI has been adopted and extended by other experiments. Several iterations of the GAUDI software framework have been released and are now being used routinely by the physicists of...

  17. Multi-core Architectures and Streaming Applications

    NARCIS (Netherlands)

    Smit, Gerard J.M.; Kokkeler, André B.J.; Wolkotte, Pascal T.; Burgwal, van de Marcel D.; Mandoiu, I.; Kennings, A.

    2008-01-01

    In this paper we focus on algorithms and reconfigurable multi-core architectures for streaming digital signal processing (DSP) applications. The multi-core concept has a number of advantages: (1) depending on the requirements more or fewer cores can be switched on/off, (2) the multi-core structure f

  18. N-glycan containing a core α1,3-fucose residue is required for basipetal auxin transport and gravitropic response in rice (Oryza sativa).

    Science.gov (United States)

    Harmoko, Rikno; Yoo, Jae Yong; Ko, Ki Seong; Ramasamy, Nirmal Kumar; Hwang, Bo Young; Lee, Eun Ji; Kim, Ho Soo; Lee, Kyung Jin; Oh, Doo-Byoung; Kim, Dool-Yi; Lee, Sanghun; Li, Yang; Lee, Sang Yeol; Lee, Kyun Oh

    2016-10-01

    In plants, α1,3-fucosyltransferase (FucT) catalyzes the transfer of fucose from GDP-fucose to asparagine-linked GlcNAc of the N-glycan core in the medial Golgi. To explore the physiological significance of this processing, we isolated two Oryza sativa (rice) mutants (fuct-1 and fuct-2) with loss of FucT function. Biochemical analyses of the N-glycan structure confirmed that α1,3-fucose is missing from the N-glycans of allelic fuct-1 and fuct-2. Compared with the wild-type cv Kitaake, fuct-1 displayed a larger tiller angle, shorter internode and panicle lengths, and decreased grain filling as well as an increase in chalky grains with abnormal shape. The mutant allele fuct-2 gave rise to similar developmental abnormalities, although they were milder than those of fuct-1. Restoration of a normal tiller angle in fuct-1 by complementation demonstrated that the phenotype is caused by the loss of FucT function. Both fuct-1 and fuct-2 plants exhibited reduced gravitropic responses. Expression of the genes involved in tiller and leaf angle control was also affected in the mutants. We demonstrate that reduced basipetal auxin transport and low auxin accumulation at the base of the shoot in fuct-1 account for both the reduced gravitropic response and the increased tiller angle.

  19. Teams at Their Core: Implementing an “All LANDS Approach to Conservation” Requires Focusing on Relationships, Teamwork Process, and Communications

    Directory of Open Access Journals (Sweden)

    Kasey R. Jacobs

    2017-07-01

    Full Text Available The U.S. Forest Service has found itself in an era of intense human activity, a changing climate; development and loss of open space; resource consumption; and problematic introduced species; and diversity in core beliefs and values. These challenges test our task-relevant maturity and the ability and willingness to meet the growing demands for services. The Forest Service is now on a transformative campaign to improve abilities and meet these challenges. The “All-Lands Approach to Conservation” brings agencies, organizations, landowners and stakeholders together across boundaries to decide on common goals for the landscapes they share. This approach is part of a larger transformation occurring in the American Conservation Movement where large-scale conservation partnerships possibly define the fourth or contemporary era. The intent of this communication is to present one perspective of what large-scale conservation partnerships should include, namely an emphasis on rethinking what leadership looks like in a collaborative context, relational governance, cooperative teamwork procedures, and communications.

  20. Automation of Flight Software Regression Testing

    Science.gov (United States)

    Tashakkor, Scott B.

    2016-01-01

    NASA is developing the Space Launch System (SLS) to be a heavy lift launch vehicle supporting human and scientific exploration beyond earth orbit. SLS will have a common core stage, an upper stage, and different permutations of boosters and fairings to perform various crewed or cargo missions. Marshall Space Flight Center (MSFC) is writing the Flight Software (FSW) that will operate the SLS launch vehicle. The FSW is developed in an incremental manner based on "Agile" software techniques. As the FSW is incrementally developed, testing the functionality of the code needs to be performed continually to ensure that the integrity of the software is maintained. Manually testing the functionality on an ever-growing set of requirements and features is not an efficient solution and therefore needs to be done automatically to ensure testing is comprehensive. To support test automation, a framework for a regression test harness has been developed and used on SLS FSW. The test harness provides a modular design approach that can compile or read in the required information specified by the developer of the test. The modularity provides independence between groups of tests and the ability to add and remove tests without disturbing others. This provides the SLS FSW team a time saving feature that is essential to meeting SLS Program technical and programmatic requirements. During development of SLS FSW, this technique has proved to be a useful tool to ensure all requirements have been tested, and that desired functionality is maintained, as changes occur. It also provides a mechanism for developers to check functionality of the code that they have developed. With this system, automation of regression testing is accomplished through a scheduling tool and/or commit hooks. Key advantages of this test harness capability includes execution support for multiple independent test cases, the ability for developers to specify precisely what they are testing and how, the ability to add