WorldWideScience

Sample records for asc software quality

  1. Sandia National Laboratories Advanced Simulation and Computing (ASC) software quality plan : ASC software quality engineering practices Version 3.0.

    Energy Technology Data Exchange (ETDEWEB)

    Turgeon, Jennifer L.; Minana, Molly A.; Hackney, Patricia; Pilch, Martin M.

    2009-01-01

    The purpose of the Sandia National Laboratories (SNL) Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. Quality is defined in the US Department of Energy/National Nuclear Security Agency (DOE/NNSA) Quality Criteria, Revision 10 (QC-1) as 'conformance to customer requirements and expectations'. This quality plan defines the SNL ASC Program software quality engineering (SQE) practices and provides a mapping of these practices to the SNL Corporate Process Requirement (CPR) 001.3.6; 'Corporate Software Engineering Excellence'. This plan also identifies ASC management's and the software project teams responsibilities in implementing the software quality practices and in assessing progress towards achieving their software quality goals. This SNL ASC Software Quality Plan establishes the signatories commitments to improving software products by applying cost-effective SQE practices. This plan enumerates the SQE practices that comprise the development of SNL ASC's software products and explains the project teams opportunities for tailoring and implementing the practices.

  2. Trilinos developers SQE guide : ASC software quality engineering practices.

    Energy Technology Data Exchange (ETDEWEB)

    Willenbring, James Michael; Heroux, Michael Allen

    2013-05-01

    The Trilinos Project is an effort to develop algorithms and enabling technologies within an object-oriented software framework for the solution of large-scale, complex multi-physics engineering and scientific problems. A new software capability is introduced into Trilinos as a package. A Trilinos package is an integral unit and, although there are exceptions such as utility packages, each package is typically developed by a small team of experts in a particular algorithms area such as algebraic preconditioners, nonlinear solvers, etc. The Trilinos Developers SQE Guide is a resource for Trilinos package developers who are working under Advanced Simulation and Computing (ASC) and are therefore subject to the ASC Software Quality Engineering Practices as described in the Sandia National Laboratories Advanced Simulation and Computing (ASC) Software Quality Plan: ASC Software Quality Engineering Practices Version 3.0 document [1]. The Trilinos Developer Policies webpage [2] contains a lot of detailed information that is essential for all Trilinos developers. The Trilinos Software Lifecycle Model [3]defines the default lifecycle model for Trilinos packages and provides a context for many of the practices listed in this document.

  3. Sandia National Laboratories Advanced Simulation and Computing (ASC) Software Quality Plan. Part 2, Mappings for the ASC software quality engineering practices. Version 1.0.

    Energy Technology Data Exchange (ETDEWEB)

    Ellis, Molly A.; Heaphy, Robert; Sturtevant, Judith E.; Hodges, Ann Louise; Boucheron, Edward A.; Drake, Richard Roy; Forsythe, Christi A.; Schofield, Joseph Richard, Jr.; Pavlakos, Constantine James; Williamson, Charles Michael; Edwards, Harold Carter

    2005-01-01

    The purpose of the Sandia National Laboratories Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. The plan defines the ASC program software quality practices and provides mappings of these practices to Sandia Corporate Requirements CPR 1.3.2 and 1.3.6 and to a Department of Energy document, 'ASCI Software Quality Engineering: Goals, Principles, and Guidelines'. This document also identifies ASC management and software project teams responsibilities in implementing the software quality practices and in assessing progress towards achieving their software quality goals.

  4. Sandia National Laboratories Advanced Simulation and Computing (ASC) software quality plan. Part 1: ASC software quality engineering practices, Version 2.0.

    Energy Technology Data Exchange (ETDEWEB)

    Sturtevant, Judith E.; Heaphy, Robert; Hodges, Ann Louise; Boucheron, Edward A.; Drake, Richard Roy; Minana, Molly A.; Hackney, Patricia; Forsythe, Christi A.; Schofield, Joseph Richard, Jr. (,; .); Pavlakos, Constantine James; Williamson, Charles Michael; Edwards, Harold Carter

    2006-09-01

    The purpose of the Sandia National Laboratories Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. The plan defines the ASC program software quality practices and provides mappings of these practices to Sandia Corporate Requirements CPR 1.3.2 and 1.3.6 and to a Department of Energy document, ASCI Software Quality Engineering: Goals, Principles, and Guidelines. This document also identifies ASC management and software project teams responsibilities in implementing the software quality practices and in assessing progress towards achieving their software quality goals.

  5. Sandia National Laboratories Advanced Simulation and Computing (ASC) software quality plan part 2 mappings for the ASC software quality engineering practices, version 2.0.

    Energy Technology Data Exchange (ETDEWEB)

    Heaphy, Robert; Sturtevant, Judith E.; Hodges, Ann Louise; Boucheron, Edward A.; Drake, Richard Roy; Minana, Molly A.; Hackney, Patricia; Forsythe, Christi A.; Schofield, Joseph Richard, Jr. (,; .); Pavlakos, Constantine James; Williamson, Charles Michael; Edwards, Harold Carter

    2006-09-01

    The purpose of the Sandia National Laboratories Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. The plan defines the ASC program software quality practices and provides mappings of these practices to Sandia Corporate Requirements CPR001.3.2 and CPR001.3.6 and to a Department of Energy document, ''ASCI Software Quality Engineering: Goals, Principles, and Guidelines''. This document also identifies ASC management and software project teams' responsibilities in implementing the software quality practices and in assessing progress towards achieving their software quality goals.

  6. Sandia National Laboratories Advanced Simulation and Computing (ASC) software quality plan. Part 1 : ASC software quality engineering practices version 1.0.

    Energy Technology Data Exchange (ETDEWEB)

    Minana, Molly A.; Sturtevant, Judith E.; Heaphy, Robert; Hodges, Ann Louise; Boucheron, Edward A.; Drake, Richard Roy; Forsythe, Christi A.; Schofield, Joseph Richard, Jr.; Pavlakos, Constantine James; Williamson, Charles Michael; Edwards, Harold Carter

    2005-01-01

    The purpose of the Sandia National Laboratories (SNL) Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. Quality is defined in DOE/AL Quality Criteria (QC-1) as conformance to customer requirements and expectations. This quality plan defines the ASC program software quality practices and provides mappings of these practices to the SNL Corporate Process Requirements (CPR 1.3.2 and CPR 1.3.6) and the Department of Energy (DOE) document, ASCI Software Quality Engineering: Goals, Principles, and Guidelines (GP&G). This quality plan identifies ASC management and software project teams' responsibilities for cost-effective software engineering quality practices. The SNL ASC Software Quality Plan establishes the signatories commitment to improving software products by applying cost-effective software engineering quality practices. This document explains the project teams opportunities for tailoring and implementing the practices; enumerates the practices that compose the development of SNL ASC's software products; and includes a sample assessment checklist that was developed based upon the practices in this document.

  7. Sandia National Laboratories Advanced Simulation and Computing (ASC) : appraisal method for the implementation of the ASC software quality engineering practices: Version 1.0.

    Energy Technology Data Exchange (ETDEWEB)

    Turgeon, Jennifer; Minana, Molly A.

    2008-02-01

    This document provides a guide to the process of conducting software appraisals under the Sandia National Laboratories (SNL) ASC Program. The goal of this document is to describe a common methodology for planning, conducting, and reporting results of software appraisals thereby enabling: development of an objective baseline on implementation of the software quality engineering (SQE) practices identified in the ASC Software Quality Plan across the ASC Program; feedback from project teams on SQE opportunities for improvement; identification of strengths and opportunities for improvement for individual project teams; guidance to the ASC Program on the focus of future SQE activities Document contents include process descriptions, templates to promote consistent conduct of appraisals, and an explanation of the relationship of this procedure to the SNL ASC software program.

  8. Software component quality evaluation

    Science.gov (United States)

    Clough, A. J.

    1991-01-01

    The paper describes a software inspection process that can be used to evaluate the quality of software components. Quality criteria, process application, independent testing of the process and proposed associated tool support are covered. Early results indicate that this technique is well suited for assessing software component quality in a standardized fashion. With automated machine assistance to facilitate both the evaluation and selection of software components, such a technique should promote effective reuse of software components.

  9. Computer software quality assurance

    International Nuclear Information System (INIS)

    The author defines some criteria for the evaluation of software quality assurance elements for applicability to the regulation of the nuclear industry. The author then analyses a number of software quality assurance (SQA) standards. The major extracted SQA elements are then discussed, and finally specific software quality assurance recommendations are made for the nuclear industry

  10. Software quality in 1997

    Energy Technology Data Exchange (ETDEWEB)

    Jones, C. [Software Productivity Research, Inc., Burlington, MA (United States)

    1997-11-01

    For many years, software quality assurance lagged behind hardware quality assurance in terms of methods, metrics, and successful results. New approaches such as Quality Function Deployment (QFD) the ISO 9000-9004 standards, the SEI maturity levels, and Total Quality Management (TQM) are starting to attract wide attention, and in some cases to bring software quality levels up to a parity with manufacturing quality levels. Since software is on the critical path for many engineered products, and for internal business systems as well, the new approaches are starting to affect global competition and attract widespread international interest. It can be hypothesized that success in mastering software quality will be a key strategy for dominating global software markets in the 21st century.

  11. Fostering software quality assessment

    OpenAIRE

    Brandtner, Martin

    2013-01-01

    Software quality assessment shall monitor and guide the evolution of a system based on quality measurements. This continuous process should ideally involve multiple stakeholders and provide adequate information for each of them to use. We want to support an effective selection of quality measurements based on the type of software and individual information needs of the involved stakeholders. We propose an approach that brings together quality measurements and individual information needs for ...

  12. Software product quality control

    CERN Document Server

    Wagner, Stefan

    2013-01-01

    Quality is not a fixed or universal property of software; it depends on the context and goals of its stakeholders. Hence, when you want to develop a high-quality software system, the first step must be a clear and precise specification of quality. Yet even if you get it right and complete, you can be sure that it will become invalid over time. So the only solution is continuous quality control: the steady and explicit evaluation of a product's properties with respect to its updated quality goals.This book guides you in setting up and running continuous quality control in your environment. Star

  13. Software quality assurance handbook

    Energy Technology Data Exchange (ETDEWEB)

    1990-09-01

    There are two important reasons for Software Quality Assurance (SQA) at Allied-Signal Inc., Kansas City Division (KCD): First, the benefits from SQA make good business sense. Second, the Department of Energy has requested SQA. This handbook is one of the first steps in a plant-wide implementation of Software Quality Assurance at KCD. The handbook has two main purposes. The first is to provide information that you will need to perform software quality assurance activities. The second is to provide a common thread to unify the approach to SQA at KCD. 2 figs.

  14. Software quality and agile methods

    OpenAIRE

    Huo, Ming; Verner, June; Zhu, Liming; Ali Babar, Muhammad

    2004-01-01

    peer-reviewed Agile methods may produce software faster but we also need to know how they meet our quality requirements. In this paper we compare the waterfall model with agile processes to show how agile methods achieve software quality under time pressure and in an unstable requirements environment, i.e. we analyze agile software quality assurance. We present a detailed waterfall model showing its software quality support processes. We then show the quality pra...

  15. SOFTWARE QUALITY MODELS: SYSTEMATIC STUDY, USES IN SOFTWARE QUALITY ENGINEERING

    OpenAIRE

    DINESH KUMAR; KAPIL KUMAR

    2012-01-01

    Software quality models are a well-accepted means to support quality management of software systems. Over the last 30 years, a multitude of quality models have been proposed and applied with varying degrees of success. Despite successes and standardization efforts, quality models are skill being criticized, as their application in practice exhibits various problems. To some extent, this criticism is caused by an unclear definition of what quality models with respect to their intended mode of ...

  16. Software Development Practices in Global Software Work : Developing Quality Software

    OpenAIRE

    2005-01-01

    This thesis is about software development practices, including the project management aspects, in the context of global software outsourcing. It was focused on the issues of achieving quality product namely here: software. It is built on the premise that the global context, in which the stakeholders are geographically separated by national boundaries, poses unique and inherent challenges derived from separation of place, time and culture.

  17. ASC-1

    DEFF Research Database (Denmark)

    Jakimoski, Goce; Khajuria, Samant

    2011-01-01

    . Unfortunately, the use of a block cipher as a building block limits the performance of the authenticated encryption schemes to at most one message block per block cipher evaluation. In this paper, we propose the authenticated encryption scheme ASC-1 (Authenticating Stream Cipher One). Similarly to LEX, ASC-1...... uses leak extraction from diÆerent AES rounds to compute the key material that is XOR-ed with the message to compute the ciphertext. Unlike LEX, the ASC-1 operates in a CFB fashion to compute an authentication tag over the encrypted message. We argue that ASC-1 is secure by reducingits (IND-CCA , INT...

  18. Software Quality Assurance Audits Guidebooks

    Science.gov (United States)

    1990-01-01

    The growth in cost and importance of software to NASA has caused NASA to address the improvement of software development across the agency. One of the products of this program is a series of guidebooks that define a NASA concept of the assurance processes that are used in software development. The Software Assurance Guidebook, NASA-GB-A201, issued in September, 1989, provides an overall picture of the NASA concepts and practices in software assurance. Second level guidebooks focus on specific activities that fall within the software assurance discipline, and provide more detailed information for the manager and/or practitioner. This is the second level Software Quality Assurance Audits Guidebook that describes software quality assurance audits in a way that is compatible with practices at NASA Centers.

  19. Quality assurance in computing software

    International Nuclear Information System (INIS)

    The paper concerns quality assurance in computing software as applied to the nuclear industry. The emergence of Software Quality Management in systems procurement over the last decade is discussed, as are some of the underlying reasons for its important role in modern procurement practice. Some of the typical aspects of control are highlighted and discussed. (author)

  20. Software quality engineering a practitioner's approach

    CERN Document Server

    Suryn, Witold

    2014-01-01

    Software quality stems from two distinctive, but associated, topics in software engineering: software functional quality and software structural quality. Software Quality Engineering studies the tenets of both of these notions, which focus on the efficiency and value of a design, respectively. The text addresses engineering quality on both the application and system levels with attention to Information Systems and Embedded Systems as well as recent developments. Targeted at graduate engineering students and software quality specialists, the book analyzes the relationship between functionality

  1. Software cost/resource modeling: Software quality tradeoff measurement

    Science.gov (United States)

    Lawler, R. W.

    1980-01-01

    A conceptual framework for treating software quality from a total system perspective is developed. Examples are given to show how system quality objectives may be allocated to hardware and software; to illustrate trades among quality factors, both hardware and software, to achieve system performance objectives; and to illustrate the impact of certain design choices on software functionality.

  2. Quality of the Open Source Software

    OpenAIRE

    Tariq, Muhammad Tahir and Aleem

    2008-01-01

    Quality and security of software are key factors in the software development. This thesis deals with the quality of open source software (OSS for short) and different questions that are related with open source and close source software has discussed in the thesis proposal. Open source software is a process by which we can produce cheap and qualitative software and its source could be re-use in the development of the software. Close source software is more expensive than open source software ...

  3. Outsourcing Software quality

    OpenAIRE

    Kaur, Amanpreet

    2013-01-01

    The key factors which have led to a growing trend of outsourcing are:? Lack of expert-labor in some portions of the business process.? Availability of cheaper labor, whilst not comprising on the quality of output.? Ability and feasibility to concentrate on the other crucial business process.These factors have specifically contributed to most of the outsourced partnersacross different locations in the world. Expertise in communication capabilities,technical expertise and favorable financial pa...

  4. Reliable Software Development with Proposed Quality Oriented Software Testing Metrics

    OpenAIRE

    Latika Kharb; Dr. Vijay Singh Rathore

    2011-01-01

    For an effective test measurement, a software tester requires a testing metrics that could measure the quality and productivity of software development process along with increasing its reusability, correctness and maintainability. Until now, the understanding of measuring software quality is not yet sophisticated enough and is still far away from being standardized and in order to assess the software quality, an appropriate set of software metrics needs to be identified that could express th...

  5. Improving software quality with software error prediction

    OpenAIRE

    Taipale, T. (Taneli)

    2015-01-01

    Today's agile software development can be a complicated process, especially when dealing with a large-scale project with demands for tight communication. The tools used in software development, while aiding the process itself, can also offer meaningful statistics. With the aid of machine learning, these statistics can be used for predicting the behavior patterns of the development process. The starting point of this thesis is a software project developed to be a part of a large telecommun...

  6. Software quality - how is it achieved?

    International Nuclear Information System (INIS)

    Although software quality can't be quantified, the tools and techniques to achieve high quality are available. As management stresses the need for definable software quality programs from vendors and subcontractors and provides the incentives for these programs, the quality of software will improve. EPRI could provide the leadership in establishing guidelines for a balanced software quality program and through workshops provide training to utility staff and management on the methods for evaluating the characteristics of quality software. With the more complex systems discussed at this workshop and particularly with the trend toward the use of artificial intelligence, the importance of quality software will grow dramatically

  7. Reliable Software Development with Proposed Quality Oriented Software Testing Metrics

    Directory of Open Access Journals (Sweden)

    Latika Kharb

    2011-07-01

    Full Text Available For an effective test measurement, a software tester requires a testing metrics that could measure the quality and productivity of software development process along with increasing its reusability, correctness and maintainability. Until now, the understanding of measuring software quality is not yet sophisticated enough and is still far away from being standardized and in order to assess the software quality, an appropriate set of software metrics needs to be identified that could express these quality attributes. Our research objective in this paper is to construct and define a set of easy-to measure software metrics for testing to be used as early indicators of external measures of quality. So,we’ve emphasized on the fact that reliable software development with respect to quality could be well achieved by using our set of testing metrics, and for that we’ve given the practical results of evaluation

  8. Integrated modeling of software cost and quality

    International Nuclear Information System (INIS)

    In modeling the cost and quality of software systems, the relationship between cost and quality must be considered. This explicit relationship is dictated by the criticality of the software being developed. The balance between cost and quality is a viable software engineering trade-off throughout the life cycle. Therefore, the ability to accurately estimate the cost and quality of software systems is essential to providing reliable software on time and within budget. Software cost models relate the product error rate to the percent of the project labor that is required for independent verification and validation. The criticality of the software determines which cost model is used to estimate the labor required to develop the software. Software quality models yield an expected error discovery rate based on the software size, criticality, software development environment, and the level of competence of the project and the developers with respect to the processes being employed

  9. Framework of Software Quality Management Using Object oriented Software Agent

    Directory of Open Access Journals (Sweden)

    Anand Pandey

    2013-01-01

    Full Text Available Development of software is a scientific and economic problem, particularly the design of complex systems whichrequire evolving methods and approaches. Agent technology is currently one of the most active and vibrant areas of IT research and development. Object-oriented Software Engineering (OOSE has become an active area of research in recent years. In this paper, we review the framework of software quality management using object- oriented methodology concepts for software agents.The software specification acts as a bridge between customers, architects, software developers and testers. Using object-oriented concept of software agent and its standard it may offer benefits even if the system is implemented without an object-based language or framework . We propose and discuss a software agent framework, specifically to support software quality management. Although still in its initial phases, research indicates some promise in enabling software developers to meet market expectations and produce projects timeously, within budget and to users satisfaction. However, the software quality management environment has also changed and is continuously evolving. Currently software projects are developed and deployed in distributed, pervasive and collaborative environments and its quality should be managed by applying its best standard. From the point of view of software engineering this framework and its standards are applying for developing the software projects.We discuss the standard and benefits that can be gained by using object-oriented concepts, and where the concepts require further development.

  10. Software quality research: why, what and how

    OpenAIRE

    Parnas, David Lorge

    2003-01-01

    The Professor David Lorge Parnas Inaugural Lecture, discussing why software quality research is important, what topics he will be studying and how research at the Software Quality Research Laboratory (SQRL) will be conducted.

  11. SWiFT Software Quality Assurance Plan.

    Energy Technology Data Exchange (ETDEWEB)

    Berg, Jonathan Charles [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2016-01-01

    This document describes the software development practice areas and processes which contribute to the ability of SWiFT software developers to provide quality software. These processes are designed to satisfy the requirements set forth by the Sandia Software Quality Assurance Program (SSQAP). APPROVALS SWiFT Software Quality Assurance Plan (SAND2016-0765) approved by: Department Manager SWiFT Site Lead Dave Minster (6121) Date Jonathan White (6121) Date SWiFT Controls Engineer Jonathan Berg (6121) Date CHANGE HISTORY Issue Date Originator(s) Description A 2016/01/27 Jon Berg (06121) Initial release of the SWiFT Software Quality Assurance Plan

  12. Software Metrics to Estimate Software Quality using Software Component Reusability

    OpenAIRE

    Prakriti Trivedi; Rajeev Kumar

    2012-01-01

    Today most of the applications developed using some existing libraries, codes, open sources etc. As a code is accessed in a program, it is represented as the software component. Such as in java beans and in .net ActiveX controls are the software components. These components are ready to use programming code or controls that excel the code development. A component based software system defines the concept of software reusability. While using these components the main question arise is whether ...

  13. Quality Management Activities for Software Architecture and Software Architecture Process

    OpenAIRE

    Hämäläinen, Niina

    2008-01-01

    Architecture processes are considerably new parts of organisations’ processes. These processes have the responsibility to aim at high quality and financially successful architectures. However, the activities which promote this aim are not clearly defined yet. This study reviews literature and practitioners’ experiences on quality management activities that could be suggested to promote the achievement of high quality software architectures and a good quality software a...

  14. EFFECT OF REFACTORING ON SOFTWARE QUALITY

    Directory of Open Access Journals (Sweden)

    Noble Kumari

    2014-05-01

    Full Text Available Software quality is an important issue in the development of successful software application. Many methods have been applied to improve the software quality. Refactoring is one of those methods. But, the effect of refactoring in general on all the software quality attributes is ambiguous. The goal of this paper is to find out the effect of various refactoring methods on quality attributes and to classify them based on their measurable effect on particular software quality attribute. The paper focuses on studying the Reusability, Complexity, Maintainability, Testability, Adaptability, Understandability, Fault Proneness, Stability and Completeness attribute of a software .This, in turn, will assist the developer in determining that whether to apply a certain refactoring method to improve a desirable quality attribute.

  15. Software quality assurance plans for safety-critical software

    International Nuclear Information System (INIS)

    Application software is defined as safety-critical if a fault in the software could prevent the system components from performing their nuclear-safety functions. Therefore, for nuclear-safety systems, the AREVA TELEPERMR XS (TXS) system is classified 1E, as defined in the Inst. of Electrical and Electronics Engineers (IEEE) Std 603-1998. The application software is classified as Software Integrity Level (SIL)-4, as defined in IEEE Std 7-4.3.2-2003. The AREVA NP Inc. Software Program Manual (SPM) describes the measures taken to ensure that the TELEPERM XS application software attains a level of quality commensurate with its importance to safety. The manual also describes how TELEPERM XS correctly performs the required safety functions and conforms to established technical and documentation requirements, conventions, rules, and standards. The program manual covers the requirements definition, detailed design, integration, and test phases for the TELEPERM XS application software, and supporting software created by AREVA NP Inc. The SPM is required for all safety-related TELEPERM XS system applications. The program comprises several basic plans and practices: 1. A Software Quality-Assurance Plan (SQAP) that describes the processes necessary to ensure that the software attains a level of quality commensurate with its importance to safety function. 2. A Software Safety Plan (SSP) that identifies the process to reasonably ensure that safety-critical software performs as intended during all abnormal conditions and events, and does not introduce any new hazards that could jeopardize the health and safety of the public. 3. A Software Verification and Validation (V and V) Plan that describes the method of ensuring the software is in accordance with the requirements. 4. A Software Configuration Management Plan (SCMP) that describes the method of maintaining the software in an identifiable state at all times. 5. A Software Operations and Maintenance Plan (SO and MP) that

  16. The 7 Qualities of Highly Secure Software

    CERN Document Server

    Paul, Mano

    2012-01-01

    The 7 Qualities of Highly Secure Software provides a framework for designing, developing, and deploying hacker-resilient software. It uses engaging anecdotes and analogies-ranging from Aesop's fables, athletics, architecture, biology, nursery rhymes, and video games-to illustrate the qualities that are essential for the development of highly secure software. Each chapter details one of the seven qualities that can make your software highly secure and less susceptible to hacker threats. Leveraging real-world experiences and examples, the book: Explains complex security concepts in language that

  17. Software Quality Assurance for Nuclear Safety Systems

    International Nuclear Information System (INIS)

    The US Department of Energy has undertaken an initiative to improve the quality of software used to design and operate their nuclear facilities across the United States. One aspect of this initiative is to revise or create new directives and guides associated with quality practices for the safety software in its nuclear facilities. Safety software includes the safety structures, systems, and components software and firmware, support software and design and analysis software used to ensure the safety of the facility. DOE nuclear facilities are unique when compared to commercial nuclear or other industrial activities in terms of the types and quantities of hazards that must be controlled to protect workers, public and the environment. Because of these differences, DOE must develop an approach to software quality assurance that ensures appropriate risk mitigation by developing a framework of requirements that accomplishes the following goals: (smbullet) Ensures the software processes developed to address nuclear safety in design, operation, construction and maintenance of its facilities are safe (smbullet) Considers the larger system that uses the software and its impacts (smbullet) Ensures that the software failures do not create unsafe conditions Software designers for nuclear systems and processes must reduce risks in software applications by incorporating processes that recognize, detect, and mitigate software failure in safety related systems. It must also ensure that fail safe modes and component testing are incorporated into software design. For nuclear facilities, the consideration of risk is not necessarily sufficient to ensure safety. Systematic evaluation, independent verification and system safety analysis must be considered for software design, implementation, and operation. The software industry primarily uses risk analysis to determine the appropriate level of rigor applied to software practices. This risk-based approach distinguishes safety

  18. Proceedings of the Fifth Triennial Software Quality Forum 2000, Software for the Next Millennium, Software Quality Forum

    Energy Technology Data Exchange (ETDEWEB)

    Scientific Software Engineering Group, CIC-12

    2000-04-01

    The Software Quality Forum is a triennial conference held by the Software Quality Assurance Subcommittee for the Department of Energy's Quality Managers. The forum centers on key issues, information, and technology important in software development for the Nuclear Weapons Complex. This year it will be opened up to include local information technology companies and software vendors presenting their solutions, ideas, and lessons learned. The Software Quality Forum 2000 will take on a more hands-on, instructional tone than those previously held. There will be an emphasis on providing information, tools, and resources to assist developers in their goal of producing next generation software.

  19. Increasing Software Quality using the Provenance of Software Development Processes

    OpenAIRE

    Schreiber, Andreas

    2013-01-01

    Today’s software development processes are complex. A lot of interaction occurs between developers, the tools they use, and even automatically between different tools. Examples of those interactions are entering a new requirement into the bug tracking system, committing new source code to the repository or automatic code style check during a check-in. To trace and understand the full process is hard. To get insight into these processes and to increase the quality of the resulting software re...

  20. Quality assurance of custom software solutions

    OpenAIRE

    Herblan, Miha

    2014-01-01

    In thesis we look at problem of software quality assurance, especially when it comes to custom solutions projects. First we look at what quality assurance is, how do we mesure it and specialty why we do it. Afterwards we go through main principles, that apply when dealing with quality assurance in general. Since quality assurance is extensive topic, we take more detailed look on one part of quality assurance that is mostly used, that is testing of software. Because we are talking about softwa...

  1. ASC Champ Orbit Model

    DEFF Research Database (Denmark)

    Riis, Troels; Jørgensen, John Leif

    1999-01-01

    This documents describes a test of the implementation of the ASC orbit model for the Champ satellite.......This documents describes a test of the implementation of the ASC orbit model for the Champ satellite....

  2. Software Quality Evaluation for Security Cots Products

    OpenAIRE

    Villalba de Benito, María Teresa; Fernández Sanz, Luis; Juan J. Cuadrado Gallego; José J. Martínez

    2010-01-01

    Increasing demand for security commercial products requires an improvement of methods for evaluating their software quality. Existing standards offer general frameworks but more specific models which reflect the perception of experts and customers as well as the particular characteristics of this type of products are needed. This article presents a method for generating domain-oriented software quality models for specific types of applications. It is applied to the generation of a model for s...

  3. Software quality: definitions and strategic issues

    OpenAIRE

    Fitzpatrick, Ronan

    1996-01-01

    This paper contains two sections relating to software quality issues. First, the various definitions of software quality are examined and an alternative suggested. It continues with a review of the quality model as defined by McCall, Richards and Walters in 1977 and mentions the later model of Boëhm published in 1978. Each of McCall's quality factors is reviewed and the extent to which they still apply in the late 1990s is commented on. The factors include, integrity, reliability, usability, ...

  4. Factors Modulating Software Design Quality

    OpenAIRE

    S., Poornima. U.; V, Suma.

    2014-01-01

    Object oriented approach is one of the popular software development approach for managing complex systems with massive set of requirements. Unlike procedural approach, this approach captures the requirements as set of data rather than services. Further, class is considered as a key unit of the solution-domain with data and services wrapped together, representing architectural design of a basic module. Thus, system complexity is directly related to the number of modules and the degree of inter...

  5. Quality of freeware antivirus software

    OpenAIRE

    Rasool, Muhammad Ahsan; Jamal, Abdul

    2011-01-01

    War between malware and antimalware software started two decade back and have adopted the modern techniques with the evolution of technological development in the field of information technology. This thesis was targeted to analyze the performance of freeware antivirus programs available in the market. Several tests were performed to analyze the performance with respect to the core responsibilities of these software’s to scan and detect the viruses and also prevent and eradicate form them. Al...

  6. Improving Software Quality through Program Analysis

    International Nuclear Information System (INIS)

    In this paper, we present the Program Analysis Framework (PAF) to analyze the software architecture and software modularity of large software packages using techniques in Aspect Mining. The basic idea about PAF is to record the call relationships information among the important elements firstly and then use the different analysis algorithms to find the crosscutting concerns which could destroy the modularity of the software from this recording information. We evaluate our framework through analyzing DATE, the ALICE Data-Acquisition (DAQ) software which handles the data flow from the detector electronics to the permanent storage archiving. The analysis results prove the effectiveness and efficiency of our framework. PAF has pinpointed a number of possible optimizations which could be applied and help maximizing the software quality. PAF could also be used for the analysis of other projects written in C language.

  7. SAPHIRE 8 Software Quality Assurance Oversight

    Energy Technology Data Exchange (ETDEWEB)

    Kurt G. Vedros

    2011-09-01

    The software quality assurance oversight consists of updating and maintaining revision control of the SAPHIRE 8 quality assurance program documentation and of monitoring revision control of the SAPHIRE 8 source code. This report summarizes the oversight efforts through description of the revision control system (RCS) setup, operation and contents. Documents maintained under revision control include the Acceptance Test Plan (ATP), Configuration Management Plan, Quality Assurance Plan, Software Project Plan, Requirements Traceability Matrix (RTM), System Test Plan, SDP Interface Training Manual, and the SAPHIRE 8, 'New Features and Capabilities Overview'.

  8. A software perspective of environmental data quality

    International Nuclear Information System (INIS)

    Because of the large amount of complex data in environmental projects, particularly large decontamination and decommissioning projects, the quality of the data has a profound impact on the success and cost of the mission. In every phase of the life cycle of the project, including regulatory intervention and legal proceedings, maintaining the quality of data and presenting data in a timely and meaningful manner are critical. In this paper, a systemic view of data quality management from a software engineering perspective is presented. A method of evaluation evolves from this view. This method complements the principles of the data quality objective. When graded adequately, the method of evaluation establishes a paradigm for ensuring data quality for new and renewed projects. This paper also demonstrates that incorporating good practices of software engineering into the data management process leads to continuous improvement of data quality

  9. Monitoring Burr Type III Software Quality Using SPC

    OpenAIRE

    Smitha Chowdary. Ch; Satya Prasad R; Sobhana. K

    2015-01-01

    The ability of software in satisfying its functional requirements successfully is measured as software reliability, making it one of the most important characteristics of software quality. Improving software processes employed during the software development life cycle is essential to produce reliable software systems of assured quality. Software Reliability Growth Models (SRGMs) aid software engineers and managers in tracking and measuring the growth in reliability as software is being devel...

  10. Software Engineering Management for Productivity and Quality

    Energy Technology Data Exchange (ETDEWEB)

    Karen White

    1999-10-01

    Since the advent of electronic computers, people have struggled to develop effective software engineering processes. While these processes are similar to those used by hardware engineers, the software industry has earned a reputation for late delivery of inadequate products. Most software managers are looking for ways to deliver quality products faster, or with fewer resources. The development time and product outcome of any software project can be influenced by four variables: the product characteristics, the people involved, the processes they use, and the underlying technology. In order to have an impact on the productivity of a software development effort, the manager must focus on and balance these areas. This paper will discuss effective ways to improve productivity by using this approach.

  11. The software quality control for gamma spectrometry

    International Nuclear Information System (INIS)

    One of major problems with wich the quality control program of an environmental measurements laboratory is confronted is the evaluation of the performances of software packages for the analysis of gamma-ray spectra. A program of tests for evaluating the performances of the software package (SPECTRAN-F, Canberra Inc.) used by our laboratory is being carried out. In this first paper the results of a preliminary study concerning the evaluation of the performance of the doublet analysis routine are presented

  12. Quality and Knowledge in Software Engineering

    OpenAIRE

    Burton, Stu; Swanson, Kent; Leonard, Lisa

    1993-01-01

    Celite corporation and Andersen Consulting have developed an advanced approach to traditional software development called the application software factory (ASF)." The approach is an integration of technology and total quality "management" techniques that includes the use of an expert system to guide module design and perform "module programming." The expert system component is called the knowledge-based design assistant and its inclusion in the ASF methodology" has significantly reduced modul...

  13. Evaluation Framework for Quality Management Software

    Directory of Open Access Journals (Sweden)

    Nadica Hrgarek

    2008-06-01

    Full Text Available Identifying and specifying user requirements is an integral part of information systems design and is critical for the project success. More than 50% of the reasons for the project failure presented in the CHAOS report [36] and study of a US Air Force project by Sheldon et al. [33] are related to requirements. The goal of this paper is to assess the relevant user and software requirements which are the basis for an electronic quality management system selection in medical device companies. This paper describes the structured evaluation and selection process of different quality management software tools that shall support business processes. The purpose of this paper is to help the small to medium size medical device companies to choose the right quality management software which meets the company's business needs.

  14. How To Improve Software Quality Assurance In Developing Countries

    Directory of Open Access Journals (Sweden)

    Ali Javed

    2012-04-01

    Full Text Available Quality is an important factor in software industry. Software quality depends upon the customer satisfaction which can be achieved through applying standards. In this era achieving quality software is very important because of the high customer demands. Developed countries are excelling in software industry and improving day by day. Meanwhile developing countries like Pakistan are struggling with software quality and cannot maintain reputation in International Market. Software Quality lacks due tomany reasons. This paper will address the problems for lacking interest in improving the software quality by higher authorities and software assurance team. We have provided solution to the addressed problems also.

  15. Quality assurance for software important to safety

    International Nuclear Information System (INIS)

    Software applications play an increasingly relevant role in nuclear power plant systems. This is particularly true of software important to safety used in both: calculations for the design, testing and analysis of nuclear reactor systems (design, engineering and analysis software); and monitoring, control and safety functions as an integral part of the reactor systems (monitoring, control and safety system software). Computer technology is advancing at a fast pace, offering new possibilities in nuclear reactor design, construction, commissioning, operation, maintenance and decommissioning. These advances also present new issues which must be considered both by the utility and by the regulatory organization. Refurbishment of ageing instrumentation and control systems in nuclear power plants and new safety related application areas have emerged, with direct (e.g. interfaces with safety systems) and indirect (e.g. operator intervention) implications for safety. Currently, there exist several international standards and guides on quality assurance for software important to safety. However, none of the existing documents provides comprehensive guidance to the developer, manager and regulator during all phases of the software life-cycle. The present publication was developed taking into account the large amount of available documentation, the rapid development of software systems and the need for updated guidance on how to do it. It provides information and guidance for defining and implementing quality assurance programmes covering the entire life-cycle of software important to safety. Expected users are managers, performers and assessors from nuclear utilities, regulatory bodies, suppliers and technical support organizations involved with the development and use of software applied in nuclear power plants

  16. Experiences with Software Quality Metrics in the EMI Middleware

    OpenAIRE

    Alandes, Maria

    2012-01-01

    he EMI Quality Model has been created to define, and later review, the EMI (European Middleware Initiative) software product and process quality. A quality model is based on a set of software quality metrics and helps to set clear and measurable quality goals for software products and processes. The EMI Quality Model follows the ISO/IEC 9126 Software Engineering – Product Quality to identify a set of characteristics that n...

  17. MCNP trademark Software Quality Assurance plan

    International Nuclear Information System (INIS)

    MCNP is a computer code that models the interaction of radiation with matter. MCNP is developed and maintained by the Transport Methods Group (XTM) of the Los Alamos National Laboratory (LANL). This plan describes the Software Quality Assurance (SQA) program applied to the code. The SQA program is consistent with the requirements of IEEE-730.1 and the guiding principles of ISO 900

  18. Software archeology: a case study in software quality assurance and design

    Energy Technology Data Exchange (ETDEWEB)

    Macdonald, John M [Los Alamos National Laboratory; Lloyd, Jane A [Los Alamos National Laboratory; Turner, Cameron J [COLORADO SCHOOL OF MINES

    2009-01-01

    Ideally, quality is designed into software, just as quality is designed into hardware. However, when dealing with legacy systems, demonstrating that the software meets required quality standards may be difficult to achieve. As the need to demonstrate the quality of existing software was recognized at Los Alamos National Laboratory (LANL), an effort was initiated to uncover and demonstrate that legacy software met the required quality standards. This effort led to the development of a reverse engineering approach referred to as software archaeology. This paper documents the software archaeology approaches used at LANL to document legacy software systems. A case study for the Robotic Integrated Packaging System (RIPS) software is included.

  19. Evaluating predictive models of software quality

    International Nuclear Information System (INIS)

    Applications from High Energy Physics scientific community are constantly growing and implemented by a large number of developers. This implies a strong churn on the code and an associated risk of faults, which is unavoidable as long as the software undergoes active evolution. However, the necessities of production systems run counter to this. Stability and predictability are of paramount importance; in addition, a short turn-around time for the defect discovery-correction-deployment cycle is required. A way to reconcile these opposite foci is to use a software quality model to obtain an approximation of the risk before releasing a program to only deliver software with a risk lower than an agreed threshold. In this article we evaluated two quality predictive models to identify the operational risk and the quality of some software products. We applied these models to the development history of several EMI packages with intent to discover the risk factor of each product and compare it with its real history. We attempted to determine if the models reasonably maps reality for the applications under evaluation, and finally we concluded suggesting directions for further studies.

  20. Software Quality Improvement in the OMC Team

    CERN Document Server

    Maier, Viktor

    Physicists use self-written software as a tool to fulfill their tasks and often the developed software is used for several years or even decades. If a software product lives for a long time, it has to be changed and adapted to external influences. This implies that the source code has to be read, understood and modified. The same applies to the software of the Optics Measurements and Corrections (OMC) team at CERN. Their task is to track, analyze and correct the beams in the LHC and other accelerators. To solve this task, they revert to a self-written software base with more than 150,000 physical lines of code. The base is subject to continuous changes as well. Their software does its job and is effective, but runs regrettably not efficient because some parts of the source code are in a bad shape and has a low quality. The implementation could be faster and more memory efficient. In addition it is difficult to read and understand the code. Source code files and functions are too big and identifiers do not rev...

  1. Software Quality Perceptions of Stakeholders Involved in the Software Development Process

    Science.gov (United States)

    Padmanabhan, Priya

    2013-01-01

    Software quality is one of the primary determinants of project management success. Stakeholders involved in software development widely agree that quality is important (Barney and Wohlin 2009). However, they may differ on what constitutes software quality, and which of its attributes are more important than others. Although, software quality…

  2. Strategic drivers of software quality: beyond external and internal software quality.

    OpenAIRE

    Fitzpatrick, Ronan

    2001-01-01

    Software quality is often considered in terms of the contractual requirements between the supplier and acquirer as described in ISO/IEC 12207 and focuses on software life cycle processes. However, beyond these processes acquirer organisations need to address other issues like complying with new legislation, securing return on investment, and achieving competitive support from their new software investments. Supplier organisations also have issues that they must manage. This paper addresses al...

  3. SAPHIRE 8 Quality Assurance Software Metrics Report

    Energy Technology Data Exchange (ETDEWEB)

    Kurt G. Vedros

    2011-08-01

    The purpose of this review of software metrics is to examine the quality of the metrics gathered in the 2010 IV&V and to set an outline for results of updated metrics runs to be performed. We find from the review that the maintenance of accepted quality standards presented in the SAPHIRE 8 initial Independent Verification and Validation (IV&V) of April, 2010 is most easily achieved by continuing to utilize the tools used in that effort while adding a metric of bug tracking and resolution. Recommendations from the final IV&V were to continue periodic measurable metrics such as McCabe's complexity measure to ensure quality is maintained. The four software tools used to measure quality in the IV&V were CodeHealer, Coverage Validator, Memory Validator, Performance Validator, and Thread Validator. These are evaluated based on their capabilities. We attempted to run their latest revisions with the newer Delphi 2010 based SAPHIRE 8 code that has been developed and was successful with all of the Validator series of tools on small tests. Another recommendation from the IV&V was to incorporate a bug tracking and resolution metric. To improve our capability of producing this metric, we integrated our current web reporting system with the SpiraTest test management software purchased earlier this year to track requirements traceability.

  4. Monitoring Burr Type III Software Quality Using SPC

    Directory of Open Access Journals (Sweden)

    Smitha Chowdary. Ch

    2015-08-01

    Full Text Available The ability of software in satisfying its functional requirements successfully is measured as software reliability, making it one of the most important characteristics of software quality. Improving software processes employed during the software development life cycle is essential to produce reliable software systems of assured quality. Software Reliability Growth Models (SRGMs aid software engineers and managers in tracking and measuring the growth in reliability as software is being developed for quality assurance. Software quality is improved by continuously monitoring and controlling the software process. Statistical Process Control (SPC is the application of statistical methods on software process data presented graphically to quickly and easily identify anomalies that enable the developer to address software failures. In this paper we proposed a SPC mechanism of control charts for time domain data using Burr Type III based on Non Homogenous Poisson Process (NHPP and parameters are estimated by Maximum likelihood Estimation (MLE method.

  5. Software quality assurance plan for viscometer

    Energy Technology Data Exchange (ETDEWEB)

    Gimera, M.

    1994-10-18

    The in situ viscometer is a portable instrument designed to raise and lower a sphere (rheometer ball) through layers of tank waste material while recording ball position, velocity, and cable tension. In the field, the viscometer attaches to a decontamination spool piece which in turn is designed to attach to any 4-inch, 150-pound flange (typical of many available tank risers). The motion of the ball and collection of data is controlled by instrumentation and control equipment housed in a separate remote control console. This document covers the product, Viscometer Data Acquisition Software. This document provides the software quality assurance plan, verification and validation plan, and configuration management plan for developing the software for the instrumentation that will be used to obtain rheology data from Tank SY-101.

  6. Software quality assurance plan for viscometer

    International Nuclear Information System (INIS)

    The in situ viscometer is a portable instrument designed to raise and lower a sphere (rheometer ball) through layers of tank waste material while recording ball position, velocity, and cable tension. In the field, the viscometer attaches to a decontamination spool piece which in turn is designed to attach to any 4-inch, 150-pound flange (typical of many available tank risers). The motion of the ball and collection of data is controlled by instrumentation and control equipment housed in a separate remote control console. This document covers the product, Viscometer Data Acquisition Software. This document provides the software quality assurance plan, verification and validation plan, and configuration management plan for developing the software for the instrumentation that will be used to obtain rheology data from Tank SY-101

  7. Particularities of Quality Management in Software Industry

    OpenAIRE

    Liviu ILIES; Catalin AFRASINEI - ZEVOIANU

    2009-01-01

    Very often IT domain, with its outcomes, through its multidisciplinary orientation, is an essential contributor to quality assurance of economic bodies and not only. It is difficult nowadays to find out an activity sector or even a sub-sector where software applications, regardless their nature, hadn’t marked out their place and contribution to its good economic and social development. In order to contribute as a tool toward economic and qualitative increasing of performance, the tool itself ...

  8. Evaluation Framework for Quality Management Software

    OpenAIRE

    Nadica Hrgarek

    2008-01-01

    Identifying and specifying user requirements is an integral part of information systems design and is critical for the project success. More than 50% of the reasons for the project failure presented in the CHAOS report [36] and study of a US Air Force project by Sheldon et al. [33] are related to requirements. The goal of this paper is to assess the relevant user and software requirements which are the basis for an electronic quality management system selection in medical device companies. Th...

  9. The dynamic of modern software development project management and the software crisis of quality. An integrated system dynamics approach towards software quality improvement

    OpenAIRE

    Nasirikaljahi, Armindokht

    2012-01-01

    The software industry is plagued by cost-overruns, delays, poor customer satisfaction and quality issues that are costing clients and customers world-wide billions of dollars each year. The phenomenon is coined The Software Crisis", and poses a huge challenge for software project management. This thesis addresses one of the core issues of the software crisis, namely software quality. The challenges of software quality are central for understanding the other symptoms of the software crisis. Th...

  10. Further Development of Tieto Software Product Quality Analysis System

    OpenAIRE

    Moisio, Teemu

    2012-01-01

    The definition of software quality and how one experiences quality is a multifaceted matter and usually totally dependent on the user group that observes the quality from different perspectives. A common way to analyse software product’s quality is to measure software product’s characteristics like usability, reliability, efficiency, expandability, testability and maintainability. For analysing software product’s quality, many processes have been developed. Using these processes and acting ac...

  11. Towards Resolving Software Quality-in-Use Measurement Challenges

    OpenAIRE

    Atoum, Issa; Bong, Chih How; Kulathuramaiyer, Narayanan

    2015-01-01

    Software quality-in-use comprehends the quality from user's perspectives. It has gained its importance in e-learning applications, mobile service based applications and project management tools. User's decisions on software acquisitions are often ad hoc or based on preference due to difficulty in quantitatively measure software quality-in-use. However, why quality-in-use measurement is difficult? Although there are many software quality models to our knowledge, no works surveys the challenges...

  12. Mining Software Quality from Software Reviews: Research Trends and Open Issues

    OpenAIRE

    Atoum, Issa; Otoom, Ahmed

    2016-01-01

    Software review text fragments have considerably valuable information about users experience. It includes a huge set of properties including the software quality. Opinion mining or sentiment analysis is concerned with analyzing textual user judgments. The application of sentiment analysis on software reviews can find a quantitative value that represents software quality. Although many software quality methods are proposed they are considered difficult to customize and many of them are limited...

  13. Software Quality Management and Organisational Fit

    Directory of Open Access Journals (Sweden)

    S.H. Nielsen

    1995-11-01

    Full Text Available This paper describes some of the findings, of an ongoing ethnographic study of a computer operations section in an Information Technology Centre. The study finds that after an initial period of staff acceptance of prescribed quality management procedures, certain features of organizational culture, structure and power, work against continued conformance. Procedures will be modified, firstly to resolve any inconsistencies between the prescribed procedures and strongly held beliefs and values about work practices and organization, and secondly to reduce or eliminate perceived threats. The paper argues that software quality management is based on a Unitarian approach to organization, which ignores the plurality of beliefs and work contexts which exist in an organization, and which assumes that organizational features can be managed and changed in predictable ways. This paper suggests that a pluralist approach to organizational analysis helps to reveal the nature and extent of changes required to the quality management system and the requirements for implementing changes.

  14. The Effects of Development Team Skill on Software Product Quality

    Science.gov (United States)

    Beaver, Justin M.; Schiavone, Guy A.

    2006-01-01

    This paper provides an analysis of the effect of the skill/experience of the software development team on the quality of the final software product. A method for the assessment of software development team skill and experience is proposed, and was derived from a workforce management tool currently in use by the National Aeronautics and Space Administration. Using data from 26 smallscale software development projects, the team skill measures are correlated to 5 software product quality metrics from the ISO/IEC 9126 Software Engineering Product Quality standard. in the analysis of the results, development team skill is found to be a significant factor in the adequacy of the design and implementation. In addition, the results imply that inexperienced software developers are tasked with responsibilities ill-suited to their skill level, and thus have a significant adverse effect on the quality of the software product. Keywords: software quality, development skill, software metrics

  15. Investigating the Role of Knowledge gap in enhancing Software Quality

    OpenAIRE

    Ahmed Mehrez

    2013-01-01

    Software quality has always been described as a poorly developed construct. Several reports and much evidence show clear problems related to software quality outputs. Therefore, software quality problems constitute the phenomenon investigated in this research question: Why does quality management not achieve its anticipated outcomes in the software industry? This research empirically tests if a possible existence of knowledge management gaps can be a reason behind a possible existence of qual...

  16. Operational excellence (six sigma) philosophy: Application to software quality assurance

    Energy Technology Data Exchange (ETDEWEB)

    Lackner, M.

    1997-11-01

    This report contains viewgraphs on operational excellence philosophy of six sigma applied to software quality assurance. This report outlines the following: goal of six sigma; six sigma tools; manufacturing vs administrative processes; Software quality assurance document inspections; map software quality assurance requirements document; failure mode effects analysis for requirements document; measuring the right response variables; and questions.

  17. A cooperative approach to ensure software product quality

    Directory of Open Access Journals (Sweden)

    M Sangeetha

    2011-07-01

    Full Text Available Quality is the process of ensuring that software developed satisfies customer requirements. Among the many software quality attributes like functionality, usability, capability, maintainability, etc., reliability is a major factor to assure quality of the software. Reliability ensures that software is failure free. In this work, we propose to ensure quality through a behavioral model that evaluates business requirements and gives priority for quality attributes. The model consists of behavioural and human perspectives in assessment. It is used for assessment of software developed.

  18. A SELF PROCESS IMPROVEMENT FOR ACHIEVING HIGH SOFTWARE QUALITY

    Directory of Open Access Journals (Sweden)

    DR. SHASHANK.D.JOSHI

    2011-05-01

    Full Text Available Quality has been one of the most important factors in judging any product. Quality means “a degree or grade of excellence or worth”. Quality is a term that is usually described using adjectives. Quality has several attributes to it, some of which can be quantified using metrics. These attributes such as usability, portability, security, performance, reliability etc have different importance in different projects. Different software quality assurance methods & practices have been used in different software projects to attain the true value. Quality is an attribute which is a distinct feature and it differs with people’s perception. Achieving high software quality involves measurement of software metrics and optimization based on estimated values. As the software systems grow larger, complexity ofdesign and implementation increases, and this in turn is more prone to defects and hence directly affect the quality of the systems. However, in any software project, high quality is always desirable, and many projects have specific quality requirements. Achieving high software quality involves measurement of software metrics and optimization based on estimated values. Developing high quality software is governed by factors such as people, process, technology and cost. This paper attempts to present a novel approach towards achieving high software quality in various kinds of projects under given constraints.

  19. Software Architecture Quality Evaluation : Approaches in an Industrial Context

    OpenAIRE

    Mårtensson, Frans

    2006-01-01

    Software architecture has been identified as an increasingly important part of software development. The software architecture helps the developer of a software system to define the internal structure of the system. Several methods for evaluating software architectures have been proposed in order to assist the developer in creating a software architecture that will have a potential to fulfil the requirements on the system. Many of the evaluation methods focus on evaluation of a single quality...

  20. Software Configuration Management: The Quality Weakness

    International Nuclear Information System (INIS)

    At the moment it is very difficult to din any process in the industry where software is not involved. We trust software does minimize the possibility of process failures. In parallel, the quality and safety requirements of our processes have been improved to satisfactory levels. Let's look around us. Every day, thousands of calculations are carried out by our engineers using computer programs. Hundreds of processes are controlled automatically. Safety marging, limits, operation controls..., are derived from them. The tools begin to control our processes but, Who does control the tool? Once they have been installed and once they are running, are they always reliable? NO If you think that your current system are satisfactory, we propose you a game in this report. It is just a test. Which is your score?. Then we revise the concept of Configuration Management and we describe an ideal machine; the ''Perpetuum Mobile'' of the Configuration. We describe some rules to implement and improvement and we comment on the operative experience in ENUSA. (Author)

  1. SAPHIRE 8 Software Quality Assurance Plan

    Energy Technology Data Exchange (ETDEWEB)

    Curtis Smith

    2010-02-01

    This Quality Assurance (QA) Plan documents the QA activities that will be managed by the INL related to JCN N6423. The NRC developed the SAPHIRE computer code for performing probabilistic risk assessments (PRAs) using a personal computer (PC) at the Idaho National Laboratory (INL) under Job Code Number (JCN) L1429. SAPHIRE started out as a feasibility study for a PRA code to be run on a desktop personal PC and evolved through several phases into a state-of-the-art PRA code. The developmental activity of SAPHIRE was the result of two concurrent important events: The tremendous expansion of PC software and hardware capability of the 90s and the onset of a risk-informed regulation era.

  2. A Systems Perspective on the Quality Description of Software Components

    OpenAIRE

    Otto Preiss; Alain Wegmann

    2003-01-01

    In this paper we present our rational for proposing a conceptual model for the description of quality attributes of software artifacts, in particular suited to software components. The scientific foundations for our quality description model are derived from researching systems science for its value to software engineering. In this work we realized that software engineering is concerned with a number of interrelated conceptual as well as concrete systems. Each of them exhibits the basic syste...

  3. Optimization of Software Quality Using Management and Technical Review Techniques

    OpenAIRE

    2014-01-01

    Optimizing the quality of software is a function of the degree of reviews made during the early life of a software development process. Reviews detect errors and potential errors early in the software development process. The errors detected during the early life cycle of software are least expensive to correct. Efficient involvement in software inspections and technical reviews, help developers improve their own skills, thereby mitigating the occurrence of errors in the later stage of softwa...

  4. Role of Quality Source Code Documentation in Software Testing

    OpenAIRE

    Prem Parashar; Arvind Kalia; Rajesh Bhatia

    2014-01-01

    Software testing is performed to validate that software under test meets all requirements. With the increase in software developing platforms, developers may commit those errors, which, if not tested with appropriate test cases, may lead to false confidence in software testing. In this paper, we proposed that building quality source code documentation can help in predicting such errors. To validate this proposal, we performed an initial study and found that if software is well documented, a t...

  5. Experiences with Software Quality Metrics in the EMI middleware

    International Nuclear Information System (INIS)

    The EMI Quality Model has been created to define, and later review, the EMI (European Middleware Initiative) software product and process quality. A quality model is based on a set of software quality metrics and helps to set clear and measurable quality goals for software products and processes. The EMI Quality Model follows the ISO/IEC 9126 Software Engineering – Product Quality to identify a set of characteristics that need to be present in the EMI software. For each software characteristic, such as portability, maintainability, compliance, etc, a set of associated metrics and KPIs (Key Performance Indicators) are identified. This article presents how the EMI Quality Model and the EMI Metrics have been defined in the context of the software quality assurance activities carried out in EMI. It also describes the measurement plan and presents some of the metrics reports that have been produced for the EMI releases and updates. It also covers which tools and techniques can be used by any software project to extract “code metrics” on the status of the software products and “process metrics” related to the quality of the development and support process such as reaction time to critical bugs, requirements tracking and delays in product releases.

  6. The impact of design complexity on software cost and quality

    OpenAIRE

    Duc, Anh Nguyen

    2010-01-01

    Context: Early prediction of software cost and quality is important for better software planning and controlling. In early development phases, design complexity metrics are considered as useful indicators of software testing effort and some quality attributes. Although many studies investigate the relationship between design complexity and cost and quality, it is unclear what we have learned from these studies, because no systematic synthesis exists to date. Aim: The research presented in thi...

  7. Pragmatic quality metrics for evolutionary software development models

    Science.gov (United States)

    Royce, Walker

    1990-01-01

    Due to the large number of product, project, and people parameters which impact large custom software development efforts, measurement of software product quality is a complex undertaking. Furthermore, the absolute perspective from which quality is measured (customer satisfaction) is intangible. While we probably can't say what the absolute quality of a software product is, we can determine the relative quality, the adequacy of this quality with respect to pragmatic considerations, and identify good and bad trends during development. While no two software engineers will ever agree on an optimum definition of software quality, they will agree that the most important perspective of software quality is its ease of change. We can call this flexibility, adaptability, or some other vague term, but the critical characteristic of software is that it is soft. The easier the product is to modify, the easier it is to achieve any other software quality perspective. This paper presents objective quality metrics derived from consistent lifecycle perspectives of rework which, when used in concert with an evolutionary development approach, can provide useful insight to produce better quality per unit cost/schedule or to achieve adequate quality more efficiently. The usefulness of these metrics is evaluated by applying them to a large, real world, Ada project.

  8. Quality Assurance in Software Development: An Exploratory Investigation in Software Project Failures and Business Performance

    Science.gov (United States)

    Ichu, Emmanuel A.

    2010-01-01

    Software quality is perhaps one of the most sought-after attributes in product development, however; this goal is unattained. Problem factors in software development and how these have affected the maintainability of the delivered software systems requires a thorough investigation. It was, therefore, very important to understand software…

  9. Experiences with Software Quality Metrics in the EMI Middleware

    CERN Document Server

    CERN. Geneva

    2012-01-01

    The EMI Quality Model has been created to define, and later review, the EMI (European Middleware Initiative) software product and process quality. A quality model is based on a set of software quality metrics and helps to set clear and measurable quality goals for software products and processes. The EMI Quality Model follows the ISO/IEC 9126 Software Engineering – Product Quality to identify a set of characteristics that need to be present in the EMI software. For each software characteristic, such as portability, maintainability, compliance, etc, a set of associated metrics and KPIs (Key Performance Indicators) are identified. This article presents how the EMI Quality Model and the EMI Metrics have been defined in the context of the software quality assurance activities carried out in EMI. It also describes the measurement plan and presents some of the metrics reports that have been produced for the EMI releases and updates. It also covers which tools and techniques can be used by any software project t...

  10. Experiences with Software Quality Metrics in the EMI middlewate

    CERN Document Server

    Alandes, M; Meneses, D; Pucciani, G

    2012-01-01

    The EMI Quality Model has been created to define, and later review, the EMI (European Middleware Initiative) software product and process quality. A quality model is based on a set of software quality metrics and helps to set clear and measurable quality goals for software products and processes. The EMI Quality Model follows the ISO/IEC 9126 Software Engineering – Product Quality to identify a set of characteristics that need to be present in the EMI software. For each software characteristic, such as portability, maintainability, compliance, etc, a set of associated metrics and KPIs (Key Performance Indicators) are identified. This article presents how the EMI Quality Model and the EMI Metrics have been defined in the context of the software quality assurance activities carried out in EMI. It also describes the measurement plan and presents some of the metrics reports that have been produced for the EMI releases and updates. It also covers which tools and techniques can be used by any software project to ...

  11. Systems and software quality the next step for industrialisation

    CERN Document Server

    Wieczorek, Martin; Bons, Heinz

    2014-01-01

    Software and systems quality is playing an increasingly important role in the growth of almost all - profit and non-profit - organisations. Quality is vital to the success of enterprises in their markets. Most small trade and repair businesses use software systems in their administration and marketing processes. Every doctor's surgery is managing its patients using software. Banking is no longer conceivable without software. Aircraft, trucks and cars use more and more software to handle their increasingly complex technical systems. Innovation, competition and cost pressure are always present i

  12. A Software Quality Evaluation System: JT-SQE

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    JT-SQE system is a software quality and measurement system. Itsdesign w a s based on the Chinese national standards of software product evaluation and qua lity characteristics. The JT-SQE system consists of two parts. One is the mode l for software quality measurement, which is of hierarchical structure. The other is the process of requirements definition, measurement and rating. The system i s a feasible model for software quality evaluation and measurement, and it has t he advantage of a friendly user interface, simple operation, ease of revision an d maintenance, and expansible measurements.

  13. The CMS Data Quality Monitoring software experience and future improvements

    CERN Document Server

    De Guio, Federico

    2013-01-01

    The Data Quality Monitoring (DQM) Software proved to be a central tool in the CMS experiment. Its flexibility allowed its integration in several environments Online, for real-time detector monitoring; Offline, for the final, fine-grained Data Certification; Release Validation, to constantly validate the functionality and the performance of the reconstruction software; in Monte Carlo productions. The central tool to deliver Data Quality information is a web site for browsing data quality histograms (DQM GUI). In this contribution the usage of the DQM Software in the different environments and its integration in the CMS Reconstruction Software Framework and in all production workflows are presented.

  14. Software Quality Metrics for Geant4: An Initial Assessment

    CERN Document Server

    Ronchieri, Elisabetta; Giacomini, Francesco

    2016-01-01

    In the context of critical applications, such as shielding and radiation protection, ensuring the quality of simulation software they depend on is of utmost importance. The assessment of simulation software quality is important not only to determine its adoption in experimental applications, but also to guarantee reproducibility of outcome over time. In this study, we present initial results from an ongoing analysis of Geant4 code based on established software metrics. The analysis evaluates the current status of the code to quantify its characteristics with respect to documented quality standards; further assessments concern evolutions over a series of release distributions. We describe the selected metrics that quantify software attributes ranging from code complexity to maintainability, and highlight what metrics are most effective at evaluating radiation transport software quality. The quantitative assessment of the software is initially focused on a set of Geant4 packages, which play a key role in a wide...

  15. The 7 C's for Creating Living Software: A Research Perspective for Quality Oriented Software Engineering.

    NARCIS (Netherlands)

    Aksit, Mehmet

    2004-01-01

    This article proposes the 7 C's for realizing quality-oriented software engineering practices. All the desired qualities of this approach are expressed in short by the term living software. The 7 C's are: Concern-oriented processes, Canonical models, Composable models, Certiable models, Constructibl

  16. 2003 SNL ASCI applications software quality engineering assessment report.

    Energy Technology Data Exchange (ETDEWEB)

    Schofield, Joseph Richard, Jr.; Ellis, Molly A.; Williamson, Charles Michael; Bonano, Lora A.

    2004-02-01

    This document describes the 2003 SNL ASCI Software Quality Engineering (SQE) assessment of twenty ASCI application code teams and the results of that assessment. The purpose of this assessment was to determine code team compliance with the Sandia National Laboratories ASCI Applications Software Quality Engineering Practices, Version 2.0 as part of an overall program assessment.

  17. A pattern framework for software quality assessment and tradeoff analysis

    NARCIS (Netherlands)

    Folmer, Eelke; Boscht, Jan

    2007-01-01

    The earliest design decisions often have a significant impact on software quality and are the most costly to revoke. One of the challenges in architecture design is to reduce the frequency of retrofit problems in software designs; not being able to improve the quality of a system cost effectively, a

  18. Software quality assurance and software safety in the Biomed Control System

    International Nuclear Information System (INIS)

    The Biomed Control System is a hardware/software system used for the delivery, measurement and monitoring of heavy-ion beams in the patient treatment and biology experiment rooms in the Bevalac at the Lawrence Berkeley Laboratory (LBL). This paper describes some aspects of this system including historical background philosophy, configuration management, hardware features that facilitate software testing, software testing procedures, the release of new software quality assurance, safety and operator monitoring. 3 refs

  19. Software Quality Evaluation for Evolving Systems in Distributed Development Environments

    OpenAIRE

    Jabangwe, Ronald

    2015-01-01

    Context: There is an overwhelming prevalence of companies developing software in global software development (GSD) contexts. The existing body of knowledge, however, falls short of providing comprehensive empirical evidence on the implication of GSD contexts on software quality for evolving software systems. Therefore there is limited evidence to support practitioners that need to make informed decisions about ongoing or future GSD projects. Objective: This thesis work seeks to explore change...

  20. A software quality model and metrics for risk assessment

    Science.gov (United States)

    Hyatt, L.; Rosenberg, L.

    1996-01-01

    A software quality model and its associated attributes are defined and used as the model for the basis for a discussion on risk. Specific quality goals and attributes are selected based on their importance to a software development project and their ability to be quantified. Risks that can be determined by the model's metrics are identified. A core set of metrics relating to the software development process and its products is defined. Measurements for each metric and their usability and applicability are discussed.

  1. Software Quality - Introduction to the Special Theme

    NARCIS (Netherlands)

    Cleve, A.; Vinju, J.J.

    2014-01-01

    The introduction of fast and cheap computer and networking hardware enables the spread of software. Software, in a nutshell, represents an unprecedented ability to channel creativity and innovation. The joyful act of simply writing computer programs for existing ICT infrastructure can change the wor

  2. INTELLIGENT MANAGEMENT OF THE QUALITY OF SYSTEMS BY SOLVING A GENERALIZED ASSIGNMENT PROBLEM WITH THE USE OF ASC-ANALYSIS AND "EIDOS-X++" SYSTEM

    Directory of Open Access Journals (Sweden)

    Lutsenko Y. V.

    2015-05-01

    Full Text Available The quality of a system is seen as an emergent property of systems, due to their composition and structure, and it reflects their functionality, reliability and cost. Therefore, when we speak about quality management, the purpose of management is the formation of pre-defined system properties of the object of management. The stronger the object of the control expresses its system properties, the stronger the nonlinearity manifests of the object: both the dependence of the management factors from each other, and the dependence of the results of the action of some factors from the actions of others. Therefore, the problem of quality management is that in the management process the management object itself changes qualitatively, i.e. it changes its level of consistency, the degree of determinism and the transfer function itself. This problem can be viewed as several tasks: First is the system identification of the condition of the object of management, 2nd – making decisions about controlling influence that changes the composition of the control object in a way its quality maximally increases at minimum costs. To solve the 2nd problem we have proposed an application of the component selection of the object by functions based on the resources allocated for the implementation of different functions; costs associated with the choice of the components and the degree of compliance of various components to their functional purpose. In fact, we have proposed a formulation and a solution of the new generalization of a variant of the assignment problem: "multi backpack", which differs from the known with the fact that the selection has been based not only on the resources and costs, but also with taking into account the degree of compliance of the components to their functional purpose. A mathematical model, which provides a solution to the 1st problem, and reflecting the degree of compliance of the components to their functionality, as well as the entire

  3. An Approach to Early Prediction of Software Quality

    Institute of Scientific and Technical Information of China (English)

    YAO Lan; YANG Bo

    2007-01-01

    Due to the rapid development of computers and their applications, early software quality prediction in software industry becomes more and more crucial. Software quality prediction model is very helpful for decision-makings such as the allocation of resource in module verification and validation. Nevertheless, due to the complicated situations of software development process in the early stage, the applicability and accuracy of these models are still under research. In this paper, a software quality prediction model based on a fuzzy neural network is presented, which takes into account both the internal factors and external factors of software. With hybrid-learning algorithm, the proposed model can deal with multiple forms of data as well as incomplete information, which helps identify design errors early and avoid expensive rework.

  4. Training, Quality Assurance Factors, and Tools Investigation: a Work Report and Suggestions on Software Quality Assurance

    Science.gov (United States)

    Lee, Pen-Nan

    1991-01-01

    Previously, several research tasks have been conducted, some observations were obtained, and several possible suggestions have been contemplated involving software quality assurance engineering at NASA Johnson. These research tasks are briefly described. Also, a brief discussion is given on the role of software quality assurance in software engineering along with some observations and suggestions. A brief discussion on a training program for software quality assurance engineers is provided. A list of assurance factors as well as quality factors are also included. Finally, a process model which can be used for searching and collecting software quality assurance tools is presented.

  5. A Framework for Rapid and Systematic Software Quality Assessment

    OpenAIRE

    Brandtner, Martin

    2015-01-01

    Software quality assessment monitors and guides the evolution of a software system based on quality measurements. Continuous Integration (CI) environ- ments can provide measurement data to feed such continuous assessments. However, in modern CI environments, data is scattered across multiple CI tools (e.g., build tool, version control system). Even small quality assessments can become extremely time-consuming, because each stakeholder has to seek for the data she needs. In this thesis, we int...

  6. A Methodology for Software Design Quality Assessment of Design Enhancements

    OpenAIRE

    Sahar Reda; Hany Ammar; Osman Hegazy

    2012-01-01

    The most important measure that must be considered in anysoftware product is its design quality. Measuring of the designquality in the early stages of software development is the key todevelop and enhance quality software. Research on objectoriented design metrics has produced a large number of metricsthat can be measured to identify design problems and assessdesign quality attributes. However the use of these design metricsis limited in practice due to the difficulty of measuring and usinga ...

  7. Criteria and tools for scientific software quality measurements

    International Nuclear Information System (INIS)

    Not all software used in the nuclear industry needs the rigorous formal verification, reliability testing and quality assessment that are being applied to safety critical software. Recently, however, there is increasing recognition that systematic and objective quality assessment of the scientific software used in design and safety analyses of nuclear facilities is necessary to support safety and licensing decisions. Because of the complexity and large size of these programs and the resource constraints faced by the AECB reviewer, it is desirable that appropriate automated tools are used wherever practical. To objectively assess the quality of software, a set of attributes of a software product by which its quality is described and evaluated must be established. These attributes must be relevant to the application domain of software under evaluation. To effectively assess the quality of software, metrics defining quantitative scale and method appropriate to determine the value of attributes need to be applied. To cost-effectively perform the evaluation, use of suitable automated tools is desirable. In this project, criteria for evaluating the quality of scientific software are presented; metrics for which those criteria can be evaluated are identified; a survey of automated tools to measure those metrics was conducted and the most appropriate tool (QA Fortran) was acquired; and the tool usage was demonstrated on three sample programs. (author) 5 refs

  8. Software Quality Assurance-Challenges in Launch Vehicle Projects

    Directory of Open Access Journals (Sweden)

    Poofa Gopalan

    2006-01-01

    Full Text Available Launch vehicle projects now depend on software, more than ever before, to ensure safetyand efficiency. Such critical software syfiems, which can lead to injury, destruction or loss ofvital equipment, human lives, and damage to environment, must be developed and verified withhigh level of quality and reliability. An overview of current quality practices pursued in launchvehicle projects is presented in this paper. These practices have played a vital role in the successfullaunch vehicle missions of Indian Space Research Organisation. As complexity of softwareincreases, the activity that gets affected is nothing but, software quality assurance (SQA. TheSQA team is facing a lot of challenges in current practices. This paper brings out such challengesin different phases of software life cycle. A set of key points to some techniques and tools, thatcould contribute to meet the software quality 'assurance challenges in launch vehicle projects,are also discussed.

  9. NIF Projects Controls and Information Systems Software Quality Assurance Plan

    International Nuclear Information System (INIS)

    Quality achievement for the National Ignition Facility (NIF) and the National Ignition Campaign (NIC) is the responsibility of the NIF Projects line organization as described in the NIF and Photon Science Directorate Quality Assurance Plan (NIF QA Plan). This Software Quality Assurance Plan (SQAP) is subordinate to the NIF QA Plan and establishes quality assurance (QA) activities for the software subsystems within Controls and Information Systems (CIS). This SQAP implements an activity level software quality assurance plan for NIF Projects as required by the LLNL Institutional Software Quality Assurance Program (ISQAP). Planned QA activities help achieve, assess, and maintain appropriate quality of software developed and/or acquired for control systems, shot data systems, laser performance modeling systems, business applications, industrial control and safety systems, and information technology systems. The objective of this SQAP is to ensure that appropriate controls are developed and implemented for management planning, work execution, and quality assessment of the CIS organization's software activities. The CIS line organization places special QA emphasis on rigorous configuration control, change management, testing, and issue tracking to help achieve its quality goals.

  10. NIF Projects Controls and Information Systems Software Quality Assurance Plan

    Energy Technology Data Exchange (ETDEWEB)

    Fishler, B

    2011-03-18

    Quality achievement for the National Ignition Facility (NIF) and the National Ignition Campaign (NIC) is the responsibility of the NIF Projects line organization as described in the NIF and Photon Science Directorate Quality Assurance Plan (NIF QA Plan). This Software Quality Assurance Plan (SQAP) is subordinate to the NIF QA Plan and establishes quality assurance (QA) activities for the software subsystems within Controls and Information Systems (CIS). This SQAP implements an activity level software quality assurance plan for NIF Projects as required by the LLNL Institutional Software Quality Assurance Program (ISQAP). Planned QA activities help achieve, assess, and maintain appropriate quality of software developed and/or acquired for control systems, shot data systems, laser performance modeling systems, business applications, industrial control and safety systems, and information technology systems. The objective of this SQAP is to ensure that appropriate controls are developed and implemented for management planning, work execution, and quality assessment of the CIS organization's software activities. The CIS line organization places special QA emphasis on rigorous configuration control, change management, testing, and issue tracking to help achieve its quality goals.

  11. Structure of AscE and Induced Burial Regions in AscE and AscG upon Formation of the Chaperone Needle-subunit Complex of Type III Secretion System in Aeromonas Hydrophila

    Energy Technology Data Exchange (ETDEWEB)

    Tan, Y.; Yu, H; Leung, K; Sivaraman, J; Mok, Y

    2008-01-01

    In the type III secretion system (T3SS) of Aeromonas hydrophila, the putative needle complex subunit AscF requires both putative chaperones AscE and AscG for formation of a ternary complex to avoid premature assembly. Here we report the crystal structure of AscE at 2.7 A resolution and the mapping of buried regions of AscE, AscG, and AscF in the AscEG and AscEFG complexes using limited protease digestion. The dimeric AscE is comprised of two helix-turn-helix monomers packed in an antiparallel fashion. The N-terminal 13 residues of AscE are buried only upon binding with AscG, but this region is found to be nonessential for the interaction. AscE functions as a monomer and can be coexpressed with AscG or with both AscG and AscF to form soluble complexes. The AscE binding region of AscG in the AscEG complex is identified to be within the N-terminal 61 residues of AscG. The exposed C-terminal substrate-binding region of AscG in the AscEG complex is induced to be buried only upon binding to AscF. However, the N-terminal 52 residues of AscF remain exposed even in the ternary AscEFG complex. On the other hand, the 35-residue C-terminal region of AscF in the complex is resistant to protease digestion in the AscEFG complex. Site-directed mutagenesis showed that two C-terminal hydrophobic residues, Ile83 and Leu84, of AscF are essential for chaperone binding.

  12. The effect of competition from open source software on the quality of proprietary software in the presence of network externalities

    OpenAIRE

    Mingqing Xing

    2015-01-01

    Purpose: A growing number of open source software emerges in many segments of the software market. In addition, software products usually exhibit network externalities. The purpose of this paper is to study the impact of open source software on the quality choices of proprietary software vendors when the market presents positive network externalities. Design/methodology: To analyze how open source software affects the optimal quality of proprietary software, this paper constructs two vertical...

  13. Tool Use Within NASA Software Quality Assurance

    Science.gov (United States)

    Shigeta, Denise; Port, Dan; Nikora, Allen P.; Wilf, Joel

    2013-01-01

    As space mission software systems become larger and more complex, it is increasingly important for the software assurance effort to have the ability to effectively assess both the artifacts produced during software system development and the development process itself. Conceptually, assurance is a straightforward idea - it is the result of activities carried out by an organization independent of the software developers to better inform project management of potential technical and programmatic risks, and thus increase management's confidence in the decisions they ultimately make. In practice, effective assurance for large, complex systems often entails assessing large, complex software artifacts (e.g., requirements specifications, architectural descriptions) as well as substantial amounts of unstructured information (e.g., anomaly reports resulting from testing activities during development). In such an environment, assurance engineers can benefit greatly from appropriate tool support. In order to do so, an assurance organization will need accurate and timely information on the tool support available for various types of assurance activities. In this paper, we investigate the current use of tool support for assurance organizations within NASA, and describe on-going work at JPL for providing assurance organizations with the information about tools they need to use them effectively.

  14. Improvement of Key Problems of Software Testing in Quality Assurance

    CERN Document Server

    Iqbal, Nayyar

    2012-01-01

    Quality assurance makes sure the project will be completed based on the previously approved specifications, standards and functionality. It is required without defects and possible problems. It monitors and tries to progress the development process from the start of the project. Software Quality Assurance (SQA) is the combination of the entire software development process, which includes software design, coding, source code control, code review, change management, configuration management and release management. In this paper we describe the solution for the key problems of software testing in quality assurance. The existing software practices have some problems such as testing practices, attitude of users and culture of organizations. All these tree problems have some combined problems such as shortcuts in testing, reduction in testing time, poor documentation etc. In this paper we are recommending strategies to provide solution of the said problems mentioned above.

  15. MEASURING THE QUALITY OF OBJECT ORIENTED SOFTWARE MODULARIZATION

    Directory of Open Access Journals (Sweden)

    Sunil L. Bangare

    2011-01-01

    Full Text Available We proposed a System to measure the quality of modularization of object-oriented software system. Our work is proposed in three Parts as follows:MODULE 1: DEFINING METRICS FOR OBJECT ORIENTED SOFTWARE AND ALGORITHMMODULE 2: CODE PARSERMODULE 3: CODE ANALYZERIn this paper we are focusing on Module 1 of our work that is defining metrics for object oriented software modularization and providing algorithm for it.

  16. A purely functional combinator language for software quality assessment

    OpenAIRE

    Martins, Pedro; Fernandes, João Paulo; Saraiva, João

    2012-01-01

    Quality assessment of open source software is becoming an important and active research area. One of the reasons for this recent interest is the consequence of Internet popularity. Nowadays, programming also involves looking for the large set of open source libraries and tools that may be reused when developing our software applications. In order to reuse such open source software artifacts, programmers not only need the guarantee that the reused artifact is certified, but a...

  17. In Search of Quality--Educational Software.

    Science.gov (United States)

    Lauterbach, Roland

    1989-01-01

    Based on the results of an assessment of 241 teaching-learning programs in biology, chemistry, physics, general science (primary level), and informatics, this article discusses possible criteria for the assessment of educational software. Offers suggestions for potential users, for practitioners of teacher education, and for the developers of…

  18. Software Design Improvements. Part 2; Software Quality and the Design and Inspection Process

    Science.gov (United States)

    Lalli, Vincent R.; Packard, Michael H.; Ziemianski, Tom

    1997-01-01

    The application of assurance engineering techniques improves the duration of failure-free performance of software. The totality of features and characteristics of a software product are what determine its ability to satisfy customer needs. Software in safety-critical systems is very important to NASA. We follow the System Safety Working Groups definition for system safety software as: 'The optimization of system safety in the design, development, use and maintenance of software and its integration with safety-critical systems in an operational environment. 'If it is not safe, say so' has become our motto. This paper goes over methods that have been used by NASA to make software design improvements by focusing on software quality and the design and inspection process.

  19. Problems in Software Quality Assurance and Reasons

    OpenAIRE

    Mohammed Khalaf M Alshammri

    2013-01-01

    This paper is aimed at highlighting the problems which has been faced by the project managers as well as the companies regarding the quality assurance. It has been seen that people do not pay much attention towards the quality assurance issues and thus eventually end up with wasting their money as well as time. Thats why it is important to make sure that the project meets the quality requirements.

  20. Development and Application of New Quality Model for Software Projects

    OpenAIRE

    2014-01-01

    The IT industry tries to employ a number of models to identify the defects in the construction of software projects. In this paper, we present COQUALMO and its limitations and aim to increase the quality without increasing the cost and time. The computation time, cost, and effort to predict the residual defects are very high; this was overcome by developing an appropriate new quality model named the software testing defect corrective model (STDCM). The STDCM was used to estimate the number of...

  1. Software Quality Assurance-Challenges in Launch Vehicle Projects

    OpenAIRE

    Poofa Gopalan; S.S. Uma Sankari; D. Mohan Kumar; R. Vikraman Nair

    2006-01-01

    Launch vehicle projects now depend on software, more than ever before, to ensure safetyand efficiency. Such critical software syfiems, which can lead to injury, destruction or loss ofvital equipment, human lives, and damage to environment, must be developed and verified withhigh level of quality and reliability. An overview of current quality practices pursued in launchvehicle projects is presented in this paper. These practices have played a vital role in the successfullaunch vehicle mission...

  2. Mathematical Principles in Software Quality Engineering

    CERN Document Server

    Singh, Manoranjan Kumar

    2010-01-01

    Mathematics has many useful properties for developing of complex software systems. One is that it can exactly describe a physical situation of the object or outcome of an action. Mathematics support abstraction and this is an excellent medium for modeling, since it is an exact medium there is a little possibility of ambiguity. This paper demonstrates that mathematics provides a high level of validation when it is used as a software medium. It also outlines distinguishing characteristics of structural testing which is based on the source code of the program tested. Structural testing methods are very amenable to rigorous definition, mathematical analysis and precise measurement. Finally, it also discusses functional and structural testing debate to have a sense of complete testing. Any program can be considered to be a function in the sense that program input forms its domain and program outputs form its range. In general discrete mathematics is more applicable to functional testing, while graph theory pertain...

  3. Quality Classifiers for Open Source Software Repositories

    OpenAIRE

    Tsatsaronis, George; Halkidi, Maria; Giakoumakis, Emmanouel A.

    2009-01-01

    Open Source Software (OSS) often relies on large repositories, like SourceForge, for initial incubation. The OSS repositories offer a large variety of meta-data providing interesting information about projects and their success. In this paper we propose a data mining approach for training classifiers on the OSS meta-data provided by such data repositories. The classifiers learn to predict the successful continuation of an OSS project. The `successfulness' of projects is defined in terms of th...

  4. Quality Management Systems in Australian Software Houses: some problems of sustaining creativity in the software process

    OpenAIRE

    Liisa von Hellens

    1995-01-01

    Software houses are taking steps towards the implementation of quality management systems (QMS) and achieving certification to international quality standards. There is an increasing tendency to require quality certificates from system suppliers before business can be even considered. The QMS is seen as a way of avoiding personnel risk if product and market knowledge remains in the possession of individuals. It is also felt that quality procedures in place will improve the company's image, at...

  5. Sandia National Laboratories ASCI Applications Software Quality Engineering Practices

    Energy Technology Data Exchange (ETDEWEB)

    ZEPPER, JOHN D.; ARAGON, KATHRYN MARY; ELLIS, MOLLY A.; BYLE, KATHLEEN A.; EATON, DONNA SUE

    2003-04-01

    This document provides a guide to the deployment of the software verification activities, software engineering practices, and project management principles that guide the development of Accelerated Strategic Computing Initiative (ASCI) applications software at Sandia National Laboratories (Sandia). The goal of this document is to identify practices and activities that will foster the development of reliable and trusted products produced by the ASCI Applications program. Document contents include an explanation of the structure and purpose of the ASCI Quality Management Council, an overview of the software development lifecycle, an outline of the practices and activities that should be followed, and an assessment tool.

  6. Sandia National Laboratories ASCI Applications Software Quality Engineering Practices

    Energy Technology Data Exchange (ETDEWEB)

    ZEPPER, JOHN D.; ARAGON, KATHRYN MARY; ELLIS, MOLLY A.; BYLE, KATHLEEN A.; EATON, DONNA SUE

    2002-01-01

    This document provides a guide to the deployment of the software verification activities, software engineering practices, and project management principles that guide the development of Accelerated Strategic Computing Initiative (ASCI) applications software at Sandia National Laboratories (Sandia). The goal of this document is to identify practices and activities that will foster the development of reliable and trusted products produced by the ASCI Applications program. Document contents include an explanation of the structure and purpose of the ASCI Quality Management Council, an overview of the software development lifecycle, an outline of the practices and activities that should be followed, and an assessment tool. These sections map practices and activities at Sandia to the ASCI Software Quality Engineering: Goals, Principles, and Guidelines, a Department of Energy document.

  7. Software quality studies using analytical metric analysis

    OpenAIRE

    Rodríguez Martínez, Cecilia

    2013-01-01

    Actualmente las empresas de ingeniería derivan una gran cantidad de recursos a la detección y corrección de errores en sus códigos software. Estos errores se deben generalmente a los errores cometidos por los desarrolladores cuando escriben el código o sus especificaciones.  No hay ninguna herramienta capaz de detectar todos estos errores y algunos de ellos pasan desapercibidos tras el proceso de pruebas. Por esta razón, numerosas investigaciones han intentado encontrar indicadores en los cód...

  8. ISO9126 BASED SOFTWARE QUALITY EVALUATION USING CHOQUET INTEGRAL

    Directory of Open Access Journals (Sweden)

    Abdelkareem M. Alashqar

    2015-01-01

    Full Text Available Evaluating software quality is an important and essential issue in the development process because it helps to deliver a competitive software product. A decision of selecting the best software based on quality attributes is a type of multi-criteria decision-making (MCDM processes where interactions among criteria should be considered. This paper presents and develops quantitative evaluations by considering interactions among criteria in the MCDM problems. The aggregator methods such as Arithmetic Mean (AM and Weighted Arithmetic Mean (WAM are introduced, described and compared to Choquet Integral (CI approach which is a type of fuzzy measure used as a new method for MCDM. The comparisons are shown by evaluating and ranking software alternatives based on six main quality attributes as identified by the ISO 9126-1 standard. The evaluation experiments depend on real data collected from case studies.

  9. A Quality Based Method to Analyze Software Architectures

    Directory of Open Access Journals (Sweden)

    Farzaneh Hoseini Jabali

    2011-07-01

    Full Text Available In order to produce and develop a software system, it is necessary to have a method of choosing a suitable software architecture which satisfies the required quality attributes and maintains a trade-off between sometimes conflicting ones. Each software architecture includes a set of design decisions for each of which there are various alternatives, satisfying the quality attributes differently. At the same time various stakeholders with various quality goals participate in decision-making. In this paper a numerical method is proposed that based on the quality attributes selects the suitable software architecture for a certain software. In this method, for each design decision, different alternatives are compared in view of a certain quality attribute, and the other way around. Multi-criteria decision-making methods are used and, at the same time, time and cost constraints are considered in decision-making, too. The proposed method applies the stakeholders' opinions in decision-making according to the degree of their importance and helps the architect to select the best software architecture with more certainty.

  10. Aspect-Oriented Software Quality Model: The AOSQ Model

    Directory of Open Access Journals (Sweden)

    Pankaj Kumar

    2012-04-01

    Full Text Available Nowadays, software development has become more complex and dynamic; they are expected more flexible, scalable and reusable. Under the umbrella of aspect, Aspect-Oriented Software Development (AOSD is relatively a modern programming paradigm to improve modularity in software development. Using Aspect-Oriented Programming (AOP language to implements crosscutting concerns through the introduction of a new construct Aspect like Class is defined as a modular unit of crosscutting behavior that affect multiple classes into reusable modules. Several quality models to measure the quality of software are available in literature. However, keep on developing software, and acceptance of new environment (i.e. AOP under conditions that give rise to an issue of evolvability. After the evolution of system, we have to find out how the new system needs to be extensible? What is the configurable status? Is designed pattern stable for new environment and technology? How the new system is sustainable? The objective of this paper is to propose a new quality model for AOSD to integrating some new qualityattributes in AOSQUAMO Model based which is based on ISO/IEC 9126 Quality Model, is called AspectOriented Quality (AOSQ Model. Analytic Hierarchy Process (AHP is used to evaluate an improved hierarchical quality model for AOSD.

  11. 2011 SAPHIRE 8 Software Quality Assurance Status Report

    Energy Technology Data Exchange (ETDEWEB)

    Kurt G. Vedros

    2011-09-01

    The Software Quality Assurance engineer position was created in fiscal year 2011 to better maintain and improve the quality of the SAPHIRE 8 development program. This year's Software Quality Assurance tasks concentrated on developing the framework of the SQA program. This report reviews the accomplishments and recommendations for each of the subtasks set forth for JCN V6059: (1) Reviews, Tests, and Code Walkthroughs; (2) Data Dictionary; (3) Metrics; (4) Requirements Traceability Matrix; (5) Provide Oversight on SAPHIRE QA Activities; and (6) Support NRC Presentations and Meetings.

  12. A Methodology for Software Design Quality Assessment of Design Enhancements

    Directory of Open Access Journals (Sweden)

    Sahar Reda

    2012-12-01

    Full Text Available The most important measure that must be considered in anysoftware product is its design quality. Measuring of the designquality in the early stages of software development is the key todevelop and enhance quality software. Research on objectoriented design metrics has produced a large number of metricsthat can be measured to identify design problems and assessdesign quality attributes. However the use of these design metricsis limited in practice due to the difficulty of measuring and usinga large number of metrics. This paper presents a methodology forsoftware design quality assessment. This methodology helps thedesigner to measure and assess the changes in design due todesign enhancements. The goal of this paper is to illustrate themethodology using practical software design examples andanalyze its utility in industrial projects. Finally, we present a casestudy to illustrate the methodology.

  13. Spectrum analysis on quality requirements consideration in software design documents

    OpenAIRE

    Kaiya, Haruhiko; Umemura, Masahiro; Ogata, Shinpei; Kaijiri, Kenji

    2013-01-01

    Abstract Software quality requirements defined in the requirements analysis stage should be implemented in the final products, such as source codes and system deployment. To guarantee this meta-requirement, quality requirements should be considered in the intermediate stages, such as the design stage or the architectural definition stage. We propose a novel method for checking whether quality requirements are considered in the design stage. In this method, a technique called “spectrum analysi...

  14. ARCHITECTURE SOFTWARE SOLUTION TO SUPPORT AND DOCUMENT MANAGEMENT QUALITY SYSTEM

    Directory of Open Access Journals (Sweden)

    Milan Eric

    2010-12-01

    Full Text Available One of the basis of a series of standards JUS ISO 9000 is quality system documentation. An architecture of the quality system documentation depends on the complexity of business system. An establishment of an efficient management documentation of system of quality is of a great importance for the business system, as well as in the phase of introducing the quality system and in further stages of its improvement. The study describes the architecture and capability of software solutions to support and manage the quality system documentation in accordance with the requirements of standards ISO 9001:2001, ISO 14001:2005 HACCP etc.

  15. Assessing quality in software development: An agile methodology approach

    Directory of Open Access Journals (Sweden)

    V. Rodríguez-Hernández

    2015-06-01

    Full Text Available A novel methodology, result of 10 years of in-field testing, which makes possible the convergence of different types of models and quality standards for Engineering and Computer Science Faculties, is presented. Since most software-developing companies are small and medium sized, the projects developed must focuson SCRUM and Extreme Programming (XP, opposed to a RUP, which is quite heavy, as well as on Personal Software Process (PSP and Team Software Process (TSP, which provide students with competences and a structured framework. ISO 90003:2004 norm is employed to define the processes by means of a quality system without new requirements or changing the existing ones. Also, the model is based on ISO/IEC 25000 (ISO (IEC 9126 – ISO/IEC 14598 to allow comparing software built by different metrics.

  16. FCE: A QUALITY METRIC FOR COTS BASED SOFTWARE DESIGN

    Directory of Open Access Journals (Sweden)

    M.V.VIJAYA SARADHI,

    2010-05-01

    Full Text Available The software that is based on component is aimed at developing large software systems thorough combining the existing software components. Before integrate different components, first one need to identify whether functional and non functional properties of different components are feasible and required to be integrated to developnew system or software. Deriving a quality measure for reusable components has proven to be challenging task now a days. This paper proposes a quality metric that provides benefit at both project and process level, namely Fault Clearance Effectiveness (FCE. This paper identifies the different characteristics that component should have so that it can be used again and again. Component qualification is a system of finding out the fitness for use of existing components that will be used to develop a new system.

  17. Color Image Quality in Presentation Software

    Directory of Open Access Journals (Sweden)

    María S. Millán

    2008-11-01

    Full Text Available The color image quality of presentation programs is evaluated and measured using S-CIELAB and CIEDE2000 color difference formulae. A color digital image in its original format is compared with the same image already imported by the program and introduced as a part of a slide. Two widely used presentation programs—Microsoft PowerPoint 2004 for Mac and Apple's Keynote 3.0.2—are evaluated in this work.

  18. Color Image Quality in Presentation Software

    OpenAIRE

    Edison Valencia; María S. Millán

    2008-01-01

    The color image quality of presentation programs is evaluated and measured using S-CIELAB and CIEDE2000 color difference formulae. A color digital image in its original format is compared with the same image already imported by the program and introduced as a part of a slide. Two widely used presentation programs—Microsoft PowerPoint 2004 for Mac and Apple's Keynote 3.0.2—are evaluated in this work.

  19. Software Industry-oriented Education with Embedded Quality Assurance Mechanisms

    Institute of Scientific and Technical Information of China (English)

    HUSSEY Matt; WU Bing

    2012-01-01

    This paper presents a broad range of suggestions on the concept of quality-assured industry-oriented higher education in software engineering, a central theme of the annual CEISIE (CEISEE this year) workshops since the first one held in Harbin, China, in 2005. It draws on the lessons of collaborative experiences involving academics and industrialists from Europe and China. These experiences make the case for a strong role for software industry- oriented higher education in the production of the software architects, developers and engineers required for the future.

  20. Evaluating the Effect of Software Quality Characteristics on Health Care Quality Indicators

    Directory of Open Access Journals (Sweden)

    Sakineh Aghazadeh

    2015-07-01

    Full Text Available Introduction: Various types of software are used in health care organizations to manage information and care processes. The quality of software has been an important concern for both health authorities and designers of Health Information Technology. Thus, assessing the effect of software quality on the performance quality of healthcare institutions is essential. Method: The most important health care quality indicators in relation to software quality characteristics are provided via an already performed literature review. ISO 9126 standard model is used for definition and integration of various characteristics of software quality. The effects of software quality characteristics and sub-characteristics on the healthcare indicators are evaluated through expert opinion analyses. A questionnaire comprising of 126 questions of 10-point Likert scale was used to gather opinions of experts in the field of Medical/Health Informatics. The data was analyzed using Structural Equation Modeling. Results: Our findings showed that software Maintainability was rated as the most effective factor on user satisfaction (R2 =0.89 and Functionality as the most important and independent variable affecting patient care quality (R2 =0.98. Efficiency was considered as the most effective factor on workflow (R2 =0.97, and Maintainability as the most important factor that affects healthcare communication (R2 =0.95. Usability and Efficiency were rated as the most effectual factor affecting patient satisfaction (R2 =0.80, 0.81. Reliability, Maintainability, and Efficiency were considered as the main factors affecting care costs (R2 =0.87, 0.74, 0.87. Conclusion: We presented a new model based on ISO standards. The model demonstrates and weighs the relations between software quality characteristics and healthcare quality indicators. The clear relationships between variables and the type of the metrics and measurement methods used in the model make it a reliable method to assess

  1. Assuring Quality and Reliability in Complex Avionics Systems hardware & Software

    Directory of Open Access Journals (Sweden)

    V. Haridas

    1997-01-01

    Full Text Available It is conventional wisdom in defence systems that electronic brains are where much of the present and future weapons system capability is developed. Electronic hardware advances, particularly in microprocessor, allow highly complex and sophisticated software to provide high degree of system autonomy and customisation to mission at hand. Since modern military systems are so much dependent on the proper functioning of electronics, the quality and reliability of electronic hardware and software have a profound impact on defensive capability and readiness. At the hardware level, due to the advances in microelectronics, functional capabilities of today's systems have increased. The advances in the hardware field have an impact on software also. Now a days, it is possible to incorporate more and more system functions through software, rather than going for a pure hardware solution. On the other hand complexities the systems are increasing, working energy levels of the systems are decreasing and the areas of reliability and quality assurance are becoming more and more wide. This paper covers major failure modes in microelectronic devices. The various techniques used to improve component and system reliability are described. The recent trends in expanding the scope of traditional quality assurance techniques are also discussed, considering both hardware and software.

  2. Software quality assurance plan for void fraction instrument

    Energy Technology Data Exchange (ETDEWEB)

    Gimera, M.

    1994-10-18

    Waste Tank SY-101 has been the focus of extensive characterization work over the past few years. The waste continually generates gases, most notably hydrogen, which are periodically released from the waste. Gas can be trapped in tank waste in three forms: as void gas (bubbles), dissolved gas, or absorbed gas. Void fraction is the volume percentage of a given sample that is comprised of void gas. The void fraction instrument (VFI) acquires the data necessary to calculate void fraction. This document covers the product, Void Fraction Data Acquisition Software. The void fraction software being developed will have the ability to control the void fraction instrument hardware and acquire data necessary to calculate the void fraction in samples. This document provides the software quality assurance plan, verification and validation plan, and configuration management plan for developing the software for the instrumentation that will be used to obtain void fraction data from Tank SY-101

  3. QualitySpy: a framework for monitoring software development processes

    Directory of Open Access Journals (Sweden)

    Marian Jureczko

    2012-03-01

    Full Text Available The growing popularity of highly iterative, agile processes creates increasing need for automated monitoring of the quality of software artifacts, which would be focused on short terms (in the case of eXtreme Programming process iteration can be limited to one week. This paper presents a framework that calculates software metrics and cooperates with development tools (e.g. source version control system and issue tracking system to describe current state of a software project with regard to its quality. The framework is designed to support high level of automation of data collection and to be useful for researchers as well as for industry. The framework is currently being developed hence the paper reports already implemented features as well as future plans. The first release is scheduled for July.

  4. Software Quality Attribute Analysis by Architecture Reconstruction (SQUA3RE)

    NARCIS (Netherlands)

    Stormer, C.

    2007-01-01

    Software Quality Attribute Analysis by Architecture Reconstruction (SQUA3RE) is a method that fosters a goal-driven process to evaluate the impact of what-if scenarios on existing systems. The method is partitioned into SQA2 and ARE. The SQA2 part provides the analysis models that can be used for q

  5. 77 FR 25168 - Appraisal Subcommittee (ASC); ASC Rules of Operation; Amended

    Science.gov (United States)

    2012-04-27

    ... Federal Housing Finance Agency. The ASC Rules of Operation serve as corporate bylaws outlining the ASC's... amended numerous provisions in Title XI. The ASC Rules of Operation serve as corporate bylaws...

  6. Quality assessment with the AGIR software results and experience

    International Nuclear Information System (INIS)

    Purpose: To evaluate whether a new software from the working group for interventional radiology (AGIR) is an appropriate tool for quality assurance in interventional radiology, and presentation of results acquired within the quality improvement process in 1999. Patients and methods: AGIR-defined parameters such as patient data, risk profile, given interventions as well as complications were registered by a recently developed software. Based on monthly data analyses, possible complications were identified and discussed in morbidity and mortality conferences. Results: 1014 interventions were performed in our institution in 1999. According to criteria established by AGIR, the complication rate was 2.7%. In addition and according to SCVIR criteria, complications were distinguished quantitatively in five classes and semiquantitatively in minor and major groups. The result was a minor complication rate of 1.8%, and a major rate of 0.9%. There were no cases of death associated with the intervention. Further strategies were developed in order to reduce the complication rate. Conclusion: Extensive quality assurance methods can be integrated in daily routine work. These methods lead to an intensive transparency of treatment results, and allow the implementation of continuous quality improvements. The development of the software is a first step in establishing a nation-wide quality assurance system. Nevertheless, modification and additional definition of the AGIR predefined parameters are required, for example, to avoid unnecessary procedures. (orig.)

  7. A Topic Modeling Based Solution for Confirming Software Documentation Quality

    Directory of Open Access Journals (Sweden)

    Nouh Alhindawi

    2016-02-01

    Full Text Available this paper presents an approach for evaluating and confirming the quality of the external software documentation using topic modeling. Typically, the quality of the external documentation has to mirror precisely the organization of the source code. Therefore, the elements of such documentation should be strongly written, associated, and presented. In this paper, we use Latent Dirichlet Allocation (LDA and HELLINGER DISTANCE to compute the similarities between the fragments of source code and the external documentation topics. These similarities are used in this paper to improve and advance the existing external documentation. Furthermore, these similarities can also be used for evaluating the new documenting process during the evolution phase of the software. The results show that the new approach yields state-of-the-art performance in evaluating and confirming the existing external documentations quality and superiority.

  8. The software product assurance metrics study: JPL's software systems quality and productivity

    Science.gov (United States)

    Bush, Marilyn W.

    1989-01-01

    The findings are reported of the Jet Propulsion Laboratory (JPL)/Software Product Assurance (SPA) Metrics Study, conducted as part of a larger JPL effort to improve software quality and productivity. Until recently, no comprehensive data had been assembled on how JPL manages and develops software-intensive systems. The first objective was to collect data on software development from as many projects and for as many years as possible. Results from five projects are discussed. These results reflect 15 years of JPL software development, representing over 100 data points (systems and subsystems), over a third of a billion dollars, over four million lines of code and 28,000 person months. Analysis of this data provides a benchmark for gauging the effectiveness of past, present and future software development work. In addition, the study is meant to encourage projects to record existing metrics data and to gather future data. The SPA long term goal is to integrate the collection of historical data and ongoing project data with future project estimations.

  9. Dynamic cardiac phantoms for use in computer software quality control

    International Nuclear Information System (INIS)

    A pilot study was initiated to obtain and implement a similar set of clinical dynamic cardiac studies (software phantoms) on different computer systems for the purpose of quality control of analysis software. Normal and abnormal gated blood pool studies were collected and transferred between six computer systems using serial transmission. Major impediments in attempting to analyse the transferred data files were incomplete or missing data records required for the calculations. Only the left ventricular ejection fraction (LVEF) parameter could be analysed on all six computers. The LVEF results obtained for 10 software phantoms using the commercial software were similar in some phantoms but widely divergent in others. Development of software phantoms still requires improvement in data transfer between computers in order to ensure a complete file content in the transferred study, and a solution for the differences in acquisition protocols. In the meantime users can start to obtain their own set of standard studies illustrative of various clinical disorders, and share these with other users with the same computer type and analysis software. (author). 4 refs, 1 tab

  10. Advanced Stirling Convertor (ASC) Technology Maturation

    Science.gov (United States)

    Wong, Wayne A.; Wilson, Scott; Collins, Josh; Wilson, Kyle

    2016-01-01

    The Advanced Stirling Convertor (ASC) development effort was initiated by NASA Glenn Research Center with contractor Sunpower, Inc., to develop high-efficiency thermal-to-electric power conversion technology for NASA Radioisotope Power Systems (RPSs). Early successful performance demonstrations led to the expansion of the project as well as adoption of the technology by the Department of Energy (DOE) and system integration contractor Lockheed Martin Space Systems Company as part of the Advanced Stirling Radioisotope Generator (ASRG) flight project. The ASRG integrates a pair of ASCs to convert the heat from a pair of General Purpose Heat Source (GPHS) modules into electrical power. The expanded NASA ASC effort included development of several generations of ASC prototypes or engineering units to help prepare the ASC technology and Sunpower for flight implementation. Sunpower later had two parallel contracts allowing the last of the NASA engineering units called ASC-E3 to serve as pathfinders for the ASC-F flight convertors being built for DOE. The ASC-E3 convertors utilized the ASC-F flight specifications and were built using the ASC-F design and process documentation. Shortly after the first ASC-F pair achieved initial operation, due to budget constraints, the DOE ASRG flight development contract was terminated. NASA continues to invest in the development of Stirling RPS technology including continued production of the ASC-E3 convertors, seven of which have been delivered with one additional unit in production. Starting in fiscal year 2015, Stirling Convertor Technology Maturation has been reorganized as an element of the RPS Stirling Cycle Technology Development (SCTD) Project and long-term plans for continued Stirling technology advancement are in reformulation. This paper provides a status on the ASC project, an overview of advancements made in the design and production of the ASC at Sunpower, and a summary of acceptance tests, reliability tests, and tactical

  11. Quantification frameworks and their application for evaluating the software quality factor using quality characteristic value

    International Nuclear Information System (INIS)

    Many problems, related with safety, frequently occur because Digital Instrument and Control Systems are widely used and expanding their ranges to many applications in Nuclear Power Plants. It, however, does not hold a general position to estimate an appropriate software quality. Thus, the Quality Characteristic Value, a software quality factor through each software life cycle, is suggested in this paper. The Quality Characteristic Value is obtained as following procedure: 1) Scoring Quality Characteristic Factors (especially correctness, traceability, completeness, and understandability) onto Software Verification and Validation results, 2) Deriving the diamond-shaped graphs by setting values of Factors at each axis and lining every points, and lastly 3) Measuring the area of the graph for Quality Characteristic Value. In this paper, this methodology is applied to Plant Control System. In addition, the series of quantification frameworks exhibit some good characteristics in the view of software quality factor. More than any thing else, it is believed that introduced framework may be applicable to regulatory guide, software approval procedures, due to its soundness and simple characteristics. (authors)

  12. A Framework for Analyzing Software Quality using Hierarchical Clustering

    Directory of Open Access Journals (Sweden)

    Arashdeep Kaur

    2011-02-01

    Full Text Available Fault proneness data available in the early software life cycle from previous releases or similar kind of projects will aid in improving software quality estimations. Various techniques have been proposed in the literature which includes statistical method, machine learning methods, neural network techniques and clustering techniques for the prediction of faulty and non faulty modules in the project. In this study, Hierarchical clustering algorithm is being trained and tested with lifecycle data collected from NASA projects namely, CM1, PC1 and JM1 as predictive models. These predictive models contain requirement metrics and static code metrics. We have combined requirement metric model with static code metric model to get fusion metric model. Further we have investigated that which of the three prediction models is found to be the best prediction model on the basis of fault detection. The basic hypothesis of software quality estimation is that automatic quality prediction models enable verificationexperts to concentrate their attention and resources at problem areas of the system under development. The proposed approach has been implemented in MATLAB 7.4. The results show that when all the prediction techniques are evaluated, the best prediction model is found to be the fusion metric model. This proposed model is also compared with other quality models available in the literature and is found to be efficient for predicting faulty modules.

  13. Software Tools for Electrical Quality Assurance in the LHC

    CERN Document Server

    Bednarek, Mateusz

    2011-01-01

    There are over 1600 superconducting magnet circuits in the LHC machine. Many of them consist of a large number of components electrically connected in series. This enhances the sensitivity of the whole circuits to electrical faults of individual components. Furthermore, circuits are equipped with a large number of instrumentation wires, which are exposed to accidental damage or swapping. In order to ensure safe operation, an Electrical Quality Assurance (ELQA) campaign is needed after each thermal cycle. Due to the complexity of the circuits, as well as their distant geographical distribution (tunnel of 27km circumference divided in 8 sectors), suitable software and hardware platforms had to be developed. The software combines an Oracle database, LabView data acquisition applications and PHP-based web follow-up tools. This paper describes the software used for the ELQA of the LHC.

  14. First statistical analysis of Geant4 quality software metrics

    Science.gov (United States)

    Ronchieri, Elisabetta; Grazia Pia, Maria; Giacomini, Francesco

    2015-12-01

    Geant4 is a simulation system of particle transport through matter, widely used in several experimental areas from high energy physics and nuclear experiments to medical studies. Some of its applications may involve critical use cases; therefore they would benefit from an objective assessment of the software quality of Geant4. In this paper, we provide a first statistical evaluation of software metrics data related to a set of Geant4 physics packages. The analysis aims at identifying risks for Geant4 maintainability, which would benefit from being addressed at an early stage. The findings of this pilot study set the grounds for further extensions of the analysis to the whole of Geant4 and to other high energy physics software systems.

  15. Automated Theorem Proving in High-Quality Software Design

    Science.gov (United States)

    Schumann, Johann; Swanson, Keith (Technical Monitor)

    2001-01-01

    The amount and complexity of software developed during the last few years has increased tremendously. In particular, programs are being used more and more in embedded systems (from car-brakes to plant-control). Many of these applications are safety-relevant, i.e. a malfunction of hardware or software can cause severe damage or loss. Tremendous risks are typically present in the area of aviation, (nuclear) power plants or (chemical) plant control. Here, even small problems can lead to thousands of casualties and huge financial losses. Large financial risks also exist when computer systems are used in the area of telecommunication (telephone, electronic commerce) or space exploration. Computer applications in this area are not only subject to safety considerations, but also security issues are important. All these systems must be designed and developed to guarantee high quality with respect to safety and security. Even in an industrial setting which is (or at least should be) aware of the high requirements in Software Engineering, many incidents occur. For example, the Warshaw Airbus crash, was caused by an incomplete requirements specification. Uncontrolled reuse of an Ariane 4 software module was the reason for the Ariane 5 disaster. Some recent incidents in the telecommunication area, like illegal "cloning" of smart-cards of D2GSM handies, or the extraction of (secret) passwords from German T-online users show that also in this area serious flaws can happen. Due to the inherent complexity of computer systems, most authors claim that only a rigorous application of formal methods in all stages of the software life cycle can ensure high quality of the software and lead to real safe and secure systems. In this paper, we will have a look, in how far automated theorem proving can contribute to a more widespread application of formal methods and their tools, and what automated theorem provers (ATPs) must provide in order to be useful.

  16. Software application for quality control protocol of mammography systems

    International Nuclear Information System (INIS)

    Considering the fact that the Quality Control of the technological process of the mammographic system involves testing of a large number of parameters, it is clearly evident that there is a need for using the information technology for gathering, processing and storing of all the parameters that are result of this process. The main goal of this software application is facilitation and automation of the gathering, processing, storing and presenting process of the data related to the qualification of the physical and technical parameters during the quality control of the technological process of the mammographic system. The software application along with its user interface and database has been made with the Microsoft Access 2003 application which is part of the Microsoft Office 2003 software packet and has been chosen as a platform for developing because it is the most commonly used office application today among the computer users in the country. This is important because it will provide the end users a familiar environment to work in, without the need for additional training and improving the computer skills that they posses. Most importantly, the software application is easy to use, fast in calculating the parameters needed and it is an excellent way to store and display the results. There is a possibility for up scaling this software solution so it can be used by many different users at the same time over the Internet. It is highly recommended that this system is implemented as soon as possible in the quality control process of the mammographic systems due to its many advantages.(Author)

  17. Quality assurance (QA) procedures for software: Evaluation of an ADC quality system

    International Nuclear Information System (INIS)

    Image viewing and processing software in computed radiography manipulates image contrast in such a way that all relevant image features are rendered to an appropriate degree of visibility, and improves image quality using enhancement algorithms. The purpose of this study was to investigate procedures for the quality assessment of image processing software for computed radiography with the use of existing test objects and to assess the influence that processing introduces on physical image quality characteristics. Measurements of high-contrast resolution, low-contrast resolution, spatial resolution, grey scale (characteristic curve) and geometric distortion were performed 'subjectively' by three independent observers and 'objectively' by the use of criteria based on pixel intensity values. Results show quality assessment is possible without the need for human evaluators, using digital images. It was discovered that the processing software evaluated in this study was able to improve some aspects of image quality, without introducing geometric distortion. (authors)

  18. Data quality: Some comments on the NASA software defect datasets

    OpenAIRE

    Shepperd, M; Song, Q.; Sun, Z.; Mair, C.

    2013-01-01

    Background-Self-evidently empirical analyses rely upon the quality of their data. Likewise, replications rely upon accurate reporting and using the same rather than similar versions of datasets. In recent years, there has been much interest in using machine learners to classify software modules into defect-prone and not defect-prone categories. The publicly available NASA datasets have been extensively used as part of this research. Objective-This short note investigates the extent to which p...

  19. Model Based Development of Quality-Aware Software Services

    OpenAIRE

    Miguel Cabello, Miguel Angel de; Massonet, Philippe; Silva Gallino, Juan Pedro; Fernández Briones, Javier

    2008-01-01

    Modelling languages and development frameworks give support for functional and structural description of software architectures. But quality-aware applications require languages which allow expressing QoS as a first-class concept during architecture design and service composition, and to extend existing tools and infrastructures adding support for modelling, evaluating, managing and monitoring QoS aspects. In addition to its functional behaviour and internal structure, the developer of each s...

  20. Requirements Prioritization: Challenges and Techniques for Quality Software Development

    OpenAIRE

    Muhammad Abdullah Awais

    2016-01-01

    Every organization is aware of the consequences and importance of requirements for the development of quality software product whether local or global. Requirement engineering phase of development with focus on the prioritization of requirements is going under huge research every day because in any development methodology, all requirements cannot be implemented at same time so requirements are prioritized to be implemented to give solution as early as possible in phases as scheduled in increm...

  1. An approach to software quality assurance for robotic inspection systems

    International Nuclear Information System (INIS)

    Software quality assurance (SQA) for robotic systems used in nuclear waste applications is vital to ensure that the systems operate safely and reliably and pose a minimum risk to humans and the environment. This paper describes the SQA approach for the control and data acquisition system for a robotic system being developed for remote surveillance and inspection of underground storage tanks (UST) at the Hanford Site

  2. An Empirical Study on the Procedure to Derive Software Quality Estimation Models [

    OpenAIRE

    Jie Xu; Danny Ho; Luiz Fernando Capretz

    2010-01-01

    Software quality assurance has been a heated topic for several decades. If factors that influence software quality can be identified, they may provide more insight for better software development management. More precise quality assurance can be achieved by employing resources according to accurate quality estimation at the early stages of a project. In this paper, a general procedure is proposed to derive software quality estimation models and various techniques are presented to accomplish t...

  3. Software quality for 1997 - what works and what doesn`t?

    Energy Technology Data Exchange (ETDEWEB)

    Jones, C. [Software Productivity Research, Burlington, MA (United States)

    1997-11-01

    This presentation provides a view of software quality for 1997 - what works and what doesn`t. For many years, software quality assurance lagged behind hardware quality assurance in terms of methods, metrics, and successful results. New approaches such as Quality Function Development (WFD) the ISO 9000-9004 standards, the SEI maturity levels, and Total Quality Management (TQM) are starting to attract wide attention, and in some cases to bring software quality levels up to a parity with manufacturing quality levels.

  4. PARTICULARITIES OF QUALITY EVALUATION IN A SOFTWARE COMPANY

    Directory of Open Access Journals (Sweden)

    Cătălin AFRĂSINEI

    2010-01-01

    Full Text Available Quality management is a management domain very discussed and disputed nowadays and this is the first sign it is a very modern, needed and present concept in theory and practice. Some are seeing it as a solution to prepare things in the way they are needed, and the instrument which might guarantee a proper environment of keeping them in a specified and constant form. The application of quality management is a quality management system that has to be designed, developed and implemented to achieve the aim of quality. The article has the purpose to briefly present what a quality management system should mean in a software company, why it should be periodically evaluated and how it might be done. In the second part it points out the characteristics of the audit as a general evaluation instrument and the main contribution consists on the author’s endeavor to mark out the particularities of an audit process carried out on a software company, considering the fact that particularization increases the changes to easier and earlier succeed with such an activity on a practical basis.

  5. Ambulatory Surgical Center (ASC) Payment System

    Data.gov (United States)

    U.S. Department of Health & Human Services — This file contains a summary of service utilization by ASC supplier and is derived from 2011 ASC line item level data, updated through June 2012, that is, line...

  6. Quality-driven multi-objective optimization of software architecture design: method, tool, and application

    OpenAIRE

    Etemadi Idgahi (Etemaadi), Ramin

    2014-01-01

    Software architecting is a non-trivial and demanding task for software engineers to perform. The architecture is a key enabler for software systems. Besides being crucial for user functionality, the software architecture has deep impact on software qualities such as performance, safety, and cost. In this dissertation, an automated approach for software architecture design is proposed that supports analysis and optimization of multiple quality attributes: First of all, we demonstrate an optimi...

  7. Quality factors in the life cycle of software oriented to safety systems in nuclear power plants

    International Nuclear Information System (INIS)

    The inclusion of software in safety related systems for nuclear power plants, makes it necessary to include the software quality assurance concept. The software quality can be defined as the adjustment degree between the software and the specified requirements and user expectations. To guarantee a certain software quality level it is necessary to make a systematic and planned set of tasks, that constitute a software quality guaranty plan. The application of such a plan involves activities that should be performed all along the software life cycle, and that can be evaluated through the so called quality factors, due to the fact that the quality itself cannot be directly measured, but indirectly as some of it manifestations. In this work, a software life cycle model is proposed, for nuclear power plant safety related systems. A set os software quality factors is also proposed , with its corresponding classification according to the proposed model. (author)

  8. NARAC SOFTWARE QUALITY ASSURANCE: ADAPTING FORMALISM TO MEET VARYING NEEDS

    Energy Technology Data Exchange (ETDEWEB)

    Walker, H; Nasstrom, J S; Homann, S G

    2007-11-20

    The National Atmospheric Release Advisory Center (NARAC) provides tools and services that predict and map the spread of hazardous material accidentally or intentionally released into the atmosphere. NARAC is a full function system that can meet a wide range of needs with a particular focus on emergency response. The NARAC system relies on computer software in the form of models of the atmosphere and related physical processes supported by a framework for data acquisition and management, user interface, visualization, communications and security. All aspects of the program's operations and research efforts are predicated to varying degrees on the reliable and correct performance of this software. Consequently, software quality assurance (SQA) is an essential component of the NARAC program. The NARAC models and system span different levels of sophistication, fidelity and complexity. These different levels require related but different approaches to SQA. To illustrate this, two different levels of software complexity are considered in this paper. As a relatively simple example, the SQA procedures that are being used for HotSpot, a straight-line Gaussian model focused on radiological releases, are described. At the other extreme, the SQA issues that must be considered and balanced for the more complex NARAC system are reviewed.

  9. Quality-driven multi-objective optimization of software architecture design : method, tool, and application

    NARCIS (Netherlands)

    Etemadi Idgahi (Etemaadi), Ramin

    2014-01-01

    Software architecting is a non-trivial and demanding task for software engineers to perform. The architecture is a key enabler for software systems. Besides being crucial for user functionality, the software architecture has deep impact on software qualities such as performance, safety, and cost.

  10. Effective Implementation of Agile Practices - Object Oriented Metrics Tool to Improve Software Quality

    Directory of Open Access Journals (Sweden)

    K. Nageswara Rao

    2012-08-01

    Full Text Available Maintaining the quality of the software is the major challenge in the process of software development.Software inspections which use the methods like structured walkthroughs and formal code reviews involvecareful examination of each and every aspect/stage of software development. In Agile softwaredevelopment, refactoring helps to improve software quality. This refactoring is a technique to improvesoftware internal structure without changing its behaviour. After much study regarding the ways toimprove software quality, our research proposes an object oriented software metric tool called“MetricAnalyzer”. This tool is tested on different codebases and is proven to be much useful.

  11. Analysis of quality raw data of second generation sequencers with Quality Assessment Software

    Science.gov (United States)

    2011-01-01

    Background Second generation technologies have advantages over Sanger; however, they have resulted in new challenges for the genome construction process, especially because of the small size of the reads, despite the high degree of coverage. Independent of the program chosen for the construction process, DNA sequences are superimposed, based on identity, to extend the reads, generating contigs; mismatches indicate a lack of homology and are not included. This process improves our confidence in the sequences that are generated. Findings We developed Quality Assessment Software, with which one can review graphs showing the distribution of quality values from the sequencing reads. This software allow us to adopt more stringent quality standards for sequence data, based on quality-graph analysis and estimated coverage after applying the quality filter, providing acceptable sequence coverage for genome construction from short reads. Conclusions Quality filtering is a fundamental step in the process of constructing genomes, as it reduces the frequency of incorrect alignments that are caused by measuring errors, which can occur during the construction process due to the size of the reads, provoking misassemblies. Application of quality filters to sequence data, using the software Quality Assessment, along with graphing analyses, provided greater precision in the definition of cutoff parameters, which increased the accuracy of genome construction. PMID:21501521

  12. Daily quality assurance software for a satellite radiometer system

    Science.gov (United States)

    Keegstra, P. B.; Smoot, G. F.; Bennett, C. L.; Aymon, J.; Backus, C.; Deamici, G.; Hinshaw, G.; Jackson, P. D.; Kogut, A.; Lineweaver, C.

    1992-01-01

    Six Differential Microwave Radiometers (DMR) on COBE (Cosmic Background Explorer) measure the large-angular-scale isotropy of the cosmic microwave background (CMB) at 31.5, 53, and 90 GHz. Quality assurance software analyzes the daily telemetry from the spacecraft to ensure that the instrument is operating correctly and that the data are not corrupted. Quality assurance for DMR poses challenging requirements. The data are differential, so a single bad point can affect a large region of the sky, yet the CMB isotropy requires lengthy integration times (greater than 1 year) to limit potential CMB anisotropies. Celestial sources (with the exception of the moon) are not, in general, visible in the raw differential data. A 'quicklook' software system was developed that, in addition to basic plotting and limit-checking, implements a collection of data tests as well as long-term trending. Some of the key capabilities include the following: (1) stability analysis showing how well the data RMS averages down with increased data; (2) a Fourier analysis and autocorrelation routine to plot the power spectrum and confirm the presence of the 3 mK 'cosmic' dipole signal; (3) binning of the data against basic spacecraft quantities such as orbit angle; (4) long-term trending; and (5) dipole fits to confirm the spacecraft attitude azimuth angle.

  13. Improving Software Quality Prediction by Noise Filtering Techniques

    Institute of Scientific and Technical Information of China (English)

    Taghi M. Khoshgoftaar; Pierre Rebours

    2007-01-01

    Accuracy of machine learners is affected by quality of the data the learners are induced on. In this paper,quality of the training dataset is improved by removing instances detected as noisy by the Partitioning Filter. The fit datasetis first split into subsets, and different base learners are induced on each of these splits. The predictions are combined insuch a way that an instance is identified as noisy if it is misclassified by a certain number of base learners. Two versionsof the Partitioning Filter are used: Multiple-Partitioning Filter and Iterative-Partitioning Filter. The number of instancesremoved by the filters is tuned by the voting scheme of the filter and the number of iterations. The primary aim of thisstudy is to compare the predictive performances of the final models built on the filtered and the un-filtered training datasets.A case study of software measurement data of a high assurance software project is performed. It is shown that predictiveperformances of models built on the filtered fit datasets and evaluated on a noisy test dataset are generally better than thosebuilt on the noisy (un-filtered) fit dataset. However, predictive performance based on certain aggressive filters is affected bypresence of noise in the evaluation dataset.

  14. A systematic review of quality attributes and measures for software product lines

    OpenAIRE

    Montagud Gregori, Sonia; Abrahao Gonzales, Silvia Mara; Insfrán Pelozo, César Emilio

    2012-01-01

    It is widely accepted that software measures provide an appropriate mechanism for understanding, monitoring, controlling, and predicting the quality of software development projects. In software product lines (SPL), quality is even more important than in a single software product since, owing to systematic reuse, a fault or an inadequate design decision could be propagated to several products in the family. Over the last few years, a great number of quality attributes and measures for assessi...

  15. Comparison of a Graphical and a Textual Design Language Using Software Quality Metrics

    OpenAIRE

    Henry, Sallie M.; Goff, Roger

    1988-01-01

    For many years the software engineering community has been attacking the software reliability problem on two fronts. First via design methodologies, languages and tools as a precheck on quality and second by measuring the quality of produced software as a postcheck. This research attempts to unify the approach to creating reliable software by providing the ability to measure the quality of a design prior to its implementation. A comparison of a graphical and a textual design language is pres...

  16. IAEA TECDOC 1517 Quality control in mammography software

    International Nuclear Information System (INIS)

    In October 2006, the IAEA published the TECDOC 1517 Quality Control in Mammography, whose main purpose was to give Latin American countries a protocol in Spanish with all necessary QC. This protocol harmonizes the tests and evaluation criteria of mammography equipment and its complementary equipment; it states the different responsibilities on all personnel and gives guidance on radiographic techniques. It was the joint effort of two ARCAL projects: RLA/6/043 and RLA/9/035. QC programs are needed to assure the final quality of mammography images and to optimize the radiation dose to the patients. Countries where national campaigns are used to improve the early detection of breast cancer among asymptomatic women require the establishment of QC programs. Specific software has been developed based on the TECDOC 1517 to facilitate the technologist, medical physicist and physician its implementation. It has a main menu bar tool and icons for rapid access to the different tests. The help option in each test pops a window with the same procedure written on the TECDOC for the user's convenience. The tests are divided on the same sections as in the document: visual inspection, storage of films, dark room, image system, radiological equipment, automatic exposure control, geometry, collimation, image visualization, film rejection analysis, image quality and dosimetry. On each test, data is introduced on specific color cells and when the user activates the calculation button, the results are compared against its tolerance levels, and indication of pass/fail is finally displayed. This software, available to all Member States, adds extra value to the TECDOC 1517 since errors in calculations will be reduced by its use. It will harmonize the way results are presented, it will facilitate comparisons, it reduces the time to evaluate the results of the test and finally it becomes a teaching tool for the TECDOC. (author)

  17. Quality Improvement and Infrastructure Activity Costs in Software Development: A Longitudinal Analysis

    OpenAIRE

    Donald E. Harter; Slaughter, Sandra A.

    2003-01-01

    This study draws upon theories of task interdependence and organizational inertia to analyze the effect of quality improvement on infrastructure activity costs in software development. Although increasing evidence indicates that quality improvement reduces software development costs, the impact on infrastructure activities is not known. Infrastructure activities include services like computer operations, data integration, and configuration management that support software development. Because...

  18. Software quality assurance on the Yucca Mountain Site Characterization Project

    International Nuclear Information System (INIS)

    The Yucca Mountain Site Characterization Project (YMP) has been involved over the years in the continuing struggle with establishing acceptable Software Quality Assurance (SQA) requirements for the development, modification, and acquisition of computer programs used to support the Mined Geologic Disposal System. These computer programs will be used to produce or manipulate data used directly in site characterization, design, analysis, performance assessment, and operation of repository structures, systems, and components. Scientists and engineers working on the project have claimed that the SQA requirements adopted by the project are too restrictive to allow them to perform their work. This paper will identify the source of the original SQA requirements adopted by the project. It will delineate the approach used by the project to identify concerns voiced by project engineers and scientists regarding the original SQA requirements. It will conclude with a discussion of methods used to address these problems in the rewrite of the original SQA requirements

  19. Recommendations for a Software Quality Assurance Plan for the CMR Facility at LANL

    Energy Technology Data Exchange (ETDEWEB)

    Adams, K.; Matthews, S. D.; McQueen, M. A.

    1998-10-01

    The Nuclear Materials Technology (NMT) organizations 1 and 3 within the Chemical and Metallurgical Research (CMR) facility at the Los Alamos National Laboratory are working to achieve Waste Isolation Pilot Plant (WIPP) certification to enable them to transport their TRU waste to WIPP. This document is intended to provide not only recommendations to address the necessary software quality assurance activities to enable the NMT-1 and NMT-3 organizations to be WIPP compliant but is also meant to provide a template for the final Software Quality Assurance Plan (SQAP). This document specifically addresses software quality assurance for all software used in support of waste characterization and analysis. Since NMT-1 and NMT-3 currently have several operational software products that are used for waste characterization and analysis, these software quality assurance recommendations apply to the operations, maintenance and retirement of the software and the creation and development of any new software required for waste characterization and analyses.

  20. Recommendations for a Software Quality Assurance Plan for the CMR Facility at LANL

    International Nuclear Information System (INIS)

    The Nuclear Materials Technology (NMT) organizations 1 and 3 within the Chemical and Metallurgical Research (CMR) facility at the Los Alamos National Laboratory are working to achieve Waste Isolation Pilot Plant (WIPP) certification to enable them to transport their TRU waste to WIPP. This document is intended to provide not only recommendations to address the necessary software quality assurance activities to enable the NMT-1 and NMT-3 organizations to be WIPP compliant but is also meant to provide a template for the final Software Quality Assurance Plan (SQAP). This document specifically addresses software quality assurance for all software used in support of waste characterization and analysis. Since NMT-1 and NMT-3 currently have several operational software products that are used for waste characterization and analysis, these software quality assurance recommendations apply to the operations, maintenance and retirement of the software and the creation and development of any new software required for waste characterization and analyses

  1. IMPROVED SOFTWARE QUALITY ASSURANCE TECHNIQUES USING SAFE GROWTH MODEL

    OpenAIRE

    M. Sangeetha; K.M.SenthilKumar; Dr.C.Arumugam; K. Akila

    2010-01-01

    In our lives are governed by large, complex systems with increasingly complex software, and the safety, security, and reliability of these systems has become a major concern. As the software in today’ssystems grows larger, it has more defects, and these defects adversely affect the safety, security, and reliability of the systems. Software engineering is the application of a systematic, disciplined, quantifiable approach to the development, operation, andmaintenance of software. Software divi...

  2. A Splice Variant of ASC Regulates IL-1β Release and Aggregates Differently from Intact ASC

    Directory of Open Access Journals (Sweden)

    Kazuhiko Matsushita

    2009-01-01

    Full Text Available The apoptosis-associated speck-like protein containing a caspase recruit domain (ASC is involved in apoptosis and innate immunity and is a major adaptor molecule responsible for procaspase-1 activation. ASC mRNA is encoded by three exons: exons 1 and 3 encode a pyrin domain (PYD and caspase recruit domain (CARD, respectively, and exon 2 encodes a proline and glycine-rich (PGR domain. Here, we identified a variant ASC protein (vASC lacking the PGR domain that was smaller than full length ASC (fASC derived from fully transcribed mRNA and searched for differences in biochemical and biological nature. Both fASC and vASC were found to activate procaspase-1 to a similar degree, but the efficiency of IL-1β excretion was significantly higher for vASC. There was also a marked structural difference observed in the fibrous aggregates formed by fASC and vASC. These results suggest that although the PGR domain is dispensable for procaspase-1 activation, it plays an important role in the regulation of the molecular structure and activity of ASC.

  3. Software quality assurance procedures for radioactive waste risk assessment codes

    International Nuclear Information System (INIS)

    This support study for the evaluation of the safety of geological disposal systems is aimed at identifying the requirements for software quality assurance procedures for radioactive waste risk assessment codes, and to recommend appropriate procedures. The research covers: (i) the analysis of existing procedures and definition of requirements; (ii) a case study of the use of some existing procedures; (iii) the definition and the implementation of procedures. The report is supported by appendices that give more detail on the procedures recommended. It is intended to provide ideas on the steps that should be taken to ensure the quality of the programs used for assessment of the safety case for radioactive waste repositories, and does not represent the introduction of wholly new ideas or techniques. The emphasis throughout is on procedures that will be easily implemented, rather than on the fully rigorous procedures that are required for some application areas. The study has concentrated on measures that will increase the confidence in repository performance assessments among the wider scientific/engineering community, and the lay public

  4. The Software Improvement Process - Tools And Rules To Encourage Quality

    CERN Document Server

    Sigerud, K

    2011-01-01

    The Applications section of the CERN accelerator Controls group has decided to apply a systematic approach to quality assurance (QA), the “Software Improvement Process”, SIP. This process focuses on three areas: the development process itself, suitable QA tools, and how to practically encourage developers to do QA. For each stage of the development process we have agreed on the recommended activities and deliverables, and identified tools to automate and support the task. For example we do more code reviews. As peer reviews are resource-intensive, we only do them for complex parts of a product. As a complement, we are using static code checking tools, like FindBugs and Checkstyle. We also encourage unit testing and have agreed on a minimum level of test coverage recommended for all products, measured using Clover. Each of these tools is well integrated with our IDE (Eclipse) and give instant feedback to the developer about the quality of their code. The major challenges of SIP have been to 1) agree on com...

  5. Lessons learned from development and quality assurance of software systems at the Halden Project

    Energy Technology Data Exchange (ETDEWEB)

    Bjorlo, T.J.; Berg, O.; Pehrsen, M.; Dahll, G.; Sivertsen, T. [OECD Halden Reactor Project (Norway)

    1996-03-01

    The OECD Halden Reactor Project has developed a number of software systems within the research programmes. These programmes have comprised a wide range of topics, like studies of software for safety-critical applications, development of different operator support systems, and software systems for building and implementing graphical user interfaces. The systems have ranged from simple prototypes to installations in process plants. In the development of these software systems, Halden has gained much experience in quality assurance of different types of software. This paper summarises the accumulated experience at the Halden Project in quality assurance of software systems. The different software systems being developed at the Halden Project may be grouped into three categories. These are plant-specific software systems (one-of-a-kind deliveries), generic software products, and safety-critical software systems. This classification has been found convenient as the categories have different requirements to the quality assurance process. In addition, the experience from use of software development tools and proprietary software systems at Halden, is addressed. The paper also focuses on the experience gained from the complete software life cycle, starting with the software planning phase and ending with software operation and maintenance.

  6. Proteomics Quality Control: Quality Control Software for MaxQuant Results.

    Science.gov (United States)

    Bielow, Chris; Mastrobuoni, Guido; Kempa, Stefan

    2016-03-01

    Mass spectrometry-based proteomics coupled to liquid chromatography has matured into an automatized, high-throughput technology, producing data on the scale of multiple gigabytes per instrument per day. Consequently, an automated quality control (QC) and quality analysis (QA) capable of detecting measurement bias, verifying consistency, and avoiding propagation of error is paramount for instrument operators and scientists in charge of downstream analysis. We have developed an R-based QC pipeline called Proteomics Quality Control (PTXQC) for bottom-up LC-MS data generated by the MaxQuant1 software pipeline. PTXQC creates a QC report containing a comprehensive and powerful set of QC metrics, augmented with automated scoring functions. The automated scores are collated to create an overview heatmap at the beginning of the report, giving valuable guidance also to nonspecialists. Our software supports a wide range of experimental designs, including stable isotope labeling by amino acids in cell culture (SILAC), tandem mass tags (TMT), and label-free data. Furthermore, we introduce new metrics to score MaxQuant's Match-between-runs (MBR) functionality by which peptide identifications can be transferred across Raw files based on accurate retention time and m/z. Last but not least, PTXQC is easy to install and use and represents the first QC software capable of processing MaxQuant result tables. PTXQC is freely available at https://github.com/cbielow/PTXQC . PMID:26653327

  7. Why should we care about data quality in software engineering?

    OpenAIRE

    Bachmann, A

    2010-01-01

    Software engineering tools such as bug tracking databases and version control systems store large amounts of data about the history and evolution of software projects. In the last few years, empirical software engineering researchers have paid attention to these data to provide promising research results, for example, to predict the number of future bugs, recommend bugs to fix next, and visualize the evolution of software systems. Unfortunately, such data is not well-prepared for research pur...

  8. Understanding the State of Quality of Software on the basis of Time Gap, Quality Gap and Difference with Standard Model

    Directory of Open Access Journals (Sweden)

    Ekbal Rashid

    2013-06-01

    Full Text Available This paper tries to introduce a new mathematical model to understand the state of quality of software by calculating parameters such as the time gap and quality gap with relation to some predefinedstandard software quality or in relation to some chalked out software quality plan. The paper also suggests methods to calculate the difference in quality of the software being developed and the modelsoftware which has been decided upon as the criteria for comparison. These methods can be employed to better understand the state of quality as compared to other standards. In order to obtain the graphical representation of data we have used Microsoft office 2007 graphical chart. Which facilitate easy simulation of time and quality gap.

  9. The software improvement process - tools and rules to encourage quality

    International Nuclear Information System (INIS)

    The Applications section of the CERN accelerator controls group has decided to apply a systematic approach to quality assurance (QA), the 'Software Improvement Process' - SIP. This process focuses on three areas: the development process itself, suitable QA tools, and how to practically encourage developers to do QA. For each stage of the development process we have agreed on the recommended activities and deliverables, and identified tools to automate and support the task. For example we do more code reviews. As peer reviews are resource intensive, we only do them for complex parts of a product. As a complement, we are using static code checking tools, like FindBugs and Checkstyle. We also encourage unit testing and have agreed on a minimum level of test coverage recommended for all products, measured using Clover. Each of these tools is well integrated with our IDE (Eclipse) and give instant feedback to the developer about the quality of their code. The major challenges of SIP have been to 1) agree on common standards and configurations, for example common code formatting and Javadoc documentation guidelines, and 2) how to encourage the developers to do QA. To address the second point, we have successfully implemented 'SIP days', i.e. one day dedicated to QA work to which the whole group of developers participates, and 'Top/Flop' lists, clearly indicating the best and worst products with regards to SIP guidelines and standards, for example test coverage. This paper presents the SIP initiative in more detail, summarizing our experience since two years and our future plans. (authors)

  10. Filmes plásticos e ácido ascórbico na qualidade de araticum minimamente processado Plastic packaging film and ascorbic acid treatment on the quality of fresh cut araticum

    OpenAIRE

    Manoel Soares Soares Júnior; Marcio Caliari; Rosângela Vera; Camila Silveira Melo

    2007-01-01

    O objetivo deste trabalho foi avaliar os efeitos do ácido ascórbico e do tipo de filme plástico como embalagem na qualidade do araticum minimamente processado e mantido sob refrigeração. O ácido ascórbico não evitou o escurecimento do araticum minimamente processado. Independentemente do tipo de embalagem, a acidez titulável aumentou com o tempo. A embalagem de policloreto de vinila ou polietileno de baixa densidade promoveu uma significativa perda de massa se comparada com a a laminada a vác...

  11. Software quality assurance plan for the National Ignition Facility integrated computer control system

    International Nuclear Information System (INIS)

    Quality achievement is the responsibility of the line organizations of the National Ignition Facility (NIF) Project. This Software Quality Assurance Plan (SQAP) applies to the activities of the Integrated Computer Control System (ICCS) organization and its subcontractors. The Plan describes the activities implemented by the ICCS section to achieve quality in the NIF Project's controls software and implements the NIF Quality Assurance Program Plan (QAPP, NIF-95-499, L-15958-2) and the Department of Energy's (DOE's) Order 5700.6C. This SQAP governs the quality affecting activities associated with developing and deploying all control system software during the life cycle of the NIF Project

  12. A Study on a Software Quality Assurance of a Process and a Product

    International Nuclear Information System (INIS)

    Since nuclear plants and facilities have made increasing use of digital technology, the safety and reliability of software is a primary concern. Software errors are more difficult to detect and handle than hardware-related failures. It is crucial to consider the a process and a product of a software life cycle to increase the quality of a software. This paper discusses the quality assurance of a process and a product of a software life cycle based on two prominent standards, ISO 9001:2000 and CMMI

  13. Quality Evaluation of Software Architecture with Application to OpenH.323 Protocol

    OpenAIRE

    Hoffmann, Martin

    2006-01-01

    The requirements towards software systems usually go beyond the correct functionality, the presence of certain quality demands are also very essential for the systems' acceptance by the stakeholders. So quality control and management must be carried out through the whole development process to ensure the implementation of required quality characteristics. This thesis focuses on the quality control of the software architecture. Several approaches for evaluating the architecture ...

  14. Research of quality control during development of NPP DCS 1E classified software

    International Nuclear Information System (INIS)

    The Nuclear safety depends on right behavior of 1E software, which is a important part of 1E DCS system. Nowadays, user focus on good function of 1E system, but pay little attention to quality control of 1E software. In fact, it's declared in IEC61513 and IEC60880 that 1E software should under strict quality control during all stages of development. This article is related to the practice of 1E DCS system quality control and explores the QC surveillance for 1E software from the user's point of view. (authors)

  15. Application of software quality assurance to a specific scientific code development task

    International Nuclear Information System (INIS)

    This paper describes an application of software quality assurance to a specific scientific code development program. The software quality assurance program consists of three major components: administrative control, configuration management, and user documentation. The program attempts to be consistent with existing local traditions of scientific code development while at the same time providing a controlled process of development

  16. Contribution to the automation of software quality control of web applications

    OpenAIRE

    García Gutiérrez, Boni

    2011-01-01

    Abstract The Web has become one of the most influential instruments in the history of mankind. Therefore, web applications development is a hot topic in the Software Engineering domain. In this context, the software quality is a key concept since it determines the degree in which a system meets its requirements and meets the expectations of its customers and/or users. Quality control (also known as verification and validation) is the set of activities designed to assess a software system in o...

  17. ASC-PROBA Interface Control Document

    DEFF Research Database (Denmark)

    Betto, Maurizio; Jørgensen, John Leif; Jørgensen, Finn E;

    1999-01-01

    This document describes the Advanced Stellar Compass (ASC) and defines the interfaces between the instrument and the PROBA satellite. The ASC is a highly advanced and autonomous Stellar Reference Unit designed, developed and produced by the Space Instrumentation Group of the Department of...... Automation of the Technical University of Denmark. The document is structured as follows. First we present the ASC - heritage, system description, performance - then we address more specifically the environmental properties, like the EMC compatibility and thermal characteristics, and the design and...

  18. Reducing the risk of failure: Software Quality assurance standards and methods

    International Nuclear Information System (INIS)

    An effective Software Quality Assurance (SQA) program provides an overall approach to software engineering and the establishment of proven methods for the production of reliable software. And, in the authors experience, the overall costs for the software life are diminished with the application of quality methods. In their experience, the issues for implementing quality standards and practices are many. This paper addresses those issues as well as the lessons learned from developing and implementing a number of software quality assurance programs. Their experience includes the development and implementation of their own NRC accepted SQA program and an SQA program for an engineering software developer, as well as developing SQA procedures, standards, and methods for utilities, medical and commercial clients. Some of the issues addressed in this paper are: setting goals and defining quality; applying the software life cycle; addressing organizational issues; providing flexibility and increasing productivity; producing effective documentation; maintaining quality records; Imposing software configuration management; conducting reviews, audits, and controls; verification and validation; and controlling software procurement

  19. ASC-ATDM Performance Portability Requirements for 2015-2019

    Energy Technology Data Exchange (ETDEWEB)

    Edwards, Harold C. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Computing Research Center; Trott, Christian Robert [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Computing Research Center

    2015-03-01

    This report outlines the research, development, and support requirements for the Advanced Simulation and Computing (ASC ) Advanced Technology, Development, and Mitigation (ATDM) Performance Portability (a.k.a., Kokkos) project for 2015 - 2019 . The research and development (R&D) goal for Kokkos (v2) has been to create and demonstrate a thread - parallel programming model a nd standard C++ library - based implementation that enables performance portability across diverse manycore architectures such as multicore CPU, Intel Xeon Phi, and NVIDIA Kepler GPU. This R&D goal has been achieved for algorithms that use data parallel pat terns including parallel - for, parallel - reduce, and parallel - scan. Current R&D is focusing on hierarchical parallel patterns such as a directed acyclic graph (DAG) of asynchronous tasks where each task contain s nested data parallel algorithms. This five y ear plan includes R&D required to f ully and performance portably exploit thread parallelism across current and anticipated next generation platforms (NGP). The Kokkos library is being evaluated by many projects exploring algorithm s and code design for NGP. Some production libraries and applications such as Trilinos and LAMMPS have already committed to Kokkos as their foundation for manycore parallelism an d performance portability. These five year requirements includes support required for current and antic ipated ASC projects to be effective and productive in their use of Kokkos on NGP. The greatest risk to the success of Kokkos and ASC projects relying upon Kokkos is a lack of staffing resources to support Kokkos to the degree needed by these ASC projects. This support includes up - to - date tutorials, documentation, multi - platform (hardware and software stack) testing, minor feature enhancements, thread - scalable algorithm consulting, and managing collaborative R&D.

  20. Software quality assurance for safety analysis and risk management at the Savannah River Site

    International Nuclear Information System (INIS)

    As part of its Reactor Operations Improvement Program at the Savannah River Site (SRS), Westinghouse Savannah River Company (WSRC), in cooperation with the Westinghouse Hanford Company, has developed and implemented quality assurance for safety-related software for technical programs essential to the safety and reliability of reactor operations. More specifically, the quality assurance process involved the development and implementation of quality standards and attendant procedures based on industry software quality standards. These procedures were then applied to computer codes in reactor safety and probabilistic risk assessment analyses. This paper provides a review of the major aspects of the WSRC safety-related software quality assurance. In particular, quality assurance procedures are described for the different life cycle phases of the software that include the Requirements, Software Design and Implementation, Testing and Installation, Operation and Maintenance, and Retirement Phases. For each phase, specific provisions are made to categorize the range of activities, the level of responsibilities, and the documentation needed to assure the control of the software. The software quality assurance procedures developed and implemented are evolutionary in nature, and thus, prone to further refinements. These procedures, nevertheless, represent an effective controlling tool for the development, production, and operation of safety-related software applicable to reactor safety and probabilistic risk assessment analyses

  1. Guidance and Control Software Project Data - Volume 4: Configuration Management and Quality Assurance Documents

    Science.gov (United States)

    Hayhurst, Kelly J. (Editor)

    2008-01-01

    The Guidance and Control Software (GCS) project was the last in a series of software reliability studies conducted at Langley Research Center between 1977 and 1994. The technical results of the GCS project were recorded after the experiment was completed. Some of the support documentation produced as part of the experiment, however, is serving an unexpected role far beyond its original project context. Some of the software used as part of the GCS project was developed to conform to the RTCA/DO-178B software standard, "Software Considerations in Airborne Systems and Equipment Certification," used in the civil aviation industry. That standard requires extensive documentation throughout the software development life cycle, including plans, software requirements, design and source code, verification cases and results, and configuration management and quality control data. The project documentation that includes this information is open for public scrutiny without the legal or safety implications associated with comparable data from an avionics manufacturer. This public availability has afforded an opportunity to use the GCS project documents for DO-178B training. This report provides a brief overview of the GCS project, describes the 4-volume set of documents and the role they are playing in training, and includes configuration management and quality assurance documents from the GCS project. Volume 4 contains six appendices: A. Software Accomplishment Summary for the Guidance and Control Software Project; B. Software Configuration Index for the Guidance and Control Software Project; C. Configuration Management Records for the Guidance and Control Software Project; D. Software Quality Assurance Records for the Guidance and Control Software Project; E. Problem Report for the Pluto Implementation of the Guidance and Control Software Project; and F. Support Documentation Change Reports for the Guidance and Control Software Project.

  2. Integrating Components in Software Product Line to Build High Quality Products

    Directory of Open Access Journals (Sweden)

    Lena Khaled

    2011-01-01

    Full Text Available Problem statement: The main part of building any system is achieving high level of quality and developing qualities it is achieve. Many organizations do not take into account the highest level of quality as a main necessary part through building its systems; they think mainly on budget and reducing time to market. Approach: One of the important approached to achieved quality was used components through building products and then selecting the most appropriate component to put them into the product line according to system requirements. Results: The main result of adopting component-based approach to software product line was premise of high quality in addition to reused and reduced time to the market. Conclusion: The ultimate goal of using components through software product line is to increase quality of software as flexibility, reliability in addition to the characteristic of making software reusable in another types of business especially in electronic commerce application.

  3. Effective Defect Prevention Approach in Software Process for Achieving Better Quality Levels

    CERN Document Server

    Suma, V

    2010-01-01

    Defect prevention is the most vital but habitually neglected facet of software quality assurance in any project. If functional at all stages of software development, it can condense the time, overheads and wherewithal entailed to engineer a high quality product. The key challenge of an IT industry is to engineer a software product with minimum post deployment defects. This effort is an analysis based on data obtained for five selected projects from leading software companies of varying software production competence. The main aim of this paper is to provide information on various methods and practices supporting defect detection and prevention leading to thriving software generation. The defect prevention technique unearths 99% of defects. Inspection is found to be an essential technique in generating ideal software generation in factories through enhanced methodologies of abetted and unaided inspection schedules. On an average 13 % to 15% of inspection and 25% - 30% of testing out of whole project effort tim...

  4. An empirical study of software architectures' effect on product quality

    DEFF Research Database (Denmark)

    Hansen, Klaus Marius; Jonasson, Kristjan; Neukirchen, Helmut

    2011-01-01

    Software architectures shift the focus of developers from lines-of-code to coarser-grained components and their interconnection structure. Unlike 2ne-grained objects, these components typically encompass business functionality and need to be aware of the underlying business processes. Hence, the...... interface of a component should re4ect relevant parts of the business process and the software architecture should emphasize the coordination among components. To shed light on these issues, we provide a framework for component-based software architectures focusing on the process perspective. The interface...

  5. ADAPTIVE SYNTHESIS OF INTELLIGENT MEASUREMENT SYSTEMS WITH THE USE OF ASC-ANALYSIS AND "EIDOS" SYSTEM. SYSTEM IDENTIFICATION IN ECONOMETRICS, BIOMETRICS, ECOLOGY, PEDAGOGY, PSYCHOLOGY AND MEDICINE

    Directory of Open Access Journals (Sweden)

    Lutsenko Y. V.

    2016-02-01

    Full Text Available The article proposes using the automated system-cognitive analysis (ASC-analysis and its software tool, which is the system called "Eidos" for synthesis and application of adaptive intelligent measuring systems to measure values of parameters of objects, and for system state identification of complex multivariable nonlinear dynamic systems. The article briefly describes the mathematical method of ASC-analysis, implemented in the software tool – universal cognitive analytical system named "Eidos-X++". The mathematical method of ASC-analysis is based on system theory of information (STI which was created in the conditions of implementation of program ideas of generalizations of all the concepts of mathematics, in particularly, the information theory based on the set theory, through a total replacement of the concept of “many” with the more general concept of system and detailed tracking of all the consequences of this replacement. Due to the mathematical method, which is the basis of ASC-analysis, this method is nonparametric and allows you to process comparably tens and hundreds of thousands of gradations of factors and future conditions of the control object (class in incomplete (fragmented, noisy data numeric and non-numeric nature which are measured in different units of measurement. We provide a detailed numerical example of the application of ASC-analysis and the system of "Eidos-X++" as a synthesis of systemic-cognitive model, providing a multiparameter typization of the states of complex systems, and system identification of their states, as well as for making decisions about managing the impact of changing the composition of the control object to get its quality (level of consistency maximally increased at minimum cost. For a numerical example of a complex system we have selected the team of the company, and its component – employees and applicants (staff. However, it must be noted that this example should be considered even wider

  6. Design and development of an expert system based quality assurance module for the Dynamo Model of software project management

    OpenAIRE

    Leidy, Frank H.

    1989-01-01

    Quality assurance is a crucial function to the successful development and maintenance of a software system. Because this activity has a significant impact on the cost of software development, the cost-effectiveness of quality assurance is a major concern to the software quality manager. There are tradeoffs between the economic benefits and costs of quality assurance. Using the Dynamo model of software project management, an optimal quality assurance level and its distribution throughout a pro...

  7. ISO and software quality assurance - licensing and certification of software professionals

    Energy Technology Data Exchange (ETDEWEB)

    Hare, J.; Rodin, L.

    1997-11-01

    This report contains viewgraphs on licensing and certifing of software professionals. Discussed in this report are: certification programs; licensing programs; why became certified; certification as a condition of empolyment; certification requirements; and examination structures.

  8. Maturity model-based software process quality assessment and management

    OpenAIRE

    Шеховцов, Владимир Анатольевич; Годлевский, Михаил Дмитриевич; Брагинский, Игорь Львович

    2011-01-01

    We propose to formulate the problem of achieving the necessary level of maturity for the software process based on the selection among alternate variants of the process steps implementation under the resource constraints of the organization

  9. Measurement and Management of the Level of Quality Control Process in SoC (System on Chip Embedded Software Development

    Directory of Open Access Journals (Sweden)

    Ki-won Song

    2012-04-01

    quality control activities and it is desirable to create a quality process to integrally represent overall level of quality control activities performed while developing the software deliverables. With the quality process, it is possible to evaluate whether enough quality control activities are performed for the project officially and secure the quality of the software deliverables before it is delivered to the customers.

  10. Identify new Software Quality Assurance needs for the UK e-Science community and reintroduction for the right tools to improve evolved software engineering processes

    OpenAIRE

    Chang, Victor

    2008-01-01

    Software Quality Assurance (QA) is defined as the methodology and good practices for ensuring the quality of software in development. It involves in handling bug reports, bug tracking, error investigation, verification of fixed bugs, test management, test case plan and design, as well as test case execution and records. Standards such as ISO 9001 are commonly followed for software QA, which recommends using a wide range of tools to improve the existing software engineering processes (SEP) for...

  11. The Role and Quality of Software Safety in the NASA Constellation Program

    Science.gov (United States)

    Layman, Lucas; Basili, Victor R.; Zelkowitz, Marvin V.

    2010-01-01

    In this study, we examine software safety risk in the early design phase of the NASA Constellation spaceflight program. Obtaining an accurate, program-wide picture of software safety risk is difficult across multiple, independently-developing systems. We leverage one source of safety information, hazard analysis, to provide NASA quality assurance managers with information regarding the ongoing state of software safety across the program. The goal of this research is two-fold: 1) to quantify the relative importance of software with respect to system safety; and 2) to quantify the level of risk presented by software in the hazard analysis. We examined 154 hazard reports created during the preliminary design phase of three major flight hardware systems within the Constellation program. To quantify the importance of software, we collected metrics based on the number of software-related causes and controls of hazardous conditions. To quantify the level of risk presented by software, we created a metric scheme to measure the specificity of these software causes. We found that from 49-70% of hazardous conditions in the three systems could be caused by software or software was involved in the prevention of the hazardous condition. We also found that 12-17% of the 2013 hazard causes involved software, and that 23-29% of all causes had a software control. Furthermore, 10-12% of all controls were software-based. There is potential for inaccuracy in these counts, however, as software causes are not consistently scoped, and the presence of software in a cause or control is not always clear. The application of our software specificity metrics also identified risks in the hazard reporting process. In particular, we found a number of traceability risks in the hazard reports may impede verification of software and system safety.

  12. Quality factors quantification/assurance for software related to safety in nuclear power plants

    International Nuclear Information System (INIS)

    Quality assurance plan is needed to guarantee the software quality. The use of such a plan involves activities that should take place all along the life cycle, and which can be evaluated using the so called quality factors. This is due to the fact that the quality itself cannot be measured, but some of its manifestations can be used for this purpose. In the present work, a methodology to quantify a set of quality factors is proposed, for software based systems to be used in safety related areas in nuclear power plants. (author)

  13. An Integrated and Comprehensive Approach to Software Quality

    Directory of Open Access Journals (Sweden)

    Dr.S.S.Riaz Ahamed

    2010-02-01

    Full Text Available Quality is the customers perception of the value of the suppliers work output. Quality represents the properties of products and/or services that are valued by the consumer. Quality is a momentary perception that occurs when something in the environment interacts with human factor, in the pre-intellectual awareness that comesbefore rational thought takes over and begins establishing order. Judgment of the resulting order is then reported as good or bad quality value. A product or process that is Reliable, and that performs its intended function is said to be a quality product.

  14. Influence of confirmation biases of developers on software quality: an empirical study

    OpenAIRE

    Calikli, Gul; Bener, Ayse Basar

    2013-01-01

    The thought processes of people have a significant impact on software quality, as software is designed, developed and tested by people. Cognitive biases, which are defined as patterned deviations of human thought from the laws of logic and mathematics, are a likely cause of software defects. However, there is little empirical evidence to date to substantiate this assertion. In this research, we focus on a specific cognitive bias, confirmation bias, which is defined as the tendency of people t...

  15. A Framework for Evaluating the Software Product Quality of Pregnancy Monitoring Mobile Personal Health Records.

    Science.gov (United States)

    Idri, Ali; Bachiri, Mariam; Fernández-Alemán, José Luis

    2016-03-01

    Stakeholders' needs and expectations are identified by means of software quality requirements, which have an impact on software product quality. In this paper, we present a set of requirements for mobile personal health records (mPHRs) for pregnancy monitoring, which have been extracted from literature and existing mobile apps on the market. We also use the ISO/IEC 25030 standard to suggest the requirements that should be considered during the quality evaluation of these mPHRs. We then go on to design a checklist in which we contrast the mPHRs for pregnancy monitoring requirements with software product quality characteristics and sub-characteristics in order to calculate the impact of these requirements on software product quality, using the ISO/IEC 25010 software product quality standard. The results obtained show that the requirements related to the user's actions and the app's features have the most impact on the external sub-characteristics of the software product quality model. The only sub-characteristic affected by all the requirements is Appropriateness of Functional suitability. The characteristic Operability is affected by 95% of the requirements while the lowest degrees of impact were identified for Compatibility (15%) and Transferability (6%). Lastly, the degrees of the impact of the mPHRs for pregnancy monitoring requirements are discussed in order to provide appropriate recommendations for the developers and stakeholders of mPHRs for pregnancy monitoring. PMID:26643080

  16. Unisys' experience in software quality and productivity management of an existing system

    Science.gov (United States)

    Munson, John B.

    1988-01-01

    A summary of Quality Improvement techniques, implementation, and results in the maintenance, management, and modification of large software systems for the Space Shuttle Program's ground-based systems is provided.

  17. Development of the iridium software for quality control of iridium-192 activated wires

    International Nuclear Information System (INIS)

    In order to improve quality control of Iridium-192 wires produced by IPEN, an automatic system prototype for the measurement of Iridium-192 activated wire was developed. This work shows the development of the Iridium software for such system

  18. Handbook of software quality assurance techniques applicable to the nuclear industry

    Energy Technology Data Exchange (ETDEWEB)

    Bryant, J.L.; Wilburn, N.P.

    1987-08-01

    Pacific Northwest Laboratory is conducting a research project to recommend good engineering practices in the application of 10 CFR 50, Appendix B requirements to assure quality in the development and use of computer software for the design and operation of nuclear power plants for NRC and industry. This handbook defines the content of a software quality assurance program by enumerating the techniques applicable. Definitions, descriptions, and references where further information may be obtained are provided for each topic.

  19. Handbook of software quality assurance techniques applicable to the nuclear industry

    International Nuclear Information System (INIS)

    Pacific Northwest Laboratory is conducting a research project to recommend good engineering practices in the application of 10 CFR 50, Appendix B requirements to assure quality in the development and use of computer software for the design and operation of nuclear power plants for NRC and industry. This handbook defines the content of a software quality assurance program by enumerating the techniques applicable. Definitions, descriptions, and references where further information may be obtained are provided for each topic

  20. Implementation of a free software for quality control of IMRT

    International Nuclear Information System (INIS)

    In this paper we focus on implementation and launch of software that allows us to compare quantitatively the two-dimensional dose distributions calculated and measured experimentally in IMRT treatment. The tool we are using to make this comparison is the free software DoseLab. This is a program written in MatLab and open source, thereby allowing in some cases adapt the program to the needs of each user. This program will be able to calculate the gamma function of these distributions, a parameter that simultaneously evaluates the difference in dose between two pixels of the image and the distance between them, giving us an objective and quantitative, allowing us to decide if both distributions are compatible or not.

  1. QARI: Quality aware software deployment for wireless sensor networks

    OpenAIRE

    Horré, Wouter; Hughes, Danny; Michiels, Sam; Joosen, Wouter

    2010-01-01

    If we are to deploy sensor applications in a realistic business context, we must provide innovative middleware services to control and enforce required system behavior; in order to correctly interpret collected temperature data, for example, sensor applications require guarantees about minimal coverage and the number of available sensors. The extreme dynamism, scale and unreliability of wireless sensor networks represent major challenges in contemporary software management. This paper pres...

  2. Fuzzy Comprehensive Evaluation Software of Teaching Quality Based on Entropy

    OpenAIRE

    Guihua Zheng; Quanlong Guan

    2013-01-01

    The present teaching evaluation models are researched on and the evaluation criteria is designed automatically from the perspectives of experts and students. And then that is made to be a kind of scientific and reasonable criteria. By combining the approach weighted entropy and fuzzy comprehensive evaluation, the present model proposes a teaching comprehensive evaluation model. This software model solves some problems in conducting quantitative analysis of teaching equality. And at the same t...

  3. Can we improve software quality by re-engineering?

    OpenAIRE

    Signore, Oreste; Loffredo, Mario; Chericoni, Susanna

    1992-01-01

    The maintenance of applications constitutes a relevant issue, as a lot of effort goes in this task. CASE tools claim to be effective in producing efficient and error free software, but in many cases the aim is not to produce new application systems, but just to modify the existing ones. Re-engineering appears to be a suitable way of getting the advantages of the automated CASE tools, without incurring the costs involved in a complete redevelopment of the existing systems, whose specifications...

  4. Software quality assurance plan for the National Ignition Facility integrated computer control system

    Energy Technology Data Exchange (ETDEWEB)

    Woodruff, J.

    1996-11-01

    Quality achievement is the responsibility of the line organizations of the National Ignition Facility (NIF) Project. This Software Quality Assurance Plan (SQAP) applies to the activities of the Integrated Computer Control System (ICCS) organization and its subcontractors. The Plan describes the activities implemented by the ICCS section to achieve quality in the NIF Project`s controls software and implements the NIF Quality Assurance Program Plan (QAPP, NIF-95-499, L-15958-2) and the Department of Energy`s (DOE`s) Order 5700.6C. This SQAP governs the quality affecting activities associated with developing and deploying all control system software during the life cycle of the NIF Project.

  5. Ensuring Software Product Quality : An Industrial Case Study

    OpenAIRE

    Pydi, Manikanta Kumar; Nakka, Annie Sushma

    2012-01-01

    Context This thesis verifies a method developed on alignment issues in different data points and is useful to validate the method in those data points. To find the alignment/misalignment problems occurring within the stakeholders in a company is done through surveys using Hierarchical Cumulative Voting (HCV). This paper presents a case study to explain the importance of alignment between the stakeholders to achieve quality. Time, scope and cost are given higher priority leaving quality as it ...

  6. A Case of Engineering Quality for Mobile Healthcare Applications Using Augmented Personal Software Process Improvement

    Directory of Open Access Journals (Sweden)

    Shahbaz Ahmed Khan Ghayyur

    2016-01-01

    Full Text Available Mobile healthcare systems are currently considered as key research areas in the domain of software engineering. The adoption of modern technologies, for mobile healthcare systems, is a quick option for industry professionals. Software architecture is a key feature that contributes towards a software product, solution, or services. Software architecture helps in better communication, documentation of design decisions, risks identification, basis for reusability, scalability, scheduling, and reduced maintenance cost and lastly it helps to avoid software failures. Hence, in order to solve the abovementioned issues in mobile healthcare, the software architecture is integrated with personal software process. Personal software process has been applied successfully but it is unable to address the issues related to architectural design and evaluation capabilities. Hence, a new technique architecture augmented personal process is presented in order to enhance the quality of the mobile healthcare systems through the use of architectural design with integration of personal software process. The proposed process was validated by case studies. It was found that the proposed process helped in reducing the overall costs and effort. Moreover, an improved architectural design helped in development of high quality mobile healthcare system.

  7. LLNL Site Specific ASCI Software Quality Engineering Recommended Practices Overview Version 1.0

    Energy Technology Data Exchange (ETDEWEB)

    Peck, T; Sparkman, D; Storch, N

    2002-02-01

    ''The LLNL Site-Specific Advanced Simulation and Computing (ASCI) Software Quality Engineering Recommended Practices VI.I'' document describes a set of recommended software quality engineering (SQE) practices for ASCI code projects at Lawrence Livermore National Laboratory (LLNL). In this context, SQE is defined as the process of building quality into software products by applying the appropriate guiding principles and management practices. Continual code improvement and ongoing process improvement are expected benefits. Certain practices are recommended, although projects may select the specific activities they wish to improve, and the appropriate time lines for such actions. Additionally, projects can rely on the guidance of this document when generating ASCI Verification and Validation (VSrV) deliverables. ASCI program managers will gather information about their software engineering practices and improvement. This information can be shared to leverage the best SQE practices among development organizations. It will further be used to ensure the currency and vitality of the recommended practices. This Overview is intended to provide basic information to the LLNL ASCI software management and development staff from the ''LLNL Site-Specific ASCI Software Quality Engineering Recommended Practices VI.I'' document. Additionally the Overview provides steps to using the ''LLNL Site-Specific ASCI Software Quality Engineering Recommended Practices VI.I'' document. For definitions of terminology and acronyms, refer to the Glossary and Acronyms sections in the ''LLNL Site-Specific ASCI Software Quality Engineering Recommended Practices VI.I''.

  8. C++ Software Quality in the ATLAS Experiment: Tools and Experience

    CERN Document Server

    Kluth, Stefan; The ATLAS collaboration; Obreshkov, Emil; Roe, Shaun; Seuster, Rolf; Snyder, Scott; Stewart, Graeme

    2016-01-01

    The ATLAS experiment at CERN uses about six million lines of code and currently has about 420 developers whose background is largely from physics. In this paper we explain how the C++ code quality is managed using a range of tools from compile-time through to run time testing and reflect on the great progress made in the last year largely through the use of static analysis tools such as Coverity®, an industry-standard tool which enables quality comparison with general open source C++ code. Other tools including cppcheck, Include-What-You-Use and run-time 'sanitizers' are also discussed.

  9. A pioneering application of NQA-1 quality assurance standards in the development of software

    International Nuclear Information System (INIS)

    One reason for systematically applying quality assurance to computer software is the extensive use of results from computer programs to characterize potential sited for nuclear waste repositories leading ultimately to important policy making decisions. Because data from these programs characterize the likely radioactivity profile for many hundreds of years, experimental validation is not feasible. The Sensitivity and Uncertainty Analysis Methods Development Project (SUAMDP) was developed to formulate and utilize efficient and comprehensive methods for determining sensitivities of calculated results with respect to changes in all input parameters. The computerized methodology was embodied in the Gradient Enhanced Software System (GRESS). Due to the fact that GRESS was to be used in the site characterization for waste storage, stringent NQA-1 requirements were imposed by the sponsor. A working relationship between the Oak Ridge National Laboratory (ORNL) Quality Department and the research scientists developing GRESS was essential in achieving understanding and acceptance of the quality assurance requirements as applied to the SUAMDP. The relationship resulted in the SUAMDP becoming the first software project at ORNL to develop a comprehensive NQA-1 Quality Assurance Plan; this plan now serves as a model for software quality assurance at ORNL. This paper describes the evolution of this plan and its impact on the application of quality assurance procedures to software. 2 refs

  10. An efficient high-quality hierarchical clustering algorithm for automatic inference of software architecture from the source code of a software system

    CERN Document Server

    Rogatch, Sarge

    2012-01-01

    It is a high-quality algorithm for hierarchical clustering of large software source code. This effectively allows to break the complexity of tens of millions lines of source code, so that a human software engineer can comprehend a software system at high level by means of looking at its architectural diagram that is reconstructed automatically from the source code of the software system. The architectural diagram shows a tree of subsystems having OOP classes in its leaves (in the other words, a nested software decomposition). The tool reconstructs the missing (inconsistent/incomplete/inexistent) architectural documentation for a software system from its source code. This facilitates software maintenance: change requests can be performed substantially faster. Simply speaking, this unique tool allows to lift the comprehensible grain of object-oriented software systems from OOP class-level to subsystem-level. It is estimated that a commercial tool, developed on the basis of this work, will reduce software mainte...

  11. Software quality and process improvement in scientific simulation codes

    Energy Technology Data Exchange (ETDEWEB)

    Ambrosiano, J.; Webster, R. [Los Alamos National Lab., NM (United States)

    1997-11-01

    This report contains viewgraphs on the quest to develope better simulation code quality through process modeling and improvement. This study is based on the experience of the authors and interviews with ten subjects chosen from simulation code development teams at LANL. This study is descriptive rather than scientific.

  12. Saasquality - A Method for Quality Evaluation of Software as a Service (Saas

    Directory of Open Access Journals (Sweden)

    Nemesio Freitas Duarte Filho

    2013-07-01

    Full Text Available The market for software products offered as a service (SaaS is growing steadily and has attractedsuppliers from different segments of the global IT market. However, the use of the SaaS products brings arange of challenges,both in the organizational, cultural and technological areas. A difficulty that existstoday is the lack of methods and models for assessing the quality of these products. This document presentsa method to assess the quality of a software product offeredas a service, named SaaSQuality. Theproposed method has a quality model appropriate to the SaaS context, based on standards and models ofsoftware quality (ISO 9126 and models for IT management (ITIL and COBIT. The experimental resultsobtained througha case study show that the method offers suitable assessment practices for Software as aService.

  13. Application of software quality assurance methods in validation and maintenance of reactor analysis computer codes

    International Nuclear Information System (INIS)

    Various computer codes employed at Israel Electricity Company for preliminary reactor design analysis and fuel cycle scoping calculations have been often subject to program source modifications. Although most changes were due to computer or operating system compatibility problems, a number of significant modifications were due to model improvement and enhancements of algorithm efficiency and accuracy. With growing acceptance of software quality assurance requirements and methods, a program of implementing extensive testing of modified software has been adopted within the regular maintenance activities. In this work survey has been performed of various software quality assurance methods of software testing which belong mainly to the two major categories of implementation ('white box') and specification-based ('black box') testing. The results of this survey exhibits a clear preference of specification-based testing. In particular the equivalence class partitioning method and the boundary value method have been selected as especially suitable functional methods for testing reactor analysis codes.A separate study of software quality assurance methods and techniques has been performed in this work objective to establish appropriate pre-test software specification methods. Two methods of software analysis and specification have been selected as the most suitable for this purpose: The method of data flow diagrams has been shown to be particularly valuable for performing the functional/procedural software specification while the entities - relationship diagrams has been approved to be efficient for specifying software data/information domain. Feasibility of these two methods has been analyzed in particular for software uncertainty analysis and overall code accuracy estimation. (author). 14 refs

  14. Defect Prevention Technique in Test Case of Software Process for Quality Improvement

    Directory of Open Access Journals (Sweden)

    Abhiraja Sharma

    2012-01-01

    Full Text Available A test case in software engineering is a set of conditions orvariables under which a tester will determine whether anapplication or software system is working correctly or not.The mechanism for determining whether a software programor system has passed or failed such a test is known as a testoracle.Defect prevention is the most vital but habitually neglectedfacet of software quality assurance in any project. Iffunctional at all stages of software development, it cancondense the time, overheads and wherewithal entailed toengineer a high quality product. The key challenge of an ITindustry is to engineer a software product with minimumpost deployment defectsThis paper will focus on finding the total number of defects ifthe test case shows that the software process not workingproperly. That has occurred in the software developmentprocess. For three similar projects and aims at classifyingvarious defects using first level of Orthogonal DefectClassification (ODC, finding base causes of the defects anduses the learning of the projects as preventive ideas. Thepaper also showcases on how the preventive ideas areimplemented in a new set of projects resulting in thereduction of the number of similar

  15. Investigating the Practical Impact of Agile Practices on the Quality of Software Projects in Continuous Delivery

    Directory of Open Access Journals (Sweden)

    Olumide Akerele

    2014-07-01

    Full Text Available Various factors affect the impact of agile factors on the continuous delivery of software projects. This is a major reason why projects perform differently- some failing and some succeeding- when they implement some agile practices in various environments. This is not helped by the fact that many projects work within limited budget while project plans also change-- making them to fall into some sort of pressure to meet deadline when they fall behind in their planned work. This study investigates the impact of pair programming, customer involvement, QA Ability, pair testing and test driven development in the pre-release and post -release quality of software projects using system dynamics within a schedule pressure blighted environment. The model is validated using results from a completed medium-sized software. Statistical results suggest that the impact of PP is insignificant on the pre-release quality of the software while TDD and customer involvement both have significant effects on the pre-release quality of software. Results also showed that both PT and QA ability had a significant impact on the post-release quality of the software.

  16. The AscSimulationMode command

    DEFF Research Database (Denmark)

    Jørgensen, John Leif

    1998-01-01

    Complex instruments like the ASC may be quite difficult to test in closed loops. This problem is augmented by the fact, that no direct stimulation of the CHU is possible that will render the full performance, noise-spectrum and real-timeliness with high fidelity. In order to circumvent this impasse...

  17. Strengthening Web Based Learning through Software Quality Analysis

    Science.gov (United States)

    Montero, Juan Manuel; San Segundo, Ruben; de Cordoba, Ricardo; de La Barcena, Amparo Marin; Zlotnik, Alexander

    The Web is changing the way people access & exchange information. Specifically in the teaching & learning environment, we are witnessing that the traditional model of presence based magisterial classes is shifting towards Web Based Learning. This new model draws on remote access systems, knowledge sharing, and student mobility. In this context, pedagogical strategies are also changing, and for instance, Project- Based Learning (PBL) is seen as a potential driver for growth and development in this arena. This study is focused on a PBL oriented course with a Distributed Remote ACcess (DRAC) system. The objective is to analyze how quantitative methods can be leveraged to design and evaluate automatic diagnosis and feedback tools to assist students on quality-related pedagogical issues in DRAC enabled PBL courses. Main conclusions derived from this study are correlation-based and reveal that the development of automatic quality assessment and feedback requires further research.

  18. A MODEL OF TRANSITION FROM QUALITY MANAGEMENT SYSTEMS TO KNOWLEDGE MANAGEMENT SYSTEMS IN SOFTWARE DEVELOPING ORGANIZATIONS

    OpenAIRE

    Chrabański, Karol

    2013-01-01

    The paper is aimed at presenting a model of transition from quality management systems to knowledge management systems in software developing organizations. The methodology focuses on presenting components of the model of transition from quality management systems to knowledge management systems. The paper defines the model of transition from the quality management systems conformable with series 9000 ISO international standards supplemented with ISO/IEC 90003:2004 to knowledge management sys...

  19. Quality control in diagnostic radiology: software (Visual Basic 6) and database applications

    International Nuclear Information System (INIS)

    Quality Assurance programme in diagnostic Radiology is being implemented by the Ministry of Health (MoH) in Malaysia. Under this program the performance of an x-ray machine used for diagnostic purpose is tested by using the approved procedure which is commonly known as Quality Control in diagnostic radiology. The quality control or performance tests are carried out b a class H licence holder issued the Atomic Energy Licensing Act 1984. There are a few computer applications (software) that are available in the market which can be used for this purpose. A computer application (software) using Visual Basics 6 and Microsoft Access, is being developed to expedite data handling, analysis and storage as well as report writing of the quality control tests. In this paper important features of the software for quality control tests are explained in brief. A simple database is being established for this purpose which is linked to the software. Problems encountered in the preparation of database are discussed in this paper. A few examples of practical usage of the software and database applications are presented in brief. (Author)

  20. Data and Cost handling Techniques for Software Quality Prediction Through Clustering

    Directory of Open Access Journals (Sweden)

    Saifi Bawahir , Mohsin Sheikh

    2012-12-01

    Full Text Available Analysis of Data quality is an important issue which has been addressed as data warehousing, data mining and information systems. It has been agreed that poor data quality will impact the quality of results of analyses and that it will therefore impact on decisions made on the basis of these results. An attempt to improve classification accuracy by pre-clustering did not succeed. However, error rates within clusters from training sets were strongly correlated with error rates within the same clusters on the test sets. This phenomenon could perhaps be used to develop confidence levels for predictions. The main and the common problem that the software industry has to face is the maintenance cost of industrial software systems. One of the main reasons for the high cost of maintenance is the inherent difficulty of understanding software systems that are large, complex, inconsistent and integrated. The main reason behind the above phenomena is because of different size and level of arrangements. Decomposing a software system into smaller, more manageable subsystems can aid the process of understanding it significantly. Different algorithms construct different decompositions. Therefore, it is important to have methods that evaluate the quality of such automatic decompositions. In our paper we present a brief survey on software quality prediction through clustering.

  1. Quality control of the software in the JT-60 computer control system

    International Nuclear Information System (INIS)

    Improvements of the JT-60 control system are constantly required as the experiments go on. In order to keep the integrity of the whole system in case of modifying the control functions, the idea of quality control has been introduced into the software development. The objective of quality control in the JT-60 control system is to accelerate the software development. The QC activities lay emphasis on making the present status of the software clear and establishing the standard procedure of the software development. The support tools for grasping the present status of the control system have been developed in the general purpose large computer, where the database on the structure of programs and the relations among programs and tables are installed. Document control is also very important. This paper reports these QC activities and their problems for the JT-60 control system. (author)

  2. NEMA NU-1 2007 based and independent quality control software for gamma cameras and SPECT

    Energy Technology Data Exchange (ETDEWEB)

    Vickery, A [Department of Clinical Physiology and Nuclear Medicine, Glostrup Hospital (Denmark); Joergensen, T [Department of Clinical Physiology and Nuclear Medicine, Naestved Hospital (Denmark); De Nijs, R, E-mail: anette@vickery.dk [Department of Clinical Physiology, Nuclear Medicine and PET, Rigshospitalet, Copenhagen University Hospital (Denmark)

    2011-09-23

    A thorough quality assurance of gamma and SPECT cameras requires a careful handling of the measured quality control (QC) data. Most gamma camera manufacturers provide the users with camera specific QC Software. This QC software is indeed a useful tool for the following of day-to-day performance of a single camera. However, when it comes to objective performance comparison of different gamma cameras and a deeper understanding of the calculated numbers, the use of camera specific QC software without access to the source code is rather avoided. Calculations and definitions might differ, and manufacturer independent standardized results are preferred. Based upon the NEMA Standards Publication NU 1-2007, we have developed a suite of easy-to-use data handling software for processing acquired QC data providing the user with instructive images and text files with the results.

  3. Software for creating quality control database in diagnostic radiology

    International Nuclear Information System (INIS)

    The paper describes a PC based program with database for quality control (QC). It keeps information about all surveyed equipment and measured parameters. The first function of the program is to extract information from old (existing) MS Excel spreadsheets with QC surveys. The second function is used for input of measurements which are automatically organized in MS Excel spreadsheets and built into the database. The spreadsheets are based on the protocols described in the EMERALD Training Scheme. In addition, the program can make statistics of all measured parameters, both in absolute term and in time

  4. Contribution to Quality-driven Evolutionary Software Development process for Service-Oriented Architectures

    OpenAIRE

    Arciniegas Herrera, Jose Luis

    2006-01-01

    The quality of software is a key element for the successful of a system. Currently, with the advance of the technology, consumers demand more and better services. Models for the development process have also to be adapted to new requirements. This is particular true in the case of service oriented systems (domain of this thesis), where an unpredictable number of users can access to one or several services. This work proposes an improvement in the models for the software development proces...

  5. Space Shuttle Program Primary Avionics Software System (PASS) Success Legacy - Quality and Reliability Date

    Science.gov (United States)

    Orr, James K.; Peltier, Daryl

    2010-01-01

    Thsi slide presentation reviews the avionics software system on board the space shuttle, with particular emphasis on the quality and reliability. The Primary Avionics Software System (PASS) provides automatic and fly-by-wire control of critical shuttle systems which executes in redundant computers. Charts given show the number of space shuttle flights vs time, PASS's development history, and other charts that point to the reliability of the system's development. The reliability of the system is also compared to predicted reliability.

  6. Development of NEMA-based Software for Gamma Camera Quality Control

    OpenAIRE

    Rova, Andrew; Celler, Anna; Hamarneh, Ghassan

    2007-01-01

    We have developed a cross-platform software application that implements all of the basic standardized nuclear medicine scintillation camera quality control analyses, thus serving as an independent complement to camera manufacturers’ software. Our application allows direct comparison of data and statistics from different cameras through its ability to uniformly analyze a range of file types. The program has been tested using multiple gamma cameras, and its results agree with comparable analysi...

  7. Measuring the impact of computer resource quality on the software development process and product

    Science.gov (United States)

    Mcgarry, Frank; Valett, Jon; Hall, Dana

    1985-01-01

    The availability and quality of computer resources during the software development process was speculated to have measurable, significant impact on the efficiency of the development process and the quality of the resulting product. Environment components such as the types of tools, machine responsiveness, and quantity of direct access storage may play a major role in the effort to produce the product and in its subsequent quality as measured by factors such as reliability and ease of maintenance. During the past six years, the NASA Goddard Space Flight Center has conducted experiments with software projects in an attempt to better understand the impact of software development methodologies, environments, and general technologies on the software process and product. Data was extracted and examined from nearly 50 software development projects. All were related to support of satellite flight dynamics ground-based computations. The relationship between computer resources and the software development process and product as exemplified by the subject NASA data was examined. Based upon the results, a number of computer resource-related implications are provided.

  8. Inter-comparison and Quality Assurance of acquisition and processing software for MUGA studies in Cuba

    International Nuclear Information System (INIS)

    With the purpose of create the bases for quality control and quality assurance of the acquisition and processing program of gated cardiac blood-pool (MUGA) studies, we used the VENSTRA cardiac function phantom in 7 cameras (4 SOPHA- DSX-1000, 2 GE- IMAGAMMA-2001 and 1 SIEMENS- HERMES) and made 3 acquisition for each Global Left Ventricular Ejection Fraction (LVEF 30%, 60% and 80%) and for each Heart Rate (HR 40, 80 and 160 heart beat/min). The planar resolution and the planar uniformity were proper in all the equipment. Differences less than 5% were found between the acquisition and processing program. To evaluate the processing program without the acquisition parameter's influence, we used one group of these image like software phantom and test the semi-automatic software in all cameras. The semi-automatic protocol showed difference less than 3% between software. The automatic processing software of gated cardiac studies were checked with the COST-B2 software phantom; the difference between the Left Ventricle Ejection Fraction calculated by these software was less than 5% and the regional wall motion analysis was complete coincident in the 93% of the cases. The use of VENSTRA and COST- B2 phantom confirm the correct functioning of the acquisition and the LVEF calculus software of MUGA studies in the 83% of cuban nuclear medicine centers

  9. Improving Code Quality of the Compact Muon Solenoid Electromagnetic Calorimeter Control Software to Increase System Maintainability

    CERN Multimedia

    Holme, Oliver; Dissertori, Günther; Djambazov, Lubomir; Lustermann, Werner; Zelepoukine, Serguei

    2013-01-01

    The Detector Control System (DCS) software of the Electromagnetic Calorimeter (ECAL) of the Compact Muon Solenoid (CMS) experiment at CERN is designed primarily to enable safe and efficient operation of the detector during Large Hadron Collider (LHC) data-taking periods. Through a manual analysis of the code and the adoption of ConQAT [1], a software quality assessment toolkit, the CMS ECAL DCS team has made significant progress in reducing complexity and improving code quality, with observable results in terms of a reduction in the effort dedicated to software maintenance. This paper explains the methodology followed, including the motivation to adopt ConQAT, the specific details of how this toolkit was used and the outcomes that have been achieved. [1] ConQAT, Continuous Quality Assessment Toolkit; https://www.conqat.org/

  10. Software Quality Validation for Web Applications Developed Using Geographically Distributed Human Resources

    Directory of Open Access Journals (Sweden)

    Mihai GHEORGHE

    2015-01-01

    Full Text Available Developing web applications using Geographically Distributed Team Members has seen an increased popularity during the last years mainly because the rise of Open Source technologies, fast penetration of the Internet in emerging economies, the continuous quest for reduced costs as well for the fast adoption of online platforms and services which successfully address project planning, coordination and other development tasks. This paper identifies general software process stages for both collocated and distributed development and analyses the impact the use of planning, management and testing online services has on the duration, cost and quality of each stage. Given that Quality Assurance is one of the most important concerns in Geographically Distributed Software Development (GDSD, the focus is on Software Quality Validation.

  11. An Empirical Study on the Procedure to Derive Software Quality Estimation Models [

    Directory of Open Access Journals (Sweden)

    Jie Xu

    2010-09-01

    Full Text Available Software quality assurance has been a heated topic for several decades. If factors that influence softwarequality can be identified, they may provide more insight for better software development management.More precise quality assurance can be achieved by employing resources according to accurate qualityestimation at the early stages of a project. In this paper, a general procedure is proposed to derivesoftware quality estimation models and various techniques are presented to accomplish the tasks inrespective steps. Several statistical techniques together with machine learning method are utilized toverify the effectiveness of software metrics. Moreover, a neuro-fuzzy approach is adopted to improve theaccuracy of the estimation model. This procedure is carried out based on data from the ISBSG repositoryto present its empirical value.

  12. Manual on quality assurance for computer software related to the safety of nuclear power plants

    International Nuclear Information System (INIS)

    The objective of the Manual is to provide guidance in the assurance of quality of specification, design, maintenance and use of computer software related to items and activities important to safety (hereinafter referred to as safety related) in nuclear power plants. This guidance is consistent with, and supplements, the requirements and recommendations of Quality Assurance for Safety in Nuclear Power Plants: A Code of Practice, 50-C-QA, and related Safety Guides on quality assurance for nuclear power plants. Annex A identifies the IAEA documents referenced in the Manual. The Manual is intended to be of use to all those who, in any way, are involved with software for safety related applications for nuclear power plants, including auditors who may be called upon to audit management systems and product software. Figs

  13. Comparative Analysis of Models and Quality Standards of the Software Product

    Directory of Open Access Journals (Sweden)

    Alena González Reyes

    2015-12-01

    Full Text Available Despite the rapid advance of the software industry, there are still defi ciencies in the quality of the products developed. There are various models and standards that can provide a basis for assessing the quality of software products. In this sense it is diffi cult for an organization to identify which is the most appropriate according to their characteristics, since most studies analyze very few models use a few criteria. This work aims to analyze a set of models and standards aimed at assessing the quality of software products in order to identify those that have been used or referenced, and aspects that characterize each. From the literature a set of criteria that were the basis for comparative analysis, among which are identified: quality characteristics and sub-characteristics, structure, purpose, separation of internal and external elements, relationships between quality characteristics, relationships between quality metrics and quality characteristics, type of project to which it applies, classifi cation type and type of quality assessed. As a result is based on the selection of the ISO / IEC 9126 and ISO / IEC 25010, as the more comprehensive standards taking into account the various approaches used in the comparison.

  14. Lightweight and Continuous Architectural Software Quality Assurance Using the aSQA Technique

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak; Hansen, Klaus Marius; Lindstrøm, Bo

    2010-01-01

    assessed and prioritized, e.g., within each development sprint. We outline the processes and metrics embodied in the technique, and report initial experiences on the benefits and liabilities. In conclusion, the technique is considered valuable and a viable tool, and has benefits in an architectural......In this paper, we present a novel technique for assessing and prioritizing architectural quality in large-scale software development projects. The technique can be applied with relatively little effort by software architects and thus suited for agile development in which quality attributes can be...

  15. Application of Domain Knowledge to Software Quality Assurance

    Science.gov (United States)

    Wild, Christian W.

    1997-01-01

    This work focused on capturing, using, and evolving a qualitative decision support structure across the life cycle of a project. The particular application of this study was towards business process reengineering and the representation of the business process in a set of Business Rules (BR). In this work, we defined a decision model which captured the qualitative decision deliberation process. It represented arguments both for and against proposed alternatives to a problem. It was felt that the subjective nature of many critical business policy decisions required a qualitative modeling approach similar to that of Lee and Mylopoulos. While previous work was limited almost exclusively to the decision capture phase, which occurs early in the project life cycle, we investigated the use of such a model during the later stages as well. One of our significant developments was the use of the decision model during the operational phase of a project. By operational phase, we mean the phase in which the system or set of policies which were earlier decided are deployed and put into practice. By making the decision model available to operational decision makers, they would have access to the arguments pro and con for a variety of actions and can thus make a more informed decision which balances the often conflicting criteria by which the value of action is measured. We also developed the concept of a 'monitored decision' in which metrics of performance were identified during the decision making process and used to evaluate the quality of that decision. It is important to monitor those decision which seem at highest risk of not meeting their stated objectives. Operational decisions are also potentially high risk decisions. Finally, we investigated the use of performance metrics for monitored decisions and audit logs of operational decisions in order to feed an evolutionary phase of the the life cycle. During evolution, decisions are revisisted, assumptions verified or refuted

  16. When Spreadsheets Become Software - Quality Control Challenges and Approaches - 13360

    International Nuclear Information System (INIS)

    As part of a preliminary waste acceptance criteria (PWAC) development, several commercial models were employed, including the Hydrologic Evaluation of Landfill Performance model (HELP) [1], the Disposal Unit Source Term - Multiple Species model (DUSTMS) [2], and the Analytical Transient One, Two, and Three-Dimensional model (AT123D) [3]. The results of these models were post-processed in MS Excel spreadsheets to convert the model results to alternate units, compare the groundwater concentrations to the groundwater concentration thresholds, and then to adjust the waste contaminant masses (based on average concentration over the waste volume) as needed in an attempt to achieve groundwater concentrations at the limiting point of assessment that would meet the compliance concentrations while maximizing the potential use of the landfill (i.e., maximizing the volume of projected waste being generated that could be placed in the landfill). During the course of the PWAC calculation development, one of the Microsoft (MS) Excel spreadsheets used to post-process the results of the commercial model packages grew to include more than 575,000 formulas across 18 worksheets. This spreadsheet was used to assess six base scenarios as well as nine uncertainty/sensitivity scenarios. The complexity of the spreadsheet resulted in the need for a rigorous quality control (QC) procedure to verify data entry and confirm the accuracy of formulas. (authors)

  17. Release Process on Quality Improvement in Open Source Software Project Management

    Directory of Open Access Journals (Sweden)

    S. Chandra Kumar Mangalam

    2012-01-01

    Full Text Available Problem statement: The Software Industry has changed and developed as a consequence of the impact of Open Source Software (OSS since 1990s. Over a period of time, OSS has evolved in an integrated manner and most of the participants in OSS activity are volunteers. Approach: This coordination form of development has produced a considerable quantity of software; and often, the development method has been viewed as an unorganized and unstructured method of development. Few existing researches deal with the Open Source Software phenomenon from a quality perception point of view and studies where enhancements are possible in the development process. Results: Release Process in OSS plays a key role in most of the OSS projects. As this process is related to the evolution of a quality software from the community of OSS developers, this research attempts to explore the process practices which are employed by OSS developers and examines the problems associated with the development process. The scope of the study is mainly confined to process management in OSS. “Prototype development and iterative development process” approaches were adapted as a methodology. Conclusion/Recommendations: The major finding and conclusion drawn is ‘lack of coordination among developers’ who are geographically isolated. Hence, the study suggests the need for coordination among developers to line up their development process for achieving the goal of the software release process.

  18. Relationships among Service Quality, Customer Satisfaction and Customer Perceived Value: Evidence from Iran's Software Industry

    OpenAIRE

    Seyed Mostafa Razavi; Hossein Safari; Hessam Shafie; Kobra khoram

    2012-01-01

    This study sets out to investigate the relationships service quality, customer perceived value and customer satisfaction in six large software companies of Iran. To this end, after reviewing the related literature, the effective factors in the service quality, customer perceived value and customer satisfaction were identified. Then, questionnaires were distributed among the customers of the companies. Next, Factor Analysis and Structural Equation Modelling were used to find the relationships;...

  19. The ASC Sequoia Programming Model

    Energy Technology Data Exchange (ETDEWEB)

    Seager, M

    2008-08-06

    In the late 1980's and early 1990's, Lawrence Livermore National Laboratory was deeply engrossed in determining the next generation programming model for the Integrated Design Codes (IDC) beyond vectorization for the Cray 1s series of computers. The vector model, developed in mid 1970's first for the CDC 7600 and later extended from stack based vector operation to memory to memory operations for the Cray 1s, lasted approximately 20 years (See Slide 5). The Cray vector era was deemed an extremely long lived era as it allowed vector codes to be developed over time (the Cray 1s were faster in scalar mode than the CDC 7600) with vector unit utilization increasing incrementally over time. The other attributes of the Cray vector era at LLNL were that we developed, supported and maintained the Operating System (LTSS and later NLTSS), communications protocols (LINCS), Compilers (Civic Fortran77 and Model), operating system tools (e.g., batch system, job control scripting, loaders, debuggers, editors, graphics utilities, you name it) and math and highly machine optimized libraries (e.g., SLATEC, and STACKLIB). Although LTSS was adopted by Cray for early system generations, they later developed COS and UNICOS operating systems and environment on their own. In the late 1970s and early 1980s two trends appeared that made the Cray vector programming model (described above including both the hardware and system software aspects) seem potentially dated and slated for major revision. These trends were the appearance of low cost CMOS microprocessors and their attendant, departmental and mini-computers and later workstations and personal computers. With the wide spread adoption of Unix in the early 1980s, it appeared that LLNL (and the other DOE Labs) would be left out of the mainstream of computing without a rapid transition to these 'Killer Micros' and modern OS and tools environments. The other interesting advance in the period is that systems were being

  20. Reliability of adaptive multivariate software sensors for sewer water quality monitoring

    DEFF Research Database (Denmark)

    Vezzaro, Luca; Mikkelsen, Peter Steen

    2015-01-01

    This study investigates the use of a multivariate approach, based on Principal Component Analysis PCA), as software sensor for fault detection and reconstruction of missing measurements in on-line monitoring of sewer water quality. The analysis was carried out on a 16-months dataset of five...... validation. However, the study also showed a dramatic drop in predictive capability of the software sensor when used for reconstructing missing values, with performance quickly deteriorating after 1 week since parameter estimation. The software sensor provided better results when used to estimate pollutants...... mainly originated from wastewater sources (such as ammonia) than when used for pollutants affected by several processes (such as TSS). Overall, this study provides a first insight in the application of multivariate methods for software sensors, highlighting drawback and potential development areas. A...

  1. Guidelines to Minimize the Cost of Software Quality in Agile Scrum Process

    OpenAIRE

    Deepa Vijay; Gopinath Ganapathy

    2014-01-01

    This paper presents a case study of Agile Scrum process followed in Retail Domain project. This paper also reveals the impacts of Cost of Software Quality, when agile scrum process is not followed efficiently. While analyzing the case study, the gaps were found and guidelines for process improvements were also suggested in this paper.

  2. Guidelines to minimize cost of software quality in agile scrum process

    OpenAIRE

    Vijay, Deepa; Ganapathy, Gopinath

    2014-01-01

    This paper presents a case study of Agile Scrum process followed in Retail Domain project. This paper also reveals the impacts of Cost of Software Quality, when agile scrum process is not followed efficiently. While analyzing the case study, the gaps were found and guidelines for process improvements were also suggested in this paper.

  3. Software architects’ experiences of quality requirements: what we know and what we do not know?

    NARCIS (Netherlands)

    Daneva, Maya; Buglione, Luigi; Herrmann, Andrea; Doerr, J.; Opdahl, A.

    2013-01-01

    [Context/motivation] Quality requirements (QRs) are a concern of both requirement engineering (RE) specialists and software architects (SAs). However, the majority of empirical studies on QRs take the RE analysts’/clients’ perspectives, and only recently very few included the SAs’ perspective. As a

  4. Guidelines to Minimize the Cost of Software Quality in Agile Scrum Process

    Directory of Open Access Journals (Sweden)

    Deepa Vijay

    2014-06-01

    Full Text Available This paper presents a case study of Agile Scrum process followed in Retail Domain project. This paper also reveals the impacts of Cost of Software Quality, when agile scrum process is not followed efficiently. While analyzing the case study, the gaps were found and guidelines for process improvements were also suggested in this paper.

  5. Software quality assurance and information management, October 1986 to October 1992

    International Nuclear Information System (INIS)

    This report describes the work carried out by Cedar Design Systems Limited under contract PECD 7/9/384. The brief for the contract was initially to provide advice on Software Quality Assurance (SQA) as part of the CEC PACOMA project. This was later extended to include further SQA and information management tasks specific to the HMIP Radioactive Waste Disposal Assessments Research Programme. (Author)

  6. Quality Control in Software Documentation Based on Measurement of Text Comprehension and Text Comprehensibility.

    Science.gov (United States)

    Lehner, Franz

    1993-01-01

    Discusses methods of textual documentation that can be used for software documentation. Highlights include measurement of text comprehensibility; methods for the measurement of documentation quality, including readability and the Cloze Procedure; tools for the measurement of text readability; and the development of the Reading Measurability…

  7. Evaluation of features to support safety and quality in general practice clinical software

    Directory of Open Access Journals (Sweden)

    Schattner Peter

    2011-05-01

    Full Text Available Abstract Background Electronic prescribing is now the norm in many countries. We wished to find out if clinical software systems used by general practitioners in Australia include features (functional capabilities and other characteristics that facilitate improved patient safety and care, with a focus on quality use of medicines. Methods Seven clinical software systems used in general practice were evaluated. Fifty software features that were previously rated as likely to have a high impact on safety and/or quality of care in general practice were tested and are reported here. Results The range of results for the implementation of 50 features across the 7 clinical software systems was as follows: 17-31 features (34-62% were fully implemented, 9-13 (18-26% partially implemented, and 9-20 (18-40% not implemented. Key findings included: Access to evidence based drug and therapeutic information was limited. Decision support for prescribing was available but varied markedly between systems. During prescribing there was potential for medicine mis-selection in some systems, and linking a medicine with its indication was optional. The definition of 'current medicines' versus 'past medicines' was not always clear. There were limited resources for patients, and some medicines lists for patients were suboptimal. Results were provided to the software vendors, who were keen to improve their systems. Conclusions The clinical systems tested lack some of the features expected to support patient safety and quality of care. Standards and certification for clinical software would ensure that safety features are present and that there is a minimum level of clinical functionality that clinicians could expect to find in any system.

  8. Evaluation of features to support safety and quality in general practice clinical software

    Science.gov (United States)

    2011-01-01

    Background Electronic prescribing is now the norm in many countries. We wished to find out if clinical software systems used by general practitioners in Australia include features (functional capabilities and other characteristics) that facilitate improved patient safety and care, with a focus on quality use of medicines. Methods Seven clinical software systems used in general practice were evaluated. Fifty software features that were previously rated as likely to have a high impact on safety and/or quality of care in general practice were tested and are reported here. Results The range of results for the implementation of 50 features across the 7 clinical software systems was as follows: 17-31 features (34-62%) were fully implemented, 9-13 (18-26%) partially implemented, and 9-20 (18-40%) not implemented. Key findings included: Access to evidence based drug and therapeutic information was limited. Decision support for prescribing was available but varied markedly between systems. During prescribing there was potential for medicine mis-selection in some systems, and linking a medicine with its indication was optional. The definition of 'current medicines' versus 'past medicines' was not always clear. There were limited resources for patients, and some medicines lists for patients were suboptimal. Results were provided to the software vendors, who were keen to improve their systems. Conclusions The clinical systems tested lack some of the features expected to support patient safety and quality of care. Standards and certification for clinical software would ensure that safety features are present and that there is a minimum level of clinical functionality that clinicians could expect to find in any system.

  9. 软件体系结构质量评价概述%An Introduction to Software Architecture Quality Evaluation

    Institute of Scientific and Technical Information of China (English)

    周欣; 黄璜; 孙家骕; 燕小荣

    2003-01-01

    Software quality is one of the most important characteristics of software system and impacts on thesystem's effect, cost and efficiency. As is well known, it's better to improve the quality as early as possible, whichcan reduce the cost in following development and maintenance. Software architecture is the first activity from problemspace to software solution space, therefore, the decisions made during this process are significant to software quality.Software architecture quality evaluation analyzes and predicts qualityfrom architecture level, helping make proper ar-chitectural decisions and detecting derivation during following development. This paper summarizes the researches inthis area, introducing the purpose, content, state of the art and application status, analyzing the difficulties and dis-cussing the future directions.

  10. Quality control of the software in the JT-60 computer control system

    International Nuclear Information System (INIS)

    The JT-60 Control System should be improved corresponding to the experimental requirements. In order to keep the integrity of the system even in the modification the concept of quality control (QC) was introduced in the software development. What we have done for QC activity are (1) to establish standard procedures of the software development, (2) to develop support tools for grasping the present status of the program structure, and (3) to develop a document system, and a source program management system. This paper reports these QC activities and their problems for the JT-60 control system. (author)

  11. PROCESS QUALITY ANALYSIS OF PERFECTIVE MAINTAINABILITY FOR COMPONENT-BASED SOFTWARE SYSTEMS USING ASPECT-ORIENTED PROGRAMMING TECHNIQUES

    OpenAIRE

    Jyothi R; Dr. V.K. Agrawal

    2011-01-01

    Maintainability occupy’s the major role in the software development life cycle (SDLC). Once the software product comes out of this SDLC, major cost and effort goes to the modification/ enhancement of the different components in a component-based software systems. This research presents the modeling work and prototyping techniques, which highlights the importance of process quality analysis for perfective maintainability. This analysis comprising of time, quality and efficiency of the derived ...

  12. A General Approach of Quality Cost Management Suitable for Effective Implementation in Software Systems

    Directory of Open Access Journals (Sweden)

    Stelian BRAD

    2010-01-01

    Full Text Available Investments in quality are best quantified by implementing and managing quality cost systems. A review of various opinions coming from practitioners and researchers about the existent quality cost models reveals a set of drawbacks (e.g. too theoretical and too close to ideal cases; too academic, with less practical impact; too much personalized to particular business processes, with difficulties in extrapolating to other cases; not comprising all dimensions of a business system. Using concepts and tools in quality management theory and practice and algorithms of innovative problem solving, this paper formulates a novel approach to improve practical usability, comprehensiveness, flexibility and customizability of a quality cost management system (QCMS when implementing it in a specific software application. Conclusions arising from the implementation in real industrial cases are also highlighted.

  13. A Framework to Analyze Object-Oriented Software and Quality Assurance

    Directory of Open Access Journals (Sweden)

    Devendrasingh Thakore,

    2013-04-01

    Full Text Available Software quality cannot be improved simply by following industry standards which require adaptive/upgrading of standards or models very frequently. Quality Assurance (QA at the design phase, based on typical design artifacts, reduces the efforts to fix the vulnerabilities which affect the cost of product. Different design metrics are available, based on their results design artifacts can be modified. Modifying or making changes in artifacts is not an easy task as these artifacts are designed by rigorous study of requirements. The purpose of this research work is to automatically find out software artifacts for the system from natural language requirement specification as forward engineering and from source code as reengineering, to generate formal models specification in exportable form that can be used by UML compliment tool to visually represent the model of system. This research work also assess these design models artifacts for quality assurance and suggest alternate designs options based on primary constraints given in requirement specification. To analyze, extract and transform the hidden facts in natural language to some formal model has many challenges and obstacles. To overcome some of these obstacles in software analysis there should be some mean or a technique which aims to generate software artifacts to build the formal models such as UML class diagrams. Initially, the proposed technique converts the NL business requirements into a formal intermediate representation to increase the accuracy of the generated artifacts and models. Next, it focuses on identifying the various software artifacts to generate the analysis phase models. Finally it provides output in the format understood by model visualizing tool. The re-engineering process to find out design level artifacts and model information about the previous version of software system from available source code with easy layout is a very difficult task. Performing this task manually

  14. Users manual for SERI QC software assessing the quality of solar radiation data

    Energy Technology Data Exchange (ETDEWEB)

    None

    1993-12-01

    This manual describes the procedures and software for assessing the quality of solar radiation data. This does not constitute quality control because quality control must take place during the preparations for data collection (selection, calibration, and installation of instruments), during the measurement process, and during the transmission (if any) and recording of the numerical values. Once the data are recorded, only quality assessment can be performed. If quality assessment is performed in real time or soon after the measurement process is completed, it can provide input to control the quality of future measurements. Furthermore, quality assessment can be used for quality control if data judged to be bad are deleted and/or modified. We do not subscribe to these actions because the deletion or modification of data destroys information that might be useful to the user. For example, if an instrument has gone through a gradual failure and all of the data that fail quality assessment criteria are deleted or modified, the user of the data may not be able to detect what was happening and will not question the accuracy of other data collected before the instrument completely failed. Therefore, the SERI QC procedures and software do not delete or modify data. Instead, flags are set to inform the user of any departure of the data from expected values. These flags indicate the magnitude and direction of such departures. For the flags to communicate as much information as possible, this manual attempts to identify and explain the probable causes of various flags. However, we cannot overemphasize the following: Flags only indicate that data do or do not fall within expected ranges. This does not mean that the data that the data are or are not valid.

  15. The Data Quality Monitoring Software for the CMS experiment at the LHC

    CERN Document Server

    Borrello, Laura

    2014-01-01

    The Data Quality Monitoring (DQM) Software is a central tool in the CMS experiment. Its flexibility allows for integration in several key environments Online, for real-time detector monitoring; Offline, for the final, fine-grained Data Certification; Release-Validation, to constantly validate the functionalities and the performance of the reconstruction software; in Monte Carlo productions.Since the end of data taking at a center of mass energy of 8 TeV, the environment in which the DQM lives has undergone fundamental changes. In turn, the DQM system has made significant upgrades in many areas to respond to not only the changes in infrastructure, but also the growing specialized needs of the collaboration with an emphasis on more sophisticated methods for evaluating dataquality, as well as advancing the DQM system to provide quality assessments of various Monte Carlo simulations versus data distributions, monitoring changes in physical effects due to modifications of algorithms or framework, and enabling reg...

  16. Specialized software for optimization of the quality control of the mammography units

    International Nuclear Information System (INIS)

    Quality control is essential to ensure the equipment used is reliable and consistent in order to maintain radiation does as low as reasonably achievable whilst optimizing image quality and performance in mammography. The effectiveness of mammographic screening is highly dependent on the consistent production of high quality diagnostic images. Mammography is highly dependent on the equipment status, which requires an effective Quality Control (QC) program to provide tools for continuous assessment of the equipment performance and also data storage and analysis of the protocols' data. The objective of this paper is to present the specialized software for Quality Control of the Mammography Units, as tool providing additional functionality for optimizations of the Mammography QC data storage and management. The PC program was developed according to the requirements stated in the European protocol for Quality Control of the Mammography Screening and the data collected as a result of its application in several Bulgarian hospitals. The Structured Analysis method was used in order to perform a case, which resulted in the development of the specialized software with a database module, providing the following functionality: Data Storage, Preliminary Data Processing and Post-Processing, Manual Data Entry, Data Import from XLS format, Data Export to XLS format, Printing, Data Filters, Automated Calculation, Automated Graphical Representation, Archiving The development of specialized QC software with a database for mammography units facilitates the process of QC data storage and handling and minimizes the errors. The electronic format for data storage is especially useful in case of long-term storage and periodical data analysis/access. The integrated data processing functionality and the automated import/export features based on standard platform increase the compatibility of the data. (authors)

  17. ASC Trilab L2 Codesign Milestone 2015

    Energy Technology Data Exchange (ETDEWEB)

    Trott, Christian Robert [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hammond, Simon David [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Dinge, Dennis [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Lin, Paul T. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Vaughan, Courtenay T. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Cook, Jeanine [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Edwards, Harold C. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Rajan, Mahesh [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hoekstra, Robert J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-09-01

    For the FY15 ASC L2 Trilab Codesign milestone Sandia National Laboratories performed two main studies. The first study investigated three topics (performance, cross-platform portability and programmer productivity) when using OpenMP directives and the RAJA and Kokkos programming models available from LLNL and SNL respectively. The focus of this first study was the LULESH mini-application developed and maintained by LLNL. In the coming sections of the report the reader will find performance comparisons (and a demonstration of portability) for a variety of mini-application implementations produced during this study with varying levels of optimization. Of note is that the implementations utilized including optimizations across a number of programming models to help ensure claims that Kokkos can provide native-class application performance are valid. The second study performed during FY15 is a performance assessment of the MiniAero mini-application developed by Sandia. This mini-application was developed by the SIERRA Thermal-Fluid team at Sandia for the purposes of learning the Kokkos programming model and so is available in only a single implementation. For this report we studied its performance and scaling on a number of machines with the intent of providing insight into potential performance issues that may be experienced when similar algorithms are deployed on the forthcoming Trinity ASC ATS platform.

  18. STATIC CODE ANALYSIS FOR SOFTWARE QUALITY IMPROVEMENT: A CASE STUDY IN BCI FRAMEWORK DEVELOPMENT

    Directory of Open Access Journals (Sweden)

    Indar Sugiarto

    2008-01-01

    Full Text Available This paper shows how the systematic approach in software testing using static code analysis method can be used for improving the software quality of a BCI framework. The method is best performed during the development phase of framework programs. In the proposed approach, we evaluate several software metrics which are based on the principles of object oriented design. Since such method is depending on the underlying programming language, we describe the method in term of C++ language programming whereas the Qt platform is also currently being used. One of the most important metric is so called software complexity. Applying the software complexity calculation using both McCabe and Halstead method for the BCI framework which consists of two important types of BCI, those are SSVEP and P300, we found that there are two classes in the framework which have very complex and prone to violation of cohesion principle in OOP. The other metrics are fit the criteria of the proposed framework aspects, such as: MPC is less than 20; average complexity is around value of 5; and the maximum depth is below 10 blocks. Such variables are considered very important when further developing the BCI framework in the future.

  19. Demonstrating resilient quality of service in software defined networking

    OpenAIRE

    Sharma, Sachin; Staessens, Dimitri; Colle, Didier; D. De Palma; Goncalves, J; Pickavet, Mario; CORDEIRO, L.; Demeester, Piet

    2014-01-01

    Software defined Networking (SDN) such as Open-Flow decouples the control plane from forwarding devices and embeds it into one or more external entities called controllers. We implemented a framework in OpenFlow through which business customers receive higher Quality of Service (QoS) than best-effort customers in all conditions (e. g. failure conditions). In the demonstration, we stream video clips (business and best-effort customer's traffic) through an emulated OpenFlow topology. During the...

  20. New Software Developments for Quality Mesh Generation and Optimization from Biomedical Imaging Data

    OpenAIRE

    Yu, Zeyun; Wang, Jun; Gao, Zhanheng; Xu, Ming; Hoshijima, Masahiko

    2013-01-01

    In this paper we present a new software toolkit for generating and optimizing surface and volumetric meshes from three-dimensional (3D) biomedical imaging data, targeted at image-based finite element analysis of some biomedical activities in a single material domain. Our toolkit includes a series of geometric processing algorithms including surface re-meshing and quality-guaranteed tetrahedral mesh generation and optimization. All methods described have been encapsulated into a user-friendly ...

  1. Introduction to Software Quality Prediction Technology%软件质量预测技术概述

    Institute of Scientific and Technical Information of China (English)

    高岩; 杨春晖; 熊婧

    2015-01-01

    软件质量预测是对软件质量进行早期预测和控制的方法,其原理是在软件开发的早期根据与软件质量有关的度量数据,使用机器学习或者统计学方法来构建软件质量模型,通过分析计算得到软件质量的预测值,从而对软件系统中潜在的错误进行预测和预警。从软件质量预测的概念、模型框架、应用、发展前景和面临的挑战等方面对软件质量预测进行了系统的概述。%Software quality prediction is the method to predict and control software quality in early stage.The principle of software quality prediction is to build software quality modules through machine learning and then obtain the predication by analyzing and calculating so as to forecast and monitor the potential errors in the software system according to the metrics data related to software quality in the early stage of software development. In this article, the concept, framework, application, prospects and challenges of software quality prediction are overviewed systematically.

  2. Quality in cytopathology: an analysis of the internal quality monitoring indicators of the Instituto Nacional de Câncer

    Directory of Open Access Journals (Sweden)

    Mario Lucio C. Araujo Jr

    2015-04-01

    Full Text Available Introduction: Quality control programs are required to ensure the effectiveness of Pap smear, which still remain a key strategy for control of cervical cancer worldwide. Objective: This study was based on the retrospective and quantitative analysis of the post-analytical phase indicators from the internal quality monitoring (IQM program for cytopathology laboratories, such as: positivity rate, atypical squamous cell (ASC/satisfactory exams ratio, ASC/abnormal test results ratio, ASC/squamous intraepithelial lesions (SIL ratio, percentage of tests compatible with high-grade squamous intraepithelial lesion (HSIL, and total of false negative. Materials and methods: The information was extracted from the computerized system of the Section for Integrated Technology in Cytopathology (Seção Integrada de Tecnologia em Citopatologia [SITEC], a reference institution for cancer cytopathology, from July 2013 to June 2014. From a total of 156,888 Pap smears, 157,454 were considered satisfactory for indicator analysis and 566 were excluded because they were considered unsatisfactory and/or rejected for analysis. The data was organized in tables using Microsoft Excel 2010 software, and categorized as indicators. Results: The averages for the indicators were: 7.2% for positivity rate, 56.9 for ASC/abnormal test ratio, 4.1 for ASC/satisfactory tests ratio, 1.4 for ASC/SIL ratio, 0.6% percentage for tests compatible with HSIL, and 2.1% for false-negative rate. Conclusion: The results show that an Internal Quality Monitoring Program is essencial to ensure quality for cytopathology laboratories, and a randomized review of at least 10% of the negative exams, as recommended by the Brazilian Ministry of Health/Instituto Nacional de Câncer (INCA, since is an effective method, especially for large laboratories.

  3. Quality assurance applied to mammographic equipments using phantoms and software for its evaluation

    International Nuclear Information System (INIS)

    The image quality assessment in radiographic equipments is a very important item for a complete quality control of the radiographic image chain. The periodic evaluation of the radiographic image quality must guarantee the constancy of this quality to carry out a suitable diagnosis. Mammographic phantom images are usually used to study the quality of images obtained by determined mammographic equipment. The digital image treatment techniques allow to carry out an automatic analysis of the phantom image. In this work we apply some techniques of digital image processing to analyze in an automatic way the image quality of mammographic phantoms, namely CIRS SP01 and RACON for different varying conditions of the mammographic equipment. The CIRS SP01 phantom is usually used in analogic mammographic equipments and the RACON phantom has been specifically developed by authors to be applied to acceptance and constancy tests of the image quality in digital radiographic equipments following recommendations of international associations. The purpose of this work consists in analyzing the image quality for both phantoms by means of an automatic software utility. This analysis allows us to study the functioning of the image chain of the mammographic system in an objective way, so an abnormal functioning of the radiographic equipment might be detected.

  4. Increasing quality and managing complexity in neuroinformatics software development with continuous integration

    Directory of Open Access Journals (Sweden)

    Yury V. Zaytsev

    2013-01-01

    Full Text Available High quality neuroscience research requires accurate, reliable and well maintained neuroinformatics applications. As software projects become larger, offering more functionality and developing a denser web of interdependence between their component parts, we need more sophisticated methods to manage their complexity. If complexity is allowed to get out of hand, either the quality of the software or the speed of development suffer, and in many cases both. To address this issue, here we develop a scalable, low-cost and open source solution for continuous integration (CI, a technique which ensures the quality of changes to the code base during the development procedure, rather than relying on a pre-release integration phase. We demonstrate that a CI based workflow, due to rapid feedback about code integration problems and tracking of code health measures, enabled substantial increases in productivity for a major neuroinformatics project and additional benefits for three further projects. Beyond the scope of the current study, we identify multiple areas in which CI can be employed to further increase the quality of neuroinformatics projects by improving development practices and incorporating appropriate development tools. Finally, we discuss what measures can be taken to lower the barrier for developers of neuroinformatics applications to adopt this useful technique.

  5. Specific developed phantoms and software to assess radiological equipment image quality

    Energy Technology Data Exchange (ETDEWEB)

    Verdu, G., E-mail: gverdu@iqn.upv.es [Universidad Politecnica de Valencia (Spain). Dept. de Ingenieria Quimica y Nuclear; Mayo, P., E-mail: p.mayo@titaniast.com [TITANIA Servicios Teconologicos, Valencia (Spain); Rodenas, F., E-mail: frodenas@mat.upv.es [Universidad Politecnica de Valencia (Spain). Dept. de Matematica Aplicada; Campayo, J.M., E-mail: j.campayo@lainsa.com [Logistica y Acondicionamientos Industriales S.A.U (LAINSA), Valencia (Spain)

    2011-07-01

    The use of radiographic phantoms specifically designed to evaluate the operation of the radiographic equipment lets the study of the image quality obtained by this equipment in an objective way. In digital radiographic equipment, the analysis of the image quality can be automatized because the acquisition of the image is possible in different technologies that are, computerized radiography or phosphor plate and direct radiography or detector. In this work we have shown an application to assess automatically the constancy quality image in the image chain of the radiographic equipment. This application is integrated by designed radiographic phantoms which are adapted to conventional, dental equipment and specific developed software for the automatic evaluation of the phantom image quality. The software is based on digital image processing techniques that let the automatic detection of the different phantom tests by edge detector, morphological operators, threshold histogram techniques, etc. The utility developed is enough sensitive to the radiographic equipment of operating conditions of voltage (kV) and charge (mAs). It is a friendly user programme connected with a data base of the hospital or clinic where it has been used. After the phantom image processing the user can obtain an inform with a resume of the imaging system state with accepting and constancy results. (author)

  6. A software tool for quality assurance of computed/digital radiography (CR/DR) systems

    Science.gov (United States)

    Desai, Nikunj; Valentino, Daniel J.

    2011-03-01

    The recommended methods to test the performance of computed radiography (CR) systems have been established by The American Association of Physicists in Medicine, Report No. 93, "Acceptance Testing and Quality Control of Photostimulable Storage Phosphor Imaging Systems". The quality assurance tests are categorized by how frequently they need to be performed. Quality assurance of CR systems is the responsibility of the facility that performs the exam and is governed by the state in which the facility is located. For Example, the New York State Department of Health has established a guide which lists the tests that a CR facility must perform for quality assurance. This study aims at educating the reader about the new quality assurance requirements defined by the state. It further demonstrates an easy to use software tool, henceforth referred to as the Digital Physicist, developed to aid a radiologic facility in conforming with state guidelines and monitoring quality assurance of CR/DR imaging systems. The Digital Physicist provides a vendor independent procedure for quality assurance of CR/DR systems. Further it, generates a PDF report with a brief description of these tests and the obtained results.

  7. Development of Automatic Quality Check Software in Mailbox Declaration For Nuclear Fuel Fabrication Plants

    International Nuclear Information System (INIS)

    Short Notice Random Inspection (SNRI) is a new IAEA safeguards inspection regime for bulk handing facility, which utilities random inspection through a mailbox system. Its main objective is to verify 100% of the flow components of the safeguarded nuclear material at such a facility. To achieve the SNRI objective, it is required to provide daily mailbox declaration, by a facility's operator, to the IAEA with regard to information, such as the receipt and shipment of nuclear materials. Mailbox declarations are then later compared with accounting records so as to examine the accuracy and consistency of the facility operator's declaration at the time of the SNRI. The IAEA has emphasized the importance of accurate mailbox declarations and recommended that the ROK initiate its own independent quality control system in order to improve and maintain its mailbox declarations as a part of the SSAC activities. In an effort to improve the transparency of operational activities at fuel fabrication plants and to satisfy IAEA recommendation, an automatic quality check software application has been developed to improve mailbox declarations at fabrication plants in Korea. The ROK and the IAEA have recognized the importance of providing good quality mailbox declaration for an effective and efficient SNRI at fuel fabrication plants in Korea. The SRA developed an automatic quality check software program in order to provide an independent QC system of mailbox declaration, as well as to improve the quality of mailbox declaration. Once the automatic QC system is implemented, it will improve the quality of an operator's mailbox declaration by examining data before sending it to the IAEA. The QC system will be applied to fuel fabrication plants in the first half of 2014

  8. Qualidade de kiwis minimamente processados e submetidos a tratamento com ácido ascórbico, ácido cítrico e cloreto de cálcio Quality of kiwis minimally processed and treated with ascorbic acid, citric acid and calcium chloride

    Directory of Open Access Journals (Sweden)

    Ana Vânia Carvalho

    2002-05-01

    Full Text Available Frutos e hortaliças minimamente processados devem apresentar atributos de conveniência e qualidade do produto fresco. O objetivo deste trabalho foi estudar o efeito do processamento mínimo de frutos tratados com soluções a 1% de ácido ascórbico, ácido cítrico e cloreto de cálcio, durante armazenamento refrigerado, na qualidade do kiwi (Actinidia deliciosa cv. Hayward. A perda de massa foi mínima durante o período de armazenamento. O ácido ascórbico fornecido pelo tratamento foi eficientemente absorvido pelos tecidos, mantendo os níveis de vitamina C cerca de 25% mais elevados nesses frutos do que nos demais tratamentos. A análise microbiológica detectou presença de bolores e leveduras e psicrotróficos, somente no tratamento com ácido cítrico, aos 8 e 10 dias, respectivamente. Não se detectaram coliformes totais e fecais e mesófilos, o que indica que o processamento foi realizado em boas condições higiênicas. Os kiwis minimamente processados e tratados com cloreto de cálcio apresentaram uma vida útil de dez dias. Nos demais tratamentos e no controle, esse tempo foi de seis dias.Minimally processed fruits and vegetables might present the same convenience and quality of fresh products. In this work, the influence of minimal processing of fruits treated with 1% solutions of ascorbic acid, citric acid and calcium chloride on the quality of kiwi (Actinidia deliciosa cv. Hayward, during refrigerated storage were investigated. Mass loss was minimal over the storage period. Ascorbic acid furnished by the treatment was effectively absorbed by tissues, keeping the vitamin C levels 25% higher in those fruits, than in other treatments. Microbiological analysis detected the presence of the group molds and yeasts and psychrotrophic in citric acid treatment, at 8 and 10 days, respectively. Total and faecal coliforms and mesophyles were not found, indicating that processing was performed under good hygienic conditions. Minimally

  9. Test Hardware Design for Flightlike Operation of Advanced Stirling Convertors (ASC-E3)

    Science.gov (United States)

    Oriti, Salvatore M.

    2012-01-01

    NASA Glenn Research Center (GRC) has been supporting development of the Advanced Stirling Radioisotope Generator (ASRG) since 2006. A key element of the ASRG project is providing life, reliability, and performance testing of the Advanced Stirling Convertor (ASC). For this purpose, the Thermal Energy Conversion branch at GRC has been conducting extended operation of a multitude of free-piston Stirling convertors. The goal of this effort is to generate long-term performance data (tens of thousands of hours) simultaneously on multiple units to build a life and reliability database. The test hardware for operation of these convertors was designed to permit in-air investigative testing, such as performance mapping over a range of environmental conditions. With this, there was no requirement to accurately emulate the flight hardware. For the upcoming ASC-E3 units, the decision has been made to assemble the convertors into a flight-like configuration. This means the convertors will be arranged in the dual-opposed configuration in a housing that represents the fit, form, and thermal function of the ASRG. The goal of this effort is to enable system level tests that could not be performed with the traditional test hardware at GRC. This offers the opportunity to perform these system-level tests much earlier in the ASRG flight development, as they would normally not be performed until fabrication of the qualification unit. This paper discusses the requirements, process, and results of this flight-like hardware design activity.

  10. Flexible Self-Managing Pipe-line Framework Reducing Development Risk to Improve Software Quality

    Directory of Open Access Journals (Sweden)

    Nitin Deepak

    2015-06-01

    Full Text Available Risk identification and assessment in today‘s scenario play a vital role in any software/web application development industry. Many process models deliver the process related to development life cycle, but the risk assessment at an early stage is still an issue and is a significant subject for research. In this paper, an approach based on MVC architecture by embedding spiral process, which is verified and validated by V-shape model is proposed. By using this approach development efficiency will increase due to less burdened working team(s, reduces stressful maintenance effort that causes reduction in risk factors because of beautifully distributed human effort to improve software quality. Besides, the efficiency of our approach is manifested by the preliminary experiment.

  11. Software quality assurance in the 1996 performance assessment for the Waste Isolation Pilot Plant

    International Nuclear Information System (INIS)

    The US Department of Energy (DOE) Waste Isolation Pilot Plant (WIPP), located in southeast New Mexico, is a deep geologic repository for the permanent disposal of transuranic waste generated by DOE defense-related activities. Sandia National Laboratories (SNL), in its role as scientific advisor to the DOE, is responsible for evaluating the long-term performance of the WIPP. This risk-based Performance Assessment (PA) is accomplished in part through the use of numerous scientific modeling codes, which rely for some of their inputs on data gathered during characterization of the site. The PA is subject to formal requirements set forth in federal regulations. In particular, the components of the calculation fall under the configuration management and software quality assurance aegis of the American Society of Mechanical Engineers(ASME) Nuclear Quality Assurance (NQA) requirements. This paper describes SNL's implementation of the NQA requirements regarding software quality assurance (SQA). The description of the implementation of SQA for a PA calculation addresses not only the interpretation of the NQA requirements, it also discusses roles, deliverables, and the resources necessary for effective implementation. Finally, examples are given which illustrate the effectiveness of SNL's SQA program, followed by a detailed discussion of lessons learned

  12. SU-E-T-103: Development and Implementation of Web Based Quality Control Software

    Energy Technology Data Exchange (ETDEWEB)

    Studinski, R; Taylor, R; Angers, C; La Russa, D; Clark, B [The Ottawa Hospital Regional Cancer Ctr., Ottawa, ON (Canada)

    2014-06-01

    Purpose: Historically many radiation medicine programs have maintained their Quality Control (QC) test results in paper records or Microsoft Excel worksheets. Both these approaches represent significant logistical challenges, and are not predisposed to data review and approval. It has been our group's aim to develop and implement web based software designed not just to record and store QC data in a centralized database, but to provide scheduling and data review tools to help manage a radiation therapy clinics Equipment Quality control program. Methods: The software was written in the Python programming language using the Django web framework. In order to promote collaboration and validation from other centres the code was made open source and is freely available to the public via an online source code repository. The code was written to provide a common user interface for data entry, formalize the review and approval process, and offer automated data trending and process control analysis of test results. Results: As of February 2014, our installation of QAtrack+ has 180 tests defined in its database and has collected ∼22 000 test results, all of which have been reviewed and approved by a physicist via QATrack+'s review tools. These results include records for quality control of Elekta accelerators, CT simulators, our brachytherapy programme, TomoTherapy and Cyberknife units. Currently at least 5 other centres are known to be running QAtrack+ clinically, forming the start of an international user community. Conclusion: QAtrack+ has proven to be an effective tool for collecting radiation therapy QC data, allowing for rapid review and trending of data for a wide variety of treatment units. As free and open source software, all source code, documentation and a bug tracker are available to the public at https://bitbucket.org/tohccmedphys/qatrackplus/.

  13. SU-E-T-103: Development and Implementation of Web Based Quality Control Software

    International Nuclear Information System (INIS)

    Purpose: Historically many radiation medicine programs have maintained their Quality Control (QC) test results in paper records or Microsoft Excel worksheets. Both these approaches represent significant logistical challenges, and are not predisposed to data review and approval. It has been our group's aim to develop and implement web based software designed not just to record and store QC data in a centralized database, but to provide scheduling and data review tools to help manage a radiation therapy clinics Equipment Quality control program. Methods: The software was written in the Python programming language using the Django web framework. In order to promote collaboration and validation from other centres the code was made open source and is freely available to the public via an online source code repository. The code was written to provide a common user interface for data entry, formalize the review and approval process, and offer automated data trending and process control analysis of test results. Results: As of February 2014, our installation of QAtrack+ has 180 tests defined in its database and has collected ∼22 000 test results, all of which have been reviewed and approved by a physicist via QATrack+'s review tools. These results include records for quality control of Elekta accelerators, CT simulators, our brachytherapy programme, TomoTherapy and Cyberknife units. Currently at least 5 other centres are known to be running QAtrack+ clinically, forming the start of an international user community. Conclusion: QAtrack+ has proven to be an effective tool for collecting radiation therapy QC data, allowing for rapid review and trending of data for a wide variety of treatment units. As free and open source software, all source code, documentation and a bug tracker are available to the public at https://bitbucket.org/tohccmedphys/qatrackplus/

  14. Waste Management facilities cost information: System Cost Model Software Quality Assurance Plan. Revision 2

    International Nuclear Information System (INIS)

    In May of 1994, Lockheed Idaho Technologies Company (LITCO) in Idaho Falls, Idaho and subcontractors developed the System Cost Model (SCM) application. The SCM estimates life-cycle costs of the entire US Department of Energy (DOE) complex for designing; constructing; operating; and decommissioning treatment, storage, and disposal (TSD) facilities for mixed low-level, low-level, transuranic, and mixed transuranic waste. The SCM uses parametric cost functions to estimate life-cycle costs for various treatment, storage, and disposal modules which reflect planned and existing facilities at DOE installations. In addition, SCM can model new facilities based on capacity needs over the program life cycle. The SCM also provides transportation costs for truck and rail, which include transport of contact-handled, remote-handled, and alpha (transuranic) wastes. The user can provide input data (default data is included in the SCM) including the volume and nature of waste to be managed, the time period over which the waste is to be managed, and the configuration of the waste management complex (i.e., where each installation's generated waste will be treated, stored, and disposed). Then the SCM uses parametric cost equations to estimate the costs of pre-operations (designing), construction costs, operation management, and decommissioning these waste management facilities. For the product to be effective and useful the SCM users must have a high level of confidence in the data generated by the software model. The SCM Software Quality Assurance Plan is part of the overall SCM project management effort to ensure that the SCM is maintained as a quality product and can be relied on to produce viable planning data. This document defines tasks and deliverables to ensure continued product integrity, provide increased confidence in the accuracy of the data generated, and meet the LITCO's quality standards during the software maintenance phase. 8 refs., 1 tab

  15. Development of a software tool for the management of quality control in a helical tomotherapy unit

    International Nuclear Information System (INIS)

    The large amount of data and information that is managed in units of external radiotherapy quality control tests makes necessary the use of tools that facilitate, on the one hand, the management of measures and results in real time, and on other tasks of management, file, query and reporting of stored data. This paper presents an application of software of own development which is used for the integral management of the helical TomoTherapy unit in the aspects related to the roles and responsibilities of the hospital Radiophysics. (Author)

  16. Action of exogenous ascorbic acid on physiological quality of cowpea seeds artificially agedAção do ácido ascórbico exógeno na qualidade fisiológica de sementes de feijão de corda envelhecidas artificialmente

    Directory of Open Access Journals (Sweden)

    Jean Carlos de Araújo Brilhante

    2013-06-01

    Full Text Available The aim of this study was to evaluate the effects of pre-treatment and post-treatment with ascorbic acid in reducing the damage caused by aging in cowpea seeds, which were aged in artificial accelerated aging chamber (45°C, 99% relative humidity in the dark for 72 h and it were subjected to four treatments: T1 – non-aged seeds were not treated with ascorbic acid (AsA; T2 – non-aged seeds treated with AsA; T3 – seeds before aging were subjected to a treatment with AsA to 0.85 mM; T4 – seeds after aging were subjected to a treatment with a 0.85 mM AsA. The aging of seeds for 72 h (T2 caused significant damage in the cell membranes of cowpea seeds, as evidenced by a increase in electrolyte leakage, higher level of lipid peroxidation and a lower germination percentage, when compared with the control (T1. T4 results were like the control treatment, with a lower electrolyte leakage and lipid peroxidation, well as a higher germination percentage, when compared with T2. The exogenous application of 0.85 mM ascorbic acid in cowpea seeds after artificial aging can mitigate its detrimental effects on the membranes integrity and seed physiological quality.O objetivo desse estudo foi avaliar os efeitos do pré e pós-tratamento com ácido ascórbico na redução dos danos ocasionados pelo envelhecimento em sementes de feijão de corda, as quais foram envelhecidas em câmara de envelhecimento artificial (45 ºC, 99% U.R., no escuro por 72 h, sendo submetidas a quatro tratamentos: T1 – sementes não envelhecidas e não tratadas com ácido ascórbico (AsA; T2 – sementes envelhecidas e não tratadas com AsA; T3 – sementes que antes do envelhecimento foram submetidas a um tratamento com AsA a 0,85 mM; T4 – sementes que após o envelhecimento foram submetidas a um tratamento com AsA a 0,85 mM. O envelhecimento das sementes por 72 h (T2 provocou danos significativos nas membranas celulares das sementes de feijão de corda, como foi evidenciado pelo

  17. Practical experience with software tools to assess and improve the quality of existing nuclear analysis and safety codes

    International Nuclear Information System (INIS)

    Within the constraints of schedule and budget, software tools and techniques were applied to existing FORTRAN codes determining software quality metrics and improving the code quality. Specifically discussed are INEL experiences in applying pretty printers, cross-reference analyzers, and computer aided software engineering (CASE) tools and techniques. These have provided management with measures of the risk potential for individual program modules so that rational decisions can be made on resource allocation. Selected program modules have been modified to reduce the complexity, achieve higher functional independence, and improve the code vectorization. (orig.)

  18. Development and case study of a science-based software platform to support policy making on air quality

    Institute of Scientific and Technical Information of China (English)

    Yun Zhu; Yahweh Lao; Carey Jang; Chen-Jen Lin; Jia Xing; Shuxiao Wang; Joshua S.Fu

    2015-01-01

    This article describes the development and implementations of a novel software platform that supports real-time,science-based policy making on air quality through a user-friendly interface.The software,RSM-VAT,uses a response surface modeling (RSM) methodology and serves as a visualization and analysis tool (VAT) for three-dimensional air quality data obtained by atmospheric models.The software features a number of powerful and intuitive data visualization functions for illustrating the complex nonlinear relationship between emission reductions and air quality benefits.The case study of contiguous U.S.demonstrates that the enhanced RSM-VAT is capable of reproducing the air quality model results with Normalized Mean Bias <2% and assisting in air quality policy making in near real time.

  19. Quality control of software in dissimilar systems using a common clinical data base

    International Nuclear Information System (INIS)

    For a long time there has been widespread interest in the quality control of diagnostic instrumentation. The increasing dependence on computational systems for clinical results makes it imperative that methods for quality control of diagnostic software be developed. This paper proposes a method based on the use of a collection of patient studies for which the results have been corroborated by independent methods. The data set will be distributed in a format requiring no special handling by the system being tested and will appear identical to studies actually collected by the host system. An example of the use of a preliminary version of the data set for comparison of two systems is shown. The comparison shows that analyses performed on the two systems agree very well and can be reliably compared for follow-up studies of a patient

  20. The role of metrics and measurements in a software intensive total quality management environment

    Science.gov (United States)

    Daniels, Charles B.

    1992-01-01

    Paramax Space Systems began its mission as a member of the Rockwell Space Operations Company (RSOC) team which was the successful bidder on a massive operations consolidation contract for the Mission Operations Directorate (MOD) at JSC. The contract awarded to the team was the Space Transportation System Operations Contract (STSOC). Our initial challenge was to accept responsibility for a very large, highly complex and fragmented collection of software from eleven different contractors and transform it into a coherent, operational baseline. Concurrently, we had to integrate a diverse group of people from eleven different companies into a single, cohesive team. Paramax executives recognized the absolute necessity to develop a business culture based on the concept of employee involvement to execute and improve the complex process of our new environment. Our executives clearly understood that management needed to set the example and lead the way to quality improvement. The total quality management policy and the metrics used in this endeavor are presented.

  1. SU-E-T-27: A Tool for Routine Quality Assurance of Radiotherapy Dose Calculation Software

    International Nuclear Information System (INIS)

    Purpose: Dose calculation software is thoroughly evaluated when it is commissioned; however, evaluation of periodic software updates is typically limited in scope due to staffing constraints and the need to quickly return the treatment planning system to clinical service. We developed a tool for quickly and comprehensively testing and documenting dose calculation software against measured data. Methods: A tool was developed using MatLab (The MathWorks, Natick, MA) for evaluation of dose calculation algorithms against measured data. Inputs to the tool are measured data, reference DICOM RT PLAN files describing the measurements, and dose calculations in DICOM format. The tool consists of a collection of extensible modules that can perform analysis of point dose, depth dose curves, and profiles using dose difference, distance-to-agreement, and the gamma-index. Each module generates a report subsection that is incorporated into a master template, which is converted to final form in portable document format (PDF). Results: After each change to the treatment planning system, a report can be generated in approximately 90 minutes. The tool has been in use for more than 5 years, spanning 5 versions of the eMC and 4 versions of the AAA. We have detected changes to the algorithms that affected clinical practice once during this period. Conclusion: Our tool provides an efficient method for quality assurance of dose calculation software, providing a complete set of tests for an update. Future work includes the addition of plan level tests, allowing incorporation of, for example, the TG-119 test suite for IMRT, and integration with the treatment planning system via an application programming interface. Integration with the planning system will permit fully-automated testing and reporting at scheduled intervals

  2. SU-E-T-27: A Tool for Routine Quality Assurance of Radiotherapy Dose Calculation Software

    Energy Technology Data Exchange (ETDEWEB)

    Popple, R; Cardan, R; Duan, J; Wu, X; Shen, S; Brezovich, I [The University of Alabama at Birmingham, Birmingham, AL (United States)

    2014-06-01

    Purpose: Dose calculation software is thoroughly evaluated when it is commissioned; however, evaluation of periodic software updates is typically limited in scope due to staffing constraints and the need to quickly return the treatment planning system to clinical service. We developed a tool for quickly and comprehensively testing and documenting dose calculation software against measured data. Methods: A tool was developed using MatLab (The MathWorks, Natick, MA) for evaluation of dose calculation algorithms against measured data. Inputs to the tool are measured data, reference DICOM RT PLAN files describing the measurements, and dose calculations in DICOM format. The tool consists of a collection of extensible modules that can perform analysis of point dose, depth dose curves, and profiles using dose difference, distance-to-agreement, and the gamma-index. Each module generates a report subsection that is incorporated into a master template, which is converted to final form in portable document format (PDF). Results: After each change to the treatment planning system, a report can be generated in approximately 90 minutes. The tool has been in use for more than 5 years, spanning 5 versions of the eMC and 4 versions of the AAA. We have detected changes to the algorithms that affected clinical practice once during this period. Conclusion: Our tool provides an efficient method for quality assurance of dose calculation software, providing a complete set of tests for an update. Future work includes the addition of plan level tests, allowing incorporation of, for example, the TG-119 test suite for IMRT, and integration with the treatment planning system via an application programming interface. Integration with the planning system will permit fully-automated testing and reporting at scheduled intervals.

  3. Knowledge work productivity effect on quality of knowledge work in software development process in SME

    Science.gov (United States)

    Yusoff, Mohd Zairol; Mahmuddin, Massudi; Ahmad, Mazida

    2016-08-01

    Knowledge and skill are necessary to develop the capability of knowledge workers. However, there is very little understanding of what the necessary knowledge work (KW) is, and how they influence the quality of knowledge work or knowledge work productivity (KWP) in software development process, including that in small and medium-sized (SME) enterprise. The SME constitutes a major part of the economy and it has been relatively unsuccessful in developing KWP. Accordingly, this paper seeks to explore the influencing dimensions of KWP that effect on the quality of KW in SME environment. First, based on the analysis of the existing literatures, the key characteristics of KW productivity are defined. Second, the conceptual model is proposed, which explores the dimensions of the KWP and its quality. This study analyses data collected from 150 respondents (based on [1], who involve in SME in Malaysia and validates the models by using structural equation modeling (SEM). The results provide an analysis of the effect of KWP on the quality of KW and business success, and have a significant relevance for both research and practice in the SME

  4. Software quality assurance documentation for the release of NUFT 2.0 for HP platforms

    Energy Technology Data Exchange (ETDEWEB)

    Fernandez, Michael W. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Johnson, Gary L. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Preckshot, Gary G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    1998-08-28

    This document is the Individual Software Plan (ISP) for version 2.0 of the Non-isothermal Unsaturated-saturated Flow and Transport (NUFT.) analysis computer program. This document addresses the applicable requirements of LLNL YMP procedure 033-YMP-QP 3.2, Section 4.2.1.1. The purpose of this ISP is to plan and organize the activities required to certify the NUFT code for quality affecting work involving problems that include cross drift analysis of the Yucca Mountain Repository facility. NUFT is software for application to the solution of a class of coupled mass and heat transport problems in porous geologic media including Yucca Mountain Repository Cross Drift Problem (YMRCDP- also known as the Enhanced Characterization of the Repository Block (ECRB)). Solution of this class of problems requires a suite of multiphase, multi-component models for numerical solution of non- isothermal flow and transport in porous media with applications to subsurface contaminant transport problems. NUFT is a suite of multiphase, multi-component models for numerical solution of non- isothermal flow and transport in porous media, with application to subsurface contaminant transport problems, and in particular, to the hydrology in and about the Yucca Mountain Repository Site. NUFI is acquired software, as defined by 033-YMP-QP 3.2, and a preliminary baseline of source code, electronic documentation, and paper documentation has been established as required by 033-YMP-QP 3.2, Section 4.1. NUFT runs on Sun Unix platforms, Solaris operating system version 5.5 and HP-UX with operating system version 10.20. The product to be qualified under this ISP is the version running on HP- UX. The HP version will be labeled Version 2.0h. The "h" is included to distinguish the HP version from possible future versions qualified for Sun or other platforms. The scope of the plans and procedures outlined in this ISP is limited to the effort required to qualify NUFT for the class of problems identified in

  5. Artificial Loading of ASC Specks with Cytosolic Antigens.

    Directory of Open Access Journals (Sweden)

    Ali Can Sahillioğlu

    Full Text Available Inflammasome complexes form upon interaction of Nod Like Receptor (NLR proteins with pathogen associated molecular patterns (PAPMS inside the cytosol. Stimulation of a subset of inflammasome receptors including NLRP3, NLRC4 and AIM2 triggers formation of the micrometer-sized spherical supramolecular complex called the ASC speck. The ASC speck is thought to be the platform of inflammasome activity, but the reason why a supramolecular complex is preferred against oligomeric platforms remains elusive. We observed that a set of cytosolic proteins, including the model antigen ovalbumin, tend to co-aggregate on the ASC speck. We suggest that co-aggregation of antigenic proteins on the ASC speck during intracellular infection might be instrumental in antigen presentation.

  6. USDA-ASCS 1936-1939 Air Photos

    Data.gov (United States)

    Minnesota Department of Natural Resources — This data set is a digital version of aerial photographs taken during the 1936-1939 time frame for the USDA-ASCS. These photos were originally recorded at a scale...

  7. PROCESS QUALITY ANALYSIS OF PERFECTIVE MAINTAINABILITY FOR COMPONENT-BASED SOFTWARE SYSTEMS USING ASPECT-ORIENTED PROGRAMMING TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Jyothi R

    2011-12-01

    Full Text Available Maintainability occupy’s the major role in the software development life cycle (SDLC. Once the software product comes out of this SDLC, major cost and effort goes to the modification/ enhancement of the different components in a component-based software systems. This research presents the modeling work and prototyping techniques, which highlights the importance of process quality analysis for perfective maintainability. This analysis comprising of time, quality and efficiency of the derived solution and it is based on the type ofmodification/enhancement request from the customers of the software product. Here we are proposing a mathematical approach for time evaluation, which includes the summation of response time and the solution production time. Quality analysis is based on the qualitative and quantitative approach. Efficiency analysisrequires the potential operational performance for component-based software systems. This needs high execution speed for handling complex algorithms and huge volumes of data. For this we are proposing an aspect-oriented programming techniques which increases the development speed, modularity, as well as it provides the best software efficiency and design quality.

  8. MATHEMATICAL MODEL FOR THE SIMULATION OF WATER QUALITY IN RIVERS USING THE VENSIM PLE® SOFTWARE

    Directory of Open Access Journals (Sweden)

    Julio Cesar de S. I. Gonçalves

    2013-06-01

    Full Text Available Mathematical modeling of water quality in rivers is an important tool for the planning and management of water resources. Nevertheless, the available models frequently show structural and functional limitations. With the objective of reducing these drawbacks, a new model has been developed to simulate water quality in rivers under unsteady conditions; this model runs on the Vensim PLE® software and can also be operated for steady-state conditions. The following eighteen water quality variables can be simulated: DO, BODc, organic nitrogen (No, ammonia nitrogen (Na, nitrite (Ni, nitrate (Nn, organic and inorganic phosphorus (Fo and Fi, respectively, inorganic solids (Si, phytoplankton (F, zooplankton (Z, bottom algae (A, detritus (D, total coliforms (TC, alkalinity (Al., total inorganic carbon (TIC, pH, and temperature (T. Methane as well as nitrogen and phosphorus compounds that are present in the aerobic and anaerobic layers of the sediment can also be simulated. Several scenarios were generated for computational simulations produced using the new model by using the QUAL2K program, and, when possible, analytical solutions. The results obtained using the new model strongly supported the results from the QUAL family and analytical solutions.

  9. Importance of Requirements Analysis & Traceability to Improve Software Quality and Reduce Cost and Risk

    Science.gov (United States)

    Kapoor, Manju M.; Mehta, Manju

    2010-01-01

    The goal of this paper is to emphasize the importance of developing complete and unambiguous requirements early in the project cycle (prior to Preliminary Design Phase). Having a complete set of requirements early in the project cycle allows sufficient time to generate a traceability matrix. Requirements traceability and analysis are the key elements in improving verification and validation process, and thus overall software quality. Traceability can be most beneficial when the system changes. If changes are made to high-level requirements it implies that low-level requirements need to be modified. Traceability ensures that requirements are appropriately and efficiently verified at various levels whereas analysis ensures that a rightly interpreted set of requirements is produced.

  10. A modeling approach to design a software sensor and analyze agronomical features - Application to sap flow and grape quality relationship

    OpenAIRE

    Thébaut, Aurélie; Scholash, Thibault; Charnomordic, Brigitte; Hilgert, Nadine

    2013-01-01

    This work proposes a framework using temporal data and domain knowledge in order to analyze complex agronomical features. The expertise is first formalized in an ontology, under the form of concepts and relationships between them, and then used in conjunction with raw data and mathematical models to design a software sensor. Next the software sensor outputs are put in relation to product quality, assessed by quantitative measurements. This requires the use of advanced data analysis methods, s...

  11. Software project estimation the fundamentals for providing high quality information to decision makers

    CERN Document Server

    Abran, Alain

    2015-01-01

    Software projects are often late and over-budget and this leads to major problems for software customers. Clearly, there is a serious issue in estimating a realistic, software project budget. Furthermore, generic estimation models cannot be trusted to provide credible estimates for projects as complex as software projects. This book presents a number of examples using data collected over the years from various organizations building software. It also presents an overview of the non-for-profit organization, which collects data on software projects, the International Software Benchmarking Stan

  12. Software development for ACR-approved phantom-based nuclear medicine tomographic image quality control with cross-platform compatibility

    Science.gov (United States)

    Oh, Jungsu S.; Choi, Jae Min; Nam, Ki Pyo; Chae, Sun Young; Ryu, Jin-Sook; Moon, Dae Hyuk; Kim, Jae Seung

    2015-07-01

    Quality control and quality assurance (QC/QA) have been two of the most important issues in modern nuclear medicine (NM) imaging for both clinical practices and academic research. Whereas quantitative QC analysis software is common to modern positron emission tomography (PET) scanners, the QC of gamma cameras and/or single-photon-emission computed tomography (SPECT) scanners has not been sufficiently addressed. Although a thorough standard operating process (SOP) for mechanical and software maintenance may help the QC/QA of a gamma camera and SPECT-computed tomography (CT), no previous study has addressed a unified platform or process to decipher or analyze SPECT phantom images acquired from various scanners thus far. In addition, a few approaches have established cross-platform software to enable the technologists and physicists to assess the variety of SPECT scanners from different manufacturers. To resolve these issues, we have developed Interactive Data Language (IDL)-based in-house software for crossplatform (in terms of not only operating systems (OS) but also manufacturers) analyses of the QC data on an ACR SPECT phantom, which is essential for assessing and assuring the tomographical image quality of SPECT. We applied our devised software to our routine quarterly QC of ACR SPECT phantom images acquired from a number of platforms (OS/manufacturers). Based on our experience, we suggest that our devised software can offer a unified platform that allows images acquired from various types of scanners to be analyzed with great precision and accuracy.

  13. Total Quality Maintenance (TQMain) A predictive and proactive maintenance concept for software

    OpenAIRE

    Williamsson, Ia

    2006-01-01

    This thesis describes an investigation of the possibility to apply a maintenance concept originally developed for the industry, on software maintenance. Today a large amount of software development models exist but not many of them treat maintenance as a part of the software life cycle. In most cases maintenance is depicted as an activity towards the end of the software life cycle. The high cost ascribed to software maintenance motivates for improvements. The maintenance concept TQMain propos...

  14. Evaluation of a software package for automated quality assessment of contrast detail images—comparison with subjective visual assessment

    Science.gov (United States)

    Pascoal, A.; Lawinski, C. P.; Honey, I.; Blake, P.

    2005-12-01

    Contrast detail analysis is commonly used to assess image quality (IQ) associated with diagnostic imaging systems. Applications include routine assessment of equipment performance and optimization studies. Most frequently, the evaluation of contrast detail images involves human observers visually detecting the threshold contrast detail combinations in the image. However, the subjective nature of human perception and the variations in the decision threshold pose limits to the minimum image quality variations detectable with reliability. Objective methods of assessment of image quality such as automated scoring have the potential to overcome the above limitations. A software package (CDRAD analyser) developed for automated scoring of images produced with the CDRAD test object was evaluated. Its performance to assess absolute and relative IQ was compared with that of an average observer. Results show that the software does not mimic the absolute performance of the average observer. The software proved more sensitive and was able to detect smaller low-contrast variations. The observer's performance was superior to the software's in the detection of smaller details. Both scoring methods showed frequent agreement in the detection of image quality variations resulting from changes in kVp and KERMAdetector, which indicates the potential to use the software CDRAD analyser for assessment of relative IQ.

  15. Evaluation of a software package for automated quality assessment of contrast detail images-comparison with subjective visual assessment

    Energy Technology Data Exchange (ETDEWEB)

    Pascoal, A [Medical Engineering and Physics, King' s College London, Faraday Building Denmark Hill, London SE5 8RX (Denmark); Lawinski, C P [KCARE - King' s Centre for Assessment of Radiological Equipment, King' s College Hospital, Faraday Building Denmark Hill, London SE5 8RX (Denmark); Honey, I [KCARE - King' s Centre for Assessment of Radiological Equipment, King' s College Hospital, Faraday Building Denmark Hill, London SE5 8RX (Denmark); Blake, P [KCARE - King' s Centre for Assessment of Radiological Equipment, King' s College Hospital, Faraday Building Denmark Hill, London SE5 8RX (Denmark)

    2005-12-07

    Contrast detail analysis is commonly used to assess image quality (IQ) associated with diagnostic imaging systems. Applications include routine assessment of equipment performance and optimization studies. Most frequently, the evaluation of contrast detail images involves human observers visually detecting the threshold contrast detail combinations in the image. However, the subjective nature of human perception and the variations in the decision threshold pose limits to the minimum image quality variations detectable with reliability. Objective methods of assessment of image quality such as automated scoring have the potential to overcome the above limitations. A software package (CDRAD analyser) developed for automated scoring of images produced with the CDRAD test object was evaluated. Its performance to assess absolute and relative IQ was compared with that of an average observer. Results show that the software does not mimic the absolute performance of the average observer. The software proved more sensitive and was able to detect smaller low-contrast variations. The observer's performance was superior to the software's in the detection of smaller details. Both scoring methods showed frequent agreement in the detection of image quality variations resulting from changes in kVp and KERMA{sub detector}, which indicates the potential to use the software CDRAD analyser for assessment of relative IQ.

  16. NNSA ASC Exascale Environment Planning, Applications Working Group, Report February 2011

    Energy Technology Data Exchange (ETDEWEB)

    Still, C H; Arsenlis, A; Bond, R B; Steinkamp, M J; Swaminarayan, S; Womble, D E; Koniges, A E; Harrison, J R; Chen, J H

    2011-02-25

    The scope of the Apps WG covers three areas of interest: Physics and Engineering Models (PEM), multi-physics Integrated Codes (IC), and Verification and Validation (V&V). Each places different demands on the exascale environment. The exascale challenge will be to provide environments that optimize all three. PEM serve as a test bed for both model development and 'best practices' for IC code development, as well as their use as standalone codes to improve scientific understanding. Rapidly achieving reasonable performance for a small team is the key to maintaining PEM innovation. Thus, the environment must provide the ability to develop portable code at a higher level of abstraction, which can then be tuned, as needed. PEM concentrate their computational footprint in one or a few kernels that must perform efficiently. Their comparative simplicity permits extreme optimization, so the environment must provide the ability to exercise significant control over the lower software and hardware levels. IC serve as the underlying software tools employed for most ASC problems of interest. Often coupling dozens of physics models into very large, very complex applications, ICs are usually the product of hundreds of staff-years of development, with lifetimes measured in decades. Thus, emphasis is placed on portability, maintainability and overall performance, with optimization done on the whole rather than on individual parts. The exascale environment must provide a high-level standardized programming model with effective tools and mechanisms for fault detection and remediation. Finally, V&V addresses the infrastructure and methods to facilitate the assessment of code and model suitability for applications, and uncertainty quantification (UQ) methods for assessment and quantification of margins of uncertainty (QMU). V&V employs both PEM and IC, with somewhat differing goals, i.e., parameter studies and error assessments to determine both the quality of the calculation

  17. Qualidade de kiwis minimamente processados e submetidos a tratamento com ácido ascórbico, ácido cítrico e cloreto de cálcio Quality of kiwis minimally processed and treated with ascorbic acid, citric acid and calcium chloride

    OpenAIRE

    Ana Vânia Carvalho; Luiz Carlos Oliveira Lima

    2002-01-01

    Frutos e hortaliças minimamente processados devem apresentar atributos de conveniência e qualidade do produto fresco. O objetivo deste trabalho foi estudar o efeito do processamento mínimo de frutos tratados com soluções a 1% de ácido ascórbico, ácido cítrico e cloreto de cálcio, durante armazenamento refrigerado, na qualidade do kiwi (Actinidia deliciosa cv. Hayward). A perda de massa foi mínima durante o período de armazenamento. O ácido ascórbico fornecido pelo tratamento foi eficientement...

  18. An Evaluation of Output Quality of Machine Translation (Padideh Software vs. Google Translate

    Directory of Open Access Journals (Sweden)

    Haniyeh Sadeghi Azer

    2015-08-01

    Full Text Available This study aims to evaluate the translation quality of two machine translation systems in translating six different text-types, from English to Persian. The evaluation was based on criteria proposed by Van Slype (1979. The proposed model for evaluation is a black-box type, comparative and adequacy-oriented evaluation. To conduct the evaluation, a questionnaire was assigned to end-users to evaluate the outputs to examine and determine, if the machine-generated translations are intelligible and acceptable from their point of view and which one of the machine-generated translations produced by Padideh software and Google Translate is more acceptable and useful from the end-users point of view. The findings indicate that, the machine-generated translations are intelligible and acceptable in translating certain text-types, for end-users and Google Translate is more acceptable from end-users point of view.Keywords: Machine Translation, Machine Translation Evaluation, Translation Quality

  19. Ribosome-associated Asc1/RACK1 is required for endonucleolytic cleavage induced by stalled ribosome at the 3′ end of nonstop mRNA

    Science.gov (United States)

    Ikeuchi, Ken; Inada, Toshifumi

    2016-01-01

    Dom34-Hbs1 stimulates degradation of aberrant mRNAs lacking termination codons by dissociating ribosomes stalled at the 3′ ends, and plays crucial roles in Nonstop Decay (NSD) and No-Go Decay (NGD). In the dom34Δ mutant, nonstop mRNA is degraded by sequential endonucleolytic cleavages induced by a stalled ribosome at the 3′ end. Here, we report that ribosome-associated Asc1/RACK1 is required for the endonucleolytic cleavage of nonstop mRNA by stalled ribosome at the 3′ end of mRNA in dom34Δ mutant cells. Asc1/RACK1 facilitates degradation of truncated GFP-Rz mRNA in the absence of Dom34 and exosome-dependent decay. Asc1/RACK1 is required for the sequential endonucleolytic cleavages by the stalled ribosome in the dom34Δ mutant, depending on its ribosome-binding activity. The levels of peptidyl-tRNA derived from nonstop mRNA were elevated in dom34Δasc1Δ mutant cells, and overproduction of nonstop mRNA inhibited growth of mutant cells. E3 ubiquitin ligase Ltn1 degrades the arrest products from truncated GFP-Rz mRNA in dom34Δ and dom34Δasc1Δ mutant cells, and Asc1/RACK1 represses the levels of substrates for Ltn1-dependent degradation. These indicate that ribosome-associated Asc1/RACK1 facilitates endonucleolytic cleavage of nonstop mRNA by stalled ribosomes and represses the levels of aberrant products even in the absence of Dom34. We propose that Asc1/RACK1 acts as a fail-safe in quality control for nonstop mRNA. PMID:27312062

  20. Development of the quality assessment model of EHR software in family medicine practices: research based on user satisfaction

    Directory of Open Access Journals (Sweden)

    Damir Kralj

    2015-09-01

    Full Text Available Background Family medicine practices (FMPs make the basis for the Croatian health care system. Use of electronic health record (EHR software is mandatory and it plays an important role in running these practices, but important functional features still remain uneven and largely left to the will of the software developers.Objective The objective of this study was to develop a novel and comprehensive model for functional evaluation of the EHR software in FMPs, based on current world standards, models and projects, as well as on actual user satisfaction and requirements.Methods Based on previous theoretical and experimental research in this area, we made the initial framework model consisting of six basic categories as a base for online survey questionnaire. Family doctors assessed perceived software quality by using a five-point Likert-type scale. Using exploratory factor analysis and appropriate statistical methods over the collected data, the final optimal structure of the novel model was formed. Special attention was focused on the validity and quality of the novel model.Results The online survey collected a total of 384 cases. The obtained results indicate both the quality of the assessed software and the quality in use of the novel model. The intense ergonomic orientation of the novel measurement model was particularly emphasised.Conclusions The resulting novel model is multiple validated, comprehensive and universal. It could be used to assess the user-perceived quality of almost all forms of the ambulatory EHR software and therefore useful to all stakeholders in this area of the health care informatisation. 

  1. High-Quality Random Number Generation Software for High-Performance Computing Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Random number (RN) generation is the key software component that permits random sampling. Software for parallel RN generation (RNG) should be based on RNGs that are...

  2. Psychophysical evaluation of Catphan®600 CT image quality findings using human observers and software analysis

    International Nuclear Information System (INIS)

    Purpose: A comparison of image quality obtained from human observers and software analysis of CT phantom images. Methods and materials: A Catphan®600 CT QA phantom was scanned for: posterior fossa; cerebrum; abdomen and chest on three CT models, as part of a dose optimisation strategy. CT image data sets (n = 24) obtained pre and post optimisation were blindly evaluated by radiographers (n = 8) identifying the number of distinct line pairs and contrast discs for each of the three supra-slice sets within the phantom's high and low contrast resolution modules. The same images were also reviewed using the web based service – Image Owl for automatic analysis of Catphan®600 images. Results: Inter-observer reliability measured using Cronbach's α between human observers and again including software analysis as the 9th observer gave α = 0.97 for both instances, indicating comparable internal consistency with and without software analysis. Results of a paired sample t-test showed no significant difference (p ≥ 0.05) between human observers and software analysis in 37.5% of observations for line pairs and 37.5%; 12.5% and 50% for the sets of contrast discs representing nominal contrast of 1.0%, 0.5% and 0.3% respectively. Software analysis findings improved compared to observer readings as contrast levels reduced. Conclusion: Combined use of human observers and software analysis for evaluation of image quality in CT using phantoms is recommended. However the sole use of software analysis may provide more detail than that obtained by human observers. Further research to investigate the clinical relevance of such image quality findings is recommended

  3. Quality Assurance Testing of Version 1.3 of U.S. EPA Benchmark Dose Software (Presentation)

    Science.gov (United States)

    EPA benchmark dose software (BMDS) issued to evaluate chemical dose-response data in support of Agency risk assessments, and must therefore be dependable. Quality assurance testing methods developed for BMDS were designed to assess model dependability with respect to curve-fitt...

  4. Relating Communications Mode Choice and Teamwork Quality: Conversational versus Textual Communication in IT System and Software Development Teams

    Science.gov (United States)

    Smith, James Robert

    2012-01-01

    This cross-sectional study explored how IT system and software development team members communicated in the workplace and whether teams that used more verbal communication (and less text-based communication) experienced higher levels of collaboration as measured using the Teamwork Quality (TWQ) scale. Although computer-mediated communication tools…

  5. Can Web 2.0 and Social Software Help Transform How We Measure Quality in Teaching, Learning, and Research?

    Science.gov (United States)

    Attwell, Graham

    This paper focuses on the issue of quality in teaching, learning, and research. In the second section, the paper looks at the different ways technology is being used to learn and at the changing expectations of learners leading to pressures for transformations in both pedagogy and institutional structures. The third section proposes a new rhizomatic model of learning. The following section "Quality Frameworks: Perception and Reality" suggests that traditional measures of the quality of teaching, learning, and research have been hijacked by the commodification of education. This is explored further in Section "The Commodification of Education and Its Impact on How We Measure Quality". Section "How will Web 2.0 and Social Software Change our Understandings and Measurement of Quality?" looks at how Web 2.0 and social software can provide opportunities of new ways of measuring the quality of learning through embedding quality measures within the processes of teaching and learning and knowledge development. Sections "What is the Purpose of Traditional Assessment Measures?" and "Critiques of Assessment Processes" provide a critique of traditional assessment processes and suggest the need to move from the assessment of learning to assessment for learning. Section "Personal Learning Environments and Assessment for Learning Through Authentic Learning Tasks" looks at how personal learning environment can be used to support authentic learning and assessment for learning. The conclusion suggests that the development of new quality processes will require fundamental rethinking of the purpose and role of universities.

  6. Establishing column batch repeatability according to Quality by Design (QbD) principles using modeling software.

    Science.gov (United States)

    Rácz, Norbert; Kormány, Róbert; Fekete, Jenő; Molnár, Imre

    2015-04-10

    Column technology needs further improvement even today. To get information of batch-to-batch repeatability, intelligent modeling software was applied. Twelve columns from the same production process, but from different batches were compared in this work. In this paper, the retention parameters of these columns with real life sample solutes were studied. The following parameters were selected for measurements: gradient time, temperature and pH. Based on calculated results, batch-to-batch repeatability of BEH columns was evaluated. Two parallel measurements on two columns from the same batch were performed to obtain information about the quality of packing. Calculating the average of individual working points at the highest critical resolution (R(s,crit)) it was found that the robustness, calculated with a newly released robustness module, had a success rate >98% among the predicted 3(6) = 729 experiments for all 12 columns. With the help of retention modeling all substances could be separated independently from the batch and/or packing, using the same conditions, having high robustness of the experiments. PMID:25703234

  7. 智能手机APP质量模型%The Quality Model of Application Software Product for Mobile Phone

    Institute of Scientific and Technical Information of China (English)

    郭文胜

    2014-01-01

    The quality issues of application software product for intelligent mobile phones which have become increasingly prominent in the market have seriously hampered the development of the mobile internet industry. However, there is no quality definition based on APP Characteristic. By tailoring and amending the general software quality model through the existing national standards, a new product quality model is proposed that guides the application software providers for improving their product quality, and provides the third party lab with inspection standard, which results in effectively protecting the end consumers' legitimate rights and interests. The Model presented in this paper has been used successfully in a Local standard.%市场上智能手机APP质量问题日益突出,已经严重制约移动互联网产业的发展,目前还没有专门针对智能手机APP软件产品特点的质量定义。本文通过对现行国家软件标准进行裁剪和修订,提出了APP的质量模型,用于指导APP供方提高产品质量,为第三方检测机构提供依据,切实保护广大终端消费者的合法权益。

  8. Image processing software for industrial integration: Flexible Image Processing System (FIPS) and Integrated Quality Inspection System (IQIS)

    Science.gov (United States)

    Ahlers, Rolf-Juergen; Hou, Marc

    1994-10-01

    FIPS (Flexible Image Processing System) and IQIS (Integrated Quality Inspection System) are two components of a new software package, that enable the user to integrate image processing into the manufacturing environment. FIPS is a user friendly software tool. It consists of different user levels that allow for linkage to CAD and CAM systems as well as the adaptation to sensor and robot environments. IQIS delivers the interfaces for the manufacturing environment such as industrial robots, manufacturing cells, etc. By way of example the different features are described.

  9. 42 CFR 416.164 - Scope of ASC services.

    Science.gov (United States)

    2010-10-01

    ... surgical procedure under § 416.166 include, but are not limited to— (1) Nursing, technician, and related services; (2) Use of the facility where the surgical procedures are performed; (3) Any laboratory testing... (CONTINUED) MEDICARE PROGRAM AMBULATORY SURGICAL SERVICES Coverage, Scope of ASC Services, and...

  10. Evaluating the Performance of Albanian Savings and Credit (ASC Union

    Directory of Open Access Journals (Sweden)

    Jonida Bou Dib (Lekocaj

    2013-04-01

    Full Text Available This research paper aimed to evaluate the role of ASC Union through three main poles:its performance in relation to outreach, its financial sustainability, and its welfare impact.It was based mainly on a descriptive study and focused on an accurate event, trying toanswer questions such as: what, where, how, who and when, through the use of differentinformation and already existing theories. Moreover, a triangulated methodologycombining interviews, questionnaires and observations was applied in order to analyze themicrocredit impacts.From the outreach angle, it was found that ASC Union's outreach has shown anincrement over the period of study with different rates of growth from 2003 to 2010 onaverage by 14.7%. On the other hand, the operational sustainability measured by returnon assets and return on equity showed instability over the period of the study, making theASC Union financial sustainability doubtful. In summary, the members confirmed thatASC Union helped them to improve their activities and income, where 87 out of 100farmers confirmed that their income increased in the last 3 years, while 31 farmers out of100, respectively 8, 9 and 10 year-old-members, proclaimed that using the micro-loanhelped them to expand their activities, while 56 farmers confirmed that microcredithelped them not only in improving their income and activity, but also their productionincrement and activity expansion.

  11. Overview of ASC Capability Computing System Governance Model

    Energy Technology Data Exchange (ETDEWEB)

    Doebling, Scott W. [Los Alamos National Laboratory

    2012-07-11

    This document contains a description of the Advanced Simulation and Computing Program's Capability Computing System Governance Model. Objectives of the Governance Model are to ensure that the capability system resources are allocated on a priority-driven basis according to the Program requirements; and to utilize ASC Capability Systems for the large capability jobs for which they were designed and procured.

  12. Supporting Early Math--Rationales and Requirements for High Quality Software

    Science.gov (United States)

    Haake, Magnus; Husain, Layla; Gulz, Agneta

    2015-01-01

    There is substantial evidence that preschooler's performance in early math is highly correlated to math performance throughout school as well as academic skills in general. One way to help children attain early math skills is by using targeted educational software and the paper discusses potential gains of using such software to support early math…

  13. Organizational Stresses and Practices Impeding Quality Software Development in Government Procurements

    Science.gov (United States)

    Holcomb, Glenda S.

    2010-01-01

    This qualitative, phenomenological doctoral dissertation research study explored the software project team members perceptions of changing organizational cultures based on management decisions made at project deviation points. The research study provided a view into challenged or failing government software projects through the lived experiences…

  14. Improving the quality of numerical software through user-centered design

    Energy Technology Data Exchange (ETDEWEB)

    Pancake, C. M., Oregon State University

    1998-06-01

    The software interface - whether graphical, command-oriented, menu-driven, or in the form of subroutine calls - shapes the user`s perception of what software can do. It also establishes upper bounds on software usability. Numerical software interfaces typically are based on the designer`s understanding of how the software should be used. That is a poor foundation for usability, since the features that are ``instinctively right`` from the developer`s perspective are often the very ones that technical programmers find most objectionable or most difficult to learn. This paper discusses how numerical software interfaces can be improved by involving users more actively in design, a process known as user-centered design (UCD). While UCD requires extra organization and effort, it results in much higher levels of usability and can actually reduce software costs. This is true not just for graphical user interfaces, but for all software interfaces. Examples show how UCD improved the usability of a subroutine library, a command language, and an invocation interface.

  15. Influência do uso simultâneo de ácido ascórbico e azodicarbonamida na qualidade do pão francês The influence of simultaneous use of ascorbic acid and azodicarbonamide in the quality of french bread

    OpenAIRE

    Alessandra Santos Lopes; Rita de Cássia Salvucci Celeste Ormenese; Flávio Martins Montenegro; Patrocínio Gonçalves Ferreira Júnior

    2007-01-01

    Este trabalho teve como objetivo avaliar o uso simultâneo de ácido ascórbico e azodicarbonamida em produto de panificação e através da metodologia de superfície de resposta. As respostas do planejamento experimental (2²) foram: o volume específico e a pontuação total das características externas e internas do pão francês. A ação do ácido ascórbico no aumento do volume específico do pão francês teve efeitos significativos (p < 0,05 ou valores próximos) linear e quadrático. Para a azodicarbonam...

  16. Parameter-based estimation of CT dose index and image quality using an in-house android™-based software

    Science.gov (United States)

    Mubarok, S.; Lubis, L. E.; Pawiro, S. A.

    2016-03-01

    Compromise between radiation dose and image quality is essential in the use of CT imaging. CT dose index (CTDI) is currently the primary dosimetric formalisms in CT scan, while the low and high contrast resolutions are aspects indicating the image quality. This study was aimed to estimate CTDIvol and image quality measures through a range of exposure parameters variation. CTDI measurements were performed using PMMA (polymethyl methacrylate) phantom of 16 cm diameter, while the image quality test was conducted by using catphan ® 600. CTDI measurements were carried out according to IAEA TRS 457 protocol using axial scan mode, under varied parameters of tube voltage, collimation or slice thickness, and tube current. Image quality test was conducted accordingly under the same exposure parameters with CTDI measurements. An Android™ based software was also result of this study. The software was designed to estimate the value of CTDIvol with maximum difference compared to actual CTDIvol measurement of 8.97%. Image quality can also be estimated through CNR parameter with maximum difference to actual CNR measurement of 21.65%.

  17. Software Testing Techniques and Strategies

    OpenAIRE

    Isha,; Sunita Sangwan

    2014-01-01

    Software testing provides a means to reduce errors, cut maintenance and overall software costs. Numerous software development and testing methodologies, tools, and techniques have emerged over the last few decades promising to enhance software quality. This paper describes Software testing, need for software testing, Software testing goals and principles. Further it describe about different Software testing techniques and different software testing strategies.

  18. Qualifying codes under software quality assurance: Two examples as guidelines for codes that are existing or under development

    International Nuclear Information System (INIS)

    Software quality assurance is an area of concern for DOE, EPA, and other agencies due to the poor quality of software and its documentation they have received in the past. This report briefly summarizes the software development concepts and terminology increasingly employed by these agencies and provides a workable approach to scientific programming under the new requirements. Following this is a practical description of how to qualify a simulation code, based on a software QA plan that has been reviewed and officially accepted by DOE/OCRWM. Two codes have recently been baselined and qualified, so that they can be officially used for QA Level 1 work under the DOE/OCRWM QA requirements. One of them was baselined and qualified within one week. The first of the codes was the multi-phase multi-component flow code TOUGH version 1, an already existing code, and the other was a geochemistry transport code STATEQ that was under development The way to accomplish qualification for both types of codes is summarized in an easy-to-follow step-by step fashion to illustrate how to baseline and qualify such codes through a relatively painless procedure

  19. ORGANIZATIONAL LEARNING AND VENDOR SUPPORT QUALITY BY THE USAGE OF APPLICATION SOFTWARE PACKAGES: A STUDY OF ASIAN ENTREPRENEURS

    Institute of Scientific and Technical Information of China (English)

    Nelson Oly NDUBISI; Omprakash K.GUPTA; Samia MASSOUD

    2003-01-01

    In this paper we study how or ganizational learning impacts organizational behavior, and how vendor support quality enhances product adoption and usage behavior. These constructs were verified using Application Software Packages (ASP) - a prewritten, precoded, commercially available set of programs that eliminates the need for individuals or organizations to write their own software programs for certain functions. The relationship between ASP usage, usage outcomes and use processes were also investigated. Two hundred and ninety-five Chinese, Indian, and Malay entrepreneurships were studied. It was found that usage outcome strongly determines usage, while use process has only an indirect relationship (via outcome) on usage. The impact of organizational learning and vendor service quality on usage, usage outcome, and use process were robust. Theoretical and practical implications ofthe research are discussed.

  20. Validation of quality control tests of a multi leaf collimator using electronic portal image devices and commercial software

    International Nuclear Information System (INIS)

    We describe a daily quality control procedure of the multi leaf collimator (MLC) based on electronic portal image devices and commercial software. We designed tests that compare portal images of a set of static and dynamic MLC configurations to a set of reference images using commercial portal dosimetry software. Reference images were acquired using the same set of MLC configurations after the calibration of the MLC. To assess the sensitivity to detect MLC under performances, we modified the MLC configurations by inserting a range of leaf position and speed errors. Distance measurements on portal images correlated with leaf position errors down to 0.1 mm in static MLC configurations. Dose differences between portal images correlated both with speed errors down to 0.5% of the nominal leaf velocities and with leaf position errors down to 0.1 mm in dynamic MLC configurations. The proposed quality control procedure can assess static and dynamic MLC configurations with high sensitivity and reliability. (Author)

  1. Core Community Specifications for Electron Microprobe Operating Systems: Software, Quality Control, and Data Management Issues

    Science.gov (United States)

    Fournelle, John; Carpenter, Paul

    2006-01-01

    Modem electron microprobe systems have become increasingly sophisticated. These systems utilize either UNIX or PC computer systems for measurement, automation, and data reduction. These systems have undergone major improvements in processing, storage, display, and communications, due to increased capabilities of hardware and software. Instrument specifications are typically utilized at the time of purchase and concentrate on hardware performance. The microanalysis community includes analysts, researchers, software developers, and manufacturers, who could benefit from exchange of ideas and the ultimate development of core community specifications (CCS) for hardware and software components of microprobe instrumentation and operating systems.

  2. Climate and product quality in software development teams: assessing the mediating role of problem solving and learning

    OpenAIRE

    Açıkgöz, Atif; Günsel, Ayşe; Kuzey, Cemil

    2015-01-01

    The popularity of new product development has been increasing in knowledgeintensive organizations as a means to manage aggressive competition. Given the criticality of product development to the performance of many organizations, it is important to unveil the mechanisms that support problem solving. In line with the relevant literature, this study examined the relationships among team climate, team problem solving, team learning, and software quality. As well, this study exp...

  3. Assessing software quality at each step of its life-cycle to enhance reliability of control systems

    International Nuclear Information System (INIS)

    A distributed software control system aims to enhance the upgrade ability and reliability by sharing responsibility between several components. The disadvantage is that it makes it harder to detect problems on a significant number of modules. With Kaizen in mind we have chosen to continuously invest in automation to obtain a complete overview of software quality despite the growth of legacy code. The development process has already been mastered by staging each life-cycle step thanks to a continuous integration server based on JENKINS and MAVEN. We enhanced this process, focusing on 3 objectives: Automatic Test, Static Code Analysis and Post-Mortem Supervision. Now, the build process automatically includes a test section to detect regressions, incorrect behaviour and integration incompatibility. The in-house TANGOUNIT project satisfies the difficulties of testing distributed components such as Tango Devices. In the next step, the programming code has to pass a complete code quality check-up. The SONAR quality server has been integrated in the process, to collect each static code analysis and display the hot topics on summary web pages. Finally, the integration of Google BREAKPAD in every TANGO Devices gives us essential statistics from crash reports and enables us to replay the crash scenarios at any time. We have already gained greater visibility on current developments. Some concrete results will be presented including reliability enhancement, better management of subcontracted software development, quicker adoption of coding standards by new developers and understanding of impacts when moving to a new technology. (authors)

  4. Quality assurance for CORAL and COOL within the LCG software stack for the LHC experiments

    CERN Document Server

    CERN. Geneva

    2015-01-01

    CORAL and COOL are software packages used by the LHC experiments for managing different categories of physics data using a variety of relational database technologies. The core components are written in C++, but Python bindings are also provided. CORAL is a generic relational access layer, while COOL includes the implementation of a specific relational data model and optimization of SQL queries for "conditions data". The software is the result of more than 10 years of development in colaboration between the IT department and the LHC experiments. The packages are built and released within the LCG software stack, for which automatic nightly builds and release installations are provided by PH-SFT (cmake, jenkins, cdash) for many different platforms, compilers and software version configurations. Test-driven development and functional tests of both C++ and Python components (CppUnit, unittest) have been key elements in the success of the projects. Dedicated test suites have also been prepared to commission and ma...

  5. Developing free software for automatic registration for the quality control of IMRT with movies

    International Nuclear Information System (INIS)

    In this work, as the commissioner of the e-JMRT, a Monte Carlo calculation network for IMRT planning, has developed software for the automatic recording of the image of the film with the results of the planning system.

  6. Improving Reliability of Spectrum Analysis for Software Quality Requirements Using TCM

    OpenAIRE

    KAIYA, Haruhiko; Tanigawa, Masaaki; Suzuki, Shunichi; Sato, Tomonori; Osada, Akira; Kaijiri, Kenji

    2010-01-01

    Quality requirements are scattered over a requirements specification. thus it Is hard to measure and trace such quality requirements to validate the specification against stakeholders' needs We proposed a technique called "spectrum analysis for quality requirements" which enabled analysts to sort a requirements specification to measure and track quality requirements in the specification In the same way as a spectrum in optics, a quality spectrum of a specification shows a quantitative feature...

  7. Advanced Stirling Convertor (ASC)--From Technology Development to Future Flight Product

    Science.gov (United States)

    Wong, Wayne A.; Wood, J. Gary; Wilson, Kyle

    2008-01-01

    The Advanced Stirling Convertor (ASC) is being developed by Sunpower Inc. under contract to NASA s Glenn Research Center (GRC) with critical technology support tasks led by GRC. The ASC development, funded by NASA s Science Mission Directorate, started in 2003 as one of 10 competitively awarded contracts that were intended to address the power conversion needs of future Radioisotope Power Systems (RPS). The ASC technology has since evolved through progressive convertor builds and successful testing to demonstrate high conversion efficiency (38 percent), low mass (1.3 kg), hermetic sealing, launch vibration simulation, EMI characterization, and is undergoing extended operation. The GRC and Sunpower team recently delivered two ASC-E convertors to the Department of Energy (DOE) and Lockheed Martin Space Systems Company for integration onto the Advanced Stirling Radioisotope Generator Engineering Unit (ASRG EU) plus one spare. The design of the next build, called the ASC-E2, has recently been initiated and is based on the heritage ASC-E with design refinements to increase reliability margin and offer higher temperature operation and improve performance. The ASC enables RPS system specific power of about 7 to 8 W/kg. This paper provides a chronology of ASC development to date and summarizes technical achievements including advancements toward flight implementation of the technology on ASRG by as early as 2013.

  8. Solidification of Spent Ion Exchange Resin Using ASC Cement

    Institute of Scientific and Technical Information of China (English)

    周耀中; 云桂春; 叶裕才

    2002-01-01

    Ion exchange resins (IERs) have been widely used in nuclear facilities. However, the spent radioactive IERs result in major quantities of low and intermediate level radioactive wastes. This article describes a laboratory experimental study on solidification processing of IERs using a new type of cement named ASC cement. The strength of the cementation matrix is in the range of 18-20 MPa (28 d); the loading of the spent IER in the cement-resin matrix is over 45% and leaching rates of 137Cs, 90Sr and 60Co are 7.92×10-5, 5.7×10-6, and 1.19×10-8 cm/d. The results show that ASC cement can be a preferable cementation material for immobilization of radioactive spent IER.

  9. Towards Applying Text Mining Techniques on Software Quality Standards and Models

    OpenAIRE

    Kelemen, Zádor Dániel; Kusters, Rob; Trienekens, Jos; Balla, Katalin

    2013-01-01

    Many of quality approaches are described in hundreds of textual pages. Manual processing of information consumes plenty of resources. In this report we present a text mining approach applied on CMMI, one well known and widely known quality approach. The text mining analysis can provide a quick overview on the scope of a quality approaches. The result of the analysis could accelerate the understanding and the selection of quality approaches.

  10. Models, methods and software tools to evaluate the quality of informational and educational resources

    International Nuclear Information System (INIS)

    The paper studies the modern methods and tools to evaluate the quality of data systems, which allows determining the specificity of informational and educational resources (IER). The author has developed a model of IER quality management at all stages of the life cycle and an integrated multi-level hierarchical system of IER quality assessment, taking into account both information properties and targeted resource assignment. The author presents a mathematical and algorithmic justification of solving the problem of IER quality management, and offers data system to assess the IER quality

  11. High Technology Systems with Low Technology Failures: Some Experiences with Rockets on Software Quality and Integration

    Science.gov (United States)

    Craig, Larry G.

    2010-01-01

    This slide presentation reviews three failures of software and how the failures contributed to or caused the failure of a launch or payload insertion into orbit. In order to avoid these systematic failures in the future, failure mitigation strategies are suggested for use.

  12. 76 FR 54800 - International Business Machines (IBM), Software Group Business Unit, Quality Assurance Group, San...

    Science.gov (United States)

    2011-09-02

    ... Employment and Training Administration International Business Machines (IBM), Software Group Business Unit... Application for Reconsideration for the workers and former workers of International Business Machines (IBM... Department's Notice was published in the Federal Register on February 2, 2011 (76 FR 5832). The...

  13. A General Approach of Quality Cost Management Suitable for Effective Implementation in Software Systems

    OpenAIRE

    Stelian BRAD

    2010-01-01

    Investments in quality are best quantified by implementing and managing quality cost systems. A review of various opinions coming from practitioners and researchers about the existent quality cost models reveals a set of drawbacks (e.g. too theoretical and too close to ideal cases; too academic, with less practical impact; too much personalized to particular business processes, with difficulties in extrapolating to other cases; not comprising all dimensions of a business system). Using concep...

  14. Advanced Stirling Convertor (ASC) - From Technology Development to Future Flight Product

    Science.gov (United States)

    Wong, Wayne A.; Wood, J. Gary; Wilson, Kyle

    2008-01-01

    The Advanced Stirling Convertor (ASC) is being developed by Sunpower, Inc. under contract to NASA s Glenn Research Center (GRC) with critical technology support tasks lead by GRC. The ASC development, funded by NASA s Science Mission Directorate, started in 2003 as one of 10 competitively awarded contracts that were to address future Radioisotope Power System (RPS) advanced power conversion needs. The ASC technology has since evolved through progressive convertor builds and successful testing to demonstrate high conversion efficiency (38 %), low mass (1.3 kg), hermetic sealing, launch vibration simulation, EMI characterization, and is undergoing extended operation. The GRC and Sunpower team recently delivered three ASC-E machines to the Department of Energy (DOE) and Lockheed Martin Space Systems Company, two units for integration onto the Advanced Stirling Radioisotope Generator Engineering Unit (ASRG EU) plus one spare. The design has recently been initiated for the ASC-E2, an evolution from the ASC-E that substitutes higher temperature materials enabling improved performance and higher reliability margins. This paper summarizes the history and status of the ASC project and discusses plans for this technology which enables RPS specific power of 8 W/kg for future NASA missions.

  15. WE-D-9A-06: Open Source Monitor Calibration and Quality Control Software for Enterprise Display Management

    Energy Technology Data Exchange (ETDEWEB)

    Bevins, N; Vanderhoek, M; Lang, S; Flynn, M [Henry Ford Health System, Detroit, MI (United States)

    2014-06-15

    Purpose: Medical display monitor calibration and quality control present challenges to medical physicists. The purpose of this work is to demonstrate and share experiences with an open source package that allows for both initial monitor setup and routine performance evaluation. Methods: A software package, pacsDisplay, has been developed over the last decade to aid in the calibration of all monitors within the radiology group in our health system. The software is used to calibrate monitors to follow the DICOM Grayscale Standard Display Function (GSDF) via lookup tables installed on the workstation. Additional functionality facilitates periodic evaluations of both primary and secondary medical monitors to ensure satisfactory performance. This software is installed on all radiology workstations, and can also be run as a stand-alone tool from a USB disk. Recently, a database has been developed to store and centralize the monitor performance data and to provide long-term trends for compliance with internal standards and various accrediting organizations. Results: Implementation and utilization of pacsDisplay has resulted in improved monitor performance across the health system. Monitor testing is now performed at regular intervals and the software is being used across multiple imaging modalities. Monitor performance characteristics such as maximum and minimum luminance, ambient luminance and illuminance, color tracking, and GSDF conformity are loaded into a centralized database for system performance comparisons. Compliance reports for organizations such as MQSA, ACR, and TJC are generated automatically and stored in the same database. Conclusion: An open source software solution has simplified and improved the standardization of displays within our health system. This work serves as an example method for calibrating and testing monitors within an enterprise health system.

  16. WE-D-9A-06: Open Source Monitor Calibration and Quality Control Software for Enterprise Display Management

    International Nuclear Information System (INIS)

    Purpose: Medical display monitor calibration and quality control present challenges to medical physicists. The purpose of this work is to demonstrate and share experiences with an open source package that allows for both initial monitor setup and routine performance evaluation. Methods: A software package, pacsDisplay, has been developed over the last decade to aid in the calibration of all monitors within the radiology group in our health system. The software is used to calibrate monitors to follow the DICOM Grayscale Standard Display Function (GSDF) via lookup tables installed on the workstation. Additional functionality facilitates periodic evaluations of both primary and secondary medical monitors to ensure satisfactory performance. This software is installed on all radiology workstations, and can also be run as a stand-alone tool from a USB disk. Recently, a database has been developed to store and centralize the monitor performance data and to provide long-term trends for compliance with internal standards and various accrediting organizations. Results: Implementation and utilization of pacsDisplay has resulted in improved monitor performance across the health system. Monitor testing is now performed at regular intervals and the software is being used across multiple imaging modalities. Monitor performance characteristics such as maximum and minimum luminance, ambient luminance and illuminance, color tracking, and GSDF conformity are loaded into a centralized database for system performance comparisons. Compliance reports for organizations such as MQSA, ACR, and TJC are generated automatically and stored in the same database. Conclusion: An open source software solution has simplified and improved the standardization of displays within our health system. This work serves as an example method for calibrating and testing monitors within an enterprise health system

  17. Influência do uso simultâneo de ácido ascórbico e azodicarbonamida na qualidade do pão francês The influence of simultaneous use of ascorbic acid and azodicarbonamide in the quality of french bread

    Directory of Open Access Journals (Sweden)

    Alessandra Santos Lopes

    2007-06-01

    Full Text Available Este trabalho teve como objetivo avaliar o uso simultâneo de ácido ascórbico e azodicarbonamida em produto de panificação e através da metodologia de superfície de resposta. As respostas do planejamento experimental (2² foram: o volume específico e a pontuação total das características externas e internas do pão francês. A ação do ácido ascórbico no aumento do volume específico do pão francês teve efeitos significativos (p The aim of this work was to evaluate the simultaneous use of ascorbic acid and azodicarbonamide in a bakery product by using the response of the surface methodology. The experimental design (2² responses were the specific volume and total count of the external and internal characteristics of French bread. The action of ascorbic acid on the increase of the specific volume of French bread had significant linear and quadratic effects (p < 0.05 or near values. For azodicarbonamide, the quadratic effect was observed, and there was no interaction effect between the two studied oxidants agents. The application of ascorbic acid and azodicarbonamide using concentrations above 75 mg.kg -1 of wheat flour and 30mg.kg -1, considering the studied levels, provided French bread production with a higher specific volume.

  18. Quality processing assurance extension for the MRS quantitation software jMRUI

    Czech Academy of Sciences Publication Activity Database

    Jablonski, Michal; Starčuková, Jana; Starčuk jr., Zenon

    2015-01-01

    Roč. 28, S1 (2015), S519. ISSN 0968-5243. [ESMRMB 2015. Annual Scientific Meeting /32./. 01.09.2015-03.09.2015, Edinburgh] R&D Projects: GA MŠk(CZ) LO1212; GA MŠk ED0017/01/01 Institutional support: RVO:68081731 Keywords : MR Spectroscopy * processing history tracking * software development * jMRUI * SQL Subject RIV: FS - Medical Facilities ; Equipment

  19. Quality Research by Using Performance Evaluation Metrics for Software Systems and Components

    OpenAIRE

    Ion BULIGIU; Georgeta SOAVA

    2006-01-01

    Software performance and evaluation have four basic needs: (1) well-defined performance testing strategy, requirements, and focuses, (2) correct and effective performance evaluation models, (3) well-defined performance metrics, and (4) cost-effective performance testing and evaluation tools and techniques. This chapter first introduced a performance test process and discusses the performance testing objectives and focus areas. Then, it summarized the basic challenges and issues on performance...

  20. An Industrial Study on Building Consensus Around Software Architectures and Quality Attributes

    OpenAIRE

    Svahnberg, Mikael

    2004-01-01

    When creating an architecture for a software system it is important to consider many aspects and different sides of these aspects at an early stage, lest they are misunderstood and cause problems at later stages during development. In this paper, we report from an industry study to understand and select between different architecture candidates. The company uses a method that focuses discussions of architecture candidates to where there are disagreements between the participating domain exper...

  1. Application of Regression Analysis in the Software Quality Control%回归分析在软件质量控制中的应用

    Institute of Scientific and Technical Information of China (English)

    倪德强

    2015-01-01

    Regression analysis is a kind of widely used statistical analysis method, software quality control is the evaluation, testing and other activities in order to ensure the quality of the final software product in the process of software development. Bringing the regression analysis into software quality control can effectively dig out the significant factors influencing the quality control activities. The attention and control of significant factors can improve the planning and implementation effect of the software quality control activities, so as to ensure the quality of software products of intermediate and final delivery.%回归分析是一种运用十分广泛的统计分析方法,软件质量控制是在软件研发过程中为了保证最终软件产品质量而开展的评审、测试等活动。在软件质量控制活动中引入回归分析,可以有效地挖掘出影响各类质量控制活动的显著因素,通过对显著因素的关注和调控,可以改善软件质量控制活动的策划工作和实施效果,从而保证中间和最终交付的软件产品质量。

  2. Guidelines for software inspections

    Science.gov (United States)

    1983-01-01

    Quality control inspections are software problem finding procedures which provide defect removal as well as improvements in software functionality, maintenance, quality, and development and testing methodology is discussed. The many side benefits include education, documentation, training, and scheduling.

  3. A feasibility study on HTS SMES applications for power quality enhancement through both software simulations and hardware-based experiments

    Science.gov (United States)

    Kim, A. R.; Kim, J. G.; Kim, S.; Park, M.; Yu, I. K.; Seong, K. C.; Watanabe, K.

    2011-11-01

    Superconducting magnetic energy storage (SMES) which promises the efficiency of more than 95% and fast response becomes a competitive energy storage device. Because of its advantages, SMES can provide benefit as a power quality enhancement device to an utility especially in connection with renewable energy sources. In this paper, a software simulation and an experiment aiming for power quality enhancement are reported. The utility was referred to Ulleung Island in Korea which had one wind power generation system. The simulation was performed using power system computer aided design/electromagnetic transient including DC (PSCAD/EMTDC) and power-hardware-in-the-loop simulation (PHILS) was implemented to monitor the operational characteristics of SMES when it was connected to utility. This study provides a highly reliable simulation results, and the feasibility of a SMES application is discussed.

  4. GCE Data Toolbox for MATLAB - a software framework for automating environmental data processing, quality control and documentation

    Science.gov (United States)

    Sheldon, W.; Chamblee, J.; Cary, R. H.

    2013-12-01

    Environmental scientists are under increasing pressure from funding agencies and journal publishers to release quality-controlled data in a timely manner, as well as to produce comprehensive metadata for submitting data to long-term archives (e.g. DataONE, Dryad and BCO-DMO). At the same time, the volume of digital data that researchers collect and manage is increasing rapidly due to advances in high frequency electronic data collection from flux towers, instrumented moorings and sensor networks. However, few pre-built software tools are available to meet these data management needs, and those tools that do exist typically focus on part of the data management lifecycle or one class of data. The GCE Data Toolbox has proven to be both a generalized and effective software solution for environmental data management in the Long Term Ecological Research Network (LTER). This open source MATLAB software library, developed by the Georgia Coastal Ecosystems LTER program, integrates metadata capture, creation and management with data processing, quality control and analysis to support the entire data lifecycle. Raw data can be imported directly from common data logger formats (e.g. SeaBird, Campbell Scientific, YSI, Hobo), as well as delimited text files, MATLAB files and relational database queries. Basic metadata are derived from the data source itself (e.g. parsed from file headers) and by value inspection, and then augmented using editable metadata templates containing boilerplate documentation, attribute descriptors, code definitions and quality control rules. Data and metadata content, quality control rules and qualifier flags are then managed together in a robust data structure that supports database functionality and ensures data validity throughout processing. A growing suite of metadata-aware editing, quality control, analysis and synthesis tools are provided with the software to support managing data using graphical forms and command-line functions, as well as

  5. Simulation of gradient-drift striations on the ASC

    International Nuclear Information System (INIS)

    The evolution of many artificial ionospheric plasma clouds is governed by a simple two-dimensional model consisting of a continuity equation and a variable-coefficient elliptic equation. This type of model applies also to some nonplasma fluid flows. Despite the simplicity of the model, state-of-the-art methods are required to maintain integrity of the solution. These are explained in moderate detail. The one-level striation code used is highly vectorized and achieves in excess of 80% execution efficiency on the ASC. Typical results and timings for the code are given, and areas of current investigations are mentioned. 4 figures, 2 tables

  6. A Software for soil quality conservation at organic waste disposal areas: The case of olive mill and pistachio wastes.

    Science.gov (United States)

    Doula, Maria; Sarris, Apostolos; Papadopoulos, Nikos; Hliaoutakis, Aggelos; Kydonakis, Aris; Argyriou, Lemonia; Theocharopoulos, Sid; Kolovos, Chronis

    2016-04-01

    For the sustainable reuse of organic wastes at agricultural areas, apart from extensive evaluation of waste properties and characteristics, it is of significant importance, in order to protect soil quality, to evaluate land suitability and estimate the correct application doses prior waste landspreading. In the light of this precondition, a software was developed that integrates GIS maps of land suitability for waste reuse (wastewater and solid waste) and an algorithm for waste doses estimation in relation to soil analysis, and in case of reuse for fertilization with soil analysis, irrigation water quality and plant needs. EU and legislation frameworks of European Member States are also considered for the assessment of waste suitability for landspreading and for the estimation of the correct doses that will not cause adverse effects on soil and also to underground water (e.g. Nitrate Directive). Two examples of software functionality are presented in this study using data collected during two LIFE projects, i.e. Prosodol for landspreading of olive mill wastes and AgroStrat for pistachio wastes.

  7. SOFTWARE METRICS VALIDATION METHODOLOGIES IN SOFTWARE ENGINEERING

    Directory of Open Access Journals (Sweden)

    K.P. Srinivasan

    2014-12-01

    Full Text Available In the software measurement validations, assessing the validation of software metrics in software engineering is a very difficult task due to lack of theoretical methodology and empirical methodology [41, 44, 45]. During recent years, there have been a number of researchers addressing the issue of validating software metrics. At present, software metrics are validated theoretically using properties of measures. Further, software measurement plays an important role in understanding and controlling software development practices and products. The major requirement in software measurement is that the measures must represent accurately those attributes they purport to quantify and validation is critical to the success of software measurement. Normally, validation is a collection of analysis and testing activities across the full life cycle and complements the efforts of other quality engineering functions and validation is a critical task in any engineering project. Further, validation objective is to discover defects in a system and assess whether or not the system is useful and usable in operational situation. In the case of software engineering, validation is one of the software engineering disciplines that help build quality into software. The major objective of software validation process is to determine that the software performs its intended functions correctly and provides information about its quality and reliability. This paper discusses the validation methodology, techniques and different properties of measures that are used for software metrics validation. In most cases, theoretical and empirical validations are conducted for software metrics validations in software engineering [1-50].

  8. A data quality monitoring software framework for the BES Ⅲ experiment

    International Nuclear Information System (INIS)

    Data quality monitoring (DQM) plays an important role in data taking at the BES Ⅲ experiments. DQM is used to monitor detector status and data quality. A DQM framework (DQMF) has been developed to make it possible to reuse the BES Ⅲ offline reconstruction system in the online environment. In this framework, the DQMF can also simulate a virtual data taking environment, transfer events to the event display, publish histograms to a histogram presenter in a fixed interval, and dump histograms into a ROOT file. The DQMF has been stably running throughout BES Ⅲ data taking. (authors)

  9. Software library of meteorological routines for air quality models; Libreria de software de procedimientos meteorologicos para modelos de dispersion de contaminantes

    Energy Technology Data Exchange (ETDEWEB)

    Galindo Garcia, Ivan Francisco

    1999-04-01

    Air quality models are an essential tool for most air pollution studies. The models require, however, certain meteorological information about the model domain. Some of the required meteorological parameters can be measured directly, but others must be estimated from available measured data. Therefore, a set of procedures, routines and computational programs to obtain all the meteorological and micrometeorological input data is required. The objective in this study is the identification and implementation of several relationships and methods for the determination of all the meteorological parameters required as input data by US-EPA recommended air pollution models. To accomplish this, a study about air pollution models was conducted, focusing, particularly, on the model meteorological input data. Also, the meteorological stations from the Servicio Meteorologico Nacional (SMN) were analyzed. The type and quality of the meteorological data produced was obtained. The routines and methods developed were based, particularly, on the data produced by SMN stations. Routines were organized in a software library, which allows one to build the specific meteorological processor needed, independently of the model used. Methods were validated against data obtained from an advanced meteorological station owned and operated by the Electrical Research Institute (Instituto de Investigaciones Electricas (IIE)). The results from the validation show that the estimation of the parameters required by air pollution models from routinely available data from Mexico meteorological stations is feasible and therefore let us take full advantage of the use of air pollution models. As an application example of the software library developed, the building of a meteorological processor for a specific air pollution model (CALPUFF) is described. The big advantage the library represents is evident from this example. [Espanol] Los modelos de dispersion de contaminantes constituyen una herramienta

  10. The FRISBEE tool, a software for optimising the trade-off between food quality, energy use, and global warming impact of cold chains

    NARCIS (Netherlands)

    Gwanpua, S.G.; Verboven, P.; Leducq, D.; Brown, T.; Verlinden, B.E.; Bekele, E.; Aregawi, W. Evans, J.; Foster, A.; Duret, S.; Hoang, H.M.; Sluis, S. van der; Wissink, E.; Hendriksen, L.J.A.M.; Taoukis, P.; Gogou, E.; Stahl, V.; El Jabri, M.; Le Page, J.F.; Claussen, I.; Indergård, E.; Nicolai, B.M.; Alvarez, G.; Geeraerd, A.H.

    2015-01-01

    Food quality (including safety) along the cold chain, energy use and global warming impact of refrigeration systems are three key aspects in assessing cold chain sustainability. In this paper, we present the framework of a dedicated software, the FRISBEE tool, for optimising quality of refrigerated

  11. Record of Assessment Moderation Practice (RAMP): Survey Software as a Mechanism of Continuous Quality Improvement

    Science.gov (United States)

    Johnson, Genevieve Marie

    2015-01-01

    In higher education, assessment integrity is pivotal to student learning and satisfaction, and, therefore, a particularly important target of continuous quality improvement. This paper reports on the preliminary development and application of a process of recording and analysing current assessment moderation practices, with the aim of identifying…

  12. Hardware-software system for the automatic quality evaluation of a welded joint

    International Nuclear Information System (INIS)

    The automatic digital processing of radiographic images has advantages over conventional methods employing x-ray television or x-ray films and an operator as the decision-maker. These advantages are primarily the elimination of subjectivity in the operator evaluation of flaws (so that the result of expert examination is presented to the customer in terms that preclude arbitrary interpretation), the possibilities of using morphological analysis, and the production of a large active data base of flaw images. The authors describe a system that incorporates an optical scanner, a videographic adapter with a graphic monitor, a personal computer, and a software package implementing the functions of automatic analysis of a static-test radiographic image of a welded joint, viz., input and storage of the radiographic image, optimal linear and nonlinear filtering of the primary image to produce a reliable flaw map, and the assignment of weld flaws to four classes: voids, cracks, nonbond flaws, and inclusions. 7 refs

  13. Combining a sensor software with statistical analysis for modeling vine water deficit impact on grape quality

    OpenAIRE

    Thébaut, Aurélie; Scholash, Thibault; Charnomordic, Brigitte; Hilgert, Nadine

    2014-01-01

    This work proposes a methodology using temporal data and domain knowledge in ord er to analyze a complex agronomical feature, namely the influence of vine water deficit on grape quality. Raw temporal data are available but they are not direc tly usable to estimate vine water deficit. The methodology associates advanced t echniques in computer science and statistics. A preliminary step is required to determine if the amount of water effectively used by the vine is sufficient or n ot. This step...

  14. A feasibility study on HTS SMES applications for power quality enhancement through both software simulations and hardware-based experiments

    International Nuclear Information System (INIS)

    SMES system was simulated to improve the power quality. The utility has one wind power generator and wind speed is continuously changed. Utility frequency was fluctuated due to wind speed variation. We made 10 kJ toroid-type HTS SMES for stabilization of utility frequency. We can monitor the operational characteristics of HTS SMES for power application. Superconducting magnetic energy storage (SMES) which promises the efficiency of more than 95% and fast response becomes a competitive energy storage device. Because of its advantages, SMES can provide benefit as a power quality enhancement device to an utility especially in connection with renewable energy sources. In this paper, a software simulation and an experiment aiming for power quality enhancement are reported. The utility was referred to Ulleung Island in Korea which had one wind power generation system. The simulation was performed using power system computer aided design/electromagnetic transient including DC (PSCAD/EMTDC) and power-hardware-in-the-loop simulation (PHILS) was implemented to monitor the operational characteristics of SMES when it was connected to utility. This study provides a highly reliable simulation results, and the feasibility of a SMES application is discussed.

  15. A feasibility study on HTS SMES applications for power quality enhancement through both software simulations and hardware-based experiments

    Energy Technology Data Exchange (ETDEWEB)

    Kim, A.R.; Kim, J.G.; Kim, S. [Changwon National University, 9 Sarim-Dong, Changwon 641-773 (Korea, Republic of); Park, M., E-mail: paku@changwon.ac.kr [Changwon National University, 9 Sarim-Dong, Changwon 641-773 (Korea, Republic of); Yu, I.K. [Changwon National University, 9 Sarim-Dong, Changwon 641-773 (Korea, Republic of); Seong, K.C. [Superconducting Device and Cryogenics Group, Korea Electrotechnology Research Institute, Changwon 641-120 (Korea, Republic of); Watanabe, K. [HFLSM, Institute for Materials Research, Tohoku University, Sendai 980-8577 (Japan)

    2011-11-15

    SMES system was simulated to improve the power quality. The utility has one wind power generator and wind speed is continuously changed. Utility frequency was fluctuated due to wind speed variation. We made 10 kJ toroid-type HTS SMES for stabilization of utility frequency. We can monitor the operational characteristics of HTS SMES for power application. Superconducting magnetic energy storage (SMES) which promises the efficiency of more than 95% and fast response becomes a competitive energy storage device. Because of its advantages, SMES can provide benefit as a power quality enhancement device to an utility especially in connection with renewable energy sources. In this paper, a software simulation and an experiment aiming for power quality enhancement are reported. The utility was referred to Ulleung Island in Korea which had one wind power generation system. The simulation was performed using power system computer aided design/electromagnetic transient including DC (PSCAD/EMTDC) and power-hardware-in-the-loop simulation (PHILS) was implemented to monitor the operational characteristics of SMES when it was connected to utility. This study provides a highly reliable simulation results, and the feasibility of a SMES application is discussed.

  16. Report of experiments and evidence for ASC L2 milestone 4467 : demonstration of a legacy application's path to exascale.

    Energy Technology Data Exchange (ETDEWEB)

    Curry, Matthew L.; Ferreira, Kurt Brian; Pedretti, Kevin Thomas Tauke; Leung, Vitus Joseph; Moreland, Kenneth D.; Lofstead, Gerald Fredrick, II; Gentile, Ann C. (Sandia National Laboratories, Livermore, CA); Klundt, Ruth Ann; Ward, H. Lee; Laros, James H., III; Hemmert, Karl Scott; Fabian, Nathan D.; Levenhagen, Michael J.; Barrett, Brian W.; Brightwell, Ronald Brian; Barrett, Richard; Wheeler, Kyle Bruce; Kelly, Suzanne Marie; Rodrigues, Arun F.; Brandt, James M. (Sandia National Laboratories, Livermore, CA); Thompson, David (Sandia National Laboratories, Livermore, CA); VanDyke, John P.; Oldfield, Ron A.; Tucker, Thomas (Open Grid Computing, Inc., Austin, TX); Vaughan, Courtenay Thomas

    2012-03-01

    This report documents thirteen of Sandia's contributions to the Computational Systems and Software Environment (CSSE) within the Advanced Simulation and Computing (ASC) program between fiscal years 2009 and 2012. It describes their impact on ASC applications. Most contributions are implemented in lower software levels allowing for application improvement without source code changes. Improvements are identified in such areas as reduced run time, characterizing power usage, and Input/Output (I/O). Other experiments are more forward looking, demonstrating potential bottlenecks using mini-application versions of the legacy codes and simulating their network activity on Exascale-class hardware. The purpose of this report is to prove that the team has completed milestone 4467-Demonstration of a Legacy Application's Path to Exascale. Cielo is expected to be the last capability system on which existing ASC codes can run without significant modifications. This assertion will be tested to determine where the breaking point is for an existing highly scalable application. The goal is to stretch the performance boundaries of the application by applying recent CSSE RD in areas such as resilience, power, I/O, visualization services, SMARTMAP, lightweight LWKs, virtualization, simulation, and feedback loops. Dedicated system time reservations and/or CCC allocations will be used to quantify the impact of system-level changes to extend the life and performance of the ASC code base. Finally, a simulation of anticipated exascale-class hardware will be performed using SST to supplement the calculations. Determine where the breaking point is for an existing highly scalable application: Chapter 15 presented the CSSE work that sought to identify the breaking point in two ASC legacy applications-Charon and CTH. Their mini-app versions were also employed to complete the task. There is no single breaking point as more than one issue was found with the two codes. The results were

  17. QRev—Software for computation and quality assurance of acoustic doppler current profiler moving-boat streamflow measurements—Technical manual for version 2.8

    Science.gov (United States)

    Mueller, David S.

    2016-01-01

    The software program, QRev applies common and consistent computational algorithms combined with automated filtering and quality assessment of the data to improve the quality and efficiency of streamflow measurements and helps ensure that U.S. Geological Survey streamflow measurements are consistent, accurate, and independent of the manufacturer of the instrument used to make the measurement. Software from different manufacturers uses different algorithms for various aspects of the data processing and discharge computation. The algorithms used by QRev to filter data, interpolate data, and compute discharge are documented and compared to the algorithms used in the manufacturers’ software. QRev applies consistent algorithms and creates a data structure that is independent of the data source. QRev saves an extensible markup language (XML) file that can be imported into databases or electronic field notes software. This report is the technical manual for version 2.8 of QRev.

  18. Statistical analysis of water-quality data containing multiple detection limits: S-language software for regression on order statistics

    Science.gov (United States)

    Lee, L.; Helsel, D.

    2005-01-01

    Trace contaminants in water, including metals and organics, often are measured at sufficiently low concentrations to be reported only as values below the instrument detection limit. Interpretation of these "less thans" is complicated when multiple detection limits occur. Statistical methods for multiply censored, or multiple-detection limit, datasets have been developed for medical and industrial statistics, and can be employed to estimate summary statistics or model the distributions of trace-level environmental data. We describe S-language-based software tools that perform robust linear regression on order statistics (ROS). The ROS method has been evaluated as one of the most reliable procedures for developing summary statistics of multiply censored data. It is applicable to any dataset that has 0 to 80% of its values censored. These tools are a part of a software library, or add-on package, for the R environment for statistical computing. This library can be used to generate ROS models and associated summary statistics, plot modeled distributions, and predict exceedance probabilities of water-quality standards. ?? 2005 Elsevier Ltd. All rights reserved.

  19. New options for hardware and software ensure a better quality of pictures obtained by MRT

    International Nuclear Information System (INIS)

    Artefacts from physiological movements may be the cause of major problems in the interpretation of pictures obtained by MRT, particularly of those of the epigastric and cervical regions which are steadily gaining in diagnostic importance. To allow these regions to be accessible to clinical diagnosis user-friendly compensatory techniques were developed that have now been made available to the medical world. Provided that the options of flow compensation (FLAG), presaturation (REST), selective elimination of fat (SPIR) and suppression of artefacts from refolding are used in an adequate way, the quality of pictures will be considerably improved. (orig.)

  20. Specification of the ASC to be used on the PRC satellite (HITSAT)

    DEFF Research Database (Denmark)

    Jørgensen, Finn E; Thuesen, Gøsta; Kilsgaard, Søren;

    1999-01-01

    The document describes the technical specifications of the ASC Star Tracker and the requirements for the equipment onboard the satellite HITSAT .......The document describes the technical specifications of the ASC Star Tracker and the requirements for the equipment onboard the satellite HITSAT ....

  1. 基于软件全面质量管理的团队建设%The Team Development Based on the Total Quality Management of Software

    Institute of Scientific and Technical Information of China (English)

    张沐辰

    2014-01-01

    文章结合全面质量管理理念和信息系统开发特点,提出了软件全面质量管理的概念。在明确软件全面质量管理的最终目标是扩大市场占有率的前提下,将信息系统质量管理分为三个阶段:事前质量管理、事中质量管理和事后质量管理,进行全面的质量管理。最后,依据建构的质量管理体系,给出了如何通过以人为中心的团队建设来保证软件质量管理的方案。%This paper based on the ideas of comprehensive quality management and characteristics of information system devel-opment, the paragraph puts forward the concept of total quality management software. The article reconsiders the relationship am-ong software quality attributes, between quality goal and business goal. The ultimate goal of software quality management to expand the market share, we divide it into three stages, that is, pre-quality management, quality management and after a matter of quality management, and carry on comprehensive quality management. Finally, based on the quality management system of construction, we give that how people-centered team building to ensure the software quality management program.

  2. "软件质量保证与测试"授课经验分享%Teaching Experience Sharing on Course-"Software Quality Assurance and Testing"

    Institute of Scientific and Technical Information of China (English)

    刘曙; 尹胜君

    2007-01-01

    Based on teaching experiences accumulated from past 5 years author shares the teaching process and approaches on course of "Software Quality Assurance and Testing", which includes course syllabus, teaching approaches, and evaluation methods, etc. We hope that this paper would have a positive input to those who are in the process of teaching software engineering related courses as well as to those who are in the process of teaching practice-oriented bilingual courses.

  3. 改善软件质量的软件安全开发流程研究%Researches on the Security of Software Development Process to Improve the Software Quality

    Institute of Scientific and Technical Information of China (English)

    冯晓媛

    2012-01-01

    目前的软件开发已经有一套严谨且成熟的开发流程,但开发过程中软件安全质量的问题却较少被关注,这势必会造成软件系统的安全危机.以现有的软件开发流程为基础,通过加强制度、管理、技术三个层面的安全措施,规划出一套软件安全开发流程,在软件开发初期就能标出阶段性的安全缺失与漏洞,确保软件安全开发流程能够持续强化软件系统的安全性.%The current software .development is a set of rigorous and mature software development process, but was less con- cerned about the quality of software security problems in the development process, which will inevitably lead to software systems there is a serious security crisis. Based on the existing software development process as the basis, to strengthen the system, man- agement, technology, three aspects of safety measures, so as to plan a set of software development process, software development can be marked out in the early stages of the security flaws and loopholes, ensure security software development process can contin- ue to strengthen the software system security.

  4. Software Engineering Guidebook

    Science.gov (United States)

    Connell, John; Wenneson, Greg

    1993-01-01

    The Software Engineering Guidebook describes SEPG (Software Engineering Process Group) supported processes and techniques for engineering quality software in NASA environments. Three process models are supported: structured, object-oriented, and evolutionary rapid-prototyping. The guidebook covers software life-cycles, engineering, assurance, and configuration management. The guidebook is written for managers and engineers who manage, develop, enhance, and/or maintain software under the Computer Software Services Contract.

  5. Physicians on board: an examination of physician financial interests in ASCs using longitudinal data.

    Science.gov (United States)

    Yee, Christine A

    2011-09-01

    This paper investigates physician financial interests in ambulatory surgery centers (ASCs) using novel, longitudinal data that identify board members (directors) of ASCs in Florida. Improving on prior research, the estimated models in this paper disentangle physician director selection effects from the causal impact of these financial interests. The data suggest that even prior to their financial interest, physician directors had larger procedure volumes than non-directors. Physician directors also referred more lower-risk patients. On average, ASC board membership led to a 27% increase in a physician's procedure volume and a 16% increase in a physician's colonoscopy volume. Simulations suggest that 5% of the colonoscopies performed in Florida between 1997 and 2004 may have been due to physician ASC board membership. The evidence also suggests that physician directors steered patients from hospitals to their affiliate ASCs. In addition, they referred and/or treated more lower-risk patients as a result of board membership. PMID:21855155

  6. Software scripts for quality checking of high-throughput nucleic acid sequencers.

    Science.gov (United States)

    Lazo, G R; Tong, J; Miller, R; Hsia, C; Rausch, C; Kang, Y; Anderson, O D

    2001-06-01

    We have developed a graphical interface to allow the researcher to view and assess the quality of sequencing results using a series of program scripts developed to process data generated by automated sequencers. The scripts are written in Perl programming language and are executable under the cgibin directory of a Web server environment. The scripts direct nucleic acid sequencing trace file data output from automated sequencers to be analyzed by the phred molecular biology program and are displayed as graphical hypertext mark-up language (HTML) pages. The scripts are mainly designed to handle 96-well microtiter dish samples, but the scripts are also able to read data from 384-well microtiter dishes 96 samples at a time. The scripts may be customized for different laboratory environments and computer configurations. Web links to the sources and discussion page are provided. PMID:11414222

  7. Differential splicing of the apoptosis-associated speck like protein containing a caspase recruitment domain (ASC regulates inflammasomes

    Directory of Open Access Journals (Sweden)

    Rojanasakul Yon

    2010-05-01

    Full Text Available Abstract Background The apoptotic speck-like protein containing a caspase recruitment domain (ASC is the essential adaptor protein for caspase 1 mediated interleukin (IL-1β and IL-18 processing in inflammasomes. It bridges activated Nod like receptors (NLRs, which are a family of cytosolic pattern recognition receptors of the innate immune system, with caspase 1, resulting in caspase 1 activation and subsequent processing of caspase 1 substrates. Hence, macrophages from ASC deficient mice are impaired in their ability to produce bioactive IL-1β. Furthermore, we recently showed that ASC translocates from the nucleus to the cytosol in response to inflammatory stimulation in order to promote an inflammasome response, which triggers IL-1β processing and secretion. However, the precise regulation of inflammasomes at the level of ASC is still not completely understood. In this study we identified and characterized three novel ASC isoforms for their ability to function as an inflammasome adaptor. Methods To establish the ability of ASC and ASC isoforms as functional inflammasome adaptors, IL-1β processing and secretion was investigated by ELISA in inflammasome reconstitution assays, stable expression in THP-1 and J774A1 cells, and by restoring the lack of endogenous ASC in mouse RAW264.7 macrophages. In addition, the localization of ASC and ASC isoforms was determined by immunofluorescence staining. Results The three novel ASC isoforms, ASC-b, ASC-c and ASC-d display unique and distinct capabilities to each other and to full length ASC in respect to their function as an inflammasome adaptor, with one of the isoforms even showing an inhibitory effect. Consistently, only the activating isoforms of ASC, ASC and ASC-b, co-localized with NLRP3 and caspase 1, while the inhibitory isoform ASC-c, co-localized only with caspase 1, but not with NLRP3. ASC-d did not co-localize with NLRP3 or with caspase 1 and consistently lacked the ability to function as an

  8. Application of QC_DR Software for Acceptance Testing and Routine Quality Control of Direct Digital Radiography Systems: Initial Experiences using the Italian Association of Physicist in Medicine Quality Control Protocol

    OpenAIRE

    Nitrosi, Andrea; Bertolini, Marco; Borasi, Giovanni; Botti, Andrea; Barani, Adriana; Rivetti, Stefano; Pierotti, Luisa

    2008-01-01

    Ideally, medical x-ray imaging systems should be designed to deliver maximum image quality at an acceptable radiation risk to the patient. Quality assurance procedures are employed to ensure that these standards are maintained. A quality control protocol for direct digital radiography (DDR) systems is described and discussed. Software to automatically process and analyze the required images was developed. In this paper, the initial results obtained on equipment of different DDR manufacturers ...

  9. GPS 数据质量检查软件的可视化开发%Visualized Development of GPS Data Quality Inspect Software

    Institute of Scientific and Technical Information of China (English)

    罗伏军; 李程; 岳国栋

    2014-01-01

    在分析影响GPS观测数据质量的因素的基础上介绍了TEQC软件进行GPS数据质量检查的原理,并利用Visual C#2010对TEQC进行界面开发,实现了参数设置、质量评价、报表输出等功能。该软件对于GPS观测数据质量检查工作具有一定的实用价值。%This paper analyzes the factors that influence the quality of GPS observation data and introduces the principle of GPS data quality inspect by the TEQC software .Develop interface for TEQC by using visual C#2010 , and realize the parameter setting , quality evaluation, report output and other functions .The software has a certain practical value for GPS data quality inspect work .

  10. Q-BIM. Quality and Assurance control for construction projects on a BIM enviroment. An analysis of the strengths and weaknesses of the current software

    OpenAIRE

    Solar Serrano, Patricia del; Andrés Ortega, Silvia; Peña González, Aránzazu de la; Vivas Urías, María Dolores

    2015-01-01

    Software for collaborative working as Building Information Modelling (BIM), which enhances the coordination between multidisciplinary teams throughout the Building Design and Execution process, has appeared in the Construction Industry recently. This methodology covers and manages the total building lifecycle information, simulating and updating digital representations for all construction stages, functioning, demolition and recycling. Quality Management is one of the deeply involved di...

  11. Solidification of low-level-radioactive resins in ASC-zeolite blends

    International Nuclear Information System (INIS)

    Solidification of low-level-radioactive (LLR) resins was studied in ASC (a kind of new cement)-zeolite blends. The effect of addition of zeolite on the strength of the cementation matrix was investigated. A superior combination was obtained as ASC 35 wt.%, zeolite 7 wt.% to mix 42 wt.% of resins with 16 wt.% of water, the moisture content of the resins was about 50%. The simulated leaching tests showed that inclusion of zeolite in ASC reduced the leaching rates of radionuclides significantly. From tests of 200L, the center temperature curve was measured, and no early thermal cracks were found

  12. Role of ASC in the Mouse Model of Helicobacter pylori Infection

    OpenAIRE

    Benoit, Bekale N.; Kobayashi, Motohiro; Kawakubo, Masatomo; Takeoka, Michiko; Sano, Kenji; Zou, Jian; Itano, Naoki; Tsutsui, Hiroko; Noda, Tetsuo; Fukuda, Minoru; Nakayama, Jun; Taniguchi, Shun'ichiro

    2009-01-01

    Apoptosis-associated speck-like protein containing a C-terminal caspase recruitment domain (ASC) is an adaptor molecule activating caspase-1 that stimulates pro-interleukin-1β (pro-IL-1β) and pro-IL-18, two pro-inflammatory cytokines with critical functions in host defense against a variety of pathogens. In this study, we investigated the role of ASC in the host defense against Helicobacter pylori utilizing ASC-deficient mice. Mice were orally inoculated with H. pylori; bacterial load, degree...

  13. IDENTIFICATION OF TYPES AND MODELS OF AIRCRAFT USING ASC-ANALYSIS OF THEIR SILHOUETTES (CONTOURS (GENERALIZATION, ABSTRACTION, CLASSIFICATION AND IDENTIFICATION

    Directory of Open Access Journals (Sweden)

    Lutsenko Y. V.

    2015-12-01

    Full Text Available The article discusses the application of automated system-cognitive analysis (ASC-analysis, its mathematical model which is system theory of information and its software tool, which is intellectual system called "Eidos" for solving problems related to identification of types and models of aircraft by their silhouettes on the ground, to be more precise, their external contours: 1 digitization of scanned images of aircraft and creation of their mathematical models; 2 formation of mathematical models of specific aircraft with the use of the information theory; 3 modeling of the generalized images of various aircraft types and models and their graphic visualization; 4 comparing an image of a particular plane with generalized images of various aircraft types and models, and quantifying the degree of similarities and differences between them, i.e., the identification of the type and model of airplane by its silhouette (contour on the ground; 5 quantification of the similarities and differences of the generalized images of the planes with each other, i.e., clusterconstructive analysis of generalized images of various aircraft types and models. The article gives a new approach to digitizing images of aircraft, based on the use of the polar coordinate system, the center of gravity of the image and its external contour. Before digitizing images, we may use their transformation, standardizing the position of the images, their sizes (resolution, distance and the angle of rotation (angle in three dimensions. Therefore, the results of digitization and ASC-analysis of the images can be invariant (independent relative to their position, dimensions and turns. The shape of the contour of a particular aircraft is considered as a noise information on the type and model of aircraft, including information about the true shape of the aircraft type and its model (clean signal and noise, which distort the real shape, due to noise influences, both of the means of

  14. ASC Predictive Science Academic Alliance Program Verification and Validation Whitepaper

    Energy Technology Data Exchange (ETDEWEB)

    Klein, R; Graziani, F; Trucano, T

    2006-03-31

    The purpose of this whitepaper is to provide a framework for understanding the role that verification and validation (V&V) are expected to play in successful ASC Predictive Science Academic Alliance (PSAA) Centers and projects. V&V have been emphasized in the recent specification of the PSAA (NNSA, 2006): (1) The resulting simulation models lend themselves to practical verification and validation methodologies and strategies that should include the integrated use of experimental and/or observational data as a key part of model and sub-model validation, as well as demonstrations of numerical convergence and accuracy for code verification. (2) Verification, validation and prediction methodologies and results must be much more strongly emphasized as research topics and demonstrated via the proposed simulations. (3) It is mandatory that proposals address the following two topics: (a) Predictability in science & engineering; and (b) Verification & validation strategies for large-scale simulations, including quantification of uncertainty and numerical convergence. We especially call attention to the explicit coupling of computational predictability and V&V in the third bullet above. In this whitepaper we emphasize this coupling, and provide concentrated guidance for addressing item 2. The whitepaper has two main components. First, we provide a brief and high-level tutorial on V&V that emphasizes critical elements of the program. Second, we state a set of V&V-related requirements that successful PSAA proposals must address.

  15. Quality assurance with TL 9000 in agile software development of set-top boxes : The case of Motorola and the use of Scrum

    OpenAIRE

    Gustafsson, Kristofer; Jacobsson, Johan

    2009-01-01

    In today’s fast-paced world, there is a constant demand for better and more efficient ways of doing business. Motorola in Linköping are using the agile development framework, Scrum in their software development. A certain level of quality must also be assured of the delivered goods and services. Is it possible to use Scrum and still meet the quality requirements? This Master Thesis is performed to investigate if it is possible to achieve a quality certificate from TL 9000, the telecom industr...

  16. How can Software Packages Certification Improve Software Process

    OpenAIRE

    Pivka, Marjan; Potočan, Vojko

    1997-01-01

    Popular software assessment models such as CMM, BOOTSTRAP, SPICE or ISO 9000 ignore the impact of software product certification on software quality. The first standard for software product quality was German DIN 66285. Based on this standard, the ISO developed a international standard for quality requirements and testing procedures for software packages: ISO/IEC 12119. This paper presents our experience with classical testing models based on ISO/IEC 12119 and DIN 66285 and with our improved ...

  17. Evaluation of replacement protocols and modifications to TCP to enhance ASC Wide Area Network performance.

    Energy Technology Data Exchange (ETDEWEB)

    Romero, Randy L. Jr.

    2004-09-01

    Historically, TCP/IP has been the protocol suite used to transfer data throughout the Advanced Simulation and Computing (ASC) community. However, TCP was developed many years ago for an environment very different from the ASC Wide Area Network (WAN) of today. There have been numerous publications that hint of better performance if modifications were made to the TCP algorithms or a different protocol was used to transfer data across a high bandwidth, high delay WAN. Since Sandia National Laboratories wants to maximize the ASC WAN performance to support the Thor's Hammer supercomputer, there is strong interest in evaluating modifications to the TCP protocol and in evaluating alternatives to TCP, such as SCTP, to determine if they provide improved performance. Therefore, the goal of this project is to test, evaluate, compare, and report protocol technologies that enhance the performance of the ASC WAN.

  18. 软件质量保证过程研究%The Research of Software Quality Assurance Process

    Institute of Scientific and Technical Information of China (English)

    胡庆林; 王一苇

    2016-01-01

    In order to regularize software quality assurance(SQA)process,many industries formulated relat-ed standards.The Enterprise can form an effective SQA method during they carry out these standards to improve its SQA ability.Compare and analyze the SQA process requirement of GJB 2786、GJB 439、GJB 5000A and DO-178C,put forward suggestions that Perform process and product evaluation of GJB 5000A and GJB 439A,perform SQA activities of DO-178C,release SQAP,and audit if corresponding activities are carried out,if the evidences of software life cycle data are existent,to assure that the SQA process is effec-tive.%为规范软件质量保证过程,许多行业制定了相关标准。企业在执行各类标准过程中可博采众长,形成一套有效的软件质量保证方法,以此提高企业的软件质量保证能力。通过比较分析GJB 2786、GJB 439A、GJB 5000A和DO-178C的软件质量保证过程,提出以GJB 5000A和GJB 439A过程和产品评价主要活动为中心,结合DO-178C定义的软件生命周期过程目标和软件质量保证目标,制定软件质量保证计划,执行软件质量保证活动,审核是否按照软件过程要求执行了相应活动,是否形成了软件生命周期数据相关证据,使软件质量保证过程行之有效。

  19. SU-E-J-199: A Software Tool for Quality Assurance of Online Replanning with MR-Linac

    International Nuclear Information System (INIS)

    Purpose: To develop a quality assurance software tool, ArtQA, capable of automatically checking radiation treatment plan parameters, verifying plan data transfer from treatment planning system (TPS) to record and verify (R&V) system, performing a secondary MU calculation considering the effect of magnetic field from MR-Linac, and verifying the delivery and plan consistency, for online replanning. Methods: ArtQA was developed by creating interfaces to TPS (e.g., Monaco, Elekta), R&V system (Mosaiq, Elekta), and secondary MU calculation system. The tool obtains plan parameters from the TPS via direct file reading, and retrieves plan data both transferred from TPS and recorded during the actual delivery in the R&V system database via open database connectivity and structured query language. By comparing beam/plan datasets in different systems, ArtQA detects and outputs discrepancies between TPS, R&V system and secondary MU calculation system, and delivery. To consider the effect of 1.5T transverse magnetic field from MR-Linac in the secondary MU calculation, a method based on modified Clarkson integration algorithm was developed and tested for a series of clinical situations. Results: ArtQA is capable of automatically checking plan integrity and logic consistency, detecting plan data transfer errors, performing secondary MU calculations with or without a transverse magnetic field, and verifying treatment delivery. The tool is efficient and effective for pre- and post-treatment QA checks of all available treatment parameters that may be impractical with the commonly-used visual inspection. Conclusion: The software tool ArtQA can be used for quick and automatic pre- and post-treatment QA check, eliminating human error associated with visual inspection. While this tool is developed for online replanning to be used on MR-Linac, where the QA needs to be performed rapidly as the patient is lying on the table waiting for the treatment, ArtQA can be used as a general QA tool

  20. Great software debates

    CERN Document Server

    Davis, A

    2004-01-01

    The industry’s most outspoken and insightful critic explains how the software industry REALLY works. In Great Software Debates, Al Davis, shares what he has learned about the difference between the theory and the realities of business and encourages you to question and think about software engineering in ways that will help you succeed where others fail. In short, provocative essays, Davis fearlessly reveals the truth about process improvement, productivity, software quality, metrics, agile development, requirements documentation, modeling, software marketing and sales, empiricism, start-up financing, software research, requirements triage, software estimation, and entrepreneurship.

  1. In Order to Improve the Quality of Software Talents Roundly,Construct the Software Factory of Sun Yat-sen University%建设中山大学软件工厂,提高软件人才培养质量

    Institute of Scientific and Technical Information of China (English)

    胡赟; 常会友; 朝红阳

    2007-01-01

    Software School of SYSU(Sun Yat-Sen University) is in a positive school-enterprise cooperation, and establishes the Sysusoft (SYSU Software Factory). Sysusoft is market-oriented directly, follows the modern enterprise system operation and management, vigorously carries out the software training, software outsourcing, independent research and development, and studiously builds up the 'Sysusoft' brand. Sysusoft internally undertakes the practice teaching of Software School of SYSU undergraduate students. It promotes the students to synthesize the quality comprehensively, and improves the quality of software talents training. Sysusoft innovates the talents training mode creatively, carries large amount of "high-level, international, mixed type of application, project, and multi-skill" software talent according to the needs of the society, and encourages the students to venture the hypothesized imbark. All these have become the sponsoring software characteristics.

  2. Analysis of the assignment scheduling capability for Unmanned Aerial Vehicles (ASC-U) simulation tool

    OpenAIRE

    Nannini, Christopher J.

    2006-01-01

    The U.S. Army Training and Doctrine Command (TRADOC) Analysis Center (TRAC) and the Modeling, Virtual Environments, and Simulations Institute (MOVES) at the Naval Postgraduate School, Monterey, California developed the Assignment Scheduling Capability for UAVs (ASC-U) simulation to assist in the analysis of unmanned aerial vehicle (UAV) requirements for the current and future force. ASC-U employs a discrete event simulation coupled with the optimization of a linear objective function. At ...

  3. Engineering a bilayered hydrogel to control ASC differentiation.

    Science.gov (United States)

    Natesan, Shanmugasundaram; Zamora, David O; Suggs, Laura J; Christy, Robert J

    2012-01-01

    Natural polymers over the years have gained more importance because of their host biocompatibility and ability to interact with cells in vitro and in vivo. An area of research that holds promise in regenerative medicine is the combinatorial use of novel biomaterials and stem cells. A fundamental strategy in the field of tissue engineering is the use of three-dimensional scaffold (e.g., decellularized extracellular matrix, hydrogels, micro/nano particles) for directing cell function. This technology has evolved from the discovery that cells need a substrate upon which they can adhere, proliferate, and express their differentiated cellular phenotype and function. More recently, it has also been determined that cells not only use these substrates for adherence, but also interact and take cues from the matrix substrate (e.g., extracellular matrix, ECM). Therefore, the cells and scaffolds have a reciprocal connection that serves to control tissue development, organization, and ultimate function. Adipose-derived stem cells (ASCs) are mesenchymal, non-hematopoetic stem cells present in adipose tissue that can exhibit multi-lineage differentiation and serve as a readily available source of cells (i.e. pre-vascular endothelia and pericytes). Our hypothesis is that adipose-derived stem cells can be directed toward differing phenotypes simultaneously by simply co-culturing them in bilayered matrices. Our laboratory is focused on dermal wound healing. To this end, we created a single composite matrix from the natural biomaterials, fibrin, collagen, and chitosan that can mimic the characteristics and functions of a dermal-specific wound healing ECM environment. PMID:22664758

  4. Power laws in software systems

    OpenAIRE

    Tonelli, Roberto

    2012-01-01

    The main topic of my PhD has been the study of power laws in software systems within the perspective of describing software quality. My PhD research contributes to a recent stream of studies in software engineering, where the investigation of power laws in software systems has become widely popular in recent years, since they appear on an incredible variety of different software quantities and properties, like, for example, software metrics, software faults, refactoring, Java byte-code,...

  5. EPIQR software

    Energy Technology Data Exchange (ETDEWEB)

    Flourentzos, F. [Federal Institute of Technology, Lausanne (Switzerland); Droutsa, K. [National Observatory of Athens, Athens (Greece); Wittchen, K.B. [Danish Building Research Institute, Hoersholm (Denmark)

    1999-11-01

    The support of the EPIQR method is a multimedia computer program. Several modules help the users of the method to treat the data collected during a diagnosis survey, to set up refurbishment scenario and calculate their cost or energy performance, and finally to visualize the results in a comprehensive way and to prepare quality reports. This article presents the structure and the main features of the software. (au)

  6. Software Architecture Evolution

    Science.gov (United States)

    Barnes, Jeffrey M.

    2013-01-01

    Many software systems eventually undergo changes to their basic architectural structure. Such changes may be prompted by new feature requests, new quality attribute requirements, changing technology, or other reasons. Whatever the causes, architecture evolution is commonplace in real-world software projects. Today's software architects, however,…

  7. QRev—Software for computation and quality assurance of acoustic doppler current profiler moving-boat streamflow measurements—User’s manual for version 2.8

    Science.gov (United States)

    Mueller, David S.

    2016-01-01

    The software program, QRev computes the discharge from moving-boat acoustic Doppler current profiler measurements using data collected with any of the Teledyne RD Instrument or SonTek bottom tracking acoustic Doppler current profilers. The computation of discharge is independent of the manufacturer of the acoustic Doppler current profiler because QRev applies consistent algorithms independent of the data source. In addition, QRev automates filtering and quality checking of the collected data and provides feedback to the user of potential quality issues with the measurement. Various statistics and characteristics of the measurement, in addition to a simple uncertainty assessment are provided to the user to assist them in properly rating the measurement. QRev saves an extensible markup language file that can be imported into databases or electronic field notes software. The user interacts with QRev through a tablet-friendly graphical user interface. This report is the manual for version 2.8 of QRev.

  8. Ultrasound -Assisted Gene Transfer to Adipose Tissue-Derived Stem/Progenitor Cells (ASCs)

    Science.gov (United States)

    Miyamoto, Yoshitaka; Ueno, Hitomi; Hokari, Rei; Yuan, Wenji; Kuno, Shuichi; Kakimoto, Takashi; Enosawa, Shin; Negishi, Yoichi; Yoshinaka, Kiyoshi; Matsumoto, Yoichiro; Chiba, Toshio; Hayashi, Shuji

    2011-09-01

    In recent years, multilineage adipose tissue-derived stem cells (ASCs) have become increasingly attractive as a promising source for cell transplantation and regenerative medicine. Particular interest has been expressed in the potential to make tissue stem cells, such as ASCs and marrow stromal cells (MSCs), differentiate by gene transfection. Gene transfection using highly efficient viral vectors such as adeno- and sendai viruses have been developed for this purpose. Sonoporation, or ultrasound (US)-assisted gene transfer, is an alternative gene manipulation technique which employs the creation of a jet stream by ultrasonic microbubble cavitation. Sonoporation using non-viral vectors is expected to be a much safer, although less efficient, tool for prospective clinical gene therapy. In this report, we assessed the efficacy of the sonoporation technique for gene transfer to ASCs. We isolated and cultured adipocyets from mouse adipose tissue. ASCs that have the potential to differentiate with transformation into adipocytes or osteoblasts were obtained. Using the US-assisted system, plasmid DNA containing beta-galactosidase (beta-Gal) and green fluorescent protein (GFP) genes were transferred to the ASCs. For this purpose, a Sonopore 4000 (NEPAGENE Co.) and a Sonazoid (Daiichi Sankyo Co.) instrument were used in combination. ASCs were subjected to US (3.1 MHz, 50% duty cycle, burst rate 2.0 Hz, intensity 1.2 W/cm2, exposure time 30 sec). We observed that the gene was more efficiently transferred with increased concentrations of plasmid DNA (5-150 μg/mL). However, further optimization of the US parameters is required, as the gene transfer efficiency was still relatively low. In conclusion, we herein demonstrate that a gene can be transferred to ASCs using our US-assisted system. In regenerative medicine, this system might resolve the current issues surrounding the use of viral vectors for gene transfer.

  9. Development of a software tool for the management of quality control in a helical tomotherapy unit; Desarrollo de una herramienta de software para la gestion integral del control de calidad en una unidad de tomoterapia helicoidal

    Energy Technology Data Exchange (ETDEWEB)

    Garcia Repiso, S.; Hernandez Rodriguez, J.; Martin Rincon, C.; Ramos Pacho, J. A.; Verde Velasco, J. M.; Delgado Aparacio, J. M.; Perez Alvarez, M. e.; Gomez Gonzalez, N.; Cons Perez, V.; Saez Beltran, M.

    2013-07-01

    The large amount of data and information that is managed in units of external radiotherapy quality control tests makes necessary the use of tools that facilitate, on the one hand, the management of measures and results in real time, and on other tasks of management, file, query and reporting of stored data. This paper presents an application of software of own development which is used for the integral management of the helical TomoTherapy unit in the aspects related to the roles and responsibilities of the hospital Radiophysics. (Author)

  10. Visualizing software structure understandability

    OpenAIRE

    Dugerdil, Philippe; Niculescu, Mihnea

    2014-01-01

    Software architecture design is known to be driven by the quality attributes we may want to satisfy. Among them, modifiability plays an important role since software maintenance takes the lion's share in the software development costs. However, to successfully maintain a legacy system, the latter must be sufficiently understood so that the maintenance team will not introduce new bugs when correcting others. Then we present a software metric that we called the Autonomy Ratio (AR). We show this...

  11. Software Upgrades under Monopoly

    OpenAIRE

    Jiri Strelicky; Kresimir Zigic

    2013-01-01

    We study price discrimination in a monopolistic software market. The monopolist charges different prices for the upgrade version and for the full version. Consumers are heterogeneous in taste for infinitely durable software and there is no resale. We show that price discrimination leads to a higher software quality but raises both absolute price and price per quality. This price discrimination does not increase sales and it decreases the total number of consumers compared to no discrimination...

  12. A concept of software testing for SMART MMIS software

    International Nuclear Information System (INIS)

    In order to achieve high quality of SMART MMIS software, the well-constructed software testing concept shall be required. This paper established software testing concept which is to be applied to SMART MMIS software, in terms of software testing organization, documentation. procedure, and methods. The software testing methods are classified into source code static analysis and dynamic testing. The software dynamic testing methods are discussed with two aspects: white-box and black-box testing. As software testing concept introduced in this paper is applied to the SMART MMIS software. the high quality of the software will be produced. In the future, software failure data will be collected through the construction of SMART MMIS prototyping facility which the software testing concept of this paper is applied to

  13. 基于敏捷管理模式的软件质量度量方法研究%Research on Software Quality Metric Method in Agile Management Mode

    Institute of Scientific and Technical Information of China (English)

    吴刚

    2016-01-01

    文章结合软件敏捷开发管理模式的特征和质量度量一般定义推论,研究提出一种基于团队属性因子和单位产品规模的缺陷值的软件质量度量与跟踪方法,即敏捷软件质量度量法,然后采用该软件质量度量模型对敏捷管理实践产生的过程数据进行计算处理和质量度量试验,研究并分析所得结果,总结得出提高软件产品质量的方法。%The article is according to the characteristic of data in agile software development management mode and general definition and inference of software quality, studies and puts forward a kind of software quality measurement and tracking method, agile software quality metric method, which bases on team attributes value and the defects value per unit product size. Then the application of agile software quality measurement model, measures and tracks the software product quality under the actual production environment, researches and analyses of the data results, and summaries the general methods of improving the quality of software product.

  14. Software Validation in ATLAS

    International Nuclear Information System (INIS)

    The ATLAS collaboration operates an extensive set of protocols to validate the quality of the offline software in a timely manner. This is essential in order to process the large amounts of data being collected by the ATLAS detector in 2011 without complications on the offline software side. We will discuss a number of different strategies used to validate the ATLAS offline software; running the ATLAS framework software, Athena, in a variety of configurations daily on each nightly build via the ATLAS Nightly System (ATN) and Run Time Tester (RTT) systems; the monitoring of these tests and checking the compilation of the software via distributed teams of rotating shifters; monitoring of and follow up on bug reports by the shifter teams and periodic software cleaning weeks to improve the quality of the offline software further.

  15. MicroASC instrument onboard Juno spacecraft utilizing inertially controlled imaging

    Science.gov (United States)

    Pedersen, David Arge Klevang; Jørgensen, Andreas Härstedt; Benn, Mathias; Denver, Troelz; Jørgensen, Peter Siegbjørn; Bjarnø, Jonas Bækby; Massaro, Alessandro; Jørgensen, John Leif

    2016-01-01

    This contribution describes the post-processing of the raw image data acquired by the microASC instrument during the Earth-fly-by of the Juno spacecraft. The images show a unique view of the Earth and Moon system as seen from afar. The procedure utilizes attitude measurements and inter-calibration of the Camera Head Units of the microASC system to trigger the image capturing. The triggering is synchronized with the inertial attitude and rotational phase of the sensor acquiring the images. This is essentially works as inertially controlled imaging facilitating image acquisition from unexplored perspectives of moons, asteroids, icy rocks and planetary rings.

  16. Fundamentals of civil engineering an introduction to the ASCE body of knowledge

    CERN Document Server

    McCuen, Richard H; Wong, Melanie K

    2011-01-01

    While the ASCE Body of Knowledge (BOK2) is the codified source for all technical and non-technical information necessary for those seeking to attain licensure in civil engineering, recent graduates have notoriously been lacking in the non-technical aspects even as they excel in the technical. Fundamentals of Civil Engineering: An Introduction to the ASCE Body of Knowledge addresses this shortfall and helps budding engineers develop the knowledge, skills, and attitudes suggested and implied by the BOK2. Written as a resource for all of the non-technical outcomes not specifically covered in the

  17. Software testing concepts and operations

    CERN Document Server

    Mili, Ali

    2015-01-01

    Explores and identifies the main issues, concepts, principles and evolution of software testing, including software quality engineering and testing concepts, test data generation, test deployment analysis, and software test managementThis book examines the principles, concepts, and processes that are fundamental to the software testing function. This book is divided into five broad parts. Part I introduces software testing in the broader context of software engineering and explores the qualities that testing aims to achieve or ascertain, as well as the lifecycle of software testing. Part II c

  18. Software productivity improvement through software engineering technology

    Science.gov (United States)

    Mcgarry, F. E.

    1985-01-01

    It has been estimated that NASA expends anywhere from 6 to 10 percent of its annual budget on the acquisition, implementation and maintenance of computer software. Although researchers have produced numerous software engineering approaches over the past 5-10 years; each claiming to be more effective than the other, there is very limited quantitative information verifying the measurable impact htat any of these technologies may have in a production environment. At NASA/GSFC, an extended research effort aimed at identifying and measuring software techniques that favorably impact productivity of software development, has been active over the past 8 years. Specific, measurable, software development technologies have been applied and measured in a production environment. Resulting software development approaches have been shown to be effective in both improving quality as well as productivity in this one environment.

  19. 42 CFR 413.118 - Payment for facility services related to covered ASC surgical procedures performed in hospitals...

    Science.gov (United States)

    2010-10-01

    ... ASC surgical procedures performed in hospitals on an outpatient basis. 413.118 Section 413.118 Public... PROSPECTIVELY DETERMINED PAYMENT RATES FOR SKILLED NURSING FACILITIES Specific Categories of Costs § 413.118 Payment for facility services related to covered ASC surgical procedures performed in hospitals on...

  20. Structure and assembly of the mouse ASC inflammasome by combined NMR spectroscopy and cryo-electron microscopy

    Science.gov (United States)

    Sborgi, Lorenzo; Ravotti, Francesco; Dandey, Venkata P.; Dick, Mathias S.; Mazur, Adam; Reckel, Sina; Chami, Mohamed; Scherer, Sebastian; Huber, Matthias; Böckmann, Anja; Egelman, Edward H.; Stahlberg, Henning; Broz, Petr; Meier, Beat H.; Hiller, Sebastian

    2015-01-01

    Inflammasomes are multiprotein complexes that control the innate immune response by activating caspase-1, thus promoting the secretion of cytokines in response to invading pathogens and endogenous triggers. Assembly of inflammasomes is induced by activation of a receptor protein. Many inflammasome receptors require the adapter protein ASC [apoptosis-associated speck-like protein containing a caspase-recruitment domain (CARD)], which consists of two domains, the N-terminal pyrin domain (PYD) and the C-terminal CARD. Upon activation, ASC forms large oligomeric filaments, which facilitate procaspase-1 recruitment. Here, we characterize the structure and filament formation of mouse ASC in vitro at atomic resolution. Information from cryo-electron microscopy and solid-state NMR spectroscopy is combined in a single structure calculation to obtain the atomic-resolution structure of the ASC filament. Perturbations of NMR resonances upon filament formation monitor the specific binding interfaces of ASC-PYD association. Importantly, NMR experiments show the rigidity of the PYD forming the core of the filament as well as the high mobility of the CARD relative to this core. The findings are validated by structure-based mutagenesis experiments in cultured macrophages. The 3D structure of the mouse ASC-PYD filament is highly similar to the recently determined human ASC-PYD filament, suggesting evolutionary conservation of ASC-dependent inflammasome mechanisms. PMID:26464513

  1. First experiences with the implementation of the European standard EN 62304 on medical device software for the quality assurance of a radiotherapy unit

    International Nuclear Information System (INIS)

    According to the latest amendment of the Medical Device Directive standalone software qualifies as a medical device when intended by the manufacturer to be used for medical purposes. In this context, the EN 62304 standard is applicable which defines the life-cycle requirements for the development and maintenance of medical device software. A pilot project was launched to acquire skills in implementing this standard in a hospital-based environment (in-house manufacture). The EN 62304 standard outlines minimum requirements for each stage of the software life-cycle, defines the activities and tasks to be performed and scales documentation and testing according to its criticality. The required processes were established for the pre-existent decision-support software FlashDumpComparator (FDC) used during the quality assurance of treatment-relevant beam parameters. As the EN 62304 standard implicates compliance with the EN ISO 14971 standard on the application of risk management to medical devices, a risk analysis was carried out to identify potential hazards and reduce the associated risks to acceptable levels. The EN 62304 standard is difficult to implement without proper tools, thus open-source software was selected and integrated into a dedicated development platform. The control measures yielded by the risk analysis were independently implemented and verified, and a script-based test automation was retrofitted to reduce the associated test effort. After all documents facilitating the traceability of the specified requirements to the corresponding tests and of the control measures to the proof of execution were generated, the FDC was released as an accessory to the HIT facility. The implementation of the EN 62304 standard was time-consuming, and a learning curve had to be overcome during the first iterations of the associated processes, but many process descriptions and all software tools can be re-utilized in follow-up projects. It has been demonstrated that a

  2. 基于社区软件外包服务过程的质量评价方法%Service Quality Evaluation Method for Community-Based Software Outsourcing Process

    Institute of Scientific and Technical Information of China (English)

    姜慧敏; 刘英; 王忠杰; 刘曙

    2009-01-01

    Outsourcing software development to the community developers is a promising model to help reduce software development cost and improve development efficiency.In this paper,we present a method to evaluate the quality of service in the managing such community-based software outsourcing process. In the community-bascd software outsoureing service,a customer(e.g.,a software company)firstly releases the requirement and design specifications of a software system to the community,then the community helps to decompose the whole development tasks into a set of fine-grained tasks(including programming,designing test eases,testing,etc)and allocatee them to community developers(programmers,testers,project managers,etc).These service providers work to fulfill the tasks and submit results to the community.In this service,quality is quite important and it is necessary to evaluate the quality of both final submitted software entities and various development activities,to ensure that all the initial requirements have been completely and correctly accomplished. In our quality evaluation method,there are three types of objects whose service quality need to be evaluated,i.e.,products,behaviors and people.Specifically speaking,they are the deliverables(codes,testeases,test records)submitted by each service provider,the development process,and various community developers,respectively.For each type of the objects,we designed five dimensions ofquality indicators,i.e.time and efficiency,place and cost,quality of service content,resources and conditions,reputation and risk.A set of refined quality indicators is designed for each of the five dimensions. Aiming at each quality indicator,we put forward the corresponding measurement method,i.e.,quantitatively calculating the value of each quality indicator based on the original data automatically collected from the community platform and some subjective evaluation opinions from customers.Then,traditional AHP method is adopted to calculate the

  3. Software development concept for SMART MMIS design

    International Nuclear Information System (INIS)

    Based on the design concept of SMART MMIS which is developed with fully digitalized system, software development concept should be considered to achieve high quality of digitalized SMART MMIS. In this paper, nuclear regulatory position on software common mode failure, software safety class, code and standards for software development, software life cycle and major techniques for software development are discussed

  4. MicroASC instrument onboard Juno spacecraft utilizing inertially controlled imaging

    DEFF Research Database (Denmark)

    Pedersen, David Arge Klevang; Jørgensen, Andreas Härstedt; Benn, Mathias;

    2016-01-01

    This contribution describes the post-processing of the raw image data acquired by the microASC instrument during the Earth-fly-by of the Juno spacecraft. The images show a unique view of the Earth and Moon system as seen from afar. The procedure utilizes attitude measurements and inter-calibratio...

  5. 76 FR 27668 - ASC Machine Tools, Inc., Spokane Valley, WA; Notice of Negative Determination on Reconsideration

    Science.gov (United States)

    2011-05-12

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF LABOR Employment and Training Administration ASC Machine Tools, Inc., Spokane Valley, WA; Notice of Negative Determination on Reconsideration On October 7, 2010, the Department of Labor issued an Affirmative Determination Regarding Application for...

  6. 75 FR 65516 - ASC Machine Tools, Inc., Spokane Valley, WA; Notice of Affirmative Determination Regarding...

    Science.gov (United States)

    2010-10-25

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF LABOR Employment and Training Administration ASC Machine Tools, Inc., Spokane Valley, WA; Notice of Affirmative Determination Regarding Application for Reconsideration By application dated September 21, 2010, a representative of the International Association...

  7. ASC Computational Environment (ACE) requirements version 8.0 final report.

    Energy Technology Data Exchange (ETDEWEB)

    Larzelere, Alex R. (Exagrid Engineering, Alexandria, VA); Sturtevant, Judith E.

    2006-11-01

    A decision was made early in the Tri-Lab Usage Model process, that the collection of the user requirements be separated from the document describing capabilities of the user environment. The purpose in developing the requirements as a separate document was to allow the requirements to take on a higher-level view of user requirements for ASC platforms in general. In other words, a separate ASC user requirement document could capture requirements in a way that was not focused on ''how'' the requirements would be fulfilled. The intent of doing this was to create a set of user requirements that were not linked to any particular computational platform. The idea was that user requirements would endure from one ASC platform user environment to another. The hope was that capturing the requirements in this way would assist in creating stable user environments even though the particular platforms would be evolving and changing. In order to clearly make the separation, the Tri-lab S&CS program decided to create a new title for the requirements. The user requirements became known as the ASC Computational Environment (ACE) Requirements.

  8. Probability-Based Design Criteria of the ASCE 7 Tsunami Loads and Effects Provisions (Invited)

    Science.gov (United States)

    Chock, G.

    2013-12-01

    Mitigation of tsunami risk requires a combination of emergency preparedness for evacuation in addition to providing structural resilience of critical facilities, infrastructure, and key resources necessary for immediate response and economic and social recovery. Critical facilities would include emergency response, medical, tsunami refuges and shelters, ports and harbors, lifelines, transportation, telecommunications, power, financial institutions, and major industrial/commercial facilities. The Tsunami Loads and Effects Subcommittee of the ASCE/SEI 7 Standards Committee is developing a proposed new Chapter 6 - Tsunami Loads and Effects for the 2016 edition of the ASCE 7 Standard. ASCE 7 provides the minimum design loads and requirements for structures subject to building codes such as the International Building Code utilized in the USA. In this paper we will provide a review emphasizing the intent of these new code provisions and explain the design methodology. The ASCE 7 provisions for Tsunami Loads and Effects enables a set of analysis and design methodologies that are consistent with performance-based engineering based on probabilistic criteria. . The ASCE 7 Tsunami Loads and Effects chapter will be initially applicable only to the states of Alaska, Washington, Oregon, California, and Hawaii. Ground shaking effects and subsidence from a preceding local offshore Maximum Considered Earthquake will also be considered prior to tsunami arrival for Alaska and states in the Pacific Northwest regions governed by nearby offshore subduction earthquakes. For national tsunami design provisions to achieve a consistent reliability standard of structural performance for community resilience, a new generation of tsunami inundation hazard maps for design is required. The lesson of recent tsunami is that historical records alone do not provide a sufficient measure of the potential heights of future tsunamis. Engineering design must consider the occurrence of events greater than

  9. Secure software practices among Malaysian software practitioners: An exploratory study

    Science.gov (United States)

    Mohamed, Shafinah Farvin Packeer; Baharom, Fauziah; Deraman, Aziz; Yahya, Jamaiah; Mohd, Haslina

    2016-08-01

    Secure software practices is increasingly gaining much importance among software practitioners and researchers due to the rise of computer crimes in the software industry. It has become as one of the determinant factors for producing high quality software. Even though its importance has been revealed, its current practice in the software industry is still scarce, particularly in Malaysia. Thus, an exploratory study is conducted among software practitioners in Malaysia to study their experiences and practices in the real-world projects. This paper discusses the findings from the study, which involved 93 software practitioners. Structured questionnaire is utilized for data collection purpose whilst statistical methods such as frequency, mean, and cross tabulation are used for data analysis. Outcomes from this study reveal that software practitioners are becoming increasingly aware on the importance of secure software practices, however, they lack of appropriate implementation, which could affect the quality of produced software.

  10. Logistic曲线在软件开发质量预测中的应用研究%ON APPLYING LOGISTIC CURVE IN SOFTWARE DEVELOPMENT QUALITY PREDICTION

    Institute of Scientific and Technical Information of China (English)

    晏明

    2014-01-01

    影响软件质量的因素除了开发方式多种多样外,还受其他因素影响。对于多阶段、不断开发、不断测试的软件开发项目,跟踪项目整体的测试质量对项目的质量控制有重要意义。研究发现软件开发项目中测试出的缺陷累计值的时间曲线基本符合Lo-gistic与Gompertz函数曲线。采用VBA编程,遍历所有实测数据的三点可求解出实测数据分别与两条函数曲线拟合度最好(最小2乘法)的三个曲线参数( L,b,a)。其中Logistic曲线的L值(即饱和值)可用于预测软件开发项目系统稳定时的缺陷累计值。通过分析软件项目开发中及系统发布运行后的累计缺陷的实测值与函数曲线(三个参数决定的曲线)的预测值,发现该函数曲线可用于预测及监控软件开发过程中及系统发布后的软件质量。%The factors affecting the quality of software are also influenced by other complications apart from the diversity of development modes.For software development projects with multi-phases, constant developing and continual tests, to follow up the overall test quality of the project is of significance to the project quality control.Our study found that the shapes of the time curves of cumulative defect values tested in software development projects are basically in accord with the curves of Logistic and Gompertz function.By adopting VBA programming to traverse three points of all the measured data, three curve parameters ( L, b, a) of the measured data which have the best fitting degrees with two function curves ( the least squares) respectively can be computed.Among them the L value ( saturation value) of Logistic curve can be used to predict the cumulative defect values of the software development project system when it is stable.By analysing the measured cumulative defect values and the prediction values of the functional curves ( determined by three parameters) of the software project

  11. Essential software architecture

    CERN Document Server

    Gorton, Ian

    2011-01-01

    Job titles like ""Technical Architect"" and ""Chief Architect"" nowadays abound in software industry, yet many people suspect that ""architecture"" is one of the most overused and least understood terms in professional software development. Gorton's book tries to resolve this dilemma. It concisely describes the essential elements of knowledge and key skills required to be a software architect. The explanations encompass the essentials of architecture thinking, practices, and supporting technologies. They range from a general understanding of structure and quality attributes through technical i

  12. Software Architecture in Depth

    OpenAIRE

    Lars Heinemann; Christian Neumann; Birgit Penzenstadler; Wassiou Sitou

    2016-01-01

    The quality of software architecture is one of the crucial success factors for the development of large and/or complex systems. Therefore, a good software architect plays a key role in every demanding project: She or he has the overview of the overall system and sets the framework for the implementation. In order to be successful in this task, software architects need well-founded and encompassing knowledge about design, which exceeds pure programming and specific spe- cialization a...

  13. Next generation software process improvement

    OpenAIRE

    Turnas, Daniel

    2003-01-01

    Approved for public release; distribution is unlimited Software is often developed under a process that can at best be described as ad hoc. While it is possible to develop quality software under an ad hoc process, formal processes can be developed to help increase the overall quality of the software under development. The application of these processes allows for an organization to mature. The software maturity level, and process improvement, of an organization can be measured with the Cap...

  14. Software Marketing Considerations.

    Science.gov (United States)

    Fuchs, Victor E.

    Seven factors that currently affect the potential for marketing and publishing computer software for education are discussed: (1) computers as an inplace technology in education, (2) marketing and distribution patterns for software, (3) consumer demand, (4) quality, (5) timelessenss, (6) basic skills, and (7) the future. The proliferation of…

  15. Software testing and software fault injection

    OpenAIRE

    Kooli, Maha; Bosio, Alberto; Benoit, Pascal; Torres, Lionel

    2015-01-01

    Reliability is one of the most important characteristics of the system quality. It is defined as the probability of failure-free operation of system for a specified period of time in a specified environment. For microprocessor based systems, reliability includes both software and hardware reliability. Many methods and techniques have been proposed in the literature so far to evaluate and test both software faults (e.g., Mutation Testing, Control Flow Testing, Data Flow Testing) and hardware f...

  16. Adipogenic differentiation of scaffold-bound human adipose tissue-derived stem cells (hASC) for soft tissue engineering

    International Nuclear Information System (INIS)

    Adipose tissue engineering, instead of tissue substitution, often uses autologous adipose tissue-derived stem cells (hASC). These cells are known to improve graft integration and to support neovascularization of scaffolds when seeded onto biomaterials. In this study we thought to engineer adipose tissue using scaffold-bound hASC, since they can be differentiated into the adipocyte cell lineage and used for soft tissue regeneration. We show here by microscopy and gene expression of the peroxysome proliferator-activated receptor gene (PPARγ2) that hASC growing on polypropylene fibrous scaffolds as well as on three-dimensional nonwoven scaffolds can be turned into adipose tissue within 19 days. Freshly isolated hASC displayed a higher differentiation potential than hASC cultured for eight passages. In addition, we proved a modified alginate microcapsule to directly induce adipogenic differentiation of incorporated hASC. The results may help to improve long-term success of adipose tissue regeneration, especially for large-scale soft tissue defects, and support the development of cell–scaffold combinations which can be shaped individually and directly induce the adipogenic differentiation of incorporated hASC at the site of implantation. (paper)

  17. Sintering process online quality analysis system based on Minitab software%基于Minitab软件的烧结过程在线质量分析系统

    Institute of Scientific and Technical Information of China (English)

    王娟

    2012-01-01

    摘要:为提高烧结过程质量分析的准确性,缩短分析周期,笔者通过Minitab软件的Com组件将Minitab软件的分析功能嵌入到生产过程控制系统中,最终形成一种基于Minitab软件的烧结过程在线质量分析系统,同时,建立了过程数据的自动采集及统计功能。阐述了在线质量分析系统的构架、功能及实现方式。通过应用实例表明该分析系统运行稳定、可靠,为过程质量控制提供了可靠、充分的数据依据。%Abstract: To improve accuracy of sintering process quality analysis and shorten analysis cycle time, a- nalysis function of Minitab software was embedded into production process control system through Corn component of Minitab software, an online quality analysis system of sintering process based on Minitab was formed, and automatic acquisition and statistical functions of process data were built. Architecture, function and implement technique are explained. Application practice shows that operation of the sys- tem is stable and reliable and it provides credible and sufficient data basis for process quality control.

  18. An independent monitor unit calculation by commercial software as a part of a radiotherapy treatment planning system quality control

    International Nuclear Information System (INIS)

    For the independent calculation of the monitored unit (MU) the commercial software RadCalc (Lifeline Software Inc., Tyler TX) was used as the choice of some available similar programs. The program was configured and used to verify the doses calculated by commercially accessible planning system Eclipse version 8.6.17 (Varian Medical System Inc., Palo Alto). This system is being used during the clinical running for the creation of the treatment plans. The results of each plan were compared to the dose phantom measurements by the ionization chamber at the same point in which the calculation were done (Eclipse, RadCalc) - in the izocentre. TPS is configured by the beam data (PDD and OAR). Those beam data were exported and afterwards the same data were imported to the program RadCalc. The consistent and independent data between TPS and RadCalc were gained by this process. The reference conditions were set the identical in RadCalc as in TPS, so the consistency between TPS and RadCalc output factors has been achieved (Collimator Scatter Factor: Sc, Phantom Scatter Factor: Sp). Those output factors were also measured by the ionizing chamber in the water phantom and compared with the TPS. Based on the clinical data of the response to the doses, ICRU recommends ensuring the ability of dosimetric systems to deliver the doses with accuracy of at least 5%. Many factors, such as layout of anatomic structures, positioning of a patient, factors related to an accelerator (a dose calibration and mechanic parameters) cause random and systematic failures in a dose delivery. The source of some problems can be also caused by the system databases and relating information transfer; and the TPS containing besides other things other dose calculation algorithms. (authors)

  19. Software Testing Platform Development and Implementation

    OpenAIRE

    Burian, Vojtěch

    2012-01-01

    The quality is probably the most significant property of a successful software product. As experience with many software projects has already shown, leaving out testing and quality management from software development process can result in vast and critical customer issues, which usually invoke additional expenses for the software production company. In the course of time, software testing as a discipline has therefore seized an important position among other software development activities. ...

  20. Software architecture 2

    CERN Document Server

    Oussalah, Mourad Chabanne

    2014-01-01

    Over the past 20 years, software architectures have significantly contributed to the development of complex and distributed systems. Nowadays, it is recognized that one of the critical problems in the design and development of any complex software system is its architecture, i.e. the organization of its architectural elements. Software Architecture presents the software architecture paradigms based on objects, components, services and models, as well as the various architectural techniques and methods, the analysis of architectural qualities, models of representation of architectural templa

  1. Software architecture 1

    CERN Document Server

    Oussalah , Mourad Chabane

    2014-01-01

    Over the past 20 years, software architectures have significantly contributed to the development of complex and distributed systems. Nowadays, it is recognized that one of the critical problems in the design and development of any complex software system is its architecture, i.e. the organization of its architectural elements. Software Architecture presents the software architecture paradigms based on objects, components, services and models, as well as the various architectural techniques and methods, the analysis of architectural qualities, models of representation of architectural template

  2. Design and Evaluation of Software Architecture

    OpenAIRE

    Bengtsson, PerOlof

    1999-01-01

    The challenge in software development is to develop software with the right quality levels. The main problem is not to know if a project is technically feasible concerning functionality, but if a solution exists that meet the software quality requirements. It is therefore desired to get an early indication of the qualities of the resulting software. Software architecture is concerned with what modules are used to compose a system and how these modules are related to each other, i.e. the struc...

  3. THE APPLICATION OF ASC-ANALYSIS AND "AIDOS" INTELLIGENT SYSTEM TO SOLVE, IN GENERAL, THE PROBLEM OF IDENTIFYING THE SOURCES AND AUTHORS OF THE STANDARD, NON-STANDARD AND INCORRECT BIBLIOGRAPHIC DESCRIPTIONS

    Directory of Open Access Journals (Sweden)

    Lutsenko Y. V.

    2014-11-01

    Full Text Available The problem of identifying authors and literary sources for bibliographic descriptions in the literature in recent years become increasingly important scientific and practical value. This is, in particular, due to the policy of the Ministry of education and science of the Russian Federation in the field of quality assessment of the results of scientific activity, which involves the use of a number of references to publications of authors and the Hirsch index. In Russia, appropriate analytical tools to evaluate the results of scientific activity, functionally similar to the well-known foreign bibliographic databases such as Scopus, Web of Science and other. Currently, the most famous Russian similar service is the Russian science citation index (RSCI: http://elibrary.ru/. However, as experience shows, references in bibliography list of publications are often made with a violation of GOST 7.1-2003 rule, and with the erroneous output, for example, incorrectly specified page numbers, name of publisher, etc., In practice, this leads to the fact that software system of bibliographic database cannot determine what is the right reference for the article and who were the authors of this article. As a result, for these authors we lost the citation, which leads to an underestimation of their Hirsch indexes and evaluation of the results of their research activities and leadership. It is clear that these negative consequences should be overcome. This article is devoted to the presentation of the ap-proach, which allows to solve the problem by apply-ing an ASC-analysis and intelligent system named "Aidos", which is a modern innovative smart technology ready for implementation

  4. THE SOLUTION OF PROBLEMS OF AMPELOGRAPHY BY USING ASC-ANALYSIS OF IMAGES OF LEAVES IN THEIR EXTERNAL CONTOURS (GENERALIZATION, ABSTRACTION, CLASSIFICATION AND IDENTIFICATION

    Directory of Open Access Journals (Sweden)

    Lutsenko Y. V.

    2015-10-01

    Full Text Available The article discusses the use of automatic systemic-cognitive analysis (ASC-analysis, its mathematical model is a system of information theory and software tools – an intellectual system called "Eidos" for the solution of some problems of ampelography: 1 digitization of scanned images of the leaves and creation of their mathematical models; 2 the formation of mathematical models of specific leaves using the spreading of information theory; 3 the formation of models of generalized images of leaves of various sorts; 4 comparing an image of a specific leaf with a generalized image of the leaf of different varieties and finding a quantitative degree of similarity and differences between them, i.e. the identification of the varieties on the leaf; 5 quantification of the similarities and differences of the varieties, i.e. cluster-constructive analysis of generalized images of the leaves of different varieties. We propose a new approach to digitizing images of leaves, based on using the polar coordinate system, the center of gravity of the image and its external contour. Before scanning images we may use transformation to standardize the position of the still images, their sizes and rotation angle. Therefore, the results of digitization and ASC-analysis of the images might be invariant (independent relatively to their position, size and rotation. The specific shape of the contour of the leaf is regarded as noise information on the variety, including information about the true shape of the leaf of the class (clean signal and noise, which distort this true form, originating in a random environment. Software tools of ASC-analysis – intellectual "Eidos" system ensures noise reduction and the selection of the signal about the true shape of the leaf of each variety on the basis of a number of noisy concrete examples of the leaves of this variety. This creates a one way form of a leaf of each class, free from their concrete implementations, i.e., the

  5. ASC Tri-lab Co-design Level 2 Milestone Report 2015

    Energy Technology Data Exchange (ETDEWEB)

    Hornung, Rich [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Jones, Holger [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Keasler, Jeff [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Neely, Rob [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Pearce, Olga [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Hammond, Si [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Trott, Christian [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Lin, Paul [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Vaughan, Courtenay [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Cook, Jeanine [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hoekstra, Rob [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bergen, Ben [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Payne, Josh [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Womeldorff, Geoff [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-09-23

    In 2015, the three Department of Energy (DOE) National Laboratories that make up the Advanced Sci- enti c Computing (ASC) Program (Sandia, Lawrence Livermore, and Los Alamos) collaboratively explored performance portability programming environments in the context of several ASC co-design proxy applica- tions as part of a tri-lab L2 milestone executed by the co-design teams at each laboratory. The programming environments that were studied included Kokkos (developed at Sandia), RAJA (LLNL), and Legion (Stan- ford University). The proxy apps studied included: miniAero, LULESH, CoMD, Kripke, and SNAP. These programming models and proxy-apps are described herein. Each lab focused on a particular combination of abstractions and proxy apps, with the goal of assessing performance portability using those. Performance portability was determined by: a) the ability to run a single application source code on multiple advanced architectures, b) comparing runtime performance between \

  6. Software: our quest for excellence. Honoring 50 years of software history, progress, and process

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-06-01

    The Software Quality Forum was established by the Software Quality Assurance (SQA) Subcommittee, which serves as a technical advisory group on software engineering and quality initiatives and issues for DOE`s quality managers. The forum serves as an opportunity for all those involved in implementing SQA programs to meet and share ideas and concerns. Participation from managers, quality engineers, and software professionals provides an ideal environment for identifying and discussing issues and concerns. The interaction provided by the forum contributes to the realization of a shared goal--high quality software product. Topics include: testing, software measurement, software surety, software reliability, SQA practices, assessments, software process improvement, certification and licensing of software professionals, CASE tools, software project management, inspections, and management`s role in ensuring SQA. The bulk of this document consists of vugraphs. Selected papers have been indexed separately for inclusion in the Energy Science and Technology Database.

  7. Preliminary results from a field survey conducted with the Automated Subsurface Characterization System (ASCS)

    Energy Technology Data Exchange (ETDEWEB)

    Cave, S.P.; Creager, J.D. [EG and G Energy Measurements, Albuquerque, NM (United States). Kirtland Operations; Baumgart, C.W. [EG and G Energy Measurements, Los Alamos, NM (United States). Los Alamos Operations

    1995-12-31

    The Automated Subsurface Characterization System (ASCS) is a portable system that uses a ground conductivity sensor and a ground penetrating radar sensor to survey and characterize subsurface objects, conditions, and geology. The objective is to demonstrate the system`s capabilities by performing an actual field survey and reducing and reporting the results. The concepts and technologies employed are applicable to a wide range of hazardous waste, ordnance detection, and site survey and monitoring tasks.

  8. A DOMAIN ONTLOGY FOR SOFTWARE PROCESS REUSING

    OpenAIRE

    Aoussat, Fadila; Oussalah, Mourad Chabane; Ahmed-Nacer, Mohamed

    2014-01-01

    Reuse the best practices and know-how capitalized from existing Software Process Models is a promising approach to model high quality Software Processes. This paper presents a part of an approach for software processes reuse based on software architectures. This contribution is based on exploiting Software Process know-how and the solution proposed after the study of existing work on software process reuse field, our study focuses on approaches for reusing based on software architectures and ...

  9. Software engineering

    CERN Document Server

    Sommerville, Ian

    2010-01-01

    The ninth edition of Software Engineering presents a broad perspective of software engineering, focusing on the processes and techniques fundamental to the creation of reliable, software systems. Increased coverage of agile methods and software reuse, along with coverage of 'traditional' plan-driven software engineering, gives readers the most up-to-date view of the field currently available. Practical case studies, a full set of easy-to-access supplements, and extensive web resources make teaching the course easier than ever.

  10. Aeroacoustics research in Europe: The CEAS-ASC report on 2013 highlights

    Science.gov (United States)

    Bennett, G. J.; Kennedy, J.; Meskell, C.; Carley, M.; Jordan, P.; Rice, H.

    2015-03-01

    The Council of European Aerospace Societies (CEAS) Aeroacoustics Specialists Committee (ASC) supports and promotes the interests of the scientific and industrial aeroacoustics community on an European scale and European aeronautics activities internationally. In this context, "aeroacoustics" encompasses all aerospace acoustics and related areas. Each year the committee highlights some of the research and development projects in Europe. This paper is a report on highlights of aeroacoustics research in Europe in 2013, compiled from information provided to the ASC of the CEAS. During 2013, a number of research programmes involving aeroacoustics were funded by the European Commission. Some of the highlights from these programmes are summarised in this paper, as well as highlights from other programmes funded by national programmes or by industry. Furthermore, a concise summary of the CEAS-ASC workshop "Atmospheric and Ground Effects on Aircraft Noise" held in Seville, Spain in September 2013 is included in this report. Enquiries concerning all contributions should be addressed to the authors who are given at the end of each subsection. This issue of the "highlights" paper is dedicated to the memory of Prof. John A. Fitzpatrick, Professor of Mechanical Engineering, Trinity College Dublin, and a valued member of the Aeroacoustics Specialists Committee. John passed away in September 2012 and is fondly missed across the globe by the friends he made in the Aeroacoustics Community. This paper is edited by PhD graduates and colleagues of John's who conduct research in aeroacoustics, inspired by his thirst for knowledge.

  11. Research on Software-Cell-Based Software System

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    The aim of research on software architecture is to improve the quality attributes of software sys tems, such as security, reliability, maintainability, testability , reassembility , evolvability. However, a sin gle running system is hard to achieve all these goals. In this paper, software-cell is introduced as the basic u nit throughout developing process. Then it is further advanced that a robust, safe and high-quality software system is composed of a running system and four supportive systems. This paper especially discusses the structure of software-cell, the construction of the five systems, and the relations between them.

  12. Mathematical software production

    Energy Technology Data Exchange (ETDEWEB)

    Cowell, W. R.; Fosdick, L. D.

    1977-01-01

    Locally constructed collections of mathematical routines are gradually being replaced by mathematical software that has been produced for broad dissemination and use. The process of producing such software begins with algorithmic analysis, and proceeds through software construction and documentation to extensive testing and, finally, to distribution and support of the software products. These are demanding and costly activities which require such a range of skills that they are carried out in collaborative projects. The costs and effort are justified by the utility of high-quality software, the efficiency of producing it for general distribution, and the benefits of providing a conduit from research to applications. This paper first reviews certain of the early developments in the field of mathematical software. Then it examines the technical problems that distinguish software production as an intellectual activity, problems whose descriptions also serve to characterize ideal mathematical software. Next, three mathematical software projects are sketched with attention to their emphasis, accomplishments, organization, and costs. Finally, comments are offered on possible future directions for mathematical software production, as extrapolations of the present involvement of universities, government laboratories, and private industry. 48 references.

  13. Are You Breaking Software Laws?

    Science.gov (United States)

    Jobe, Holly

    1985-01-01

    Solutions to concern over computer software copyright infringement include seeking a licensing agreement with the software publisher or negotiating with a supplier for discounts on multiple copies of a program. The best sourcebook for guidance is "Software Quality and Copyright: Issues in Computer-Assisted Instruction," by Virginia Helm. (DCS)

  14. Designing Educational Software for Tomorrow.

    Science.gov (United States)

    Harvey, Wayne

    Designed to address the management and use of computer software in education and training, this paper explores both good and poor software design, calling for improvements in the quality of educational software by attending to design considerations that are based on general principles of learning rather than specific educational objectives. This…

  15. k0-NAA quality assessment by analysis of different certified reference materials using the KAYZERO/SOLCOI software

    International Nuclear Information System (INIS)

    A suite of natural matrix reference materials (RMs) were used to assess the quality of analytical results obtained by k0-instrumental neutron activation analysis (k0-INAA) at the Jozef Stefan Institute (IJS). Five certified reference materials (CRMs) from the Institute for Reference Materials and Measurements (IRMM), two standard reference materials (SRMs) from the National Institute of Standards and Technology (NIST), three RMs from the International Atomic Energy Agency (IAEA) and one RM from IJS were analyzed. Altogether, results for twenty-four elements in inorganic matrices and twenty-nine elements in organic matrices, obtained by k0-INAA, were compared to certified values. Results obtained show good agreement with certified or assigned values except for Fe, La, Nd, Sm and U in inorganic matrices, and Ag, Al and Cr in organic matrices. (author)

  16. Development of a software for the control of the quality management system of the TRIGA-Mark III reactor

    International Nuclear Information System (INIS)

    The quality has not only become one of the essential requirements of the product but rather at the present time it is a strategic factor key of which depends the bigger part of the organizations, not only to maintain their position in the market but also to assure their survival. The good organizations will have processes, procedures and standards to confront these challenges. The big organizations require of the certification of their administration systems, and once the organization has obtained this certification the following step it is to maintain it. The implementation and certification of an administration system requires of an appropriate operative organization that achieves continuous improvements in their operation. This is the case of the TRIGA Mark III reactor, which contains a computer program that upgrades, it controls and it programs activities to develop in the Installation, allowing one operative organization to the whole personnel of the same one. With the purpose of avoiding activities untimely. (Author)

  17. Characteristics for Software Optimization Projects

    Directory of Open Access Journals (Sweden)

    Iulian NITESCU

    2008-01-01

    Full Text Available The increasing of the software systems complexity imposes the identification and implementation of some methods and techniques in order to manage it. The software optimization project is a way in which the software complexity is controlled. The software optimization project must face to the organization need to earn profit. The software optimization project is an integrated part of the application cycle because share same resources, depends on other stages and influences next phases. The optimization project has some particularities because it works on an finished product around its quality. The process is quality and performance oriented and it assumes that the product life cycle is almost finished.

  18. Prediction of Defective Software Modules Using Class Imbalance Learning

    OpenAIRE

    Divya Tomar; Sonali Agarwal

    2016-01-01

    Software defect predictors are useful to maintain the high quality of software products effectively. The early prediction of defective software modules can help the software developers to allocate the available resources to deliver high quality software products. The objective of software defect prediction system is to find as many defective software modules as possible without affecting the overall performance. The learning process of a software defect predictor is difficult due to the imbal...

  19. Grid-less imaging with antiscatter correction software in 2D mammography: the effects on image quality and MGD under a partial virtual clinical validation study

    Science.gov (United States)

    Van Peteghem, Nelis; Bemelmans, Frédéric; Bramaje Adversalo, Xenia; Salvagnini, Elena; Marshall, Nicholas; Bosmans, Hilde; Van Ongeval, Chantal

    2016-03-01

    This work investigated the effect of the grid-less acquisition mode with scatter correction software developed by Siemens Healthcare (PRIME mode) on image quality and mean glandular dose (MGD) in a comparative study against a standard mammography system with grid. Image quality was technically quantified with contrast-detail (c-d) analysis and by calculating detectability indices (d') using a non-prewhitening with eye filter model observer (NPWE). MGD was estimated technically using slabs of PMMA and clinically on a set of 11439 patient images. The c-d analysis gave similar results for all mammographic systems examined, although the d' values were slightly lower for the system with PRIME mode when compared to the same system in standard mode (-2.8% to -5.7%, depending on the PMMA thickness). The MGD values corresponding to the PMMA measurements with automatic exposure control indicated a dose reduction from 11.0% to 20.8% for the system with PRIME mode compared to the same system without PRIME mode. The largest dose reductions corresponded to the thinnest PMMA thicknesses. The results from the clinical dosimetry study showed an overall population-averaged dose reduction of 11.6% (up to 27.7% for thinner breasts) for PRIME mode compared to standard mode for breast thicknesses from 20 to 69 mm. These technical image quality measures were then supported using a clinically oriented study whereby simulated clusters of microcalcifications and masses were inserted into patient images and read by radiologists in an AFROC study to quantify their detectability. In line with the technical investigation, no significant difference was found between the two imaging modes (p-value 0.95).

  20. A Survey Of Software Dependability

    OpenAIRE

    Sarma, VVS

    1987-01-01

    This paper presents on overview of the issues in precisely defining, specifying and evaluating the dependability of software, particularly in the context of computer controlled process systems. Dependability is intended to be a generic term embodying various quality factors and is useful for both software and hardware. While the developments in quality assurance and reliability theories have proceeded mostly in independent directions for hardware and software systems, we present here the ca...

  1. Software architecture analysis of usability

    OpenAIRE

    Folmer, Eelke

    2005-01-01

    One of the qualities that has received increased attention in recent decades is usability. A software product with poor usability is likely to fail in a highly competitive market; therefore software developing organizations are paying more and more attention to ensuring the usability of their software. However, practice shows that product quality (which includes usability among others) is not that high as it could be. Organizations spend a relative large amount of money and effort on fixing u...

  2. Reliable software

    OpenAIRE

    Arenas Solà, Concepción; Mestres i Naval, Francesc

    2015-01-01

    In biomedicine, biodiversity and other fields of research, large databases are used. Assuming that a proper statistical procedure has been chosen, a crucial point is the selection of the right software to compute the data. The available software has to be sufficiently proven and having the guarantee that it is reliable. Currently, it is easy to obtain free software for most statistical procedures. We agree that a free software is especially useful because as a large number of researchers can ...

  3. Software Program: Software Management Guidebook

    Science.gov (United States)

    1996-01-01

    The purpose of this NASA Software Management Guidebook is twofold. First, this document defines the core products and activities required of NASA software projects. It defines life-cycle models and activity-related methods but acknowledges that no single life-cycle model is appropriate for all NASA software projects. It also acknowledges that the appropriate method for accomplishing a required activity depends on characteristics of the software project. Second, this guidebook provides specific guidance to software project managers and team leaders in selecting appropriate life cycles and methods to develop a tailored plan for a software engineering project.

  4. Quality

    International Nuclear Information System (INIS)

    What is quality? How do you achieve it? How do you keep it once you have got it. The answer for industry at large is the three-step hierarchy of quality control, quality assurance and Total quality Management. An overview is given of the history of quality movement, illustrated with examples from Schlumberger operations, as well as the oil industry's approach to quality. An introduction of the Schlumberger's quality-associated ClientLink program is presented. 15 figs., 4 ills., 16 refs

  5. SCAM – Software Component Assessment Model

    OpenAIRE

    Hasan Tahir; Aasia Khannum; Ruhma Tahir

    2011-01-01

    It is widely understood that component based development is different from conventional development because components offer accelerated growth. In the absence of an effective component assessment strategy the developers of a software project have no way of assessing the quality of the software component they are about to incorporate into the project. We present two laws that link software components, software projects and their quality. We further propose a simple software component assessme...

  6. Knowledge Management in Software Process Improvement

    OpenAIRE

    Bjørnson, Finn Olav

    2007-01-01

    Reports of software a development projects that miss schedule, exceeds budget and deliver products with poor quality are abundant in the literature. Both researchers and the industry are seeking methods to counter these trends and improve software quality.Software Process Improvement is a systematic approach to improve the capabilities and performance of software organizations. One basic idea is to assess the organizations’ current practice and improve their software process on the basis of t...

  7. Systematic Software Development

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; Méndez Fernández, Daniel

    2015-01-01

    The speed of innovation and the global allocation of resources to accelerate development or to reduce cost put pressure on the software industry. In the global competition, especially so-called high-price countries have to present arguments why the higher development cost is justified and what...... of professionalism and systematization of software development to draw a map of strengths and weaknesses. To this end, we conducted as a first step an exploratory survey in Germany, presented in this paper. In this survey, we focused on the perceived importance of the two general software engineering process areas...... project- and quality management and their implementation in practice. So far, our results suggest that the necessity for a systematic software development is well recognized, while software development still follows an ad-hoc rather than a systematized style. Our results provide initial findings, which we...

  8. What Is Software Engineering?

    OpenAIRE

    Dzerzhinskiy, Fedor; Raykov, Leonid D.

    2015-01-01

    A later translation (2015) of the article in Russian published in 1990. The article proposes an approach to defining a set of basic notions for subject area of software engineering discipline. The set of notions is intended to serve as a basis for detection and correction of some widespread conceptual mistakes in the efforts aimed at improving the quality and work productivity in creation and operation of software.

  9. Software reliability

    CERN Document Server

    Bendell, A

    1986-01-01

    Software Reliability reviews some fundamental issues of software reliability as well as the techniques, models, and metrics used to predict the reliability of software. Topics covered include fault avoidance, fault removal, and fault tolerance, along with statistical methods for the objective assessment of predictive accuracy. Development cost models and life-cycle cost models are also discussed. This book is divided into eight sections and begins with a chapter on adaptive modeling used to predict software reliability, followed by a discussion on failure rate in software reliability growth mo

  10. Advanced fingerprint verification software

    Science.gov (United States)

    Baradarani, A.; Taylor, J. R. B.; Severin, F.; Maev, R. Gr.

    2016-05-01

    We have developed a fingerprint software package that can be used in a wide range of applications from law enforcement to public and private security systems, and to personal devices such as laptops, vehicles, and door- locks. The software and processing units are a unique implementation of new and sophisticated algorithms that compete with the current best systems in the world. Development of the software package has been in line with the third generation of our ultrasonic fingerprinting machine1. Solid and robust performance is achieved in the presence of misplaced and low quality fingerprints.

  11. Software Engineering for Tagging Software

    OpenAIRE

    Karan Gupta; Anita Goel

    2013-01-01

    Tagging is integrated into web application to ease maintenance of large amount of information stored in aweb application. With no mention of requirement specification or design document for tagging software,academically or otherwise, integrating tagging software in a web application is a tedious task. In thispaper, a framework has been created for integration of tagging software in a web application. Theframework follows the software development life cycle paradigms and is to be used during i...

  12. Software Complexity Methodologies & Software Security

    OpenAIRE

    Masoud Rafighi; Nasser Modiri

    2011-01-01

    It is broadly clear that complexity is one of the software natural features. Software natural complexity and software requirement functionality are two inseparable part and they have special range. measurement complexity have explained with using the MacCabe and Halsted models and with an example discuss about software complexity in this paper Flow metric information Henry and Kafura, complexity metric system Agresti-card-glass, design metric in item’s level have compared and peruse then cate...

  13. LDUA software custodian's notebook

    International Nuclear Information System (INIS)

    This plan describes the activities to be performed and controls to be applied to the process of specifying, obtaining, and qualifying the control and data acquisition software for the Light Duty Utility Arm (LDUA) System. It serves the purpose of a software quality assurance plan, a verification and validation plan, and a configuration management plan. This plan applies to all software that is an integral part of the LDUA control and data acquisition system, that is, software that is installed in the computers that are part of the LDUA system as it is deployed in the field. This plan applies to the entire development process, including: requirements; design; implementation; and operations and maintenance. This plan does not apply to any software that is not integral with the LDUA system. This plan has-been prepared in accordance with WHC-CM-6-1 Engineering Practices, EP-2.1; WHC-CM-3-10 Software Practices; and WHC-CM-4-2, QR 19.0, Software Quality Assurance Requirements

  14. Clay hydration and crystal growth in expansive anhydritic claystone. The Ascó Power Plant case

    Science.gov (United States)

    Alonso, Eduardo; Ramon, Anna

    2015-04-01

    A large power plant directly founded on a hard claystone experienced, soon after the construction of the foundation slabs, a continuous heave developing at decreasing rate, which has been active for the last 35 years. When undisturbed (i.e., at some depth, in the range of several meters) Ascó claystone exhibits high unconfined compressive strengths (30-40 MPa). In high quality cores the rock has a massive aspect and discontinuities are difficult to observe. The rock has a Tertiary origin and horizontal layers at spacing of 1-4 m could be identified. Whitish seams of gypsum, bassanite or anhydrite are also observed within the reddish rock matrix. Minerals identified in deep cores are quartz (10%), calcite and dolomite (50-70%), clay minerals (10-20%) and gypsum and anhydrite (2-20%). Among the clay minerals, illite dominates (10%). Smectite or smectite-interbedded minerals do not amount in general to more than 5%. The undisturbed rock has a low porosity (6-11%) and low water content (2-5%). Because of the presence of hydrated sulphates, water content and degree of saturation (Sr=0.8-0.9 was found) are somewhat uncertain. However, high suctions were found in recovered cores. This rock changes into a weathered material at shallow depths. Mineralogy is not much affected but porosity increases to 22-29% and water content increases to 10-19%. Strength drops to small values (soil like) and a lower "in situ" suction has been measured (0.4-7.1 MPa). The added pore volume of the weathered material, if compared with the deep rock, is filled with water. The heave of the station was attributed to the hydration of undisturbed rock under the building slabs of the power plant. In fact, large excavations preceded the layout of foundations. As a result, atmospheric water had an easy access to the intact rock. The installation of a compacted soil fill around the buildings allowed the presence of a permanent water table which could infiltrate into the rock. Piezometric data provided

  15. APPLICATION OF INFORMATION THEORY AND A.S.C. ANALYSIS FOR EXPERIMENTAL RESEARCH IN NUMBER THEORY

    Directory of Open Access Journals (Sweden)

    Lutsenko Y. V.

    2014-03-01

    Full Text Available Is it possible to automate the study of the properties of numbers and their relationship so that the results of this study can be formulated in the form of statements, indicating the specific quantity of information stored in them? To answer this question it is offered to apply the same method that is widely tested and proved in studies of real objects and their relations in various fields to study the properties of numbers in the theory of numbers namely - the automated system-cognitive analysis (A.S.C. analysis, based on information theory

  16. Solidification of radioactive resins by using ASC cement and zeolite blends

    International Nuclear Information System (INIS)

    The solidification of simulated spent radioactive resins is investigated using ASC cement and zeolite blends. The compress strengths and leaching rates of the solidified objects with various added amount of zeolite were compared. The microstructures of the matrix were investigated by SEM in order to explain the effect of zeolite amount on the performance of solidified object. The experimental results indicate that the addition of zeolite causes a structural shift of solidified object from pinhead crystal to layer crystal, and addition of 10%-20% zeolite can decrease the leaching rate of Cs greatly, however, it had little influence on the compress strengths. (authors)

  17. Software Reliability Growth Model with Logistic-Exponential Test-Effort Function and Analysis of Software Release Policy

    OpenAIRE

    Shaik. Mohammad Rafi; Dr.K.Nageswara Rao; Shaheda Akthar

    2010-01-01

    software reliability is one of the important factors of software quality. Before software delivered in to market it is thoroughly checked and errors are removed. Every software industry wants to develop software that should be error free. Software reliabilitygrowth models are helping the software industries to develop software which is error free and reliable. In this paper an analysis is done based on incorporating the logistic-exponential testing-effort in to NHPP Software reliability growt...

  18. Developing free software for automatic registration for the quality control of IMRT with movies; Desarrollo de un software gratuito para el registro automatico durante el control de calidad de la IMRT con peliculas

    Energy Technology Data Exchange (ETDEWEB)

    Moral, F. del; Meilan, E.; Pereira, L.; Salvador, F.; Munoz, V.; Salgado, M.

    2011-07-01

    In this work, as the commissioner of the e-JMRT, a Monte Carlo calculation network for IMRT planning, has developed software for the automatic recording of the image of the film with the results of the planning system.

  19. Six Sigma software development

    CERN Document Server

    Tayntor, Christine B

    2002-01-01

    Since Six Sigma has had marked success in improving quality in other settings, and since the quality of software remains poor, it seems a natural evolution to apply the concepts and tools of Six Sigma to system development and the IT department. Until now however, there were no books available that applied these concepts to the system development process. Six Sigma Software Development fills this void and illustrates how Six Sigma concepts can be applied to all aspects of the evolving system development process. It includes the traditional waterfall model and in the support of legacy systems,

  20. The role of colposcopy and typization of human papillomavirus in further diagnostic proceedings in patients with ASC-US cytological finding of the uterine cervix

    Directory of Open Access Journals (Sweden)

    Živadinović Radomir

    2009-01-01

    Full Text Available Background/Aim. Bethesda system of classification of cytological findings was introduced in 2001 two subcategories in the category of atypical squamous cells (ASC findings: ASC of undetermined significance (ASC-US and ASC which cannot exclude high-grade intraepithelial lesions (ASC-H. The aim of our study was to assess a possible association of these two subcategories with pathologic biopsy finding and to find out the best further diagnostic proceedings. Methods. At the Clinic of Gynecology and Obstetrics, Niš 130 patients with ASC findings were analyzed. Colposcopy was performed in all study participants. Patients with pathological colposcopic findings underwent cervical biopsy. In 10 patients with pathologic histologic and 15 with benign findings human papilloma virus (HPV typization was done using the Hybrid Capture method. Results. Patients with ASC-H finding had significantly more pathologic biopsies compared with patients with ASC-US finding (57.84: 20.72. Conclusion. Colposcopy was exhibited somewhat higher sensitivity compared to HPV typization (94.7 : 90, but lower sensitivity (79.27 : 86.6. The usage of HPV typization in the triage of patients with ASC cytologic smear induces statistically significant reduction of unnecessary percentage of cervical biopsies.

  1. Software Defined Networking Demands on Software Technologies

    DEFF Research Database (Denmark)

    Galinac Grbac, T.; Caba, Cosmin Marius; Soler, José

    2015-01-01

    Software Defined Networking (SDN) is a networking approach based on a centralized control plane architecture with standardised interfaces between control and data planes. SDN enables fast configuration and reconfiguration of the network to enhance resource utilization and service performances. This......, we review this new approach to networking from an architectural point of view, and identify and discuss some critical quality issues that require new developments in software technologies. These issues we discuss along with use case scenarios. Here in this paper we aim to identify challenges for...... new approach enables a more dynamic and flexible network, which may adapt to user needs and application requirements. To this end, systemized solutions must be implemented in network software, aiming to provide secure network services that meet the required service performance levels. In this paper...

  2. Software Transparency

    OpenAIRE

    Leite, Julio Cesar Sampaio do Prado

    2009-01-01

    Software transparency is a new concern that software developers must deal with. As society moves towards the digitalization of day to day processes, the transparency of these digital processes becomes of fundamental importance if citizens would like to exercise their right to know. Informed discourse is only possible if processes that affect the public are open to evaluation. Achieving software transparency to this level of openness brings up several roadblocks. Thi...

  3. Software Engineering in the Academy

    OpenAIRE

    Lokesh Khurana

    2011-01-01

    There is no universally accepted definition of software engineering. For some, software engineering is just a glorified name for programming. If you are a programmer, you might put “software engineer†on your business card but never “programmer.†Others have higher expectations. A textbook definition of the term might read something like this: “the body of methods, tools, and techniques intended to produce quality software.†Rather than just emphasizing quality, we could distinguish ...

  4. Automating Object-Oriented Software Development Methods

    NARCIS (Netherlands)

    Tekinerdogan, Bedir; Saeki, Motoshi; Sunyé, Gerson; Broek, van den Pim; Hruby, Pavel; Frohner, A´ kos

    2002-01-01

    Current software projects have generally to deal with producing and managing large and complex software products. It is generally believed that applying software development methods are useful in coping with this complexity and for supporting quality. As such numerous object-oriented software develo

  5. Automating Object-Oriented Software Development Methods

    NARCIS (Netherlands)

    Tekinerdogan, Bedir; Saeki, Motoshi; Sunyé, Gerson; Broek, van den Pim; Hruby, Pavel

    2001-01-01

    Current software projects have generally to deal with producing and managing large and complex software products. It is generally believed that applying software development methods are useful in coping with this complexity and for supporting quality. As such numerous object-oriented software devel

  6. A Comparative Study of Software Testing Techniques

    OpenAIRE

    Anju Bansal

    2014-01-01

    Software testing is the process used to measure the quality of developed computer software. It exhibits all mistakes, errors and flaws in the developed software. In this paper, the three most prevalent and commonly used software testing techniques for detecting errors are described and compared, they are: white box testing, black box testing and grey box testing.

  7. University of Utah ASC site review. August 24-25, 2006

    Energy Technology Data Exchange (ETDEWEB)

    Hertel, Eugene S., Jr. (.,; .)

    2007-02-01

    This report is a review of progress made by the Center for the Simulation of Accidental Fires and Explosions (C-SAFE) at the University of Utah, during the ninth year (Fiscal 2006) of its existence as an activity funded by the Department of Energy's Advanced Simulation and Computing Program (ASC). The ten-member Review Team composed of the TST and AST spent two days (August 24-25, 2006) at the University, reviewing formal presentations and demonstrations by the C-SAFE researchers and conferring privately. The Review Team found that the C-SAFE project administrators and staff had prepared well for the review. C-SAFE management and staff openly shared extensive answers to unexpected questions and the advance materials were well prepared and very informative. We believe that the time devoted to the review was used effectively and hope that the recommendations included in this 2006 report will provide helpful guidance to C-SAFE personnel and ASC managers.

  8. Comparison of lysimeter based and calculated ASCE reference evapotranspiration in a subhumid climate

    Science.gov (United States)

    Nolz, Reinhard; Cepuder, Peter; Eitzinger, Josef

    2016-04-01

    The standardized form of the well-known FAO Penman-Monteith equation, published by the Environmental and Water Resources Institute of the American Society of Civil Engineers (ASCE-EWRI), is recommended as a standard procedure for calculating reference evapotranspiration (ET ref) and subsequently plant water requirements. Applied and validated under different climatic conditions it generally achieved good results compared to other methods. However, several studies documented deviations between measured and calculated reference evapotranspiration depending on environmental and weather conditions. Therefore, it seems generally advisable to evaluate the model under local environmental conditions. In this study, reference evapotranspiration was determined at a subhumid site in northeastern Austria from 2005 to 2010 using a large weighing lysimeter (ET lys). The measured data were compared with ET ref calculations. Daily values differed slightly during a year, at which ET ref was generally overestimated at small values, whereas it was rather underestimated when ET was large, which is supported also by other studies. In our case, advection of sensible heat proved to have an impact, but it could not explain the differences exclusively. Obviously, there were also other influences, such as seasonal varying surface resistance or albedo. Generally, the ASCE-EWRI equation for daily time steps performed best at average weather conditions. The outcomes should help to correctly interpret ET ref data in the region and in similar environments and improve knowledge on the dynamics of influencing factors causing deviations.

  9. Optimizing the ASC WAN: evaluating network performance tools for comparing transport protocols.

    Energy Technology Data Exchange (ETDEWEB)

    Lydick, Christopher L.

    2007-07-01

    The Advanced Simulation & Computing Wide Area Network (ASC WAN), which is a high delay-bandwidth network connection between US Department of Energy National Laboratories, is constantly being examined and evaluated for efficiency. One of the current transport-layer protocols which is used, TCP, was developed for traffic demands which are different from that on the ASC WAN. The Stream Control Transport Protocol (SCTP), on the other hand, has shown characteristics which make it more appealing to networks such as these. Most important, before considering a replacement for TCP on any network, a testing tool that performs well against certain criteria needs to be found. In order to try to find such a tool, two popular networking tools (Netperf v.2.4.3 & v.2.4.6 (OpenSS7 STREAMS), and Iperf v.2.0.6) were tested. These tools implement both TCP and SCTP and were evaluated using four metrics: (1) How effectively can the tool reach a throughput near the bandwidth? (2) How much of the CPU does the tool utilize during operation? (3) Is the tool freely and widely available? And, (4) Is the tool actively developed? Following the analysis of those tools, this paper goes further into explaining some recommendations and ideas for future work.

  10. NASA software documentation standard software engineering program

    Science.gov (United States)

    1991-01-01

    The NASA Software Documentation Standard (hereinafter referred to as Standard) can be applied to the documentation of all NASA software. This Standard is limited to documentation format and content requirements. It does not mandate specific management, engineering, or assurance standards or techniques. This Standard defines the format and content of documentation for software acquisition, development, and sustaining engineering. Format requirements address where information shall be recorded and content requirements address what information shall be recorded. This Standard provides a framework to allow consistency of documentation across NASA and visibility into the completeness of project documentation. This basic framework consists of four major sections (or volumes). The Management Plan contains all planning and business aspects of a software project, including engineering and assurance planning. The Product Specification contains all technical engineering information, including software requirements and design. The Assurance and Test Procedures contains all technical assurance information, including Test, Quality Assurance (QA), and Verification and Validation (V&V). The Management, Engineering, and Assurance Reports is the library and/or listing of all project reports.

  11. Assessing sofware quality by micro patterns detection

    OpenAIRE

    Destefanis, Giuseppe

    2013-01-01

    One of the goals of Software Engineering is to reduce, or at least to try to control, the defectiveness of software systems during the development phase. Software engineers need to have empirical evidence that software metrics are related to software quality. Unfortunately, software quality is quite an elusive concept, software being an immaterial entity that cannot be physically measured in traditional ways. In general, software quality means many things. In software, the narowest s...

  12. Scientific Software Component Technology

    Energy Technology Data Exchange (ETDEWEB)

    Kohn, S.; Dykman, N.; Kumfert, G.; Smolinski, B.

    2000-02-16

    We are developing new software component technology for high-performance parallel scientific computing to address issues of complexity, re-use, and interoperability for laboratory software. Component technology enables cross-project code re-use, reduces software development costs, and provides additional simulation capabilities for massively parallel laboratory application codes. The success of our approach will be measured by its impact on DOE mathematical and scientific software efforts. Thus, we are collaborating closely with library developers and application scientists in the Common Component Architecture forum, the Equation Solver Interface forum, and other DOE mathematical software groups to gather requirements, write and adopt a variety of design specifications, and develop demonstration projects to validate our approach. Numerical simulation is essential to the science mission at the laboratory. However, it is becoming increasingly difficult to manage the complexity of modern simulation software. Computational scientists develop complex, three-dimensional, massively parallel, full-physics simulations that require the integration of diverse software packages written by outside development teams. Currently, the integration of a new software package, such as a new linear solver library, can require several months of effort. Current industry component technologies such as CORBA, JavaBeans, and COM have all been used successfully in the business domain to reduce software development costs and increase software quality. However, these existing industry component infrastructures will not scale to support massively parallel applications in science and engineering. In particular, they do not address issues related to high-performance parallel computing on ASCI-class machines, such as fast in-process connections between components, language interoperability for scientific languages such as Fortran, parallel data redistribution between components, and massively

  13. Atypical squamous cells, cannot exclude high grade squamous intraepithelial (ASC-H in HIV-positive women

    Directory of Open Access Journals (Sweden)

    Michelow Pam

    2010-01-01

    Full Text Available Objective: South Africa has very high rates of both HIV infection and cervical pathology. The management of ASC-H is colposcopy and directed biopsy, but with so many women diagnosed with HSIL and a dearth of colposcopy centres in South Africa, women with cytologic diagnosis of ASC-H may not be prioritized for colposcopy. The aim of this study was to determine if HIV-positive women with a cytologic diagnosis of ASC-H should undergo immediate colposcopy or whether colposcopy can be delayed, within the context of an underfunded health care setting with so many competing health needs. Materials and Methods: A computer database search was performed from the archives of an NGO-administered clinic that offers comprehensive HIV care. All women with a cytologic diagnosis of ASC-H on cervical smears from September 2005 until August 2009 were identified. Histologic follow up was sought in all patients. Results: A total of 2111 cervical smears were performed and 41 diagnosed as ASC-H (1.94%. No histologic follow up data was available in 15 cases. Follow up histologic results were as follows: three negative (11.5%, five koilocytosis and/ or CIN1 (19.2%, ten CIN2 (38.5% and eight CIN3 (30.8%. There were no cases of invasive carcinoma on follow up. Conclusion: The current appropriate management of HIV-positive women in low-resource settings with a diagnosis of ASC-H on cervical smear is colposcopy, despite the costs involved. In the future and if cost-effective in developing nations, use of novel markers may help select which HIV-positive women can be managed conservatively and which ones referred for more active treatment. More research in this regard is warranted.

  14. Software Reviews.

    Science.gov (United States)

    Miller, Anne, Ed.; Radziemski, Cathy, Ed.

    1988-01-01

    Reviews two software packages for the Macintosh series. "Course Builder 2.0," a courseware authoring system, allows the user to create programs which stand alone and may be used independently in the classroom. "World Builder," an artificial intelligence software package, allows creative thinking, problem-solving, and decision-making. (YP)

  15. Software RISCO

    OpenAIRE

    Silva, J.M.G.; F. S. Barbosa

    2008-01-01

    Os processos utilizados do desenvolvimento do software RISCO (software para bordadeiras), para terem sido eficazes, tiveram em conta as vertentes ambientais (humanas e materiais) a replicar numa aplicação digital. Para tal foi capital a observação dos dados e comportamentos, de forma a oferecer alterações positivas decorrentes do uso da nova ferramenta de trabalho.

  16. AJL Software

    OpenAIRE

    2012-01-01

    AJL Software es una empresa dedicada al desarrollo de software a la medida, el segmento al cual se enfocara el negocio actualmente está en auge en Colombia gracias a las ventajas que ofrecen las herramientas de computo por la modernización y automatización que ofrece la tecnología en todos las empresas.

  17. Reuse-based software production technology

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Software reuse is viewed as a key technology to improve software product quality and productivity. This paper discusses a series of technologies related with software reuse and software component technology: component model, which describes component's essential characteristics; component acquisition technology, of which domain engineering is the main approach; component management technology, of which component library is the kernel; application integration and composition technology, of which application engineering is the main approach; software evolution technology, of which software reengineering is the main approach, etc. This paper introduces the software development environment: JadeBird Software Production Line System, which effectively integrates the above-mentioned technologies.

  18. NASA PC software evaluation project

    Science.gov (United States)

    Dominick, Wayne D. (Editor); Kuan, Julie C.

    1986-01-01

    The USL NASA PC software evaluation project is intended to provide a structured framework for facilitating the development of quality NASA PC software products. The project will assist NASA PC development staff to understand the characteristics and functions of NASA PC software products. Based on the results of the project teams' evaluations and recommendations, users can judge the reliability, usability, acceptability, maintainability and customizability of all the PC software products. The objective here is to provide initial, high-level specifications and guidelines for NASA PC software evaluation. The primary tasks to be addressed in this project are as follows: to gain a strong understanding of what software evaluation entails and how to organize a structured software evaluation process; to define a structured methodology for conducting the software evaluation process; to develop a set of PC software evaluation criteria and evaluation rating scales; and to conduct PC software evaluations in accordance with the identified methodology. Communication Packages, Network System Software, Graphics Support Software, Environment Management Software, General Utilities. This report represents one of the 72 attachment reports to the University of Southwestern Louisiana's Final Report on NASA Grant NGT-19-010-900. Accordingly, appropriate care should be taken in using this report out of context of the full Final Report.

  19. Software engineering methodologies and tools

    Science.gov (United States)

    Wilcox, Lawrence M.

    1993-01-01

    Over the years many engineering disciplines have developed, including chemical, electronic, etc. Common to all engineering disciplines is the use of rigor, models, metrics, and predefined methodologies. Recently, a new engineering discipline has appeared on the scene, called software engineering. For over thirty years computer software has been developed and the track record has not been good. Software development projects often miss schedules, are over budget, do not give the user what is wanted, and produce defects. One estimate is there are one to three defects per 1000 lines of deployed code. More and more systems are requiring larger and more complex software for support. As this requirement grows, the software development problems grow exponentially. It is believed that software quality can be improved by applying engineering principles. Another compelling reason to bring the engineering disciplines to software development is productivity. It has been estimated that productivity of producing software has only increased one to two percent a year in the last thirty years. Ironically, the computer and its software have contributed significantly to the industry-wide productivity, but computer professionals have done a poor job of using the computer to do their job. Engineering disciplines and methodologies are now emerging supported by software tools that address the problems of software development. This paper addresses some of the current software engineering methodologies as a backdrop for the general evaluation of computer assisted software engineering (CASE) tools from actual installation of and experimentation with some specific tools.

  20. Software engineering

    CERN Document Server

    Sommerville, Ian

    2016-01-01

    For courses in computer science and software engineering The Fundamental Practice of Software Engineering Software Engineering introduces readers to the overwhelmingly important subject of software programming and development. In the past few years, computer systems have come to dominate not just our technological growth, but the foundations of our world's major industries. This text seeks to lay out the fundamental concepts of this huge and continually growing subject area in a clear and comprehensive manner. The Tenth Edition contains new information that highlights various technological updates of recent years, providing readers with highly relevant and current information. Sommerville's experience in system dependability and systems engineering guides the text through a traditional plan-based approach that incorporates some novel agile methods. The text strives to teach the innovators of tomorrow how to create software that will make our world a better, safer, and more advanced place to live.