WorldWideScience

Sample records for implementing software quality

  1. An Approach for the Implementation of Software Quality Models Adpoting CERTICS and CMMI-DEV

    Directory of Open Access Journals (Sweden)

    GARCIA, F.W.

    2015-12-01

    Full Text Available This paper proposes a mapping between two product quality and software processes models used in the industry, the CERTICS national model and the CMMI-DEV international model. The stages of mapping are presented step by step, as well as the mapping review, which had the cooperation of one specialist in CERTICS and CMMI-DEV models. It aims to correlate the structures of the two models in order to facilitate and reduce the implementation time and costs, and to stimulate the execution of multi-model implementations in software developers companies.

  2. Software Quality Assurance Metrics

    Science.gov (United States)

    McRae, Kalindra A.

    2004-01-01

    Software Quality Assurance (SQA) is a planned and systematic set of activities that ensures conformance of software life cycle processes and products conform to requirements, standards and procedures. In software development, software quality means meeting requirements and a degree of excellence and refinement of a project or product. Software Quality is a set of attributes of a software product by which its quality is described and evaluated. The set of attributes includes functionality, reliability, usability, efficiency, maintainability, and portability. Software Metrics help us understand the technical process that is used to develop a product. The process is measured to improve it and the product is measured to increase quality throughout the life cycle of software. Software Metrics are measurements of the quality of software. Software is measured to indicate the quality of the product, to assess the productivity of the people who produce the product, to assess the benefits derived from new software engineering methods and tools, to form a baseline for estimation, and to help justify requests for new tools or additional training. Any part of the software development can be measured. If Software Metrics are implemented in software development, it can save time, money, and allow the organization to identify the caused of defects which have the greatest effect on software development. The summer of 2004, I worked with Cynthia Calhoun and Frank Robinson in the Software Assurance/Risk Management department. My task was to research and collect, compile, and analyze SQA Metrics that have been used in other projects that are not currently being used by the SA team and report them to the Software Assurance team to see if any metrics can be implemented in their software assurance life cycle process.

  3. Software quality assurance

    CERN Document Server

    Laporte, Claude Y

    2018-01-01

    This book introduces Software Quality Assurance (SQA) and provides an overview of standards used to implement SQA. It defines ways to assess the effectiveness of how one approaches software quality across key industry sectors such as telecommunications, transport, defense, and aerospace. * Includes supplementary website with an instructor's guide and solutions * Applies IEEE software standards as well as the Capability Maturity Model Integration for Development (CMMI) * Illustrates the application of software quality assurance practices through the use of practical examples, quotes from experts, and tips from the authors

  4. Software quality assurance handbook

    Energy Technology Data Exchange (ETDEWEB)

    1990-09-01

    There are two important reasons for Software Quality Assurance (SQA) at Allied-Signal Inc., Kansas City Division (KCD): First, the benefits from SQA make good business sense. Second, the Department of Energy has requested SQA. This handbook is one of the first steps in a plant-wide implementation of Software Quality Assurance at KCD. The handbook has two main purposes. The first is to provide information that you will need to perform software quality assurance activities. The second is to provide a common thread to unify the approach to SQA at KCD. 2 figs.

  5. SU-E-T-103: Development and Implementation of Web Based Quality Control Software

    International Nuclear Information System (INIS)

    Studinski, R; Taylor, R; Angers, C; La Russa, D; Clark, B

    2014-01-01

    Purpose: Historically many radiation medicine programs have maintained their Quality Control (QC) test results in paper records or Microsoft Excel worksheets. Both these approaches represent significant logistical challenges, and are not predisposed to data review and approval. It has been our group's aim to develop and implement web based software designed not just to record and store QC data in a centralized database, but to provide scheduling and data review tools to help manage a radiation therapy clinics Equipment Quality control program. Methods: The software was written in the Python programming language using the Django web framework. In order to promote collaboration and validation from other centres the code was made open source and is freely available to the public via an online source code repository. The code was written to provide a common user interface for data entry, formalize the review and approval process, and offer automated data trending and process control analysis of test results. Results: As of February 2014, our installation of QAtrack+ has 180 tests defined in its database and has collected ∼22 000 test results, all of which have been reviewed and approved by a physicist via QATrack+'s review tools. These results include records for quality control of Elekta accelerators, CT simulators, our brachytherapy programme, TomoTherapy and Cyberknife units. Currently at least 5 other centres are known to be running QAtrack+ clinically, forming the start of an international user community. Conclusion: QAtrack+ has proven to be an effective tool for collecting radiation therapy QC data, allowing for rapid review and trending of data for a wide variety of treatment units. As free and open source software, all source code, documentation and a bug tracker are available to the public at https://bitbucket.org/tohccmedphys/qatrackplus/

  6. SU-E-T-103: Development and Implementation of Web Based Quality Control Software

    Energy Technology Data Exchange (ETDEWEB)

    Studinski, R; Taylor, R; Angers, C; La Russa, D; Clark, B [The Ottawa Hospital Regional Cancer Ctr., Ottawa, ON (Canada)

    2014-06-01

    Purpose: Historically many radiation medicine programs have maintained their Quality Control (QC) test results in paper records or Microsoft Excel worksheets. Both these approaches represent significant logistical challenges, and are not predisposed to data review and approval. It has been our group's aim to develop and implement web based software designed not just to record and store QC data in a centralized database, but to provide scheduling and data review tools to help manage a radiation therapy clinics Equipment Quality control program. Methods: The software was written in the Python programming language using the Django web framework. In order to promote collaboration and validation from other centres the code was made open source and is freely available to the public via an online source code repository. The code was written to provide a common user interface for data entry, formalize the review and approval process, and offer automated data trending and process control analysis of test results. Results: As of February 2014, our installation of QAtrack+ has 180 tests defined in its database and has collected ∼22 000 test results, all of which have been reviewed and approved by a physicist via QATrack+'s review tools. These results include records for quality control of Elekta accelerators, CT simulators, our brachytherapy programme, TomoTherapy and Cyberknife units. Currently at least 5 other centres are known to be running QAtrack+ clinically, forming the start of an international user community. Conclusion: QAtrack+ has proven to be an effective tool for collecting radiation therapy QC data, allowing for rapid review and trending of data for a wide variety of treatment units. As free and open source software, all source code, documentation and a bug tracker are available to the public at https://bitbucket.org/tohccmedphys/qatrackplus/.

  7. Software quality challenges.

    OpenAIRE

    Fitzpatrick, Ronan; Smith, Peter; O'Shea, Brendan

    2004-01-01

    This paper sets out a number of challenges facing the software quality community. These challenges relate to the broader view of quality and the consequences for software quality definitions. These definitions are related to eight perspectives of software quality in an end-to-end product life cycle. Research and study of software quality has traditionally focused on product quality for management information systems and this paper considers the challenge of defining additional quality factors...

  8. A prolog implementation of pattern search to optimize software quality assurance

    OpenAIRE

    Buzzard, Raymond Karl

    1990-01-01

    Approved for public release, distribution is unlimited Quality Assurance (QA) is a critical factor in the development of successful software systems. Through the use of various QA tools, project managers can ensure that a desired level of performance and reliability is built into the system. However, these tools are not without cost. Project managers must weight all QA costs and benefits for each development environment before weigh all QA costs and benefits for each development environmen...

  9. Software component quality evaluation

    Science.gov (United States)

    Clough, A. J.

    1991-01-01

    The paper describes a software inspection process that can be used to evaluate the quality of software components. Quality criteria, process application, independent testing of the process and proposed associated tool support are covered. Early results indicate that this technique is well suited for assessing software component quality in a standardized fashion. With automated machine assistance to facilitate both the evaluation and selection of software components, such a technique should promote effective reuse of software components.

  10. Implementing Software Defined Radio

    CERN Document Server

    Grayver, Eugene

    2013-01-01

    Software Defined Radio makes wireless communications easier, more efficient, and more reliable. This book bridges the gap between academic research and practical implementation. When beginning a project, practicing engineers, technical managers, and graduate students can save countless hours by considering the concepts presented in these pages. The author covers the myriad options and trade-offs available when selecting an appropriate hardware architecture. As demonstrated here, the choice between hardware- and software-centric architecture can mean the difference between meeting an aggressive schedule and bogging down in endless design iterations. Because of the author’s experience overseeing dozens of failed and successful developments, he is able to present many real-life examples. Some of the key concepts covered are: Choosing the right architecture for the market – laboratory, military, or commercial Hardware platforms – FPGAs, GPPs, specialized and hybrid devices Standardization efforts to ens...

  11. Implementing Quality Assurance for the Numerical Research Software Dune / PDELab / DuMux

    Science.gov (United States)

    Flemisch, B.; Bastian, P.; Kempf, D.; Koch, T.; Helmig, R.

    2015-12-01

    Quality assurance and, in particular, automated testing, should be one of the key elements of modern software development. However, applying common techniques from software engineering to numerical frameworks, such as Dune, may be challenging since the requirements for a test might be very different to standard software. This talk gives an overview of our work in describing system tests for numerical software and developing test tools to ensure that qualitative and quantitative properties of PDE discretizations are preserved. The developed tools are employed in the Dune discretization module Dune-PDELab and the porous-media simulator DuMux.The newly developed module dune-testtools provides the following components: A domain specific language for feature modelling, which is naturally integrated into the workflow of numerical simulation. Tools to test whether a given PDE discretization does still yield the correct result without performance (or scalability) regressions. Integration of the above tools into a CMake based build system. Extensions to the Dune core modules to support the development of system tests.

  12. Computer software quality assurance

    International Nuclear Information System (INIS)

    Ives, K.A.

    1986-06-01

    The author defines some criteria for the evaluation of software quality assurance elements for applicability to the regulation of the nuclear industry. The author then analyses a number of software quality assurance (SQA) standards. The major extracted SQA elements are then discussed, and finally specific software quality assurance recommendations are made for the nuclear industry

  13. Software to implement the IAEA-TECDOC 1517: quality control in mammography

    International Nuclear Information System (INIS)

    Mora Rodriguez, Patricia

    2011-01-01

    Mammography quality control is presented; also, guidelines for establishing a quality control program through tests to be performed, equipment, frequencies and tolerances. The work is intended for radiologists, medical physicists and technologists. Also examples of data collection sheets, minimum equipment list and overview of a quality control program are included. Two projects (RLA/9/057 and RLA/9/067) raised the goal of developing a software that allows a simple and effective way to automate the evidence contained in the TECDOC. The V1 is presented in the IRPA 12 (2008), which was tested in some countries to detect problems in their use and include possible improvements. However, the final revised version for distribution in member countries was get in 2010. The way of work on the project is described and who has worked. Also, a complete program, its importance and the projection of the Universidad de Costa Rica to work with the same. (author) [es

  14. Software product quality measurement

    OpenAIRE

    Godliauskas, Eimantas

    2016-01-01

    This paper analyses Ruby product quality measures, suggesting three new measures for Ruby product quality measurement tool Rubocop to measure Ruby product quality characteristics defined in ISO 2502n standard series. This paper consists of four main chapters. The first chapter gives a brief view of software product quality and software product quality measurement. The second chapter analyses object oriented quality measures. The third chapter gives a brief view of the most popular Ruby qualit...

  15. Software quality in 1997

    Energy Technology Data Exchange (ETDEWEB)

    Jones, C. [Software Productivity Research, Inc., Burlington, MA (United States)

    1997-11-01

    For many years, software quality assurance lagged behind hardware quality assurance in terms of methods, metrics, and successful results. New approaches such as Quality Function Deployment (QFD) the ISO 9000-9004 standards, the SEI maturity levels, and Total Quality Management (TQM) are starting to attract wide attention, and in some cases to bring software quality levels up to a parity with manufacturing quality levels. Since software is on the critical path for many engineered products, and for internal business systems as well, the new approaches are starting to affect global competition and attract widespread international interest. It can be hypothesized that success in mastering software quality will be a key strategy for dominating global software markets in the 21st century.

  16. [Implementation of a web based software for documentation and control of quality of an acute pain service].

    Science.gov (United States)

    Pawlik, Michael T; Abel, Reinhard; Abt, Gregor; Kieninger, Martin; Graf, Bernhard Martin; Taeger, Kai; Ittner, Karl Peter

    2009-07-01

    Providing an acute pain service means accumulation of a large amount of data. The alleviation of data collection, improvement of data quality and data analysis plays a pivotal role. The electronic medical record (EMR) is gaining more and more importance in this context and is continuously spreading in clinical practice. Up to now only a few commercial softwares are available that specifically fit to the needs of an acute pain service. Here we report the development and implementation of such a program (Schmerzvisite, Medlinq, Hamburg, Germany) in the acute pain service of a University Hospital.

  17. Software product quality control

    CERN Document Server

    Wagner, Stefan

    2013-01-01

    Quality is not a fixed or universal property of software; it depends on the context and goals of its stakeholders. Hence, when you want to develop a high-quality software system, the first step must be a clear and precise specification of quality. Yet even if you get it right and complete, you can be sure that it will become invalid over time. So the only solution is continuous quality control: the steady and explicit evaluation of a product's properties with respect to its updated quality goals.This book guides you in setting up and running continuous quality control in your environment. Star

  18. Design and implementation of software for automated quality control and data analysis for a complex LC/MS/MS assay for urine opiates and metabolites.

    Science.gov (United States)

    Dickerson, Jane A; Schmeling, Michael; Hoofnagle, Andrew N; Hoffman, Noah G

    2013-01-16

    Mass spectrometry provides a powerful platform for performing quantitative, multiplexed assays in the clinical laboratory, but at the cost of increased complexity of analysis and quality assurance calculations compared to other methodologies. Here we describe the design and implementation of a software application that performs quality control calculations for a complex, multiplexed, mass spectrometric analysis of opioids and opioid metabolites. The development and implementation of this application improved our data analysis and quality assurance processes in several ways. First, use of the software significantly improved the procedural consistency for performing quality control calculations. Second, it reduced the amount of time technologists spent preparing and reviewing the data, saving on average over four hours per run, and in some cases improving turnaround time by a day. Third, it provides a mechanism for coupling procedural and software changes with the results of each analysis. We describe several key details of the implementation including the use of version control software and automated unit tests. These generally useful software engineering principles should be considered for any software development project in the clinical lab. Copyright © 2012 Elsevier B.V. All rights reserved.

  19. Establishing software quality assurance

    International Nuclear Information System (INIS)

    Malsbury, J.

    1983-01-01

    This paper is concerned with four questions about establishing software QA: What is software QA. Why have software QA. What is the role of software QA. What is necessary to ensure the success of software QA

  20. Roadmap for Peridynamic Software Implementation

    Energy Technology Data Exchange (ETDEWEB)

    Littlewood, David John [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-10-01

    The application of peridynamics for engineering analysis requires an efficient and robust software implementation. Key elements include processing of the discretization, the proximity search for identification of pairwise interactions, evaluation of the con- stitutive model, application of a bond-damage law, and contact modeling. Additional requirements may arise from the choice of time integration scheme, for example esti- mation of the maximum stable time step for explicit schemes, and construction of the tangent stiffness matrix for many implicit approaches. This report summaries progress to date on the software implementation of the peridynamic theory of solid mechanics. Discussion is focused on parallel implementation of the meshfree discretization scheme of Silling and Askari [33] in three dimensions, although much of the discussion applies to computational peridynamics in general.

  1. Modernization of software quality assurance

    Science.gov (United States)

    Bhaumik, Gokul

    1988-01-01

    The customers satisfaction depends not only on functional performance, it also depends on the quality characteristics of the software products. An examination of this quality aspect of software products will provide a clear, well defined framework for quality assurance functions, which improve the life-cycle activities of software development. Software developers must be aware of the following aspects which have been expressed by many quality experts: quality cannot be added on; the level of quality built into a program is a function of the quality attributes employed during the development process; and finally, quality must be managed. These concepts have guided our development of the following definition for a Software Quality Assurance function: Software Quality Assurance is a formal, planned approach of actions designed to evaluate the degree of an identifiable set of quality attributes present in all software systems and their products. This paper is an explanation of how this definition was developed and how it is used.

  2. Software Quality Assurance in Software Projects: A Study of Pakistan

    OpenAIRE

    Faisal Shafique Butt; Sundus Shaukat; M. Wasif Nisar; Ehsan Ullah Munir; Muhammad Waseem; Kashif Ayyub

    2013-01-01

    Software quality is specific property which tells what kind of standard software should have. In a software project, quality is the key factor of success and decline of software related organization. Many researches have been done regarding software quality. Software related organization follows standards introduced by Capability Maturity Model Integration (CMMI) to achieve good quality software. Quality is divided into three main layers which are Software Quality Assurance (SQA), Software Qu...

  3. Methods of Software Quality Assurance under a Nuclear Quality Assurance Program

    International Nuclear Information System (INIS)

    Kim, Jang Yeol; Lee, Young Jun; Cha, Kyung Ho; Cheon, Se Woo; Lee, Jang Soo; Kwon, Kee Choon

    2005-01-01

    This paper addresses a substantial implementation of a software quality assurance under a nuclear quality assurance program. The relationship of the responsibility between a top-level nuclear quality assurance program such as ASME/NQA-1 and its lower level software quality assurance is described. Software quality assurance activities and software quality assurance procedures during the software development life cycle are also described

  4. Research and implementation of software automatic test

    Science.gov (United States)

    Li-hong, LIAN

    2017-06-01

    With the fast development in IT technology nowadays, software is increasingly complex and large. Hundreds of people in the development team, thousands of modules and interfaces, across geographies and systems user are no longer a fresh thing. All of these put forward higher requirements for software testing. Due to the low cost of implementation and the advantage of effective inheritance and accumulation of test assets, software automation testing has gradually become one of the important means to ensure the quality of software for IT enterprises. This paper analyzes the advantages of automatic test, common misconceptions; puts forward unsuitable application scenarios and the best time to intervene; focus on the analysis of the feasibility of judging the interface automation test; and puts forward the function and elements of interface automatic test tools to have; provides a reference for large-scale project interface automated testing tool selection or custom development.

  5. Software as quality product

    International Nuclear Information System (INIS)

    Enders, A.

    1975-01-01

    In many discussions on the reliability of computer systems, software is presented as the weak link in the chain. The contribution attempts to identify the reasons for this situation as seen from the software development. The concepts correctness and reliability of programmes are explained as they are understood in the specialist discussion of today. Measures and methods are discussed which are particularly relevant as far as the obtaining of fault-free and reliable programmes is concerned. Conclusions are drawn for the user of software so that he is in the position to judge himself what can be justly expected frm the product software compared to other products. (orig./LH) [de

  6. Software Quality Assurance Audits Guidebooks

    Science.gov (United States)

    1990-01-01

    The growth in cost and importance of software to NASA has caused NASA to address the improvement of software development across the agency. One of the products of this program is a series of guidebooks that define a NASA concept of the assurance processes that are used in software development. The Software Assurance Guidebook, NASA-GB-A201, issued in September, 1989, provides an overall picture of the NASA concepts and practices in software assurance. Second level guidebooks focus on specific activities that fall within the software assurance discipline, and provide more detailed information for the manager and/or practitioner. This is the second level Software Quality Assurance Audits Guidebook that describes software quality assurance audits in a way that is compatible with practices at NASA Centers.

  7. Software Quality Assurance for Nuclear Safety Systems

    International Nuclear Information System (INIS)

    Sparkman, D R; Lagdon, R

    2004-01-01

    The US Department of Energy has undertaken an initiative to improve the quality of software used to design and operate their nuclear facilities across the United States. One aspect of this initiative is to revise or create new directives and guides associated with quality practices for the safety software in its nuclear facilities. Safety software includes the safety structures, systems, and components software and firmware, support software and design and analysis software used to ensure the safety of the facility. DOE nuclear facilities are unique when compared to commercial nuclear or other industrial activities in terms of the types and quantities of hazards that must be controlled to protect workers, public and the environment. Because of these differences, DOE must develop an approach to software quality assurance that ensures appropriate risk mitigation by developing a framework of requirements that accomplishes the following goals: (sm b ullet) Ensures the software processes developed to address nuclear safety in design, operation, construction and maintenance of its facilities are safe (sm b ullet) Considers the larger system that uses the software and its impacts (sm b ullet) Ensures that the software failures do not create unsafe conditions Software designers for nuclear systems and processes must reduce risks in software applications by incorporating processes that recognize, detect, and mitigate software failure in safety related systems. It must also ensure that fail safe modes and component testing are incorporated into software design. For nuclear facilities, the consideration of risk is not necessarily sufficient to ensure safety. Systematic evaluation, independent verification and system safety analysis must be considered for software design, implementation, and operation. The software industry primarily uses risk analysis to determine the appropriate level of rigor applied to software practices. This risk-based approach distinguishes safety

  8. Software quality: Process or people

    Science.gov (United States)

    Palmer, Regina; Labaugh, Modenna

    1993-01-01

    This paper will present data related to software development processes and personnel involvement from the perspective of software quality assurance. We examine eight years of data collected from six projects. Data collected varied by project but usually included defect and fault density with limited use of code metrics, schedule adherence, and budget growth information. The data are a blend of AFSCP 800-14 and suggested productivity measures in Software Metrics: A Practioner's Guide to Improved Product Development. A software quality assurance database tool, SQUID, was used to store and tabulate the data.

  9. First experiences with the implementation of the European standard EN 62304 on medical device software for the quality assurance of a radiotherapy unit.

    Science.gov (United States)

    Höss, Angelika; Lampe, Christian; Panse, Ralf; Ackermann, Benjamin; Naumann, Jakob; Jäkel, Oliver

    2014-03-21

    According to the latest amendment of the Medical Device Directive standalone software qualifies as a medical device when intended by the manufacturer to be used for medical purposes. In this context, the EN 62304 standard is applicable which defines the life-cycle requirements for the development and maintenance of medical device software. A pilot project was launched to acquire skills in implementing this standard in a hospital-based environment (in-house manufacture). The EN 62304 standard outlines minimum requirements for each stage of the software life-cycle, defines the activities and tasks to be performed and scales documentation and testing according to its criticality. The required processes were established for the pre-existent decision-support software FlashDumpComparator (FDC) used during the quality assurance of treatment-relevant beam parameters. As the EN 62304 standard implicates compliance with the EN ISO 14971 standard on the application of risk management to medical devices, a risk analysis was carried out to identify potential hazards and reduce the associated risks to acceptable levels. The EN 62304 standard is difficult to implement without proper tools, thus open-source software was selected and integrated into a dedicated development platform. The control measures yielded by the risk analysis were independently implemented and verified, and a script-based test automation was retrofitted to reduce the associated test effort. After all documents facilitating the traceability of the specified requirements to the corresponding tests and of the control measures to the proof of execution were generated, the FDC was released as an accessory to the HIT facility. The implementation of the EN 62304 standard was time-consuming, and a learning curve had to be overcome during the first iterations of the associated processes, but many process descriptions and all software tools can be re-utilized in follow-up projects. It has been demonstrated that a

  10. First experiences with the implementation of the European standard EN 62304 on medical device software for the quality assurance of a radiotherapy unit

    International Nuclear Information System (INIS)

    Höss, Angelika; Lampe, Christian; Panse, Ralf; Ackermann, Benjamin; Naumann, Jakob; Jäkel, Oliver

    2014-01-01

    According to the latest amendment of the Medical Device Directive standalone software qualifies as a medical device when intended by the manufacturer to be used for medical purposes. In this context, the EN 62304 standard is applicable which defines the life-cycle requirements for the development and maintenance of medical device software. A pilot project was launched to acquire skills in implementing this standard in a hospital-based environment (in-house manufacture). The EN 62304 standard outlines minimum requirements for each stage of the software life-cycle, defines the activities and tasks to be performed and scales documentation and testing according to its criticality. The required processes were established for the pre-existent decision-support software FlashDumpComparator (FDC) used during the quality assurance of treatment-relevant beam parameters. As the EN 62304 standard implicates compliance with the EN ISO 14971 standard on the application of risk management to medical devices, a risk analysis was carried out to identify potential hazards and reduce the associated risks to acceptable levels. The EN 62304 standard is difficult to implement without proper tools, thus open-source software was selected and integrated into a dedicated development platform. The control measures yielded by the risk analysis were independently implemented and verified, and a script-based test automation was retrofitted to reduce the associated test effort. After all documents facilitating the traceability of the specified requirements to the corresponding tests and of the control measures to the proof of execution were generated, the FDC was released as an accessory to the HIT facility. The implementation of the EN 62304 standard was time-consuming, and a learning curve had to be overcome during the first iterations of the associated processes, but many process descriptions and all software tools can be re-utilized in follow-up projects. It has been demonstrated that a

  11. R D software quality assurance

    Energy Technology Data Exchange (ETDEWEB)

    Hood, F.C.

    1991-10-01

    Research software quality assurance (QA) requirements must be adequate to strengthen development or modification objectives, but flexible enough not to restrict creativity. Application guidelines are needed for the different kinds of research and development (R D) software activities to assure project objectives are achieved.

  12. Survey on Impact of Software Metrics on Software Quality

    OpenAIRE

    Mrinal Singh Rawat; Arpita Mittal; Sanjay Kumar Dubey

    2012-01-01

    Software metrics provide a quantitative basis for planning and predicting software development processes. Therefore the quality of software can be controlled and improved easily. Quality in fact aids higher productivity, which has brought software metrics to the forefront. This research paper focuses on different views on software quality. Moreover, many metrics and models have been developed; promoted and utilized resulting in remarkable successes. This paper examines the realm of software e...

  13. Software cost/resource modeling: Software quality tradeoff measurement

    Science.gov (United States)

    Lawler, R. W.

    1980-01-01

    A conceptual framework for treating software quality from a total system perspective is developed. Examples are given to show how system quality objectives may be allocated to hardware and software; to illustrate trades among quality factors, both hardware and software, to achieve system performance objectives; and to illustrate the impact of certain design choices on software functionality.

  14. Software quality engineering a practitioner's approach

    CERN Document Server

    Suryn, Witold

    2014-01-01

    Software quality stems from two distinctive, but associated, topics in software engineering: software functional quality and software structural quality. Software Quality Engineering studies the tenets of both of these notions, which focus on the efficiency and value of a design, respectively. The text addresses engineering quality on both the application and system levels with attention to Information Systems and Embedded Systems as well as recent developments. Targeted at graduate engineering students and software quality specialists, the book analyzes the relationship between functionality

  15. PPM Receiver Implemented in Software

    Science.gov (United States)

    Gray, Andrew; Kang, Edward; Lay, Norman; Vilnrotter, Victor; Srinivasan, Meera; Lee, Clement

    2010-01-01

    A computer program has been written as a tool for developing optical pulse-position- modulation (PPM) receivers in which photodetector outputs are fed to analog-to-digital converters (ADCs) and all subsequent signal processing is performed digitally. The program can be used, for example, to simulate an all-digital version of the PPM receiver described in Parallel Processing of Broad-Band PPM Signals (NPO-40711), which appears elsewhere in this issue of NASA Tech Briefs. The program can also be translated into a design for digital PPM receiver hardware. The most notable innovation embodied in the software and the underlying PPM-reception concept is a digital processing subsystem that performs synchronization of PPM time slots, even though the digital processing is, itself, asynchronous in the sense that no attempt is made to synchronize it with the incoming optical signal a priori and there is no feedback to analog signal processing subsystems or ADCs. Functions performed by the software receiver include time-slot synchronization, symbol synchronization, coding preprocessing, and diagnostic functions. The program is written in the MATLAB and Simulink software system. The software receiver is highly parameterized and, hence, programmable: for example, slot- and symbol-synchronization filters have programmable bandwidths.

  16. Set up and programming of an ALICE Time-Of-Flight trigger facility and software implementation for its Quality Assurance (QA) during LHC Run 2

    CERN Document Server

    Toschi, Francesco

    2016-01-01

    The Cosmic and Topology Trigger Module (CTTM) is the main component of a trigger based on the ALICE TOF detector. Taking advantage of the TOF fast response, this VME board implements the trigger logic and delivers several L0 trigger outputs, used since Run 1, to provide cosmic triggers and rare triggers in pp, p+Pb and Pb+Pb data taking. Due to TOF DCS architectural change of the PCs controlling the CTTM (from 32 bits to 64 bits) it is mandatory to upgrade the software related to the CTTM including the code programming the FPGA firmware. A dedicated CTTM board will be installed in a CERN lab (Meyrin site), with the aim of recreating the electronics chain of the TOF trigger, to get a comfortable porting of the code to the 64 bit environment. The project proposed to the summer student is the setting up of the CTTM and the porting of the software. Moreover, in order to monitor the CTTM Trigger board during the real data taking, the implementation of a new Quality Assurance (QA) code is also crucial, together wit...

  17. The Effects of Development Team Skill on Software Product Quality

    Science.gov (United States)

    Beaver, Justin M.; Schiavone, Guy A.

    2006-01-01

    This paper provides an analysis of the effect of the skill/experience of the software development team on the quality of the final software product. A method for the assessment of software development team skill and experience is proposed, and was derived from a workforce management tool currently in use by the National Aeronautics and Space Administration. Using data from 26 smallscale software development projects, the team skill measures are correlated to 5 software product quality metrics from the ISO/IEC 9126 Software Engineering Product Quality standard. in the analysis of the results, development team skill is found to be a significant factor in the adequacy of the design and implementation. In addition, the results imply that inexperienced software developers are tasked with responsibilities ill-suited to their skill level, and thus have a significant adverse effect on the quality of the software product. Keywords: software quality, development skill, software metrics

  18. Implementing Software Safety in the NASA Environment

    Science.gov (United States)

    Wetherholt, Martha S.; Radley, Charles F.

    1994-01-01

    Until recently, NASA did not consider allowing computers total control of flight systems. Human operators, via hardware, have constituted the ultimate safety control. In an attempt to reduce costs, NASA has come to rely more and more heavily on computers and software to control space missions. (For example. software is now planned to control most of the operational functions of the International Space Station.) Thus the need for systematic software safety programs has become crucial for mission success. Concurrent engineering principles dictate that safety should be designed into software up front, not tested into the software after the fact. 'Cost of Quality' studies have statistics and metrics to prove the value of building quality and safety into the development cycle. Unfortunately, most software engineers are not familiar with designing for safety, and most safety engineers are not software experts. Software written to specifications which have not been safety analyzed is a major source of computer related accidents. Safer software is achieved step by step throughout the system and software life cycle. It is a process that includes requirements definition, hazard analyses, formal software inspections, safety analyses, testing, and maintenance. The greatest emphasis is placed on clearly and completely defining system and software requirements, including safety and reliability requirements. Unfortunately, development and review of requirements are the weakest link in the process. While some of the more academic methods, e.g. mathematical models, may help bring about safer software, this paper proposes the use of currently approved software methodologies, and sound software and assurance practices to show how, to a large degree, safety can be designed into software from the start. NASA's approach today is to first conduct a preliminary system hazard analysis (PHA) during the concept and planning phase of a project. This determines the overall hazard potential of

  19. Software Quality Certification: identifying the real obstacles

    Directory of Open Access Journals (Sweden)

    Megan Baker

    1996-05-01

    Full Text Available A case study of software certification reveals the real difficulty of certifying quality beyond superficial assessment - readers are invited to form their own conclusions. AS 3563 Software Quality Management System is the Australian version of ISO 9001, developed specifically for the software industry. For many Australian software houses, gaining certification with AS 3563 is a priority since certification has become a prerequisite to doing business with government departments and major corporations. However, the process of achieving registration with this standard is a lengthy and resource intensive process, and may have little impact on actual software quality. This case study recounts the experience of the consulting arm of one of Australia's accounting firms in its quest for certification. By using a number of specific management strategies this company was able to successfully implement AS 3563 in less than half the time usually taken to achieve certification - a feat for which its management should be congratulated. However, because the focus of the project was on gaining certification, few internal benefits have been realised despite the successful implementation of the standard.

  20. Implementing an open-access CASA software for the assessment of stallion sperm motility: Relationship with other sperm quality parameters.

    Science.gov (United States)

    Giaretta, Elisa; Munerato, Mauro; Yeste, Marc; Galeati, Giovanna; Spinaci, Marcella; Tamanini, Carlo; Mari, Gaetano; Bucci, Diego

    2017-01-01

    Setting an open-access computer assisted sperm analysis (CASA) may benefit the evaluation of motility in mammalian sperm, especially when economic constraints do not allow the use of a commercial system. There have been successful attempts to develop such a device in Zebra fish sperm and the system has been used in very few studies on mammalian spermatozoa. Against this background, the present study aimed at developing an open-access CASA system for mammalian sperm using the horse as a model and based upon the Image J software previously established for Zebra fish sperm. Along with determining the sperm progressive motility and other kinetic parameters (such as amplitude of lateral head displacement), the "results" window was adjusted to simplify subsequent statistical analyses. The path window was enriched with colored sperm trajectories on the basis of the subpopulation they belong to and a number that allowed the sperm track to be associated to the sperm motility data shown in the "results" window. Data obtained from the novel plugin (named as CASA_bgm) were compared with those of the commercial CASA Hamilton-Thorn IVOS Vers.12, through Bland Altman's plots. While the percentage of total and progressive motile sperm, VCL, VAP, VSL, LIN and STR and ALH were in agreement with those obtained with the commercial system, BCF significantly differed between the two systems probably due to their settings. Interestingly, a positive and significant correlation between the percentages of total motile sperm evaluated through CASA_bgm and those showing high mitochondrial membrane potential evaluated by JC-1 staining was found. In conclusion, CASA_bgm ImageJ plugin could be useful and reliable for stallion sperm motility analysis and it is our aim to apply this system to other mammalian species. Copyright © 2016 Elsevier B.V. All rights reserved.

  1. lessons and challenges from software quality assessment

    African Journals Online (AJOL)

    DJFLEX

    We discussed these lessons and challenges across two measurable characteristics namely quality of design (life cycle ... KEYWORDS: Software, Software Quality ,Quality Standard, Characteristics, Assessment, Challanges, lessons. 1. ... F. Bakpo, Department of Computer Science, University of Nigeria, Nsukka, Nigeria ...

  2. Software quality assurance plan for PORFLOW-3D

    International Nuclear Information System (INIS)

    Maheras, S.J.

    1993-03-01

    This plan describes the steps taken by the Idaho National Engineering Laboratory Subsurface and Environmental Modeling Unit personnel to implement software quality assurance procedures for the PORFLOW-3D computer code. PORFLOW-3D was used to conduct radiological performance assessments at the Savannah River Site. software quality assurance procedures for PORFLOW-3D include software acquisition, installation, testing, operation, maintenance, and retirement. Configuration control and quality assurance procedures are also included or referenced in this plan

  3. Results of agile project management implementation in software engineering companies

    Directory of Open Access Journals (Sweden)

    Suetin Sergei

    2016-01-01

    Full Text Available Agile project management methodologies, tools and techniques have been becoming more and more popular among Russian and foreign software companies. Though the agile software project management methodologies appear to be more flexible and sound alternative to traditional project management approaches, the practice of agile project management implementation needs more research. The article presents the research on the practical results of agile project management implementation in Russian software engineering companies. The survey-based study covers 8 companies and 35 their projects managed with the help of agile methodologies. In contrast to some optimistic researchers of agile practices, the research findings show that in the investigated projects agile project management led to the deterioration of cost and schedule performance. However, the quality, both perceived by clients and assessed by internal technical analysts, improved after the implementation of agile software project management practices.

  4. Software metrics: Software quality metrics for distributed systems. [reliability engineering

    Science.gov (United States)

    Post, J. V.

    1981-01-01

    Software quality metrics was extended to cover distributed computer systems. Emphasis is placed on studying embedded computer systems and on viewing them within a system life cycle. The hierarchy of quality factors, criteria, and metrics was maintained. New software quality factors were added, including survivability, expandability, and evolvability.

  5. Software Quality Assurance activities of ITER CODAC

    International Nuclear Information System (INIS)

    Pande, Sopan; DiMaio, Franck; Kim, Changseung; Kim, Joohan; Klotz, Wolf-Dieter; Makijarvi, Petri; Stepanov, Denis; Wallander, Anders

    2013-01-01

    Highlights: ► Comprehensive and consistent software engineering and quality assurance of CODAC. ► Applicable to all CODAC software projects executed by ITER DAs and contractors. ► Configurable plans for cost effective application of SQA processes. ► CODAC software plans SQAP, SVVP, SDP, and SCMP. ► CODAC software processes based on IEEE 12207-2008. -- Abstract: Software as an integral part of the plant system I and C is crucial in the manufacturing and integrated operation of ITER plant systems. Software Quality Assurance is necessary to ensure the development and maintenance of consistently high quality I and C software throughout the lifetime of ITER. CODAC decided to follow IEEE 12207-2008 software lifecycle processes for Software Engineering and Software Quality Assurance. Software Development Plan, Software Configuration Management Plan and Software Verification and Validation Plan are the mainstay of Software Quality Assurance which is documented in the Software Quality Assurance Plan. This paper describes the Software Quality Assurance (SQA) activities performed by CODAC. The SQA includes development and maintenance of above plans, processes and resources. With the help of Verification and Validation Teams they gather evidence of process conformance and product conformance, and record process data for quality audits and perform process improvements

  6. Software Quality Improvement in the OMC Team

    CERN Document Server

    Maier, Viktor

    Physicists use self-written software as a tool to fulfill their tasks and often the developed software is used for several years or even decades. If a software product lives for a long time, it has to be changed and adapted to external influences. This implies that the source code has to be read, understood and modified. The same applies to the software of the Optics Measurements and Corrections (OMC) team at CERN. Their task is to track, analyze and correct the beams in the LHC and other accelerators. To solve this task, they revert to a self-written software base with more than 150,000 physical lines of code. The base is subject to continuous changes as well. Their software does its job and is effective, but runs regrettably not efficient because some parts of the source code are in a bad shape and has a low quality. The implementation could be faster and more memory efficient. In addition it is difficult to read and understand the code. Source code files and functions are too big and identifiers do not rev...

  7. Static and Dynamic Software Quality Metric Tools

    OpenAIRE

    Mayo, Kevin A.; Wake, Steven A.; Henry, Sallie M.

    1990-01-01

    The ability to detect and predict poor software quality is of major importance to software engineers, managers, and quality assurance organizations. Poor software quality leads to increased development costs and expensive maintenance. With so much attention on exacerbated budgetary constraints, a viable alternative is necessary. Software quality metrics are designed for this purpose. Metrics measure aspects of code or PDL representations, and can be collected and used throughout the life ...

  8. Sandia National Laboratories Advanced Simulation and Computing (ASC) software quality plan : ASC software quality engineering practices Version 3.0.

    Energy Technology Data Exchange (ETDEWEB)

    Turgeon, Jennifer L.; Minana, Molly A.; Hackney, Patricia; Pilch, Martin M.

    2009-01-01

    The purpose of the Sandia National Laboratories (SNL) Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. Quality is defined in the US Department of Energy/National Nuclear Security Agency (DOE/NNSA) Quality Criteria, Revision 10 (QC-1) as 'conformance to customer requirements and expectations'. This quality plan defines the SNL ASC Program software quality engineering (SQE) practices and provides a mapping of these practices to the SNL Corporate Process Requirement (CPR) 001.3.6; 'Corporate Software Engineering Excellence'. This plan also identifies ASC management's and the software project teams responsibilities in implementing the software quality practices and in assessing progress towards achieving their software quality goals. This SNL ASC Software Quality Plan establishes the signatories commitments to improving software products by applying cost-effective SQE practices. This plan enumerates the SQE practices that comprise the development of SNL ASC's software products and explains the project teams opportunities for tailoring and implementing the practices.

  9. Evaluating predictive models of software quality

    International Nuclear Information System (INIS)

    Ciaschini, V; Canaparo, M; Ronchieri, E; Salomoni, D

    2014-01-01

    Applications from High Energy Physics scientific community are constantly growing and implemented by a large number of developers. This implies a strong churn on the code and an associated risk of faults, which is unavoidable as long as the software undergoes active evolution. However, the necessities of production systems run counter to this. Stability and predictability are of paramount importance; in addition, a short turn-around time for the defect discovery-correction-deployment cycle is required. A way to reconcile these opposite foci is to use a software quality model to obtain an approximation of the risk before releasing a program to only deliver software with a risk lower than an agreed threshold. In this article we evaluated two quality predictive models to identify the operational risk and the quality of some software products. We applied these models to the development history of several EMI packages with intent to discover the risk factor of each product and compare it with its real history. We attempted to determine if the models reasonably maps reality for the applications under evaluation, and finally we concluded suggesting directions for further studies.

  10. The medical software quality deployment method.

    Science.gov (United States)

    Hallberg, N; Timpka, T; Eriksson, H

    1999-03-01

    The objective of this study was to develop a Quality Function Deployment (QFD) model for design of information systems in health-care environments. Consecutive blocked-subject case studies were conducted, based on action research methods. Starting with a QFD model for software development, a model for information system design, the Medical Software Quality Deployment (MSQD) model, was developed. The MSQD model was divided into the pre-study phase, in which the customer categories and their power to influence the design are determined; the data collection phase, in which the voice of customers (VoC) is identified by observations and interviews and quantified by Critical. Incident questionnaires; the need specification phase, where the VoC is specified into ranked customer needs; and the design phase where the customer needs are transformed stepwise to technical requirements and design attributes. QFD showed to be useful for integrating the values of different customer categories in software development for health-care settings. In the later design phases, other quality methods should be used for software implementation and testing.

  11. SWiFT Software Quality Assurance Plan.

    Energy Technology Data Exchange (ETDEWEB)

    Berg, Jonathan Charles [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2016-01-01

    This document describes the software development practice areas and processes which contribute to the ability of SWiFT software developers to provide quality software. These processes are designed to satisfy the requirements set forth by the Sandia Software Quality Assurance Program (SSQAP). APPROVALS SWiFT Software Quality Assurance Plan (SAND2016-0765) approved by: Department Manager SWiFT Site Lead Dave Minster (6121) Date Jonathan White (6121) Date SWiFT Controls Engineer Jonathan Berg (6121) Date CHANGE HISTORY Issue Date Originator(s) Description A 2016/01/27 Jon Berg (06121) Initial release of the SWiFT Software Quality Assurance Plan

  12. Swarming Robot Design, Construction and Software Implementation

    Science.gov (United States)

    Stolleis, Karl A.

    2014-01-01

    In this paper is presented an overview of the hardware design, construction overview, software design and software implementation for a small, low-cost robot to be used for swarming robot development. In addition to the work done on the robot, a full simulation of the robotic system was developed using Robot Operating System (ROS) and its associated simulation. The eventual use of the robots will be exploration of evolving behaviors via genetic algorithms and builds on the work done at the University of New Mexico Biological Computation Lab.

  13. Sandia National Laboratories Advanced Simulation and Computing (ASC) software quality plan. Part 1: ASC software quality engineering practices, Version 2.0.

    Energy Technology Data Exchange (ETDEWEB)

    Sturtevant, Judith E.; Heaphy, Robert; Hodges, Ann Louise; Boucheron, Edward A.; Drake, Richard Roy; Minana, Molly A.; Hackney, Patricia; Forsythe, Christi A.; Schofield, Joseph Richard, Jr. (,; .); Pavlakos, Constantine James; Williamson, Charles Michael; Edwards, Harold Carter

    2006-09-01

    The purpose of the Sandia National Laboratories Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. The plan defines the ASC program software quality practices and provides mappings of these practices to Sandia Corporate Requirements CPR 1.3.2 and 1.3.6 and to a Department of Energy document, ASCI Software Quality Engineering: Goals, Principles, and Guidelines. This document also identifies ASC management and software project teams responsibilities in implementing the software quality practices and in assessing progress towards achieving their software quality goals.

  14. Small PACS implementation using publicly available software

    Science.gov (United States)

    Passadore, Diego J.; Isoardi, Roberto A.; Gonzalez Nicolini, Federico J.; Ariza, P. P.; Novas, C. V.; Omati, S. A.

    1998-07-01

    Building cost effective PACS solutions is a main concern in developing countries. Hardware and software components are generally much more expensive than in developed countries and also more tightened financial constraints are the main reasons contributing to a slow rate of implementation of PACS. The extensive use of Internet for sharing resources and information has brought a broad number of freely available software packages to an ever-increasing number of users. In the field of medical imaging is possible to find image format conversion packages, DICOM compliant servers for all kinds of service classes, databases, web servers, image visualization, manipulation and analysis tools, etc. This paper describes a PACS implementation for review and storage built on freely available software. It currently integrates four diagnostic modalities (PET, CT, MR and NM), a Radiotherapy Treatment Planning workstation and several computers in a local area network, for image storage, database management and image review, processing and analysis. It also includes a web-based application that allows remote users to query the archive for studies from any workstation and to view the corresponding images and reports. We conclude that the advantage of using this approach is twofold. It allows a full understanding of all the issues involved in the implementation of a PACS and also contributes to keep costs down while enabling the development of a functional system for storage, distribution and review that can prove to be helpful for radiologists and referring physicians.

  15. Software quality assurance plans for safety-critical software

    International Nuclear Information System (INIS)

    Liddle, P.

    2006-01-01

    Application software is defined as safety-critical if a fault in the software could prevent the system components from performing their nuclear-safety functions. Therefore, for nuclear-safety systems, the AREVA TELEPERM R XS (TXS) system is classified 1E, as defined in the Inst. of Electrical and Electronics Engineers (IEEE) Std 603-1998. The application software is classified as Software Integrity Level (SIL)-4, as defined in IEEE Std 7-4.3.2-2003. The AREVA NP Inc. Software Program Manual (SPM) describes the measures taken to ensure that the TELEPERM XS application software attains a level of quality commensurate with its importance to safety. The manual also describes how TELEPERM XS correctly performs the required safety functions and conforms to established technical and documentation requirements, conventions, rules, and standards. The program manual covers the requirements definition, detailed design, integration, and test phases for the TELEPERM XS application software, and supporting software created by AREVA NP Inc. The SPM is required for all safety-related TELEPERM XS system applications. The program comprises several basic plans and practices: 1. A Software Quality-Assurance Plan (SQAP) that describes the processes necessary to ensure that the software attains a level of quality commensurate with its importance to safety function. 2. A Software Safety Plan (SSP) that identifies the process to reasonably ensure that safety-critical software performs as intended during all abnormal conditions and events, and does not introduce any new hazards that could jeopardize the health and safety of the public. 3. A Software Verification and Validation (V and V) Plan that describes the method of ensuring the software is in accordance with the requirements. 4. A Software Configuration Management Plan (SCMP) that describes the method of maintaining the software in an identifiable state at all times. 5. A Software Operations and Maintenance Plan (SO and MP) that

  16. Engineering high quality medical software

    CERN Document Server

    Coronato, Antonio

    2018-01-01

    This book focuses on high-confidence medical software in the growing field of e-health, telecare services and health technology. It covers the development of methodologies and engineering tasks together with standards and regulations for medical software.

  17. Implementation of a free software for quality control of IMRT; Puesta en marcha de un soltware de libre distribucion para el control de calidad IMRT

    Energy Technology Data Exchange (ETDEWEB)

    Chinillace, N.; Alonso, S.; Cortina, T.; Reinado, D.; Ricos, B.; Diaz, S.; Campayo, J. M.

    2011-07-01

    In this paper we focus on implementation and launch of software that allows us to compare quantitatively the two-dimensional dose distributions calculated and measured experimentally in IMRT treatment. The tool we are using to make this comparison is the free software DoseLab. This is a program written in MatLab and open source, thereby allowing in some cases adapt the program to the needs of each user. This program will be able to calculate the gamma function of these distributions, a parameter that simultaneously evaluates the difference in dose between two pixels of the image and the distance between them, giving us an objective and quantitative, allowing us to decide if both distributions are compatible or not.

  18. A SYSTEMATIC STUDY OF SOFTWARE QUALITY MODELS

    OpenAIRE

    Dr.Vilas. M. Thakare; Ashwin B. Tomar

    2011-01-01

    This paper aims to provide a basis for software quality model research, through a systematic study ofpapers. It identifies nearly seventy software quality research papers from journals and classifies paper asper research topic, estimation approach, study context and data set. The paper results combined withother knowledge provides support for recommendations in future software quality model research, toincrease the area of search for relevant studies, carefully select the papers within a set ...

  19. NIF Projects Controls and Information Systems Software Quality Assurance Plan

    Energy Technology Data Exchange (ETDEWEB)

    Fishler, B

    2011-03-18

    Quality achievement for the National Ignition Facility (NIF) and the National Ignition Campaign (NIC) is the responsibility of the NIF Projects line organization as described in the NIF and Photon Science Directorate Quality Assurance Plan (NIF QA Plan). This Software Quality Assurance Plan (SQAP) is subordinate to the NIF QA Plan and establishes quality assurance (QA) activities for the software subsystems within Controls and Information Systems (CIS). This SQAP implements an activity level software quality assurance plan for NIF Projects as required by the LLNL Institutional Software Quality Assurance Program (ISQAP). Planned QA activities help achieve, assess, and maintain appropriate quality of software developed and/or acquired for control systems, shot data systems, laser performance modeling systems, business applications, industrial control and safety systems, and information technology systems. The objective of this SQAP is to ensure that appropriate controls are developed and implemented for management planning, work execution, and quality assessment of the CIS organization's software activities. The CIS line organization places special QA emphasis on rigorous configuration control, change management, testing, and issue tracking to help achieve its quality goals.

  20. Software Quality in the Objectory Software Development Process

    NARCIS (Netherlands)

    van den Berg, Klaas; Demeyer, S.; Bosch, J.

    In this paper we discuss how software quality assurance is realized in Rational Objectory. Although much support is given through guidelines and checkpoints, the tool fails to provide clear goals and metrics for quality assessments and it only partially supports the phases in a measurement program.

  1. Requirements engineering: foundation for software quality

    NARCIS (Netherlands)

    Daneva, Maia; Pastor, Oscar

    2016-01-01

    Welcome to the proceedings of the 22nd edition of REFSQ: the International Working Conference on Requirements Engineering – Foundation for Software Quality! Requirements engineering (RE) has been recognized as a critical factor that impacts the quality of software, systems, and services. Since the

  2. Assessing Software Quality Through Visualised Cohesion Metrics

    Directory of Open Access Journals (Sweden)

    Timothy Shih

    2001-05-01

    Full Text Available Cohesion is one of the most important factors for software quality as well as maintainability, reliability and reusability. Module cohesion is defined as a quality attribute that seeks for measuring the singleness of the purpose of a module. The module of poor quality can be a serious obstacle to the system quality. In order to design a good software quality, software managers and engineers need to introduce cohesion metrics to measure and produce desirable software. A highly cohesion software is thought to be a desirable constructing. In this paper, we propose a function-oriented cohesion metrics based on the analysis of live variables, live span and the visualization of processing element dependency graph. We give six typical cohesion examples to be measured as our experiments and justification. Therefore, a well-defined, well-normalized, well-visualized and well-experimented cohesion metrics is proposed to indicate and thus enhance software cohesion strength. Furthermore, this cohesion metrics can be easily incorporated with software CASE tool to help software engineers to improve software quality.

  3. Proceedings of the Fifth Triennial Software Quality Forum 2000, Software for the Next Millennium, Software Quality Forum

    Energy Technology Data Exchange (ETDEWEB)

    Scientific Software Engineering Group, CIC-12

    2000-04-01

    The Software Quality Forum is a triennial conference held by the Software Quality Assurance Subcommittee for the Department of Energy's Quality Managers. The forum centers on key issues, information, and technology important in software development for the Nuclear Weapons Complex. This year it will be opened up to include local information technology companies and software vendors presenting their solutions, ideas, and lessons learned. The Software Quality Forum 2000 will take on a more hands-on, instructional tone than those previously held. There will be an emphasis on providing information, tools, and resources to assist developers in their goal of producing next generation software.

  4. Development and application of new quality model for software projects.

    Science.gov (United States)

    Karnavel, K; Dillibabu, R

    2014-01-01

    The IT industry tries to employ a number of models to identify the defects in the construction of software projects. In this paper, we present COQUALMO and its limitations and aim to increase the quality without increasing the cost and time. The computation time, cost, and effort to predict the residual defects are very high; this was overcome by developing an appropriate new quality model named the software testing defect corrective model (STDCM). The STDCM was used to estimate the number of remaining residual defects in the software product; a few assumptions and the detailed steps of the STDCM are highlighted. The application of the STDCM is explored in software projects. The implementation of the model is validated using statistical inference, which shows there is a significant improvement in the quality of the software projects.

  5. Improving system quality through software evaluation.

    Science.gov (United States)

    McDaniel, James G

    2002-05-01

    The role of evaluation is examined with respect to quality of software in healthcare. Of particular note is the failure of the Therac-25 radiation therapy machine. This example provides evidence of several types of defect which could have been detected and corrected using appropriate evaluation procedures. The field of software engineering has developed metrics and guidelines to assist in software evaluation but this example indicates that software evaluation must be extended beyond the formally defined interfaces of the software to its real-life operating context.

  6. The Testing Strategy for the Embedded Software implemented in I/O module of KNICS PLC

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Jong Gyun; Park, Won Man; Lee, Dong Young [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    2006-07-01

    The safety Grade PLC (POSAFE) is being developed in the Korea Nuclear Instrumentation and Control System (KNICS) R and D project. The PLC is being designed for satisfies Safety Class 1E, Quality Class 1, and Seismic Category I. The embedded software for implementation in I/O module such as the pIAOS and pOAOS is being developed according to the safety critical software life cycle. The developed software according to the software life cycle is tested for verification and validation by an independent software testing team. This paper describes the software testing strategy to find the faults that may exist in software design and code effectively.

  7. Design and implementation of embedded Bluetooth software system

    Science.gov (United States)

    Zhou, Zhijian; Zhou, Shujie; Xu, Huimin

    2001-10-01

    This thesis introduces the background knowledge and characteristics of Bluetooth technology. Then it summarizes the architecture and working principle of Bluetooth software. After carefully studying the characteristics of embedded operating system and Bluetooth software, this thesis declared two sets of module about Bluetooth software. Corresponding to these module's characteristics, this thesis introduces the design and implementation of LAN Access and Bluetooth headset. The Headset part introduces a developing method corresponding to the particularity of Bluetooth control software. Although these control software are application entity, the control signaling exchanged between them are regulations according to former definitions and they functions through the interaction of data and control information. These data and control information construct the protocol data unit (PDU), and the former definition can be seen as protocol in fact. This thesis uses the advanced development flow on communication protocol development as reference, a formal method - SDL (Specification and Description Language) - describing, validating and coding manually to C. This method not only reserved the efficiency of manually coded code, but also it ensures the quality of codes. The introduction also involves finite state machine theory while introduces the practical developing method on protocol development with the aid of SDL.

  8. Implementing Large Projects in Software Engineering Courses

    Science.gov (United States)

    Coppit, David

    2006-01-01

    In software engineering education, large projects are widely recognized as a useful way of exposing students to the real-world difficulties of team software development. But large projects are difficult to put into practice. First, educators rarely have additional time to manage software projects. Second, classrooms have inherent limitations that…

  9. Software Power Metric Model: An Implementation | Akwukwuma ...

    African Journals Online (AJOL)

    ... and the execution time (TIME) in each case was recorded. We then obtain the application functions point count. Our result shows that the proposed metric is computable, consistent in its use of unit, and is programming language independent. Keywords: Software attributes, Software power, measurement, Software metric, ...

  10. Experiences with Software Quality Metrics in the EMI middlewate

    OpenAIRE

    Alandes, M; Kenny, E M; Meneses, D; Pucciani, G

    2012-01-01

    The EMI Quality Model has been created to define, and later review, the EMI (European Middleware Initiative) software product and process quality. A quality model is based on a set of software quality metrics and helps to set clear and measurable quality goals for software products and processes. The EMI Quality Model follows the ISO/IEC 9126 Software Engineering – Product Quality to identify a set of characteristics that need to be present in the EMI software. For each software characteristi...

  11. Software quality: definitions and strategic issues

    OpenAIRE

    Fitzpatrick, Ronan

    1996-01-01

    This paper contains two sections relating to software quality issues. First, the various definitions of software quality are examined and an alternative suggested. It continues with a review of the quality model as defined by McCall, Richards and Walters in 1977 and mentions the later model of Boëhm published in 1978. Each of McCall's quality factors is reviewed and the extent to which they still apply in the late 1990s is commented on. The factors include, integrity, reliability, usability, ...

  12. Software quality assessment for health care systems.

    Science.gov (United States)

    Braccini, G; Fabbrini, F; Fusani, M

    1997-01-01

    The problem of defining a quality model to be used in the evaluation of the software components of a Health Care System (HCS) is addressed. The model, based on the ISO/IEC 9126 standard, has been interpreted to fit the requirements of some classes of applications representative of Health Care Systems, on the basis of the experience gained both in the field of medical Informatics and assessment of software products. The values resulting from weighing the quality characteristics according to their criticality outline a set of quality profiles that can be used both for evaluation and certification.

  13. A software perspective of environmental data quality

    International Nuclear Information System (INIS)

    Banerjee, B.

    1995-01-01

    Because of the large amount of complex data in environmental projects, particularly large decontamination and decommissioning projects, the quality of the data has a profound impact on the success and cost of the mission. In every phase of the life cycle of the project, including regulatory intervention and legal proceedings, maintaining the quality of data and presenting data in a timely and meaningful manner are critical. In this paper, a systemic view of data quality management from a software engineering perspective is presented. A method of evaluation evolves from this view. This method complements the principles of the data quality objective. When graded adequately, the method of evaluation establishes a paradigm for ensuring data quality for new and renewed projects. This paper also demonstrates that incorporating good practices of software engineering into the data management process leads to continuous improvement of data quality

  14. Software quality assurance: in large scale and complex software-intensive systems

    NARCIS (Netherlands)

    Mistrik, I.; Soley, R.; Ali, N.; Grundy, J.; Tekinerdogan, B.

    2015-01-01

    Software Quality Assurance in Large Scale and Complex Software-intensive Systems presents novel and high-quality research related approaches that relate the quality of software architecture to system requirements, system architecture and enterprise-architecture, or software testing. Modern software

  15. The software quality control for gamma spectrometry

    International Nuclear Information System (INIS)

    Monte, L.

    1986-01-01

    One of major problems with wich the quality control program of an environmental measurements laboratory is confronted is the evaluation of the performances of software packages for the analysis of gamma-ray spectra. A program of tests for evaluating the performances of the software package (SPECTRAN-F, Canberra Inc.) used by our laboratory is being carried out. In this first paper the results of a preliminary study concerning the evaluation of the performance of the doublet analysis routine are presented

  16. Spectrum analysis on quality requirements consideration in software design documents.

    Science.gov (United States)

    Kaiya, Haruhiko; Umemura, Masahiro; Ogata, Shinpei; Kaijiri, Kenji

    2013-12-01

    Software quality requirements defined in the requirements analysis stage should be implemented in the final products, such as source codes and system deployment. To guarantee this meta-requirement, quality requirements should be considered in the intermediate stages, such as the design stage or the architectural definition stage. We propose a novel method for checking whether quality requirements are considered in the design stage. In this method, a technique called "spectrum analysis for quality requirements" is applied not only to requirements specifications but also to design documents. The technique enables us to derive the spectrum of a document, and quality requirements considerations in the document are numerically represented in the spectrum. We can thus objectively identify whether the considerations of quality requirements in a requirements document are adapted to its design document. To validate the method, we applied it to commercial software systems with the help of a supporting tool, and we confirmed that the method worked well.

  17. Sandia National Laboratories Advanced Simulation and Computing (ASC) software quality plan part 2 mappings for the ASC software quality engineering practices, version 2.0.

    Energy Technology Data Exchange (ETDEWEB)

    Heaphy, Robert; Sturtevant, Judith E.; Hodges, Ann Louise; Boucheron, Edward A.; Drake, Richard Roy; Minana, Molly A.; Hackney, Patricia; Forsythe, Christi A.; Schofield, Joseph Richard, Jr. (,; .); Pavlakos, Constantine James; Williamson, Charles Michael; Edwards, Harold Carter

    2006-09-01

    The purpose of the Sandia National Laboratories Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. The plan defines the ASC program software quality practices and provides mappings of these practices to Sandia Corporate Requirements CPR001.3.2 and CPR001.3.6 and to a Department of Energy document, ''ASCI Software Quality Engineering: Goals, Principles, and Guidelines''. This document also identifies ASC management and software project teams' responsibilities in implementing the software quality practices and in assessing progress towards achieving their software quality goals.

  18. Designing, developing, and implementing software ecosystems

    DEFF Research Database (Denmark)

    Manikas, Konstantinos; Hämäläinen, Mervi; Tyrväinen, Pasi

    2017-01-01

    The notion of software ecosystems has been popular both in research and industry for more than a decade, but how software ecosystems are created still remains unclear. This becomes more of a challenge if one examines the "creation'' of ecosystems that have high probability in surviving in the fut...... aspects of a single re-iterating phase and thus propose the view of design, development, and establishment as a continous process, running in parallel with and interrelated to the monitoring of the ecosystem evolution....... in the future, i.e. with respect to ecosystem health. In this paper, we focus on the creation of software ecosystems and propose a process for designing, developing, and establishing software ecosystems based on three basic steps and a set of activities for each step. We note that software ecosystem research...... identifies that ecosystems typically emerge from either a company deciding to allow development on their product platform or from a successful open source project. In our study we add to this knowledge by demonstrating, through two case studies, that ecosystems can emerge from more than a technological...

  19. Architecture-Centric Software Quality Management

    Science.gov (United States)

    Maciaszek, Leszek A.

    Software quality is a multi-faceted concept defined using different attributes and models. From all various quality requirements, the quality of adaptiveness is by far most critical. Based on this assumption, this paper offers an architecture-centric approach to production of measurably-adaptive systems. The paper uses the PCBMER (Presentation, Controller, Bean, Mediator, Entity, and Resource) meta-architecture to demonstrate how complexity of a software solution can be measured and kept under control in standalone applications. Meta-architectural extensions aimed at managing quality in integration development projects are also introduced. The DSM (Design Structure Matrix) method is used to explain our approach to measure the quality. The discussion is conducted against the background of the holonic approach to science (as the middle-ground between holism and reductionism).

  20. System Quality Management in Software Testing Laboratory that Chooses Accreditation

    Directory of Open Access Journals (Sweden)

    Yanet Brito R.

    2013-12-01

    Full Text Available The evaluation of software products will reach full maturity when executed by the scheme and provides third party certification. For the validity of the certification, the independent laboratory must be accredited for that function, using internationally recognized standards. This brings with it a challenge for the Industrial Laboratory Testing Software (LIPS, responsible for testing the products developed in Cuban Software Industry, define strategies that will permit it to offer services with a high level of quality. Therefore it is necessary to establish a system of quality management according to NC-ISO/IEC 17025: 2006 to continuously improve the operational capacity and technical competence of the laboratory, with a view to future accreditation of tests performed. This article discusses the process defined in the LIPS for the implementation of a Management System of Quality, from the current standards and trends, as a necessary step to opt for the accreditation of the tests performed.

  1. Software archeology: a case study in software quality assurance and design

    Energy Technology Data Exchange (ETDEWEB)

    Macdonald, John M [Los Alamos National Laboratory; Lloyd, Jane A [Los Alamos National Laboratory; Turner, Cameron J [COLORADO SCHOOL OF MINES

    2009-01-01

    Ideally, quality is designed into software, just as quality is designed into hardware. However, when dealing with legacy systems, demonstrating that the software meets required quality standards may be difficult to achieve. As the need to demonstrate the quality of existing software was recognized at Los Alamos National Laboratory (LANL), an effort was initiated to uncover and demonstrate that legacy software met the required quality standards. This effort led to the development of a reverse engineering approach referred to as software archaeology. This paper documents the software archaeology approaches used at LANL to document legacy software systems. A case study for the Robotic Integrated Packaging System (RIPS) software is included.

  2. Sandia National Laboratories Advanced Simulation and Computing (ASC) software quality plan. Part 1 : ASC software quality engineering practices version 1.0.

    Energy Technology Data Exchange (ETDEWEB)

    Minana, Molly A.; Sturtevant, Judith E.; Heaphy, Robert; Hodges, Ann Louise; Boucheron, Edward A.; Drake, Richard Roy; Forsythe, Christi A.; Schofield, Joseph Richard, Jr.; Pavlakos, Constantine James; Williamson, Charles Michael; Edwards, Harold Carter

    2005-01-01

    The purpose of the Sandia National Laboratories (SNL) Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. Quality is defined in DOE/AL Quality Criteria (QC-1) as conformance to customer requirements and expectations. This quality plan defines the ASC program software quality practices and provides mappings of these practices to the SNL Corporate Process Requirements (CPR 1.3.2 and CPR 1.3.6) and the Department of Energy (DOE) document, ASCI Software Quality Engineering: Goals, Principles, and Guidelines (GP&G). This quality plan identifies ASC management and software project teams' responsibilities for cost-effective software engineering quality practices. The SNL ASC Software Quality Plan establishes the signatories commitment to improving software products by applying cost-effective software engineering quality practices. This document explains the project teams opportunities for tailoring and implementing the practices; enumerates the practices that compose the development of SNL ASC's software products; and includes a sample assessment checklist that was developed based upon the practices in this document.

  3. Strategic quality: a software engineering approach

    OpenAIRE

    2009-01-01

    M.Ing. Software engineering organizations face a struggle for daily survival in an extremely volatile climate. Numerous times it has been shown that the quality of a service or product could make the difference between an organization existing or closing down. The way in which quality is approached in any organization is part of a strategy; unbeknown to the managers and employees in many instances. Even though there are numerous books, articles, internet sites and other sources devoted to ...

  4. MCNP trademark Software Quality Assurance plan

    International Nuclear Information System (INIS)

    Abhold, H.M.; Hendricks, J.S.

    1996-04-01

    MCNP is a computer code that models the interaction of radiation with matter. MCNP is developed and maintained by the Transport Methods Group (XTM) of the Los Alamos National Laboratory (LANL). This plan describes the Software Quality Assurance (SQA) program applied to the code. The SQA program is consistent with the requirements of IEEE-730.1 and the guiding principles of ISO 900

  5. Software metrics: The key to quality software on the NCC project

    Science.gov (United States)

    Burns, Patricia J.

    1993-01-01

    Network Control Center (NCC) Project metrics are captured during the implementation and testing phases of the NCCDS software development lifecycle. The metrics data collection and reporting function has interfaces with all elements of the NCC project. Close collaboration with all project elements has resulted in the development of a defined and repeatable set of metrics processes. The resulting data are used to plan and monitor release activities on a weekly basis. The use of graphical outputs facilitates the interpretation of progress and status. The successful application of metrics throughout the NCC project has been instrumental in the delivery of quality software. The use of metrics on the NCC Project supports the needs of the technical and managerial staff. This paper describes the project, the functions supported by metrics, the data that are collected and reported, how the data are used, and the improvements in the quality of deliverable software since the metrics processes and products have been in use.

  6. Experiences with Software Quality Metrics in the EMI Middleware

    OpenAIRE

    Alandes, Maria

    2012-01-01

    PUBLISHED he EMI Quality Model has been created to define, and later review, the EMI (European Middleware Initiative) software product and process quality. A quality model is based on a set of software quality metrics and helps to set clear and measurable quality goals for software products and processes. The EMI Quality Model follows the ISO/IEC 9126 Software Engineering – Product Quality to identify a set of characteris...

  7. Software Quality Perceptions of Stakeholders Involved in the Software Development Process

    Science.gov (United States)

    Padmanabhan, Priya

    2013-01-01

    Software quality is one of the primary determinants of project management success. Stakeholders involved in software development widely agree that quality is important (Barney and Wohlin 2009). However, they may differ on what constitutes software quality, and which of its attributes are more important than others. Although, software quality…

  8. Prototype implementation of segment assembling software

    Directory of Open Access Journals (Sweden)

    Pešić Đorđe

    2018-01-01

    Full Text Available IT education is very important and a lot of effort is put into the development of tools for helping students to acquire programming knowledge and for helping teachers in automating the examination process. This paper describes a prototype of the program segment assembling software used in the context of making tests in the field of algorithmic complexity. The proposed new program segment assembling model uses rules and templates. A template is a simple program segment. A rule defines combining method and data dependencies if they exist. One example of program segment assembling by the proposed system is given. Graphical user interface is also described.

  9. Software Quality Control at Belle II

    Science.gov (United States)

    Ritter, M.; Kuhr, T.; Hauth, T.; Gebard, T.; Kristof, M.; Pulvermacher, C.; Belle Software Group, II

    2017-10-01

    Over the last seven years the software stack of the next generation B factory experiment Belle II has grown to over one million lines of C++ and Python code, counting only the part included in offline software releases. There are several thousand commits to the central repository by about 100 individual developers per year. To keep a coherent software stack of high quality that it can be sustained and used efficiently for data acquisition, simulation, reconstruction, and analysis over the lifetime of the Belle II experiment is a challenge. A set of tools is employed to monitor the quality of the software and provide fast feedback to the developers. They are integrated in a machinery that is controlled by a buildbot master and automates the quality checks. The tools include different compilers, cppcheck, the clang static analyzer, valgrind memcheck, doxygen, a geometry overlap checker, a check for missing or extra library links, unit tests, steering file level tests, a sophisticated high-level validation suite, and an issue tracker. The technological development infrastructure is complemented by organizational means to coordinate the development.

  10. A Botanical Garden Data Base Implemented in SAS Software.

    Science.gov (United States)

    Greenberg, Edward A.; Lewis, Bruce R.

    1985-01-01

    A database had been developed for the Phoenix Desert Botanical Garden's Living Plant Collection using Statistical Analysis System software. Implementation procedures, data dictionary maintenance, data entry, updating, and reporting are described. (JN)

  11. Software Defined Radio Datalink Implementation Using PC-Type Computers

    National Research Council Canada - National Science Library

    Zafeiropoulos, Georgios

    2003-01-01

    The objective of this thesis was to examine the feasibility of implementation and the performance of a Software Defined Radio datalink, using a common PC type host computer and a high level programming language...

  12. Measuring health care process quality with software quality measures.

    Science.gov (United States)

    Yildiz, Ozkan; Demirörs, Onur

    2012-01-01

    Existing quality models focus on some specific diseases, clinics or clinical areas. Although they contain structure, process, or output type measures, there is no model which measures quality of health care processes comprehensively. In addition, due to the not measured overall process quality, hospitals cannot compare quality of processes internally and externally. To bring a solution to above problems, a new model is developed from software quality measures. We have adopted the ISO/IEC 9126 software quality standard for health care processes. Then, JCIAS (Joint Commission International Accreditation Standards for Hospitals) measurable elements were added to model scope for unifying functional requirements. Assessment (diagnosing) process measurement results are provided in this paper. After the application, it was concluded that the model determines weak and strong aspects of the processes, gives a more detailed picture for the process quality, and provides quantifiable information to hospitals to compare their processes with multiple organizations.

  13. IMPROVING (SOFTWARE) PATENT QUALITY THROUGH THE ADMINISTRATIVE PROCESS.

    Science.gov (United States)

    Rai, Arti K

    2013-11-24

    The available evidence indicates that patent quality, particularly in the area of software, needs improvement. This Article argues that even an agency as institutionally constrained as the U.S. Patent and Trademark Office ("PTO") could implement a portfolio of pragmatic, cost-effective quality improvement strategies. The argument in favor of these strategies draws upon not only legal theory and doctrine but also new data from a PTO software examination unit with relatively strict practices. Strategies that resolve around Section 112 of the patent statute could usefully be deployed at the initial examination stage. Other strategies could be deployed within the new post-issuance procedures available to the agency under the America Invents Act. Notably, although the strategies the Article discusses have the virtue of being neutral as to technology, they are likely to have a very significant practical impact in the area of software.

  14. Proposed Department of Defense software metrics implementation plan

    Science.gov (United States)

    Fife, Dennis W.; Popelas, Judy M.; Springsteen, Beth

    1994-09-01

    This report presents a proposed plan for DOD implementation of software metrics in support of acquisition reform and improved software risk management. The report documents the concluding phase of a one-year task to assess software metrics and to determine how to improve their use in software acquisition, particularly of weapon system software. An earlier effort surveyed the state of development and use of software metrics in industry and defense organizations. This survey found (1) that industry is strongly committed to using software metrics in a corporate-wide approach to gain marketplace advantages, and (2) that defense contractors exhibit a state of evolution in metrics use that is comparable to that of strictly commercial companies, if not more mature. The overall vision of the proposed plan is to establish a department-wide, bottom-to-top corporate approach for collecting and using software metrics to improve analysis and decision-making in software acquisition and risk management. Acquisition reform and the need for improved risk management should provide more specific goals to address under this vision, comparable to industry's goals of marketplace benefits from using software metrics. The plan is framed as eight specific recommendations, including a list of future research and development work that would support their implementation.

  15. Quality in Software Development: a pragmatic approach using metrics

    Directory of Open Access Journals (Sweden)

    Daniel Acton

    2014-06-01

    Full Text Available As long as software has been produced, there have been efforts to strive for quality in software products. In order to understand quality in software products, researchers have built models of software quality that rely on metrics in an attempt to provide a quantitative view of software quality. The aim of these models is to provide software producers with the capability to define and evaluate metrics related to quality and use these metrics to improve the quality of the software they produce over time. The main disadvantage of these models is that they require effort and resources to define and evaluate metrics from software projects. This article briefly describes some prominent models of software quality in the literature and continues to describe a new approach to gaining insight into quality in software development projects. A case study based on this new approach is described and results from the case study are discussed.

  16. SAPHIRE 8 Quality Assurance Software Metrics Report

    Energy Technology Data Exchange (ETDEWEB)

    Kurt G. Vedros

    2011-08-01

    The purpose of this review of software metrics is to examine the quality of the metrics gathered in the 2010 IV&V and to set an outline for results of updated metrics runs to be performed. We find from the review that the maintenance of accepted quality standards presented in the SAPHIRE 8 initial Independent Verification and Validation (IV&V) of April, 2010 is most easily achieved by continuing to utilize the tools used in that effort while adding a metric of bug tracking and resolution. Recommendations from the final IV&V were to continue periodic measurable metrics such as McCabe's complexity measure to ensure quality is maintained. The four software tools used to measure quality in the IV&V were CodeHealer, Coverage Validator, Memory Validator, Performance Validator, and Thread Validator. These are evaluated based on their capabilities. We attempted to run their latest revisions with the newer Delphi 2010 based SAPHIRE 8 code that has been developed and was successful with all of the Validator series of tools on small tests. Another recommendation from the IV&V was to incorporate a bug tracking and resolution metric. To improve our capability of producing this metric, we integrated our current web reporting system with the SpiraTest test management software purchased earlier this year to track requirements traceability.

  17. Implementation of Digital Watermarking Using MATLAB Software

    OpenAIRE

    Karnpriya Vyas; Kirti Sethiya; Sonu Jain

    2012-01-01

    Digital watermarking holds significant promise as one of the keys to protecting proprietary digital content in the coming years. It focuses on embedding information inside a digital object such that the embedded information is in separable bound to the object. The proposed scheme has been implemented on MATLAB, as it is a high level technical computing language and interactive environment for algorithm development, data visualization, data analysis, and numerical computation. We w...

  18. The dynamic of modern software development project management and the software crisis of quality. An integrated system dynamics approach towards software quality improvement

    OpenAIRE

    Nasirikaljahi, Armindokht

    2012-01-01

    The software industry is plagued by cost-overruns, delays, poor customer satisfaction and quality issues that are costing clients and customers world-wide billions of dollars each year. The phenomenon is coined The Software Crisis", and poses a huge challenge for software project management. This thesis addresses one of the core issues of the software crisis, namely software quality. The challenges of software quality are central for understanding the other symptoms of the software crisis. Th...

  19. Design and implementation of Skype USB user gateway software

    Science.gov (United States)

    Qi, Yang

    2017-08-01

    With the widespread application of VoIP, the client with private protocol becomes more and more popular. Skype is one of the representatives. How to connect Skype with PSTN just by Skype client has gradually become hot. This paper design and implement the software based on a kind of USB User Gateway. With the software Skype user can freely communicate with PSTN phone. FSM is designed as the core of the software, and Skype control is separated by the USB Gateway control. In this way, the communication becomes more flexible and efficient. In the actual user testing, the software obtains good results.

  20. Quality assurance of nuclear medicine computer software

    International Nuclear Information System (INIS)

    Cradduck, T.D.

    1986-01-01

    Although quality assurance activities have become well established for the hardware found in nuclear medicine little attention has been paid to computer software. This paper outlines some of the problems that exist and indicates some of the solutions presently under development. The major thrust has been towards establishment of programming standards and comprehensive documentation. Some manufacturers have developed installation verification procedures which programmers are urged to use as models for their own programs. Items that tend to cause erroneous results are discussed with the emphasis for error detection and correction being placed on proper education and training of the computer operator. The concept of interchangeable data files or 'software phantoms' for purposes of quality assurance is discussed. (Author)

  1. lessons and challenges from software quality assessment

    African Journals Online (AJOL)

    DJFLEX

    ANSI/ASQCA3/1978) gave the definition of quality as. “The totality of features and characteristics of a product or service that bear on its ability to satisfy stated or implied needs.” Software is a critical core industry that is essential to national interests in science, technology, etc. It is in several places at the same time in today's ...

  2. Colonoscopy quality: metrics and implementation.

    Science.gov (United States)

    Calderwood, Audrey H; Jacobson, Brian C

    2013-09-01

    Colonoscopy is an excellent area for quality improvement because it is high volume, has significant associated risk and expense, and there is evidence that variability in its performance affects outcomes. The best end point for validation of quality metrics in colonoscopy is colorectal cancer incidence and mortality, but a more readily accessible metric is the adenoma detection rate. Fourteen quality metrics were proposed in 2006, and these are described in this article. Implementation of quality improvement initiatives involves rapid assessments and changes on an iterative basis, and can be done at the individual, group, or facility level. Copyright © 2013 Elsevier Inc. All rights reserved.

  3. Software quality assurance plan for the National Ignition Facility integrated computer control system

    International Nuclear Information System (INIS)

    Woodruff, J.

    1996-11-01

    Quality achievement is the responsibility of the line organizations of the National Ignition Facility (NIF) Project. This Software Quality Assurance Plan (SQAP) applies to the activities of the Integrated Computer Control System (ICCS) organization and its subcontractors. The Plan describes the activities implemented by the ICCS section to achieve quality in the NIF Project's controls software and implements the NIF Quality Assurance Program Plan (QAPP, NIF-95-499, L-15958-2) and the Department of Energy's (DOE's) Order 5700.6C. This SQAP governs the quality affecting activities associated with developing and deploying all control system software during the life cycle of the NIF Project

  4. Master Pump Shutdown MPS Software Quality Assurance Plan (SQAP)

    International Nuclear Information System (INIS)

    BEVINS, R.R.

    2000-01-01

    The MPSS Software Quality Assurance (SQAP) describes the tools and strategy used in the development of the MPSS software. The document also describes the methodology for controlling and managing changes to the software

  5. N286.7-99, A Canadian standard specifying software quality management system requirements for analytical, scientific, and design computer programs and its implementation at AECL

    International Nuclear Information System (INIS)

    Abel, R.

    2000-01-01

    Analytical, scientific, and design computer programs (referred to in this paper as 'scientific computer programs') are developed for use in a large number of ways by the user-engineer to support and prove engineering calculations and assumptions. These computer programs are subject to frequent modifications inherent in their application and are often used for critical calculations and analysis relative to safety and functionality of equipment and systems. N286.7-99(4) was developed to establish appropriate quality management system requirements to deal with the development, modification, and application of scientific computer programs. N286.7-99 provides particular guidance regarding the treatment of legacy codes

  6. Applying Quality Costs in a Software Development Environment

    Directory of Open Access Journals (Sweden)

    I.P. Hollingsworth

    1999-05-01

    Full Text Available This paper shows how Quality Costs can be a measure of software quality. The relationship between Quality Costs and other software quality metrics is briefly explained, and software development oriented versions of the two principal Quality Cost models are described. Finally the paper discusses the major issues involved in setting up a software Quality Cost programme. The concepts are based on previous research on Quality Costs in manufacturing, coupled with work on software metrics and the work currently being undertaken by the authors in a number of industries.

  7. Adopting software quality measures for healthcare processes.

    Science.gov (United States)

    Yildiz, Ozkan; Demirörs, Onur

    2009-01-01

    In this study, we investigated the adoptability of software quality measures for healthcare process measurement. Quality measures of ISO/IEC 9126 are redefined from a process perspective to build a generic healthcare process quality measurement model. Case study research method is used, and the model is applied to a public hospital's Entry to Care process. After the application, weak and strong aspects of the process can be easily observed. Access audibility, fault removal, completeness of documentation, and machine utilization are weak aspects and these aspects are the candidates for process improvement. On the other hand, functional completeness, fault ratio, input validity checking, response time, and throughput time are the strong aspects of the process.

  8. On the Role of Software Quality Management in Software Process Improvement

    DEFF Research Database (Denmark)

    Wiedemann Jacobsen, Jan; Kuhrmann, Marco; Münch, Jürgen

    2016-01-01

    and to develop an initial picture of how these topics are addressed in SPI. Our findings show a fairly pragmatic contribution set in which different solutions are proposed, discussed, and evaluated. Among others, our findings indicate a certain reluctance towards standard quality or (test) maturity models......Software Process Improvement (SPI) programs have been implemented, inter alia, to improve quality and speed of software development. SPI addresses many aspects ranging from individual developer skills to entire organizations. It comprises, for instance, the optimization of specific activities....... In this paper, we provide a detailed investigation of those papers from the overall systematic mapping study that were classified as addressing SPI in the context of SQM (including testing). From the main study's result set, 92 papers were selected for an in-depth systematic review to study the contributions...

  9. Design and Implementation of Modular Software for Programming Mobile Robots

    Directory of Open Access Journals (Sweden)

    Alessandro Farinelli

    2006-03-01

    Full Text Available This article describes a software development toolkit for programming mobile robots, that has been used on different platforms and for different robotic applications. We address design choices, implementation issues and results in the realization of our robot programming environment, that has been devised and built from many people since 1998. We believe that the proposed framework is extremely useful not only for experienced robotic software developers, but also for students approaching robotic research projects.

  10. Implementation and Testing of VLBI Software Correlation at the USNO

    Science.gov (United States)

    Fey, Alan; Ojha, Roopesh; Boboltz, Dave; Geiger, Nicole; Kingham, Kerry; Hall, David; Gaume, Ralph; Johnston, Ken

    2010-01-01

    The Washington Correlator (WACO) at the U.S. Naval Observatory (USNO) is a dedicated VLBI processor based on dedicated hardware of ASIC design. The WACO is currently over 10 years old and is nearing the end of its expected lifetime. Plans for implementation and testing of software correlation at the USNO are currently being considered. The VLBI correlation process is, by its very nature, well suited to a parallelized computing environment. Commercial off-the-shelf computer hardware has advanced in processing power to the point where software correlation is now both economically and technologically feasible. The advantages of software correlation are manifold but include flexibility, scalability, and easy adaptability to changing environments and requirements. We discuss our experience with and plans for use of software correlation at USNO with emphasis on the use of the DiFX software correlator.

  11. Effective organizational solutions for implementation of DBMS software packages

    Science.gov (United States)

    Jones, D.

    1984-01-01

    The space telescope management information system development effort is a guideline for discussing effective organizational solutions used in implementing DBMS software. Focus is on the importance of strategic planning. The value of constructing an information system architecture to conform to the organization's managerial needs, the need for a senior decision maker, dealing with shifting user requirements, and the establishment of a reliable working relationship with the DBMS vendor are examined. Requirements for a schedule to demonstrate progress against a defined timeline and the importance of continued monitoring for production software control, production data control, and software enhancements are also discussed.

  12. Operational excellence (six sigma) philosophy: Application to software quality assurance

    Energy Technology Data Exchange (ETDEWEB)

    Lackner, M.

    1997-11-01

    This report contains viewgraphs on operational excellence philosophy of six sigma applied to software quality assurance. This report outlines the following: goal of six sigma; six sigma tools; manufacturing vs administrative processes; Software quality assurance document inspections; map software quality assurance requirements document; failure mode effects analysis for requirements document; measuring the right response variables; and questions.

  13. Software quality assurance for safety analysis and risk management at the Savannah River Site

    International Nuclear Information System (INIS)

    Ades, M.J.; Toffer, H.; Crowe, R.D.

    1991-01-01

    As part of its Reactor Operations Improvement Program at the Savannah River Site (SRS), Westinghouse Savannah River Company (WSRC), in cooperation with the Westinghouse Hanford Company, has developed and implemented quality assurance for safety-related software for technical programs essential to the safety and reliability of reactor operations. More specifically, the quality assurance process involved the development and implementation of quality standards and attendant procedures based on industry software quality standards. These procedures were then applied to computer codes in reactor safety and probabilistic risk assessment analyses. This paper provides a review of the major aspects of the WSRC safety-related software quality assurance. In particular, quality assurance procedures are described for the different life cycle phases of the software that include the Requirements, Software Design and Implementation, Testing and Installation, Operation and Maintenance, and Retirement Phases. For each phase, specific provisions are made to categorize the range of activities, the level of responsibilities, and the documentation needed to assure the control of the software. The software quality assurance procedures developed and implemented are evolutionary in nature, and thus, prone to further refinements. These procedures, nevertheless, represent an effective controlling tool for the development, production, and operation of safety-related software applicable to reactor safety and probabilistic risk assessment analyses

  14. Factors to Consider When Implementing Automated Software Testing

    Science.gov (United States)

    2016-11-10

    Therefore, many businesses are automating their software testing in order to save money and improve quality. When considering whether automation...is a viable option, businesses must take several factors into account. The purpose of this document is to illuminate these factors. Software...programming, e.g., Java or Visual Basic.  Subject Matter Experts (SME) with firm grasp of application being automated. 2. Additional costs for setup (e.g

  15. Application of multistage process control methodology for software quality management

    Directory of Open Access Journals (Sweden)

    Boby John

    2016-12-01

    Full Text Available As the need for software increased, the number of software firms and the competition among them also increased. The software companies in developing countries like India can no longer survive based on cost advantage alone. The firms need to deliver competitively priced quality software products on time. This can be achieved through quantitatively managing the different phases or sub processes in software development process. But quantitative management of a process consisting of a set of interlinked sub processes or stages with the output of one sub pro-cess influencing that of subsequent stages and final output is not easy. The process performance models developed for quantitative management of software development process often model the final outcome in terms of factors from various stages together or focuses only on quantitatively managing a particular sub process independently. In manufacturing and other engineering indus-tries, the processes with multiple sub process are monitored and controlled using multistage pro-cess control methodology. This paper is an application of multistage statistical process control for managing the software development process. The suggested methodology is a combination of process performance models and control charts. The proposed methodology can be easily im-plemented for controlling various types of software projects like development projects, incre-mental development projects, testing projects etc. The methodology also provides the project manager the opportunity to tighten or relax the control at various sub processes based on the pro-ject team’s strengths and still achieve the goal on the final outcome.

  16. Unisys' experience in software quality and productivity management of an existing system

    Science.gov (United States)

    Munson, John B.

    1988-01-01

    A summary of Quality Improvement techniques, implementation, and results in the maintenance, management, and modification of large software systems for the Space Shuttle Program's ground-based systems is provided.

  17. Reducing the risk of failure: Software Quality assurance standards and methods

    International Nuclear Information System (INIS)

    Elphick, J.; Cope, H.

    1992-01-01

    An effective Software Quality Assurance (SQA) program provides an overall approach to software engineering and the establishment of proven methods for the production of reliable software. And, in the authors experience, the overall costs for the software life are diminished with the application of quality methods. In their experience, the issues for implementing quality standards and practices are many. This paper addresses those issues as well as the lessons learned from developing and implementing a number of software quality assurance programs. Their experience includes the development and implementation of their own NRC accepted SQA program and an SQA program for an engineering software developer, as well as developing SQA procedures, standards, and methods for utilities, medical and commercial clients. Some of the issues addressed in this paper are: setting goals and defining quality; applying the software life cycle; addressing organizational issues; providing flexibility and increasing productivity; producing effective documentation; maintaining quality records; Imposing software configuration management; conducting reviews, audits, and controls; verification and validation; and controlling software procurement

  18. The Implementation of Computer Data Processing Software for EAST NBI

    Science.gov (United States)

    Zhang, Xiaodan; Hu, Chundong; Sheng, Peng; Zhao, Yuanzhe; Wu, Deyun; Cui, Qinglong

    2014-10-01

    One of the most important project missions of neutral beam injectors is the implementation of 100 s neutral beam injection (NBI) with high power energy to the plasma of the EAST superconducting tokamak. Correspondingly, it's necessary to construct a high-speed and reliable computer data processing system for processing experimental data, such as data acquisition, data compression and storage, data decompression and query, as well as data analysis. The implementation of computer data processing application software (CDPS) for EAST NBI is presented in this paper in terms of its functional structure and system realization. The set of software is programmed in C language and runs on Linux operating system based on TCP network protocol and multi-threading technology. The hardware mainly includes industrial control computer (IPC), data server, PXI DAQ cards and so on. Now this software has been applied to EAST NBI system, and experimental results show that the CDPS can serve EAST NBI very well.

  19. Design and Implementation of Software for Resistance Welding Process Simulations

    DEFF Research Database (Denmark)

    Zhang, Wenqi

    2003-01-01

    Based on long time engineering research and dedicated collaborations with industry, a new welding software, SORPAS, has been developed for simulation of resistance projection and spot welding processes applying the powerful finite element method (FEM). In order to make the software directly usable...... by engineers and technicians in industry, all of the important parameters in resistance welding are considered and automatically implemented into the software. With the specially designed graphic user interface for Windows, engineers (even without prior knowledge of FEM) can quickly learn and easily operate...... of work pieces and electrodes as well as process parameter settings similar to real machine settings, the software has been readily applied in industry for supporting product development and process optimization. After simulation, the dynamic process parameters are graphically displayed. The distributions...

  20. Lessons learned from development and quality assurance of software systems at the Halden Project

    International Nuclear Information System (INIS)

    Bjorlo, T.J.; Berg, O.; Pehrsen, M.; Dahll, G.; Sivertsen, T.

    1996-01-01

    The OECD Halden Reactor Project has developed a number of software systems within the research programmes. These programmes have comprised a wide range of topics, like studies of software for safety-critical applications, development of different operator support systems, and software systems for building and implementing graphical user interfaces. The systems have ranged from simple prototypes to installations in process plants. In the development of these software systems, Halden has gained much experience in quality assurance of different types of software. This paper summarises the accumulated experience at the Halden Project in quality assurance of software systems. The different software systems being developed at the Halden Project may be grouped into three categories. These are plant-specific software systems (one-of-a-kind deliveries), generic software products, and safety-critical software systems. This classification has been found convenient as the categories have different requirements to the quality assurance process. In addition, the experience from use of software development tools and proprietary software systems at Halden, is addressed. The paper also focuses on the experience gained from the complete software life cycle, starting with the software planning phase and ending with software operation and maintenance

  1. Process model for building quality software on internet time ...

    African Journals Online (AJOL)

    The competitive nature of the software construction market and the inherently exhilarating nature of software itself have hinged the success of any software development project on four major pillars: time to market, product quality, innovation and documentation. Unfortunately, however, existing software development models ...

  2. Software application for quality control protocol of mammography systems

    International Nuclear Information System (INIS)

    Kjosevski, Vladimir; Gershan, Vesna; Ginovska, Margarita; Spasevska, Hristina

    2010-01-01

    Considering the fact that the Quality Control of the technological process of the mammographic system involves testing of a large number of parameters, it is clearly evident that there is a need for using the information technology for gathering, processing and storing of all the parameters that are result of this process. The main goal of this software application is facilitation and automation of the gathering, processing, storing and presenting process of the data related to the qualification of the physical and technical parameters during the quality control of the technological process of the mammographic system. The software application along with its user interface and database has been made with the Microsoft Access 2003 application which is part of the Microsoft Office 2003 software packet and has been chosen as a platform for developing because it is the most commonly used office application today among the computer users in the country. This is important because it will provide the end users a familiar environment to work in, without the need for additional training and improving the computer skills that they posses. Most importantly, the software application is easy to use, fast in calculating the parameters needed and it is an excellent way to store and display the results. There is a possibility for up scaling this software solution so it can be used by many different users at the same time over the Internet. It is highly recommended that this system is implemented as soon as possible in the quality control process of the mammographic systems due to its many advantages.(Author)

  3. Software Quality Assessment Tool Based on Meta-Models

    OpenAIRE

    Doneva Rositsa; Gaftandzhieva Silvia; Doneva Zhelyana; Staevsky Nevena

    2015-01-01

    In the software industry it is indisputably essential to control the quality of produced software systems in terms of capabilities for easy maintenance, reuse, portability and others in order to ensure reliability in the software development. But it is also clear that it is very difficult to achieve such a control through a ‘manual’ management of quality.There are a number of approaches for software quality assurance based typically on software quality models (e.g. ISO 9126, McCall’s, Boehm’s...

  4. SAPHIRE 8 Software Quality Assurance Plan

    Energy Technology Data Exchange (ETDEWEB)

    Curtis Smith

    2010-02-01

    This Quality Assurance (QA) Plan documents the QA activities that will be managed by the INL related to JCN N6423. The NRC developed the SAPHIRE computer code for performing probabilistic risk assessments (PRAs) using a personal computer (PC) at the Idaho National Laboratory (INL) under Job Code Number (JCN) L1429. SAPHIRE started out as a feasibility study for a PRA code to be run on a desktop personal PC and evolved through several phases into a state-of-the-art PRA code. The developmental activity of SAPHIRE was the result of two concurrent important events: The tremendous expansion of PC software and hardware capability of the 90s and the onset of a risk-informed regulation era.

  5. Improving Performance of Software Implemented Floating Point Addition

    DEFF Research Database (Denmark)

    Hindborg, Andreas Erik; Karlsson, Sven

    2011-01-01

    We outline and evaluate hardware extensions to an integer processor pipeline which allow IEEE 754 oating point, FP, addition to be eciently implemented in software. With a very moderate increase in hardware resources, our perfor- mance evaluation shows that, for a benchmark that executes 12.5% FP...... addition instructions, our approach exhibits a rel- ative slowdown of 3.38 to 15.15 as compared to dedicated hardware. This is a signicant improvement of pure software emulation which leads to relative slowdowns up to 45.33....

  6. Requirement analysis of the safety-critical software implementation for the nuclear power plant

    International Nuclear Information System (INIS)

    Chang, Hoon Seon; Jung, Jae Cheon; Kim, Jae Hack; Nam, Sang Ku; Kim, Hang Bae

    2005-01-01

    The safety critical software shall be implemented under the strict regulation and standards along with hardware qualification. In general, the safety critical software has been implemented using functional block language (FBL) and structured language like C in the real project. Software design shall comply with such characteristics as; modularity, simplicity, minimizing the use of sub-routine, and excluding the interrupt logic. To meet these prerequisites, we used the computer-aided software engineering (CASE) tool to substantiate the requirements traceability matrix that were manually developed using Word processors or Spreadsheets. And the coding standard and manual have been developed to confirm the quality of software development process, such as; readability, consistency, and maintainability in compliance with NUREG/CR-6463. System level preliminary hazard analysis (PHA) is performed by analyzing preliminary safety analysis report (PSAR) and FMEA document. The modularity concept is effectively implemented for the overall module configurations and functions using RTP software development tool. The response time imposed on the basis of the deterministic structure of the safety-critical software was measured

  7. Lessons and challenges from software quality assessment: The ...

    African Journals Online (AJOL)

    ... human resource, result computation etc. We discussed these lessons and challenges across two measurable characteristics namely quality of design (life cycle stages) and quality of conformance. Finally, we also recommended the lessons and challenges from software quality management for space system software.

  8. Software Quality and Copyright: Issues in Computer-Assisted Instruction.

    Science.gov (United States)

    Helm, Virginia

    The two interconnected problems of educational quality and piracy are described and analyzed in this book, which begins with an investigation of the accusations regarding the alleged dismal quality of educational software. The reality behind accusations of rampant piracy and the effect of piracy on the quality of educational software is examined…

  9. Embedding Quality Function Deployment In Software Development ...

    African Journals Online (AJOL)

    Software development differs widely in concept, requirement and framework. Therefore the software engineer has enormous task in engineering functional software that can work and be delivered on time..This paper focuses on how customers' voice can be heard in order to reduce development and manufacturing costs, ...

  10. On Quality and Measures in Software Engineering

    Science.gov (United States)

    Bucur, Ion I.

    2006-01-01

    Complexity measures are mainly used to estimate vital information about reliability and maintainability of software systems from regular analysis of the source code. Such measures also provide constant feedback during a software project to assist the control of the development procedure. There exist several models to classify a software product's…

  11. Experiences with Software Quality Metrics in the EMI middleware

    International Nuclear Information System (INIS)

    Alandes, M; Meneses, D; Pucciani, G; Kenny, E M

    2012-01-01

    The EMI Quality Model has been created to define, and later review, the EMI (European Middleware Initiative) software product and process quality. A quality model is based on a set of software quality metrics and helps to set clear and measurable quality goals for software products and processes. The EMI Quality Model follows the ISO/IEC 9126 Software Engineering – Product Quality to identify a set of characteristics that need to be present in the EMI software. For each software characteristic, such as portability, maintainability, compliance, etc, a set of associated metrics and KPIs (Key Performance Indicators) are identified. This article presents how the EMI Quality Model and the EMI Metrics have been defined in the context of the software quality assurance activities carried out in EMI. It also describes the measurement plan and presents some of the metrics reports that have been produced for the EMI releases and updates. It also covers which tools and techniques can be used by any software project to extract “code metrics” on the status of the software products and “process metrics” related to the quality of the development and support process such as reaction time to critical bugs, requirements tracking and delays in product releases.

  12. Experiences with Software Quality Metrics in the EMI middleware

    Science.gov (United States)

    Alandes, M.; Kenny, E. M.; Meneses, D.; Pucciani, G.

    2012-12-01

    The EMI Quality Model has been created to define, and later review, the EMI (European Middleware Initiative) software product and process quality. A quality model is based on a set of software quality metrics and helps to set clear and measurable quality goals for software products and processes. The EMI Quality Model follows the ISO/IEC 9126 Software Engineering - Product Quality to identify a set of characteristics that need to be present in the EMI software. For each software characteristic, such as portability, maintainability, compliance, etc, a set of associated metrics and KPIs (Key Performance Indicators) are identified. This article presents how the EMI Quality Model and the EMI Metrics have been defined in the context of the software quality assurance activities carried out in EMI. It also describes the measurement plan and presents some of the metrics reports that have been produced for the EMI releases and updates. It also covers which tools and techniques can be used by any software project to extract “code metrics” on the status of the software products and “process metrics” related to the quality of the development and support process such as reaction time to critical bugs, requirements tracking and delays in product releases.

  13. Implementing Resource-aware Multicast Forwarding in Software Defined Networks

    DEFF Research Database (Denmark)

    Poderys, Justas; Sunny, Anjusha; Soler, José

    2018-01-01

    Using multicast data transmissions, data can be eciently distributed to a high number of network users. However, in order to ef-ciently stream multimedia using multicast communication, multicast routing protocols must have knowledge of all network links and their available bandwidth. In Software......-Karp algorithm, by taking into account network topology and links load information. This paper presents the algorithm, implementation details, and an analysis of the testing results....

  14. Quality Assurance in Software Development: An Exploratory Investigation in Software Project Failures and Business Performance

    Science.gov (United States)

    Ichu, Emmanuel A.

    2010-01-01

    Software quality is perhaps one of the most sought-after attributes in product development, however; this goal is unattained. Problem factors in software development and how these have affected the maintainability of the delivered software systems requires a thorough investigation. It was, therefore, very important to understand software…

  15. Pragmatic quality metrics for evolutionary software development models

    Science.gov (United States)

    Royce, Walker

    1990-01-01

    Due to the large number of product, project, and people parameters which impact large custom software development efforts, measurement of software product quality is a complex undertaking. Furthermore, the absolute perspective from which quality is measured (customer satisfaction) is intangible. While we probably can't say what the absolute quality of a software product is, we can determine the relative quality, the adequacy of this quality with respect to pragmatic considerations, and identify good and bad trends during development. While no two software engineers will ever agree on an optimum definition of software quality, they will agree that the most important perspective of software quality is its ease of change. We can call this flexibility, adaptability, or some other vague term, but the critical characteristic of software is that it is soft. The easier the product is to modify, the easier it is to achieve any other software quality perspective. This paper presents objective quality metrics derived from consistent lifecycle perspectives of rework which, when used in concert with an evolutionary development approach, can provide useful insight to produce better quality per unit cost/schedule or to achieve adequate quality more efficiently. The usefulness of these metrics is evaluated by applying them to a large, real world, Ada project.

  16. Building quality into performance and safety assessment software

    International Nuclear Information System (INIS)

    Wojciechowski, L.C.

    2011-01-01

    Quality assurance is integrated throughout the development lifecycle for performance and safety assessment software. The software used in the performance and safety assessment of a Canadian deep geological repository (DGR) follows the CSA quality assurance standard CSA-N286.7 [1], Quality Assurance of Analytical, Scientific and Design Computer Programs for Nuclear Power Plants. Quality assurance activities in this standard include tasks such as verification and inspection; however, much more is involved in producing a quality software computer program. The types of errors found with different verification methods are described. The integrated quality process ensures that defects are found and corrected as early as possible. (author)

  17. A Model for Quality Optimization in Software Design Processes

    NARCIS (Netherlands)

    Noppen, J.A.R.; van den Broek, P.M.; Aksit, Mehmet

    The main objective of software engineers is to design and implement systems that implement all functional and non-functional requirements. Unfortunately, it is very difficult or even generally impossible to deliver a software system that satisfies all the requirements. Even more seriously, failures

  18. The 7 C's for Creating Living Software: A Research Perspective for Quality-Oriented Software Engineering

    NARCIS (Netherlands)

    Aksit, Mehmet

    2004-01-01

    This article proposes the 7 C's for realizing quality-oriented software engineering practices. All the desired qualities of this approach are expressed in short by the term living software. The 7 C's are: Concern-oriented processes, Canonical models, Composable models, Certifiable models,

  19. International Liability Issues for Software Quality

    National Research Council Canada - National Science Library

    Mead, Nancy

    2003-01-01

    This report focuses on international law related to cybercrime, international information security standards, and software liability issues as they relate to information security for critical infrastructure applications...

  20. Continuous software quality analysis for the ATLAS experiment

    CERN Document Server

    Washbrook, Andrew; The ATLAS collaboration

    2017-01-01

    The software for the ATLAS experiment on the Large Hadron Collider at CERN has evolved over many years to meet the demands of Monte Carlo simulation, particle detector reconstruction and data analysis. At present over 3.8 million lines of C++ code (and close to 6 million total lines of code) are maintained by an active worldwide developer community. In order to run the experiment software efficiently at hundreds of computing centres it is essential to maintain a high level of software quality standards. The methods proposed to improve software quality practices by incorporating checks into the new ATLAS software build infrastructure.

  1. Systems and software quality the next step for industrialisation

    CERN Document Server

    Wieczorek, Martin; Bons, Heinz

    2014-01-01

    Software and systems quality is playing an increasingly important role in the growth of almost all - profit and non-profit - organisations. Quality is vital to the success of enterprises in their markets. Most small trade and repair businesses use software systems in their administration and marketing processes. Every doctor's surgery is managing its patients using software. Banking is no longer conceivable without software. Aircraft, trucks and cars use more and more software to handle their increasingly complex technical systems. Innovation, competition and cost pressure are always present i

  2. Experiences with Software Quality Metrics in the EMI Middleware

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    The EMI Quality Model has been created to define, and later review, the EMI (European Middleware Initiative) software product and process quality. A quality model is based on a set of software quality metrics and helps to set clear and measurable quality goals for software products and processes. The EMI Quality Model follows the ISO/IEC 9126 Software Engineering – Product Quality to identify a set of characteristics that need to be present in the EMI software. For each software characteristic, such as portability, maintainability, compliance, etc, a set of associated metrics and KPIs (Key Performance Indicators) are identified. This article presents how the EMI Quality Model and the EMI Metrics have been defined in the context of the software quality assurance activities carried out in EMI. It also describes the measurement plan and presents some of the metrics reports that have been produced for the EMI releases and updates. It also covers which tools and techniques can be used by any software project t...

  3. Experiences with Software Quality Metrics in the EMI middlewate

    CERN Document Server

    Alandes, M; Meneses, D; Pucciani, G

    2012-01-01

    The EMI Quality Model has been created to define, and later review, the EMI (European Middleware Initiative) software product and process quality. A quality model is based on a set of software quality metrics and helps to set clear and measurable quality goals for software products and processes. The EMI Quality Model follows the ISO/IEC 9126 Software Engineering – Product Quality to identify a set of characteristics that need to be present in the EMI software. For each software characteristic, such as portability, maintainability, compliance, etc, a set of associated metrics and KPIs (Key Performance Indicators) are identified. This article presents how the EMI Quality Model and the EMI Metrics have been defined in the context of the software quality assurance activities carried out in EMI. It also describes the measurement plan and presents some of the metrics reports that have been produced for the EMI releases and updates. It also covers which tools and techniques can be used by any software project to ...

  4. Automated software quality visualisation using fuzzy logic techniques

    OpenAIRE

    Senior, J; Allison, I; Tepper, J

    2007-01-01

    In the past decade there has been a concerted effort by the software industry to improve the quality of its products. This has led to the inception of various techniques with which to control and measure the process involved in software development. Methods like the Capability Maturity Model have introduced processes and strategies that require measurement in the form of software metrics. With the ever increasing number of software metrics being introduced by capability based processes, softw...

  5. Software Quality - Introduction to the Special Theme

    NARCIS (Netherlands)

    A. Cleve (Anthony); J.J. Vinju (Jurgen)

    2014-01-01

    htmlabstractThe introduction of fast and cheap computer and networking hardware enables the spread of software. Software, in a nutshell, represents an unprecedented ability to channel creativity and innovation. The joyful act of simply writing computer programs for existing ICT infrastructure can

  6. A multiobjective module-order model for software quality enhancement

    NARCIS (Netherlands)

    Khoshgoftaar, TM; Liu, Y; Seliya, N

    2004-01-01

    The knowledge, prior to system operations, of which program modules are problematic is valuable to a software quality assurance team, especially when there is a constraint on software quality enhancement resources. A cost-effective approach for allocating such resources is to obtain a prediction in

  7. A pattern framework for software quality assessment and tradeoff analysis

    NARCIS (Netherlands)

    Folmer, Eelke; Boscht, Jan

    The earliest design decisions often have a significant impact on software quality and are the most costly to revoke. One of the challenges in architecture design is to reduce the frequency of retrofit problems in software designs; not being able to improve the quality of a system cost effectively, a

  8. Faster issue resolution with higher technical quality of software

    NARCIS (Netherlands)

    Bijlsma, D.; Ferreira, M.A.; Luijten, B.; Visser, J.

    2011-01-01

    We performed an empirical study of the relation between technical quality of software products and the issue resolution performance of their maintainers. In particular, we tested the hypothesis that ratings for source code maintainability, as employed by the Software Improvement Group (SIG) quality

  9. A software quality model and metrics for risk assessment

    Science.gov (United States)

    Hyatt, L.; Rosenberg, L.

    1996-01-01

    A software quality model and its associated attributes are defined and used as the model for the basis for a discussion on risk. Specific quality goals and attributes are selected based on their importance to a software development project and their ability to be quantified. Risks that can be determined by the model's metrics are identified. A core set of metrics relating to the software development process and its products is defined. Measurements for each metric and their usability and applicability are discussed.

  10. SOFTWARE TOOLS FOR COMPUTING EXPERIMENT AIMED AT MULTIVARIATE ANALYSIS IMPLEMENTATION

    Directory of Open Access Journals (Sweden)

    A. V. Tyurin

    2015-09-01

    Full Text Available A concept for organization and planning of computational experiment aimed at implementation of multivariate analysis of complex multifactor models is proposed. It is based on the generation of calculations tree. The logical and structural schemes of the tree are given and software tools, as well, for the automation of work with it: calculation generation, carrying out calculations and analysis of the obtained results. Computer modeling systems and such special-purpose systems as RACS and PRADIS do not solve the problems connected with effective carrying out of computational experiment, consisting of its organization, planning, execution and analysis of the results. Calculation data storage for computational experiment organization is proposed in the form of input and output data tree. Each tree node has a reference to the calculation of model step performed earlier. The storage of calculations tree is realized in a specially organized directory structure. A software tool is proposed for creating and modifying design scheme that stores the structure of one branch of the calculation tree with the view of effective planning of multivariate calculations. A set of special-purpose software tools gives the possibility for the quick generation and modification of the tree, addition of calculations with step-by-step change in the model factors. To perform calculations, software environment in the form of a graphical user interface for creating and modifying calculation script has been developed. This environment makes it possible to traverse calculation tree in a certain order and to perform serial and parallel initiation of computational modules. To analyze the results, software tool has been developed, operating on the base of the tag tree. It is a special tree that stores input and output data of the calculations in the set of changes form of appropriate model factors. The tool enables to select the factors and responses of the model at various steps

  11. Guidance and Control Software Project Data - Volume 4: Configuration Management and Quality Assurance Documents

    Science.gov (United States)

    Hayhurst, Kelly J. (Editor)

    2008-01-01

    The Guidance and Control Software (GCS) project was the last in a series of software reliability studies conducted at Langley Research Center between 1977 and 1994. The technical results of the GCS project were recorded after the experiment was completed. Some of the support documentation produced as part of the experiment, however, is serving an unexpected role far beyond its original project context. Some of the software used as part of the GCS project was developed to conform to the RTCA/DO-178B software standard, "Software Considerations in Airborne Systems and Equipment Certification," used in the civil aviation industry. That standard requires extensive documentation throughout the software development life cycle, including plans, software requirements, design and source code, verification cases and results, and configuration management and quality control data. The project documentation that includes this information is open for public scrutiny without the legal or safety implications associated with comparable data from an avionics manufacturer. This public availability has afforded an opportunity to use the GCS project documents for DO-178B training. This report provides a brief overview of the GCS project, describes the 4-volume set of documents and the role they are playing in training, and includes configuration management and quality assurance documents from the GCS project. Volume 4 contains six appendices: A. Software Accomplishment Summary for the Guidance and Control Software Project; B. Software Configuration Index for the Guidance and Control Software Project; C. Configuration Management Records for the Guidance and Control Software Project; D. Software Quality Assurance Records for the Guidance and Control Software Project; E. Problem Report for the Pluto Implementation of the Guidance and Control Software Project; and F. Support Documentation Change Reports for the Guidance and Control Software Project.

  12. Software quality assurance plan for the National Ignition Facility integrated computer control system

    Energy Technology Data Exchange (ETDEWEB)

    Woodruff, J.

    1996-11-01

    Quality achievement is the responsibility of the line organizations of the National Ignition Facility (NIF) Project. This Software Quality Assurance Plan (SQAP) applies to the activities of the Integrated Computer Control System (ICCS) organization and its subcontractors. The Plan describes the activities implemented by the ICCS section to achieve quality in the NIF Project`s controls software and implements the NIF Quality Assurance Program Plan (QAPP, NIF-95-499, L-15958-2) and the Department of Energy`s (DOE`s) Order 5700.6C. This SQAP governs the quality affecting activities associated with developing and deploying all control system software during the life cycle of the NIF Project.

  13. An Approach to Measuring Software Quality Perception

    Science.gov (United States)

    Hofman, Radoslaw

    Perception measuring and perception management is an emerging approach in the area of product management. Cognitive, psychological, behavioral and neurological theories, tools and methods are being employed for a better understanding of the mechanisms of a consumer's attitude and decision processes. Software is also being defined as a product, however this kind of product is significantly different from all other products. Software products are intangible and it is difficult to trace their characteristics which are strongly dependant on a dynamic context of use.

  14. Multistage switching hardware and software implementations for student experiment purpose

    Science.gov (United States)

    Sani, A.; Suherman

    2018-02-01

    Current communication and internet networks are underpinned by the switching technologies that interconnect one network to the others. Students’ understanding on networks rely on how they conver the theories. However, understanding theories without touching the reality may exert spots in the overall knowledge. This paper reports the progress of the multistage switching design and implementation for student laboratory activities. The hardware and software designs are based on three stages clos switching architecture with modular 2x2 switches, controlled by an arduino microcontroller. The designed modules can also be extended for batcher and bayan switch, and working on circuit and packet switching systems. The circuit analysis and simulation show that the blocking probability for each switch combinations can be obtained by generating random or patterned traffics. The mathematic model and simulation analysis shows 16.4% blocking probability differences as the traffic generation is uniform. The circuits design components and interfacing solution have been identified to allow next step implementation.

  15. Design, Implementation, and Performance of CREAM Data Acquisition Software

    International Nuclear Information System (INIS)

    Zinn, S.-Y.; Ahn, H.S.; Bagliesi, M.G.; Beatty, J.J.; Childers, J.T.; Coutu, S.; DuVernois, M.A.; Ganel, O.; Kim, H.J.; Lee, M.H.; Lutz, L.; Malinine, A.; Maestro, P.; Marrocchesi, P.S.; Park, I.H.; Seo, E.S.; Song, C.; Swordy, S.; Wu, J.

    2006-01-01

    Cosmic Ray Energetics and Mass (CREAM) is a balloon-borne experiment scheduled for launching from Antarctica in late 2004. Its aim is to measure the energy spectrum and composition of cosmic rays from proton to iron nuclei at ultra high energies from 1 to 1,000 TeV. Ultra long duration balloons are expected to fly about 100 days. One special feature of the CREAM data acquisition software (CDAQ) is the telemetric operation of the instrument using satellites. During a flight the science event and housekeeping data are sent from the instrument to a ground facility. Likewise, commands for controlling both the hardware and the software are uploaded from the ground facility. This requires a robust, reliable, and fast software system. CDAQ has been developed and tested during three beam tests at CERN in July, September, and November 2003. Recently the interfaces to the transition radiation detector (TRD) and to the timing-based charge detector (TCD) have been added. These new additions to CDAQ will be checked at a thermal/vacuum test of the instrument at NASA. The design, implementation, and performance of CDAQ are reported

  16. Criteria and tools for scientific software quality measurements

    International Nuclear Information System (INIS)

    Tseng, M.Y.

    1995-12-01

    Not all software used in the nuclear industry needs the rigorous formal verification, reliability testing and quality assessment that are being applied to safety critical software. Recently, however, there is increasing recognition that systematic and objective quality assessment of the scientific software used in design and safety analyses of nuclear facilities is necessary to support safety and licensing decisions. Because of the complexity and large size of these programs and the resource constraints faced by the AECB reviewer, it is desirable that appropriate automated tools are used wherever practical. To objectively assess the quality of software, a set of attributes of a software product by which its quality is described and evaluated must be established. These attributes must be relevant to the application domain of software under evaluation. To effectively assess the quality of software, metrics defining quantitative scale and method appropriate to determine the value of attributes need to be applied. To cost-effectively perform the evaluation, use of suitable automated tools is desirable. In this project, criteria for evaluating the quality of scientific software are presented; metrics for which those criteria can be evaluated are identified; a survey of automated tools to measure those metrics was conducted and the most appropriate tool (QA Fortran) was acquired; and the tool usage was demonstrated on three sample programs. (author) 5 refs

  17. Tool Use Within NASA Software Quality Assurance

    Science.gov (United States)

    Shigeta, Denise; Port, Dan; Nikora, Allen P.; Wilf, Joel

    2013-01-01

    As space mission software systems become larger and more complex, it is increasingly important for the software assurance effort to have the ability to effectively assess both the artifacts produced during software system development and the development process itself. Conceptually, assurance is a straightforward idea - it is the result of activities carried out by an organization independent of the software developers to better inform project management of potential technical and programmatic risks, and thus increase management's confidence in the decisions they ultimately make. In practice, effective assurance for large, complex systems often entails assessing large, complex software artifacts (e.g., requirements specifications, architectural descriptions) as well as substantial amounts of unstructured information (e.g., anomaly reports resulting from testing activities during development). In such an environment, assurance engineers can benefit greatly from appropriate tool support. In order to do so, an assurance organization will need accurate and timely information on the tool support available for various types of assurance activities. In this paper, we investigate the current use of tool support for assurance organizations within NASA, and describe on-going work at JPL for providing assurance organizations with the information about tools they need to use them effectively.

  18. Software quality assurance procedures for radioactive waste risk assessment codes

    International Nuclear Information System (INIS)

    Hill, I.; Mayer, J.

    1990-01-01

    This support study for the evaluation of the safety of geological disposal systems is aimed at identifying the requirements for software quality assurance procedures for radioactive waste risk assessment codes, and to recommend appropriate procedures. The research covers: (i) the analysis of existing procedures and definition of requirements; (ii) a case study of the use of some existing procedures; (iii) the definition and the implementation of procedures. The report is supported by appendices that give more detail on the procedures recommended. It is intended to provide ideas on the steps that should be taken to ensure the quality of the programs used for assessment of the safety case for radioactive waste repositories, and does not represent the introduction of wholly new ideas or techniques. The emphasis throughout is on procedures that will be easily implemented, rather than on the fully rigorous procedures that are required for some application areas. The study has concentrated on measures that will increase the confidence in repository performance assessments among the wider scientific/engineering community, and the lay public

  19. Using software metrics and software reliability models to attain acceptable quality software for flight and ground support software for avionic systems

    Science.gov (United States)

    Lawrence, Stella

    1992-01-01

    This paper is concerned with methods of measuring and developing quality software. Reliable flight and ground support software is a highly important factor in the successful operation of the space shuttle program. Reliability is probably the most important of the characteristics inherent in the concept of 'software quality'. It is the probability of failure free operation of a computer program for a specified time and environment.

  20. Cluster implementation for parallel computation within MATLAB software environment

    International Nuclear Information System (INIS)

    Santana, Antonio O. de; Dantas, Carlos C.; Charamba, Luiz G. da R.; Souza Neto, Wilson F. de; Melo, Silvio B. Melo; Lima, Emerson A. de O.

    2013-01-01

    A cluster for parallel computation with MATLAB software the COCGT - Cluster for Optimizing Computing in Gamma ray Transmission methods, is implemented. The implementation correspond to creation of a local net of computers, facilities and configurations of software, as well as the accomplishment of cluster tests for determine and optimizing of performance in the data processing. The COCGT implementation was required by data computation from gamma transmission measurements applied to fluid dynamic and tomography reconstruction in a FCC-Fluid Catalytic Cracking cold pilot unity, and simulation data as well. As an initial test the determination of SVD - Singular Values Decomposition - of random matrix with dimension (n , n), n=1000, using the Girco's law modified, revealed that COCGT was faster in comparison to the literature [1] cluster, which is similar and operates at the same conditions. Solution of a system of linear equations provided a new test for the COCGT performance by processing a square matrix with n=10000, computing time was 27 s and for square matrix with n=12000, computation time was 45 s. For determination of the cluster behavior in relation to 'parfor' (parallel for-loop) and 'spmd' (single program multiple data), two codes were used containing those two commands and the same problem: determination of SVD of a square matrix with n= 1000. The execution of codes by means of COCGT proved: 1) for the code with 'parfor', the performance improved with the labs number from 1 to 8 labs; 2) for the code 'spmd', just 1 lab (core) was enough to process and give results in less than 1 s. In similar situation, with the difference that now the SVD will be determined from square matrix with n1500, for code with 'parfor', and n=7000, for code with 'spmd'. That results take to conclusions: 1) for the code with 'parfor', the behavior was the same already described above; 2) for code with 'spmd', the same besides having produced a larger performance, it supports a

  1. Total quality management implementation guidelines

    Energy Technology Data Exchange (ETDEWEB)

    1993-12-01

    These Guidelines were designed by the Energy Quality Council to help managers and supervisors in the Department of Energy Complex bring Total Quality Management to their organizations. Because the Department is composed of a rich mixture of diverse organizations, each with its own distinctive culture and quality history, these Guidelines are intended to be adapted by users to meet the particular needs of their organizations. For example, for organizations that are well along on their quality journeys and may already have achieved quality results, these Guidelines will provide a consistent methodology and terminology reference to foster their alignment with the overall Energy quality initiative. For organizations that are just beginning their quality journeys, these Guidelines will serve as a startup manual on quality principles applied in the Energy context.

  2. Implementation of an OAIS Repository Using Free, Open Source Software

    Science.gov (United States)

    Flathers, E.; Gessler, P. E.; Seamon, E.

    2015-12-01

    The Northwest Knowledge Network (NKN) is a regional data repository located at the University of Idaho that focuses on the collection, curation, and distribution of research data. To support our home institution and others in the region, we offer services to researchers at all stages of the data lifecycle—from grant application and data management planning to data distribution and archive. In this role, we recognize the need to work closely with other data management efforts at partner institutions and agencies, as well as with larger aggregation efforts such as our state geospatial data clearinghouses, data.gov, DataONE, and others. In the past, one of our challenges with monolithic, prepackaged data management solutions is that customization can be difficult to implement and maintain, especially as new versions of the software are released that are incompatible with our local codebase. Our solution is to break the monolith up into its constituent parts, which offers us several advantages. First, any customizations that we make are likely to fall into areas that can be accessed through Application Program Interfaces (API) that are likely to remain stable over time, so our code stays compatible. Second, as components become obsolete or insufficient to meet new demands that arise, we can replace the individual components with minimal effect on the rest of the infrastructure, causing less disruption to operations. Other advantages include increased system reliability, staggered rollout of new features, enhanced compatibility with legacy systems, reduced dependence on a single software company as a point of failure, and the separation of development into manageable tasks. In this presentation, we describe our application of the Service Oriented Architecture (SOA) design paradigm to assemble a data repository that conforms to the Open Archival Information System (OAIS) Reference Model primarily using a collection of free and open-source software. We detail the design

  3. Continuous Software Quality analysis for the ATLAS experiment

    CERN Document Server

    Washbrook, Andrew; The ATLAS collaboration

    2017-01-01

    The regular application of software quality tools in large collaborative projects is required to reduce code defects to an acceptable level. If left unchecked the accumulation of defects invariably results in performance degradation at scale and problems with the long-term maintainability of the code. Although software quality tools are effective for identification there remains a non-trivial sociological challenge to resolve defects in a timely manner. This is a ongoing concern for the ATLAS software which has evolved over many years to meet the demands of Monte Carlo simulation, detector reconstruction and data analysis. At present over 3.8 million lines of C++ code (and close to 6 million total lines of code) are maintained by a community of hundreds of developers worldwide. It is therefore preferable to address code defects before they are introduced into a widely used software release. Recent wholesale changes to the ATLAS software infrastructure have provided an ideal opportunity to apply software quali...

  4. Bottlenecks in Software Defect Prediction Implementation in Industrial Projects

    OpenAIRE

    Hryszko Jarosław; Madeyski Lech

    2015-01-01

    Case studies focused on software defect prediction in real, industrial software development projects are extremely rare. We report on dedicated R&D project established in cooperation between Wroclaw University of Technology and one of the leading automotive software development companies to research possibilities of introduction of software defect prediction using an open source, extensible software measurement and defect prediction framework called DePress (Defect Prediction in Software Syst...

  5. Implementation of Software Configuration Management Process by Models: Practical Experiments and Learned Lessons

    Directory of Open Access Journals (Sweden)

    Bartusevics Arturs

    2014-12-01

    Full Text Available Nowadays software configuration management process is not only dilemma which system should be used for version control or how to merge changes from one source code branch to other. There are multiple tasks such as version control, build management, deploy management, status accounting, bug tracking and many others that should be solved to support full configuration management process according to most popular quality standards. The main scope of the mentioned process is to include only valid and tested software items to final version of product and prepare a new version as soon as possible. To implement different tasks of software configuration management process, a set of different tools, scripts and utilities should be used. The current paper provides a new model-based approach to implementation of configuration management. Using different models, a new approach helps to organize existing solutions and develop new ones by a parameterized way, thus increasing reuse of solutions. The study provides a general description of new model-based conception and definitions of all models needed to implement a new approach. The second part of the paper contains an overview of criteria, practical experiments and lessons learned from using new models in software configuration management. Finally, further works are defined based on results of practical experiments and lessons learned.

  6. An Initial Quality Analysis of the Ohloh Software Evolution Data

    NARCIS (Netherlands)

    Bruntink, M.

    2014-01-01

    Large public data sets on software evolution promise great value to both researchers and practitioners, in particular for software (development) analytics. To realise this value, the data quality of such data sets needs to be studied and improved. Despite these data sets being of a secondary nature,

  7. A survey of Canadian medical physicists: software quality assurance of in-house software.

    Science.gov (United States)

    Salomons, Greg J; Kelly, Diane

    2015-01-05

    This paper reports on a survey of medical physicists who write and use in-house written software as part of their professional work. The goal of the survey was to assess the extent of in-house software usage and the desire or need for related software quality guidelines. The survey contained eight multiple-choice questions, a ranking question, and seven free text questions. The survey was sent to medical physicists associated with cancer centers across Canada. The respondents to the survey expressed interest in having guidelines to help them in their software-related work, but also demonstrated extensive skills in the area of testing, safety, and communication. These existing skills form a basis for medical physicists to establish a set of software quality guidelines.

  8. Quality Market: Design and Field Study of Prediction Market for Software Quality Control

    Science.gov (United States)

    Krishnamurthy, Janaki

    2010-01-01

    Given the increasing competition in the software industry and the critical consequences of software errors, it has become important for companies to achieve high levels of software quality. While cost reduction and timeliness of projects continue to be important measures, software companies are placing increasing attention on identifying the user…

  9. Evolvability as a Quality Attribute of Software Architectures

    NARCIS (Netherlands)

    Ciraci, S.; van den Broek, P.M.; Duchien, Laurence; D'Hondt, Maja; Mens, Tom

    We review the definition of evolvability as it appears on the literature. In particular, the concept of software evolvability is compared with other system quality attributes, such as adaptability, maintainability and modifiability.

  10. Human Factors in Software Development Processes: Measuring System Quality

    DEFF Research Database (Denmark)

    Abrahão, Silvia; Baldassarre, Maria Teresa; Caivano, Danilo

    2016-01-01

    Software Engineering and Human-Computer Interaction look at the development process from different perspectives. They apparently use very different approaches, are inspired by different principles and address different needs. But, they definitively have the same goal: develop high quality software...... in the most effective way. The second edition of the workshop puts particular attention on efforts of the two communities in enhancing system quality. The research question discussed is: who, what, where, when, why, and how should we evaluate?...

  11. A New Software Quality Model for Evaluating COTS Components

    OpenAIRE

    Adnan Rawashdeh; Bassem Matalkah

    2006-01-01

    Studies show that COTS-based (Commercial off the shelf) systems that are being built recently are exceeding 40% of the total developed software systems. Therefore, a model that ensures quality characteristics of such systems becomes a necessity. Among the most critical processes in COTS-based systems are the evaluation and selection of the COTS components. There are several existing quality models used to evaluate software systems in general; however, none of them is dedicated to COTS-based s...

  12. Quality Management Systems Implementation Compared With Organizational Maturity in Hospital.

    Science.gov (United States)

    Moradi, Tayebeh; Jafari, Mehdi; Maleki, Mohammad Reza; Naghdi, Seyran; Ghiasvand, Hesam

    2015-07-27

    A quality management system can provide a framework for continuous improvement in order to increase the probability of customers and other stakeholders' satisfaction. The test maturity model helps organizations to assess the degree of maturity in implementing effective and sustained quality management systems; plan based on the current realities of the organization and prioritize their improvement programs. We aim to investigate and compare the level of organizational maturity in hospitals with the status of quality management systems implementation. This analytical cross sectional study was conducted among hospital administrators and quality experts working in hospitals with over 200 beds located in Tehran. In the first step, 32 hospitals were selected and then 96 employees working in the selected hospitals were studied. The data were gathered using the implementation checklist of quality management systems and the organization maturity questionnaire derived from ISO 10014. The content validity was calculated using Lawshe method and the reliability was estimated using test - retest method and calculation of Cronbach's alpha coefficient. The descriptive and inferential statistics were used to analyze the data using SPSS 18 software. According to the table, the mean score of organizational maturity among hospitals in the first stage of quality management systems implementation was equal to those in the third stage and hypothesis was rejected (p-value = 0.093). In general, there is no significant difference in the organizational maturity between the first and third level hospitals (in terms of implementation of quality management systems). Overall, the findings of the study show that there is no significant difference in the organizational maturity between the hospitals in different levels of the quality management systems implementation and in fact, the maturity of the organizations cannot be attributed to the implementation of such systems. As a result, hospitals

  13. Sandia National Laboratories ASCI Applications Software Quality Engineering Practices; TOPICAL

    International Nuclear Information System (INIS)

    ZEPPER, JOHN D.; ARAGON, KATHRYN MARY; ELLIS, MOLLY A.; BYLE, KATHLEEN A.; EATON, DONNA SUE

    2002-01-01

    This document provides a guide to the deployment of the software verification activities, software engineering practices, and project management principles that guide the development of Accelerated Strategic Computing Initiative (ASCI) applications software at Sandia National Laboratories (Sandia). The goal of this document is to identify practices and activities that will foster the development of reliable and trusted products produced by the ASCI Applications program. Document contents include an explanation of the structure and purpose of the ASCI Quality Management Council, an overview of the software development lifecycle, an outline of the practices and activities that should be followed, and an assessment tool. These sections map practices and activities at Sandia to the ASCI Software Quality Engineering: Goals, Principles, and Guidelines, a Department of Energy document

  14. [Software for illustrating a cost-quality balance carried out by clinical laboratory practice].

    Science.gov (United States)

    Nishibori, Masahiro; Asayama, Hitoshi; Kimura, Satoshi; Takagi, Yasushi; Hagihara, Michio; Fujiwara, Mutsunori; Yoneyama, Akiko; Watanabe, Takashi

    2010-09-01

    We have no proper reference indicating the quality of clinical laboratory practice, which should clearly illustrates that better medical tests require more expenses. Japanese Society of Laboratory Medicine was concerned about recent difficult medical economy and issued a committee report proposing a guideline to evaluate the good laboratory practice. According to the guideline, we developed software that illustrate a cost-quality balance carried out by clinical laboratory practice. We encountered a number of controversial problems, for example, how to measure and weight each quality-related factor, how to calculate costs of a laboratory test and how to consider characteristics of a clinical laboratory. Consequently we finished only prototype software within the given period and the budget. In this paper, software implementation of the guideline and the above-mentioned problems are summarized. Aiming to stimulate these discussions, the operative software will be put on the Society's homepage for trial

  15. Measuring Software Product Quality: The ISO 25000 Series and CMMI

    Science.gov (United States)

    2004-06-14

    Relationship of CMMI and ISO 9126 /25000 - 1 CMMI takes a total life cycle view and is inclusive in its approach to requirements development. Requirements...University Carnegie Mellon Software Engineering Institute page 10 Key Points in Relationship of CMMI and ISO 9126 /25000 - 2 Product Quality Requirements are...Institute page 16 Quality In Use Model ( ISO /IEC 9126 ) Quality In UseQuality In Use EffectivenessEffectiveness SafetySafety ProductivityProductivity

  16. MCNP software quality : then and now /

    Energy Technology Data Exchange (ETDEWEB)

    Giesler, G. C. (Gregg Carl)

    2001-01-01

    MCNP is the Monte Carlo N-Particle radiation transport code whose history dates back more than half a century to the early days of computing. From a simple beginning, its uses have grown to include fields such as criticality safety, radiation shielding, oil well logging, and medical imaging and diagnostics and an international user community of over 3000 users. This large user community could only happen by the maintainance of sofware quality throughout its history. This paper will describe how the quality was maintained in the past, how the process is being improved today, and directions for future efforts.

  17. C++ Software Quality in the ATLAS experiment

    CERN Document Server

    Roe, Shaun; The ATLAS collaboration; Kluth, Stefan; Seuster, Rolf; Snyder, Scott; Obreshkov, Emil; Sherwood, Peter; Stewart, Graeme

    2016-01-01

    In this paper we explain how the C++ code quality is managed in ATLAS using a range of tools from compile-time through to run time testing and reflect on the substantial progress made in the last two years largely through the use of static analysis tools such as Coverity®, an industry-standard tool which enables quality comparison with general open source C++ code. Other available code analysis tools are also discussed, as is the role of unit testing with an example of how the googlemock framework can be applied to our codebase.

  18. Coal Quality Expert: Status and software specifications

    International Nuclear Information System (INIS)

    Harrison, C.D.

    1992-01-01

    Under the Clean Coal Technology Program (Clean Coal Round 1), the US Department of Energy (DOE) and the Electric Power Research Institute (EPRI) are funding the development and demonstration of a computer program called the Coal Quality Expert (CQE trademark). When finished, the CQE will be a comprehensive PC-based program which can be used to evaluate several potential coal cleaning, blending, and switching options to reduce power plant emissions while minimizing generation costs. The CQE will be flxible in nature and capable of evaluating various qualities of coal, available transportation options, performance issues, and alternative emissions control strategies. This allows the CQE to determine the most cost-effective coal and the least expensive emissions control strategy for a given plant. To accomplish this, the CQE will be composed of technical models to evaluate performance issues; environmental models to evaluate environmental and regulatory issues; and cost estimating models to predict costs for installations of new and retrofit coal cleaning processes, power production equipment, and emissions control systems as well as other production costs such as consumables (fuel, scrubber additive, etc.), waste disposal, operating and maintenance, and replacement energy costs. These technical, environmental, and economic models as well as a graphical user interface will be developed for the CQE. And, in addition, to take advantage of already existing capability, the CQE will rely on seamless integration of already proven and extensively used computer programs such as the EPRI Coal Quality Information Systems, Coal Quality Impact Model (CQIM trademark), and NO x Pert. 2 figs

  19. A survey of Canadian medical physicists: software quality assurance of in‐house software

    Science.gov (United States)

    Kelly, Diane

    2015-01-01

    This paper reports on a survey of medical physicists who write and use in‐house written software as part of their professional work. The goal of the survey was to assess the extent of in‐house software usage and the desire or need for related software quality guidelines. The survey contained eight multiple‐choice questions, a ranking question, and seven free text questions. The survey was sent to medical physicists associated with cancer centers across Canada. The respondents to the survey expressed interest in having guidelines to help them in their software‐related work, but also demonstrated extensive skills in the area of testing, safety, and communication. These existing skills form a basis for medical physicists to establish a set of software quality guidelines. PACS number: 87.55.Qr PMID:25679168

  20. Designing and implementing a state quality award

    Science.gov (United States)

    Dobson, E. N.

    1993-02-01

    To remain competitive in today's global economy, businesses need to ensure customer satisfaction by offering high-quality products and services. Governors and state governments can play a critical role in ensuring the economic health of the business in their state by encouraging the adoption of quality practices and recognizing successful efforts by firms to improve quality and productivity. The manual is intended to help state government officials and other individuals implement a state quality award program.

  1. Software quality assurance (SQA) for Savannah River reactors

    Energy Technology Data Exchange (ETDEWEB)

    Schaumann, C.M.

    1990-01-01

    Over the last 25 years, the Savannah River Site (SRS) has developed a strong Software Quality Assurance (SQA) program. It provides the information and management controls required of a high quality auditable system. The SRS SQA program provides the framework to meet the requirements in increasing regulation.

  2. Faster Defect Resolution with Higher Technical Quality of Software

    NARCIS (Netherlands)

    Luijten, B.; Visser, J.

    2010-01-01

    We performed an empirical study of the relation between technical quality of software products and the defect resolution performance of their maintainers. In particular, we tested the hypothesis that ratings for source code maintainability, as employed by the SIG quality model, are correlated with

  3. Framework for implementing product portfolio management in software business

    NARCIS (Netherlands)

    Jagroep, Erik; Van De Weerd, Inge; Brinkkemper, Sjaak; Dobbe, Ton

    2014-01-01

    Whether a software product company takes up a project depends on the strategic decisions that are made with regard to an organization's products. A software project needs to fit strategic goals and enable an organization to realize a vision through its software products. Making decisions on a

  4. Metric-based Evaluation of Implemented Software Architectures

    NARCIS (Netherlands)

    Bouwers, E.M.

    2013-01-01

    Software systems make up an important part of our daily lives. Just like all man- made objects, the possibilities of a software system are constrained by the choices made during its creation. The complete set of these choices can be referred to as the software architecture of a system. Since the

  5. Application of software quality assurance methods in validation and maintenance of reactor analysis computer codes

    International Nuclear Information System (INIS)

    Reznik, L.

    1994-01-01

    Various computer codes employed at Israel Electricity Company for preliminary reactor design analysis and fuel cycle scoping calculations have been often subject to program source modifications. Although most changes were due to computer or operating system compatibility problems, a number of significant modifications were due to model improvement and enhancements of algorithm efficiency and accuracy. With growing acceptance of software quality assurance requirements and methods, a program of implementing extensive testing of modified software has been adopted within the regular maintenance activities. In this work survey has been performed of various software quality assurance methods of software testing which belong mainly to the two major categories of implementation ('white box') and specification-based ('black box') testing. The results of this survey exhibits a clear preference of specification-based testing. In particular the equivalence class partitioning method and the boundary value method have been selected as especially suitable functional methods for testing reactor analysis codes.A separate study of software quality assurance methods and techniques has been performed in this work objective to establish appropriate pre-test software specification methods. Two methods of software analysis and specification have been selected as the most suitable for this purpose: The method of data flow diagrams has been shown to be particularly valuable for performing the functional/procedural software specification while the entities - relationship diagrams has been approved to be efficient for specifying software data/information domain. Feasibility of these two methods has been analyzed in particular for software uncertainty analysis and overall code accuracy estimation. (author). 14 refs

  6. Color Image Quality in Presentation Software

    Directory of Open Access Journals (Sweden)

    María S. Millán

    2008-11-01

    Full Text Available The color image quality of presentation programs is evaluated and measured using S-CIELAB and CIEDE2000 color difference formulae. A color digital image in its original format is compared with the same image already imported by the program and introduced as a part of a slide. Two widely used presentation programs—Microsoft PowerPoint 2004 for Mac and Apple's Keynote 3.0.2—are evaluated in this work.

  7. IEEE [Institute of Electrical and Electronics Engineers] standards and nuclear software quality engineering

    International Nuclear Information System (INIS)

    Daughtrey, T.

    1988-01-01

    Significant new nuclear-specific software standards have recently been adopted under the sponsorship of the American Nuclear Society and the American Society of Mechanical Engineers. The interest of the US Nuclear Regulatory Commission has also been expressed through their issuance of NUREG/CR-4640. These efforts all indicate a growing awareness of the need for thorough, referenceable expressions of the way to build in and evaluate quality in nuclear software. A broader professional perspective can be seen in the growing number of software engineering standards sponsored by the Institute of Electrical and Electronics Engineers (IEEE) Computer Society. This family of standards represents a systematic effort to capture professional consensus on quality practices throughout the software development life cycle. The only omission-the implementation phase-is treated by accepted American National Standards Institute or de facto standards for programming languages

  8. ARCHITECTURE SOFTWARE SOLUTION TO SUPPORT AND DOCUMENT MANAGEMENT QUALITY SYSTEM

    Directory of Open Access Journals (Sweden)

    Milan Eric

    2010-12-01

    Full Text Available One of the basis of a series of standards JUS ISO 9000 is quality system documentation. An architecture of the quality system documentation depends on the complexity of business system. An establishment of an efficient management documentation of system of quality is of a great importance for the business system, as well as in the phase of introducing the quality system and in further stages of its improvement. The study describes the architecture and capability of software solutions to support and manage the quality system documentation in accordance with the requirements of standards ISO 9001:2001, ISO 14001:2005 HACCP etc.

  9. Quality Systems Implementation in the Pharmaceutical Industry

    African Journals Online (AJOL)

    Nafiisah

    overseas market. It is actually into the implementation phase of ISO 9001:2000,. Quality Management System. However, it already complies with the WHO GMP ..... Price. ISO Certified. Efficacy. Case Study analysis. Quality Systems for the manufacture of a Pharmaceutical product. Introduction. The name Waypharma was ...

  10. Requirements for guidelines systems: implementation challenges and lessons from existing software-engineering efforts.

    Science.gov (United States)

    Shah, Hemant; Allard, Raymond D; Enberg, Robert; Krishnan, Ganesh; Williams, Patricia; Nadkarni, Prakash M

    2012-03-09

    A large body of work in the clinical guidelines field has identified requirements for guideline systems, but there are formidable challenges in translating such requirements into production-quality systems that can be used in routine patient care. Detailed analysis of requirements from an implementation perspective can be useful in helping define sub-requirements to the point where they are implementable. Further, additional requirements emerge as a result of such analysis. During such an analysis, study of examples of existing, software-engineering efforts in non-biomedical fields can provide useful signposts to the implementer of a clinical guideline system. In addition to requirements described by guideline-system authors, comparative reviews of such systems, and publications discussing information needs for guideline systems and clinical decision support systems in general, we have incorporated additional requirements related to production-system robustness and functionality from publications in the business workflow domain, in addition to drawing on our own experience in the development of the Proteus guideline system (http://proteme.org). The sub-requirements are discussed by conveniently grouping them into the categories used by the review of Isern and Moreno 2008. We cite previous work under each category and then provide sub-requirements under each category, and provide example of similar work in software-engineering efforts that have addressed a similar problem in a non-biomedical context. When analyzing requirements from the implementation viewpoint, knowledge of successes and failures in related software-engineering efforts can guide implementers in the choice of effective design and development strategies.

  11. EMMC guidance on quality assurance for academic materials modelling software engineering

    OpenAIRE

    European Materials Modelling Council

    2015-01-01

    Proposed recommendations for software development in LEIT projects. This document presents the advice of software owners, commercial and academic, on what academic software could do to generate better quality software, ready to be used by third parties.

  12. Cost Analysis of Poor Quality Using a Software Simulation

    Directory of Open Access Journals (Sweden)

    Jana Fabianová

    2017-02-01

    Full Text Available The issues of quality, cost of poor quality and factors affecting quality are crucial to maintaining a competitiveness regarding to business activities. Use of software applications and computer simulation enables more effective quality management. Simulation tools offer incorporating the variability of more variables in experiments and evaluating their common impact on the final output. The article presents a case study focused on the possibility of using computer simulation Monte Carlo in the field of quality management. Two approaches for determining the cost of poor quality are introduced here. One from retrospective scope of view, where the cost of poor quality and production process are calculated based on historical data. The second approach uses the probabilistic characteristics of the input variables by means of simulation, and reflects as a perspective view of the costs of poor quality. Simulation output in the form of a tornado and sensitivity charts complement the risk analysis.

  13. The software improvement process - tools and rules to encourage quality

    International Nuclear Information System (INIS)

    Sigerud, K.; Baggiolini, V.

    2012-01-01

    The Applications section of the CERN accelerator controls group has decided to apply a systematic approach to quality assurance (QA), the 'Software Improvement Process' - SIP. This process focuses on three areas: the development process itself, suitable QA tools, and how to practically encourage developers to do QA. For each stage of the development process we have agreed on the recommended activities and deliverables, and identified tools to automate and support the task. For example we do more code reviews. As peer reviews are resource intensive, we only do them for complex parts of a product. As a complement, we are using static code checking tools, like FindBugs and Checkstyle. We also encourage unit testing and have agreed on a minimum level of test coverage recommended for all products, measured using Clover. Each of these tools is well integrated with our IDE (Eclipse) and give instant feedback to the developer about the quality of their code. The major challenges of SIP have been to 1) agree on common standards and configurations, for example common code formatting and Javadoc documentation guidelines, and 2) how to encourage the developers to do QA. To address the second point, we have successfully implemented 'SIP days', i.e. one day dedicated to QA work to which the whole group of developers participates, and 'Top/Flop' lists, clearly indicating the best and worst products with regards to SIP guidelines and standards, for example test coverage. This paper presents the SIP initiative in more detail, summarizing our experience since two years and our future plans. (authors)

  14. Quality of Service Attributes for Software as a Service

    Directory of Open Access Journals (Sweden)

    Lukas Burkon

    2013-07-01

    Full Text Available Software as a Service (SaaS has been developing for over ten years and, is reaching a mature level, where quality and its monitoring and management become significant. Although, SaaS is derived from the ASP model, SaaS background and architecture is different and therefore also SaaS quality management is based on different concepts. This paper is focused on the difference between traditional IT outsourcing and SaaS and proposes a set of quality attributes appropriate for the management of the SaaS quality.

  15. Software quality assurance plan for void fraction instrument

    International Nuclear Information System (INIS)

    Gimera, M.

    1994-01-01

    Waste Tank SY-101 has been the focus of extensive characterization work over the past few years. The waste continually generates gases, most notably hydrogen, which are periodically released from the waste. Gas can be trapped in tank waste in three forms: as void gas (bubbles), dissolved gas, or absorbed gas. Void fraction is the volume percentage of a given sample that is comprised of void gas. The void fraction instrument (VFI) acquires the data necessary to calculate void fraction. This document covers the product, Void Fraction Data Acquisition Software. The void fraction software being developed will have the ability to control the void fraction instrument hardware and acquire data necessary to calculate the void fraction in samples. This document provides the software quality assurance plan, verification and validation plan, and configuration management plan for developing the software for the instrumentation that will be used to obtain void fraction data from Tank SY-101

  16. Arquitectura orientada a servicios para software de apoyo para el proceso personal de software A service oriented architecture for the implementation of the personal software process

    Directory of Open Access Journals (Sweden)

    Erick Salinas

    2011-06-01

    Full Text Available El presente trabajo describe una arquitectura orientada a servicios para un software que tiene como objetivo facilitar la implementación de un Proceso Personal de Software en un equipo de desarrollo u organización. Entre las características que posee este software y que son relevantes de mencionar están las de entregar extensibilidad e independencia, esto se ve reflejado en la facilidad para agregar nuevas herramientas al proceso de desarrollo de software integradas al Proceso Personal de Software con un máximo de independencia de sistemas operativos y lenguajes de programación. El software implementado realiza la recolección de los datos necesarios para el Proceso Personal de Software casi completamente automática, considerando que el administrador solamente clasifica los errores que pueden ocurrir cuando se utiliza algún lenguaje de programación en particular, entre otras pequeñas tareas. Esta facilidad de uso hace que la implementación del Proceso Personal de Software se realice exitosamente con un bajo esfuerzo requerido por los integrantes del equipo de desarrollo.This work describes a service oriented architecture of a software application that facilitates the implementation of the Personal Software Process by a development team or an organization. Some of the characteristics of this software and which are important to mention are extensibility and technical environment independence. These characteristics facilitate the process of adding new tools to the software development process integrating them to the Personal Software Process independently of the operating systems and programming languages being used. The implemented software undertakes the data collection necessary to the Personal Software Process almost automatically, since the administrator must only classify the errors that may occur when a particular programming language is used, among other small tasks. This ease of use approach helps to make the implementation of

  17. Increasing Responsibility to Customers through a Dynamic Quality Assurance System in Software Development

    Directory of Open Access Journals (Sweden)

    Cătălin Afrăsinei-Zevoianu

    2014-02-01

    Full Text Available The information explosion has led inevitably to the need to design and implement software solutions to the new information environments faced by any person, group, company, and nation. Measuring software product quality has been neglected for a long time but now both producers and researchers recognize the importance of field testing and assessment software. In this context, the aim of the article is to propose, validate and demonstrate the importance of using dynamic indicators of software quality measurement that reflect the implications that quality has in the emergence of gaps in each stage of the process leading to the achievement of application software. Method proposed is part of applied researches field having the purpose of being a specific instrument for the domain studied and it is intended to become the means by which the required effectiveness for the assumed objectives is reached: increasing responsibility to customers through quality improvement. Moreover, by applying measures to improve product quality, any company signs up in the current trend of increasing responsibility to the client which, in its turn, is an important component of CSR.

  18. Evaluation of features to support safety and quality in general practice clinical software

    Directory of Open Access Journals (Sweden)

    Schattner Peter

    2011-05-01

    Full Text Available Abstract Background Electronic prescribing is now the norm in many countries. We wished to find out if clinical software systems used by general practitioners in Australia include features (functional capabilities and other characteristics that facilitate improved patient safety and care, with a focus on quality use of medicines. Methods Seven clinical software systems used in general practice were evaluated. Fifty software features that were previously rated as likely to have a high impact on safety and/or quality of care in general practice were tested and are reported here. Results The range of results for the implementation of 50 features across the 7 clinical software systems was as follows: 17-31 features (34-62% were fully implemented, 9-13 (18-26% partially implemented, and 9-20 (18-40% not implemented. Key findings included: Access to evidence based drug and therapeutic information was limited. Decision support for prescribing was available but varied markedly between systems. During prescribing there was potential for medicine mis-selection in some systems, and linking a medicine with its indication was optional. The definition of 'current medicines' versus 'past medicines' was not always clear. There were limited resources for patients, and some medicines lists for patients were suboptimal. Results were provided to the software vendors, who were keen to improve their systems. Conclusions The clinical systems tested lack some of the features expected to support patient safety and quality of care. Standards and certification for clinical software would ensure that safety features are present and that there is a minimum level of clinical functionality that clinicians could expect to find in any system.

  19. Evaluation of features to support safety and quality in general practice clinical software

    Science.gov (United States)

    2011-01-01

    Background Electronic prescribing is now the norm in many countries. We wished to find out if clinical software systems used by general practitioners in Australia include features (functional capabilities and other characteristics) that facilitate improved patient safety and care, with a focus on quality use of medicines. Methods Seven clinical software systems used in general practice were evaluated. Fifty software features that were previously rated as likely to have a high impact on safety and/or quality of care in general practice were tested and are reported here. Results The range of results for the implementation of 50 features across the 7 clinical software systems was as follows: 17-31 features (34-62%) were fully implemented, 9-13 (18-26%) partially implemented, and 9-20 (18-40%) not implemented. Key findings included: Access to evidence based drug and therapeutic information was limited. Decision support for prescribing was available but varied markedly between systems. During prescribing there was potential for medicine mis-selection in some systems, and linking a medicine with its indication was optional. The definition of 'current medicines' versus 'past medicines' was not always clear. There were limited resources for patients, and some medicines lists for patients were suboptimal. Results were provided to the software vendors, who were keen to improve their systems. Conclusions The clinical systems tested lack some of the features expected to support patient safety and quality of care. Standards and certification for clinical software would ensure that safety features are present and that there is a minimum level of clinical functionality that clinicians could expect to find in any system.

  20. The software product assurance metrics study: JPL's software systems quality and productivity

    Science.gov (United States)

    Bush, Marilyn W.

    1989-01-01

    The findings are reported of the Jet Propulsion Laboratory (JPL)/Software Product Assurance (SPA) Metrics Study, conducted as part of a larger JPL effort to improve software quality and productivity. Until recently, no comprehensive data had been assembled on how JPL manages and develops software-intensive systems. The first objective was to collect data on software development from as many projects and for as many years as possible. Results from five projects are discussed. These results reflect 15 years of JPL software development, representing over 100 data points (systems and subsystems), over a third of a billion dollars, over four million lines of code and 28,000 person months. Analysis of this data provides a benchmark for gauging the effectiveness of past, present and future software development work. In addition, the study is meant to encourage projects to record existing metrics data and to gather future data. The SPA long term goal is to integrate the collection of historical data and ongoing project data with future project estimations.

  1. Framework to measure and maintain the quality of software using the concept of Code Readability

    OpenAIRE

    Rambabu P; Kumar J; Praneeth S; Jyothi B

    2011-01-01

    Present day’s software industry is using software metrics to estimate the complexity of software systems to find software cost estimation, software development control, software testing, software assurance and software maintenance. The relationship between a simple set of local code features and human concept of readability can be derived by collecting the data from 120 human annotators. This paper presents the concept of code readability and investigate its relation to software quality and a...

  2. Real-time Kernel Implementation Practice Program for Embedded Software Engineers' Education and its Evaluation

    Science.gov (United States)

    Yoshida, Toshio; Matsumoto, Masahide; Seo, Katsuhiko; Chino, Shinichiro; Sugino, Eiji; Sawamoto, Jun; Koizumi, Hisao

    A real-time kernel (henceforth RTK) is in the center place of embedded software technology, and the understanding of RTK is indispensable for the embedded system design. To implement RTK, it is necessary to understand languages that describe RTK software program code, system programming manners, software development tools, CPU on that RTK runs and the interface between software and hardware, etc. in addition to understanding of RTK itself. This means RTK implementation process largely covers embedded software implementation process. Therefore, it is thought that RTK implementation practice program is very effective as a means of the acquisition of common embedded software skill in addition to deeper acquisition of RTK itself. In this paper, we propose to apply RTK implementing practice program to embedded software engineers educational program. We newly developed very small and step-up type RTK named μK for educational use, and held a seminar that used μK as a teaching material for the students of information science and engineers of the software house. As a result, we confirmed that RTK implementation practice program is very effective for the acquisition of embedded software common skill.

  3. Quality of drug interaction alerts in prescribing and dispensing software.

    Science.gov (United States)

    Sweidan, Michelle; Reeve, James F; Brien, Jo-anne E; Jayasuriya, Pradeep; Martin, Jennifer H; Vernon, Graeme M

    2009-03-02

    To investigate the quality of drug interaction decision support in selected prescribing and dispensing software systems, and to compare this information with that found in a range of reference sources. A comparative study, conducted between June 2006 and February 2007, of the support provided for making decisions about 20 major and 20 minor drug interactions in six prescribing and three dispensing software systems used in primary care in Australia. Five electronic reference sources were evaluated for comparison. Sensitivity, specificity and quality of information; for major interactions: whether information on clinical effects, timeframe and pharmacological mechanism was included, whether management advice was helpful, and succinctness. Six of the nine software systems had a sensitivity rate > or = 90%, detecting most of the major interactions. Only 3/9 systems had a specificity rate of > or = 80%, with other systems providing inappropriate or unhelpful alerts for many minor interactions. Only 2/9 systems provided adequate information about clinical effects for more than half the major drug interactions, and 1/9 provided useful management advice for more than half of these. The reference sources had high sensitivity and in general provided more comprehensive clinical information than the software systems. Drug interaction decision support in commonly used prescribing and dispensing software has significant shortcomings.

  4. Approved Air Quality Implementation Plans in Region 10

    Science.gov (United States)

    Landing page for information about EPA-approved air quality State Implementation Plans (SIPs), Tribal Implementation Plans (TIPs), and Federal Implementation Plans (FIPs) in Alaska, Idaho, Oregon, Washington.

  5. Quality of Service Attributes for Software as a Service

    OpenAIRE

    Lukas Burkon

    2013-01-01

    Software as a Service (SaaS) has been developing for over ten years and, is reaching a mature level, where quality and its monitoring and management become significant. Although, SaaS is derived from the ASP model, SaaS background and architecture is different and therefore also SaaS quality management is based on different concepts. This paper is focused on the difference between traditional IT outsourcing and SaaS and proposes a set of quality attributes appropriate for the management of th...

  6. UrQt: an efficient software for the Unsupervised Quality trimming of NGS data.

    Science.gov (United States)

    Modolo, Laurent; Lerat, Emmanuelle

    2015-04-29

    Quality control is a necessary step of any Next Generation Sequencing analysis. Although customary, this step still requires manual interventions to empirically choose tuning parameters according to various quality statistics. Moreover, current quality control procedures that provide a "good quality" data set, are not optimal and discard many informative nucleotides. To address these drawbacks, we present a new quality control method, implemented in UrQt software, for Unsupervised Quality trimming of Next Generation Sequencing reads. Our trimming procedure relies on a well-defined probabilistic framework to detect the best segmentation between two segments of unreliable nucleotides, framing a segment of informative nucleotides. Our software only requires one user-friendly parameter to define the minimal quality threshold (phred score) to consider a nucleotide to be informative, which is independent of both the experiment and the quality of the data. This procedure is implemented in C++ in an efficient and parallelized software with a low memory footprint. We tested the performances of UrQt compared to the best-known trimming programs, on seven RNA and DNA sequencing experiments and demonstrated its optimality in the resulting tradeoff between the number of trimmed nucleotides and the quality objective. By finding the best segmentation to delimit a segment of good quality nucleotides, UrQt greatly increases the number of reads and of nucleotides that can be retained for a given quality objective. UrQt source files, binary executables for different operating systems and documentation are freely available (under the GPLv3) at the following address: https://lbbe.univ-lyon1.fr/-UrQt-.html .

  7. Recommendations for a software quality assurance policy for the nuclear waste disposal risk assessment programme

    International Nuclear Information System (INIS)

    Hill, I.E.

    1985-05-01

    This study reviewed a number of published standards for software quality assurance, and included a series of interviews with software developers aimed at exploring their attitudes to software quality assurance. Recommendations for software quality assurance policy are made based on the above investigations. This document provides a summary of the recommendations made in the full report on project, reference MR-CDS-4. (author)

  8. A direct implementation for influence lines in finite element software

    DEFF Research Database (Denmark)

    Jepsen, Michael S.; Damkilde, Lars

    2014-01-01

    The use of influence lines is a recognized method for determining the critical design load conditions and this paper shows a direct method for applying influence lines in any structural finite element software. The main idea is to equate displacement or angular discontinuities with nodal forces...

  9. Development and implementation of own software for dosimetry multichannel film

    International Nuclear Information System (INIS)

    Jimenez Feltstrom, D.; Reyes Garcia, R.; Luis Simon, F. J.; Carrasco Herrera, M.; Sanchez Carmona, G.; Herrador Cordoba, M.

    2013-01-01

    The objective of this work is to develop its own software for multichannel film dosimetry Radiochromic EBT2. Compare the results obtained with its use in multichannel and single-channel dosimetry. Check that the multi-channel dosimetry eliminates much of the artifacts caused by dirt, fingerprints, scratches, etc. Radiochromic in film and scanner devices. (Author)

  10. Software Tools for Electrical Quality Assurance in the LHC

    CERN Document Server

    Bednarek, Mateusz

    2011-01-01

    There are over 1600 superconducting magnet circuits in the LHC machine. Many of them consist of a large number of components electrically connected in series. This enhances the sensitivity of the whole circuits to electrical faults of individual components. Furthermore, circuits are equipped with a large number of instrumentation wires, which are exposed to accidental damage or swapping. In order to ensure safe operation, an Electrical Quality Assurance (ELQA) campaign is needed after each thermal cycle. Due to the complexity of the circuits, as well as their distant geographical distribution (tunnel of 27km circumference divided in 8 sectors), suitable software and hardware platforms had to be developed. The software combines an Oracle database, LabView data acquisition applications and PHP-based web follow-up tools. This paper describes the software used for the ELQA of the LHC.

  11. First statistical analysis of Geant4 quality software metrics

    Science.gov (United States)

    Ronchieri, Elisabetta; Grazia Pia, Maria; Giacomini, Francesco

    2015-12-01

    Geant4 is a simulation system of particle transport through matter, widely used in several experimental areas from high energy physics and nuclear experiments to medical studies. Some of its applications may involve critical use cases; therefore they would benefit from an objective assessment of the software quality of Geant4. In this paper, we provide a first statistical evaluation of software metrics data related to a set of Geant4 physics packages. The analysis aims at identifying risks for Geant4 maintainability, which would benefit from being addressed at an early stage. The findings of this pilot study set the grounds for further extensions of the analysis to the whole of Geant4 and to other high energy physics software systems.

  12. Automated Theorem Proving in High-Quality Software Design

    Science.gov (United States)

    Schumann, Johann; Swanson, Keith (Technical Monitor)

    2001-01-01

    The amount and complexity of software developed during the last few years has increased tremendously. In particular, programs are being used more and more in embedded systems (from car-brakes to plant-control). Many of these applications are safety-relevant, i.e. a malfunction of hardware or software can cause severe damage or loss. Tremendous risks are typically present in the area of aviation, (nuclear) power plants or (chemical) plant control. Here, even small problems can lead to thousands of casualties and huge financial losses. Large financial risks also exist when computer systems are used in the area of telecommunication (telephone, electronic commerce) or space exploration. Computer applications in this area are not only subject to safety considerations, but also security issues are important. All these systems must be designed and developed to guarantee high quality with respect to safety and security. Even in an industrial setting which is (or at least should be) aware of the high requirements in Software Engineering, many incidents occur. For example, the Warshaw Airbus crash, was caused by an incomplete requirements specification. Uncontrolled reuse of an Ariane 4 software module was the reason for the Ariane 5 disaster. Some recent incidents in the telecommunication area, like illegal "cloning" of smart-cards of D2GSM handies, or the extraction of (secret) passwords from German T-online users show that also in this area serious flaws can happen. Due to the inherent complexity of computer systems, most authors claim that only a rigorous application of formal methods in all stages of the software life cycle can ensure high quality of the software and lead to real safe and secure systems. In this paper, we will have a look, in how far automated theorem proving can contribute to a more widespread application of formal methods and their tools, and what automated theorem provers (ATPs) must provide in order to be useful.

  13. Implementing Extreme Programming in Distributed Software Project Teams: Strategies and Challenges

    Science.gov (United States)

    Maruping, Likoebe M.

    Agile software development methods and distributed forms of organizing teamwork are two team process innovations that are gaining prominence in today's demanding software development environment. Individually, each of these innovations has yielded gains in the practice of software development. Agile methods have enabled software project teams to meet the challenges of an ever turbulent business environment through enhanced flexibility and responsiveness to emergent customer needs. Distributed software project teams have enabled organizations to access highly specialized expertise across geographic locations. Although much progress has been made in understanding how to more effectively manage agile development teams and how to manage distributed software development teams, managers have little guidance on how to leverage these two potent innovations in combination. In this chapter, I outline some of the strategies and challenges associated with implementing agile methods in distributed software project teams. These are discussed in the context of a study of a large-scale software project in the United States that lasted four months.

  14. Quantification frameworks and their application for evaluating the software quality factor using quality characteristic value

    International Nuclear Information System (INIS)

    Kim, C.; Chung, C.H.; Won-Ahn, K.

    2004-01-01

    Many problems, related with safety, frequently occur because Digital Instrument and Control Systems are widely used and expanding their ranges to many applications in Nuclear Power Plants. It, however, does not hold a general position to estimate an appropriate software quality. Thus, the Quality Characteristic Value, a software quality factor through each software life cycle, is suggested in this paper. The Quality Characteristic Value is obtained as following procedure: 1) Scoring Quality Characteristic Factors (especially correctness, traceability, completeness, and understandability) onto Software Verification and Validation results, 2) Deriving the diamond-shaped graphs by setting values of Factors at each axis and lining every points, and lastly 3) Measuring the area of the graph for Quality Characteristic Value. In this paper, this methodology is applied to Plant Control System. In addition, the series of quantification frameworks exhibit some good characteristics in the view of software quality factor. More than any thing else, it is believed that introduced framework may be applicable to regulatory guide, software approval procedures, due to its soundness and simple characteristics. (authors)

  15. Software support for environmental measurement in quality at educational institutions

    Directory of Open Access Journals (Sweden)

    Alena Pauliková

    2016-03-01

    Full Text Available The analysed theme of this article is based on the training of environmental measurements for workplaces. This is very important for sustainable quality in technical educational institutions. Applied kinds of software, which are taught at technical educational institutions, have to offer the professional and methodical knowledge concerning conditions of working ambient for students of selected technical specialisations. This skill is performed in such a way that the graduates, after entering the practical professional life, will be able to participate in solutions for actual problems that are related to environmental protection by means of software support. Nowadays, during the training processit is also obligatory to introduce technical science. Taking into consideration the above-mentioned facts it is possible to say that information technology support for environmental study subjects is a relevant aspect, which should be integrated into the university educational process. There is an effective progress that further highlights the focus on the quality of university education not only for environmental engineers. Actual trends require an increasing number of software/hardware educated engineers who can participate in qualitative university preparation, i.e.IT environmentalists. The Department of Environmental Engineering at the Faculty of Mechanical Engineering, TechnicalUniversity in Košice, Slovakia is an institution specified and intended for quality objectivisation. This institution introduced into the study programmes (“Environmental Management” and “Technology of Environmental Protection” study subjects with the software support, which are oriented towards outdoor and indoor ambient and in this way the Department of Process and Environmental Engineering is integrated effectively and intensively into the area of measurement training with regard to the requirement of quality educational processes.

  16. Gamification Can It Increase the Quantity and Quality of Software

    Science.gov (United States)

    2012-04-25

    education. • IT Services –Help desks and network administrators are areas where gamification techniques could prove useful. • Businesses and Marketing... Gamification ? Can it increase the quantity and quality of software? Paul Flanagan, Wednesday, April 25, 2012 11:00 – 11:45 AM 1 “The views... Business Data, retrieved from http://www.technologyreview.com/video/?vid=664 on April 12, 2011. 3 3“The global hub for educating, informing, and

  17. An approach to software quality assurance for robotic inspection systems

    International Nuclear Information System (INIS)

    Kiebel, G.R.

    1993-10-01

    Software quality assurance (SQA) for robotic systems used in nuclear waste applications is vital to ensure that the systems operate safely and reliably and pose a minimum risk to humans and the environment. This paper describes the SQA approach for the control and data acquisition system for a robotic system being developed for remote surveillance and inspection of underground storage tanks (UST) at the Hanford Site

  18. Quality system implementation for nuclear analytical techniques

    International Nuclear Information System (INIS)

    2004-01-01

    The international effort (UNIDO, ILAC, BIPM, etc.) to establish a functional infrastructure for metrology and accreditation in many developing countries needs to be complemented by assistance to implement high quality practices and high quality output by service providers and producers in the respective countries. Knowledge of how to approach QA systems that justify a formal accreditation is available in only a few countries and the dissemination of know how and development of skills is needed bottom up from the working level of laboratories and institutes. Awareness building, convincing of management, introduction of good management practices, technical expertise and good documentation will lead to the creation of a quality culture that assures a sustainability and inherent development of quality practices as a prerequisite of economic success. Quality assurance and quality control can be used as a valuable management tool and is a prerequisite for international trade and information exchange. This publication tries to assist quality managers, Laboratory Managers and staff involved in setting up a QA/QC system in a nuclear analytical laboratory to take appropriate action to start and complete the necessary steps for a successful quality system for ultimate national accreditation. This guidebook contributes to a better understanding of the basic ideas behind ISO/IEC 17025, the international standard for 'General requirements for the competence of testing and calibration laboratories'. It provides basic information and detailed explanation about the establishment of the QC system in analytical and nuclear analytical laboratories. It is a proper training material for training of trainers and makes managers with QC management and implementation familiar. This training material aims to facilitate the implementation of internationally accepted quality principles and to promote attempts by Member States' laboratories to obtain accreditation for nuclear analytical

  19. Design, Implementation, and Performance of CREAM Data Acquisition Software

    CERN Document Server

    Zinn, S Y; Bagliesi, M G; Beatty, J J; Childers, J T; Coutu, S; Duvernois, M A; Ganel, O; Kim, H J; Lee, M H; Lutz, L; Malinine, A; Maestro, P; Marrocchesi, P S; Park, I H; Seo, E S; Song, C; Swordy, S; Wu, J

    2005-01-01

    Cosmic Ray Energetics and Mass (CREAM) is a balloon-borne experiment scheduled for launching from Antarctica in late 2004. Its aim is to measure the energy spectrum and composition of cosmic rays from proton to iron nuclei at ultra high energies from 1 to 1,000 TeV. Ultra long duration balloons are expected to fly about 100 days. One special feature of the CREAM data acquisition software (CDAQ) is the telemetric operation of the instrument using satellites. During a flight the science event and housekeeping data are sent from the instrument to a ground facility. Likewise, commands for controlling both the hardware and the software are uploaded from the ground facility. This requires a robust, reliable, and fast software system. CDAQ has been developed and tested during three beam tests at CERN in July, September, and November 2003. Recently the interfaces to the transition radiation detector (TRD) and to the timing-based charge detector (TCD) have been added. These new additions to CDAQ will be checked at a t...

  20. Hanford Tanks Initiative quality assurance implementation plan

    International Nuclear Information System (INIS)

    Huston, J.J.

    1998-01-01

    Hanford Tanks Initiative (HTI) Quality Assurance Implementation Plan for Nuclear Facilities defines the controls for the products and activities developed by HTI. Project Hanford Management Contract (PHMC) Quality Assurance Program Description (QAPD)(HNF-PRO599) is the document that defines the quality requirements for Nuclear Facilities. The QAPD provides direction for compliance to 10 CFR 830.120 Nuclear Safety Management, Quality Assurance Requirements. Hanford Tanks Initiative (HTI) is a five-year activity resulting from the technical and financial partnership of the US Department of Energy's Office of Waste Management (EM-30), and Office of Science and Technology Development (EM-50). HTI will develop and demonstrate technologies and processes for characterization and retrieval of single shell tank waste. Activities and products associated with HTI consist of engineering, construction, procurement, closure, retrieval, characterization, and safety and licensing

  1. QUALITY SERVICES EVALUATION MODEL BASED ON DEDICATED SOFTWARE TOOL

    Directory of Open Access Journals (Sweden)

    ANDREEA CRISTINA IONICĂ

    2012-10-01

    Full Text Available In this paper we introduced a new model, called Service Quality (SQ, which combines QFD and SERVQUAL methods. This model takes from the SERVQUAL method the five dimensions of requirements and three of characteristics and from the QFD method the application methodology. The originality of the SQ model consists in computing a global index that reflects the customers’ requirements accomplishment level by the quality characteristics. In order to prove the viability of the SQ model, there was developed a software tool that was applied for the evaluation of a health care services provider.

  2. The PARETO RATING Software System for the Paretoapproximation Quality Assessment in Multi-criteria Optimization Problem

    Directory of Open Access Journals (Sweden)

    S. V. Groshev

    2014-01-01

    Full Text Available We consider the task to assess the quality of Pareto set (front numerical approximation in a multi-criteria optimization (MOC problem. We mean that Pareto-approximation is obtained by means of this or that population e.g. genetic algorithm.Eventually, the purpose of work is a comparative assessment of the efficiency of population algorithms of Pareto-approximation. The great number of characteristics (indicators of the Pareto-approximation quality is developed. Therefore an assessment problem of the Paretoapproximation quality is also considered as multi-criteria (multi-indicator. There are a number of well-known software systems to solve an assessment problem of the Pareto-approximation quality in different degree. Common drawback of these systems is a lack of both the WEB INTERFACE and the support of a multi-indicator assessment of Pareto-approximation quality (though there is a support to calculate the values of a large number of these indicators. The PARETO RATING software system is urged to eliminate the specified shortcomings of known systems. As population algorithms of Pareto-approximation are, as a rule, stochastic, we consider statistical methods to assess the quality of two and more Pareto-approximations (and thereby the estimates of algorithms used to obtain these approximations as well as follows: methods based on the ranging of the specified approximations; methods based on the quality indicators; methods based on the so-called empirical functions of approachability. We give formal statement of the MOC-problem and general scheme of the population algorithms of its solution, present reviews of known indicators of Pareto-approximation quality and statistical methods for assessment of Pareto-approximation quality. We describe the system architecture and main features of its software implementation and illustrate efficiency of made algorithmic and software solutions.

  3. Software quality for 1997 - what works and what doesn`t?

    Energy Technology Data Exchange (ETDEWEB)

    Jones, C. [Software Productivity Research, Burlington, MA (United States)

    1997-11-01

    This presentation provides a view of software quality for 1997 - what works and what doesn`t. For many years, software quality assurance lagged behind hardware quality assurance in terms of methods, metrics, and successful results. New approaches such as Quality Function Development (WFD) the ISO 9000-9004 standards, the SEI maturity levels, and Total Quality Management (TQM) are starting to attract wide attention, and in some cases to bring software quality levels up to a parity with manufacturing quality levels.

  4. Freight Advanced Traveler Information System (FRATIS) Dallas-Fort Worth : software architecture design and implementation options.

    Science.gov (United States)

    2013-05-01

    This document describes the Software Architecture Design and Implementation Options for FRATIS : system. The demonstration component of this task will serve to test the technical feasibility of the : FRATIS prototype while also facilitating the colle...

  5. Software development to implement the TxDOT culvert rating guide.

    Science.gov (United States)

    2013-05-01

    This implementation project created CULVLR: Culvert Load Rating, Version 1.0.0, a Windows-based : desktop application software package that automates the process by which Texas Department of Transportation : (TxDOT) engineers and their consultants ...

  6. Opening Up Architectures of Software-Intensive Systems: A First Prototype Implementation

    National Research Council Canada - National Science Library

    Charland, Philippe; Dessureault, Dany; Ouellet, David; Lizotte, Michel

    2007-01-01

    .... One approach to deal with this problem is information hiding. This technical memorandum presents a prototype which implements this technique to reverse engineer dynamic models from Java software systems...

  7. NARAC SOFTWARE QUALITY ASSURANCE: ADAPTING FORMALISM TO MEET VARYING NEEDS

    Energy Technology Data Exchange (ETDEWEB)

    Walker, H; Nasstrom, J S; Homann, S G

    2007-11-20

    The National Atmospheric Release Advisory Center (NARAC) provides tools and services that predict and map the spread of hazardous material accidentally or intentionally released into the atmosphere. NARAC is a full function system that can meet a wide range of needs with a particular focus on emergency response. The NARAC system relies on computer software in the form of models of the atmosphere and related physical processes supported by a framework for data acquisition and management, user interface, visualization, communications and security. All aspects of the program's operations and research efforts are predicated to varying degrees on the reliable and correct performance of this software. Consequently, software quality assurance (SQA) is an essential component of the NARAC program. The NARAC models and system span different levels of sophistication, fidelity and complexity. These different levels require related but different approaches to SQA. To illustrate this, two different levels of software complexity are considered in this paper. As a relatively simple example, the SQA procedures that are being used for HotSpot, a straight-line Gaussian model focused on radiological releases, are described. At the other extreme, the SQA issues that must be considered and balanced for the more complex NARAC system are reviewed.

  8. Quality assurance (QA) procedures for software: Evaluation of an ADC quality system

    International Nuclear Information System (INIS)

    Efstathopoulos, E. P.; Benekos, O.; Molfetas, M.; Charou, E.; Kottou, S.; Argentos, S.; Kelekis, N. L.

    2005-01-01

    Image viewing and processing software in computed radiography manipulates image contrast in such a way that all relevant image features are rendered to an appropriate degree of visibility, and improves image quality using enhancement algorithms. The purpose of this study was to investigate procedures for the quality assessment of image processing software for computed radiography with the use of existing test objects and to assess the influence that processing introduces on physical image quality characteristics. Measurements of high-contrast resolution, low-contrast resolution, spatial resolution, grey scale (characteristic curve) and geometric distortion were performed 'subjectively' by three independent observers and 'objectively' by the use of criteria based on pixel intensity values. Results show quality assessment is possible without the need for human evaluators, using digital images. It was discovered that the processing software evaluated in this study was able to improve some aspects of image quality, without introducing geometric distortion. (authors)

  9. Quality factors in the life cycle of software oriented to safety systems in nuclear power plants

    International Nuclear Information System (INIS)

    Nunez McLeod, J.E.; Rivera, S.S.

    1997-01-01

    The inclusion of software in safety related systems for nuclear power plants, makes it necessary to include the software quality assurance concept. The software quality can be defined as the adjustment degree between the software and the specified requirements and user expectations. To guarantee a certain software quality level it is necessary to make a systematic and planned set of tasks, that constitute a software quality guaranty plan. The application of such a plan involves activities that should be performed all along the software life cycle, and that can be evaluated through the so called quality factors, due to the fact that the quality itself cannot be directly measured, but indirectly as some of it manifestations. In this work, a software life cycle model is proposed, for nuclear power plant safety related systems. A set os software quality factors is also proposed , with its corresponding classification according to the proposed model. (author) [es

  10. Implementing Resource-aware Multicast Forwarding in Software Defined Networks

    DEFF Research Database (Denmark)

    Poderys, Justas; Sunny, Anjusha; Soler, José

    2018-01-01

    Dened Networks (SDN), all this information is available in a centralized entity - SDN network. This work proposes to utilize the SDN paradigm to perform network-resources aware multicast data routing in the SDN controller. In a prototype implementation, multicast data is routed using a modied Edmonds...

  11. Shear beams in finite element modelling : Software implementation and validation

    NARCIS (Netherlands)

    Schreppers, G.J.; Hendriks, M.A.N.; Boer, A.; Ferreira, D.; Kikstra, W.P.

    2015-01-01

    Fiber models for beam and shell elements allow for relatively rapid finite element analysis of concrete structures and structural elements. This project aims at the development of the formulation of such elements and a pilot implementation. The reduction of calculation time and degrees of freedom

  12. The influence of software filtering in digital mammography image quality

    International Nuclear Information System (INIS)

    Michail, C; Spyropoulou, V; Valais, I; Panayiotakis, G; Kalyvas, N; Fountos, G; Kandarakis, I; Dimitropoulos, N

    2009-01-01

    Breast cancer is one of the most frequently diagnosed cancers among women. Several techniques have been developed to help in the early detection of breast cancer such as conventional and digital x-ray mammography, positron and single-photon emission mammography, etc. A key advantage in digital mammography is that images can be manipulated as simple computer image files. Thus non-dedicated commercially available image manipulation software can be employed to process and store the images. The image processing tools of the Photoshop (CS 2) software usually incorporate digital filters which may be used to reduce image noise, enhance contrast and increase spatial resolution. However, improving an image quality parameter may result in degradation of another. The aim of this work was to investigate the influence of three sharpening filters, named hereafter sharpen, sharpen more and sharpen edges on image resolution and noise. Image resolution was assessed by means of the Modulation Transfer Function (MTF).In conclusion it was found that the correct use of commercial non-dedicated software on digital mammograms may improve some aspects of image quality.

  13. The influence of software filtering in digital mammography image quality

    Science.gov (United States)

    Michail, C.; Spyropoulou, V.; Kalyvas, N.; Valais, I.; Dimitropoulos, N.; Fountos, G.; Kandarakis, I.; Panayiotakis, G.

    2009-05-01

    Breast cancer is one of the most frequently diagnosed cancers among women. Several techniques have been developed to help in the early detection of breast cancer such as conventional and digital x-ray mammography, positron and single-photon emission mammography, etc. A key advantage in digital mammography is that images can be manipulated as simple computer image files. Thus non-dedicated commercially available image manipulation software can be employed to process and store the images. The image processing tools of the Photoshop (CS 2) software usually incorporate digital filters which may be used to reduce image noise, enhance contrast and increase spatial resolution. However, improving an image quality parameter may result in degradation of another. The aim of this work was to investigate the influence of three sharpening filters, named hereafter sharpen, sharpen more and sharpen edges on image resolution and noise. Image resolution was assessed by means of the Modulation Transfer Function (MTF).In conclusion it was found that the correct use of commercial non-dedicated software on digital mammograms may improve some aspects of image quality.

  14. Maximize Your Investment 10 Key Strategies for Effective Packaged Software Implementations

    CERN Document Server

    Beaubouef, Grady Brett

    2009-01-01

    This is a handbook covering ten principles for packaged software implementations that project managers, business owners, and IT developers should pay attention to. The book also has practical real-world coverage including a sample agenda for conducting business solution modeling, customer case studies, and a road map to implement guiding principles. This book is aimed at enterprise architects, development leads, project managers, business systems analysts, business systems owners, and anyone who wants to implement packaged software effectively. If you are a customer looking to implement COTS s

  15. Implementing an composition architecture for an online game software

    Directory of Open Access Journals (Sweden)

    Jaime Alberto Guzmán Luna

    2011-01-01

    Full Text Available Este artículo presenta una arquitectura para la composición de servicios Web, en la que un plan de composición se construye sobre la base de un agente de planificación, que puede ser ejecutado en forma concurrente durante su composición a fin de estimar la siguiente acción a ejecutar en lugar de preparar un completo plan que con frecuencia será invalidado. Esta característica es muy valiosa cuando se trata de abordar los problemas en tiempo real. Específicamente se propone como un dominio de prueba un software de juegos en línea, llamado ENVIRO.

  16. Barriers integrating dedicated software for quality management in pain centres

    NARCIS (Netherlands)

    Ten Hoopen, A.J.; Zanstra, P.E.; Van der Haring, E.J.; Tuil, W.S.; van Wijhe, M.

    2006-01-01

    OBJECTIVES: Explore the feasibility of integrating a dedicated pain centre information system as part of a quality management network with a number of different Hospital Information Systems. MATERIAL & METHODS: A systematic approach integrating and implementing the system in 15 selected hospital

  17. Software for improving the quality of project management, a case study: international manufacture of electrical equipment

    Science.gov (United States)

    Preradović, D. M.; Mićić, Lj S.; Barz, C.

    2017-05-01

    Production conditions in today’s world require software support at every stage of production and development of new products, for quality assurance and compliance with ISO standards. In addition to ISO standards such as usual metrics of quality, companies today are focused on other optional standards, such as CMMI (Capability Maturity Model Integrated) or prescribing they own standards. However, while there is intensive progress being made in the PM (project management), there is still a significant number of projects, at the global level, that are failures. These have failed to achieve their goals, within budget or timeframe. This paper focuses on checking the role of software tools through the rate of success in projects implemented in the case of internationally manufactured electrical equipment. The results of this research show the level of contribution of the project management software used to manage and develop new products to improve PM processes and PM functions, and how selection of the software tools affects the quality of PM processes and successfully completed projects.

  18. Intelligent surgical laser system configuration and software implementation

    Science.gov (United States)

    Hsueh, Chi-Fu T.; Bille, Josef F.

    1992-06-01

    An intelligent surgical laser system, which can help the ophthalmologist to achieve higher precision and control during their procedures, has been developed by ISL as model CLS 4001. In addition to the laser and laser delivery system, the system is also equipped with a vision system (IPU), robotics motion control (MCU), and a tracking closed loop system (ETS) that tracks the eye in three dimensions (X, Y and Z). The initial patient setup is computer controlled with guidance from the vision system. The tracking system is automatically engaged when the target is in position. A multi-level tracking system is developed by integrating our vision and tracking systems which have been able to maintain our laser beam precisely on target. The capabilities of the automatic eye setup and the tracking in three dimensions provides for improved accuracy and measurement repeatability. The system is operated through the Surgical Control Unit (SCU). The SCU communicates with the IPU and the MCU through both ethernet and RS232. Various scanning pattern (i.e., line, curve, circle, spiral, etc.) can be selected with given parameters. When a warning is activated, a voice message is played that will normally require a panel touch acknowledgement. The reliability of the system is ensured in three levels: (1) hardware, (2) software real time monitoring, and (3) user. The system is currently under clinical validation.

  19. Sci-Thur AM: Planning - 04: Evaluation of the fluence complexity, solution quality, and run efficiency produced by five fluence parameterizations implemented in PARETO multiobjective radiotherapy treatment planning software.

    Science.gov (United States)

    Champion, H; Fiege, J; McCurdy, B; Potrebko, P; Cull, A

    2012-07-01

    PARETO (Pareto-Aware Radiotherapy Evolutionary Treatment Optimization) is a novel multiobjective treatment planning system that performs beam orientation and fluence optimization simultaneously using an advanced evolutionary algorithm. In order to reduce the number of parameters involved in this enormous search space, we present several methods for modeling the beam fluence. The parameterizations are compared using innovative tools that evaluate fluence complexity, solution quality, and run efficiency. A PARETO run is performed using the basic weight (BW), linear gradient (LG), cosine transform (CT), beam group (BG), and isodose-projection (IP) methods for applying fluence modulation over the projection of the Planning Target Volume in the beam's-eye-view plane. The solutions of each run are non-dominated with respect to other trial solutions encountered during the run. However, to compare the solution quality of independent runs, each run competes against every other run in a round robin fashion. Score is assigned based on the fraction of solutions that survive when a tournament selection operator is applied to the solutions of the two competitors. To compare fluence complexity, a modulation index, fractal dimension, and image gradient entropy are calculated for the fluence maps of each optimal plan. We have found that the LG method results in superior solution quality for a spine phantom, lung patient, and cauda equina patient. The BG method produces solutions with the highest degree of fluence complexity. Most methods result in comparable run times. The LG method produces superior solution quality using a moderate degree of fluence modulation. © 2012 American Association of Physicists in Medicine.

  20. Using iKidTools™ Software Support Systems to Develop and Implement Self-Monitoring Interventions

    Science.gov (United States)

    Patti, Angela L.; Miller, Kevin J.

    2011-01-01

    Educational teams often are faced with the task of developing and implementing Behavioral Intervention Plans (BIPs) for students who present challenging and/or disruptive behaviors. This article describes the steps used to develop and implement a self-monitoring BIP that incorporated an innovative software system, iKidTools™. An authentic case…

  1. Development and case study of a science-based software platform to support policy making on air quality.

    Science.gov (United States)

    Zhu, Yun; Lao, Yanwen; Jang, Carey; Lin, Chen-Jen; Xing, Jia; Wang, Shuxiao; Fu, Joshua S; Deng, Shuang; Xie, Junping; Long, Shicheng

    2015-01-01

    This article describes the development and implementations of a novel software platform that supports real-time, science-based policy making on air quality through a user-friendly interface. The software, RSM-VAT, uses a response surface modeling (RSM) methodology and serves as a visualization and analysis tool (VAT) for three-dimensional air quality data obtained by atmospheric models. The software features a number of powerful and intuitive data visualization functions for illustrating the complex nonlinear relationship between emission reductions and air quality benefits. The case study of contiguous U.S. demonstrates that the enhanced RSM-VAT is capable of reproducing the air quality model results with Normalized Mean Bias <2% and assisting in air quality policy making in near real time. Copyright © 2014. Published by Elsevier B.V.

  2. Development and implementation of a 'Mental Health Finder' software tool within an electronic medical record system.

    Science.gov (United States)

    Swan, D; Hannigan, A; Higgins, S; McDonnell, R; Meagher, D; Cullen, W

    2017-02-01

    In Ireland, as in many other healthcare systems, mental health service provision is being reconfigured with a move toward more care in the community, and particularly primary care. Recording and surveillance systems for mental health information and activities in primary care are needed for service planning and quality improvement. We describe the development and initial implementation of a software tool ('mental health finder') within a widely used primary care electronic medical record system (EMR) in Ireland to enable large-scale data collection on the epidemiology and management of mental health and substance use problems among patients attending general practice. In collaboration with the Irish Primary Care Research Network (IPCRN), we developed the 'Mental Health Finder' as a software plug-in to a commonly used primary care EMR system to facilitate data collection on mental health diagnoses and pharmacological treatments among patients. The finder searches for and identifies patients based on diagnostic coding and/or prescribed medicines. It was initially implemented among a convenience sample of six GP practices. Prevalence of mental health and substance use problems across the six practices, as identified by the finder, was 9.4% (range 6.9-12.7%). 61.9% of identified patients were female; 25.8% were private patients. One-third (33.4%) of identified patients were prescribed more than one class of psychotropic medication. Of the patients identified by the finder, 89.9% were identifiable via prescribing data, 23.7% via diagnostic coding. The finder is a feasible and promising methodology for large-scale data collection on mental health problems in primary care.

  3. Maintaining Quality and Confidence in Open-Source, Evolving Software: Lessons Learned with PFLOTRAN

    Science.gov (United States)

    Frederick, J. M.; Hammond, G. E.

    2017-12-01

    Software evolution in an open-source framework poses a major challenge to a geoscientific simulator, but when properly managed, the pay-off can be enormous for both the developers and the community at large. Developers must juggle implementing new scientific process models, adopting increasingly efficient numerical methods and programming paradigms, changing funding sources (or total lack of funding), while also ensuring that legacy code remains functional and reported bugs are fixed in a timely manner. With robust software engineering and a plan for long-term maintenance, a simulator can evolve over time incorporating and leveraging many advances in the computational and domain sciences. In this positive light, what practices in software engineering and code maintenance can be employed within open-source development to maximize the positive aspects of software evolution and community contributions while minimizing its negative side effects? This presentation will discusses steps taken in the development of PFLOTRAN (www.pflotran.org), an open source, massively parallel subsurface simulator for multiphase, multicomponent, and multiscale reactive flow and transport processes in porous media. As PFLOTRAN's user base and development team continues to grow, it has become increasingly important to implement strategies which ensure sustainable software development while maintaining software quality and community confidence. In this presentation, we will share our experiences and "lessons learned" within the context of our open-source development framework and community engagement efforts. Topics discussed will include how we've leveraged both standard software engineering principles, such as coding standards, version control, and automated testing, as well unique advantages of object-oriented design in process model coupling, to ensure software quality and confidence. We will also be prepared to discuss the major challenges faced by most open-source software teams, such

  4. Effectiveness of Software Quality Assurance in Offshore Development Enterprises in Sri Lanka

    OpenAIRE

    Malinda G. Sirisena

    2014-01-01

    The aim of this research is to evaluate the effectiveness of software quality assurance approaches of Sri Lankan offshore software development organizations, and to propose a framework which could be used across all offshore software development organizations. An empirical study was conducted using derived framework from popular software quality evaluation models. The research instrument employed was a questionnaire survey among thirty seven Sri Lankan registered offshore software develop...

  5. A survey of quality assurance practices in biomedical open source software projects.

    Science.gov (United States)

    Koru, Günes; El Emam, Khaled; Neisa, Angelica; Umarji, Medha

    2007-05-07

    Open source (OS) software is continuously gaining recognition and use in the biomedical domain, for example, in health informatics and bioinformatics. Given the mission critical nature of applications in this domain and their potential impact on patient safety, it is important to understand to what degree and how effectively biomedical OS developers perform standard quality assurance (QA) activities such as peer reviews and testing. This would allow the users of biomedical OS software to better understand the quality risks, if any, and the developers to identify process improvement opportunities to produce higher quality software. A survey of developers working on biomedical OS projects was conducted to examine the QA activities that are performed. We took a descriptive approach to summarize the implementation of QA activities and then examined some of the factors that may be related to the implementation of such practices. Our descriptive results show that 63% (95% CI, 54-72) of projects did not include peer reviews in their development process, while 82% (95% CI, 75-89) did include testing. Approximately 74% (95% CI, 67-81) of developers did not have a background in computing, 80% (95% CI, 74-87) were paid for their contributions to the project, and 52% (95% CI, 43-60) had PhDs. A multivariate logistic regression model to predict the implementation of peer reviews was not significant (likelihood ratio test = 16.86, 9 df, P = .051) and neither was a model to predict the implementation of testing (likelihood ratio test = 3.34, 9 df, P = .95). Less attention is paid to peer review than testing. However, the former is a complementary, and necessary, QA practice rather than an alternative. Therefore, one can argue that there are quality risks, at least at this point in time, in transitioning biomedical OS software into any critical settings that may have operational, financial, or safety implications. Developers of biomedical OS applications should invest more effort

  6. Contribuição dos modelos de qualidade e maturidade na melhoria dos processos de software Contribution of quality and maturity models to software process improvement

    Directory of Open Access Journals (Sweden)

    Antonio Carlos Tonini

    2008-01-01

    Full Text Available Grande parte das empresas desenvolvedoras de software criou seu próprio processo de trabalho. Devido à rápida expansão do mercado de software, a concorrência ocorre muito mais em custo do que em diferenciação. Para obter vantagem competitiva, as empresas devem atualizar-se continuamente na tecnologia, buscar a maturidade nos processos e eliminar a ineficiência operacional. Isso requer um envolvimento das pessoas, dos processos e da organização como um todo. O artigo discute a implementação de melhorias nos processos de software segundo os principais modelos de qualidade e de maturidade. Com base em um Estudo de Casos Múltiplos, verifica-se que a melhoria dos processos de software requer que a melhoria ocorra primeiramente entre cada um dos desenvolvedores e, a seguir, envolva os grupos de desenvolvimento e por fim, a organização como um todo. A pesquisa conclui que os modelos de qualidade e maturidade servem como orientadores do processo de melhoria.Many software development companies have developed their own work method. Due to the fast software market growth, the competition focuses more on cost than on differentiation. To achieve competitive advantage, software developer organizations must continually update their technology, reach high level process maturity and eliminate all the operational inefficiency. These procedures involve people, processes and the whole organization. The aim of the paper is to discuss software process improvement implementation according to the most important quality and maturity models. Based on a Multiple Case Study, it is verified that the software process improvement needs firstly individual improvement and, later, it involves the developer teams and the whole organization. The research concludes that the quality and maturity models must be used as improvement process drivers.

  7. Near-Real time analysis of seismic data of active volcanoes: Software implementations of time sequence data analysis

    Directory of Open Access Journals (Sweden)

    J. Vila

    2008-07-01

    Full Text Available This paper presents the development and applications of a software-based quality control system that monitors volcano activity in near-real time. On the premise that external seismic manifestations provide information directly related to the internal status of a volcano, here we analyzed variations in background seismic noise. By continuous analysis of variations in seismic waveforms, we detected clear indications of changes in the internal status. The application of this method to data recorded in Villarrica (Chile and Tungurahua (Ecuador volcanoes demonstrates that it is suitable to be used as a forecasting tool. A recent application of this developed software-based quality control to the real-time monitoring of Teide – Pico Viejo volcanic complex (Spain anticipated external episodes of volcanic activity, thus corroborating the advantages and capacity of the methodology when implemented as an automatic real-time procedure.

  8. Improving the quality of care of patients with rheumatic disease using patient-centric electronic redesign software.

    Science.gov (United States)

    Newman, Eric D; Lerch, Virginia; Billet, Jon; Berger, Andrea; Kirchner, H Lester

    2015-04-01

    Electronic health records (EHRs) are not optimized for chronic disease management. To improve the quality of care for patients with rheumatic disease, we developed electronic data capture, aggregation, display, and documentation software. The software integrated and reassembled information from the patient (via a touchscreen questionnaire), nurse, physician, and EHR into a series of actionable views. Core functions included trends over time, rheumatology-related demographics, and documentation for patient and provider. Quality measures collected included patient-reported outcomes, disease activity, and function. The software was tested and implemented in 3 rheumatology departments, and integrated into routine care delivery. Post-implementation evaluation measured adoption, efficiency, productivity, and patient perception. Over 2 years, 6,725 patients completed 19,786 touchscreen questionnaires. The software was adopted for use by 86% of patients and rheumatologists. Chart review and documentation time trended downward, and productivity increased by 26%. Patient satisfaction, activation, and adherence remained unchanged, although pre-implementation values were high. A strong correlation was seen between use of the software and disease control (weighted Pearson's correlation coefficient 0.5927, P = 0.0095), and a relative increase in patients with low disease activity of 3% per quarter was noted. We describe innovative software that aggregates, stores, and displays information vital to improving the quality of care for patients with chronic rheumatic disease. The software was well-adopted by patients and providers. Post-implementation, significant improvements in quality of care, efficiency of care, and productivity were demonstrated. Copyright © 2015 by the American College of Rheumatology.

  9. Development of Innovative Computer Software to Facilitate the Setup and Computation of Water Quality Index

    OpenAIRE

    Samira Yousefzadeh; Amir Hossein Mahvi; Mahmood Alimohammadi; Kazem Naddafi; Maryam Valadi Amin; Ramin Nabizadeh

    2013-01-01

    Background Developing a water quality index which is used to convert the water quality dataset into a single number is the most important task of most water quality monitoring programmes. As the water quality index setup is based on different local obstacles, it is not feasible to introduce a definite water quality index to reveal the water quality level. Findings In this study, an innovative software application, the Iranian Water Quality Index Software (IWQIS), is presented in order to faci...

  10. SOFTWARE TRAINING AIDS DEVELOPMENT AND IMPLEMENTATION IN PROFESSIONAL PREPARATION PRACTICE OF TECHNOLOGICAL EDUCATION TEACHER

    Directory of Open Access Journals (Sweden)

    Anatoliy G. Gritchenko

    2013-03-01

    Full Text Available The article outlines the theoretical and practical aspects of software training aids development and implementation in professional preparation practice of technological education teacher. The myriad opportunities of new information technologies are described; the characteristic features of modern software training tool (STT are revealed; the main algorithmic structure circuits of training programs construction (linear, cyclic, with hyperlinks, to the labels, which enable the development of STT variety and functionality are given; the methodology of STT creating is described based on the analysis of the technology teacher preparation in HEE content, MITE didactic functions and selection criteria of educational software for this area of specialist’s preparation.

  11. A methodology to assess the impact of design patterns on software quality

    NARCIS (Netherlands)

    Ampatzoglou, Apostolos; Frantzeskou, Georgia; Stamelos, Ioannis

    Context Software quality is considered to be one of the most important concerns of software production teams. Additionally, design patterns are documented solutions to common design problems that are expected to enhance software quality. Until now, the results on the effect of design patterns on

  12. Supporting custom quality models to analyse and compare open-source software

    NARCIS (Netherlands)

    D. Di Ruscio (Davide); D.S. Kolovos (Dimitrios); I. Korkontzelos (Ioannis); N. Matragkas (Nicholas); J.J. Vinju (Jurgen)

    2017-01-01

    textabstractThe analysis and comparison of open source software can be improved by means of quality models supporting the evaluation of the software systems being compared and the final decision about which of them has to be adopted. Since software quality can mean different things in different

  13. Comparison of quality control software tools for diffusion tensor imaging.

    Science.gov (United States)

    Liu, Bilan; Zhu, Tong; Zhong, Jianhui

    2015-04-01

    Image quality of diffusion tensor imaging (DTI) is critical for image interpretation, diagnostic accuracy and efficiency. However, DTI is susceptible to numerous detrimental artifacts that may impair the reliability and validity of the obtained data. Although many quality control (QC) software tools are being developed and are widely used and each has its different tradeoffs, there is still no general agreement on an image quality control routine for DTIs, and the practical impact of these tradeoffs is not well studied. An objective comparison that identifies the pros and cons of each of the QC tools will be helpful for the users to make the best choice among tools for specific DTI applications. This study aims to quantitatively compare the effectiveness of three popular QC tools including DTI studio (Johns Hopkins University), DTIprep (University of North Carolina at Chapel Hill, University of Iowa and University of Utah) and TORTOISE (National Institute of Health). Both synthetic and in vivo human brain data were used to quantify adverse effects of major DTI artifacts to tensor calculation as well as the effectiveness of different QC tools in identifying and correcting these artifacts. The technical basis of each tool was discussed, and the ways in which particular techniques affect the output of each of the tools were analyzed. The different functions and I/O formats that three QC tools provide for building a general DTI processing pipeline and integration with other popular image processing tools were also discussed. Copyright © 2015 Elsevier Inc. All rights reserved.

  14. Software quality assurance on the Yucca Mountain Site Characterization Project

    International Nuclear Information System (INIS)

    Matras, J.R.

    1993-01-01

    The Yucca Mountain Site Characterization Project (YMP) has been involved over the years in the continuing struggle with establishing acceptable Software Quality Assurance (SQA) requirements for the development, modification, and acquisition of computer programs used to support the Mined Geologic Disposal System. These computer programs will be used to produce or manipulate data used directly in site characterization, design, analysis, performance assessment, and operation of repository structures, systems, and components. Scientists and engineers working on the project have claimed that the SQA requirements adopted by the project are too restrictive to allow them to perform their work. This paper will identify the source of the original SQA requirements adopted by the project. It will delineate the approach used by the project to identify concerns voiced by project engineers and scientists regarding the original SQA requirements. It will conclude with a discussion of methods used to address these problems in the rewrite of the original SQA requirements

  15. IMPROVED SOFTWARE QUALITY ASSURANCE TECHNIQUES USING SAFE GROWTH MODEL

    OpenAIRE

    M.Sangeetha; K.M.SenthilKumar; Dr.C.Arumugam; K. Akila

    2010-01-01

    In our lives are governed by large, complex systems with increasingly complex software, and the safety, security, and reliability of these systems has become a major concern. As the software in today’ssystems grows larger, it has more defects, and these defects adversely affect the safety, security, and reliability of the systems. Software engineering is the application of a systematic, disciplined, quantifiable approach to the development, operation, andmaintenance of software. Software divi...

  16. An empirical evaluation of software quality assurance practices and challenges in a developing country: a comparison of Nigeria and Turkey

    OpenAIRE

    Sowunmi, Olaperi Yeside; Misra, Sanjay; Fernandez-Sanz, Luis; Crawford, Broderick; Soto, Ricardo

    2016-01-01

    Background The importance of quality assurance in the software development process cannot be overemphasized because its adoption results in high reliability and easy maintenance of the software system and other software products. Software quality assurance includes different activities such as quality control, quality management, quality standards, quality planning, process standardization and improvement amongst others. The aim of this work is to further investigate the software quality assu...

  17. Development and implementation of pharmaceutical care planning software for nursing homes based on the Fleetwood Model.

    Science.gov (United States)

    Lapane, Kate L; Hiris, Jeffrey; Hughes, Carmel M; Feinberg, Janice

    2006-12-15

    The effectiveness of pharmaceutical care planning software for nursing homes and the extent to which the software assisted in the implementation of the Fleetwood Model are described. During the study, one long-term-care pharmacy identified 13 nursing homes to participate in the intervention group of a study evaluating the effectiveness of the Fleetwood Model. To successfully implement the Fleetwood Model, which demands prospective drug regimen review and collaborative practices between dispensing and consultant pharmacists, a software system that exchanged information between these pharmacists was deemed necessary. Pharmacists' self-reported assessments of the use of the software and the technical difficulties reported with its use were collected. The number of interventions performed by pharmacist type, the proportion of residents receiving interventions by multiple pharmacists, and the extent to which the interventions were prospective and performed before the mandated 30-day review were estimated from data documented in the software. The consistency of software use by the pharmacists was also estimated. Seventy-one percent of dispensing pharmacists and 40% of consultant pharmacists reported using the software most or all of the time. Fourteen percent of dispensing pharmacists and 40% of consultant pharmacists reported technical difficulties with the software. Over half of newly admitted or readmitted residents received a Fleetwood intervention within 3 days of admittance into the nursing home-71.2% occurred in less than 30 days of admission. The use of information technology to increase communication among health care professionals and assist in providing prospective drug regimen review in long-term-care facilities is feasible. Collaboration and extensive field testing with end users, realistic expectations, appropriate training, and technical support are necessary when implementing new technology.

  18. Recommendations for a Software Quality Assurance Plan for the CMR Facility at LANL

    International Nuclear Information System (INIS)

    Adams, K.; Matthews, S. D.; McQueen, M. A.

    1998-01-01

    The Nuclear Materials Technology (NMT) organizations 1 and 3 within the Chemical and Metallurgical Research (CMR) facility at the Los Alamos National Laboratory are working to achieve Waste Isolation Pilot Plant (WIPP) certification to enable them to transport their TRU waste to WIPP. This document is intended to provide not only recommendations to address the necessary software quality assurance activities to enable the NMT-1 and NMT-3 organizations to be WIPP compliant but is also meant to provide a template for the final Software Quality Assurance Plan (SQAP). This document specifically addresses software quality assurance for all software used in support of waste characterization and analysis. Since NMT-1 and NMT-3 currently have several operational software products that are used for waste characterization and analysis, these software quality assurance recommendations apply to the operations, maintenance and retirement of the software and the creation and development of any new software required for waste characterization and analyses

  19. Software Quality Evaluation Models Applicable in Health Information and Communications Technologies. A Review of the Literature.

    Science.gov (United States)

    Villamor Ordozgoiti, Alberto; Delgado Hito, Pilar; Guix Comellas, Eva María; Fernandez Sanchez, Carlos Manuel; Garcia Hernandez, Milagros; Lluch Canut, Teresa

    2016-01-01

    Information and Communications Technologies in healthcare has increased the need to consider quality criteria through standardised processes. The aim of this study was to analyse the software quality evaluation models applicable to healthcare from the perspective of ICT-purchasers. Through a systematic literature review with the keywords software, product, quality, evaluation and health, we selected and analysed 20 original research papers published from 2005-2016 in health science and technology databases. The results showed four main topics: non-ISO models, software quality evaluation models based on ISO/IEC standards, studies analysing software quality evaluation models, and studies analysing ISO standards for software quality evaluation. The models provide cost-efficiency criteria for specific software, and improve use outcomes. The ISO/IEC25000 standard is shown as the most suitable for evaluating the quality of ICTs for healthcare use from the perspective of institutional acquisition.

  20. SU-F-J-72: A Clinical Usable Integrated Contouring Quality Evaluation Software for Radiotherapy

    International Nuclear Information System (INIS)

    Jiang, S; Dolly, S; Cai, B; Mutic, S; Li, H

    2016-01-01

    Purpose: To introduce the Auto Contour Evaluation (ACE) software, which is the clinical usable, user friendly, efficient and all-in-one toolbox for automatically identify common contouring errors in radiotherapy treatment planning using supervised machine learning techniques. Methods: ACE is developed with C# using Microsoft .Net framework and Windows Presentation Foundation (WPF) for elegant GUI design and smooth GUI transition animations through the integration of graphics engines and high dots per inch (DPI) settings on modern high resolution monitors. The industrial standard software design pattern, Model-View-ViewModel (MVVM) pattern, is chosen to be the major architecture of ACE for neat coding structure, deep modularization, easy maintainability and seamless communication with other clinical software. ACE consists of 1) a patient data importing module integrated with clinical patient database server, 2) a 2D DICOM image and RT structure simultaneously displaying module, 3) a 3D RT structure visualization module using Visualization Toolkit or VTK library and 4) a contour evaluation module using supervised pattern recognition algorithms to detect contouring errors and display detection results. ACE relies on supervised learning algorithms to handle all image processing and data processing jobs. Implementations of related algorithms are powered by Accord.Net scientific computing library for better efficiency and effectiveness. Results: ACE can take patient’s CT images and RT structures from commercial treatment planning software via direct user input or from patients’ database. All functionalities including 2D and 3D image visualization and RT contours error detection have been demonstrated with real clinical patient cases. Conclusion: ACE implements supervised learning algorithms and combines image processing and graphical visualization modules for RT contours verification. ACE has great potential for automated radiotherapy contouring quality verification

  1. SU-F-J-72: A Clinical Usable Integrated Contouring Quality Evaluation Software for Radiotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Jiang, S; Dolly, S; Cai, B; Mutic, S; Li, H [Washington University School of Medicine, Saint Louis, MO (United States)

    2016-06-15

    Purpose: To introduce the Auto Contour Evaluation (ACE) software, which is the clinical usable, user friendly, efficient and all-in-one toolbox for automatically identify common contouring errors in radiotherapy treatment planning using supervised machine learning techniques. Methods: ACE is developed with C# using Microsoft .Net framework and Windows Presentation Foundation (WPF) for elegant GUI design and smooth GUI transition animations through the integration of graphics engines and high dots per inch (DPI) settings on modern high resolution monitors. The industrial standard software design pattern, Model-View-ViewModel (MVVM) pattern, is chosen to be the major architecture of ACE for neat coding structure, deep modularization, easy maintainability and seamless communication with other clinical software. ACE consists of 1) a patient data importing module integrated with clinical patient database server, 2) a 2D DICOM image and RT structure simultaneously displaying module, 3) a 3D RT structure visualization module using Visualization Toolkit or VTK library and 4) a contour evaluation module using supervised pattern recognition algorithms to detect contouring errors and display detection results. ACE relies on supervised learning algorithms to handle all image processing and data processing jobs. Implementations of related algorithms are powered by Accord.Net scientific computing library for better efficiency and effectiveness. Results: ACE can take patient’s CT images and RT structures from commercial treatment planning software via direct user input or from patients’ database. All functionalities including 2D and 3D image visualization and RT contours error detection have been demonstrated with real clinical patient cases. Conclusion: ACE implements supervised learning algorithms and combines image processing and graphical visualization modules for RT contours verification. ACE has great potential for automated radiotherapy contouring quality verification

  2. Caliko: An Inverse Kinematics Software Library Implementation of the FABRIK Algorithm

    OpenAIRE

    Lansley, Alastair; Vamplew, Peter; Smith, Philip; Foale, Cameron

    2016-01-01

    The Caliko library is an implementation of the FABRIK (Forward And Backward Reaching Inverse Kinematics) algorithm written in Java. The inverse kinematics (IK) algorithm is implemented in both 2D and 3D, and incorporates a variety of joint constraints as well as the ability to connect multiple IK chains together in a hierarchy. The library allows for the simple creation and solving of multiple IK chains as well as visualisation of these solutions. It is licensed under the MIT software license...

  3. The Software Improvement Process - Tools And Rules To Encourage Quality

    CERN Document Server

    Sigerud, K

    2011-01-01

    The Applications section of the CERN accelerator Controls group has decided to apply a systematic approach to quality assurance (QA), the “Software Improvement Process”, SIP. This process focuses on three areas: the development process itself, suitable QA tools, and how to practically encourage developers to do QA. For each stage of the development process we have agreed on the recommended activities and deliverables, and identified tools to automate and support the task. For example we do more code reviews. As peer reviews are resource-intensive, we only do them for complex parts of a product. As a complement, we are using static code checking tools, like FindBugs and Checkstyle. We also encourage unit testing and have agreed on a minimum level of test coverage recommended for all products, measured using Clover. Each of these tools is well integrated with our IDE (Eclipse) and give instant feedback to the developer about the quality of their code. The major challenges of SIP have been to 1) agree on com...

  4. Integrated software environment dedicated for implementation of control systems based on PLC controllers

    Directory of Open Access Journals (Sweden)

    Szymon SURMA

    2007-01-01

    Full Text Available Industrial processes’ control systems based on PLC controllers play today a very important role in all fields of transport, including also sea transport. Construction of control systems is the field of engineering, which has been continuously evolving towards maximum simplification of system design path. Up to now the time needed forthe system construction from the design to commissioning had to be divided into a few stages. A mistake made in an earlier stage caused that in most cases the next stages had to be restarted. Available debugging systems allows defect detection at an early stage of theproject implementation. The paper presents general characteristic of integrated software for implementation of complex control systems. The issues related to the software use for programming of the visualisation environment, control computer, selection oftransmission medium and transmission protocol as well as PLC controllers’ configuration, software and control have been analysed.

  5. Results of the EC research project REQUEST on software quality and reliability

    International Nuclear Information System (INIS)

    Kersken, M.; Saglietti, F.

    1990-01-01

    GRS work in software safety was mainly concerned with the qualitative assessment of software reliability and quality. As a supplement to these activities the work within the REQUEST project emphasized the quantitative determination of the respective parameters. The three-level quality model COQUAMO serves for the computation - and partly for the prediction - of quality factors during the software life cycle. PERFIDE controls the application of software reliability models during the test phase and in early operational life. Specific attention was paid to the assessment of fault-tolerant diverse software systems. (orig.) [de

  6. arrayQCplot: software for checking the quality of microarray data.

    Science.gov (United States)

    Lee, Eun-Kyung; Yi, Sung-Gon; Park, Taesung

    2006-09-15

    arrayQCplot is a software for the exploratory analysis of microarray data. This software focuses on quality control and generates newly developed plots for quality and reproducibility checks. It is developed using R and provides a user-friendly graphical interface for graphics and statistical analysis. Therefore, novice users will find arrayQCplot as an easy-to-use software for checking the quality of their data by a simple mouse click. arrayQCplot software is available from Bioconductor at http://www.bioconductor.org. A more detailed manual is available at http://bibs.snu.ac.kr/software/arrayQCplot tspark@stats.snu.ac.kr.

  7. Implementation of LTE SC-FDMA on the USRP2 Software Defined Radio Platform

    DEFF Research Database (Denmark)

    Jørgensen, Peter Bjørn; Hansen, Thomas Lundgaard; Sørensen, Troels Bundgaard

    2011-01-01

    In this paper we discuss the implementation of a Single Carrier Frequency Division Multiple Access (SC-FDMA) transceiver running over the Universal Software Radio Peripheral 2 (USRP2). SC-FDMA is the air interface which has been selected for the uplink in the latest Long Term Evolution (LTE...

  8. Comparison of Software Quality Metrics for Object-Oriented System

    OpenAIRE

    Amit Sharma; Sanjay Kumar Dubey

    2012-01-01

    According to the IEEE standard glossary of softwareengineering, Object-Oriented design is becoming moreimportant in software development environment andsoftware Metrics are essential in software engineering formeasuring the software complexity, estimating size, qualityand project efforts. There are various approaches throughwhich we can find the software cost estimation andpredicates on various kinds of deliverable items. The toolsare used for measuring the estimations are lines of codes,func...

  9. MultiRefactor: Automated Refactoring To Improve Software Quality

    OpenAIRE

    Mohan, Michael; Greer, Des

    2017-01-01

    In this paper, a new approach is proposed for automated software maintenance. The tool is able to perform 26 different refactorings. It also contains a large selection of metrics to measure the impact of the refactorings on the software and six different search based optimization algorithms to improve the software. This tool contains both mono-objective and multi-objective search techniques for software improvement and is fully automated. The paper describes the various capabilities of the to...

  10. Portal for Families Overcoming Neurodevelopmental Disorders (PFOND): Implementation of a Software Framework for Facilitated Community Website Creation by Nontechnical Volunteers.

    Science.gov (United States)

    Ye, Xin Cynthia; Ng, Isaiah; Seid-Karbasi, Puya; Imam, Tuhina; Lee, Cheryl E; Chen, Shirley Yu; Herman, Adam; Sharma, Balraj; Johal, Gurinder; Gu, Bobby; Wasserman, Wyeth W

    2013-08-06

    The Portal for Families Overcoming Neurodevelopmental Disorders (PFOND) provides a structured Internet interface for the sharing of information with individuals struggling with the consequences of rare developmental disorders. Large disease-impacted communities can support fundraising organizations that disseminate Web-based information through elegant websites run by professional staff. Such quality resources for families challenged by rare disorders are infrequently produced and, when available, are often dependent upon the continued efforts of a single individual. The project endeavors to create an intuitive Web-based software system that allows a volunteer with limited technical computer skills to produce a useful rare disease website in a short time period. Such a system should provide access to emerging news and research findings, facilitate community participation, present summary information about the disorder, and allow for transient management by volunteers who are likely to change periodically. The prototype portal was implemented using the WordPress software system with both existing and customized supplementary plug-in software modules. Gamification scoring features were implemented in a module, allowing editors to measure progress. The system was installed on a Linux-based computer server, accessible across the Internet through standard Web browsers. A prototype PFOND system was implemented and tested. The prototype system features a structured organization with distinct partitions for background information, recent publications, and community discussions. The software design allows volunteer editors to create a themed website, implement a limited set of topic pages, and connect the software to dynamic RSS feeds providing information about recent news or advances. The prototype was assessed by a fraction of the disease sites developed (8 out of 27), including Aarskog-Scott syndrome, Aniridia, Adams-Oliver syndrome, Cat Eye syndrome, Kabuki syndrome

  11. Software engineering and Ada (Trademark) training: An implementation model for NASA

    Science.gov (United States)

    Legrand, Sue; Freedman, Glenn

    1988-01-01

    The choice of Ada for software engineering for projects such as the Space Station has resulted in government and industrial groups considering training programs that help workers become familiar with both a software culture and the intricacies of a new computer language. The questions of how much time it takes to learn software engineering with Ada, how much an organization should invest in such training, and how the training should be structured are considered. Software engineering is an emerging, dynamic discipline. It is defined by the author as the establishment and application of sound engineering environments, tools, methods, models, principles, and concepts combined with appropriate standards, guidelines, and practices to support computing which is correct, modifiable, reliable and safe, efficient, and understandable throughout the life cycle of the application. Neither the training programs needed, nor the content of such programs, have been well established. This study addresses the requirements for training for NASA personnel and recommends an implementation plan. A curriculum and a means of delivery are recommended. It is further suggested that a knowledgeable programmer may be able to learn Ada in 5 days, but that it takes 6 to 9 months to evolve into a software engineer who uses the language correctly and effectively. The curriculum and implementation plan can be adapted for each NASA Center according to the needs dictated by each project.

  12. 78 FR 53270 - Revision of Air Quality Implementation Plan; California; Sacramento Metropolitan Air Quality...

    Science.gov (United States)

    2013-08-29

    ... Quality Implementation Plan; California; Sacramento Metropolitan Air Quality Management District... to the Sacramento Metropolitan Air Quality Management District (SMAQMD or District) portion of the..., Sacramento Metropolitan Air Quality Management District, Rule 214 (Federal New Source Review), Rule 203...

  13. Concerning the problem of polygraphic wire calculation: theoretical aspects, software, practical implementation

    Directory of Open Access Journals (Sweden)

    Selkina A. V.

    2016-05-01

    Full Text Available the article analyzes the problems arising while organizing the workflow in printing companies. We suggest to address these problems by means of implementing computer-based accounting systems. Online and offline calculators used by printing enterprises for accounting are discussed. The author outlined the functional and specified requirements to such software. They were considered in the calculation module of accounting polygraphic wire used for block bonding. The software allows to increase the calculation process speed, to reduce the amount of errors in calculation and to decrease the labour intensity of the accounting process.

  14. Proteomics Quality Control: Quality Control Software for MaxQuant Results.

    Science.gov (United States)

    Bielow, Chris; Mastrobuoni, Guido; Kempa, Stefan

    2016-03-04

    Mass spectrometry-based proteomics coupled to liquid chromatography has matured into an automatized, high-throughput technology, producing data on the scale of multiple gigabytes per instrument per day. Consequently, an automated quality control (QC) and quality analysis (QA) capable of detecting measurement bias, verifying consistency, and avoiding propagation of error is paramount for instrument operators and scientists in charge of downstream analysis. We have developed an R-based QC pipeline called Proteomics Quality Control (PTXQC) for bottom-up LC-MS data generated by the MaxQuant software pipeline. PTXQC creates a QC report containing a comprehensive and powerful set of QC metrics, augmented with automated scoring functions. The automated scores are collated to create an overview heatmap at the beginning of the report, giving valuable guidance also to nonspecialists. Our software supports a wide range of experimental designs, including stable isotope labeling by amino acids in cell culture (SILAC), tandem mass tags (TMT), and label-free data. Furthermore, we introduce new metrics to score MaxQuant's Match-between-runs (MBR) functionality by which peptide identifications can be transferred across Raw files based on accurate retention time and m/z. Last but not least, PTXQC is easy to install and use and represents the first QC software capable of processing MaxQuant result tables. PTXQC is freely available at https://github.com/cbielow/PTXQC .

  15. [Design and implementation of virtual reality software with psychological treatment for drug-dependent patients].

    Science.gov (United States)

    Yang, Bo; Zhao, Xu; Ou, Yalin; Zhang, Jingyu; Li, Qing; Liu, Zhihong

    2012-12-01

    High relapse rate of drug-dependent patients is a serious problem in the current situation. The present article describes how to design and implement virtual reality technology for drug-dependent patients with psychological treatment, with the aim at the addiction withdrawal. The software was developed based on open-source game engine for 2D models. The form of a game simulates the actual style in the day-to-day living environment of drug-dependent patients and the temptation of using drugs. The software helps the patients deal with different scenarios and different event handling, cause their own thinking, and response to the temptation from high-risk environment and from other drug-dependent patients. The function of the software is close to the real life of drug-dependent patients, and has a prospect to become a new treatment to reduce the relapse rate of drug-dependence.

  16. How do Quality Requirements Contribute to Software Sustainability?

    NARCIS (Netherlands)

    Condori-Fernandez, O.N.; Lago, P.; Calero, Coral

    2016-01-01

    The concept of sustainable development has become an important objective of policy makers in the software industry. The most used definition of sustainability refers to dimensions of economic sustainability to ensure that software services can create economic value; technical sustainability that

  17. Image quality dependence on image processing software in ...

    African Journals Online (AJOL)

    Background. Image post-processing gives computed radiography (CR) a considerable advantage over film-screen systems. After digitisation of information from CR plates, data are routinely processed using manufacturer-specific software. Agfa CR readers use MUSICA software, and an upgrade with significantly different ...

  18. Research of quality control during development of NPP DCS 1E classified software

    International Nuclear Information System (INIS)

    Shi Weihua; Lu Zhenguo; Xie Qi

    2012-01-01

    The Nuclear safety depends on right behavior of 1E software, which is a important part of 1E DCS system. Nowadays, user focus on good function of 1E system, but pay little attention to quality control of 1E software. In fact, it's declared in IEC61513 and IEC60880 that 1E software should under strict quality control during all stages of development. This article is related to the practice of 1E DCS system quality control and explores the QC surveillance for 1E software from the user's point of view. (authors)

  19. ISO and software quality assurance - licensing and certification of software professionals

    Energy Technology Data Exchange (ETDEWEB)

    Hare, J.; Rodin, L.

    1997-11-01

    This report contains viewgraphs on licensing and certifing of software professionals. Discussed in this report are: certification programs; licensing programs; why became certified; certification as a condition of empolyment; certification requirements; and examination structures.

  20. Application of software quality assurance to a specific scientific code development task

    International Nuclear Information System (INIS)

    Dronkers, J.J.

    1986-03-01

    This paper describes an application of software quality assurance to a specific scientific code development program. The software quality assurance program consists of three major components: administrative control, configuration management, and user documentation. The program attempts to be consistent with existing local traditions of scientific code development while at the same time providing a controlled process of development

  1. SQuAVisiT : A Software Quality Assessment and Visualisation Toolset

    NARCIS (Netherlands)

    Roubtsov, Serguei; Telea, Alexandru; Holten, Danny

    2007-01-01

    Software quality assessment of large COBOL industrial legacy systems, both for maintenance or migration purposes, mounts a serious challenge. We present the Software Quality Assessment and Visualisation Toolset (SQuAVisiT), which assists users in performing the above task. First, it allows a fully

  2. Teamwork quality and project success in software development: A survey of agile development teams

    OpenAIRE

    Lindsjørn, Yngve; Sjøberg, Dag I.K.; Dingsøyr, Torgeir; Bergersen, Gunnar R.; Dybå, Tore

    2016-01-01

    Small, self-directed teams are central in agile development. This article investigates the effect of team- work quality on team performance, learning and work satisfaction in agile software teams, and whether this effect differs from that of traditional software teams. A survey was administered to 477 respondents from 71 agile software teams in 26 companies and analyzed using structural equation modeling. A posi- tive effect of teamwork quality on team performance was found when team members ...

  3. Evaluation of mass spectral library search algorithms implemented in commercial software.

    Science.gov (United States)

    Samokhin, Andrey; Sotnezova, Ksenia; Lashin, Vitaly; Revelsky, Igor

    2015-06-01

    Performance of several library search algorithms (against EI mass spectral databases) implemented in commercial software products ( acd/specdb, chemstation, gc/ms solution and ms search) was estimated. Test set contained 1000 mass spectra, which were randomly selected from NIST'08 (RepLib) mass spectral database. It was shown that composite (also known as identity) algorithm implemented in ms search (NIST) software gives statistically the best results: the correct compound occupied the first position in the list of possible candidates in 81% of cases; the correct compound was within the list of top ten candidates in 98% of cases. It was found that use of presearch option can lead to rejection of the correct answer from the list of possible candidates (therefore presearch option should not be used, if possible). Overall performance of library search algorithms was estimated using receiver operating characteristic curves. Copyright © 2015 John Wiley & Sons, Ltd.

  4. Complete modeling and software implementation of a virtual solar hydrogen hybrid system

    International Nuclear Information System (INIS)

    Pedrazzi, S.; Zini, G.; Tartarini, P.

    2010-01-01

    A complete mathematical model and software implementation of a solar hydrogen hybrid system has been developed and applied to real data. The mathematical model has been derived from sub-models taken from literature with appropriate modifications and improvements. The model has been implemented as a stand-alone virtual energy system in a model-based, multi-domain software environment. A test run has then been performed on typical residential user data-sets over a year-long period. Results show that the virtual hybrid system can bring about complete grid independence; in particular, hydrogen production balance is positive (+1.25 kg) after a year's operation with a system efficiency of 7%.

  5. Design and implementation of a compliant robot with force feedback and strategy planning software

    Science.gov (United States)

    Premack, T.; Strempek, F. M.; Solis, L. A.; Brodd, S. S.; Cutler, E. P.; Purves, L. R.

    1984-01-01

    Force-feedback robotics techniques are being developed for automated precision assembly and servicing of NASA space flight equipment. Design and implementation of a prototype robot which provides compliance and monitors forces is in progress. Computer software to specify assembly steps and makes force feedback adjustments during assembly are coded and tested for three generically different precision mating problems. A model program demonstrates that a suitably autonomous robot can plan its own strategy.

  6. Implementation of the INSPECT software package for statistical calculation in nuclear material accountability

    International Nuclear Information System (INIS)

    Marzo, M.A.S.

    1986-01-01

    The INSPECT software package was developed in the Pacific Northwest Laboratory for statistical calculations in nuclear material accountability. The programs apply the inspection and evaluation methodology described in Part of the Safeguards Technical Manual. In this paper the implementation of INSPECT at the Safeguards Division of CNEN, and the main characteristics of INSPECT are described. The potential applications of INSPECT to the nuclear material accountability is presented. (Author) [pt

  7. Software Implementation of a Secure Firmware Update Solution in an IOT Context

    Directory of Open Access Journals (Sweden)

    Lukas Kvarda

    2016-01-01

    Full Text Available The present paper is concerned with the secure delivery of firmware updates to Internet of Things (IoT devices. Additionally, it deals with the design of a safe and secure bootloader for a UHF RFID reader. A software implementation of a secure firmware update solution is performed. The results show there is space to integrate even more security features into existing devices.

  8. Caliko: An Inverse Kinematics Software Library Implementation of the FABRIK Algorithm

    Directory of Open Access Journals (Sweden)

    Alastair Lansley

    2016-09-01

    Full Text Available The Caliko library is an implementation of the FABRIK (Forward And Backward Reaching Inverse Kinematics algorithm written in Java. The inverse kinematics (IK algorithm is implemented in both 2D and 3D, and incorporates a variety of joint constraints as well as the ability to connect multiple IK chains together in a hierarchy. The library allows for the simple creation and solving of multiple IK chains as well as visualisation of these solutions. It is licensed under the MIT software license and the source code is freely available for use and modification at: https://github.com/feduni/caliko

  9. Design and implementation of a software tool intended for simulation and test of real time codes

    International Nuclear Information System (INIS)

    Le Louarn, C.

    1986-09-01

    The objective of real time software testing is to show off processing errors and unobserved functional requirements or timing constraints in a code. In the perspective of safety analysis of nuclear equipments of power plants testing should be carried independently from the physical process (which is not generally available), and because casual hardware failures must be considered. We propose here a simulation and test tool, integrally software, with large interactive possibilities for testing assembly code running on microprocessor. The OST (outil d'aide a la simulation et au Test de logiciels temps reel) simulates code execution and hardware or software environment behaviour. Test execution is closely monitored and many useful informations are automatically saved. The present thesis work details, after exposing methods and tools dedicated to real time software, the OST system. We show the internal mechanisms and objects of the system: particularly ''events'' (which describe evolutions of the system under test) and mnemonics (which describe the variables). Then, we detail the interactive means available to the user for constructing the test data and the environment of the tested software. Finally, a prototype implementation is presented along with the results of the tests carried out. This demonstrates the many advantages of the use of an automatic tool over a manual investigation. As a conclusion, further developments, nececessary to complete the final tool are rewieved [fr

  10. Software requirements elicitation to support internal monitoring of quality assurance system for higher education in Indonesia

    Science.gov (United States)

    Amalia, A.; Gunawan, D.; Hardi, S. M.; Rachmawati, D.

    2018-02-01

    The Internal Quality Assurance System (in Indonesian: SPMI (Sistem Penjaminan Mutu Internal) is a systemic activity of quality assurance of higher education in Indonesia. SPMI should be done by all higher education or universities in Indonesia based on the Regulation of the Minister of Research, Technology and Higher Education of the Republic of Indonesia Number 62 of 2016. Implementation of SPMI must refer to the principle of SPMI that is independent, standardize, accurate, well planned and sustainable, documented and systematic. To assist the SPMI cycle properly, universities need a supporting software to monitor all the activities of SPMI. But in reality, many universities are not optimal in building this SPMI monitoring system. One of the obstacles is the determination of system requirements in support of SPMI principles is difficult to achieve. In this paper, we observe the initial phase of the engineering requirements elicitation. Unlike other methods that collect system requirements from users and stakeholders, we find the system requirements of the SPMI principles from SPMI guideline book. The result of this paper can be used as a choice in determining SPMI software requirements. This paper can also be used by developers and users to understand the scenario of SPMI so that could overcome the problems of understanding between this two parties.

  11. Characterizing the contribution of quality requirements to software sustainability

    NARCIS (Netherlands)

    Condori-Fernandez, Nelly; Lago, Patricia

    2018-01-01

    Most respondents considered modifiability as relevant for addressing both technical and environmental sustainability. Functional correctness, availability, modifiability, interoperability and recoverability favor positively the endurability of software systems. This study has also identified

  12. 77 FR 73320 - Approval of Air Quality Implementation Plans; California; South Coast Air Quality Management...

    Science.gov (United States)

    2012-12-10

    ... Quality Implementation Plans; California; South Coast Air Quality Management District; Prevention of... Implementation Plan (SIP) revision for the South Coast Air Quality Management District (SCAQMD or District... in a August 15, 2012 letter from the South Coast Air Quality Management District regarding specific...

  13. Round table discussion: Quality control and standardization of nuclear medicine software

    International Nuclear Information System (INIS)

    Anon.

    1988-01-01

    In summary the round table came to the following important conclusions: Nuclear medicine software systems need better documentation, especially regarding details of algorithms and limitations, and user friendliness could be considerably improved. Quality control of software is an integral part of quality assurance in nuclear medicine and should be performed at all levels of the software. Quality control of applications software should preferably be performed with assistance of generally accepted software phantoms. A basic form of standardization was welcomed and partly regarded as essential by all participants. Some areas such as patient study files could be standardized in the near future, whereas other areas such as the standardization of clinical applications programs or acquisition protocols still present major difficulties. An international cooperation in the field of standardization of software and other topics has already been started on the European level and should be continued and supported. (orig.)

  14. Implementation of the qualities of radiodiagnostic: mammography

    International Nuclear Information System (INIS)

    Pacífico, Leonardo; Magalhães, Luís A.G.; Fernandes, Elisabeth; Peixoto, José Guilherme P.

    2017-01-01

    The objective of the present study was to evaluate the expanded uncertainty and present the result of the internal audit performed at the Laboratory of Radiological Sciences (LCR). The qualities of the mammographic bundles that are references in the LCR calibrations had their uncertainties and conformities with the standard evaluated. The expanded uncertainty was 1.40%, and the result of the internal audit was satisfactory. We conclude that LCR can perform calibrations on mammography qualities for end users. (author)

  15. Software design and implementation concepts for an interoperable medical communication framework.

    Science.gov (United States)

    Besting, Andreas; Bürger, Sebastian; Kasparick, Martin; Strathen, Benjamin; Portheine, Frank

    2018-02-23

    The new IEEE 11073 service-oriented device connectivity (SDC) standard proposals for networked point-of-care and surgical devices constitutes the basis for improved interoperability due to its independence of vendors. To accelerate the distribution of the standard a reference implementation is indispensable. However, the implementation of such a framework has to overcome several non-trivial challenges. First, the high level of complexity of the underlying standard must be reflected in the software design. An efficient implementation has to consider the limited resources of the underlying hardware. Moreover, the frameworks purpose of realizing a distributed system demands a high degree of reliability of the framework itself and its internal mechanisms. Additionally, a framework must provide an easy-to-use and fail-safe application programming interface (API). In this work, we address these challenges by discussing suitable software engineering principles and practical coding guidelines. A descriptive model is developed that identifies key strategies. General feasibility is shown by outlining environments in which our implementation has been utilized.

  16. Strategies of Assessing and Implementing Quality Assurance in ...

    African Journals Online (AJOL)

    The main strategy used for implementation of quality assurance was integration of the library` services quality assurance agenda into the university structures and the quality assurance mechanisms available and used in Nigerian university libraries were programme accreditation and benchmarking of library systems.

  17. Internal Quality Assurance System and Its Implementation in Kaunas College

    Science.gov (United States)

    Misiunas, Mindaugas

    2007-01-01

    The article discusses the internal system of quality assurance and its implementation methods in Kaunas College. The issues of quality assurance are reviewed in the context of the European higher education area covering the three levels: European, national and institutional. The importance of quality assurance and its links with external…

  18. The Role and Quality of Software Safety in the NASA Constellation Program

    Science.gov (United States)

    Layman, Lucas; Basili, Victor R.; Zelkowitz, Marvin V.

    2010-01-01

    In this study, we examine software safety risk in the early design phase of the NASA Constellation spaceflight program. Obtaining an accurate, program-wide picture of software safety risk is difficult across multiple, independently-developing systems. We leverage one source of safety information, hazard analysis, to provide NASA quality assurance managers with information regarding the ongoing state of software safety across the program. The goal of this research is two-fold: 1) to quantify the relative importance of software with respect to system safety; and 2) to quantify the level of risk presented by software in the hazard analysis. We examined 154 hazard reports created during the preliminary design phase of three major flight hardware systems within the Constellation program. To quantify the importance of software, we collected metrics based on the number of software-related causes and controls of hazardous conditions. To quantify the level of risk presented by software, we created a metric scheme to measure the specificity of these software causes. We found that from 49-70% of hazardous conditions in the three systems could be caused by software or software was involved in the prevention of the hazardous condition. We also found that 12-17% of the 2013 hazard causes involved software, and that 23-29% of all causes had a software control. Furthermore, 10-12% of all controls were software-based. There is potential for inaccuracy in these counts, however, as software causes are not consistently scoped, and the presence of software in a cause or control is not always clear. The application of our software specificity metrics also identified risks in the hazard reporting process. In particular, we found a number of traceability risks in the hazard reports may impede verification of software and system safety.

  19. DISEÑO DEL PROCESO DE IMPLEMENTACIÓN DE SOFTWARE EN DESOFT HABANA / DESIGN OF THE SOFTWARE IMPLEMENTATION PROCESS AT DESOFT HAVANA

    Directory of Open Access Journals (Sweden)

    Alina Isasi-Genix

    2012-01-01

    Full Text Available En Cuba, el proceso de Informatización de la Sociedad ha propiciado el aumento del uso de herramientas informáticas, dirigidas a la gestión de dirección en los órganos de gobierno, la administración y las empresas. La empresa DESOFT desempeña un rol protagónico en tal sentido, implementando varios productos de la cartera de la empresa en clientes de todos los municipios del país, lo que hace del servicio de implementación, uno de los más importantes para dicha entidad. Actualmente, la división DESOFT Habana cuenta con una metodología de implementación de herramientas informáticas; sin embargo, el servicio no se gestiona de forma eficiente, por lo que es necesario diseñar el proceso de implementación adaptado a las condiciones reales de los proyectos en la provincia. Se utilizó el enfoque a procesos, para realizar el diseño, lo que permite brindar un servicio de excelencia y garantizar la calidad de los procesos. In Cuba, the process of computerization of the society has led to increasing the use of software tools aimed at the management of direction in the government´s bodies, administration and businesses. DESOFT company plays a leading role in this regard, implementing several products of the company's portfolio on clients of all country´s municipalities, which turns the implementation service into one of the most important for this entity. Nowadays, the Havana DESOFT division has a methodology for the implementation of software tools; however, the service is not efficiently managed, so it is necessary to design the implementation process adapted to the actual conditions of the projects in the province. For the design, it was used the process approach, which allows to provide an excellence service and to ensure the quality of the processes.

  20. Implementing quality initiatives in healthcare organizations: drivers and challenges.

    Science.gov (United States)

    Abdallah, Abdallah

    2014-01-01

    Various quality initiatives seem to have successful implementation in some healthcare organizations yet fail in others. This paper sets out to study the literature trying to understand drivers and challenges facing quality initiatives implementation in healthcare organizations then compare findings from literature with those of a structured questionnaire answered by 60 representatives from 18 hospitals. Finally it proposes a framework that mitigates challenges and utilizes drivers to ensure best implementation results. Literature regarding implementing various quality initiatives in the healthcare sector was reviewed. Representatives from several healthcare organizations were surveyed. Results from both approaches are compared to highlight the key challenges and drivers facing implementers. This research reveals that internal factors related to leadership and employees greatly affect quality initiative success or failure. Design and relevance play a major role in successful implementation. PRACTICAL IMPLICATIONs: This research offers healthcare professionals greater success when implementing certain quality initiatives by taking success/failure factors into consideration. A general framework for successful implementation in the healthcare sector is provided. This article uncovers reasons behind success or failure in a comprehensive and practical way. It also explores how most popular quality initiatives are applied in hospitals.

  1. Implementing Kanban for agile process management within the ALMA Software Operations Group

    Science.gov (United States)

    Reveco, Johnny; Mora, Matias; Shen, Tzu-Chiang; Soto, Ruben; Sepulveda, Jorge; Ibsen, Jorge

    2014-07-01

    After the inauguration of the Atacama Large Millimeter/submillimeter Array (ALMA), the Software Operations Group in Chile has refocused its objectives to: (1) providing software support to tasks related to System Integration, Scientific Commissioning and Verification, as well as Early Science observations; (2) testing the remaining software features, still under development by the Integrated Computing Team across the world; and (3) designing and developing processes to optimize and increase the level of automation of operational tasks. Due to their different stakeholders, each of these tasks presents a wide diversity of importances, lifespans and complexities. Aiming to provide the proper priority and traceability for every task without stressing our engineers, we introduced the Kanban methodology in our processes in order to balance the demand on the team against the throughput of the delivered work. The aim of this paper is to share experiences gained during the implementation of Kanban in our processes, describing the difficulties we have found, solutions and adaptations that led us to our current but still evolving implementation, which has greatly improved our throughput, prioritization and problem traceability.

  2. Design and implementation air quality monitoring robot

    Science.gov (United States)

    Chen, Yuanhua; Li, Jie; Qi, Chunxue

    2017-01-01

    Robot applied in environmental protection can break through the limitations in working environment, scope and mode of the existing environmental monitoring and pollution abatement equipments, which undertake the innovation and improvement in the basin, atmosphere, emergency and pollution treatment facilities. Actually, the relevant technology is backward with limited research and investment. Though the device companies have achieved some results in the study on the water quality monitoring, pipeline monitoring and sewage disposal, this technological progress on the whole is still much slow, and the mature product has not been formed. As a result, the market urges a demand of a new type of device which is more suitable for environmental protection on the basis of robot successfully applied in other fields. This paper designs and realizes a tracked mobile robot of air quality monitoring, which can be used to monitor air quality for the pollution accident in industrial parks and regular management.

  3. Implementation of the qualities of radiodiagnostic: mammography

    Science.gov (United States)

    Pacífico, L. C.; Magalhães, L. A. G.; Peixoto, J. G. P.; Fernandes, E.

    2018-03-01

    The objective of the present study was to evaluate the expanded uncertainty of the mammographic calibration process and present the result of the internal audit performed at the Laboratory of Radiological Sciences (LCR). The qualities of the mammographic beans that are references in the LCR, comprises two irradiation conditions: no-attenuated beam and attenuated beam. Both had satisfactory results, with an expanded uncertainty equals 2,1%. The internal audit was performed, and the degree of accordance with the ISO/IEC 17025 was evaluated. The result of the internal audit was satisfactory. We conclude that LCR can perform calibrations on mammography qualities for end users.

  4. A SaTScan™ macro accessory for cartography (SMAC package implemented with SAS® software

    Directory of Open Access Journals (Sweden)

    Kleinman Ken P

    2007-03-01

    Full Text Available Abstract Background SaTScan is a software program written to implement the scan statistic; it can be used to find clusters in space and/or time. It must often be run multiple times per day when doing disease surveillance. Running SaTScan frequently via its graphical user interface can be cumbersome, and the output can be difficult to visualize. Results The SaTScan Macro Accessory for Cartography (SMAC package consists of four SAS macros and was designed as an easier way to run SaTScan multiple times and add graphical output. The package contains individual macros which allow the user to make the necessary input files for SaTScan, run SaTScan, and create graphical output all from within SAS software. The macros can also be combined to do this all in one step. Conclusion The SMAC package can make SaTScan easier to use and can make the output more informative.

  5. A SaTScan macro accessory for cartography (SMAC) package implemented with SAS software.

    Science.gov (United States)

    Abrams, Allyson M; Kleinman, Ken P

    2007-03-06

    SaTScan is a software program written to implement the scan statistic; it can be used to find clusters in space and/or time. It must often be run multiple times per day when doing disease surveillance. Running SaTScan frequently via its graphical user interface can be cumbersome, and the output can be difficult to visualize. The SaTScan Macro Accessory for Cartography (SMAC) package consists of four SAS macros and was designed as an easier way to run SaTScan multiple times and add graphical output. The package contains individual macros which allow the user to make the necessary input files for SaTScan, run SaTScan, and create graphical output all from within SAS software. The macros can also be combined to do this all in one step. The SMAC package can make SaTScan easier to use and can make the output more informative.

  6. Integrated State Estimation and Contingency Analysis Software Implementation using High Performance Computing Techniques

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Yousu; Glaesemann, Kurt R.; Rice, Mark J.; Huang, Zhenyu

    2015-12-31

    Power system simulation tools are traditionally developed in sequential mode and codes are optimized for single core computing only. However, the increasing complexity in the power grid models requires more intensive computation. The traditional simulation tools will soon not be able to meet the grid operation requirements. Therefore, power system simulation tools need to evolve accordingly to provide faster and better results for grid operations. This paper presents an integrated state estimation and contingency analysis software implementation using high performance computing techniques. The software is able to solve large size state estimation problems within one second and achieve a near-linear speedup of 9,800 with 10,000 cores for contingency analysis application. The performance evaluation is presented to show its effectiveness.

  7. Advanced quality prediction model for software architectural knowledge sharing

    NARCIS (Netherlands)

    Liang, Peng; Jansen, Anton; Avgeriou, Paris; Tang, Antony; Xu, Lai

    In the field of software architecture, a paradigm shift is occurring from describing the outcome of architecting process to describing the Architectural Knowledge (AK) created and used during architecting. Many AK models have been defined to represent domain concepts and their relationships, and

  8. An empirical study of software architectures' effect on product quality

    DEFF Research Database (Denmark)

    Hansen, Klaus Marius; Jonasson, Kristjan; Neukirchen, Helmut

    2011-01-01

    Software architectures shift the focus of developers from lines-of-code to coarser-grained components and their interconnection structure. Unlike 2ne-grained objects, these components typically encompass business functionality and need to be aware of the underlying business processes. Hence...

  9. Dealing with Imprecise Quality Factors in Software Design

    NARCIS (Netherlands)

    Noppen, J.A.R.; van den Broek, P.M.; Aksit, Mehmet; Boehm, B.; Chulani, S.; Verner, J.; Wong, B.

    2005-01-01

    During the design of a software system impreciseness can manifest itself in for instance the requirements or performance estimations. While it is common to eliminate the impreciseness by information that can not be justified , it is better to model the impreciseness since it is the most accurate

  10. ThermoData Engine (TDE) software implementation of the dynamic data evaluation concept. 7. Ternary mixtures.

    Science.gov (United States)

    Diky, Vladimir; Chirico, Robert D; Muzny, Chris D; Kazakov, Andrei F; Kroenlein, Kenneth; Magee, Joseph W; Abdulagatov, Ilmutdin; Kang, Jeong Won; Frenkel, Michael

    2012-01-23

    ThermoData Engine (TDE) is the first full-scale software implementation of the dynamic data evaluation concept, as reported in this journal. The present paper describes the first application of this concept to the evaluation of thermophysical properties for ternary chemical systems. The method involves construction of Redlich-Kister type equations for individual properties (excess volume, thermal conductivity, viscosity, surface tension, and excess enthalpy) and activity coefficient models for phase equilibrium properties (vapor-liquid and liquid-liquid equilibrium). Constructed ternary models are based on those for the three pure component and three binary subsystems evaluated on demand through the TDE software algorithms. All models are described in detail, and extensions to the class structure of the program are provided. Reliable evaluation of properties for the binary subsystems is essential for successful property evaluations for ternary systems, and algorithms are described to aid appropriate parameter selection and fitting for the implemented activity coefficient models (NRTL, Wilson, Van Laar, Redlich-Kister, and UNIQUAC). Two activity coefficient models based on group contributions (original UNIFAC and NIST-KT-UNIFAC) are also implemented. Novel features of the user interface are shown, and directions for future enhancements are outlined.

  11. Implementation of audio computer-assisted interviewing software in HIV/AIDS research.

    Science.gov (United States)

    Pluhar, Erika; McDonnell Holstad, Marcia; Yeager, Katherine A; Denzmore-Nwagbara, Pamela; Corkran, Carol; Fielder, Bridget; McCarty, Frances; Diiorio, Colleen

    2007-01-01

    Computer-assisted interviewing (CAI) has begun to play a more prominent role in HIV/AIDS prevention research. Despite the increased popularity of CAI, particularly audio computer-assisted self-interviewing (ACASI), some research teams are still reluctant to implement ACASI technology because of lack of familiarity with the practical issues related to using these software packages. The purpose of this report is to describe the implementation of one particular ACASI software package, the Questionnaire Development System (QDS; Nova Research Company, Bethesda, MD), in several nursing and HIV/AIDS prevention research settings. The authors present acceptability and satisfaction data from two large-scale public health studies in which they have used QDS with diverse populations. They also address issues related to developing and programming a questionnaire; discuss practical strategies related to planning for and implementing ACASI in the field, including selecting equipment, training staff, and collecting and transferring data; and summarize advantages and disadvantages of computer-assisted research methods.

  12. Software Design and Implementation of a Next Generation GNSS Data Logger for Radio Frequency Interference Detection

    Science.gov (United States)

    Rivera, Sean

    The aim of this thesis is to implement a new GNSS interference detection system that extends the functionality of previous systems. This technology will be evaluated on the effectiveness of data collection in GPS L1, GPS L2C and all software changes required to implement it. Additionally there is discussion about extending the technology to log GLONASS L1 and GLONASS L2 data. This thesis is broken up into three sections. First, it describes the current state of GPS ASIC systems, using the SiGe receiver. It covers the reasons that this ASIC requires modernization as well as what behavior the system seeks to emulate. Next, it describes the new hardware that will make up the base of the new system. It covers what each component has been added for and its differences from the previous version. The focus is on the new ability of the receiver to receiver L2C simultaneously with L1 as well as the increased data rate which allows for better positioning. Finally, the thesis describes new software development effort centered around the new system. The thesis describes three test modes used, continuous logging, triggered logging and periodic logging. Extra attention is paid to the implementation of a real time networked version of the code. The final goal of the thesis is to build a low cost distributed system that is capable of detecting and localizing GPS and GLONASS interference on both the L1 and L2 bands.

  13. Quality factors quantification/assurance for software related to safety in nuclear power plants

    International Nuclear Information System (INIS)

    Nunez McLeod, J.E.; Rivera, S.S.

    1997-01-01

    Quality assurance plan is needed to guarantee the software quality. The use of such a plan involves activities that should take place all along the life cycle, and which can be evaluated using the so called quality factors. This is due to the fact that the quality itself cannot be measured, but some of its manifestations can be used for this purpose. In the present work, a methodology to quantify a set of quality factors is proposed, for software based systems to be used in safety related areas in nuclear power plants. (author) [es

  14. Total Quality Management Implementation: Selected Readings

    Science.gov (United States)

    1989-04-01

    such as Crosby and Juran, estimate that up to 85 percent of quality improvement is under direct control of management and can not be remedied by the...has lead to a misguided policy of attempting to increase competitiveness by increasing the budget for research. You do not cure constipation by forced...diagnose and remedy chronic errors; usually they are reward- just the target value. For example. financial institutions cannot ed for developing new

  15. Pre-Hardware Optimization of Spacecraft Image Processing Software Algorithms and Hardware Implementation

    Science.gov (United States)

    Kizhner, Semion; Flatley, Thomas P.; Hestnes, Phyllis; Jentoft-Nilsen, Marit; Petrick, David J.; Day, John H. (Technical Monitor)

    2001-01-01

    Spacecraft telemetry rates have steadily increased over the last decade presenting a problem for real-time processing by ground facilities. This paper proposes a solution to a related problem for the Geostationary Operational Environmental Spacecraft (GOES-8) image processing application. Although large super-computer facilities are the obvious heritage solution, they are very costly, making it imperative to seek a feasible alternative engineering solution at a fraction of the cost. The solution is based on a Personal Computer (PC) platform and synergy of optimized software algorithms and re-configurable computing hardware technologies, such as Field Programmable Gate Arrays (FPGA) and Digital Signal Processing (DSP). It has been shown in [1] and [2] that this configuration can provide superior inexpensive performance for a chosen application on the ground station or on-board a spacecraft. However, since this technology is still maturing, intensive pre-hardware steps are necessary to achieve the benefits of hardware implementation. This paper describes these steps for the GOES-8 application, a software project developed using Interactive Data Language (IDL) (Trademark of Research Systems, Inc.) on a Workstation/UNIX platform. The solution involves converting the application to a PC/Windows/RC platform, selected mainly by the availability of low cost, adaptable high-speed RC hardware. In order for the hybrid system to run, the IDL software was modified to account for platform differences. It was interesting to examine the gains and losses in performance on the new platform, as well as unexpected observations before implementing hardware. After substantial pre-hardware optimization steps, the necessity of hardware implementation for bottleneck code in the PC environment became evident and solvable beginning with the methodology described in [1], [2], and implementing a novel methodology for this specific application [6]. The PC-RC interface bandwidth problem for the

  16. 77 FR 52277 - Approval of Air Quality Implementation Plans; California; South Coast Air Quality Management...

    Science.gov (United States)

    2012-08-29

    ... Quality Implementation Plans; California; South Coast Air Quality Management District; Prevention of... rule. SUMMARY: EPA is proposing approval of a permitting rule submitted for the South Coast Air Quality Management District (District) portion of the California State Implementation Plan (SIP). The State is...

  17. 75 FR 65594 - Approval and Promulgation of Air Quality Implementation Plans; Ohio; Ohio Ambient Air Quality...

    Science.gov (United States)

    2010-10-26

    ... Promulgation of Air Quality Implementation Plans; Ohio; Ohio Ambient Air Quality Standards AGENCY... the Ohio Administrative Code (OAC) relating to the consolidation of Ohio's Ambient Air Quality Standards (AAQS) into Ohio's State Implementation Plan (SIP) under the Clean Air Act. On April 8, 2009, and...

  18. Fiber beam-column element implementation in academic CAD software Matrix 3D

    Directory of Open Access Journals (Sweden)

    Jovanović Đorđe

    2017-01-01

    Full Text Available Theoretical foundations of beam -column finite element implemented (and tested within academic CAD software developed on FTN (Department of civil engineering are presented in this paper. Aforementioned FE is force-based fibre element, divided into a discrete number of monitored sections. Besides of material nonlinearity, finite-element is capable of capturing geometrical nonlinearity. Some of numerical issues needed for performing incremental-iterative solution procedures with those elements are addressed in the paper. Finally, results and comparison with available data are shown.

  19. Identifying strengths and weaknesses of Quality Management Unit University of Sumatera Utara software using SCAMPI C

    Science.gov (United States)

    Gunawan, D.; Amalia, A.; Rahmat, R. F.; Muchtar, M. A.; Siregar, I.

    2018-02-01

    Identification of software maturity level is a technique to determine the quality of the software. By identifying the software maturity level, the weaknesses of the software can be observed. As a result, the recommendations might be a reference for future software maintenance and development. This paper discusses the software Capability Level (CL) with case studies on Quality Management Unit (Unit Manajemen Mutu) University of Sumatera Utara (UMM-USU). This research utilized Standard CMMI Appraisal Method for Process Improvement class C (SCAMPI C) model with continuous representation. This model focuses on activities for developing quality products and services. The observation is done in three process areas, such as Project Planning (PP), Project Monitoring and Control (PMC), and Requirements Management (REQM). According to the measurement of software capability level for UMM-USU software, turns out that the capability level for the observed process area is in the range of CL1 and CL2. Planning Project (PP) is the only process area which reaches capability level 2, meanwhile, PMC and REQM are still in CL 1 or in performed level. This research reveals several weaknesses of existing UMM-USU software. Therefore, this study proposes several recommendations for UMM-USU to improve capability level for observed process areas.

  20. Development of innovative computer software to facilitate the setup and computation of water quality index.

    Science.gov (United States)

    Nabizadeh, Ramin; Valadi Amin, Maryam; Alimohammadi, Mahmood; Naddafi, Kazem; Mahvi, Amir Hossein; Yousefzadeh, Samira

    2013-04-26

    Developing a water quality index which is used to convert the water quality dataset into a single number is the most important task of most water quality monitoring programmes. As the water quality index setup is based on different local obstacles, it is not feasible to introduce a definite water quality index to reveal the water quality level. In this study, an innovative software application, the Iranian Water Quality Index Software (IWQIS), is presented in order to facilitate calculation of a water quality index based on dynamic weight factors, which will help users to compute the water quality index in cases where some parameters are missing from the datasets. A dataset containing 735 water samples of drinking water quality in different parts of the country was used to show the performance of this software using different criteria parameters. The software proved to be an efficient tool to facilitate the setup of water quality indices based on flexible use of variables and water quality databases.

  1. 76 FR 54800 - International Business Machines (IBM), Software Group Business Unit, Quality Assurance Group, San...

    Science.gov (United States)

    2011-09-02

    ... DEPARTMENT OF LABOR Employment and Training Administration [TA-W-74,554] International Business... workers of International Business Machines (IBM), Software Group Business Unit, Optim Data Studio Tools QA... workers of International Business Machines (IBM), Software Group Business Unit, Quality Assurance Group...

  2. A Framework for Evaluating the Software Product Quality of Pregnancy Monitoring Mobile Personal Health Records.

    Science.gov (United States)

    Idri, Ali; Bachiri, Mariam; Fernández-Alemán, José Luis

    2016-03-01

    Stakeholders' needs and expectations are identified by means of software quality requirements, which have an impact on software product quality. In this paper, we present a set of requirements for mobile personal health records (mPHRs) for pregnancy monitoring, which have been extracted from literature and existing mobile apps on the market. We also use the ISO/IEC 25030 standard to suggest the requirements that should be considered during the quality evaluation of these mPHRs. We then go on to design a checklist in which we contrast the mPHRs for pregnancy monitoring requirements with software product quality characteristics and sub-characteristics in order to calculate the impact of these requirements on software product quality, using the ISO/IEC 25010 software product quality standard. The results obtained show that the requirements related to the user's actions and the app's features have the most impact on the external sub-characteristics of the software product quality model. The only sub-characteristic affected by all the requirements is Appropriateness of Functional suitability. The characteristic Operability is affected by 95% of the requirements while the lowest degrees of impact were identified for Compatibility (15%) and Transferability (6%). Lastly, the degrees of the impact of the mPHRs for pregnancy monitoring requirements are discussed in order to provide appropriate recommendations for the developers and stakeholders of mPHRs for pregnancy monitoring.

  3. Maximizing the utilization and impact of medical educational software by designing for local area network (LAN) implementation.

    OpenAIRE

    Stevens, R.; Reber, E.

    1993-01-01

    The design, development and implementation of medical education software often occurs without sufficient consideration of the potential benefits that can be realized by making the software network aware. These benefits can be considerable and can greatly enhance the utilization and potential impact of the software. This article details how multiple aspects of the IMMEX problem solving project have benefited from taking maximum advantage of LAN resources.

  4. Analisis Kriteria Pengembangan Penganggaran Elektronik Menggunakan Software Quality Function Deployment (SQFD dari Sudut Pandang Pengguna

    Directory of Open Access Journals (Sweden)

    Rahmi Kartika Jati

    2017-08-01

    Full Text Available Pesatnya perkembangan sistem informasi menjadi pemicu dalam pengembangan aplikasi pendukung operasional organisasi. Tak terkecuali pemerintah Indonesia. Sebagai organisasi yang besar, pemerintah Indonesia memandang perlu adanya perbaikan layanan kepada masyarakat. Hal tersebut dilakukan dengan menetapkan penerapan e-government. E-government merupakan salah satu alat yang digunakan dalam perbaikan birokrasi dan tata kelola pemerintahan. Pusat Penelitian X sebagai lembaga pemerintahan melakukan perbaikan terus menerus untuk meningkatkan tata kelola organisasi. Dari beberapa area yang akan menjadi objek pengembangan e-government, pihak manajemen Pusat Penelitian X memutuskan untuk memprioritaskan perbaikan pada proses keuangan, khususnya proses penganggaran. Dalam rangka perbaikan proses anggaran tersebut, perlu kajian terkait pengembangan penganggaran elektronik. Berdasarkan hal tersebut penelitian ini bertujuan untuk pengembangan penganggaran elektronik di Pusat Penelitian X. Pengembangan dilakukan dengan menggunakan Software Quality Function Deployment (SQFD yang merupakan pengembangan Quality Function Deployment (QFD tradisional yang umum digunakan pada pengembangan produk manufaktur. Dari penelitian ini diperoleh hasil bahwa dalam pengembangan penganggaran elektronik di Pusat Penelitian X, programmer perlu memperhatikan delapan kriteria prioritas yang menurut pengguna perlu ada di dalam penganggaran elektronik. Dimana delapan kriteria prioritas tersebut dapat digunakan sebagai dasar pengetesan sistem untuk finalisasi pengembangan penganggaran elektronik.*****The rapid development of information system is the trigger for the developing of organization operational support applications. This is also the case of Indonesian government. As a big organization, Indonesian government sees the need of improving public services. Those can be conducted by implementing e-government, as a tool to improve the bureaucracy and governance effectively and

  5. Determinants of quality management systems implementation in hospitals.

    Science.gov (United States)

    Wardhani, Viera; Utarini, Adi; van Dijk, Jitse Pieter; Post, Doeke; Groothoff, Johan Willem

    2009-03-01

    To identify the problems and facilitating factors in the implementation of quality management system (QMS) in hospitals through a systematic review. A search strategy was performed on the Medline database for articles written in English published between 1992 and early 2006. Using the thesaurus terms 'Total Quality Management' and 'Quality Assurance Health Care', combined with the term 'hospital' and 'implement*', we identified 533 publications. The screening process was based on empirical articles describing organization-wide QMS implementation. Fourteen empirical articles fulfilled the inclusion criteria and were reviewed in this paper. An organization culture emphasizing standards and values associated with affiliation, teamwork and innovation, assumption of change and risk taking, play as the key success factor in QMS implementation. This culture needs to be supported by sufficient technical competence to apply a scientific problem-solving approach. A clear distribution of QMS function within the organizational structure is more important than establishing a formal quality structure. In addition to management leadership, physician involvement also plays an important role in implementing QMS. Six supporting and limiting factors determining QMS implementation are identified in this review. These are the organization culture, design, leadership for quality, physician involvement, quality structure and technical competence.

  6. Report on the working conference on requirements engineering: foundation for software quality (REFSQ'09)

    NARCIS (Netherlands)

    Glinz, Martin; Heymans, Patrick; Persson, Anne; Sindre, Guttorm; Aurum, Aybüke; Madhavji, Nazim; Madhavji, N.; Paech, Barbara; Regev, Gil; Wieringa, Roelf J.

    This report summarizes the presentations and discussions at REFSQ’09, the 15th International Working Conference on Requirements Engineering: Foundation for Software Quality which was held on June 8-9, 2009 in Amsterdam, The Netherlands.

  7. THE PROCEDURE OF IMPLEMENTATION OF ELECTRONIC SCIENTIFIC JOURNAL USING THE OPEN JOURNAL SYSTEMS SOFTWARE PLATFORM

    Directory of Open Access Journals (Sweden)

    Spirin O.

    2017-12-01

    Full Text Available The article deals with the procedure of implementation of the electronic scientific periodical edition using electronic open journal systems (EOJS. It is specified the activities at each of its stages (prognostic, organizational, technical, preparatory, practical, generalizing and prospective. The recommendations on the main aspects of the electronic journal implementation are given: the normative and legal basis for the functioning, their types, scientific scope, target audience, economic model, editorial policies and periodicity of publication; the software selection for the editorial and publishing process support, the Open Journal Systems (OJS user roles system; the expected type and format of the journal content, the type of it access, including open access, archiving, indexing and information and analytical monitoring of published scientific works; the formation of the editorial board and staff, the involvement of ICT professionals, their duties and workload, the users and technical team training.

  8. Exploring Issues of Implementation, Equity, and Student Achievement With Educational Software in the DC Public Schools

    Directory of Open Access Journals (Sweden)

    June Ahn

    2016-08-01

    Full Text Available In this article, we present analyses from a researcher-practitioner partnership in the District of Columbia Public Schools, where we are exploring the impact of educational software on students’ academic achievement. We analyze a unique data set that combines student-level information from the district with data of student usage of a mathematics game platform: First in Math (FIM. These data offer a window into long-standing issues in the educational technology literature around implementation, equity, and student achievement. We show that time spent in FIM was correlated with improved future performance on standardized math assessments for students in Grades 4–8. However, student time spent using FIM was highly related to factors such as race, gender, and prior achievement. Such observations from data are helpful for school districts and researchers to inform equitable implementation of new technologies and maximize benefits to learners.

  9. Implementing Total Quality Management in a University Setting.

    Science.gov (United States)

    Coate, L. Edwin

    1991-01-01

    Oregon State University implemented Total Quality Management in nine phases: exploration; establishing a pilot study team; defining customer needs; adopting the breakthrough planning process; performing breakthrough planning in divisions; forming daily management teams; initiating cross-functional pilot projects; implementing cross-functional…

  10. Implementation of quality management and quality measurements in education

    DEFF Research Database (Denmark)

    Madsen, Ole Nørgaard; Carlsson, Ricci

    Gennemgang af program for implementering af kvalitetsledelse på en uddannelsesinstitution. I 7 faser præsenteres en fremgangsmåde baseret på den danske/europæiske model for selvevaluering som grundlag for kvalitetsbeskrivelse af virksomheden, prioritering af kvalitetsindsats, planlægning af kvali...

  11. Implementing an integrative multi-agent clinical decision support system with open source software.

    Science.gov (United States)

    Sayyad Shirabad, Jelber; Wilk, Szymon; Michalowski, Wojtek; Farion, Ken

    2012-02-01

    Clinical decision making is a complex multi-stage process. Decision support can play an important role at each stage of this process. At present, the majority of clinical decision support systems have been focused on supporting only certain stages. In this paper we present the design and implementation of MET3-a prototype multi-agent system providing an integrative decision support that spans over the entire decision making process. The system helps physicians with data collection, diagnosis formulation, treatment planning and finding supporting evidence. MET3 integrates with external hospital information systems via HL7 messages and runs on various computing platforms available at the point of care (e.g., tablet computers, mobile phones). Building MET3 required sophisticated and reliable software technologies. In the past decade the open source software movement has produced mature, stable, industrial strength software systems with a large user base. Therefore, one of the decisions that should be considered before developing or acquiring a decision support system is whether or not one could use open source technologies instead of proprietary ones. We believe MET3 shows that the answer to this question is positive.

  12. A Comparison of Four Software Programs for Implementing Decision Analytic Cost-Effectiveness Models.

    Science.gov (United States)

    Hollman, Chase; Paulden, Mike; Pechlivanoglou, Petros; McCabe, Christopher

    2017-08-01

    The volume and technical complexity of both academic and commercial research using decision analytic modelling has increased rapidly over the last two decades. The range of software programs used for their implementation has also increased, but it remains true that a small number of programs account for the vast majority of cost-effectiveness modelling work. We report a comparison of four software programs: TreeAge Pro, Microsoft Excel, R and MATLAB. Our focus is on software commonly used for building Markov models and decision trees to conduct cohort simulations, given their predominance in the published literature around cost-effectiveness modelling. Our comparison uses three qualitative criteria as proposed by Eddy et al.: "transparency and validation", "learning curve" and "capability". In addition, we introduce the quantitative criterion of processing speed. We also consider the cost of each program to academic users and commercial users. We rank the programs based on each of these criteria. We find that, whilst Microsoft Excel and TreeAge Pro are good programs for educational purposes and for producing the types of analyses typically required by health technology assessment agencies, the efficiency and transparency advantages of programming languages such as MATLAB and R become increasingly valuable when more complex analyses are required.

  13. Proceso de pruebas para productos de software en un laboratorio de calidad /Testing process for software products at a quality laboratory

    Directory of Open Access Journals (Sweden)

    Dalila Jústiz-Núñez

    2014-04-01

    Full Text Available La calidad en sentido general, tanto de software como de otros tipos de productos, es un elemento que cada día se tiene más en cuenta a nivel mundial y su logro se relaciona directamente con el proceso que se emplee para obtenerla. Este trabajo presenta una propuesta de proceso de pruebas de software, para un Laboratorio de Calidad, inmerso en un ambiente universitario. Se detallan las actividades de los procesos fundamentales y los artefactos de salida, los niveles de prueba que se aplican y otros elementos de interés. Además se muestra una experiencia práctica de aplicación del proceso y los resultados de varios casos de estudio. Esta propuesta incluye la definición de los aspectos metodológicos y la selección de herramientas que automaticen el proceso. ABSTRACT In general terms, the quality of the software as of other products, is an element of increasing importance worldwide and it is strongly linked to its obtaining process. This work presents a proposal of a software testing process for a Quality Laboratory, integrated into an academic environment. The activities of the main processes and the output artifacts were detailed, as well as the testing levels applied, among other elements of interest. It was also showed a practical experience related to the process implementation and the results of several study cases. This proposal includes the definition of the methodological issues and the selection of the tools for the process automation.

  14. Handbook of software quality assurance techniques applicable to the nuclear industry

    International Nuclear Information System (INIS)

    Bryant, J.L.; Wilburn, N.P.

    1987-08-01

    Pacific Northwest Laboratory is conducting a research project to recommend good engineering practices in the application of 10 CFR 50, Appendix B requirements to assure quality in the development and use of computer software for the design and operation of nuclear power plants for NRC and industry. This handbook defines the content of a software quality assurance program by enumerating the techniques applicable. Definitions, descriptions, and references where further information may be obtained are provided for each topic

  15. The Impact of Organization, Project and Governance Variables on Software Quality and Project Success

    OpenAIRE

    Abbas, Noura; Gravell, Andy; Wills, Gary

    2010-01-01

    In this paper we present a statistically tested evidence about how quality and success rate are correlated with variables reflecting the organization and aspects of its project’s governance, namely retrospectives and metrics. The results presented in this paper are based on the Agile Projects Governance Survey that collected 129 responses. This paper discuss the deep analysis of this survey, and the main findings suggest that when applying agile software development, the quality of software i...

  16. Handbook of software quality assurance techniques applicable to the nuclear industry

    Energy Technology Data Exchange (ETDEWEB)

    Bryant, J.L.; Wilburn, N.P.

    1987-08-01

    Pacific Northwest Laboratory is conducting a research project to recommend good engineering practices in the application of 10 CFR 50, Appendix B requirements to assure quality in the development and use of computer software for the design and operation of nuclear power plants for NRC and industry. This handbook defines the content of a software quality assurance program by enumerating the techniques applicable. Definitions, descriptions, and references where further information may be obtained are provided for each topic.

  17. Value of Investment as a Key Driver for Prioritization and Implementation of Healthcare Software.

    Science.gov (United States)

    Bata, Seth A; Richardson, Terry

    2018-01-01

    Health systems across the nation are recovering from massive financial and resource investments in electronic health record applications. In the midst of these recovery efforts, implementations of new care models, including accountable care organizations and population health initiatives, are underway. The shift from fee-for-service to fee-for-outcomes and fee-for-value payment models calls for care providers to work in new ways. It also changes how physicians are compensated and reimbursed. These changes necessitate that healthcare systems further invest in information technology solutions. Selecting which information technology (IT) projects are of most value is vital, especially in light of recent expenditures. Return-on-investment analysis is a powerful tool used in various industries to select the most appropriate IT investments. It has proven vital in selecting, justifying, and implementing software projects. Other financial metrics, such as net present value, economic value added, and total economic impact, also quantify the success of expenditures on information systems. This paper extends the concept of quantifying project value to include clinical outcomes and nonfinancial value as investment returns, applying a systematic approach to healthcare software projects. We term this inclusive approach Value of Investment. It offers a necessary extension for application in clinical settings where a strictly financial view may fall short in providing a complete picture of important benefits. This paper outlines the Value of Investment process and its attributes, and uses illustrative examples to explore the efficacy of this methodology within a midsized health system.

  18. Global Crisis as Enterprise Software Motivator: from Lifecycle Optimization to Efficient Implementation Series

    Directory of Open Access Journals (Sweden)

    Sergey V. Zykov

    2012-04-01

    Full Text Available It is generally known that software system development lifecycle (SSDL should be managed adequately. The global economy crisis and subsequent depression have taught us certain lessons on the subject, which is so vital for enterprises. The paper presents the adaptive methodology of enterprise SSDL, which allows to avoid "local crises" while producing large-scale software. The methodology is based on extracting common ERP module level patterns and applying them to series of heterogeneous implementations. The approach includes a lifecycle model, which extends conventional spiral model by formal data representation/management models and DSL-based "low-level" CASE tools supporting the formalisms. The methodology has been successfully implemented as a series of portal-based ERP systems in ITERA oil-and-gas corporation, and in a number of trading/banking enterprise applications for other enterprises. Semantic network-based airline dispatch system, and a 6D-model-driven nuclear power plant construction support system are currently in progress. Combining various SSDL models is discussed. Terms-and-cost reduction factors are examined. Correcting SSDL according to project size and scope is overviewed. The so-called “human factor errors” resulting from non-systematic SSDL approach, and their influencing crisis and depression, are analyzed. The ways to systematic and efficient SSDL are outlined. Troubleshooting advises are given for the problems concerned.

  19. External Quality Metrics for Object-Oriented Software: A Systematic Literature Review

    Directory of Open Access Journals (Sweden)

    Danilo Santos

    2017-12-01

    Full Text Available Software quality metrics can be categorized into internal quality, external quality, and quality in use metrics. Although exist close relationship between internal and external software quality, there are not explicit evidences in literature that attributes and metrics of internal quality impact external quality. This is essential to know which metric to use according to the software characteristic that you want to improve. Hence, we carried out a systematic literature review for identifying this relationship. After analyzing 664 papers, 12 papers were studied in depth. As result, we found 65 metrics related to maintainability, usability, reliability, and quality characteristics as well as main attributes that impact external metrics (size, coupling, and cohesion. In follow, we filtered some metrics that have clear definitions, are appropriately related to the characteristic that purports to measure, and do not use subjective attributes in their computation. Therefore, these metrics are more robust and reliable to evaluate software characteristics. So, these metrics are better for use in practice by professionals working in the software market.

  20. New tools for digital medical image processing implemented in DIP software

    Energy Technology Data Exchange (ETDEWEB)

    Araujo, Erica A.C.; Santana, Ivan E. [Instituto Federal de Educacao, Ciencia e Tecnologia de Pernambuco, Recife, PE (Brazil); Lima, Fernando R.A., E-mail: falima@cnen.gov.b [Centro Regional de Ciencias Nucleares, (CRCN/NE-CNEN-PE), Recife, PE (Brazil); Viera, Jose W. [Escola Politecnica de Pernambuco, Recife, PE (Brazil)

    2011-07-01

    The anthropomorphic models used in computational dosimetry, also called phantoms, are mostly built from stacks of images CT (Computed Tomography) or MRI (Magnetic Resonance Imaging) obtained from scans of patients or volunteers. The construction of voxel phantoms requires computational processing for transforming image formats, dimensional image compression (2D) to form three-dimensional arrays (3D), quantization, resampling, enhancement, restoration and image segmentation, among others. Hardly the computational dosimetry researcher finds all these skills into a single software and often it results in a decreased development of their research or inadequate use of alternative tools. The need to integrate the various tasks of the original digital image processing to obtain an image that can be used in a computational model of exposure led to the development of software DIP (Digital Image Processing). This software reads, writes and edits binary files containing the 3D matrix corresponding to a stack of cross-sectional images of a given geometry that can be a human body or other volume of interest. It can also read any type of computer image and do conversions. When the task involves only one output image, it is saved in the JPEG standard Windows. When it involves a stack of images, the binary output file is called SGI (Interactive Graphic Simulations, a symbol already used in other publications of the Research Group in Numerical Dosimetry). The following paper presents the third version of the DIP software and emphasizes the new tools it implemented. Currently it has the menus Basics, Views, Spatial Domain, Frequency Domain, Segmentations and Study. Each menu contains items and subitems with features that generally require an image as input and produce an image or an attribute in the output. (author)

  1. New tools for digital medical image processing implemented in DIP software

    International Nuclear Information System (INIS)

    Araujo, Erica A.C.; Santana, Ivan E.; Lima, Fernando R.A.; Viera, Jose W.

    2011-01-01

    The anthropomorphic models used in computational dosimetry, also called phantoms, are mostly built from stacks of images CT (Computed Tomography) or MRI (Magnetic Resonance Imaging) obtained from scans of patients or volunteers. The construction of voxel phantoms requires computational processing for transforming image formats, dimensional image compression (2D) to form three-dimensional arrays (3D), quantization, resampling, enhancement, restoration and image segmentation, among others. Hardly the computational dosimetry researcher finds all these skills into a single software and often it results in a decreased development of their research or inadequate use of alternative tools. The need to integrate the various tasks of the original digital image processing to obtain an image that can be used in a computational model of exposure led to the development of software DIP (Digital Image Processing). This software reads, writes and edits binary files containing the 3D matrix corresponding to a stack of cross-sectional images of a given geometry that can be a human body or other volume of interest. It can also read any type of computer image and do conversions. When the task involves only one output image, it is saved in the JPEG standard Windows. When it involves a stack of images, the binary output file is called SGI (Interactive Graphic Simulations, a symbol already used in other publications of the Research Group in Numerical Dosimetry). The following paper presents the third version of the DIP software and emphasizes the new tools it implemented. Currently it has the menus Basics, Views, Spatial Domain, Frequency Domain, Segmentations and Study. Each menu contains items and subitems with features that generally require an image as input and produce an image or an attribute in the output. (author)

  2. Quality assurance strategies in hospitals: Development, implementation and impact of quality assurance methods in Iranian hospitals

    NARCIS (Netherlands)

    Aghaei Hashjin, A.

    2015-01-01

    This thesis concentrates on the subject of quality assurance strategies in hospitals; exploring the development, implementation and impact of quality assurance (QA) methods in Iranian hospitals. A series of descriptive and analytical studies using qualitative and quantitative data were performed.

  3. A Case of Engineering Quality for Mobile Healthcare Applications Using Augmented Personal Software Process Improvement

    Directory of Open Access Journals (Sweden)

    Shahbaz Ahmed Khan Ghayyur

    2016-01-01

    Full Text Available Mobile healthcare systems are currently considered as key research areas in the domain of software engineering. The adoption of modern technologies, for mobile healthcare systems, is a quick option for industry professionals. Software architecture is a key feature that contributes towards a software product, solution, or services. Software architecture helps in better communication, documentation of design decisions, risks identification, basis for reusability, scalability, scheduling, and reduced maintenance cost and lastly it helps to avoid software failures. Hence, in order to solve the abovementioned issues in mobile healthcare, the software architecture is integrated with personal software process. Personal software process has been applied successfully but it is unable to address the issues related to architectural design and evaluation capabilities. Hence, a new technique architecture augmented personal process is presented in order to enhance the quality of the mobile healthcare systems through the use of architectural design with integration of personal software process. The proposed process was validated by case studies. It was found that the proposed process helped in reducing the overall costs and effort. Moreover, an improved architectural design helped in development of high quality mobile healthcare system.

  4. Quality Assurance of Software Used In Aircraft Or Related Products

    Science.gov (United States)

    1993-02-01

    This advisory circular (AC) provides an acceptable means, but not the only means, to show compliance with the quality assurance requirements of Federal Aviation Regulations (FAR) Part 21, Certification Procedures for Products and Parts, as applicable...

  5. [Quality management in diabetology by the PC software program DIQUAL].

    Science.gov (United States)

    Köhler, S; Use, G; Schumann, M; Müller, U A

    2000-06-01

    Today, effective therapies for patients with diabetes mellitus. However, these therapeutic strategies are often not or incorrectly applied. Following these discrepancies for health care providers, it is mandatory to document the efficacy of diabetes treatment. However, it is hard to prove the outcome of diabetes care because of unsuitable documentation, missing parameters and different definitions of quality indicators. The computer programme DIQUAL was developed to improve the diabetes management on the diabetes ward, the diabetes out-patient department and for the documentation of outcome quality. DIQUAL is a patient database and offers structured data collecting and administration, text processing, referring letters, application and list functions, internal and external quality control, as well as cohort or cross-section analysis, data export function for nation wide data collection, system internal plausibility check and extension modules for scientific studies. On the basis of DIQUAL, the first nation wide comparison of outcome quality in the routine treatment of type 1 diabetes was possible. In 1998, we analysed the original data of 1789 patients from 32 district and university hospitals on the basis of the therapeutic goals. HbA1C decreased 1.8% and the incidence of severe hypoglycaemia was lowered to the half. There was a substantial improvement of processes in the health care institutions and the quality of information transmission to the general practitioners. This system of measurement and improvement of quality is also suitable for other areas in health care and medicine.

  6. Fuzzy system for risk analysis in software projects through the attributes of quality standards iso 25000

    Directory of Open Access Journals (Sweden)

    Chau Sen Shia

    2014-02-01

    Full Text Available With the growth in demand for products and services in the IT area, companies encounter difficulties in establishing a metric or measure of quality of services to address measurably qualitative values in their planning. In this work fuzzy logic, standard SQuaRE (measurement of the quality of software products, Likertscale, GQM method (Goal-Question-Metric -indicator of quality of Software and the project risk analysis model of Boehm were used to assess the quality of services and decision-making, according to your demand and requests for software development. With the aim of improving the quality in the provision of services, the application is used to integrate the team and follow the life cycle of a project from its initial phase, and to assist in the comparison with the proposed schedule during the requirements elicitation.

  7. 78 FR 63934 - Approval of Air Quality Implementation Plans; California; El Dorado County Air Quality Management...

    Science.gov (United States)

    2013-10-25

    ...] Approval of Air Quality Implementation Plans; California; El Dorado County Air Quality Management District... California for the El Dorado County Air Quality Management District (EDAQMD) portion of the California SIP... 24, 1987 Federal Register, May 25, 1988, U.S. EPA, Air Quality Management Division, Office of Air...

  8. 77 FR 12482 - Approval and Promulgation of Air Quality Implementation Plans; Indiana; Lead Ambient Air Quality...

    Science.gov (United States)

    2012-03-01

    ... Promulgation of Air Quality Implementation Plans; Indiana; Lead Ambient Air Quality Standards AGENCY... incorporates the National Ambient Air Quality Standards (NAAQS) for Pb promulgated by EPA in 2008. DATES: This... FR 66964) and codified at 40 CFR 50.16, ``National primary and secondary ambient air quality...

  9. 75 FR 65572 - Approval and Promulgation of Air Quality Implementation Plans; Ohio; Ohio Ambient Air Quality...

    Science.gov (United States)

    2010-10-26

    ... Promulgation of Air Quality Implementation Plans; Ohio; Ohio Ambient Air Quality Standards AGENCY... Ohio Administrative Code (OAC) relating to the consolidation of Ohio's Ambient Air Quality Standards... apply to Ohio's SIP. Incorporating the air quality standards into Ohio's SIP helps assure that...

  10. 78 FR 19990 - Approval and Promulgation of Air Quality Implementation Plans; Ohio; Ohio Ambient Air Quality...

    Science.gov (United States)

    2013-04-03

    ... Promulgation of Air Quality Implementation Plans; Ohio; Ohio Ambient Air Quality Standards; Correction AGENCY... approved revisions to Ohio regulations that consolidated air quality standards in a new chapter of rules... State's air quality standards into Ohio Administrative Code (OAC) 3745-25 and modifying an assortment of...

  11. Contribution of computer software to quality assurance process in radiopharmacy service at Grenoble University Hospital

    International Nuclear Information System (INIS)

    Sylvoz, N.; Desruet, M.-D.; Hustacheb, C.; Foroni, L.; Allenet, B.

    2009-01-01

    The optimization of the drug use process chain has become a priority for the improvement of quality of the care and for the logistic and economic rationalization. The aim was to assess the impact of the computerization, in radiopharmacy and nuclear medicine services, on safety of the supply chain and on the costs. We performed a before after survey: 50 weeks before and 50 weeks after introduction of the software. Indicators were selected to assess the safety and costs. Regarding the safety of the drug supply chain, computerization leads to a significant increase in the rate of nominative drug delivering satisfying legal requirements and significant decrease in delivering delay, especially those due to problems in radiopharmacy organization. Regarding the costs, the rate of drugs ordered but not used is significantly lower after computerization. This study made possible to identify critical control points and to implement corrective actions. From a legal point of view, computerization of drug prescribing and dispensing led to quantitative and qualitative improvements in the quality, safety and traceability of the drug supply chain. Lastly, computerization would be also a potential tool for cost saving. (authors) [fr

  12. Software de balanced scorecard: proposta de um roteiro de implantação Balanced scorecard software: proposal of a guideline implementation

    Directory of Open Access Journals (Sweden)

    Oswaldo Keiji Hikage

    2006-04-01

    Full Text Available Em 1992, o conceito de Balanced Scorecard (BSC foi apresentado por Robert Kaplan e David Norton e em razão de sua disseminação muitas empresas que o adotaram e passaram por processos de fusões e aquisições tiveram suas informações aumentadas consideravelmente em seus bancos de dados. Esta série de informações e a necessidade de gerenciar com eficiência os indicadores estratégicos, de disponibilizar rapidamente relatórios gerenciais, analisar e simular cenários levaram as empresas a buscar um sistema automatizado. Dessa forma, algumas preocupações passaram a ter importância: como selecionar um software de BSC? Como implantar um software de BSC? Além das dificuldades inerentes à aquisição de qualquer software, a situação especificamente tratada neste artigo apresenta algumas peculiaridades, diante da intensa interação entre a sistemática do BSC e os softwares que o suportam. Este artigo, por meio de um estudo de caso realizado em uma empresa do setor de telecomunicações, enfoca a implantação de um software de BSC, visando ao desenvolvimento de um roteiro que possa sistematizar o processo.Since 1992, when Robert Kaplan and David Norton developed the concepts of Balanced Scorecard (BSC, many companies that adopted these concepts and survived in a higher competitive environment charicterized of high number of mergers, purchases and shutdowns increased the volume of information in their database. So, in consequence of the adoption of BSC, there is the needed for implementing a computerized control system, due to the work with the BSC and the management of a great volume of information. In this context, when the companies decide to implement a BSC software, they faced two problems: how to choose the BSC software? How to implement the selected software? This study proposes to develop a guideline for the BSC software introduction based on a case study in a telecommunication company and the experience from the practice on

  13. C++ Software Quality in the ATLAS Experiment: Tools and Experience

    CERN Document Server

    Kluth, Stefan; The ATLAS collaboration; Obreshkov, Emil; Roe, Shaun; Seuster, Rolf; Snyder, Scott; Stewart, Graeme

    2016-01-01

    The ATLAS experiment at CERN uses about six million lines of code and currently has about 420 developers whose background is largely from physics. In this paper we explain how the C++ code quality is managed using a range of tools from compile-time through to run time testing and reflect on the great progress made in the last year largely through the use of static analysis tools such as Coverity®, an industry-standard tool which enables quality comparison with general open source C++ code. Other tools including cppcheck, Include-What-You-Use and run-time 'sanitizers' are also discussed.

  14. C++ software quality in the ATLAS experiment: tools and experience

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00236968; The ATLAS collaboration; Kluth, Stefan; Seuster, Rolf; Snyder, Scott; Obreshkov, Emil; Roe, Shaun; Sherwood, Peter; Stewart, Graeme

    2017-01-01

    In this paper we explain how the C++ code quality is managed in ATLAS using a range of tools from compile-time through to run time testing and reflect on the substantial progress made in the last two years largely through the use of static analysis tools such as Coverity®, an industry-standard tool which enables quality comparison with general open source C++ code. Other available code analysis tools are also discussed, as is the role of unit testing with an example of how the GoogleTest framework can be applied to our codebase.

  15. C++ software quality in the ATLAS experiment: tools and experience

    Science.gov (United States)

    Martin-Haugh, S.; Kluth, S.; Seuster, R.; Snyder, S.; Obreshkov, E.; Roe, S.; Sherwood, P.; Stewart, G. A.

    2017-10-01

    In this paper we explain how the C++ code quality is managed in ATLAS using a range of tools from compile-time through to run time testing and reflect on the substantial progress made in the last two years largely through the use of static analysis tools such as Coverity®, an industry-standard tool which enables quality comparison with general open source C++ code. Other available code analysis tools are also discussed, as is the role of unit testing with an example of how the GoogleTest framework can be applied to our codebase.

  16. THE EXISTING BARRIERS IN IMPLEMENTING TOTAL QUALITY MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Salagean Horatiu Catalin

    2014-07-01

    Full Text Available In the current market economy, companies are constantly struggling to achieve a sustained competitive advantage that will enable them to improve performance, which results in increased competitiveness, and of course, profit. Among the few competitive advantages that can become sustainable competitive advantages, quality plays a crucial role. Recent research shows that about 90% of buyers in the international market, consider quality as having at least equal importance with price in making the decision to purchase. In the opinion of some specialists in economic theory and practice, total quality refers to the holistic approach of quality, which actually means, addressing all aspects of economic and social development and technical of quality. Thus, the holistic approach of quality at organisation-wide involves procedural approach of quality, in this respect, the study focuses on this type of quality approach, i.e. the procedural approach, taking into account the strategic aspects of the continuous improvement of quality, which means in fact, the quality management. Total Quality Management is seen as a way to transform the economies of some countries to be more competitive than others. However, Total Quality Management brings not and will not produce results overnight, it is not a panacea for all the problems facing the organization. Total Quality Management requires a change in organizational culture, which must focus on meeting customer expectations and increasing the involvement of all employees to meet this objective, as an expression of the ethics of continuous improvement. In general, research on quality aiming identify why an organization should adopt the principles of total quality management, but attempts to identify the failing companies' attempts to implement total quality management principles are not so visible. Concerns companies to introduce quality management systems are becoming more pronounced, therefore, in this study we try to

  17. Implementation of a Quality Management System in regulatory inspection activities

    International Nuclear Information System (INIS)

    Pires do Rio, Monica; Ferreira, Paulo Roberto; Cunha, Paulo G. da; Acar, Maria Elizabeth

    2005-01-01

    The Institute for Radioprotection and Dosimetry - IRD -, of the Brazilian National Nuclear Energy Commission, CNEN, started in 2001, the implementation of a quality management system (SGQ), in the inspection, testing and calibration activities. The SGQ was an institutional guideline and is inserted in a larger system of management of the IRD started in 1999, with the adoption of the National Quality Award criteria - PNQ, within the Project for Excellence in Technological Research of Associacao Brasileira das Instituicoes de Pesquisas Tecnologicas - ABIPTI (Brazilian Association of Technological Research institutions). The proposed quality management system and adopted at the IRD was developed and implemented in accordance with the requirements of NBR ISO/IEC 17025 - General requirements for the competence of testing and calibration laboratories, and ISO/IEC 17020 - General criteria for operation of various types of bodies performing inspections. For regulatory inspection activities, the quality system was implemented on three program inspection services of radiological protection led, respectively, by clinics and hospitals that operate radiotherapy services; industries that use nuclear gauges in their control or productive processes and power reactor operators (CNAAA) - just the environmental part. It was formed a pioneering team of inspectors for standardizing the processes, procedures and starting the implementation of the system in the areas. This work describes the implementation process steps, including difficulties, learning and advantages of the adoption of a quality management system in inspection activities

  18. Achieving dependable software through Continuous Delivery and Quality Monitoring

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    The idea for the presentation is to present our implementation of the CI/CD paradigms and explain on real live examples advantages and drawbacks of the current solution. During the presentation we will try to cover all the required steps which should automatically triggered by a developer’s commit. The presentation should give users a good hands-on experience on basic CI/CD principles and allow them to design and i...

  19. Design and Implementation of a Mobile Phone Locator Using Software Defined Radio

    National Research Council Canada - National Science Library

    Larsen, Ian P

    2007-01-01

    ...) signal using software defined radio and commodity computer hardware. Using software designed by the GNU free software project as a base, standard GSM packets were transmitted and received over the air, and their arrival times detected...

  20. World Workshop on Oral Medicine VI: Utilization of Oral Medicine-specific software for support of clinical care, research, and education: current status and strategy for broader implementation.

    Science.gov (United States)

    Brailo, Vlaho; Firriolo, Francis John; Tanaka, Takako Imai; Varoni, Elena; Sykes, Rosemary; McCullough, Michael; Hua, Hong; Sklavounou, Alexandra; Jensen, Siri Beier; Lockhart, Peter B; Mattsson, Ulf; Jontell, Mats

    2015-08-01

    To assess the current scope and status of Oral Medicine-specific software (OMSS) utilized to support clinical care, research, and education in Oral Medicine and to propose a strategy for broader implementation of OMSS within the global Oral Medicine community. An invitation letter explaining the objectives was sent to the global Oral Medicine community. Respondents were interviewed to obtain information about different aspects of OMSS functionality. Ten OMSS tools were identified. Four were being used for clinical care, one was being used for research, two were being used for education, and three were multipurpose. Clinical software was being utilized as databases developed to integrate of different type of clinical information. Research software was designed to facilitate multicenter research. Educational software represented interactive, case-orientated technology designed for clinical training in Oral Medicine. Easy access to patient data was the most commonly reported advantage. Difficulty of use and poor integration with other software was the most commonly reported disadvantage. The OMSS presented in this paper demonstrate how information technology (IT) can have an impact on the quality of patient care, research, and education in the field of Oral Medicine. A strategy for broader implementation of OMSS is proposed. Copyright © 2015 Elsevier Inc. All rights reserved.

  1. Software quality and process improvement in scientific simulation codes

    Energy Technology Data Exchange (ETDEWEB)

    Ambrosiano, J.; Webster, R. [Los Alamos National Lab., NM (United States)

    1997-11-01

    This report contains viewgraphs on the quest to develope better simulation code quality through process modeling and improvement. This study is based on the experience of the authors and interviews with ten subjects chosen from simulation code development teams at LANL. This study is descriptive rather than scientific.

  2. Introductory Molecular Orbital Theory: An Honors General Chemistry Computational Lab as Implemented Using Three-Dimensional Modeling Software

    Science.gov (United States)

    Ruddick, Kristie R.; Parrill, Abby L.; Petersen, Richard L.

    2012-01-01

    In this study, a computational molecular orbital theory experiment was implemented in a first-semester honors general chemistry course. Students used the GAMESS (General Atomic and Molecular Electronic Structure System) quantum mechanical software (as implemented in ChemBio3D) to optimize the geometry for various small molecules. Extended Huckel…

  3. A pioneering application of NQA-1 quality assurance standards in the development of software

    International Nuclear Information System (INIS)

    Weisbin, A.N.

    1988-01-01

    One reason for systematically applying quality assurance to computer software is the extensive use of results from computer programs to characterize potential sited for nuclear waste repositories leading ultimately to important policy making decisions. Because data from these programs characterize the likely radioactivity profile for many hundreds of years, experimental validation is not feasible. The Sensitivity and Uncertainty Analysis Methods Development Project (SUAMDP) was developed to formulate and utilize efficient and comprehensive methods for determining sensitivities of calculated results with respect to changes in all input parameters. The computerized methodology was embodied in the Gradient Enhanced Software System (GRESS). Due to the fact that GRESS was to be used in the site characterization for waste storage, stringent NQA-1 requirements were imposed by the sponsor. A working relationship between the Oak Ridge National Laboratory (ORNL) Quality Department and the research scientists developing GRESS was essential in achieving understanding and acceptance of the quality assurance requirements as applied to the SUAMDP. The relationship resulted in the SUAMDP becoming the first software project at ORNL to develop a comprehensive NQA-1 Quality Assurance Plan; this plan now serves as a model for software quality assurance at ORNL. This paper describes the evolution of this plan and its impact on the application of quality assurance procedures to software. 2 refs

  4. Building better guidelines with BRIDGE-Wiz: development and evaluation of a software assistant to promote clarity, transparency, and implementability

    Science.gov (United States)

    Michel, George; Rosenfeld, Richard M; Davidson, Caryn

    2011-01-01

    Objective To demonstrate the feasibility of capturing the knowledge required to create guideline recommendations in a systematic, structured, manner using a software assistant. Practice guidelines constitute an important modality that can reduce the delivery of inappropriate care and support the introduction of new knowledge into clinical practice. However, many guideline recommendations are vague and underspecified, lack any linkage to supporting evidence or documentation of how they were developed, and prove to be difficult to transform into systems that influence the behavior of care providers. Methods The BRIDGE-Wiz application (Building Recommendations In a Developer's Guideline Editor) uses a wizard approach to address the questions: (1) under what circumstances? (2) who? (3) ought (with what level of obligation?) (4) to do what? (5) to whom? (6) how and why? Controlled natural language was applied to create and populate a template for recommendation statements. Results The application was used by five national panels to develop guidelines. In general, panelists agreed that the software helped to formalize a process for authoring guideline recommendations and deemed the application usable and useful. Discussion Use of BRIDGE-Wiz promotes clarity of recommendations by limiting verb choices, building active voice recommendations, incorporating decidability and executability checks, and limiting Boolean connectors. It enhances transparency by incorporating systematic appraisal of evidence quality, benefits, and harms. BRIDGE-Wiz promotes implementability by providing a pseudocode rule, suggesting deontic modals, and limiting the use of ‘consider’. Conclusion Users found that BRIDGE-Wiz facilitates the development of clear, transparent, and implementable guideline recommendations. PMID:21846779

  5. 78 FR 16474 - Extension of the Period for Comments on the Enhancement of Quality of Software-Related Patents

    Science.gov (United States)

    2013-03-15

    ... announcing the formation of a partnership with the software community to enhance the quality of software-related patents (Software Partnership), and a request for comments on the preparation of patent... discussion by the Software Partnership; and potential practices that applicants can employ at the drafting...

  6. The impact of software quality characteristics on healthcare outcome: a literature review.

    Science.gov (United States)

    Aghazadeh, Sakineh; Pirnejad, Habibollah; Moradkhani, Alireza; Aliev, Alvosat

    2014-01-01

    The aim of this study was to discover the effect of software quality characteristics on healthcare quality and efficiency indicators. Through a systematic literature review, we selected and analyzed 37 original research papers to investigate the impact of the software indicators (coming from the standard ISO 9126 quality characteristics and sub-characteristics) on some of healthcare important outcome indicators and finally ranked these software indicators. The results showed that the software characteristics usability, reliability and efficiency were mostly favored in the studies, indicating their importance. On the other hand, user satisfaction, quality of patient care, clinical workflow efficiency, providers' communication and information exchange, patient satisfaction and care costs were among the healthcare outcome indicators frequently evaluated in relation to the mentioned software characteristics. Regression Logistic Method was the most common assessment methodology, and Confirmatory Factor Analysis and Structural Equation Modeling were performed to test the structural model's fit. The software characteristics were considered to impact the healthcare outcome indicators through other intermediate factors (variables).

  7. Design of common software for quality control of SPECT

    International Nuclear Information System (INIS)

    Li Xiaohua; Gao Ruzhen; Chen Shengzu

    1993-01-01

    The goal of this study is to design a common testing system for SPECT quality control according to NEMA standard. Using the system, the performances of different types of SPECT can be tested, so that the acceptance testing, performance comparing and routine quality control for SPECT can be normalized. The system was based on IBM PC series of microcomputer. Testing data are acquired from various types of SPECT, then transferred into IBM PC through interface and tested with an unique testing program. Two parts were included: interface and SPECT testing program. It emphatically studied the managing program of RS232 interface, designing skills and the mathematic patterns of SPECT testing program. The system which was composed of 11 subroutines can be used to measure the performances for both gamma camera and SPECT. The system was tested on OMEGA 500/MCS 560 SPECT and the results showed that it is effective, accurate and easy to use

  8. The Software-Based Implementation of the CFAR Reciever for the Processing of the Radar Signal

    Directory of Open Access Journals (Sweden)

    Erik Gemzicky

    2007-01-01

    Full Text Available The performance of a radar receiver is greatly dependent on the presence of noise. The receiver should achieve the constant false alarm rate (CFAR and the maximum probability of the target detection. The CFAR receivers are usually used in radar applications for the radar signal processing in case of unknown or time-varying background noise statistics, especially in cases when clutter or jamming signals are above the receiver threshold. CFAR automatically adjusts the threshold to prevent the threshold crossings. The paper deals with a software-based implementation of CFAR. The softwarebased CFAR processing was tested on various types of processors and processing times were compared. The best resultswere achieved using the dual-core AMD Athlon64 X2 4800+ using SSE2 instructions for processing.

  9. Considerations for control system software verification and validation specific to implementations using distributed processor architectures

    International Nuclear Information System (INIS)

    Munro, J.K. Jr.

    1993-01-01

    Until recently, digital control systems have been implemented on centralized processing systems to function in one of several ways: (1) as a single processor control system; (2) as a supervisor at the top of a hierarchical network of multiple processors; or (3) in a client-server mode. Each of these architectures uses a very different set of communication protocols. The latter two architectures also belong to the category of distributed control systems. Distributed control systems can have a central focus, as in the cases just cited, or be quite decentralized in a loosely coupled, shared responsibility arrangement. This last architecture is analogous to autonomous hosts on a local area network. Each of the architectures identified above will have a different set of architecture-associated issues to be addressed in the verification and validation activities during software development. This paper summarizes results of efforts to identify, describe, contrast, and compare these issues

  10. Implementation of highly parallel and large scale GW calculations within the OpenAtom software

    Science.gov (United States)

    Ismail-Beigi, Sohrab

    The need to describe electronic excitations with better accuracy than provided by band structures produced by Density Functional Theory (DFT) has been a long-term enterprise for the computational condensed matter and materials theory communities. In some cases, appropriate theoretical frameworks have existed for some time but have been difficult to apply widely due to computational cost. For example, the GW approximation incorporates a great deal of important non-local and dynamical electronic interaction effects but has been too computationally expensive for routine use in large materials simulations. OpenAtom is an open source massively parallel ab initiodensity functional software package based on plane waves and pseudopotentials (http://charm.cs.uiuc.edu/OpenAtom/) that takes advantage of the Charm + + parallel framework. At present, it is developed via a three-way collaboration, funded by an NSF SI2-SSI grant (ACI-1339804), between Yale (Ismail-Beigi), IBM T. J. Watson (Glenn Martyna) and the University of Illinois at Urbana Champaign (Laxmikant Kale). We will describe the project and our current approach towards implementing large scale GW calculations with OpenAtom. Potential applications of large scale parallel GW software for problems involving electronic excitations in semiconductor and/or metal oxide systems will be also be pointed out.

  11. Implementation of a data management software system for SSME test history data

    Science.gov (United States)

    Abernethy, Kenneth

    1986-01-01

    The implementation of a software system for managing Space Shuttle Main Engine (SSME) test/flight historical data is presented. The software system uses the database management system RIM7 for primary data storage and routine data management, but includes several FORTRAN programs, described here, which provide customized access to the RIM7 database. The consolidation, modification, and transfer of data from the database THIST, to the RIM7 database THISRM is discussed. The RIM7 utility modules for generating some standard reports from THISRM and performing some routine updating and maintenance are briefly described. The FORTRAN accessing programs described include programs for initial loading of large data sets into the database, capturing data from files for database inclusion, and producing specialized statistical reports which cannot be provided by the RIM7 report generator utility. An expert system tutorial, constructed using the expert system shell product INSIGHT2, is described. Finally, a potential expert system, which would analyze data in the database, is outlined. This system could use INSIGHT2 as well and would take advantage of RIM7's compatibility with the microcomputer database system RBase 5000.

  12. Design and Implementation of Integrated Software Research and Community Service at State Polytechnic of Manado

    Science.gov (United States)

    Saroinsong, T.; A. S Kondoj, M.; Kandiyoh, G.; Pontoh, G.

    2018-01-01

    The State Polytechnic of Manado (Polimdo) is one of the reliable institutions in North Sulawesi that first implemented ISO 9001. But the accreditation of the institution has not been satisfactory, it means there is still much to be prepared to achieve the expected target. One of the criteria of assessment of institutional accreditation is related to research activities and social work in accordance with the standard seven. Data documentation systems related to research activities and social work are not well integrated and well documented in all existing work units. This causes the process of gathering information related to the activities and the results of research and social work in order to support the accreditation activities of the institution is still not efficient. This study aims to build an integrated software in all work units in Polimdo to obtain documentation and data synchronization in support of activities or reporting of documents accreditation institution in accordance with standard seven specifically in terms of submission of research proposal and dedication. The software will be developed using RUP method with analysis using data flow diagram and ERM so that the result of this research is documentation and synchronization of data and information of research activity and community service which can be used in preparing documents report for accreditation institution.

  13. Improving quality through effective implementation of information technology in healthcare.

    Science.gov (United States)

    Øvretveit, John; Scott, Tim; Rundall, Thomas G; Shortell, Stephen M; Brommels, Mats

    2007-10-01

    To describe an implementation of one information technology system (electronic medical record, EMR) in one hospital, the perceived impact, the factors thought to help and hinder implementation and the success of the system and compare this with theories of effective IT implementation. To draw on previous research, empirical data from this study is used to develop IT implementation theory. Qualitative case study, replicating the methods and questions of a previously published USA EMR implementation study using semi-structured interviews and documentation. Large Swedish teaching hospital shortly after a merger of two hospital sites. Thirty senior clinicians, managers, project team members, doctors and nurses. The Swedish implementation was achieved within a year and for under half the budget, with a generally popular EMR which was thought to save time and improve the quality of patient care. Evidence from this study and findings from the more problematic USA implementation case suggests that key factors for cost effective implementation and operation were features of the system itself, the implementation process and the conditions under which the implementation was carried out. There is empirical support for the IT implementation theory developed in this study, which provides a sound basis for future research and successful implementation. Successful implementation of an EMR is likely with an intuitive system, requiring little training, already well developed for clinical work but allowing flexibility for development, where clinicians are involved in selection and in modification for their department needs and where a realistic timetable is made using an assessment of the change-capability of the organization. Once a system decision is made, the implementation should be driven by top and departmental leaders assisted by competent project teams involving information technology specialists and users. Corrections for unforeseen eventualities will be needed, especially

  14. A tool to include gamma analysis software into a quality assurance program.

    Science.gov (United States)

    Agnew, Christina E; McGarry, Conor K

    2016-03-01

    To provide a tool to enable gamma analysis software algorithms to be included in a quality assurance (QA) program. Four image sets were created comprising two geometric images to independently test the distance to agreement (DTA) and dose difference (DD) elements of the gamma algorithm, a clinical step and shoot IMRT field and a clinical VMAT arc. The images were analysed using global and local gamma analysis with 2 in-house and 8 commercially available software encompassing 15 software versions. The effect of image resolution on gamma pass rates was also investigated. All but one software accurately calculated the gamma passing rate for the geometric images. Variation in global gamma passing rates of 1% at 3%/3mm and over 2% at 1%/1mm was measured between software and software versions with analysis of appropriately sampled images. This study provides a suite of test images and the gamma pass rates achieved for a selection of commercially available software. This image suite will enable validation of gamma analysis software within a QA program and provide a frame of reference by which to compare results reported in the literature from various manufacturers and software versions. Copyright © 2015. Published by Elsevier Ireland Ltd.

  15. Architecture and Implementation of OpenPET Firmware and Embedded Software

    International Nuclear Information System (INIS)

    Abu-Nimeh, Faisal T.; Ito, Jennifer; Moses, William W.; Peng, Qiyu; Choong, Woon-Seng

    2016-01-01

    OpenPET is an open source, modular, extendible, and high-performance platform suitable for multi-channel data acquisition and analysis. Due to the versatility of the hardware, firmware, and software architectures, the platform is capable of interfacing with a wide variety of detector modules not only in medical imaging but also in homeland security applications. Analog signals from radiation detectors share similar characteristics-a pulse whose area is proportional to the deposited energy and whose leading edge is used to extract a timing signal. As a result, a generic design method of the platform is adopted for the hardware, firmware, and software architectures and implementations. The analog front-end is hosted on a module called a Detector Board, where each board can filter, combine, timestamp, and process multiple channels independently. The processed data is formatted and sent through a backplane bus to a module called Support Board, where 1 Support Board can host up to eight Detector Board modules. The data in the Support Board, coming from 8 Detector Board modules, can be aggregated or correlated (if needed) depending on the algorithm implemented or runtime mode selected. It is then sent out to a computer workstation for further processing. The number of channels (detector modules), to be processed, mandates the overall OpenPET System Configuration, which is designed to handle up to 1,024 channels using 16-channel Detector Boards in the Standard System Configuration and 16,384 channels using 32-channel Detector Boards in the Large System Configuration.

  16. On the Characterization and Software Implementation of General Protein Lattice Models

    Science.gov (United States)

    Bechini, Alessio

    2013-01-01

    Abstract models of proteins have been widely used as a practical means to computationally investigate general properties of the system. In lattice models any sterically feasible conformation is represented as a self-avoiding walk on a lattice, and residue types are limited in number. So far, only two- or three-dimensional lattices have been used. The inspection of the neighborhood of alpha carbons in the core of real proteins reveals that also lattices with higher coordination numbers, possibly in higher dimensional spaces, can be adopted. In this paper, a new general parametric lattice model for simplified protein conformations is proposed and investigated. It is shown how the supporting software can be consistently designed to let algorithms that operate on protein structures be implemented in a lattice-agnostic way. The necessary theoretical foundations are developed and organically presented, pinpointing the role of the concept of main directions in lattice-agnostic model handling. Subsequently, the model features across dimensions and lattice types are explored in tests performed on benchmark protein sequences, using a Python implementation. Simulations give insights on the use of square and triangular lattices in a range of dimensions. The trend of potential minimum for sequences of different lengths, varying the lattice dimension, is uncovered. Moreover, an extensive quantitative characterization of the usage of the so-called “move types” is reported for the first time. The proposed general framework for the development of lattice models is simple yet complete, and an object-oriented architecture can be proficiently employed for the supporting software, by designing ad-hoc classes. The proposed framework represents a new general viewpoint that potentially subsumes a number of solutions previously studied. The adoption of the described model pushes to look at protein structure issues from a more general and essential perspective, making computational

  17. ALGORITHM OF PLACEMENT OF VIDEO SURVEILLANCE CAMERAS AND ITS SOFTWARE IMPLEMENTATION

    Directory of Open Access Journals (Sweden)

    Loktev Alexey Alexeevich

    2012-10-01

    Full Text Available Comprehensive distributed safety, control, and monitoring systems applied by companies and organizations of different ownership structure play a substantial role in the present-day society. Video surveillance elements that ensure image processing and decision making in automated or automatic modes are the essential components of new systems. This paper covers the modeling of video surveillance systems installed in buildings, and the algorithm, or pattern, of video camera placement with due account for nearly all characteristics of buildings, detection and recognition facilities, and cameras themselves. This algorithm will be subsequently implemented as a user application. The project contemplates a comprehensive approach to the automatic placement of cameras that take account of their mutual positioning and compatibility of tasks. The project objective is to develop the principal elements of the algorithm of recognition of a moving object to be detected by several cameras. The image obtained by different cameras will be processed. Parameters of motion are to be identified to develop a table of possible options of routes. The implementation of the recognition algorithm represents an independent research project to be covered by a different article. This project consists in the assessment of the degree of complexity of an algorithm of camera placement designated for identification of cases of inaccurate algorithm implementation, as well as in the formulation of supplementary requirements and input data by means of intercrossing sectors covered by neighbouring cameras. The project also contemplates identification of potential problems in the course of development of a physical security and monitoring system at the stage of the project design development and testing. The camera placement algorithm has been implemented as a software application that has already been pilot tested on buildings and inside premises that have irregular dimensions. The

  18. Design, implementation and verification of software code for radiation dose assessment based on simple generic environmental model

    International Nuclear Information System (INIS)

    I Putu Susila; Arif Yuniarto

    2017-01-01

    Radiation dose assessment to determine the potential of radiological impacts of various installations within nuclear facility complex is necessary to ensure environmental and public safety. A simple generic model-based method for calculating radiation doses caused by the release of radioactive substances into the environment has been published by the International Atomic Energy Agency (IAEA) as the Safety Report Series No. 19 (SRS-19). In order to assist the application of the assessment method and a basis for the development of more complex assessment methods, an open-source based software code has been designed and implemented. The software comes with maps and is very easy to be used because assessment scenarios can be done through diagrams. Software verification was performed by comparing its result to SRS-19 and CROM software calculation results. Dose estimated by SRS-19 are higher compared to the result of developed software. However, these are still acceptable since dose estimation in SRS-19 is based on conservative approach. On the other hand, compared to CROM software, the same results for three scenarios and a non-significant difference of 2.25 % in another scenario were obtained. These results indicate the correctness of our implementation and implies that the developed software is ready for use in real scenario. In the future, the addition of various features and development of new model need to be done to improve the capability of software that has been developed. (author)

  19. SSCL magnet systems quality program implementation for laboratory and industry

    International Nuclear Information System (INIS)

    Warner, D.G.; Bever, D.L.

    1992-01-01

    The development and delivery of reliable and producible magnets for the Superconducting Super Collider Laboratory (SSCL) require the teamwork of a large and diverse workforce composed of personnel with backgrounds in laboratory research, defense, and energy. The SSCL Magnet Quality Program is being implemented with focus on three definitive objectives: (1) communication of requirements, (2) teamwork, and (3) verification. Examination of the SSCL Magnet Systems Division's (MSD) current and planned approach to implementation of the SSCL Magnet Quality Program utilizing these objectives is discussed

  20. Software for creating quality control database in diagnostic radiology

    International Nuclear Information System (INIS)

    Stoeva, M.; Spassov, G.; Tabakov, S.

    2000-01-01

    The paper describes a PC based program with database for quality control (QC). It keeps information about all surveyed equipment and measured parameters. The first function of the program is to extract information from old (existing) MS Excel spreadsheets with QC surveys. The second function is used for input of measurements which are automatically organized in MS Excel spreadsheets and built into the database. The spreadsheets are based on the protocols described in the EMERALD Training Scheme. In addition, the program can make statistics of all measured parameters, both in absolute term and in time

  1. PLANNING QUALITY ASSURANCE PROCESSES IN A LARGE SCALE GEOGRAPHICALLY SPREAD HYBRID SOFTWARE DEVELOPMENT PROJECT

    Directory of Open Access Journals (Sweden)

    Святослав Аркадійович МУРАВЕЦЬКИЙ

    2016-02-01

    Full Text Available There have been discussed key points of operational activates in a large scale geographically spread software development projects. A look taken at required QA processes structure in such project. There have been given up to date methods of integration quality assurance processes into software development processes. There have been reviewed existing groups of software development methodologies. Such as sequential, agile and based on RPINCE2. There have been given a condensed overview of quality assurance processes in each group. There have been given a review of common challenges that sequential and agile models are having in case of large geographically spread hybrid software development project. Recommendations were given in order to tackle those challenges.  The conclusions about the best methodology choice and appliance to the particular project have been made.

  2. Quality circles in the public health sector: implementation and effect.

    Science.gov (United States)

    Schmele, J A; Allen, M E; Butler, S; Gresham, D

    1991-09-01

    Although the quality circle (QC) process has been used in health care, there is a conspicuous gap in the literature about its use in community health nursing. The purpose of this service/education project was to implement QCs in the public health nursing sector throughout a southern central state. The major objective was to provide QC training to approximately 250 supervisors and staff nurses so that this participative group problem-solving approach might be used as a systematic method of dealing with concerns related to quality of care. Evaluation tools, such as the Science Research Associates' attitude scale and the quality management maturity index, were used to determine whether or not the implementation of the QC program influenced the level of morale and quality management maturity. The data obtained reflected positive changes and favorable supervisory responses.

  3. Challenges of Implementing Free and Open Source Software (FOSS): Evidence from the Indian Educational Setting

    Science.gov (United States)

    Thankachan, Briju; Moore, David Richard

    2017-01-01

    The use of Free and Open Source Software (FOSS), a subset of Information and Communication Technology (ICT), can reduce the cost of purchasing software. Despite the benefit in the initial purchase price of software, deploying software requires total cost that goes beyond the initial purchase price. Total cost is a silent issue of FOSS and can only…

  4. QUASAR: A Method for the Quality Assessment of Software-Intensive System Architectures

    Science.gov (United States)

    2006-07-01

    International Organization for Standardization. International Stan- dard ISO /IEC 9126 . Information technology–Software product evaluation–Quality...quality models [ ISO 91] and associated taxonomies and ontologies [Firesmith 03]. As illustrated in Figure 21, the following examples taken primarily...Quality Requirements.” Journal of Object Technology (JOT) 2, 5 (Septem- ber-October 2003): 67-75. http://www.jot.fm/issues/issue_2003_09/column6 [ ISO 91

  5. Woods Hole Image Processing System Software implementation; using NetCDF as a software interface for image processing

    Science.gov (United States)

    Paskevich, Valerie F.

    1992-01-01

    The Branch of Atlantic Marine Geology has been involved in the collection, processing and digital mosaicking of high, medium and low-resolution side-scan sonar data during the past 6 years. In the past, processing and digital mosaicking has been accomplished with a dedicated, shore-based computer system. With the need to process sidescan data in the field with increased power and reduced cost of major workstations, a need to have an image processing package on a UNIX based computer system which could be utilized in the field as well as be more generally available to Branch personnel was identified. This report describes the initial development of that package referred to as the Woods Hole Image Processing System (WHIPS). The software was developed using the Unidata NetCDF software interface to allow data to be more readily portable between different computer operating systems.

  6. [Computer graphic display of retinal examination results. Software improving the quality of documenting fundus changes].

    Science.gov (United States)

    Jürgens, Clemens; Grossjohann, Rico; Czepita, Damian; Tost, Frank

    2009-01-01

    Graphic documentation of retinal examination results in clinical ophthalmological practice is often depicted using pictures or in handwritten form. Popular software products used to describe changes in the fundus do not vary much from simple graphic programs that enable to insert, scale and edit basic graphic elements such as: a circle, rectangle, arrow or text. Displaying the results of retinal examinations in a unified way is difficult to achieve. Therefore, we devised and implemented modern software tools for this purpose. A computer program enabling to quickly and intuitively form graphs of the fundus, that can be digitally archived or printed was created. Especially for the needs of ophthalmological clinics, a set of standard digital symbols used to document the results of retinal examinations was developed and installed in a library of graphic symbols. These symbols are divided into the following categories: preoperative, postoperative, neovascularization, retinopathy of prematurity. The appropriate symbol can be selected with a click of the mouse and dragged-and-dropped on the canvas of the fundus. Current forms of documenting results of retinal examinations are unsatisfactory, due to the fact that they are time consuming and imprecise. Unequivocal interpretation is difficult or in some cases impossible. Using the developed computer program a sketch of the fundus can be created much more quickly than by hand drawing. Additionally the quality of the medica documentation using a system of well described and standardized symbols will be enhanced. (1) Graphic symbols used to document the results of retinal examinations are a part of everyday clinical practice. (2) The designed computer program will allow quick and intuitive graphical creation of fundus sketches that can be either digitally archived or printed.

  7. Space Shuttle Program Primary Avionics Software System (PASS) Success Legacy - Quality and Reliability Date

    Science.gov (United States)

    Orr, James K.; Peltier, Daryl

    2010-01-01

    Thsi slide presentation reviews the avionics software system on board the space shuttle, with particular emphasis on the quality and reliability. The Primary Avionics Software System (PASS) provides automatic and fly-by-wire control of critical shuttle systems which executes in redundant computers. Charts given show the number of space shuttle flights vs time, PASS's development history, and other charts that point to the reliability of the system's development. The reliability of the system is also compared to predicted reliability.

  8. Software Quality and Testing: What DoD Can Learn from Commercial Practices

    Science.gov (United States)

    1992-08-31

    Defects, and Correction of Processes PROCESS IMPROVEMENT ...... I,- DEVELOPMENT ] STEST"ING -- [ PROCESO IMPROVEMENT 1 Figure 1. Software Quality Control...only when users understand the manual procedure the tool will automate, and the benefit of automating it. With regard to software testing in DoD, we can...testing - the process of exercising or evaluating a system or system components by manual or automnated means to veiify that it satisfies specified

  9. 78 FR 10589 - Revision of Air Quality Implementation Plan; California; Sacramento Metropolitan Air Quality...

    Science.gov (United States)

    2013-02-14

    ... ENVIRONMENTAL PROTECTION AGENCY 40 CFR Part 52 [EPA-R09-OAR-2013-0064; FRL-9777-7] Revision of Air Quality Implementation Plan; California; Sacramento Metropolitan Air Quality Management District... Sacramento Metropolitan Air Quality Management District (SMAQMD or District) portion of the California State...

  10. Implementation of New Process Models for Tailored Polymer Composite Structures into Processing Software Packages

    International Nuclear Information System (INIS)

    Nguyen, Ba Nghiep; Jin, Xiaoshi; Wang, Jin; Phelps, Jay; Tucker, Charles L. III; Kunc, Vlastimil; Bapanapalli, Satish K.; Smith, Mark T.

    2010-01-01

    This report describes the work conducted under the Cooperative Research and Development Agreement (CRADA) (Nr. 260) between the Pacific Northwest National Laboratory (PNNL) and Autodesk, Inc. to develop and implement process models for injection-molded long-fiber thermoplastics (LFTs) in processing software packages. The structure of this report is organized as follows. After the Introduction Section (Section 1), Section 2 summarizes the current fiber orientation models developed for injection-molded short-fiber thermoplastics (SFTs). Section 3 provides an assessment of these models to determine their capabilities and limitations, and the developments needed for injection-molded LFTs. Section 4 then focuses on the development of a new fiber orientation model for LFTs. This model is termed the anisotropic rotary diffusion - reduced strain closure (ARD-RSC) model as it explores the concept of anisotropic rotary diffusion to capture the fiber-fiber interaction in long-fiber suspensions and uses the reduced strain closure method of Wang et al. to slow down the orientation kinetics in concentrated suspensions. In contrast to fiber orientation modeling, before this project, no standard model was developed to predict the fiber length distribution in molded fiber composites. Section 5 is therefore devoted to the development of a fiber length attrition model in the mold. Sections 6 and 7 address the implementations of the models in AMI, and the conclusions drawn from this work is presented in Section 8.

  11. Implementation of New Process Models for Tailored Polymer Composite Structures into Processing Software Packages

    Energy Technology Data Exchange (ETDEWEB)

    Nguyen, Ba Nghiep; Jin, Xiaoshi; Wang, Jin; Phelps, Jay; Tucker III, Charles L.; Kunc, Vlastimil; Bapanapalli, Satish K.; Smith, Mark T.

    2010-02-23

    This report describes the work conducted under the Cooperative Research and Development Agreement (CRADA) (Nr. 260) between the Pacific Northwest National Laboratory (PNNL) and Autodesk, Inc. to develop and implement process models for injection-molded long-fiber thermoplastics (LFTs) in processing software packages. The structure of this report is organized as follows. After the Introduction Section (Section 1), Section 2 summarizes the current fiber orientation models developed for injection-molded short-fiber thermoplastics (SFTs). Section 3 provides an assessment of these models to determine their capabilities and limitations, and the developments needed for injection-molded LFTs. Section 4 then focuses on the development of a new fiber orientation model for LFTs. This model is termed the anisotropic rotary diffusion - reduced strain closure (ARD-RSC) model as it explores the concept of anisotropic rotary diffusion to capture the fiber-fiber interaction in long-fiber suspensions and uses the reduced strain closure method of Wang et al. to slow down the orientation kinetics in concentrated suspensions. In contrast to fiber orientation modeling, before this project, no standard model was developed to predict the fiber length distribution in molded fiber composites. Section 5 is therefore devoted to the development of a fiber length attrition model in the mold. Sections 6 and 7 address the implementations of the models in AMI, and the conclusions drawn from this work is presented in Section 8.

  12. Implementation of the ATLAS trigger within the multi-threaded software framework AthenaMT

    Science.gov (United States)

    Wynne, Ben; ATLAS Collaboration

    2017-10-01

    We present an implementation of the ATLAS High Level Trigger, HLT, that provides parallel execution of trigger algorithms within the ATLAS multithreaded software framework, AthenaMT. This development will enable the ATLAS HLT to meet future challenges due to the evolution of computing hardware and upgrades of the Large Hadron Collider, LHC, and ATLAS Detector. During the LHC data-taking period starting in 2021, luminosity will reach up to three times the original design value. Luminosity will increase further, to up to 7.5 times the design value, in 2026 following LHC and ATLAS upgrades. This includes an upgrade of the ATLAS trigger architecture that will result in an increase in the HLT input rate by a factor of 4 to 10 compared to the current maximum rate of 100 kHz. The current ATLAS multiprocess framework, AthenaMP, manages a number of processes that each execute algorithms sequentially for different events. AthenaMT will provide a fully multi-threaded environment that will additionally enable concurrent execution of algorithms within an event. This has the potential to significantly reduce the memory footprint on future manycore devices. An additional benefit of the HLT implementation within AthenaMT is that it facilitates the integration of offline code into the HLT. The trigger must retain high rejection in the face of increasing numbers of pileup collisions. This will be achieved by greater use of offline algorithms that are designed to maximize the discrimination of signal from background. Therefore a unification of the HLT and offline reconstruction software environment is required. This has been achieved while at the same time retaining important HLT-specific optimisations that minimize the computation performed to reach a trigger decision. Such optimizations include early event rejection and reconstruction within restricted geometrical regions. We report on an HLT prototype in which the need for HLT-specific components has been reduced to a minimum

  13. Improving Code Quality of the Compact Muon Solenoid Electromagnetic Calorimeter Control Software to Increase System Maintainability

    CERN Multimedia

    Holme, Oliver; Dissertori, Günther; Djambazov, Lubomir; Lustermann, Werner; Zelepoukine, Serguei

    2013-01-01

    The Detector Control System (DCS) software of the Electromagnetic Calorimeter (ECAL) of the Compact Muon Solenoid (CMS) experiment at CERN is designed primarily to enable safe and efficient operation of the detector during Large Hadron Collider (LHC) data-taking periods. Through a manual analysis of the code and the adoption of ConQAT [1], a software quality assessment toolkit, the CMS ECAL DCS team has made significant progress in reducing complexity and improving code quality, with observable results in terms of a reduction in the effort dedicated to software maintenance. This paper explains the methodology followed, including the motivation to adopt ConQAT, the specific details of how this toolkit was used and the outcomes that have been achieved. [1] ConQAT, Continuous Quality Assessment Toolkit; https://www.conqat.org/

  14. Manual on quality assurance for computer software related to the safety of nuclear power plants

    International Nuclear Information System (INIS)

    1988-01-01

    The objective of the Manual is to provide guidance in the assurance of quality of specification, design, maintenance and use of computer software related to items and activities important to safety (hereinafter referred to as safety related) in nuclear power plants. This guidance is consistent with, and supplements, the requirements and recommendations of Quality Assurance for Safety in Nuclear Power Plants: A Code of Practice, 50-C-QA, and related Safety Guides on quality assurance for nuclear power plants. Annex A identifies the IAEA documents referenced in the Manual. The Manual is intended to be of use to all those who, in any way, are involved with software for safety related applications for nuclear power plants, including auditors who may be called upon to audit management systems and product software. Figs

  15. Software Quality Validation for Web Applications Developed Using Geographically Distributed Human Resources

    Directory of Open Access Journals (Sweden)

    Mihai GHEORGHE

    2015-01-01

    Full Text Available Developing web applications using Geographically Distributed Team Members has seen an increased popularity during the last years mainly because the rise of Open Source technologies, fast penetration of the Internet in emerging economies, the continuous quest for reduced costs as well for the fast adoption of online platforms and services which successfully address project planning, coordination and other development tasks. This paper identifies general software process stages for both collocated and distributed development and analyses the impact the use of planning, management and testing online services has on the duration, cost and quality of each stage. Given that Quality Assurance is one of the most important concerns in Geographically Distributed Software Development (GDSD, the focus is on Software Quality Validation.

  16. Flexible Implementation of Multiphysics and Discretizations in PyLith Crustal Deformation Modeling Software

    Science.gov (United States)

    Aagaard, B.; Knepley, M.; Williams, C. A.

    2016-12-01

    We are creating a flexible implementation of multiphysics and finite-element discretizations in PyLith, a community, open-source code (http://geodynamics.org/cig/software/pylith/) for modeling quasi-static and dynamic crustal deformation with an emphasis on earthquake faulting. The goals include expanding the current suite of elastic, viscoelastic, and elastoplastic bulk rheologies to include poroelasticity, thermoelasticity, and incompressible elasticity. We cast the governing equations in a form that involves the product of the finite-element basis function or its derivatives with pointwise functions that look very much like the strong form of the governing equation. This allows the finite-element integration to be decomposed into a routine for the numerical integration over cells and boundaries of the finite-element mesh and simple routines implementing the physics (pointwise functions). The finite-element integration routine works in any spatial dimension with an arbitrary number of physical fields (e.g., displacement, temperature, and fluid pressure). It also makes it much easier optimize the finite-element integrations for proper vectorization, tiling, and other traversal optimization on multiple architectures (e.g., CUDA and OpenCL) independent of the pointwise functions. Users can easily extend the code by adding new routines for the pointwise functions to implement different rheologies and/or governing equations. Tight integration with the Portable, Extensible Toolkit for Scientific Computation (PETSc) provides support for a wide range of linear and nonlinear solvers and time-stepping algorithms so that a wide variety of governing equations can be solved efficiently.

  17. Implementation of quality control systematics for personnel monitoring services

    International Nuclear Information System (INIS)

    Franco, J.O.A.

    1984-01-01

    The implementation of statistical quality control techniques used in industrial practise is proposed to dosimetric services. 'Control charts' and 'sampling inspection' are adapted respectively for control of measuring process and of dose results produced in routine. A chapter on Radiation Protection and Personnel Monitoring was included. (M.A.C.) [pt

  18. Determinants of quality management systems implementation in hospitals

    NARCIS (Netherlands)

    Wardhani, Viera; Utarini, Adi; van Dijk, Jitse Pieter; Post, Doeke; Groothoff, Johan Willem

    Objective: To identify the problems and facilitating factors in the implementation of quality management system (QMS) in hospitals through a systematic review. Method: A search strategy was pet-formed on the Medline database for articles written in English published between 1992 and early 2006.

  19. When Spreadsheets Become Software - Quality Control Challenges and Approaches - 13360

    International Nuclear Information System (INIS)

    Fountain, Stefanie A.; Chen, Emmie G.; Beech, John F.; Wyatt, Elizabeth E.; Quinn, Tanya B.; Seifert, Robert W.; Bonczek, Richard R.

    2013-01-01

    As part of a preliminary waste acceptance criteria (PWAC) development, several commercial models were employed, including the Hydrologic Evaluation of Landfill Performance model (HELP) [1], the Disposal Unit Source Term - Multiple Species model (DUSTMS) [2], and the Analytical Transient One, Two, and Three-Dimensional model (AT123D) [3]. The results of these models were post-processed in MS Excel spreadsheets to convert the model results to alternate units, compare the groundwater concentrations to the groundwater concentration thresholds, and then to adjust the waste contaminant masses (based on average concentration over the waste volume) as needed in an attempt to achieve groundwater concentrations at the limiting point of assessment that would meet the compliance concentrations while maximizing the potential use of the landfill (i.e., maximizing the volume of projected waste being generated that could be placed in the landfill). During the course of the PWAC calculation development, one of the Microsoft (MS) Excel spreadsheets used to post-process the results of the commercial model packages grew to include more than 575,000 formulas across 18 worksheets. This spreadsheet was used to assess six base scenarios as well as nine uncertainty/sensitivity scenarios. The complexity of the spreadsheet resulted in the need for a rigorous quality control (QC) procedure to verify data entry and confirm the accuracy of formulas. (authors)

  20. When Spreadsheets Become Software - Quality Control Challenges and Approaches - 13360

    Energy Technology Data Exchange (ETDEWEB)

    Fountain, Stefanie A.; Chen, Emmie G.; Beech, John F. [Geosyntec Consultants, Inc., 1255 Roberts Boulevard NW, Suite 200, Kennesaw, GA 30144 (United States); Wyatt, Elizabeth E. [LATA Environmental Services of Kentucky, LLC, 761 Veterans Ave, Kevil, KY 42053 (United States); Quinn, Tanya B. [Geosyntec Consultants, Inc., 2002 Summit Boulevard NE, Suite 885, Atlanta, GA 30319 (United States); Seifert, Robert W. [Portsmouth/Paducah Project Office, United States Department of Energy, 5600 Hobbs Rd, Kevil, KY 42053 (United States); Bonczek, Richard R. [Portsmouth/Paducah Project Office, United States Department of Energy, 1017 Majestic Drive, Lexington, KY 40513 (United States)

    2013-07-01

    As part of a preliminary waste acceptance criteria (PWAC) development, several commercial models were employed, including the Hydrologic Evaluation of Landfill Performance model (HELP) [1], the Disposal Unit Source Term - Multiple Species model (DUSTMS) [2], and the Analytical Transient One, Two, and Three-Dimensional model (AT123D) [3]. The results of these models were post-processed in MS Excel spreadsheets to convert the model results to alternate units, compare the groundwater concentrations to the groundwater concentration thresholds, and then to adjust the waste contaminant masses (based on average concentration over the waste volume) as needed in an attempt to achieve groundwater concentrations at the limiting point of assessment that would meet the compliance concentrations while maximizing the potential use of the landfill (i.e., maximizing the volume of projected waste being generated that could be placed in the landfill). During the course of the PWAC calculation development, one of the Microsoft (MS) Excel spreadsheets used to post-process the results of the commercial model packages grew to include more than 575,000 formulas across 18 worksheets. This spreadsheet was used to assess six base scenarios as well as nine uncertainty/sensitivity scenarios. The complexity of the spreadsheet resulted in the need for a rigorous quality control (QC) procedure to verify data entry and confirm the accuracy of formulas. (authors)

  1. SME's Implementing a Quality Management System- Risks and Opportunities

    Directory of Open Access Journals (Sweden)

    Luise ZEININGER

    2017-12-01

    Full Text Available Part of our daily lives, risk ceased to be just something that managers are desperate to alleviate if not eliminate. It is now rightfully considered a path to explore new opportunities. The last version of the ISO 9001, the 2015 edition, is asking from organization implementing it to determine the risk and opportunities associated with their main processes, the impact over relevant third parties and the context, both internal and external. With this duality as starting point, we have investigated among SME's implementing a quality management system what would they consider to be the risk and opportunities associated to this action. Our paper presents hereinafter the partial results of a wider study on Romanian SME's management, especially the findings related to what managers consider as risk or opportunity when implementing a quality management system.

  2. [Quality assurance in acute pain therapy : Development of software for the acute pain service].

    Science.gov (United States)

    Czaplik, M; Joppich, R; Rossaint, R

    2010-08-01

    A detailed documentation system is essential for an effectively working acute pain service. Patient-related documentation aids the physician with check lists and algorithms and may thus further improve clinical practice. As adequate software was missing, we developed a database that was first adapted to the in-house conditions, but can also be adjusted to other surroundings. By integrating "one-click documentation" and new codes for clinical observations, a user-friendly software was created that notably improved the quality of documentation. In the first test period more than 30,000 ward rounds were collected, and a considerably improved documentation quality could be achieved.

  3. Quality assurance for CORAL and COOL within the LCG software stack for the LHC experiments

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    CORAL and COOL are software packages used by the LHC experiments for managing different categories of physics data using a variety of relational database technologies. The core components are written in C++, but Python bindings are also provided. CORAL is a generic relational access layer, while COOL includes the implementation of a specific relational data model and optimization of SQL queries for "conditions data". The software is the result of more than 10 years of development in colaboration between the IT department and the LHC experiments. The packages are built and released within the LCG software stack, for which automatic nightly builds and release installations are provided by PH-SFT (cmake, jenkins, cdash) for many different platforms, compilers and software version configurations. Test-driven development and functional tests of both C++ and Python components (CppUnit, unittest) have been key elements in the success of the projects. Dedicated test suites have also been prepared to commission and ma...

  4. Implementation of quality control program in radiodiagnostic services

    International Nuclear Information System (INIS)

    Herrera S, A.; Roas Z, N.

    1995-01-01

    This monograph is the first version of the implementation of the quality control programme in radiology diagnostic services. Here all information related to diagnostic quality to better radiation protection to patients and personnel was collected. The programme was implemented on the X-ray equipment at three hospitals (named hospital A, hospital B and hospital C) and included the evaluation of technical parameters such as kilovolts, exposition time, filtration, fields. In addition, dark room, chassis and image intensifiers were also evaluated. The procedures to carry out the quality control and the manner in which the observations, conclusions and recommendations should be formulated are based on documents issued by the International Commission on Radiological Protection (I.C.R.P.), International Atomic Energy Agency (I.A.E.A.) and World Health Organization (W.H.O.)

  5. Implementation Of Quality Management System For Irradiation Processing Services

    Science.gov (United States)

    Lungu, Ion-Bogdan; Manea, Maria-Mihaela

    2015-07-01

    In today's market, due to an increasing competitiveness, quality management has set itself as an indispensable tool and a reference point for every business. It is ultimately focused on customer satisfaction which is a stringent factor for every business. Implementing and maintaining a QMS is a rather difficult, time consuming and expensive process which must be done with respect of many factors. The aim of this paper is to present a case study for implementing QMS ISO 9001 in a gamma irradiation treatment service provider. The research goals are the identification of key benefits, reasons, advantages, disadvantages, drawbacks etc for a successful QMS implementation and use. Finally, the expected results focus on creating a general framework for implementing an efficient QMS plan that can be easily adapted to other kind of services and markets.

  6. Software: our quest for excellence. Honoring 50 years of software history, progress, and process

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-06-01

    The Software Quality Forum was established by the Software Quality Assurance (SQA) Subcommittee, which serves as a technical advisory group on software engineering and quality initiatives and issues for DOE`s quality managers. The forum serves as an opportunity for all those involved in implementing SQA programs to meet and share ideas and concerns. Participation from managers, quality engineers, and software professionals provides an ideal environment for identifying and discussing issues and concerns. The interaction provided by the forum contributes to the realization of a shared goal--high quality software product. Topics include: testing, software measurement, software surety, software reliability, SQA practices, assessments, software process improvement, certification and licensing of software professionals, CASE tools, software project management, inspections, and management`s role in ensuring SQA. The bulk of this document consists of vugraphs. Selected papers have been indexed separately for inclusion in the Energy Science and Technology Database.

  7. Design and implementation of a mobile phone locator using software defined radio

    OpenAIRE

    Larsen, Ian Paul

    2007-01-01

    This thesis presents an approach for generating, detecting, and decoding a Global System for Mobile Communications (GSM) signal using software defined radio and commodity computer hardware. Using software designed by the GNU freesoftware project as a base, standard GSM packets were transmitted and received over the air, and their arrival times detected. A method is provided to use software analysis of multiple receivers to locate an emitter based on the information received by the softwar...

  8. Digital radiography: optimization of image quality and dose using multi-frequency software.

    Science.gov (United States)

    Precht, H; Gerke, O; Rosendahl, K; Tingberg, A; Waaler, D

    2012-09-01

    New developments in processing of digital radiographs (DR), including multi-frequency processing (MFP), allow optimization of image quality and radiation dose. This is particularly promising in children as they are believed to be more sensitive to ionizing radiation than adults. To examine whether the use of MFP software reduces the radiation dose without compromising quality at DR of the femur in 5-year-old-equivalent anthropomorphic and technical phantoms. A total of 110 images of an anthropomorphic phantom were imaged on a DR system (Canon DR with CXDI-50 C detector and MLT[S] software) and analyzed by three pediatric radiologists using Visual Grading Analysis. In addition, 3,500 images taken of a technical contrast-detail phantom (CDRAD 2.0) provide an objective image-quality assessment. Optimal image-quality was maintained at a dose reduction of 61% with MLT(S) optimized images. Even for images of diagnostic quality, MLT(S) provided a dose reduction of 88% as compared to the reference image. Software impact on image quality was found significant for dose (mAs), dynamic range dark region and frequency band. By optimizing image processing parameters, a significant dose reduction is possible without significant loss of image quality.

  9. Implementations of service oriented architecture and agile software development: What works and what are the challenges?

    NARCIS (Netherlands)

    Schramm, Milan; Daneva, Maia

    2016-01-01

    Today many organizations use service-oriented architecture and agile software development as their software paradigms. While both certainly have their advantages, in the fields of Empirical Software Engineering and Information Systems these have been treated in relative isolation and their impact on

  10. COST OF QUALITY MODELS AND THEIR IMPLEMENTATION IN MANUFACTURING FIRMS

    Directory of Open Access Journals (Sweden)

    N.M. Vaxevanidis

    2009-03-01

    Full Text Available In order to improve quality, an organization must take into account the costs associated with achieving quality since the objective of continuous improvement programs is not only to meet customer requirements, but also to do it at the lowest, possible, cost. This can only obtained by reducing the costs needed to achieve quality, and the reduction of these costs is only possible if they are identified and measured. Therefore, measuring and reporting the cost of quality (CoQ should be considered an important issue for achieving quality excellence. To collect quality costs an organization needs to adopt a framework to classify costs; however, there is no general agreement on a single broad definition of quality costs. CoQ is usually understood as the sum of conformance plus non-conformance costs, where cost of conformance is the price paid for prevention of poor quality (for example, inspection and quality appraisal and cost of non-conformance is the cost of poor quality caused by product and service failure (for example, rework and returns. The objective of this paper is to give a survey of research articles on the topic of CoQ; it opens with a literature review focused on existing CoQ models; then, it briefly presents the most common CoQ parameters and the metrics (indices used for monitoring CoQ. Finally, the use of CoQ models in practice, i.e., the implementation of a quality costing system and cost of quality reporting in companies is discussed, with emphasis in cases concerning manufacturing firms.

  11. Sandia software guidelines, Volume 4: Configuration management

    Energy Technology Data Exchange (ETDEWEB)

    1992-06-01

    This volume is one in a series of Sandia Software Guidelines for use in producing quality software within Sandia National Laboratories. This volume is based on the IEEE standard and guide for software configuration management. The basic concepts and detailed guidance on implementation of these concepts are discussed for several software project types. Example planning documents for both projects and organizations are included.

  12. Assessing climate model software quality: a defect density analysis of three models

    Directory of Open Access Journals (Sweden)

    J. Pipitone

    2012-08-01

    Full Text Available A climate model is an executable theory of the climate; the model encapsulates climatological theories in software so that they can be simulated and their implications investigated. Thus, in order to trust a climate model, one must trust that the software it is built from is built correctly. Our study explores the nature of software quality in the context of climate modelling. We performed an analysis of defect reports and defect fixes in several versions of leading global climate models by collecting defect data from bug tracking systems and version control repository comments. We found that the climate models all have very low defect densities compared to well-known, similarly sized open-source projects. We discuss the implications of our findings for the assessment of climate model software trustworthiness.

  13. ThermoData Engine (TDE): software implementation of the dynamic data evaluation concept. 9. Extensible thermodynamic constraints for pure compounds and new model developments.

    Science.gov (United States)

    Diky, Vladimir; Chirico, Robert D; Muzny, Chris D; Kazakov, Andrei F; Kroenlein, Kenneth; Magee, Joseph W; Abdulagatov, Ilmutdin; Frenkel, Michael

    2013-12-23

    ThermoData Engine (TDE) is the first full-scale software implementation of the dynamic data evaluation concept, as reported in this journal. The present article describes the background and implementation for new additions in latest release of TDE. Advances are in the areas of program architecture and quality improvement for automatic property evaluations, particularly for pure compounds. It is shown that selection of appropriate program architecture supports improvement of the quality of the on-demand property evaluations through application of a readily extensible collection of constraints. The basis and implementation for other enhancements to TDE are described briefly. Other enhancements include the following: (1) implementation of model-validity enforcement for specific equations that can provide unphysical results if unconstrained, (2) newly refined group-contribution parameters for estimation of enthalpies of formation for pure compounds containing carbon, hydrogen, and oxygen, (3) implementation of an enhanced group-contribution method (NIST-Modified UNIFAC) in TDE for improved estimation of phase-equilibrium properties for binary mixtures, (4) tools for mutual validation of ideal-gas properties derived through statistical calculations and those derived independently through combination of experimental thermodynamic results, (5) improvements in program reliability and function that stem directly from the recent redesign of the TRC-SOURCE Data Archival System for experimental property values, and (6) implementation of the Peng-Robinson equation of state for binary mixtures, which allows for critical evaluation of mixtures involving supercritical components. Planned future developments are summarized.

  14. Total Quality Management Implementation and Guest Satisfaction in Hospitality

    Directory of Open Access Journals (Sweden)

    Miroslav Knežević

    2017-02-01

    Full Text Available Total quality management (TQM has become a modern system of constant improvement of the quality of all company activities. The purpose of this study is to measure the expectations and satisfaction of the guests concerning the attribute quality of the hotel product. Furthermore obtained results were compared in such a way as to analyse particularly the reviews of hotels which have implemented TQM and have the ISO 9001 certificates with reviews from hotels which have not implemented TQM and do not have the ISO 9001 certificates. The conducted analysis included 55 hotels in Serbia belonging to the 4- and 5-star categories, i.e. 1308 guests who have stayed in them. The results show that between the observed groups of guests there are fewer differences in expectations than in perception, and that generally speaking guests who have stayed in the hotels that have implemented TQM are more satisfied. The biggest difference concerning the guest satisfaction with the quality of service in the observed hotels is noticeable in relation to the employees and the value-for-money.

  15. A systematic literature review of open source software quality assessment models.

    Science.gov (United States)

    Adewumi, Adewole; Misra, Sanjay; Omoregbe, Nicholas; Crawford, Broderick; Soto, Ricardo

    2016-01-01

    Many open source software (OSS) quality assessment models are proposed and available in the literature. However, there is little or no adoption of these models in practice. In order to guide the formulation of newer models so they can be acceptable by practitioners, there is need for clear discrimination of the existing models based on their specific properties. Based on this, the aim of this study is to perform a systematic literature review to investigate the properties of the existing OSS quality assessment models by classifying them with respect to their quality characteristics, the methodology they use for assessment, and their domain of application so as to guide the formulation and development of newer models. Searches in IEEE Xplore, ACM, Science Direct, Springer and Google Search is performed so as to retrieve all relevant primary studies in this regard. Journal and conference papers between the year 2003 and 2015 were considered since the first known OSS quality model emerged in 2003. A total of 19 OSS quality assessment model papers were selected. To select these models we have developed assessment criteria to evaluate the quality of the existing studies. Quality assessment models are classified into five categories based on the quality characteristics they possess namely: single-attribute, rounded category, community-only attribute, non-community attribute as well as the non-quality in use models. Our study reflects that software selection based on hierarchical structures is found to be the most popular selection method in the existing OSS quality assessment models. Furthermore, we found that majority (47%) of the existing models do not specify any domain of application. In conclusion, our study will be a valuable contribution to the community and helps the quality assessment model developers in formulating newer models and also to the practitioners (software evaluators) in selecting suitable OSS in the midst of alternatives.

  16. Taking advantage of ground data systems attributes to achieve quality results in testing software

    Science.gov (United States)

    Sigman, Clayton B.; Koslosky, John T.; Hageman, Barbara H.

    1994-01-01

    During the software development life cycle process, basic testing starts with the development team. At the end of the development process, an acceptance test is performed for the user to ensure that the deliverable is acceptable. Ideally, the delivery is an operational product with zero defects. However, the goal of zero defects is normally not achieved but is successful to various degrees. With the emphasis on building low cost ground support systems while maintaining a quality product, a key element in the test process is simulator capability. This paper reviews the Transportable Payload Operations Control Center (TPOCC) Advanced Spacecraft Simulator (TASS) test tool that is used in the acceptance test process for unmanned satellite operations control centers. The TASS is designed to support the development, test and operational environments of the Goddard Space Flight Center (GSFC) operations control centers. The TASS uses the same basic architecture as the operations control center. This architecture is characterized by its use of distributed processing, industry standards, commercial off-the-shelf (COTS) hardware and software components, and reusable software. The TASS uses much of the same TPOCC architecture and reusable software that the operations control center developer uses. The TASS also makes use of reusable simulator software in the mission specific versions of the TASS. Very little new software needs to be developed, mainly mission specific telemetry communication and command processing software. By taking advantage of the ground data system attributes, successful software reuse for operational systems provides the opportunity to extend the reuse concept into the test area. Consistency in test approach is a major step in achieving quality results.

  17. Assessing the impact of continuous quality improvement/total quality management: concept versus implementation.

    Science.gov (United States)

    Shortell, S M; O'Brien, J L; Carman, J M; Foster, R W; Hughes, E F; Boerstler, H; O'Connor, E J

    1995-01-01

    OBJECTIVE: This study examines the relationships among organizational culture, quality improvement processes and selected outcomes for a sample of up to 61 U. S. hospitals. DATA SOURCES AND STUDY SETTING: Primary data were collected from 61 U. S. hospitals (located primarily in the midwest and the west) on measures related to continuous quality improvement/total quality management (CQI/TQM), organizational culture, implementation approaches, and degree of quality improvement implementation based on the Baldrige Award criteria. These data were combined with independently collected data on perceived impact and objective measures of clinical efficiency (i.e., charges and length of stay) for six clinical conditions. STUDY DESIGN: The study involved cross-sectional examination of the named relationships. DATA COLLECTION/EXTRACTION METHODS: Reliable and valid scales for the organizational culture and quality improvement implementation measures were developed based on responses from over 7,000 individuals across the 61 hospitals with an overall completion rate of 72 percent. Independent data on perceived impact were collected from a national survey and independent data on clinical efficiency from a companion study of managed care. PRINCIPAL FINDINGS: A participative, flexible, risk-taking organizational culture was significantly related to quality improvement implementation. Quality improvement implementation, in turn, was positively associated with greater perceived patient outcomes and human resource development. Larger-size hospitals experienced lower clinical efficiency with regard to higher charges and higher length of stay, due in part to having more bureaucratic and hierarchical cultures that serve as a barrier to quality improvement implementation. CONCLUSIONS: What really matters is whether or not a hospital has a culture that supports quality improvement work and an approach that encourages flexible implementation. Larger-size hospitals face more difficult

  18. Implementing a DVB-T/H Receiver on a Software-Defined Radio Platform

    Directory of Open Access Journals (Sweden)

    Yong Jiang

    2009-01-01

    Full Text Available Digital multimedia broadcasting is available in more and more countries with various forms. One of the most successful forms is Digital Video Broadcasting for Terrestrial (DVB-T, which has been deployed in most countries of the world for years. In order to bring the digital multimedia broadcasting services to battery-powered handheld receivers in a mobile environment, Digital Video Broadcasting for Handheld (DVB-H has been formally adopted by ETSI. More advanced and complex digital multimedia broadcasting systems are under development, for example, the next generation of DVB-T, a.k.a. DVB-T2. Current commercial DVB-T/H receivers are usually built upon dedicated application-specific integrated circuits (ASICs. However, ASICs are not flexible for incoming evolved standards and less overall-area efficient since they cannot be efficiently reused and shared among different radio standards, when we integrate a DVB-T/H receiver into a mobile phone. This paper presents an example implementation of a DVB-T/H receiver on the prototype of Infineon Technologies' Software-Defined Radio (SDR platform called MuSIC (Multiple SIMD Cores, which is a DSP-centered and accelerator-assisted architecture and aims at battery-powered mass-market handheld terminals.

  19. Designing and Implementing a Distributed System Architecture for the Mars Rover Mission Planning Software (Maestro)

    Science.gov (United States)

    Goldgof, Gregory M.

    2005-01-01

    Distributed systems allow scientists from around the world to plan missions concurrently, while being updated on the revisions of their colleagues in real time. However, permitting multiple clients to simultaneously modify a single data repository can quickly lead to data corruption or inconsistent states between users. Since our message broker, the Java Message Service, does not ensure that messages will be received in the order they were published, we must implement our own numbering scheme to guarantee that changes to mission plans are performed in the correct sequence. Furthermore, distributed architectures must ensure that as new users connect to the system, they synchronize with the database without missing any messages or falling into an inconsistent state. Robust systems must also guarantee that all clients will remain synchronized with the database even in the case of multiple client failure, which can occur at any time due to lost network connections or a user's own system instability. The final design for the distributed system behind the Mars rover mission planning software fulfills all of these requirements and upon completion will be deployed to MER at the end of 2005 as well as Phoenix (2007) and MSL (2009).

  20. Development and Implementation of Software for Visualizing and Editing Multidimensional Flight Simulation Input Data

    Science.gov (United States)

    Whelan, Todd Michael

    1996-01-01

    In a real-time or batch mode simulation that is designed to model aircraft dynamics over a wide range of flight conditions, a table look- up scheme is implemented to determine the forces and moments on the vehicle based upon the values of parameters such as angle of attack, altitude, Mach number, and control surface deflections. Simulation Aerodynamic Variable Interface (SAVI) is a graphical user interface to the flight simulation input data, designed to operate on workstations that support X Windows. The purpose of the application is to provide two and three dimensional visualization of the data, to allow an intuitive sense of the data set. SAVI also allows the user to manipulate the data, either to conduct an interactive study of the influence of changes on the vehicle dynamics, or to make revisions to data set based on new information such as flight test. This paper discusses the reasons for developing the application, provides an overview of its capabilities, and outlines the software architecture and operating environment.

  1. Implementation of the ATLAS trigger within the multi-threaded software framework AthenaMT

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00225867; The ATLAS collaboration

    2016-01-01

    We present an implementation of the ATLAS High Level Trigger, HLT, that provides parallel execution of trigger algorithms within the ATLAS multithreaded software framework, AthenaMT. This development will enable the ATLAS HLT to meet future challenges due to the evolution of computing hardware and upgrades of the Large Hadron Collider, LHC, and ATLAS Detector. During the LHC data-taking period starting in 2021, luminosity will reach up to three times the original design value. Luminosity will increase further, to up to 7.5 times the design value, in 2026 following LHC and ATLAS upgrades. This includes an upgrade of the ATLAS trigger architecture that will result in an increase in the HLT input rate by a factor of 4 to 10 compared to the current maximum rate of 100 kHz. The current ATLAS multiprocess framework, AthenaMP, manages a number of processes that each execute algorithms sequentially for different events. AthenaMT will provide a fully multi-threaded environment that will additionally enable concurrent ...

  2. Software implementation of a high speed interface between a PDP-10 and several PDP-11s

    International Nuclear Information System (INIS)

    De Mesa, N.P. III.

    1975-01-01

    The DMA10 is a high speed link between a PDP-10 and up to eight PDP-11s; specifically, the PDP-10 shares sections of its memory with the PDP-11s. The two segment concept on the PDP-10 of shared/reentrant code and non-shared code is implemented. The inclusion of read only memory on the PDP-11s allows for the development of ''PROM'' software which all the PDP-11s may share. The principal difference between the DMA10 and other communications interfaces is that it is not a block transfer device. Because of the shared memory concept the features of the DMA10 are high data bandwidth and minimal processor intervention between data transfers. Communication programs between the PDP-10 and the PDP-11 may be tested wholly in either processor, independent of the DMA10 interface. In the current mode of operation the PDP-11's simply act as device controllers. Future plans include separate operating systems in various PDP-11s

  3. An empirical investigation of quality performance expectations in the software industry: A gap analysis approach

    OpenAIRE

    Mehrez, Ahmed

    2014-01-01

    The assessment of quality is pervasive but vital in organisations. It is particularly important in the design of an effective quality management system. The assessment is often considered from the perspective of fulfilling customers' requirements; however, the level of compatibility between a product or service providers' perceptions of what to deliver and the customers' desires or expectations is often uncertain. This paper describes an empirical study evaluating, within the software industr...

  4. Implementation of Quality Management System for Historical Building Conservation

    Directory of Open Access Journals (Sweden)

    Zahari N.F.

    2014-01-01

    Full Text Available The main objectives of this study are twofold. Firstly, to identify the implementation of ISO 9001 procedures being used as references for conservation works and the development of Quality Management System (QMS guidelines. Data were solicited from three (3 conservation areas. The analysis involved of descriptive approach and statistical methods. The findings revealed that QMS is not structurally established, implemented and enforced as part of conservation practice in Malaysia. From the findings, the authors hope to give clear perception to the reader on current preservation practice and the existence of QMS with reference to ISO 9001 for future conservation mechanism.

  5. Guidelines to minimize cost of software quality in agile scrum process

    OpenAIRE

    Vijay, Deepa; Ganapathy, Gopinath

    2014-01-01

    This paper presents a case study of Agile Scrum process followed in Retail Domain project. This paper also reveals the impacts of Cost of Software Quality, when agile scrum process is not followed efficiently. While analyzing the case study, the gaps were found and guidelines for process improvements were also suggested in this paper.

  6. Software quality assurance and information management, October 1986 to October 1992

    International Nuclear Information System (INIS)

    Hill, I.E.

    1993-01-01

    This report describes the work carried out by Cedar Design Systems Limited under contract PECD 7/9/384. The brief for the contract was initially to provide advice on Software Quality Assurance (SQA) as part of the CEC PACOMA project. This was later extended to include further SQA and information management tasks specific to the HMIP Radioactive Waste Disposal Assessments Research Programme. (Author)

  7. Creating high-quality behavioural designs for software-intensive systems

    NARCIS (Netherlands)

    Gülesir, G.; America, Pierre; Benschop, Frank; van den Berg, Klaas; Aksit, Mehmet; van der Laar, Pierre; Punter, Teade

    2010-01-01

    In todays industrial practice, behavioral designs of software-intensive systems such as embedded systems are often imprecisely documented as plain text in a natural language such as English, supplemented with ad-hoc diagrams. Lack of quality in behavioral design documents causes poor communication

  8. A quality control method for nuclear instrumentation and control systems based on software safety prediction

    Science.gov (United States)

    Son, Han Seong; Seong, Poong Hyun

    2000-04-01

    In the case of safety-related applications like nuclear instrumentation and control (NI&C), safety-oriented quality control is required. The objective of this paper is to present a software safety classification method as a safety-oriented quality control tool. Based on this method, we predict the risk (and thus safety) of software items that are at the core of NI&C systems. Then we classify the software items according to the degree of the risk. The method can be used earlier than at the detailed design phase. Furthermore, the method can also be used in all the development phases without major changes. The proposed method seeks to utilize the measures that can be obtained from the safety analysis and requirements analysis. Using the measures proved to be desirable in a few aspects. The authors have introduced fuzzy approximate reasoning to the classification method because experts' knowledge covers the vague frontiers between good quality and bad quality with linguistic uncertainty and fuzziness. Fuzzy Colored Petri Net (FCPN) is introduced in order to offer a formal framework for the classification method and facilitate the knowledge representation, modification, or verification. Through the proposed quality control method, high-quality NI&C systems can be developed effectively and used safely.

  9. 76 FR 44535 - Revisions to the California State Implementation Plan, Northern Sierra Air Quality Management...

    Science.gov (United States)

    2011-07-26

    ... the California State Implementation Plan, Northern Sierra Air Quality Management District, Sacramento Metropolitan Air Quality Management District, and South Coast Air Quality Management District AGENCY... the Northern Sierra Air Quality Management District (NSAQMD), Sacramento Metropolitan Air Quality...

  10. Image Quality Improvement after Implementation of a CT Accreditation Program

    International Nuclear Information System (INIS)

    Kim, You Sung; Jung, Seung Eun; Choi, Byung Gil; Shin, Yu Ri; Hwang, Seong Su; Ku, Young Mi; Lim, Yeon Soo; Lee, Jae Mun

    2010-01-01

    The purpose of this study was to evaluate any improvement in the quality of abdominal CTs after the utilization of the nationally based accreditation program. Approval was obtained from the Institutional Review Board, and informed consent was waived. We retrospectively analyzed 1,011 outside abdominal CTs, from 2003 to 2007. We evaluated images using a fill-up sheet form of the national accreditation program, and subjectively by grading for the overall CT image quality. CT scans were divided into two categories according to time periods; before and after the implementation of the accreditation program. We compared CT scans between two periods according to parameters pertaining to the evaluation of images. We determined whether there was a correlation between the results of a subjective assessment of the image quality and the evaluation scores of the clinical image. The following parameters were significantly different after the implementation of the accreditation program: identifying data, display parameters, scan length, spatial and contrast resolution, window width and level, optimal contrast enhancement, slice thickness, and total score. The remaining parameters were not significantly different between scans obtained from the two different periods: scan parameters, film quality, and artifacts. After performing the CT accreditation program, the quality of the outside abdominal CTs show marked improvement, especially for the parameters related to the scanning protocol

  11. Development, implementation and quality assurance of biokinetic models within CONRAD

    International Nuclear Information System (INIS)

    Nosske, D.; Birchall, A.; Blanchardon, E.; Breustedt, B.; Giussani, A.; Luciani, A.; Oeh, U.; Lopez, M. A.

    2008-01-01

    The work of the Task Group 5.2 'Research Studies on Biokinetic Models' of the CONRAD project is presented. New biokinetic models have been implemented by several European institutions. Quality assurance procedures included intercomparison of the results as well as quality assurance of model formulation. Additionally, the use of the models was examined leading to proposals of tuning parameters. Stable isotope studies were evaluated with respect to their implications to the new models, and new biokinetic models were proposed on the basis of their results. Furthermore, the development of a biokinetic model describing the effects of decorporation of actinides by diethylenetriaminepentaacetic acid treatment was initiated. (authors)

  12. Efficient FPGA Implementation of a STBC-OFDM Combiner for an IEEE 802.16 Software Radio Receiver

    DEFF Research Database (Denmark)

    Cattoni, Andrea Fabio; Le Moullec, Yannick; Sacchi, Claudio

    2014-01-01

    In this paper, an efficient FPGA implementation of a 4x4 Space-Time Block Coding (STBC) combiner for MIMO-OFDM software radio receivers is considered. The proposed combiner is based on a low-complexity algorithm which reduces the interference due to the Quasi-Orthogonality of the STBC decoding...

  13. The Feasibility Study of Implementing a Fiber Optic Local Area Network in Software Metrics Laboratory in Ingersoll 158

    National Research Council Canada - National Science Library

    Be, Chai

    2004-01-01

    ... fiber components compared to the increase electronic costs of carrying Gigabit Ethernet over Cat 5 or Cat SE UTP copper cabling has also accelerated the migration to optical fiber LAN. The thesis conducts a feasibility study of implementing a Fiber Optic Local Area Network in Software Metrics Laboratory in Ingersoll 158.

  14. IMPLEMENTATION STRATEGY OF FREE SOFTWARE IN THE PROCESS OF PREPARATION OF TEACHERS OF MATHEMATICS, PHYSICS AND COMPUTER SCIENCE

    Directory of Open Access Journals (Sweden)

    Vladyslav Ye. Velychko

    2016-01-01

    Full Text Available Information processes in the society encourage the formation of a revision of the forms and methods of learning; involve the use of didactic capabilities of information and communication technologies in teaching. No less important in this context, the problem of professionals training who are able to use modern possibilities of computer technology. Training of highly qualified teachers is only possible using advanced technologies that cover the entire range of existing opportunities. The analysis used in the formation of the software has showed insufficient use of a whole class of software - free software in the educational process. To overcome this problem, the proposed implementation strategy of free software in the preparation of teachers of mathematics, physics and computer science is proposed.

  15. Clinical audit and quality systems - practical implementation in Finland

    International Nuclear Information System (INIS)

    Jaervinen, H.

    2003-01-01

    Clinical audit is a new concept of significant importance for the quality of radiological practices, introduced by the EC Medical Exposure Directive (MED, 97/43/EURATOM). By definition, clinical audit means 'a systematic examination or review of medical radiological procedures which seeks to improve the quality and the outcome of patient care, through structured review whereby radiological practices, procedures, and results are examined against agreed standards for good medical radiological procedures, with modifications of the practices where indicated and the application of new standards if necessary'. In its most profound meaning, being introduced in the medical exposure directive, clinical audit can be seen as a review of the success in implementing the justification and optimization principles, and therefore, it is to a large extent an issue of radiation safety for the patient. According to the directive, clinical audits shall be 'carried out in accordance with national procedures'. For the last few years, parallel to the development of the MED in Europe, there has been a worldwide tendency to implement appropriate quality systems (QS) in the health care organizations, in accordance with the international quality standards (ISO 9000 series etc). Such quality systems have been applied for a long time and very widely by the industry. It is a strong belief that the development of quality systems for health care would result in equal benefits as trusted in industry, in terms of efficiency and safety of health care services. For radiological practices, the quality systems are expected to become a framework for improving the optimization of practices and for maintaining good radiation safety, as well as providing a mechanism to prevent mistakes and accidents. In some countries, like the UK and The Netherlands, there are legal requirements to establish and maintain quality systems at certain type of radiological units. In some countries and some radiological units

  16. Clinical audit and quality systems - practical implementation in Finland

    Energy Technology Data Exchange (ETDEWEB)

    Jaervinen, H. [Radiation and Nuclear Safety Authority, Helsinki (Finland)

    2003-06-01

    Clinical audit is a new concept of significant importance for the quality of radiological practices, introduced by the EC Medical Exposure Directive (MED, 97/43/EURATOM). By definition, clinical audit means 'a systematic examination or review of medical radiological procedures which seeks to improve the quality and the outcome of patient care, through structured review whereby radiological practices, procedures, and results are examined against agreed standards for good medical radiological procedures, with modifications of the practices where indicated and the application of new standards if necessary'. In its most profound meaning, being introduced in the medical exposure directive, clinical audit can be seen as a review of the success in implementing the justification and optimization principles, and therefore, it is to a large extent an issue of radiation safety for the patient. According to the directive, clinical audits shall be 'carried out in accordance with national procedures'. For the last few years, parallel to the development of the MED in Europe, there has been a worldwide tendency to implement appropriate quality systems (QS) in the health care organizations, in accordance with the international quality standards (ISO 9000 series etc). Such quality systems have been applied for a long time and very widely by the industry. It is a strong belief that the development of quality systems for health care would result in equal benefits as trusted in industry, in terms of efficiency and safety of health care services. For radiological practices, the quality systems are expected to become a framework for improving the optimization of practices and for maintaining good radiation safety, as well as providing a mechanism to prevent mistakes and accidents. In some countries, like the UK and The Netherlands, there are legal requirements to establish and maintain quality systems at certain type of radiological units. In some countries and some

  17. An empirical evaluation of software quality assurance practices and challenges in a developing country: a comparison of Nigeria and Turkey.

    Science.gov (United States)

    Sowunmi, Olaperi Yeside; Misra, Sanjay; Fernandez-Sanz, Luis; Crawford, Broderick; Soto, Ricardo

    2016-01-01

    The importance of quality assurance in the software development process cannot be overemphasized because its adoption results in high reliability and easy maintenance of the software system and other software products. Software quality assurance includes different activities such as quality control, quality management, quality standards, quality planning, process standardization and improvement amongst others. The aim of this work is to further investigate the software quality assurance practices of practitioners in Nigeria. While our previous work covered areas on quality planning, adherence to standardized processes and the inherent challenges, this work has been extended to include quality control, software process improvement and international quality standard organization membership. It also makes comparison based on a similar study carried out in Turkey. The goal is to generate more robust findings that can properly support decision making by the software community. The qualitative research approach, specifically, the use of questionnaire research instruments was applied to acquire data from software practitioners. In addition to the previous results, it was observed that quality assurance practices are quite neglected and this can be the cause of low patronage. Moreover, software practitioners are neither aware of international standards organizations or the required process improvement techniques; as such their claimed standards are not aligned to those of accredited bodies, and are only limited to their local experience and knowledge, which makes it questionable. The comparison with Turkey also yielded similar findings, making the results typical of developing countries. The research instrument used was tested for internal consistency using the Cronbach's alpha, and it was proved reliable. For the software industry in developing countries to grow strong and be a viable source of external revenue, software assurance practices have to be taken seriously

  18. IMPLEMENTATION OF INTEGRAL SYSTEM OF QUALITY MANAGEMENT IN TOURISM

    Directory of Open Access Journals (Sweden)

    Vidoje Vujic

    2007-12-01

    Full Text Available Theory and practice have corroborated the need and usefulness of an integral approach to quality management in tourist destinations, since the destination and its touristic offer define a number of disparate participants and interested parties. An integral system of quality management is one of the models of touristic management that with the implementation of contemporary principles strives to achieve business excellence and competitory advantage. The paper determines and projects the need and importance of implementing an integral system and accordingly seeks to form a model for its development. By studying the oneness of the whole we have established the dependence and firm connections between particular norms and elements, and by analyzing them in this paper we describe the structure and associated characteristics of the whole.

  19. Implementing quality assurance with multiple contractors under changing regulations

    International Nuclear Information System (INIS)

    Hanrahan, T.

    1989-01-01

    This paper discusses the tools and techniques used to establish the responsibilities for quality implementation on the California Low-Level Radioactive Waste Disposal Project. The project is structured to use the traditional nuclear-oriented quality assurance criteria in the non-traditional application of scientific investigations required for site characterization activities. It has required a careful blending of approaches and mentalities from backgrounds in both nuclear QA and EPA QA as well as the selective application of guidance from both of the agencies. As a significant portion of the work was subcontracted to organizations which ranged from a small group of college professors to a multi-national corporation, the program needed to be versatile, easy to implement but yet definitive. The author describes some of the misconceptions encountered and areas of weakness to be identified and strengthened

  20. Balancing compliance and cost when implementing a Quality Assurance program

    International Nuclear Information System (INIS)

    Pickering, S.Y.

    1997-12-01

    When implementing a Quality Assurance (QA) program, compliance and cost must be balanced. A QA program must be developed that hits the mark in terms of adequate control and documentation, but does not unnecessarily expand resources. As the Waste Isolation Pilot Plant (WIPP) has moved towards certification, Sandia National Laboratories has learned much about balancing compliance and costs. Some of these lessons are summarized here

  1. Regulatory inspection of the implementation of quality assurance programmes

    International Nuclear Information System (INIS)

    1989-01-01

    This Manual provides guidance to Member States in the organization and performance of their regulatory inspection functions regarding the implementation of nuclear power plant quality assurance programmes. It addresses the interface between, and is consistent with, the IAEA Nuclear Safety Standards (NUSS programme) documents on quality assurance and governmental organization. The Manual offers a practical model and examples for performing regulatory inspections to ensure that the quality assurance programme is operating satisfactorily in the siting, design, manufacturing, construction, commissioning, operation and decommissioning of nuclear power plants. The primary objective is to confirm that the licensee has the capability to manage and control the effective performance of all quality assurance responsibilities during all phases of a nuclear power project. The guidance provided through this Manual for proper establishment and execution of the regulatory inspections helps to enforce the effective implementation of the quality assurance programme as a management control system that the nuclear industry should establish and use in attaining the safety and reliability objectives for nuclear installations. This enforcement action by national regulatory bodies and the emphasis on the purposes and advantages of quality assurance as an important management tool integrated within the total project task have been recommended by the IAEA International Nuclear Safety Advisory Group (INSAG). The primary intended users of this Manual are the management personnel and high level staff from regulatory bodies but it will also be helpful to management personnel from nuclear utilities and vendors. They all are inevitable partners in a nuclear power project and this document offers all of them valuable information on the better accomplishment of quality assurance activities to ensure the common objective of safe and reliable nuclear power production

  2. Report of AAPM Task Group 162: Software for planar image quality metrology.

    Science.gov (United States)

    Samei, Ehsan; Ikejimba, Lynda C; Harrawood, Brian P; Rong, John; Cunningham, Ian A; Flynn, Michael J

    2018-02-01

    The AAPM Task Group 162 aimed to provide a standardized approach for the assessment of image quality in planar imaging systems. This report offers a description of the approach as well as the details of the resultant software bundle to measure detective quantum efficiency (DQE) as well as its basis components and derivatives. The methodology and the associated software include the characterization of the noise power spectrum (NPS) from planar images acquired under specific acquisition conditions, modulation transfer function (MTF) using an edge test object, the DQE, and effective DQE (eDQE). First, a methodological framework is provided to highlight the theoretical basis of the work. Then, a step-by-step guide is included to assist in proper execution of each component of the code. Lastly, an evaluation of the method is included to validate its accuracy against model-based and experimental data. The code was built using a Macintosh OSX operating system. The software package contains all the source codes to permit an experienced user to build the suite on a Linux or other *nix type system. The package further includes manuals and sample images and scripts to demonstrate use of the software for new users. The results of the code are in close alignment with theoretical expectations and published results of experimental data. The methodology and the software package offered in AAPM TG162 can be used as baseline for characterization of inherent image quality attributes of planar imaging systems. © 2017 American Association of Physicists in Medicine.

  3. Nursing Leader Collaboration to Drive Quality Improvement and Implementation Science.

    Science.gov (United States)

    Ryan, Rosemary W; Harris, Karen K; Mattox, Lisa; Singh, Olivine; Camp, Melanie; Shirey, Maria R

    2015-01-01

    Nursing leadership opportunities to improve quality and align resources in health care exist. An estimated 18% of United States gross domestic product is spent on health care delivery systems that produce poor outcomes. The purpose of this article was to describe how quality improvement and implementation science initiatives enhance outcomes using nursing leadership strategies that play an integral role in aligning key colleagues to drive the collaborative process. A critical appraisal of the literature was conducted, which supports the importance of evidenced-based practice improvement, collaborative change process, and professional role of nursing leadership. Limited evidence exists related to practice strategies for nursing leaders to implement sustainable change at the unit level for successful alignment of resources. Strategies based on Rogers' Diffusion of Innovation Theory are recommended to address the gap in the literature. The strategies aim to increase meaningful knowledge or the "why," create a tipping point, and implement sustainable change starting with the end in mind. Nurse leaders are a central component for driving alignment and implementing change at the unit level. Uses of the described evidenced-based strategies have implications for nursing practice, education, and scholarship.

  4. Quality Assurance Framework Implementation Guide for Isolated Community Power Systems

    Energy Technology Data Exchange (ETDEWEB)

    Esterly, Sean R. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Baring-Gould, Edward I. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Burman, Kari A. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Greacen, Chris [Independent Consultant (United States)

    2017-08-15

    This implementation guide is a companion document to the 'Quality Assurance Framework for Mini-Grids' technical report. This document is intended to be used by one of the many stakeholder groups that take part in the implementation of isolated power systems. Although the QAF could be applied to a single system, it was designed primarily to be used within the context of a larger national or regional rural electrification program in which many individual systems are being installed. This guide includes a detailed overview of the Quality Assurance Framework and provides guidance focused on the implementation of the Framework from the perspective of the different stakeholders that are commonly involved in expanding energy development within specific communities or regions. For the successful long-term implementation of a specific rural electrification program using mini-grid systems, six key stakeholders have been identified that are typically engaged, each with a different set of priorities 1. Regulatory agency 2. Governmental ministry 3. System developers 4. Mini-utility 5. Investors 6. Customers/consumers. This document is broken into two distinct sections. The first focuses on the administrative processes in the development and operation of community-based mini-grid programs, while the second focuses on the process around the installation of the mini-grid project itself.

  5. Implementing hospital quality assurance policies in Iran: balancing licensing, annual evaluation, inspections and quality management systems.

    Science.gov (United States)

    Aghaei Hashjin, Asgar; Delgoshaei, Bahram; Kringos, Dionne S; Tabibi, Seyed Jamaladin; Manouchehri, Jila; Klazinga, Niek S

    2015-01-01

    The purpose of this paper is to provide an overview of applied hospital quality assurance (QA) policies in Iran. A mixed method (quantitative data and qualitative document analysis) study was carried out between 1996 and 2010. The QA policy cycle forms a tight monitoring system to assure hospital quality by combining mandatory and voluntary methods in Iran. The licensing, annual evaluation and grading, and regulatory inspections statutorily implemented by the government as a national package to assure and improve hospital care quality, while implementing quality management systems (QMS) was voluntary for hospitals. The government's strong QA policy legislation role and support has been an important factor for successful QA implementation in Iran, though it may affected QA assessment independency and validity. Increased hospital evaluation independency and repositioning, updating standards, professional involvement and effectiveness studies could increase QA policy impact and maturity. The study highlights the current QA policy implementation cycle in Iranian hospitals. It provides a basis for further quality strategy development in Iranian hospitals and elsewhere. It also raises attention about finding the optimal balance between different QA policies, which is topical for many countries. This paper describes experiences when implementing a unique approach, combining mandatory and voluntary QA policies simultaneously in a developing country, which has invested considerably over time to improve hospital quality. The experiences with a mixed obligatory/voluntary approach and comprehensive policies in Iran may contain lessons for policy makers in developing and developed countries.

  6. The six critical attributes of the next generation of quality management software systems.

    Science.gov (United States)

    Clark, Kathleen

    2011-07-01

    Driven by both the need to meet regulatory requirements and a genuine desire to drive improved quality, quality management systems encompassing standard operating procedure, corrective and preventative actions and related processes have existed for many years, both in paper and electronic form. The impact of quality management systems on 'actual' quality, however, is often reported as far less than desired. A quality management software system that moves beyond formal forms-driven processes to include a true closed loop design, manage disparate processes across the enterprise, provide support for collaborative processes and deliver insight into the overall state of control has the potential to close the gap between simply accomplishing regulatory compliance and delivering measurable improvements in quality and efficiency.

  7. EarthCollab, building geoscience-centric implementations of the VIVO semantic software suite

    Science.gov (United States)

    Rowan, L. R.; Gross, M. B.; Mayernik, M. S.; Daniels, M. D.; Krafft, D. B.; Kahn, H. J.; Allison, J.; Snyder, C. B.; Johns, E. M.; Stott, D.

    2017-12-01

    EarthCollab, an EarthCube Building Block project, is extending an existing open-source semantic web application, VIVO, to enable the exchange of information about scientific researchers and resources across institutions. EarthCollab is a collaboration between UNAVCO, a geodetic facility and consortium that supports diverse research projects informed by geodesy, The Bering Sea Project, an interdisciplinary field program whose data archive is hosted by NCAR's Earth Observing Laboratory, and Cornell University. VIVO has been implemented by more than 100 universities and research institutions to highlight research and institutional achievements. This presentation will discuss benefits and drawbacks of working with and extending open source software. Some extensions include plotting georeferenced objects on a map, a mobile-friendly theme, integration of faceting via Elasticsearch, extending the VIVO ontology to capture geoscience-centric objects and relationships, and the ability to cross-link between VIVO instances. Most implementations of VIVO gather information about a single organization. The EarthCollab project created VIVO extensions to enable cross-linking of VIVO instances to reduce the amount of duplicate information about the same people and scientific resources and to enable dynamic linking of related information across VIVO installations. As the list of customizations grows, so does the effort required to maintain compatibility between the EarthCollab forks and the main VIVO code. For example, dozens of libraries and dependencies were updated prior to the VIVO v1.10 release, which introduced conflicts in the EarthCollab cross-linking code. The cross-linking code has been developed to enable sharing of data across different versions of VIVO, however, using a JSON output schema standardized across versions. We will outline lessons learned in working with VIVO and its open source dependencies, which include Jena, Solr, Freemarker, and jQuery and discuss future

  8. Design and Implementation of Real-Time Software Radio for Anti-Interference GPS/WAAS Sensors

    Directory of Open Access Journals (Sweden)

    Per Enge

    2012-10-01

    Full Text Available Adaptive antenna array processing is widely known to provide significant anti-interference capabilities within a Global Navigation Satellite Systems (GNSS receiver. A main challenge in the quest for such receiver architecture has always been the computational/processing requirements. Even more demanding would be to try and incorporate the flexibility of the Software-Defined Radio (SDR design philosophy in such an implementation. This paper documents a feasible approach to a real-time SDR implementation of a beam-steered GNSS receiver and validates its performance. This research implements a real-time software receiver on a widely-available x86-based multi-core microprocessor to process four-element antenna array data streams sampled with 16-bit resolution. The software receiver is capable of 12 channels all-in-view Controlled Reception Pattern Antenna (CRPA array processing capable of rejecting multiple interferers. Single Instruction Multiple Data (SIMD instructions assembly coding and multithreaded programming, the key to such an implementation to reduce computational complexity, are fully documented within the paper. In conventional antenna array systems, receivers use the geometry of antennas and cable lengths known in advance. The documented CRPA implementation is architected to operate without extensive set-up and pre-calibration and leverages Space-Time Adaptive Processing (STAP to provide adaptation in both the frequency and space domains. The validation component of the paper demonstrates that the developed software receiver operates in real time with live Global Positioning System (GPS and Wide Area Augmentation System (WAAS L1 C/A code signal. Further, interference rejection capabilities of the implementation are also demonstrated using multiple synthetic interferers which are added to the live data stream.

  9. Geoscience data standards, software implementations, and the Internet. Where we came from and where we might be going.

    Science.gov (United States)

    Blodgett, D. L.

    2014-12-01

    Geographic information science and the coupled database and software systems that have grown from it have been evolving since the early 1990s. The multi-file shapefile package, invented early in this evolution, is an example of a highly generalized file format that can be used as an archival, interchange, and format for program execution. There are other formats, such as GeoTIFF and NetCDF that have similar characteristics. These de-facto standard (in contrast to the formally defined and published standards) formats, while not initially designed for machine-readable web-services, are used in them extensively. Relying on these formats allows legacy software to be adapted to web-services, but may require complicate software development to handle dynamic introspection of these legacy file formats' metadata. A generalized system of web-service types that offer archive, interchange, and run-time capabilities based on commonly implemented file formats and established web-service specifications has emerged from exemplar implementations. For example, an Open Geospatial Consortium (OGC) Web Feature Service is used to serve sites or model polygons and an OGC Sensor Observation Service provides time series data for the sites. The broad system of data formats, web-service types, and freely available software that implements the system will be described. The presentation will include a perspective on the future of this basic system and how it relates to scientific domain specific information models such as the Open Geospatial Consortium standards for geographic, hydrologic, and hydrogeologic data.

  10. Secure software development training course

    Directory of Open Access Journals (Sweden)

    Victor S. Gorbatov

    2017-06-01

    Full Text Available Information security is one of the most important criteria for the quality of developed software. To obtain a sufficient level of application security companies implement security process into software development life cycle. At this stage software companies encounter with deficit employees who able to solve problems of software design, implementation and application security. This article provides a description of the secure software development training course. Training course of application security is designed for co-education students of different IT-specializations.

  11. Process-based quality management for clinical implementation of adaptive radiotherapy

    International Nuclear Information System (INIS)

    Noel, Camille E.; Santanam, Lakshmi; Parikh, Parag J.; Mutic, Sasa

    2014-01-01

    Purpose: Intensity-modulated adaptive radiotherapy (ART) has been the focus of considerable research and developmental work due to its potential therapeutic benefits. However, in light of its unique quality assurance (QA) challenges, no one has described a robust framework for its clinical implementation. In fact, recent position papers by ASTRO and AAPM have firmly endorsed pretreatment patient-specific IMRT QA, which limits the feasibility of online ART. The authors aim to address these obstacles by applying failure mode and effects analysis (FMEA) to identify high-priority errors and appropriate risk-mitigation strategies for clinical implementation of intensity-modulated ART. Methods: An experienced team of two clinical medical physicists, one clinical engineer, and one radiation oncologist was assembled to perform a standard FMEA for intensity-modulated ART. A set of 216 potential radiotherapy failures composed by the forthcoming AAPM task group 100 (TG-100) was used as the basis. Of the 216 failures, 127 were identified as most relevant to an ART scheme. Using the associated TG-100 FMEA values as a baseline, the team considered how the likeliness of occurrence (O), outcome severity (S), and likeliness of failure being undetected (D) would change for ART. New risk priority numbers (RPN) were calculated. Failures characterized by RPN ≥ 200 were identified as potentially critical. Results: FMEA revealed that ART RPN increased for 38% (n = 48/127) of potential failures, with 75% (n = 36/48) attributed to failures in the segmentation and treatment planning processes. Forty-three of 127 failures were identified as potentially critical. Risk-mitigation strategies include implementing a suite of quality control and decision support software, specialty QA software/hardware tools, and an increase in specially trained personnel. Conclusions: Results of the FMEA-based risk assessment demonstrate that intensity-modulated ART introduces different (but not necessarily

  12. Implementation of quality systems by Mexican exporters of processed meat.

    Science.gov (United States)

    Maldonado-Siman, E; Bernal-Alcántara, R; Cadena-Meneses, J A; Altamirano-Cárdenas, J R; Martinez-Hernández, P A

    2014-12-01

    Requirements of hazard analysis and critical control points (HACCP) are becoming essential for international trade in food commodities as a safety assurance component. This research reports the level of the adoption of ISO 9000 and the HACCP system by Federal Inspection Type (TIF) pork-exporting enterprises. Implementation and operating costs are reported as well as the benefits involved in this food industry process. In Mexico, there are 97 companies classified as TIF enterprises, and 22 are registered as exporters of processed pork with the National Services for Safety and Quality and Animal Health of the Secretariat of Agriculture, Livestock, Rural Development, Fisheries and Food. Surveys were administered to 22 companies, with a 95.2% response rate. Enterprise characteristics were evaluated, as well as their operating activities. Fieldwork consisted of administering structured questionnaires to TIF exporters. All the surveyed enterprises had implemented HACCP, whereas the ISO 9000 regulation was applied in only 30%. Of total production, 75% is exported to 13 countries, and 25% goes to the Mexican market niche. Results indicate that the main factors for adopting HACCP are related to accessibility to international markets, improving quality, and reducing product quality audits by customers. The results also indicated that staff training was the most important issue. Microbiological testing was the highest cost of the operation. The main benefits reported were related to better access to international markets and a considerable reduction in microbial counts. This study shows the willingness of Mexican pork processors to implement food safety protocols for producing safe and quality products to compete in the international food trade.

  13. Quality control in urodynamics and the role of software support in the QC procedure.

    Science.gov (United States)

    Hogan, S; Jarvis, P; Gammie, A; Abrams, P

    2011-11-01

    This article aims to identify quality control (QC) best practice, to review published QC audits in order to identify how closely good practice is followed, and to carry out a market survey of the software features that support QC offered by urodynamics machines available in the UK. All UK distributors of urodynamic systems were contacted and asked to provide information on the software features relating to data quality of the products they supply. The results of the market survey show that the features offered by manufacturers differ greatly. Automated features, which can be turned off in most cases, include: cough recognition, detrusor contraction detection, and high pressure alerts. There are currently no systems that assess data quality based on published guidelines. A literature review of current QC guidelines for urodynamics was carried out; QC audits were included in the literature review to see how closely guidelines were being followed. This review highlights the fact that basic QC is not being carried out effectively by urodynamicists. Based on the software features currently available and the results of the literature review there is both the need and capacity for a greater degree of automation in relation to urodynamic data quality and accuracy assessment. Some progress has been made in this area and certain manufacturers have already developed automated cough detection. Copyright © 2011 Wiley Periodicals, Inc.

  14. QCScreen: a software tool for data quality control in LC-HRMS based metabolomics.

    Science.gov (United States)

    Simader, Alexandra Maria; Kluger, Bernhard; Neumann, Nora Katharina Nicole; Bueschl, Christoph; Lemmens, Marc; Lirk, Gerald; Krska, Rudolf; Schuhmacher, Rainer

    2015-10-24

    Metabolomics experiments often comprise large numbers of biological samples resulting in huge amounts of data. This data needs to be inspected for plausibility before data evaluation to detect putative sources of error e.g. retention time or mass accuracy shifts. Especially in liquid chromatography-high resolution mass spectrometry (LC-HRMS) based metabolomics research, proper quality control checks (e.g. for precision, signal drifts or offsets) are crucial prerequisites to achieve reliable and comparable results within and across experimental measurement sequences. Software tools can support this process. The software tool QCScreen was developed to offer a quick and easy data quality check of LC-HRMS derived data. It allows a flexible investigation and comparison of basic quality-related parameters within user-defined target features and the possibility to automatically evaluate multiple sample types within or across different measurement sequences in a short time. It offers a user-friendly interface that allows an easy selection of processing steps and parameter settings. The generated results include a coloured overview plot of data quality across all analysed samples and targets and, in addition, detailed illustrations of the stability and precision of the chromatographic separation, the mass accuracy and the detector sensitivity. The use of QCScreen is demonstrated with experimental data from metabolomics experiments using selected standard compounds in pure solvent. The application of the software identified problematic features, samples and analytical parameters and suggested which data files or compounds required closer manual inspection. QCScreen is an open source software tool which provides a useful basis for assessing the suitability of LC-HRMS data prior to time consuming, detailed data processing and subsequent statistical analysis. It accepts the generic mzXML format and thus can be used with many different LC-HRMS platforms to process both multiple

  15. [Quality of vocational education in speech therapy - development and implementation of a quality assurance programme].

    Science.gov (United States)

    Ullrich, A; Kawski, S; Koch, U; Härter, M

    2014-12-01

    The provision of high-quality health-services is only possible if it is based on vocational education of corresponding quality. To promote the quality of vocational education in speech therapy, a quality assurance programme was developed in a scientifically supervised multi-step process. The main goals of the quality assurance programme include: (i) external review of the quality of education by means of well-defined criteria, (ii) certification of schools that meet the requirements, and (iii) provision of feedback to schools about their results. A total of 208 quality indicators cover the essential aspects of vocational education in speech therapy, and apply to the structural, process and outcome quality. These indicators are based on a literature survey as well as on expert opinion, and are calibrated by data. The data are collected by using questionnaires (school management, teachers in speech therapy, students, consecutive patient sample) and are validated by specific document analyses and telephone audits. Each school receives an individual quality report of its achieved results benchmarked to other schools. Since the initial implementation in 2008, a total of 50 schools participated in the quality assurance programme and 41 achieved certification. Therefore, the defined set of quality criteria has been disseminated and utilized by about half of all German schools for vocational education in speech therapy. The evaluation of the data on quality collected across all schools highlights the strengths and weaknesses of vocational education as well as the demands for quality improvement. © Georg Thieme Verlag KG Stuttgart · New York.

  16. Design: Vital step for Quality management system implementation.

    Directory of Open Access Journals (Sweden)

    Paula Evenilda Cruz Esmoris

    2009-12-01

    Full Text Available Sancti Spíritus´ Acopio, Beneficio y Comercialización de Productos Apícolas organization developed a Quality management system (QMS according to NC-ISO 9001:2008 ”Quality management systems - Requirements”. In this organization, QMS implementation is not only a strategy to increase competitiveness and clients satisfaction, but a requirement to improve its performance and constitutes a demand of international market because honey is an exportable product. This article presents QMS design according to the description of its key elements. Design was based on legal and regulatory requirements from corresponding organizations and technical orientations issued by the Empresa Nacional Apícola. Items taking into account were related with the definition of: scope (products, processes and areas; QMS processes and their relations (map; product related processes and activities; needed resources; responsibilities with QMS; barriers favorable forces to QMS implementation and documentation to support QMS processes. Use of a correct design allows a successful next steps implementation of QMS and complying with the Instituto Nacional de Normalización methodology.

  17. Implementing SNOMED CT for Quality Reporting: Avoiding Pitfalls.

    Science.gov (United States)

    Wade, G

    2011-01-01

    To implement the SNOMED CT electronic specifications for reporting quality measures and to identify critical issues that affect implementation. The Centers for Medicare and Medicaid (CMS) have issued the electronic specifications for reporting quality measures requiring vendors and hospital systems to use standardized data elements to provide financial incentives for eligible providers. The electronic specifications from CMS were downloaded and extracted. All SNOMED CT codes were examined individually as part of the creation of a mapping table for distribution by a vendor for incorporation into electronic health record systems. A qualitative and quantitative evaluation of the SNOMED CT codes was done as a follow up to the mapping project. A total of 10643 SNOMED codes were examined for the 44 measures. The approved SNOMED CT code sets contain aberrancies in content such as incomplete IDs, the use of description IDs instead of concept IDs, inactive codes, morphology and observable codes for clinical findings and the inclusion of non-human content. Implementers of these approved specifications must do additional rigorous review and make edits in order to avoid incorporating errors into their EHR products and systems.

  18. Characteristics for Software Optimization Projects

    Directory of Open Access Journals (Sweden)

    Iulian NITESCU

    2008-01-01

    Full Text Available The increasing of the software systems complexity imposes the identification and implementation of some methods and techniques in order to manage it. The software optimization project is a way in which the software complexity is controlled. The software optimization project must face to the organization need to earn profit. The software optimization project is an integrated part of the application cycle because share same resources, depends on other stages and influences next phases. The optimization project has some particularities because it works on an finished product around its quality. The process is quality and performance oriented and it assumes that the product life cycle is almost finished.

  19. Projeto Seis Sigma para a implementação de software de programação Six Sigma project for scheduling software implementation

    Directory of Open Access Journals (Sweden)

    Rogério Cerávolo Calia

    2005-12-01

    Full Text Available O artigo visa analisar a eficácia organizacional da metodologia Seis Sigma na gestão de projetos para a redução de atrasos e redução de estoques na manufatura, por meio da implementação de um software com algoritmos da Teoria das Restrições. Inicialmente, é apresentada uma revisão bibliográfica sobre a gestão de projetos na perspectiva da gestão da mudança organizacional nos processos de negócios. Em seguida, são revistos os conceitos sobre a metodologia Seis Sigma para a gestão de projetos e sobre os algoritmos da Teoria das Restrições. Então, são descritos os estudos de caso em dois projetos de implementação do software da Teoria das Restrições, sendo que apenas uma das implementações utilizou-se da metodologia Seis Sigma para a gestão do projeto. Na análise dos resultados, busca-se compreender os motivos de o projeto com a metodologia Seis Sigma ter reduzido inventário três vezes mais rápido do que o projeto sem o Seis Sigma.The article aims to analyze the organizational effectiveness of the Six Sigma methodology for project management to reduce delays and to reduce inventory in manufacture, by the implementation of software with Theory of Constraints algorithms. Initially, the article presents a bibliographic revision on project management and its impact on the organizational change management for improving business processes. Then, the article revises the concepts about the Six Sigma methodology for project management and about the Theory of Constraints algorithms. It follows, the case studies descriptions on two implementation projects of the Theory of Constraints software, in which only one of these implementations adopted the Six Sigma methodology in the project management. In the results analyzes, the article discusses the reasons why the project with the Six Sigma methodology was three times faster than the other project.

  20. Are We Working Well with Others? How the Multi Team Systems Impact Software Quality

    Directory of Open Access Journals (Sweden)

    Mathieu Lavallée

    2018-01-01

    Full Text Available Background: There are many studies on software development teams, but few about the interactions between teams. Current findings suggest that these multi-team systems may have a significant impact on software development projects. Aim: The objective of this exploratory study is to provide more evidence on multi-team systems in software engineering and identify challenges with a potential impact on software quality. Method: A non-participatory approach was used to collect data on one development project within a large telecommunication organization. Verbal interactions between team members were analyzed using a coding scheme following the Grounded Theory approach. Results: The results show that the interactions between teams are often technical in nature, outlining technical dependencies between departments, external providers, and even clients. Conclusion: This article hypothesizes that managers of large software project should (1 identify external teams most likely to interfere with their development work, to (2 appoint brokers to redirect external requests to the appropriate resource, and to (3 ensure that there are opportunities to discuss technical issues at the multi-team level. Failure to do so could results in delays and the persistence of codebase-wide issues.

  1. STATIC CODE ANALYSIS FOR SOFTWARE QUALITY IMPROVEMENT: A CASE STUDY IN BCI FRAMEWORK DEVELOPMENT

    Directory of Open Access Journals (Sweden)

    Indar Sugiarto

    2008-01-01

    Full Text Available This paper shows how the systematic approach in software testing using static code analysis method can be used for improving the software quality of a BCI framework. The method is best performed during the development phase of framework programs. In the proposed approach, we evaluate several software metrics which are based on the principles of object oriented design. Since such method is depending on the underlying programming language, we describe the method in term of C++ language programming whereas the Qt platform is also currently being used. One of the most important metric is so called software complexity. Applying the software complexity calculation using both McCabe and Halstead method for the BCI framework which consists of two important types of BCI, those are SSVEP and P300, we found that there are two classes in the framework which have very complex and prone to violation of cohesion principle in OOP. The other metrics are fit the criteria of the proposed framework aspects, such as: MPC is less than 20; average complexity is around value of 5; and the maximum depth is below 10 blocks. Such variables are considered very important when further developing the BCI framework in the future.

  2. Implementing and validating of pan-sharpening algorithms in open-source software

    Science.gov (United States)

    Pesántez-Cobos, Paúl; Cánovas-García, Fulgencio; Alonso-Sarría, Francisco

    2017-10-01

    Several approaches have been used in remote sensing to integrate images with different spectral and spatial resolutions in order to obtain fused enhanced images. The objective of this research is three-fold. To implement in R three image fusion techniques (High Pass Filter, Principal Component Analysis and Gram-Schmidt); to apply these techniques to merging multispectral and panchromatic images from five different images with different spatial resolutions; finally, to evaluate the results using the universal image quality index (Q index) and the ERGAS index. As regards qualitative analysis, Landsat-7 and Landsat-8 show greater colour distortion with the three pansharpening methods, although the results for the other images were better. Q index revealed that HPF fusion performs better for the QuickBird, IKONOS and Landsat-7 images, followed by GS fusion; whereas in the case of Landsat-8 and Natmur-08 images, the results were more even. Regarding the ERGAS spatial index, the ACP algorithm performed better for the QuickBird, IKONOS, Landsat-7 and Natmur-08 images, followed closely by the GS algorithm. Only for the Landsat-8 image did, the GS fusion present the best result. In the evaluation of spectral components, HPF results tended to be better and ACP results worse, the opposite was the case with the spatial components. Better quantitative results are obtained in Landsat-7 and Landsat-8 images with the three fusion methods than with the QuickBird, IKONOS and Natmur-08 images. This contrasts with the qualitative evaluation reflecting the importance of splitting the two evaluation approaches (qualitative and quantitative). Significant disagreement may arise when different methodologies are used to asses the quality of an image fusion. Moreover, it is not possible to designate, a priori, a given algorithm as the best, not only because of the different characteristics of the sensors, but also because of the different atmospherics conditions or peculiarities of the

  3. Clinical implementation and quality assurance for intensity modulated radiation therapy

    International Nuclear Information System (INIS)

    Ma, C.-M.; Price, R.; McNeeley, S.; Chen, L.; Li, J.S.; Wang, L.; Ding, M.; Fourkal, E.; Qin, L.

    2002-01-01

    This paper describes the clinical implementation and quality assurance (QA) for intensity-modulated radiation therapy (IMRT) based on the experience at Fox Chase Cancer Center, Philadelphia, USA. We will review our procedures for the clinical implementation of the IMRT technique and the requirements for patient immobilization, target delineation, treatment optimization, beam delivery and system administration. We will discuss the dosimetric requirements and measurement procedures for beam commissioning and dosimetry verification for IMRT. We will examine the details of model-based dose calculation for IMRT treatment planning and the potential problems with such dose calculation algorithms. We will discuss the effect of beam delivery systems on the actual dose distributions received by the patients and the methods to incorporate such effects in the treatment optimization process. We will investigate the use of the Monte Carlo method for dose calculation and treatment verification for IMRT

  4. 78 FR 21540 - Revisions to the California State Implementation Plan, Butte County Air Quality Management...

    Science.gov (United States)

    2013-04-11

    ... the California State Implementation Plan, Butte County Air Quality Management District and Sacramento Metropolitan Air Quality Management District AGENCY: Environmental Protection Agency (EPA). ACTION: Direct... Quality Management District (BCAQMD) and Sacramento Metropolitan Air Quality Management District (SMAQMD...

  5. 76 FR 44493 - Revisions to the California State Implementation Plan, Northern Sierra Air Quality Management...

    Science.gov (United States)

    2011-07-26

    ... California State Implementation Plan, Northern Sierra Air Quality Management District, Sacramento Metropolitan Air Quality Management District, and South Coast Air Quality Management District AGENCY... approve revisions to the Northern Sierra Air Quality Management District (NSAQMD), Sacramento Metropolitan...

  6. 77 FR 12495 - Revisions to the California State Implementation Plan, Antelope Valley Air Quality Management...

    Science.gov (United States)

    2012-03-01

    ... the California State Implementation Plan, Antelope Valley Air Quality Management District and Mojave Desert Quality Management District AGENCY: Environmental Protection Agency (EPA). ACTION: Direct final... Quality Management District (AVAQMD) and Mojave Desert Air Quality Management District (MDAQMD) portion of...

  7. 75 FR 40726 - Revisions to the California State Implementation Plan, Sacramento Metropolitan Air Quality...

    Science.gov (United States)

    2010-07-14

    ... the California State Implementation Plan, Sacramento Metropolitan Air Quality Management District and South Coast Air Quality Management District AGENCY: Environmental Protection Agency (EPA). ACTION... Metropolitan Air Quality Management District (SMAQMD) and South Coast Air Quality Management District (SCAQMD...

  8. The Data Quality Monitoring Software for the CMS experiment at the LHC

    CERN Document Server

    AUTHOR|(CDS)2071602

    2014-01-01

    The Data Quality Monitoring (DQM) Software is a central tool in the CMS experiment. Its flexibility allows for integration in several key environments Online, for real-time detector monitoring; Offline, for the final, fine-grained Data Certification; Release-Validation, to constantly validate the functionalities and the performance of the reconstruction software; in Monte Carlo productions.Since the end of data taking at a center of mass energy of 8 TeV, the environment in which the DQM lives has undergone fundamental changes. In turn, the DQM system has made significant upgrades in many areas to respond to not only the changes in infrastructure, but also the growing specialized needs of the collaboration with an emphasis on more sophisticated methods for evaluating dataquality, as well as advancing the DQM system to provide quality assessments of various Monte Carlo simulations versus data distributions, monitoring changes in physical effects due to modifications of algorithms or framework, and enabling reg...

  9. System For Inspection And Quality Assurance Of Software: A Knowledge-Based Experiment With Code Understanding

    Science.gov (United States)

    Das, Bikas K.

    1989-03-01

    This paper describes a knowledge-based prototype that inspects and quality assures software components. The prototype model, which offers a singular representation of these components, is used to automate both the mechanical and nonmechanical activities in the quality assurance (QA) process. It will be shown that the prototype, in addition to automating the QA process, provides a novel approach to understanding code. Our approaches are compared with recent approaches to code understanding. The paper also presents the results of an experiment with several classes of nonsyntactic bugs. It is argued that a structured environment, as facilitated by our unique architecture along with "software development standards" used in the QA process, is essential for meaningful analysis of code. Initial success with the prototype has generated several interesting directions for future work.

  10. System for inspection and quality assurance of software - A knowledge-based experiment with code understanding

    International Nuclear Information System (INIS)

    Das, B.K.

    1989-01-01

    This paper describes a knowledge-based prototype that inspects and quality-assures software components. The prototype model, which offers a singular representation of these components, is used to automate both the mechanical and nonmechanical activities in the quality assurance (QA) process. It is shown that the prototype, in addition to automating the QA process, provides a novel approach to understanding code. These approaches are compared with recent approaches to code understanding. The paper also presents the results of an experiment with several classes of nonsyntactic bugs. It is argued that a structured environment, as facilitated by this unique architecture, along with software development standards used in the QA process, is essential for meaningful analysis of code. 8 refs

  11. Implementation of Successful Practices Using an Iterative Development Methodology for an AEGIS Configuration Management Software Application

    National Research Council Canada - National Science Library

    Colston, Sharon

    1998-01-01

    This paper documents a two-and-a-half year software development project of the Combat Systems Configuration Management Branch of the Combat Systems Department at Naval Surface Warfare Center, Dahlgren Division (NSWCDD...

  12. Specialized software for optimization of the quality control of the mammography units

    International Nuclear Information System (INIS)

    Stoeva, M.; Vassileva, J.

    2004-01-01

    Quality control is essential to ensure the equipment used is reliable and consistent in order to maintain radiation does as low as reasonably achievable whilst optimizing image quality and performance in mammography. The effectiveness of mammographic screening is highly dependent on the consistent production of high quality diagnostic images. Mammography is highly dependent on the equipment status, which requires an effective Quality Control (QC) program to provide tools for continuous assessment of the equipment performance and also data storage and analysis of the protocols' data. The objective of this paper is to present the specialized software for Quality Control of the Mammography Units, as tool providing additional functionality for optimizations of the Mammography QC data storage and management. The PC program was developed according to the requirements stated in the European protocol for Quality Control of the Mammography Screening and the data collected as a result of its application in several Bulgarian hospitals. The Structured Analysis method was used in order to perform a case, which resulted in the development of the specialized software with a database module, providing the following functionality: Data Storage, Preliminary Data Processing and Post-Processing, Manual Data Entry, Data Import from XLS format, Data Export to XLS format, Printing, Data Filters, Automated Calculation, Automated Graphical Representation, Archiving The development of specialized QC software with a database for mammography units facilitates the process of QC data storage and handling and minimizes the errors. The electronic format for data storage is especially useful in case of long-term storage and periodical data analysis/access. The integrated data processing functionality and the automated import/export features based on standard platform increase the compatibility of the data. (authors)

  13. Implementation and evaluation of a gravimetric i.v. workflow software system in an oncology ambulatory care pharmacy.

    Science.gov (United States)

    Reece, Kelley M; Lozano, Miguel A; Roux, Ryan; Spivey, Susan M

    2016-02-01

    The implementation and evaluation of a gravimetric i.v. workflow software system in an oncology ambulatory care pharmacy are described. To estimate the risk involved in the sterile i.v. compounding process, a failure modes and effects analysis (FMEA) in the oncology ambulatory care pharmacy was performed. When a volumetric-based process was used to reconstitute vials, the actual concentration was unknown since an assumption must be made that the exact volume of diluent was used when reconstituting the drug. This gap in our process was discovered during the FMEA and was resolved with the implementation of an i.v. workflow software solution. The i.v. software system standardized preparation steps and documented each process step, enabling a systematic review of the metrics for safety, productivity, and drug waste. Over the study period, 15,843 doses were prepared utilizing the new technology, with a total of 1,126 errors (7%) detected by the workflow software during dose preparation. Barcode scanning detected 292 (26%) of the total errors, the gravimetric weighing step detected 797 (71%) deviation errors, and 37 (3%) errors were detected at the vial reconstitution step. All errors were detected during compounding, eliminating the need to correct errors after production. Technician production time decreased by 34%, and pharmacist checking time decreased by 37%. Implementation of a gravimetric-based software system that used barcode verification and real-time alerts improved the detection of errors in the chemotherapy preparation process when compared with self-reporting. Standardized workflow processes and the elimination of time-consuming manual steps increased productivity while vial management decreased costs. Copyright © 2016 by the American Society of Health-System Pharmacists, Inc. All rights reserved.

  14. Software Quality Assurance and Verification for the MPACT Library Generation Process

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Yuxuan [Univ. of Michigan, Ann Arbor, MI (United States); Williams, Mark L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Wiarda, Dorothea [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Clarno, Kevin T. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Kim, Kang Seog [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Celik, Cihangir [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2017-05-01

    This report fulfills the requirements for the Consortium for the Advanced Simulation of Light-Water Reactors (CASL) milestone L2:RTM.P14.02, “SQA and Verification for MPACT Library Generation,” by documenting the current status of the software quality, verification, and acceptance testing of nuclear data libraries for MPACT. It provides a brief overview of the library generation process, from general-purpose evaluated nuclear data files (ENDF/B) to a problem-dependent cross section library for modeling of light-water reactors (LWRs). The software quality assurance (SQA) programs associated with each of the software used to generate the nuclear data libraries are discussed; specific tests within the SCALE/AMPX and VERA/XSTools repositories are described. The methods and associated tests to verify the quality of the library during the generation process are described in detail. The library generation process has been automated to a degree to (1) ensure that it can be run without user intervention and (2) to ensure that the library can be reproduced. Finally, the acceptance testing process that will be performed by representatives from the Radiation Transport Methods (RTM) Focus Area prior to the production library’s release is described in detail.

  15. Quality of healthcare related software applications--setting up an accreditation system in Hungary.

    Science.gov (United States)

    Lakner, G; Balkányi, L; Surján, G; Kovács, J

    1997-01-01

    Meeting expectations of high quality health care, the safe and secure operation of medical information systems is a "must". However for healthcare software nationwide quality control systems are not widely used. A quality control project of health care applications in Hungary has been launched in 1996 by the Hungarian Society of Healthcare Informatics (MEIT) and Medico-Biological Section of Johann Neumann Society of Computing (NJSZT) by establishing a joint Healthcare Informatics Applications Accreditation Board (Board ESAB). The Board developed an evaluation methodology and a legal procedure to test health care software application modules. The evaluation method is based on international standards as ISO-9126 and on emerging European standards of CEN/TC 251. First rounds of accreditation already proved that there is a need among providers and users for the accreditation process. The authors hope that establishing an accreditation system will lead to a more balanced health care software market where users have an opportunity to inform themselves by the opinion of independent experts on the product they intend to purchase.

  16. Implementation of Similarity Based Kriging in Open Source Software and Application to Uncertainty Quantification and Reduction in Hydrogeological Inversion

    Science.gov (United States)

    Komara, R.; Ginsbourger, D.

    2014-12-01

    We present the implementation of Similarity Based Kriging (SBK). This approach extends Gaussian process regression (GPR) methods, typically restricted to Euclidean spaces, to spaces that are non-Euclidean or perhaps even non-metric. SBK was inspired by problems in aquifer modeling, where inputs of numerical simulations are typically curves and parameter fields, and predicting scalar or vector outputs by Kriging with such very high-dimensional inputs may seem not feasible at first. SBK combines ideas from the distance-based set-up of Scheidt and Caers (2009) with GPR and allows calculating Kriging predictions based only on similarities between inputs rather than on their high-dimensional representation. Written in open source code, this proposed approach includes automated construction of SBK models and provides diagnostics to assess model quality both in terms of covariance fitting and internal/external prediction validation. Covariance hyperparameters can be estimated both by maximum likelihood and leave-one-out cross validation relying in both cases on efficient formulas and a hybrid genetic optimization algorithm using derivatives. The determination of the best dimension for Classical multidimensional scaling (MDS) and non-metric MDS of the data will be investigated. Application of this software to real life data examples in Euclidean and non-Euclidean (dis)similarity settings will be covered and touch on aquifer modeling, hydrogeological forecasting, and sequential inverse problem solving. In the last case, a novel approach where a variant of the expected improvement criterion is used for choosing several points at a time will be presented. This part of the method and the previous covariance hyperparameter estimation parallelize naturally and we demonstrate how to save computation time by optimally distributing function evaluations over multiple cores or processors.

  17. UMTRA project technical assistance contractor quality assurance implementation plan

    International Nuclear Information System (INIS)

    1994-03-01

    The Uranium Mill Tailings Remedial Action (UMTRA) Project Technical Assistance contractor (TAC) Quality Assurance Implementation Plan (QAIP) outlines the primary requirements for integrating quality functions for TAC technical activities applied to the surface and ground water phases of the UMTRA Project. The QAIP is subordinate to the latest issue of the UMTRA Project TAC Quality Assurance Program Plan (QAPP) (DOE, 1993a), which was developed using US Department of Energy (DOE) Order 5700.6C quality assurance (QA) criteria. The QAIP addresses technical aspects of the TAC UMTRA Project surface and ground water programs. All QA issues in the QAIP shall comply with requirements contained in the TAC QAPP (DOE, 1933a). Because industry standards for data acquisition and data control are not addressed in DOE Order 5700.6C, the QAIP has been formatted to the 14 US Environmental Protection Agency (EPA) Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) QA requirements. DOE Order 5700.6C criteria that are not contained in the CERCLA requirements are added to the QAIP as additional requirements in Sections 15.0 through 18.0. Project documents that contain CERCLA requirements and 5700.6 criteria shall be referenced in this document to avoid duplication. Referenced documents are not included in this QAIP but are available through the UMTRA Project Document Control Center

  18. Impact on dose and image quality of a software-based scatter correction in mammography.

    Science.gov (United States)

    Monserrat, Teresa; Prieto, Elena; Barbés, Benigno; Pina, Luis; Elizalde, Arlette; Fernández, Belén

    2017-01-01

    Background In 2014, Siemens developed a new software-based scatter correction (Progressive Reconstruction Intelligently Minimizing Exposure [PRIME]), enabling grid-less digital mammography. Purpose To compare doses and image quality between PRIME (grid-less) and standard (with anti-scatter grid) modes. Material and Methods Contrast-to-noise ratio (CNR) was measured for various polymethylmethacrylate (PMMA) thicknesses and dose values provided by the mammograph were recorded. CDMAM phantom images were acquired for various PMMA thicknesses and inverse Image Quality Figure (IQF inv ) was calculated. Values of incident entrance surface air kerma (ESAK) and average glandular dose (AGD) were obtained from the DICOM header for a total of 1088 pairs of clinical cases. Two experienced radiologists compared subjectively the image quality of a total of 149 pairs of clinical cases. Results CNR values were higher and doses were lower in PRIME mode for all thicknesses. IQF inv values in PRIME mode were lower for all thicknesses except for 40 mm of PMMA equivalent, in which IQF inv was slightly greater in PRIME mode. A mean reduction of 10% in ESAK and 12% in AGD in PRIME mode with respect to standard mode was obtained. The clinical image quality in PRIME and standard acquisitions resulted to be similar in most of the cases (84% for the first radiologist and 67% for the second one). Conclusion The use of PRIME software reduces, in average, the dose of radiation to the breast without affecting image quality. This reduction is greater for thinner and denser breasts.

  19. Evaluation of multiple intelligences in children aged 7 to 11 years through the implementation of an interactive software

    OpenAIRE

    Rebolledo Rodríguez, Rigel A.; Samaniego González, Euclides

    2017-01-01

    This document describes the evaluation project of multiple intelligences in children aged 7 to 11 years through the implementation of an interactive software, developed on Android for tablets, allowing parents, teachers, tutors, psy-chologists or other responsible adult, identifying the different types of intelligences that have children in order to know each other and develop their skills and their future potential. Similarly, research was developed to evaluate the effective-ness of the t...

  20. ThermoData Engine (TDE): software implementation of the dynamic data evaluation concept. 5. Experiment planning and product design.

    Science.gov (United States)

    Diky, Vladimir; Chirico, Robert D; Kazakov, Andrei F; Muzny, Chris D; Magee, Joseph W; Abdulagatov, Ilmutdin; Kang, Jeong Won; Kroenlein, Kenneth; Frenkel, Michael

    2011-01-24

    ThermoData Engine (TDE) is the first full-scale software implementation of the dynamic data evaluation concept, as reported recently in this journal. In the present paper, we describe development of an algorithmic approach to assist experiment planning through assessment of the existing body of knowledge, including availability of experimental thermophysical property data, variable ranges studied, associated uncertainties, state of prediction methods, and parameters for deployment of prediction methods and how these parameters can be obtained using targeted measurements, etc., and, indeed, how the intended measurement may address the underlying scientific or engineering problem under consideration. A second new feature described here is the application of the software capabilities for aid in the design of chemical products through identification of chemical systems possessing desired values of thermophysical properties within defined ranges of tolerance. The algorithms and their software implementation to achieve this are described. Finally, implementation of a new data validation and weighting system is described for vapor-liquid equilibrium (VLE) data, and directions for future enhancements are outlined.

  1. Implementation of Clinical Quality Management for Rehabilitation in Malaysia

    Directory of Open Access Journals (Sweden)

    Julia Patrick Engkasan

    2017-11-01

    Full Text Available In February 2017, the World Health Organization (WHO launched its historic ”Rehabilitation 2030: A Call for Action”. Scaling up rehabilitation in health systems requires concerted action across all 6 components of WHO’s Health Systems Framework. For rehabilitation, information about functioning is essential, as it is required for effective rehabilitation at all levels of the health system. What is missing is a countrywide demonstration project involving the implementation of a clinical quality management system for the continuous improvement of rehabilitation, both at the level of clinical care for individual patients and at the level of rehabilitation service provision. Consequently, the Department of Rehabilitation Medicine at the University of Malaya and University Malaya Medical Centre, together with the Cheras Rehabilitation Hospital of the Ministry of Health, and the Social Security Organisation (SOCSO Rehabilitation Centre in Malacca, Malaysia, initiated a project to develop a Malaysian-wide clinical quality management system for rehabilitation (CQM-R Malaysia. The objective of this paper is to describe CQM-R Malaysia. First, a conceptual description of a CQM-R based on the International Classification of Functioning, Disability and Health (ICF is set out. The methods, results and conclusions of a situation analysis conducted in January 2017 are then reported. Finally, the building blocks and implementation action plan developed for CQM-R Malaysia are presented.

  2. Implementation of quality by design toward processing of food products.

    Science.gov (United States)

    Rathore, Anurag S; Kapoor, Gautam

    2017-05-28

    Quality by design (QbD) is a systematic approach that begins with predefined objectives and emphasizes product and process understanding and process control. It is an approach based on principles of sound science and quality risk management. As the food processing industry continues to embrace the idea of in-line, online, and/or at-line sensors and real-time characterization for process monitoring and control, the existing gaps with regard to our ability to monitor multiple parameters/variables associated with the manufacturing process will be alleviated over time. Investments made for development of tools and approaches that facilitate high-throughput analytical and process development, process analytical technology, design of experiments, risk analysis, knowledge management, and enhancement of process/product understanding would pave way for operational and economic benefits later in the commercialization process and across other product pipelines. This article aims to achieve two major objectives. First, to review the progress that has been made in the recent years on the topic of QbD implementation in processing of food products and second, present a case study that illustrates benefits of such QbD implementation.

  3. 77 FR 47535 - Revisions to the California State Implementation Plan, Sacramento Metropolitan Air Quality...

    Science.gov (United States)

    2012-08-09

    ... the California State Implementation Plan, Sacramento Metropolitan Air Quality Management District... final action to approve revisions to the Sacramento Metropolitan Air Quality Management District portion of the California State Implementation Plan (SIP). This revision concerns the definition of volatile...

  4. 77 FR 47581 - Revisions to the California State Implementation Plan, Sacramento Metropolitan Air Quality...

    Science.gov (United States)

    2012-08-09

    ... the California State Implementation Plan, Sacramento Metropolitan Air Quality Management District... approve revisions to the Sacramento Metropolitan Air Quality Management District (SMAQMD) portion of the California State Implementation Plan (SIP). This revision concerns the definition of volatile organic...

  5. Quality assurance applied to mammographic equipments using phantoms and software for its evaluation

    International Nuclear Information System (INIS)

    Mayo, Patricia; Rodenas, Francisco; Manuel Campayo, Juan; Verdu, Gumersido

    2010-01-01

    The image quality assessment in radiographic equipments is a very important item for a complete quality control of the radiographic image chain. The periodic evaluation of the radiographic image quality must guarantee the constancy of this quality to carry out a suitable diagnosis. Mammographic phantom images are usually used to study the quality of images obtained by determined mammographic equipment. The digital image treatment techniques allow to carry out an automatic analysis of the phantom image. In this work we apply some techniques of digital image processing to analyze in an automatic way the image quality of mammographic phantoms, namely CIRS SP01 and RACON for different varying conditions of the mammographic equipment. The CIRS SP01 phantom is usually used in analogic mammographic equipments and the RACON phantom has been specifically developed by authors to be applied to acceptance and constancy tests of the image quality in digital radiographic equipments following recommendations of international associations. The purpose of this work consists in analyzing the image quality for both phantoms by means of an automatic software utility. This analysis allows us to study the functioning of the image chain of the mammographic system in an objective way, so an abnormal functioning of the radiographic equipment might be detected.

  6. Quality assurance applied to mammographic equipments using phantoms and software for its evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Mayo, Patricia, E-mail: p.mayo@titaniast.co [Titania Servicios Tecnologicos S.L., Grupo Dominguis, Apartado 46015, Valencia (Spain); Rodenas, Francisco [Departamento de Matematica Aplicada, Universidad Politecnica de Valencia, Apartado 46022, Valencia (Spain); Manuel Campayo, Juan [Hospital Clinico Universitario de Valencia, Avda. Blasco Ibanez, Apartado 46017, Valencia (Spain); Verdu, Gumersido [Departamento de Ingenieria Quimica y Nuclear, Universidad Politecnica de Valencia, Apartado 46022, Valencia (Spain)

    2010-07-21

    The image quality assessment in radiographic equipments is a very important item for a complete quality control of the radiographic image chain. The periodic evaluation of the radiographic image quality must guarantee the constancy of this quality to carry out a suitable diagnosis. Mammographic phantom images are usually used to study the quality of images obtained by determined mammographic equipment. The digital image treatment techniques allow to carry out an automatic analysis of the phantom image. In this work we apply some techniques of digital image processing to analyze in an automatic way the image quality of mammographic phantoms, namely CIRS SP01 and RACON for different varying conditions of the mammographic equipment. The CIRS SP01 phantom is usually used in analogic mammographic equipments and the RACON phantom has been specifically developed by authors to be applied to acceptance and constancy tests of the image quality in digital radiographic equipments following recommendations of international associations. The purpose of this work consists in analyzing the image quality for both phantoms by means of an automatic software utility. This analysis allows us to study the functioning of the image chain of the mammographic system in an objective way, so an abnormal functioning of the radiographic equipment might be detected.

  7. Implementing clinical governance in English primary care groups/trusts: reconciling quality improvement and quality assurance.

    Science.gov (United States)

    Campbell, S M; Sheaff, R; Sibbald, B; Marshall, M N; Pickard, S; Gask, L; Halliwell, S; Rogers, A; Roland, M O

    2002-03-01

    To investigate the concept of clinical governance being advocated by primary care groups/trusts (PCG/Ts), approaches being used to implement clinical governance, and potential barriers to its successful implementation in primary care. Qualitative case studies using semi-structured interviews and documentation review. Twelve purposively sampled PCG/Ts in England. Fifty senior staff including chief executives, clinical governance leads, mental health leads, and lay board members. Participants' perceptions of the role of clinical governance in PCG/Ts. PCG/Ts recognise that the successful implementation of clinical governance in general practice will require cultural as well as organisational changes, and the support of practices. They are focusing their energies on supporting practices and getting them involved in quality improvement activities. These activities include, but move beyond, conventional approaches to quality assessment (audit, incentives) to incorporate approaches which emphasise corporate and shared learning. PCG/Ts are also engaged in setting up systems for monitoring quality and for dealing with poor performance. Barriers include structural barriers (weak contractual levers to influence general practices), resource barriers (perceived lack of staff or money), and cultural barriers (suspicion by practice staff or problems overcoming the perceived blame culture associated with quality assessment). PCG/Ts are focusing on setting up systems for implementing clinical governance which seek to emphasise developmental and supportive approaches which will engage health professionals. Progress is intentionally incremental but formidable challenges lie ahead, not least reconciling the dual role of supporting practices while monitoring (and dealing with poor) performance.

  8. THE USE OF ANALOG PROCESSES AS A FACTOR TO IMPROVE SOFTWARE QUALITY: A CASE STUDY OF ORTHO-MEDICAL SOFTWARE

    OpenAIRE

    GABRIEL VIEIRA MONTEIRO

    2003-01-01

    A presente dissertação avalia um software médico através de parâmetros e critérios ergonômicos. Primeiramente descrevem- se os conceitos relativos a Tecnologia da Informação e conceitua-se o software como um dos subsistemas de qualquer sistema de Tecnologia da Informação. A partir de então, verificou-se as etapas de desenvolvimento de software, relacionados ao contexto de usabilidade. Levantaram-se também os principais problemas de interação encontrados no...

  9. [Eva-Reha: a computer software supporting outcome-based quality management in medical rehabilitation].

    Science.gov (United States)

    Noack, M; Schneider, T; Nosper, M

    2005-04-01

    Development of a computer software for supporting medical quallity management by documenting progression and results of medical rehabilitation in neurologic, orthopaedic, and geriatric patients. The software "Eva-Reha" (Evaluation of Medical Rehabilitation) was generated using C ++ in a client-server structure with Interbase being the underlying relational database management system. The software is network-compatible and runs under Windows NT and Windows 2000. "Eva-Reha", developed by the "Medizinischer Dienst der Krankenversicherung Rheinland-Pfalz (MDK RLP)" supports quality management systems in medical rehabilitation. Since 2003 the MDK RLP provides neurologic and geriatric rehabilitation centres with the software free of charge. With the help of "Eva-Reha" progression and results of medical rehabilitation can be displayed metrically, thus facilitating individual rehabilitation planning and supporting motivation of the rehabilitation team. Therapeutic strategies can be evaluated for different ICD-10-diagnoses or impairment groups. Moreover, "Eva-Reha" provides valuable data for administration and controlling purposes, e. g. age structure, case mix, impairment on admission and medical as well as rehabilitative procedures. The system generates a request for extension in a set form which facilitates communication between rehabilitation centres and sponsors.

  10. 78 FR 67327 - Approval and Promulgation of Air Quality Implementation Plans; State of Colorado; Revised...

    Science.gov (United States)

    2013-11-12

    ... Guidance for Developing Transportation Conformity State Implementation Plans (SIPs) for further background... consulted our document ``Guidance for Developing Transportation Conformity State Implementation Plans (SIPs...] Approval and Promulgation of Air Quality Implementation Plans; State of Colorado; Revised Transportation...

  11. Air Quality in Mexico City: Policies Implemented for its Improvement

    Science.gov (United States)

    Paramo, V.

    2007-12-01

    stringent emission levels of the gasoline fleet; update the detention of pollutant vehicles program; partial exemption of the inspection and maintenance program for cleaner and or highly efficient vehicles; substitution of 3,000 microbuses, 40,000 taxis and 1,200 buses; commissioning of the first Bus Rapid Transit system; implementation of a program for the emissions reduction for the 300 most polluted industrial facilities; and continuous update of the air quality environmental management programs. To continue improving the air quality in the MCMA, the environmental authorities will continue the implementation of the 2002-2010 Air Quality Improvement Program. In 2007 the Green Program was started, this includes those actions that have proven to be effective reduction of pollutant emissions and incorporates new actions for the reduction of local and global pollutant emissions. The most important of these new actions are: substitution of 9,500 microbuses; renewal of all the taxis fleet; commissioning of 10 Bus Rapid Transit lines; commissioning of Line 12 of the underground system; schedules and routes limitations to the cargo fleet; increase 5 percent the number of non-motorized trips (bicycling and walking); regulation of the private public transport passenger stops; requirement of private schools to provide school transport; regulation of non-occupied taxis in circulation; modifications to the circulation of 350 critical crossing points in the city; adoption of intelligent traffic lights systems; complete substitution of the local government vehicle's fleet; implement the inspection and maintenance of the cargo fleet; introduction of low- sulfur diesel, among other measures.

  12. Software-Defined GPS Receiver Implemented on the Parallella-16 Board

    DEFF Research Database (Denmark)

    Olesen, Daniel Madelung; Jakobsen, Jakob; Knudsen, Per

    2015-01-01

    This paper describes a GPS software receiver design made of inexpensive and physically small hardware components. The small embedded platform, known as the Parallella-16 computer has been utilized in conjunction with a commercial RF front-end to construct a 4-channel real time software GPS receiv....... The total cost of the hardware is below 150$ and the size is comparable to a credit-card. The receiver has been developed for research in GNSS/INS integration on small Unmanned Aerial Vehicles (UAVs)....

  13. Increasing quality and managing complexity in neuroinformatics software development with continuous integration

    Directory of Open Access Journals (Sweden)

    Yury V. Zaytsev

    2013-01-01

    Full Text Available High quality neuroscience research requires accurate, reliable and well maintained neuroinformatics applications. As software projects become larger, offering more functionality and developing a denser web of interdependence between their component parts, we need more sophisticated methods to manage their complexity. If complexity is allowed to get out of hand, either the quality of the software or the speed of development suffer, and in many cases both. To address this issue, here we develop a scalable, low-cost and open source solution for continuous integration (CI, a technique which ensures the quality of changes to the code base during the development procedure, rather than relying on a pre-release integration phase. We demonstrate that a CI based workflow, due to rapid feedback about code integration problems and tracking of code health measures, enabled substantial increases in productivity for a major neuroinformatics project and additional benefits for three further projects. Beyond the scope of the current study, we identify multiple areas in which CI can be employed to further increase the quality of neuroinformatics projects by improving development practices and incorporating appropriate development tools. Finally, we discuss what measures can be taken to lower the barrier for developers of neuroinformatics applications to adopt this useful technique.

  14. PREDICTION OF SMARTPHONES’ PERCEIVED IMAGE QUALITY USING SOFTWARE EVALUATION TOOL VIQET

    Directory of Open Access Journals (Sweden)

    Pinchas ZOREA

    2016-12-01

    Full Text Available A great deal of resources and efforts have been made in recent years to assess how the smartphones users perceived the image quality. Unfortunately, only limited success has been achieved and the image quality assessment still based on many physical human visual test. The paper describes the new model proposed for perceived quality based on human visual tests compared with image analysis by the software application tool. The values of parameters of perceived image quality (brightness, contrast, color saturation and sharpness were calibrated based on results from human visual experiments.PREDICŢIA CALITĂŢII PERCEPUTE A IMAGINILOR AFIȘATE DE SMARTPHONE-URI UTILIZÂND APLICAŢIA DE EVALUARE VIQETÎn ultimii ani au fost depuse eforturi semnificative pentru a evalua modul în care utilizatorii de smartphone  percep calitatea imaginilor. Din păcate, a fost atins doar un progres limitat, evaluarea calităţii imaginiilor bazându-se încă pe multiple teste vizuale umane. În lucrare este descris un nou model al calităţii percepute pe baza testelor vizuale umane, comparate cu analiza imaginii efectuate cu o aplicaţie software. Valorile parametrilor calităţii  percepute a imaginii (lu­minozitate, contrast, saturaţia culorilor şi claritatea au fost calibrate pe baza rezultatelor experimentelor vizuale umane.

  15. Specific developed phantoms and software to assess radiological equipment image quality

    Energy Technology Data Exchange (ETDEWEB)

    Verdu, G., E-mail: gverdu@iqn.upv.es [Universidad Politecnica de Valencia (Spain). Dept. de Ingenieria Quimica y Nuclear; Mayo, P., E-mail: p.mayo@titaniast.com [TITANIA Servicios Teconologicos, Valencia (Spain); Rodenas, F., E-mail: frodenas@mat.upv.es [Universidad Politecnica de Valencia (Spain). Dept. de Matematica Aplicada; Campayo, J.M., E-mail: j.campayo@lainsa.com [Logistica y Acondicionamientos Industriales S.A.U (LAINSA), Valencia (Spain)

    2011-07-01

    The use of radiographic phantoms specifically designed to evaluate the operation of the radiographic equipment lets the study of the image quality obtained by this equipment in an objective way. In digital radiographic equipment, the analysis of the image quality can be automatized because the acquisition of the image is possible in different technologies that are, computerized radiography or phosphor plate and direct radiography or detector. In this work we have shown an application to assess automatically the constancy quality image in the image chain of the radiographic equipment. This application is integrated by designed radiographic phantoms which are adapted to conventional, dental equipment and specific developed software for the automatic evaluation of the phantom image quality. The software is based on digital image processing techniques that let the automatic detection of the different phantom tests by edge detector, morphological operators, threshold histogram techniques, etc. The utility developed is enough sensitive to the radiographic equipment of operating conditions of voltage (kV) and charge (mAs). It is a friendly user programme connected with a data base of the hospital or clinic where it has been used. After the phantom image processing the user can obtain an inform with a resume of the imaging system state with accepting and constancy results. (author)

  16. Specific developed phantoms and software to assess radiological equipment image quality

    International Nuclear Information System (INIS)

    Verdu, G.; Rodenas, F.

    2011-01-01

    The use of radiographic phantoms specifically designed to evaluate the operation of the radiographic equipment lets the study of the image quality obtained by this equipment in an objective way. In digital radiographic equipment, the analysis of the image quality can be automatized because the acquisition of the image is possible in different technologies that are, computerized radiography or phosphor plate and direct radiography or detector. In this work we have shown an application to assess automatically the constancy quality image in the image chain of the radiographic equipment. This application is integrated by designed radiographic phantoms which are adapted to conventional, dental equipment and specific developed software for the automatic evaluation of the phantom image quality. The software is based on digital image processing techniques that let the automatic detection of the different phantom tests by edge detector, morphological operators, threshold histogram techniques, etc. The utility developed is enough sensitive to the radiographic equipment of operating conditions of voltage (kV) and charge (mAs). It is a friendly user programme connected with a data base of the hospital or clinic where it has been used. After the phantom image processing the user can obtain an inform with a resume of the imaging system state with accepting and constancy results. (author)

  17. [Methods of process analysis, instruments of quality management. Successful implementation of quality projects].

    Science.gov (United States)

    Seyfarth-Metzger, I; Liebich, B; Volz, A

    2001-09-15

    Meaningful experience and knowledge--the prerequisite for the successful implementation of quality projects--are described in terms of their practical importance on the basis of the experience gained with quality management in the municipal hospital at München-Schwabing (HMS). Against the background of the process of quality assurance or PDCA (Plan Do Check Act), the approach to the selection of suitable themes (problem selection), the prioritization of projects, the appropriate composition of the project group, the description of the problem, and the organization of the project, are discussed. The authors describe the implementation of methods of process analysis such as flow diagrams and cause-and-effect diagrams. The importance of evaluation is justified, and pragmatic approaches are presented. The importance of project documentation and essential contents are discussed.

  18. Measurement and Management of the Level of Quality Control Process in SoC (System on Chip Embedded Software Development

    Directory of Open Access Journals (Sweden)

    Ki-Won Song

    2012-04-01

    Full Text Available This paper presents the process of measuring the level of quality control process to ensure the quality of delivered software package during the development cycle. The success of the project requires three pre-requisites and they constrain one another. Quality is the most important factor for successful project completion. In other words, quality should not be sacrificed for the sake of meeting cost budget or delivering within schedule. Also, cost caused by any quality issues such as defect resolution increases exponentially once the product is out of the door. Having said that, we also have to consider the schedule side of constraints for the successful project. In other words, we have no time to do a quality job and we have to compete with other competitors to ship the product to the market earlier than them. So, the quality measurement and management concept is introduced to meet the agile software development environment in conjunction with performance strategies to execute within organization. Obviously, there are many key performance indexes derivable from the actual data associated with quality control activities and it is desirable to create a quality process to integrally represent overall level of quality control activities performed while developing the software deliverables. With the quality process, it is possible to evaluate whether enough quality control activities are performed for the project officially and secure the quality of the software deliverables before it is delivered to the customers.

  19. How Does Software Process Improvement Address Global Software Engineering?

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; Diebold, Philipp; Münch, Jürgen

    2016-01-01

    For decades, Software Process Improvement (SPI) programs have been implemented, inter alia, to improve quality and speed of software development. To set up, guide, and carry out SPI projects, and to measure SPI state, impact, and success, a multitude of different SPI approaches and considerable...

  20. Implementing a nationwide quality improvement approach in health services.

    Science.gov (United States)

    Sahel, Amina; DeBrouwere, Vincent; Dujardin, Bruno; Kegels, Guy; Belkaab, Nejoua; Alaoui Belghiti, Abdelali

    2015-01-01

    The purpose of this paper is to present an innovative quality improvement intervention developed in Morocco and discuss its implementation. Until 2004, the Moroccan Ministry of Health (MoH) encouraged pilots of quality improvement approaches but none of them were revealed to be sustainable. Internal assessments pinpointed factors such as lack of recognition of the participating team's efforts and lack of pressure on managers to become more accountable. In 2005, Morocco opted for an intervention called "Quality Contest" (QC) targeting health centres, hospitals and health district offices and combining quality measurement with structures ranking, performance disclosure and reward system. The QC is organized every 18 months. After the self-assessment and external audit step, the participating structures are ranked according to their scores. Their performances are then disseminated and the highest performing structures are rewarded. The results showed an improvement in performance among participating structures, constructive exchange of successful experiences between structures, as well as communication of constraints, needs and expectations between MoH managers at central and local levels; the use of peer-auditors was appreciated as it enabled an exchange of best practices between auditors and audited teams but this was mitigated by the difficulty of ensuring their neutrality; and the recognition of efforts was appreciated but seemed insufficient to ensure a sense of justice and maintain motivation. This intervention is an example of MoH leadership that has succeeded in introducing transparency and accountability mechanisms (ranking and performance disclosure) as leverage to change the management culture of the public health services; setting up a reward system to reinforce motivation and adapting continuously the intervention to enhance its sustainability and acceptability.