WorldWideScience

Sample records for standard commercial software

  1. GRO/EGRET data analysis software: An integrated system of custom and commercial software using standard interfaces

    Science.gov (United States)

    Laubenthal, N. A.; Bertsch, D.; Lal, N.; Etienne, A.; Mcdonald, L.; Mattox, J.; Sreekumar, P.; Nolan, P.; Fierro, J.

    1992-01-01

    The Energetic Gamma Ray Telescope Experiment (EGRET) on the Compton Gamma Ray Observatory has been in orbit for more than a year and is being used to map the full sky for gamma rays in a wide energy range from 30 to 20,000 MeV. Already these measurements have resulted in a wide range of exciting new information on quasars, pulsars, galactic sources, and diffuse gamma ray emission. The central part of the analysis is done with sky maps that typically cover an 80 x 80 degree section of the sky for an exposure time of several days. Specific software developed for this program generates the counts, exposure, and intensity maps. The analysis is done on a network of UNIX based workstations and takes full advantage of a custom-built user interface called X-dialog. The maps that are generated are stored in the FITS format for a collection of energies. These, along with similar diffuse emission background maps generated from a model calculation, serve as input to a maximum likelihood program that produces maps of likelihood with optional contours that are used to evaluate regions for sources. Likelihood also evaluates the background corrected intensity at each location for each energy interval from which spectra can be generated. Being in a standard FITS format permits all of the maps to be easily accessed by the full complement of tools available in several commercial astronomical analysis systems. In the EGRET case, IDL is used to produce graphics plots in two and three dimensions and to quickly implement any special evaluation that might be desired. Other custom-built software, such as the spectral and pulsar analyses, take advantage of the XView toolkit for display and Postscript output for the color hard copy. This poster paper outlines the data flow and provides examples of the user interfaces and output products. It stresses the advantages that are derived from the integration of the specific instrument-unique software and powerful commercial tools for graphics and

  2. NASA's Software Safety Standard

    Science.gov (United States)

    Ramsay, Christopher M.

    2007-01-01

    requirements. This allows the projects leeway to meet these requirements in many forms that best suit a particular project's needs and safety risk. In other words, it tells the project what to do, not how to do it. This update also incorporated advances in the state of the practice of software safety from academia and private industry. It addresses some of the more common issues now facing software developers in the NASA environment such as the use of Commercial-Off-the-Shelf Software (COTS), Modified OTS (MOTS), Government OTS (GOTS), and reused software. A team from across NASA developed the update and it has had both NASA-wide internal reviews by software engineering, quality, safety, and project management. It has also had expert external review. This presentation and paper will discuss the new NASA Software Safety Standard, its organization, and key features. It will start with a brief discussion of some NASA mission failures and incidents that had software as one of their root causes. It will then give a brief overview of the NASA Software Safety Process. This will include an overview of the key personnel responsibilities and functions that must be performed for safety-critical software.

  3. Commercial Literacy Software.

    Science.gov (United States)

    Balajthy, Ernest

    1997-01-01

    Presents the first year's results of a continuing project to monitor the availability of software of relevance for literacy education purposes. Concludes there is an enormous amount of software available for use by teachers of reading and literacy--whereas drill-and-practice software is the largest category of software available, large numbers of…

  4. Prediction of ice accretion and anti-icing heating power on wind turbine blades using standard commercial software

    International Nuclear Information System (INIS)

    Villalpando, Fernando; Reggio, Marcelo; Ilinca, Adrian

    2016-01-01

    An approach to numerically simulate ice accretion on 2D sections of a wind turbine blade is presented. The method uses standard commercial ANSYS-Fluent and Matlab tools. The Euler-Euler formulation is used to calculate the water impingement on the airfoil, and a UDF (Used Defined Function) has been devised to turn the airfoil's solid wall into a permeable boundary. Mayer's thermodynamic model is implemented in Matlab for computing ice thickness and for updating the airfoil contour. A journal file is executed to systematize the procedure: meshing, droplet trajectory calculation, thermodynamic model application for computing ice accretion, and the updating of airfoil contours. The proposed ice prediction strategy has been validated using iced airfoil contours obtained experimentally in the AMIL refrigerated wind tunnel (Anti-icing Materials International Laboratory). Finally, a numerical prediction method has been generated for anti-icing assessment, and its results compared with data obtained in this laboratory. - Highlights: • A methodology for ice accretion prediction using commercial software is proposed. • Euler model gives better prediction of airfoil water collection with detached flow. • A source term is used to change from a solid wall to a permeable wall in Fluent. • Energy needed for ice-accretion mitigation system is predicted.

  5. Standard software for CAMAC

    International Nuclear Information System (INIS)

    Lenkszus, F.R.

    1978-01-01

    The NIM Committee (National Instrumentation Methods Committee) of the U.S. Department of Energy and the ESONE Committee of European Laboratories have jointly specified standard software for use with CAMAC. Three general approaches were followed: the definition of a language called IML for use in CAMAC systems, the definition of a standard set of subroutine calls, and real-time extensions to the BASIC language. This paper summarizes the results of these efforts. 1 table

  6. NASA software documentation standard software engineering program

    Science.gov (United States)

    1991-01-01

    The NASA Software Documentation Standard (hereinafter referred to as Standard) can be applied to the documentation of all NASA software. This Standard is limited to documentation format and content requirements. It does not mandate specific management, engineering, or assurance standards or techniques. This Standard defines the format and content of documentation for software acquisition, development, and sustaining engineering. Format requirements address where information shall be recorded and content requirements address what information shall be recorded. This Standard provides a framework to allow consistency of documentation across NASA and visibility into the completeness of project documentation. This basic framework consists of four major sections (or volumes). The Management Plan contains all planning and business aspects of a software project, including engineering and assurance planning. The Product Specification contains all technical engineering information, including software requirements and design. The Assurance and Test Procedures contains all technical assurance information, including Test, Quality Assurance (QA), and Verification and Validation (V&V). The Management, Engineering, and Assurance Reports is the library and/or listing of all project reports.

  7. Commercial off-the-shelf software dedication process based on the commercial grade survey of supplier

    International Nuclear Information System (INIS)

    Kim, J. Y.; Lee, J. S.; Chon, S. W.; Lee, G. Y.; Park, J. K.

    2000-01-01

    Commercial Off-The-Shelf(COTS) software dedication process can apply to a combination of methods like the hardware commercial grade item dedication process. In general, these methods are : methods 1(special test and inspection), method 2(commercial grade survey of supplier), method 3(source verification), and method 4(acceptance supplier/item performance record). In this paper, the suggested procedure-oriented dedication process on the basis of method 2 for COTS software is consistent with EPRI/TR-106439 and NUREG/CR-6421 requirements. Additional tailoring policy based on code and standards related to COTS software may be also founded in the suggested commercial software dedication process. Suggested commercial software dedication process has been developed for a commercial I and C software dedication who performs COTS qualification according to the dedication procedure

  8. Future of Software Engineering Standards

    Science.gov (United States)

    Poon, Peter T.

    1997-01-01

    In the new millennium, software engineering standards are expected to continue to influence the process of producing software-intensive systems which are cost-effetive and of high quality. These sytems may range from ground and flight systems used for planetary exploration to educational support systems used in schools as well as consumer-oriented systems.

  9. Software Development Standard Processes (SDSP)

    Science.gov (United States)

    Lavin, Milton L.; Wang, James J.; Morillo, Ronald; Mayer, John T.; Jamshidian, Barzia; Shimizu, Kenneth J.; Wilkinson, Belinda M.; Hihn, Jairus M.; Borgen, Rosana B.; Meyer, Kenneth N.; hide

    2011-01-01

    A JPL-created set of standard processes is to be used throughout the lifecycle of software development. These SDSPs cover a range of activities, from management and engineering activities, to assurance and support activities. These processes must be applied to software tasks per a prescribed set of procedures. JPL s Software Quality Improvement Project is currently working at the behest of the JPL Software Process Owner to ensure that all applicable software tasks follow these procedures. The SDSPs are captured as a set of 22 standards in JPL s software process domain. They were developed in-house at JPL by a number of Subject Matter Experts (SMEs) residing primarily within the Engineering and Science Directorate, but also from the Business Operations Directorate and Safety and Mission Success Directorate. These practices include not only currently performed best practices, but also JPL-desired future practices in key thrust areas like software architecting and software reuse analysis. Additionally, these SDSPs conform to many standards and requirements to which JPL projects are beholden.

  10. 48 CFR 1852.227-86 - Commercial computer software-Licensing.

    Science.gov (United States)

    2010-10-01

    .../contractor proposes its standard commercial software license, those applicable portions thereof consistent... its standard commercial software license until after this purchase order/contract has been issued, or at or after the time the computer software is delivered, such license shall nevertheless be deemed...

  11. Integrating commercial software in accelerator control- case study

    International Nuclear Information System (INIS)

    Pace, Alberto

    1994-01-01

    Using existing commercial software is the dream of any control system engineer for the development cost reduction that can reach one order of magnitude. This dream often vanishes when appears the requirement to have a uniform and consistent architecture through a wide number of components and applications. This makes it difficult to integrate several commercial packages that often impose different user interface and communication standards. This paper will describe the approach and standards that have been chosen for the CERN ISOLDE control system that have allowed several commercial packages to be integrated in the system as-they-are permitting the software development cost to be reduced to a minimum. (author). 10 refs., 2 tabs., 9 figs

  12. 48 CFR 52.227-19 - Commercial Computer Software License.

    Science.gov (United States)

    2010-10-01

    ... Software License. 52.227-19 Section 52.227-19 Federal Acquisition Regulations System FEDERAL ACQUISITION... Clauses 52.227-19 Commercial Computer Software License. As prescribed in 27.409(g), insert the following clause: Commercial Computer Software License (DEC 2007) (a) Notwithstanding any contrary provisions...

  13. 48 CFR 27.405-3 - Commercial computer software.

    Science.gov (United States)

    2010-10-01

    ... software. 27.405-3 Section 27.405-3 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION... Commercial computer software. (a) When contracting other than from GSA's Multiple Award Schedule contracts for the acquisition of commercial computer software, no specific contract clause prescribed in this...

  14. The ANS mathematics and computation software standards

    Energy Technology Data Exchange (ETDEWEB)

    Smetana, A. O. [Savannah River National Laboratory, Washington Savannah River Company, Aiken, SC 29808 (United States)

    2006-07-01

    The Mathematics and Computations Div. of the American Nuclear Society sponsors the ANS-10 Standards Subcommittee. This subcommittee, which is part of the ANS Standards Committee, currently maintains three ANSI/ANS software standards. These standards are: Portability of Scientific and Engineering Software, ANS-10.2; Guidelines for the Verification and Validation of Scientific and Engineering Computer Programs for the Nuclear Industry, ANS-10.4; and Accommodating User Needs in Scientific and Engineering Computer Software Development, ANS-10.5. A fourth Standard, Documentation of Computer Software, ANS-10.3, is available as a historical Standard. (authors)

  15. The ANS mathematics and computation software standards

    International Nuclear Information System (INIS)

    Smetana, A. O.

    2006-01-01

    The Mathematics and Computations Div. of the American Nuclear Society sponsors the ANS-10 Standards Subcommittee. This subcommittee, which is part of the ANS Standards Committee, currently maintains three ANSI/ANS software standards. These standards are: Portability of Scientific and Engineering Software, ANS-10.2; Guidelines for the Verification and Validation of Scientific and Engineering Computer Programs for the Nuclear Industry, ANS-10.4; and Accommodating User Needs in Scientific and Engineering Computer Software Development, ANS-10.5. A fourth Standard, Documentation of Computer Software, ANS-10.3, is available as a historical Standard. (authors)

  16. Software Quality Assurance and Controls Standard

    Science.gov (United States)

    2010-04-27

    dassurance a wor pro uc s an processes comply with predefined provisions and plans. • According to International Standard (IS) 12207 – of the 44...from document (plan) focus to process focus – Alignment with framework standard IS 12207 software life cycle (SLC) processes with exact...Books and P blications IEEE Software and Systems Engineering curriculum ABET u Certified Software Development Professional Standards ISO /IEC

  17. Regional vegetation management standards for commercial pine ...

    African Journals Online (AJOL)

    Although the understanding gained from these trials allowed for the development of vegetation management standards, their operational and economic viability need to be tested on a commercial basis. Four pine trials were thus initiated to test the applicability of these standards when utilised on a commercial scale. Two of ...

  18. Diversification and Challenges of Software Engineering Standards

    Science.gov (United States)

    Poon, Peter T.

    1994-01-01

    The author poses certain questions in this paper: 'In the future, should there be just one software engineering standards set? If so, how can we work towards that goal? What are the challenges of internationalizing standards?' Based on the author's personal view, the statement of his position is as follows: 'There should NOT be just one set of software engineering standards in the future. At the same time, there should NOT be the proliferation of standards, and the number of sets of standards should be kept to a minimum.It is important to understand the diversification of the areas which are spanned by the software engineering standards.' The author goes on to describe the diversification of processes, the diversification in the national and international character of standards organizations, the diversification of the professional organizations producing standards, the diversification of the types of businesses and industries, and the challenges of internationalizing standards.

  19. A company perspective on software engineering standards

    International Nuclear Information System (INIS)

    Steer, R.W.

    1988-01-01

    Software engineering standards, as implemented via formal policies and procedures, have historically been used in the nuclear industry, especially for codes used in the design, analysis, or operation of the plant. Over the past two decades, a significant amount of software has been put in place to perform these functions, while the overall software life cycle has become better understood, more and different computer systems have become available, and industry has become increasingly aware of the advantages gained when these procedures are used in the development and maintenance of this large amount of software. The use of standards and attendant procedures is thus becoming increasingly important as more computerization is taking place, both in the design and the operation of the plant. It is difficult to categorize software used in activities related to nuclear plants in a simple manner. That difficulty is due to the diversity of those uses, with attendant diversity in the methods and procedures used in the production of the software, compounded by a changing business climate in which significant software engineering expertise is being applied to a broader range of applications on a variety of computing systems. The use of standards in the various phases of the production of software thus becomes more difficult as well. This paper discusses the various types of software and the importance of software standards in the development of each of them

  20. Functional Testing Protocols for Commercial Building Efficiency Baseline Modeling Software

    Energy Technology Data Exchange (ETDEWEB)

    Jump, David; Price, Phillip N.; Granderson, Jessica; Sohn, Michael

    2013-09-06

    This document describes procedures for testing and validating proprietary baseline energy modeling software accuracy in predicting energy use over the period of interest, such as a month or a year. The procedures are designed according to the methodology used for public domain baselining software in another LBNL report that was (like the present report) prepared for Pacific Gas and Electric Company: ?Commercial Building Energy Baseline Modeling Software: Performance Metrics and Method Testing with Open Source Models and Implications for Proprietary Software Testing Protocols? (referred to here as the ?Model Analysis Report?). The test procedure focuses on the quality of the software?s predictions rather than on the specific algorithms used to predict energy use. In this way the software vendor is not required to divulge or share proprietary information about how their software works, while enabling stakeholders to assess its performance.

  1. ESSCOTS for Learning: Transforming Commercial Software into Powerful Educational Tools.

    Science.gov (United States)

    McArthur, David; And Others

    1995-01-01

    Gives an overview of Educational Support Systems based on commercial off-the-shelf software (ESSCOTS), and discusses the benefits of developing such educational software. Presents results of a study that revealed the learning processes of middle and high school students who used a geographical information system. (JMV)

  2. Improvement of gamma calibration procedures with commercial management software

    International Nuclear Information System (INIS)

    Lucena, Rodrigo F.; Potiens, Maria da Penha A.; Santos, Gelson P.; Vivolo, Vitor

    2007-01-01

    In this work, the gamma calibration procedure of the Instruments Calibration Laboratory (LCI) of the IPEN-CNEN-SP was improved with the use of the commercial management software Autolab TM from Automa Company. That software was adapted for our specific use in the calibration procedures. The evaluation of the uncertainties in gamma calibration protocol was improved by the LCI staff and yet the all worksheets and final calibration report lay-out was developed in commercial software like Excell TM and Word TM from Microsft TM . (author)

  3. Standards Interoperability: Application of Contemporary Software Safety Assurance Standards to the Evolution of Legacy Software

    National Research Council Canada - National Science Library

    Meacham, Desmond J

    2006-01-01

    .... The proposed formal model is then applied to the requirements for RTCA DO-178B and MIL-STD-498 as representative examples of contemporary and legacy software standards. The results provide guidance on how to achieve airworthiness certification for modified legacy software, whilst maximizing the use of software products from the previous development.

  4. Dilemmas within Commercial Involvement in Open Source Software

    DEFF Research Database (Denmark)

    Ciesielska, Malgorzata; Westenholz, Ann

    2016-01-01

    to free-riding. There are six levels of commercial involvement in open source communities, and each of them is characterized by a different dilemma. Originality/value – The paper sheds light on the various level of involvement of business in open source movement and emphasize that the popularized “open......Purpose – The purpose of this paper is to contribute to the literature about the commercial involvement in open source software, levels of this involvement and consequences of attempting to mix various logics of action. Design/methodology/approach – This paper uses the case study approach based...... on mixed methods: literature reviews and news searches, electronic surveys, qualitative interviews and observations. It combines discussions from several research projects as well as previous publications to present the scope of commercial choices within open source software and their consequences...

  5. Quench Simulation of Superconducting Magnets with Commercial Multiphysics Software

    CERN Document Server

    AUTHOR|(SzGeCERN)751171; Auchmann, Bernhard; Jarkko, Niiranen; Maciejewski, Michal

    The simulation of quenches in superconducting magnets is a multiphysics problem of highest complexity. Operated at 1.9 K above absolute zero, the material properties of superconductors and superfluid helium vary by several orders of magnitude over a range of only 10 K. The heat transfer from metal to helium goes through different transfer and boiling regimes as a function of temperature, heat flux, and transferred energy. Electrical, magnetic, thermal, and fluid dynamic effects are intimately coupled, yet live on vastly different time and spatial scales. While the physical models may be the same in all cases, it is an open debate whether the user should opt for commercial multiphysics software like ANSYS or COMSOL, write customized models based on general purpose network solvers like SPICE, or implement the physics models and numerical solvers entirely in custom software like the QP3, THEA, and ROXIE codes currently in use at the European Organisation for Nuclear Research (CERN). Each approach has its strengt...

  6. 78 FR 17875 - Commercial Driver's License Testing and Commercial Learner's Permit Standards

    Science.gov (United States)

    2013-03-25

    ... [Docket No. FMCSA-2007-27659] RIN 2126-AB59 Commercial Driver's License Testing and Commercial Learner's.... The 2011 final rule amended the commercial driver's license (CDL) knowledge and skills testing standards and established new minimum Federal standards for States to issue the commercial learner's permit...

  7. 77 FR 26989 - Commercial Driver's License Testing and Commercial Learner's Permit Standards

    Science.gov (United States)

    2012-05-08

    ... [Docket No. FMCSA-2007-27659] RIN 2126-AB02 Commercial Driver's License Testing and Commercial Learner's... effective on July 8, 2011. That final rule amended the commercial driver's license (CDL) knowledge and skills testing standards and established new minimum Federal standards for States to issue the commercial...

  8. An IMRT dose distribution study using commercial verification software

    International Nuclear Information System (INIS)

    Grace, M.; Liu, G.; Fernando, W.; Rykers, K.

    2004-01-01

    Full text: The introduction of IMRT requires users to confirm that the isodose distributions and relative doses calculated by their planning system match the doses delivered by their linear accelerators. To this end the commercially available software, VeriSoft TM (PTW-Freiburg, Germany) was trialled to determine if the tools and functions it offered would be of benefit to this process. The CMS Xio (Computer Medical System) treatment planning system was used to generate IMRT plans that were delivered with an upgraded Elekta SL15 linac. Kodak EDR2 film sandwiched in RW3 solid water (PTW-Freiburg, Germany) was used to measure the IMRT fields delivered with 6 MV photons. The isodose and profiles measured with the film generally agreed to within ± 3% or ± 3 mm with the planned doses, in some regions (outside the IMRT field) the match fell to within ± 5%. The isodose distributions of the planning system and the film could be compared on screen and allows for electronic records of the comparison to be kept if so desired. The features and versatility of this software has been of benefit to our IMRT QA program. Furthermore, the VeriSoft TM software allows for quick and accurate, automated planar film analysis.Copyright (2004) Australasian College of Physical Scientists and Engineers in Medicine

  9. Noise data management using commercially available data-base software

    International Nuclear Information System (INIS)

    Damiano, B.; Thie, J.A.

    1988-01-01

    A data base has been created using commercially available software to manage the data collected by an automated noise data acquisition system operated by Oak Ridge National Laboratory at the Fast Flux Test Facility (FFTF). The data base was created to store, organize, and retrieve selected features of the nuclear and process signal noise data, because the large volume of data collected by the automated system makes manual data handling and interpretation based on visual examination of noise signatures impractical. Compared with manual data handling, use of the data base allows the automatically collected data to be utilized more fully and effectively. The FFTF noise data base uses the Oracle Relational Data Base Management System implemented on a desktop personal computer

  10. Managing mapping data using commercial data base management software.

    Science.gov (United States)

    Elassal, A.A.

    1985-01-01

    Electronic computers are involved in almost every aspect of the map making process. This involvement has become so thorough that it is practically impossible to find a recently developed process or device in the mapping field which does not employ digital processing in some form or another. This trend, which has been evolving over two decades, is accelerated by the significant improvements in capility, reliability, and cost-effectiveness of electronic devices. Computerized mapping processes and devices share a common need for machine readable data. Integrating groups of these components into automated mapping systems requires careful planning for data flow amongst them. Exploring the utility of commercial data base management software to assist in this task is the subject of this paper. -Author

  11. Software measurement standards for areal surface texture parameters: part 2—comparison of software

    International Nuclear Information System (INIS)

    Harris, P M; Smith, I M; Giusca, C; Leach, R K; Wang, C

    2012-01-01

    A companion paper in this issue describes reference software for the evaluation of areal surface texture parameters, focusing on the definitions of the parameters and giving details of the numerical algorithms employed in the software to implement those definitions. The reference software is used as a benchmark against which software in a measuring instrument can be compared. A data set is used as input to both the software under test and the reference software, and the results delivered by the software under test are compared with those provided by the reference software. This paper presents a comparison of the results returned by the reference software with those reported by proprietary software for surface texture measurement. Differences between the results can be used to identify where algorithms and software for evaluating the parameters differ. They might also be helpful in identifying where parameters are not sufficiently well-defined in standards. (paper)

  12. 75 FR 32983 - Commercial Driver's License (CDL) Standards: Exemption

    Science.gov (United States)

    2010-06-10

    ...-28480] Commercial Driver's License (CDL) Standards: Exemption AGENCY: Federal Motor Carrier Safety... commercial driver's license (CDL) as required by current regulations. FMCSA reviewed NAAA's application for... demonstrate alternatives its members would employ to ensure that their commercial motor vehicle (CMV) drivers...

  13. Anticipatory Standards and the Commercialization of Nanotechnology

    International Nuclear Information System (INIS)

    Rashba, Edward; Gamota, Daniel

    2003-01-01

    Standardization will play an increasing role in creating a smooth transition from the laboratory to the marketplace as products based on nanotechnology are developed and move into broad use. Traditionally, standards have evolved out of a need to achieve interoperability among existing products, create order in markets, simplify production and ensure safety. This view does not account for the escalating trend in standardization, especially in emerging technology sectors, in which standards working groups anticipate the evolution of a technology and facilitate its rapid development and entree to the market place. It is important that the nanotechnology community views standards as a vital tool to promote progress along the nanotechnology value chain - from nanoscale materials that form the building blocks for components and devices to the integration of these devices into functional systems.This paper describes the need for and benefits derived from developing consensus standards in nanotechnology, and how standards are created. Anticipatory standards can nurture the growth of nanotechnology by drawing on the lessons learned from a standards effort that has and continues to revolutionize the telecommunications industry. Also, a brief review is presented on current efforts in the US to create nanotechnology standards

  14. Standardization of Software Application Development and Governance

    Science.gov (United States)

    2015-03-01

    of their systems or applications. DOD systems do not have the luxury of replacing systems at the same pace as commercial companies. DOD has to...is not that the commercial market purposefully sells products that are not complete, but having a 100% complete product requires extensive testing...develop applications for Google ’s Android and Apple ’s iOS devices. Both these companies have SDKs online as well as a number of resources available

  15. Measuring the Software Product Quality during the Software Development Life-Cycle: An ISO Standards Perspective

    OpenAIRE

    Rafa E. Al-Qutaish

    2009-01-01

    Problem statement: The International Organization for Standardization (ISO) published a set of international standards related to the software engineering, such as ISO 12207 and ISO 9126. However, there is a set of cross-references between the two standards. Approach: The ISO 9126 on software product quality and ISO 12207 on software life cycle processes had been analyzed to invistigate the relationships between them and to make a mapping from the ISO 9126 quality characteristics to the ISO 1...

  16. Standardization of software application development and governance

    OpenAIRE

    Labbe, Peter P.

    2015-01-01

    Approved for public release; distribution is unlimited A number of Defense Department initiatives focus on how to engineer better systems that directly influence software architecture, including Open Architecture, Enterprise Architecture, and Joint Information Enterprise. Additionally, the Department of Defense (DOD) mandates moving applications to consolidated datacenters and cloud computing. When examined from an application development perspective, the DOD lacks a common approach for in...

  17. Archival standards, in archival open access software And offer appropriate software for internal archival centers

    Directory of Open Access Journals (Sweden)

    Abdolreza Izadi

    2016-12-01

    Full Text Available The purpose of this study is Study of Descriptive Metadata Standards in Archival open source software, to determine the most appropriate descriptive metadata standard (s and also Encoder Software support of these standards. The approach of present study is combination and library methods, Delphi and descriptive survey are used. Data gathering in library study is fiche, in the Delphi method is questionnaire and in descriptive survey is checklist. Statistical population contains 5 Archival open source software. The findings suggest that 5 metadata standards, consist of EAD, ISAD, EAC-CPF, ISAAR & ISDF, diagnosed appropriate by Delphi Panel members as the most appropriate descriptive metadata standards to use for archival software. Moreover, ICA-ATOM and Archivist toolkit in terms of support for standards that were suitable, diagnosed as the most appropriate archival software.

  18. Contracting for Computer Software in Standardized Computer Languages

    Science.gov (United States)

    Brannigan, Vincent M.; Dayhoff, Ruth E.

    1982-01-01

    The interaction between standardized computer languages and contracts for programs which use these languages is important to the buyer or seller of software. The rationale for standardization, the problems in standardizing computer languages, and the difficulties of determining whether the product conforms to the standard are issues which must be understood. The contract law processes of delivery, acceptance testing, acceptance, rejection, and revocation of acceptance are applicable to the contracting process for standard language software. Appropriate contract language is suggested for requiring strict compliance with a standard, and an overview of remedies is given for failure to comply.

  19. Transport behaviour of commercially available 100-Omega standard resistors

    CSIR Research Space (South Africa)

    Schumacher, B

    2001-04-01

    Full Text Available Several types of commercial 100-Omega resistors can be used with the cryogenic current comparator to maintain the resistance unit, derived from the Quantized Hall Effect (QHE), and to disseminate this unit to laboratory resistance standards. Up...

  20. Commercial Discount Rate Estimation for Efficiency Standards Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Fujita, K. Sydny [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2016-04-13

    Underlying each of the Department of Energy's (DOE's) federal appliance and equipment standards are a set of complex analyses of the projected costs and benefits of regulation. Any new or amended standard must be designed to achieve significant additional energy conservation, provided that it is technologically feasible and economically justified (42 U.S.C. 6295(o)(2)(A)). A proposed standard is considered economically justified when its benefits exceed its burdens, as represented by the projected net present value of costs and benefits. DOE performs multiple analyses to evaluate the balance of costs and benefits of commercial appliance and equipment e efficiency standards, at the national and individual building or business level, each framed to capture different nuances of the complex impact of standards on the commercial end user population. The Life-Cycle Cost (LCC) analysis models the combined impact of appliance first cost and operating cost changes on a representative commercial building sample in order to identify the fraction of customers achieving LCC savings or incurring net cost at the considered efficiency levels.1 Thus, the choice of commercial discount rate value(s) used to calculate the present value of energy cost savings within the Life-Cycle Cost model implicitly plays a key role in estimating the economic impact of potential standard levels.2 This report is intended to provide a more in-depth discussion of the commercial discount rate estimation process than can be readily included in standard rulemaking Technical Support Documents (TSDs).

  1. Practical support for Lean Six Sigma software process definition using IEEE software engineering standards

    CERN Document Server

    Land, Susan K; Walz, John W

    2012-01-01

    Practical Support for Lean Six Sigma Software Process Definition: Using IEEE Software Engineering Standards addresses the task of meeting the specific documentation requirements in support of Lean Six Sigma. This book provides a set of templates supporting the documentation required for basic software project control and management and covers the integration of these templates for their entire product development life cycle. Find detailed documentation guidance in the form of organizational policy descriptions, integrated set of deployable document templates, artifacts required in suppo

  2. Contracting for Computer Software in Standardized Computer Languages

    OpenAIRE

    Brannigan, Vincent M.; Dayhoff, Ruth E.

    1982-01-01

    The interaction between standardized computer languages and contracts for programs which use these languages is important to the buyer or seller of software. The rationale for standardization, the problems in standardizing computer languages, and the difficulties of determining whether the product conforms to the standard are issues which must be understood. The contract law processes of delivery, acceptance testing, acceptance, rejection, and revocation of acceptance are applicable to the co...

  3. Reducing the risk of failure: Software Quality assurance standards and methods

    International Nuclear Information System (INIS)

    Elphick, J.; Cope, H.

    1992-01-01

    An effective Software Quality Assurance (SQA) program provides an overall approach to software engineering and the establishment of proven methods for the production of reliable software. And, in the authors experience, the overall costs for the software life are diminished with the application of quality methods. In their experience, the issues for implementing quality standards and practices are many. This paper addresses those issues as well as the lessons learned from developing and implementing a number of software quality assurance programs. Their experience includes the development and implementation of their own NRC accepted SQA program and an SQA program for an engineering software developer, as well as developing SQA procedures, standards, and methods for utilities, medical and commercial clients. Some of the issues addressed in this paper are: setting goals and defining quality; applying the software life cycle; addressing organizational issues; providing flexibility and increasing productivity; producing effective documentation; maintaining quality records; Imposing software configuration management; conducting reviews, audits, and controls; verification and validation; and controlling software procurement

  4. Company's Unusual Plan to Package Commercial Software with Business Textbooks Produces a Measure of Success.

    Science.gov (United States)

    Watkins, Beverly T.

    1992-01-01

    Course Technology Inc. has developed 10 products combining textbooks with commercial software for college accounting, business, computer science, and statistics courses. Five of the products use Lotus 1-2-3 spreadsheet software. The products have been positively received by teachers and students. (DB)

  5. Standards guide for space and earth sciences computer software

    Science.gov (United States)

    Mason, G.; Chapman, R.; Klinglesmith, D.; Linnekin, J.; Putney, W.; Shaffer, F.; Dapice, R.

    1972-01-01

    Guidelines for the preparation of systems analysis and programming work statements are presented. The data is geared toward the efficient administration of available monetary and equipment resources. Language standards and the application of good management techniques to software development are emphasized.

  6. Computer-assisted operational management of power plants in the field of tension between standard and individual software; IT-unterstuetzte Betriebsfuehrung von Kraftwerken. Im Spannungsfeld von Standard- und Individual-Software

    Energy Technology Data Exchange (ETDEWEB)

    Hippmann, Norbert [RWE Power AG, Essen (Germany). Sparte Steinkohle-/Gas-Kraftwerke

    2010-07-01

    Process routines in the operational management of power plants - particularly maintenance - are now largely planned, controlled and documented with the help of IT. Depending on corporate policy, IT support for routines is currently realised either with commercially available standard ERP software or with dedicated applications that have been specially developed for a given company. Whereas standard software has certain technical benefits (homogeneous databases, data integrity, standard user interface, no software interfaces, standard maintenance and service), customised applications have the undisputed advantage of offering the best possible mapping of company-specific process routines. By exploiting the full spectrum of IT enhancement options of its SAP system, RWE Power has largely combined the respective benefits of both standard and customised software, while also realising high-end user requirements that go beyond the mere standard. (orig.)

  7. Commercialization and Standardization Progress Towards an Optical Communications Earth Relay

    Science.gov (United States)

    Edwards, Bernard L.; Israel, David J.

    2015-01-01

    NASA is planning to launch the next generation of a space based Earth relay in 2025 to join the current Space Network, consisting of Tracking and Data Relay Satellites in space and the corresponding infrastructure on Earth. While the requirements and architecture for that relay satellite are unknown at this time, NASA is investing in communications technologies that could be deployed to provide new communications services. One of those new technologies is optical communications. The Laser Communications Relay Demonstration (LCRD) project, scheduled for launch in 2018 as a hosted payload on a commercial communications satellite, is a critical pathfinder towards NASA providing optical communications services on the next generation space based relay. This paper will describe NASA efforts in the on-going commercialization of optical communications and the development of inter-operability standards. Both are seen as critical to making optical communications a reality on future NASA science and exploration missions. Commercialization is important because NASA would like to eventually be able to simply purchase an entire optical communications terminal from a commercial provider. Inter-operability standards are needed to ensure that optical communications terminals developed by one vendor are compatible with the terminals of another. International standards in optical communications would also allow the space missions of one nation to use the infrastructure of another.

  8. Energy efficiency standards for residential and commercial equipment: Additional opportunities

    Energy Technology Data Exchange (ETDEWEB)

    Rosenquist, Greg; McNeil, Michael; Iyer, Maithili; Meyers, Steve; McMahon, Jim

    2004-08-02

    Energy efficiency standards set minimum levels of energy efficiency that must be met by new products. Depending on the dynamics of the market and the level of the standard, the effect on the market for a given product may be small, moderate, or large. Energy efficiency standards address a number of market failures that exist in the buildings sector. Decisions about efficiency levels often are made by people who will not be responsible for the energy bill, such as landlords or developers of commercial buildings. Many buildings are occupied for their entire lives by very temporary owners or renters, each unwilling to make long-term investments that would mostly reward subsequent users. And sometimes what looks like apathy about efficiency merely reflects inadequate information or time invested to evaluate it. In addition to these sector-specific market failures, energy efficiency standards address the endemic failure of energy prices to incorporate externalities. In the U.S., energy efficiency standards for consumer products were first implemented in California in 1977. National standards became effective starting in 1988. By the end of 2001, national standards were in effect for over a dozen residential appliances, as well as for a number of commercial sector products. Updated standards will take effect in the next few years for several products. Outside the U.S., over 30 countries have adopted minimum energy performance standards. Technologies and markets are dynamic, and additional opportunities to improve energy efficiency exist. There are two main avenues for extending energy efficiency standards. One is upgrading standards that already exist for specific products. The other is adopting standards for products that are not covered by existing standards. In the absence of new and upgraded energy efficiency standards, it is likely that many new products will enter the stock with lower levels of energy efficiency than would otherwise be the case. Once in the stock

  9. Using commercial software products for atmospheric remote sensing

    Science.gov (United States)

    Kristl, Joseph A.; Tibaudo, Cheryl; Tang, Kuilian; Schroeder, John W.

    2002-02-01

    The Ontar Corporation (www.Ontar.com) has developed several products for atmospheric remote sensing to calculate radiative transport, atmospheric transmission, and sensor performance in both the normal atmosphere and the atmosphere disturbed by battlefield conditions of smoke, dust, explosives and turbulence. These products include: PcModWin: Uses the USAF standard MODTRAN model to compute the atmospheric transmission and radiance at medium spectral resolution (2 cm-1) from the ultraviolet/visible into the infrared and microwave regions of the spectrum. It can be used for any geometry and atmospheric conditions such as aerosols, clouds and rain. PcLnWin: Uses the USAF standard FASCOD model to compute atmospheric transmission and emission at high (line-by-line) spectral resolution using the HITRAN 2000 database. It can be used over the same spectrum from the UV/visible into the infrared and microwave regions of the spectrum. HitranPC: Computes the absolute high (line-by-line) spectral resolution transmission spectrum of the atmosphere for different temperatures and pressures. HitranPC is a user-friendly program developed by the University of South Florida (USF) and uses the international standard molecular spectroscopic database, HITRAN. LidarPC: A computer program to calculate the Laser Radar/L&n Equation for hard targets and atmospheric backscatter using manual input atmospheric parameters or HitranPC and BETASPEC - transmission and backscatter calculations of the atmosphere. Also developed by the University of South Florida (USF). PcEosael: is a library of programs that mathematically describe aspects of electromagnetic propagation in battlefield environments. 25 modules are connected but can be exercised individually. Covers eight general categories of atmospheric effects, including gases, aerosols and laser propagation. Based on codes developed by the Army Research Lab. NVTherm: NVTherm models parallel scan, serial scan, and staring thermal imagers that operate

  10. Standardizing Activation Analysis: New Software for Photon Activation Analysis

    Science.gov (United States)

    Sun, Z. J.; Wells, D.; Segebade, C.; Green, J.

    2011-06-01

    Photon Activation Analysis (PAA) of environmental, archaeological and industrial samples requires extensive data analysis that is susceptible to error. For the purpose of saving time, manpower and minimizing error, a computer program was designed, built and implemented using SQL, Access 2007 and asp.net technology to automate this process. Based on the peak information of the spectrum and assisted by its PAA library, the program automatically identifies elements in the samples and calculates their concentrations and respective uncertainties. The software also could be operated in browser/server mode, which gives the possibility to use it anywhere the internet is accessible. By switching the nuclide library and the related formula behind, the new software can be easily expanded to neutron activation analysis (NAA), charged particle activation analysis (CPAA) or proton-induced X-ray emission (PIXE). Implementation of this would standardize the analysis of nuclear activation data. Results from this software were compared to standard PAA analysis with excellent agreement. With minimum input from the user, the software has proven to be fast, user-friendly and reliable.

  11. Standardizing Activation Analysis: New Software for Photon Activation Analysis

    International Nuclear Information System (INIS)

    Sun, Z. J.; Wells, D.; Green, J.; Segebade, C.

    2011-01-01

    Photon Activation Analysis (PAA) of environmental, archaeological and industrial samples requires extensive data analysis that is susceptible to error. For the purpose of saving time, manpower and minimizing error, a computer program was designed, built and implemented using SQL, Access 2007 and asp.net technology to automate this process. Based on the peak information of the spectrum and assisted by its PAA library, the program automatically identifies elements in the samples and calculates their concentrations and respective uncertainties. The software also could be operated in browser/server mode, which gives the possibility to use it anywhere the internet is accessible. By switching the nuclide library and the related formula behind, the new software can be easily expanded to neutron activation analysis (NAA), charged particle activation analysis (CPAA) or proton-induced X-ray emission (PIXE). Implementation of this would standardize the analysis of nuclear activation data. Results from this software were compared to standard PAA analysis with excellent agreement. With minimum input from the user, the software has proven to be fast, user-friendly and reliable.

  12. Outsourcing the development of specific application software using the ESA software engineering standards the SPS software Interlock System

    CERN Document Server

    Denis, B

    1995-01-01

    CERN is considering outsourcing as a solution to the reduction of staff. To need to re-engineer the SPS Software Interlock System provided an opportunity to explore the applicability of outsourcing to our specific controls environment and the ESA PSS-05 standards were selected for the requirements specification, the development, the control and monitoring and the project management. The software produced by the contractor is now fully operational. After outlining the scope and the complexity of the project, a discussion on the ESA PSS-05 will be presented: the choice, the way these standards improve the outsourcing process, the quality induced but also the need to adapt them and their limitation in the definition of the customer-supplier relationship. The success factors and the difficulties of development under contract will also be discussed. The maintenance aspect and the impact on in-house developments will finally be addressed.

  13. Employing industrial standards in software engineering for W7X

    Energy Technology Data Exchange (ETDEWEB)

    Kuehner, Georg [Max-Planck-Institut fuer Plasmaphysik, EURATOM Association, Teilinstitut Greifswald, Wendelsteinstrasse 1, D-17491 Greifswald (Germany)], E-mail: kuehner@ipp.mpg.de; Bluhm, Torsten [Max-Planck-Institut fuer Plasmaphysik, EURATOM Association, Teilinstitut Greifswald, Wendelsteinstrasse 1, D-17491 Greifswald (Germany); Heimann, Peter [Max-Planck-Institut fuer Plasmaphysik, EURATOM Association, Boltzmannstrasse 2, D-85748 Garching (Germany); Hennig, Christine [Max-Planck-Institut fuer Plasmaphysik, EURATOM Association, Teilinstitut Greifswald, Wendelsteinstrasse 1, D-17491 Greifswald (Germany); Kroiss, Hugo [Max-Planck-Institut fuer Plasmaphysik, EURATOM Association, Boltzmannstrasse 2, D-85748 Garching (Germany); Krueger, Alexander [University of Applied Sciences, Schwedenschanze 135, 18435 Stralsund (Germany); Laqua, Heike; Lewerentz, Marc [Max-Planck-Institut fuer Plasmaphysik, EURATOM Association, Teilinstitut Greifswald, Wendelsteinstrasse 1, D-17491 Greifswald (Germany); Maier, Josef [Max-Planck-Institut fuer Plasmaphysik, EURATOM Association, Boltzmannstrasse 2, D-85748 Garching (Germany); Riemann, Heike; Schacht, Joerg; Spring, Anett; Werner, Andreas [Max-Planck-Institut fuer Plasmaphysik, EURATOM Association, Teilinstitut Greifswald, Wendelsteinstrasse 1, D-17491 Greifswald (Germany); Zilker, Manfred [Max-Planck-Institut fuer Plasmaphysik, EURATOM Association, Boltzmannstrasse 2, D-85748 Garching (Germany)

    2009-06-15

    The stellarator W7X is a large complex experiment designed for continuous operation and planned to be operated for about 20 years. Software support is highly demanded for experiment preparation, operation and data analysis which in turn induces serious non-functional requirements on the software quality like, e.g.: {center_dot}high availability, stability, maintainability vs. {center_dot}high flexibility concerning change of functionality, technology, personnel {center_dot}high versatility concerning the scale of system size and performance These challenges are best met by exploiting industrial experience in quality management and assurance (QM/QA), e.g. focusing on top-down development methods, developing an integral functional system model, using UML as a diagramming standard, building vertical prototypes, support for distributed development, etc., which have been used for W7X, however on an 'as necessary' basis. Proceeding in this manner gave significant results for control, data acquisition, corresponding database-structures and user applications over many years. As soon as production systems started using the software in the labs or on a prototype the development activity demanded to be organized in a more rigorous process mainly to provide stable operation conditions. Thus a process improvement activity was started for stepwise introduction of quality assuring processes with tool support taking standards like CMMI, ISO-15504 (SPICE) as a guideline. Experiences obtained so far will be reported. We conclude software engineering and quality assurance has to be an integral part of systems engineering right from the beginning of projects and be organized according to industrial standards to be prepared for the challenges of nuclear fusion research.

  14. Software for the IAEA Occupational Radiation Protection Standards

    International Nuclear Information System (INIS)

    Mocaun, N.M.; Paul, F.; Griffith, R.V.; Gustafsson, M.; Webb, G.A.M.; Enache, A.

    2000-01-01

    The software version of International Basic Safety Standards (BSS) for Protection against Ionizing Radiation and for the Safety of Radiation Sources, jointly sponsored by Food and Agriculture Organization of the United Nations, International Atomic Energy Agency, International Labour Organization, Nuclear Energy Agency of the Organization for Economic Co-operation and Development, Pan American Health Organization and World Health Organization, was issued on diskette (SS115 software version) by IAEA in 1997. This Windows based software was written in Visual Basic and is designed to provide the user with a powerful and flexible retrieval system to access the 364 page BSS. The code enables the user to search the BSS, including 22 tables and 254 topics, directly through the 'contents' tree. Access is based on keywords, subjects index or cross referencing between portions of the document dealing with different aspects of the same issue or concept. Definitions of important terms used in the Standards can be found by accessing the Glossary. Text and data can be extracted using familiar copy, paste and print features. Publication of three Safety Guides on Occupational Radiation Protection, with co-sponsorship of the IAEA and International Labour Office, is planned for the second half of 1999. The same system will be used to provide these on diskette or CD-ROM (ORPGUIDE version 4.1). The new software will include the Safety Guides: Occupational Radiation Protection, Assessment of Occupational Exposure due to Intakes of Radionuclides, and Assessment of Occupational Exposure due to External Sources of Radiation, as well as the Bss and the Safety Fundamentals, Radiation Protection and the Safety of Radiation Sources. The capabilities of the new software have been expanded to include free form text search and cross referencing of the five documents which will comprise the guidance of the IAEA and its co-sponsors on Occupational Radiation Protection. It is envisioned that the

  15. A proposed acceptance process for commercial off-the-shelf (COTS) software in reactor applications

    International Nuclear Information System (INIS)

    Preckshot, G.G.; Scott, J.A.

    1996-03-01

    This paper proposes a process for acceptance of commercial off-the-shelf (COTS) software products for use in reactor systems important to safety. An initial set of four criteria establishes COTS software product identification and its safety category. Based on safety category, three sets of additional criteria, graded in rigor, are applied to approve/disapprove the product. These criteria fall roughly into three areas: product assurance, verification of safety function and safety impact, and examination of usage experience of the COTS product in circumstances similar to the proposed application. A report addressing the testing of existing software is included as an appendix

  16. Capacity Management as a Service for Enterprise Standard Software

    Directory of Open Access Journals (Sweden)

    Hendrik Müller

    2017-12-01

    Full Text Available Capacity management approaches optimize component utilization from a strong technical perspective. In fact, the quality of involved services is considered implicitly by linking it to resource capacity values. This practice hinders to evaluate design alternatives with respect to given service levels that are expressed in user-centric metrics such as the mean response time for a business transaction. We argue that utilized historical workload traces often contain a variety of performance-related information that allows for the integration of performance prediction techniques through machine learning. Since enterprise applications excessively make use of standard software that is shipped by large software vendors to a wide range of customers, standardized prediction models can be trained and provisioned as part of a capacity management service which we propose in this article. Therefore, we integrate knowledge discovery activities into well-known capacity planning steps, which we adapt to the special characteristics of enterprise applications. Using a real-world example, we demonstrate how prediction models that were trained on a large scale of monitoring data enable cost-efficient measurement-based prediction techniques to be used in early design and redesign phases of planned or running applications. Finally, based on the trained model, we demonstrate how to simulate and analyze future workload scenarios. Using a Pareto approach, we were able to identify cost-effective design alternatives for an enterprise application whose capacity is being managed.

  17. Features of commercial computer software systems for medical examiners and coroners.

    Science.gov (United States)

    Hanzlick, R L; Parrish, R G; Ing, R

    1993-12-01

    There are many ways of automating medical examiner and coroner offices, one of which is to purchase commercial software products specifically designed for death investigation. We surveyed four companies that offer such products and requested information regarding each company and its hardware, software, operating systems, peripheral devices, applications, networking options, programming language, querying capability, coding systems, prices, customer support, and number and size of offices using the product. Although the four products (CME2, ForenCIS, InQuest, and Medical Examiner's Software System) are similar in many respects and each can be installed on personal computers, there are differences among the products with regard to cost, applications, and the other features. Death investigators interested in office automation should explore these products to determine the usefulness of each in comparison with the others and in comparison with general-purpose, off-the-shelf databases and software adaptable to death investigation needs.

  18. 77 FR 30919 - Commercial Driver's License Testing and Commercial Learner's Permit Standards

    Science.gov (United States)

    2012-05-24

    ..., and 385 [Docket No. FMCSA-2007-27659] Commercial Driver's License Testing and Commercial Learner's... published a final rule titled ``Commercial Driver's License Testing and Commercial Learner's Permit... additional drivers, primarily those transporting certain tanks temporarily attached to the commercial motor...

  19. SoftLab: A Soft-Computing Software for Experimental Research with Commercialization Aspects

    Science.gov (United States)

    Akbarzadeh-T, M.-R.; Shaikh, T. S.; Ren, J.; Hubbell, Rob; Kumbla, K. K.; Jamshidi, M

    1998-01-01

    SoftLab is a software environment for research and development in intelligent modeling/control using soft-computing paradigms such as fuzzy logic, neural networks, genetic algorithms, and genetic programs. SoftLab addresses the inadequacies of the existing soft-computing software by supporting comprehensive multidisciplinary functionalities from management tools to engineering systems. Furthermore, the built-in features help the user process/analyze information more efficiently by a friendly yet powerful interface, and will allow the user to specify user-specific processing modules, hence adding to the standard configuration of the software environment.

  20. Space and Missile Systems Center Standard: Software Development

    Science.gov (United States)

    2015-01-16

    waterfall development lifecycle models . Source: Adapted from (IEEE 610.12) See (IEEE 1074) for more information. Software ...spiral, and waterfall lifecycle models .) 2. The developer shall record the selected software development lifecycle model (s) in the Software ...through i.e., waterfall , lifecycle model , the following requirements apply with the interpretation that the software is developed as a single build.

  1. Commercial software upgrades may significantly alter Perfusion CT parameter values in colorectal cancer

    International Nuclear Information System (INIS)

    Goh, Vicky; Shastry, Manu; Endozo, Raymondo; Groves, Ashley M.; Engledow, Alec; Peck, Jacqui; Reston, Jonathan; Wellsted, David M.; Rodriguez-Justo, Manuel; Taylor, Stuart A.; Halligan, Steve

    2011-01-01

    To determine how commercial software platform upgrades impact on derived parameters for colorectal cancer. Following ethical approval, 30 patients with suspected colorectal cancer underwent Perfusion CT using integrated 64 detector PET/CT before surgery. Analysis was performed using software based on modified distributed parameter analysis (Perfusion software version 4; Perfusion 4.0), then repeated using the previous version (Perfusion software version 3; Perfusion 3.0). Tumour blood flow (BF), blood volume (BV), mean transit time (MTT) and permeability surface area product (PS) were determined for identical regions-of-interest. Slice-by-slice and 'whole tumour' variance was assessed by Bland-Altman analysis. Mean BF, BV and PS was 20.4%, 59.5%, and 106% higher, and MTT 14.3% shorter for Perfusion 4.0 than Perfusion 3.0. The mean difference (95% limits of agreement) were +13.5 (-44.9 to 72.0), +2.61 (-0.06 to 5.28), -1.23 (-6.83 to 4.36), and +14.2 (-4.43 to 32.8) for BF, BV, MTT and PS respectively. Within subject coefficient of variation was 36.6%, 38.0%, 27.4% and 60.6% for BF, BV, MTT and PS respectively indicating moderate to poor agreement. Software version upgrades of the same software platform may result in significantly different parameter values, requiring adjustments for cross-version comparison. (orig.)

  2. 76 FR 39018 - Commercial Driver's License Testing and Commercial Learner's Permit Standards; Corrections

    Science.gov (United States)

    2011-07-05

    ... [Docket No. FMCSA-2007-27659] RIN 2126-AB02 Commercial Driver's License Testing and Commercial Learner's..., 2011, that will be effective on July 8, 2011. This final rule amends the commercial driver's license... to issue the commercial learner's permit (CLP). Since the final rule was published, FMCSA identified...

  3. Buying in to bioinformatics: an introduction to commercial sequence analysis software.

    Science.gov (United States)

    Smith, David Roy

    2015-07-01

    Advancements in high-throughput nucleotide sequencing techniques have brought with them state-of-the-art bioinformatics programs and software packages. Given the importance of molecular sequence data in contemporary life science research, these software suites are becoming an essential component of many labs and classrooms, and as such are frequently designed for non-computer specialists and marketed as one-stop bioinformatics toolkits. Although beautifully designed and powerful, user-friendly bioinformatics packages can be expensive and, as more arrive on the market each year, it can be difficult for researchers, teachers and students to choose the right software for their needs, especially if they do not have a bioinformatics background. This review highlights some of the currently available and most popular commercial bioinformatics packages, discussing their prices, usability, features and suitability for teaching. Although several commercial bioinformatics programs are arguably overpriced and overhyped, many are well designed, sophisticated and, in my opinion, worth the investment. If you are just beginning your foray into molecular sequence analysis or an experienced genomicist, I encourage you to explore proprietary software bundles. They have the potential to streamline your research, increase your productivity, energize your classroom and, if anything, add a bit of zest to the often dry detached world of bioinformatics. © The Author 2014. Published by Oxford University Press.

  4. Multi-institutional Validation Study of Commercially Available Deformable Image Registration Software for Thoracic Images

    International Nuclear Information System (INIS)

    Kadoya, Noriyuki; Nakajima, Yujiro; Saito, Masahide; Miyabe, Yuki; Kurooka, Masahiko; Kito, Satoshi; Fujita, Yukio; Sasaki, Motoharu; Arai, Kazuhiro; Tani, Kensuke; Yagi, Masashi; Wakita, Akihisa; Tohyama, Naoki; Jingu, Keiichi

    2016-01-01

    Purpose: To assess the accuracy of the commercially available deformable image registration (DIR) software for thoracic images at multiple institutions. Methods and Materials: Thoracic 4-dimensional (4D) CT images of 10 patients with esophageal or lung cancer were used. Datasets for these patients were provided by DIR-lab ( (dir-lab.com)) and included a coordinate list of anatomic landmarks (300 bronchial bifurcations) that had been manually identified. Deformable image registration was performed between the peak-inhale and -exhale images. Deformable image registration error was determined by calculating the difference at each landmark point between the displacement calculated by DIR software and that calculated by the landmark. Results: Eleven institutions participated in this study: 4 used RayStation (RaySearch Laboratories, Stockholm, Sweden), 5 used MIM Software (Cleveland, OH), and 3 used Velocity (Varian Medical Systems, Palo Alto, CA). The ranges of the average absolute registration errors over all cases were as follows: 0.48 to 1.51 mm (right-left), 0.53 to 2.86 mm (anterior-posterior), 0.85 to 4.46 mm (superior-inferior), and 1.26 to 6.20 mm (3-dimensional). For each DIR software package, the average 3-dimensional registration error (range) was as follows: RayStation, 3.28 mm (1.26-3.91 mm); MIM Software, 3.29 mm (2.17-3.61 mm); and Velocity, 5.01 mm (4.02-6.20 mm). These results demonstrate that there was moderate variation among institutions, although the DIR software was the same. Conclusions: We evaluated the commercially available DIR software using thoracic 4D-CT images from multiple centers. Our results demonstrated that DIR accuracy differed among institutions because it was dependent on both the DIR software and procedure. Our results could be helpful for establishing prospective clinical trials and for the widespread use of DIR software. In addition, for clinical care, we should try to find the optimal DIR procedure using thoracic 4D

  5. Increasing software testability with standard access and control interfaces

    Science.gov (United States)

    Nikora, Allen P; Some, Raphael R.; Tamir, Yuval

    2003-01-01

    We describe an approach to improving the testability of complex software systems with software constructs modeled after the hardware JTAG bus, used to provide visibility and controlability in testing digital circuits.

  6. Issues and relationships among software standards for nuclear safety applications. Version 2.0

    International Nuclear Information System (INIS)

    Scott, J.A.; Preckshot, G.G.; Lawrence, J.D.; Johnson, G.L.

    1996-01-01

    Lawrence Livermore National Laboratory is assisting the Nuclear Regulatory Commission with the development of draft regulatory guides for selected software engineering standards. This report describes the results of the initial task in this work. The selected software standards and a set of related software engineering standards were reviewed, and the resulting preliminary elements of the regulatory positions are identified in this report. The importance of a thorough understanding of the relationships among standards useful for developing safety-related software is emphasized. The relationship of this work to the update of the Standard Review Plan is also discussed

  7. Evaluation of Solid Rocket Motor Component Data Using a Commercially Available Statistical Software Package

    Science.gov (United States)

    Stefanski, Philip L.

    2015-01-01

    Commercially available software packages today allow users to quickly perform the routine evaluations of (1) descriptive statistics to numerically and graphically summarize both sample and population data, (2) inferential statistics that draws conclusions about a given population from samples taken of it, (3) probability determinations that can be used to generate estimates of reliability allowables, and finally (4) the setup of designed experiments and analysis of their data to identify significant material and process characteristics for application in both product manufacturing and performance enhancement. This paper presents examples of analysis and experimental design work that has been conducted using Statgraphics®(Registered Trademark) statistical software to obtain useful information with regard to solid rocket motor propellants and internal insulation material. Data were obtained from a number of programs (Shuttle, Constellation, and Space Launch System) and sources that include solid propellant burn rate strands, tensile specimens, sub-scale test motors, full-scale operational motors, rubber insulation specimens, and sub-scale rubber insulation analog samples. Besides facilitating the experimental design process to yield meaningful results, statistical software has demonstrated its ability to quickly perform complex data analyses and yield significant findings that might otherwise have gone unnoticed. One caveat to these successes is that useful results not only derive from the inherent power of the software package, but also from the skill and understanding of the data analyst.

  8. Software-Defined Solutions for Managing Energy Use in Small to Medium Sized Commercial Buildings

    Energy Technology Data Exchange (ETDEWEB)

    Peffer, Therese [Univ. of California, Berkeley, CA (United States); Council on International Education Exchange (CIEE), Portland, ME (United States); Blumstein, Carl [Council on International Education Exchange (CIEE), Portland, ME (United States); Culler, David [Univ. of California, Berkeley, CA (United States). Electrical Engineering and Computer Sciences (EECS); Modera, Mark [Univ. of California, Davis, CA (United States). Western Cooling Efficiency Center (WCEC); Meier, Alan [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2015-09-10

    The Project uses state-of-the-art computer science to extend the benefits of Building Automation Systems (BAS) typically found in large buildings (>100,000 square foot) to medium-sized commercial buildings (<50,000 sq ft). The BAS developed in this project, termed OpenBAS, uses an open-source and open software architecture platform, user interface, and plug-and-play control devices to facilitate adoption of energy efficiency strategies in the commercial building sector throughout the United States. At the heart of this “turn key” BAS is the platform with three types of controllers—thermostat, lighting controller, and general controller—that are easily “discovered” by the platform in a plug-and-play fashion. The user interface showcases the platform and provides the control system set-up, system status display and means of automatically mapping the control points in the system.

  9. 78 FR 73589 - Energy Conservation Program: Energy Conservation Standards for Commercial and Industrial Electric...

    Science.gov (United States)

    2013-12-06

    ... Conservation Program: Energy Conservation Standards for Commercial and Industrial Electric Motors; Proposed... Conservation Program: Energy Conservation Standards for Commercial and Industrial Electric Motors AGENCY... proposes energy conservation standards for a number of different groups of electric motors that DOE has not...

  10. Comparison of 3D reconstruction of mandible for pre-operative planning using commercial and open-source software

    Science.gov (United States)

    Abdullah, Johari Yap; Omar, Marzuki; Pritam, Helmi Mohd Hadi; Husein, Adam; Rajion, Zainul Ahmad

    2016-12-01

    3D printing of mandible is important for pre-operative planning, diagnostic purposes, as well as for education and training. Currently, the processing of CT data is routinely performed with commercial software which increases the cost of operation and patient management for a small clinical setting. Usage of open-source software as an alternative to commercial software for 3D reconstruction of the mandible from CT data is scarce. The aim of this study is to compare two methods of 3D reconstruction of the mandible using commercial Materialise Mimics software and open-source Medical Imaging Interaction Toolkit (MITK) software. Head CT images with a slice thickness of 1 mm and a matrix of 512x512 pixels each were retrieved from the server located at the Radiology Department of Hospital Universiti Sains Malaysia. The CT data were analysed and the 3D models of mandible were reconstructed using both commercial Materialise Mimics and open-source MITK software. Both virtual 3D models were saved in STL format and exported to 3matic and MeshLab software for morphometric and image analyses. Both models were compared using Wilcoxon Signed Rank Test and Hausdorff Distance. No significant differences were obtained between the 3D models of the mandible produced using Mimics and MITK software. The 3D model of the mandible produced using MITK open-source software is comparable to the commercial MIMICS software. Therefore, open-source software could be used in clinical setting for pre-operative planning to minimise the operational cost.

  11. The Image of User Instructions: Comparing Users' Expectations of and Experiences with an Official and a Commercial Software Manual

    NARCIS (Netherlands)

    de Jong, Menno D.T.; Karreman, Joyce

    2017-01-01

    Purpose: The market for (paid-for) commercial software manuals is flourishing, while (free) official manuals are often assumed to be neglected by users. To investigate differences in user perceptions of commercial and official manuals, we conducted two studies: one focusing on user expectations and

  12. Numerical Analyses of Subsoil-structure Interaction in Original Non-commercial Software based on FEM

    Science.gov (United States)

    Cajka, R.; Vaskova, J.; Vasek, J.

    2018-04-01

    For decades attention has been paid to interaction of foundation structures and subsoil and development of interaction models. Given that analytical solutions of subsoil-structure interaction could be deduced only for some simple shapes of load, analytical solutions are increasingly being replaced by numerical solutions (eg. FEM – Finite element method). Numerical analyses provides greater possibilities for taking into account the real factors involved in the subsoil-structure interaction and was also used in this article. This makes it possible to design the foundation structures more efficiently and still reliably and securely. Currently there are several software that, can deal with the interaction of foundations and subsoil. It has been demonstrated that non-commercial software called MKPINTER (created by Cajka) provides appropriately results close to actual measured values. In MKPINTER software stress-strain analysis of elastic half-space by means of Gauss numerical integration and Jacobean of transformation is done. Input data for numerical analysis were observed by experimental loading test of concrete slab. The loading was performed using unique experimental equipment which was constructed in the area Faculty of Civil Engineering, VŠB-TU Ostrava. The purpose of this paper is to compare resulting deformation of the slab with values observed during experimental loading test.

  13. Technical report on the surface reconstruction of stacked contours by using the commercial software

    Science.gov (United States)

    Shin, Dong Sun; Chung, Min Suk; Hwang, Sung Bae; Park, Jin Seo

    2007-03-01

    After drawing and stacking contours of a structure, which is identified in the serially sectioned images, three-dimensional (3D) image can be made by surface reconstruction. Usually, software is composed for the surface reconstruction. In order to compose the software, medical doctors have to acquire the help of computer engineers. So in this research, surface reconstruction of stacked contours was tried by using commercial software. The purpose of this research is to enable medical doctors to perform surface reconstruction to make 3D images by themselves. The materials of this research were 996 anatomic images (1 mm intervals) of left lower limb, which were made by serial sectioning of a cadaver. On the Adobe Photoshop, contours of 114 anatomic structures were drawn, which were exported to Adobe Illustrator files. On the Maya, contours of each anatomic structure were stacked. On the Rhino, superoinferior lines were drawn along all stacked contours to fill quadrangular surfaces between contours. On the Maya, the contours were deleted. 3D images of 114 anatomic structures were assembled with their original locations preserved. With the surface reconstruction technique, developed in this research, medical doctors themselves could make 3D images of the serially sectioned images such as CTs and MRIs.

  14. High Resolution Topography of Polar Regions from Commercial Satellite Imagery, Petascale Computing and Open Source Software

    Science.gov (United States)

    Morin, Paul; Porter, Claire; Cloutier, Michael; Howat, Ian; Noh, Myoung-Jong; Willis, Michael; Kramer, WIlliam; Bauer, Greg; Bates, Brian; Williamson, Cathleen

    2017-04-01

    Surface topography is among the most fundamental data sets for geosciences, essential for disciplines ranging from glaciology to geodynamics. Two new projects are using sub-meter, commercial imagery licensed by the National Geospatial-Intelligence Agency and open source photogrammetry software to produce a time-tagged 2m posting elevation model of the Arctic and an 8m posting reference elevation model for the Antarctic. When complete, this publically available data will be at higher resolution than any elevation models that cover the entirety of the Western United States. These two polar projects are made possible due to three equally important factors: 1) open-source photogrammetry software, 2) petascale computing, and 3) sub-meter imagery licensed to the United States Government. Our talk will detail the technical challenges of using automated photogrammetry software; the rapid workflow evolution to allow DEM production; the task of deploying the workflow on one of the world's largest supercomputers; the trials of moving massive amounts of data, and the management strategies the team needed to solve in order to meet deadlines. Finally, we will discuss the implications of this type of collaboration for future multi-team use of leadership-class systems such as Blue Waters, and for further elevation mapping.

  15. Development of a coppice planting machine to commercial standards

    Energy Technology Data Exchange (ETDEWEB)

    Turton, J.S.

    2000-07-01

    This report gives details of the development work carried out on the Turton Engineering Coppice Planting machine in order to commercially market it. The background to the machine which plants single rows of cuttings from rods is traced,, and previous development work, design work, production of sub-assemblies and the assembly of modules, inspection and assembly, static trials, and commercial planting are examined. Further machine developments, proving trials, and recommendations for further work are discussed. Appendices address relationships applicable to vertical planting, the Turton short rotation cultivation machine rod format, estimated prices and charges, and a list of main suppliers. (UK)

  16. Comparison of ISO 9000 and recent software life cycle standards to nuclear regulatory review guidance

    International Nuclear Information System (INIS)

    Preckshot, G.G.; Scott, J.A.

    1998-01-01

    Lawrence Livermore National Laboratory is assisting the Nuclear Regulatory Commission with the assessment of certain quality and software life cycle standards to determine whether additional guidance for the U.S. nuclear regulatory context should be derived from the standards. This report describes the nature of the standards and compares the guidance of the standards to that of the recently updated Standard Review Plan

  17. 76 FR 38153 - California State Nonroad Engine Pollution Control Standards; Commercial Harbor Craft Regulations...

    Science.gov (United States)

    2011-06-29

    ... Standards; Commercial Harbor Craft Regulations; Opportunity for Public Hearing and Comment AGENCY... engines on commercial harbor craft. CARB has requested that EPA issue a new authorization under [email protected] . SUPPLEMENTARY INFORMATION: I. California's Commercial Harbor Craft Regulations In a...

  18. Round table discussion: Quality control and standardization of nuclear medicine software

    International Nuclear Information System (INIS)

    Anon.

    1988-01-01

    In summary the round table came to the following important conclusions: Nuclear medicine software systems need better documentation, especially regarding details of algorithms and limitations, and user friendliness could be considerably improved. Quality control of software is an integral part of quality assurance in nuclear medicine and should be performed at all levels of the software. Quality control of applications software should preferably be performed with assistance of generally accepted software phantoms. A basic form of standardization was welcomed and partly regarded as essential by all participants. Some areas such as patient study files could be standardized in the near future, whereas other areas such as the standardization of clinical applications programs or acquisition protocols still present major difficulties. An international cooperation in the field of standardization of software and other topics has already been started on the European level and should be continued and supported. (orig.)

  19. 75 FR 52378 - Transfer of Commercial Standard Mail Parcels to Competitive Product List

    Science.gov (United States)

    2010-08-25

    ..., 2010, the United States Postal Service[reg] filed with the Postal Regulatory Commission a Request of the United States Postal Service to transfer commercial Standard Mail Parcels from the Mail... POSTAL SERVICE Transfer of Commercial Standard Mail Parcels to Competitive Product List AGENCY...

  20. Two‐year experience with the commercial Gamma Knife Check software

    Science.gov (United States)

    Bhatnagar, Jagdish; Bednarz, Greg; Novotny, Josef; Flickinger, John; Lunsford, L. Dade; Huq, M. Saiful

    2016-01-01

    The Gamma Knife Check software is an FDA approved second check system for dose calculations in Gamma Knife radiosurgery. The purpose of this study was to evaluate the accuracy and the stability of the commercial software package as a tool for independent dose verification. The Gamma Knife Check software version 8.4 was commissioned for a Leksell Gamma Knife Perfexion and a 4C unit at the University of Pittsburgh Medical Center in May 2012. Independent dose verifications were performed using this software for 319 radiosurgery cases on the Perfexion and 283 radiosurgery cases on the 4C units. The cases on each machine were divided into groups according to their diagnoses, and an averaged absolute percent dose difference for each group was calculated. The percentage dose difference for each treatment target was obtained as the relative difference between the Gamma Knife Check dose and the dose from the tissue maximum ratio algorithm (TMR 10) from the GammaPlan software version 10 at the reference point. For treatment plans with imaging skull definition, results obtained from the Gamma Knife Check software using the measurement‐based skull definition method are used for comparison. The collected dose difference data were also analyzed in terms of the distance from the treatment target to the skull, the number of treatment shots used for the target, and the gamma angles of the treatment shots. The averaged percent dose differences between the Gamma Knife Check software and the GammaPlan treatment planning system are 0.3%, 0.89%, 1.24%, 1.09%, 0.83%, 0.55%, 0.33%, and 1.49% for the trigeminal neuralgia, acoustic neuroma, arteriovenous malformation (AVM), meningioma, pituitary adenoma, glioma, functional disorders, and metastasis cases on the Perfexion unit. The corresponding averaged percent dose differences for the 4C unit are 0.33%, 1.2%, 2.78% 1.99%, 1.4%, 1.92%, 0.62%, and 1.51%, respectively. The dose difference is, in general, larger for treatment targets in the

  1. Software life cycle management standards real-world solutions and scenarios for savings

    CERN Document Server

    Wright, David

    2011-01-01

    Software Life Cycle Management Standards will help you apply ISO/IEC 19770 to your business and enjoy the rewards it offers. David Wright calls on his vast experience to explain how the Standard applies to the whole of the software life cycle, not just the software asset management aspects. His informative guide gives up-to-date information using practical examples, clear diagrams and entertaining anecdotes.

  2. Numerical simulation of nonequilibrium flows by using the state-to-state approach in commercial software

    Science.gov (United States)

    Kunova, O. V.; Shoev, G. V.; Kudryavtsev, A. N.

    2017-01-01

    Nonequilibrium flows of a two-component oxygen mixture O2/O behind a shock wave are studied with due allowance for the state-to-state vibrational and chemical kinetics. The system of gas-dynamic equations is supplemented with kinetic equations including contributions of VT (TV)-exchange and dissociation processes. A method of the numerical solution of this system with the use of the ANSYS Fluent commercial software package is proposed, which is used in a combination with the authors' code that takes into account nonequilibrium kinetics. The computed results are compared with parameters obtained by solving the problem in the shock-fitting formulation. The vibrational temperature is compared with experimental data. The numerical tool proposed in the present paper is applied to study the flow around a cylinder.

  3. Performance assessment of the commercial CFD software for the prediction of the PWR internal flow - Corrected version

    International Nuclear Information System (INIS)

    Lee, Gong Hee; Bang, Young Seok; Woo, Sweng Woong; Cheong, Ae Ju; Kim, Do Hyeong; Kang, Min Ku

    2013-01-01

    As the computer hardware technology develops the license applicants for nuclear power plant use the commercial CFD software with the aim of reducing the excessive conservatism associated with using simplified and conservative analysis tools. Even if some of CFD software developers and its users think that a state of the art CFD software can be used to solve reasonably at least the single-phase nuclear reactor safety problems there is still the limitations and the uncertainties in the calculation result. From a regulatory perspective, Korea Institute of Nuclear Safety (KINS) has been presently conducting the performance assessment of the commercial CFD software for the nuclear reactor safety problems. In this study, in order to examine the prediction performance of the commercial CFD software with the porous model in the analysis of the scale-down APR+ (Advanced Power Reactor Plus) internal flow, simulation was conducted with the on-board numerical models in ANSYS CFX R.14 and FLUENT R.14. It was concluded that depending on the CFD software the internal flow distribution of the scale-down APR+ was locally some-what different. Although there was a limitation in estimating the prediction performance of the commercial CFD software due to the limited number of the measured data, CFXR.14 showed the more reasonable predicted results in comparison with FLUENT R.14. Meanwhile, due to the difference of discretization methodology, FLUENT R.14 required more computational memory than CFX R.14 for the same grid system. Therefore the CFD software suitable to the available computational resource should be selected for the massive parallel computation. (authors)

  4. IEEE [Institute of Electrical and Electronics Engineers] standards and nuclear software quality engineering

    International Nuclear Information System (INIS)

    Daughtrey, T.

    1988-01-01

    Significant new nuclear-specific software standards have recently been adopted under the sponsorship of the American Nuclear Society and the American Society of Mechanical Engineers. The interest of the US Nuclear Regulatory Commission has also been expressed through their issuance of NUREG/CR-4640. These efforts all indicate a growing awareness of the need for thorough, referenceable expressions of the way to build in and evaluate quality in nuclear software. A broader professional perspective can be seen in the growing number of software engineering standards sponsored by the Institute of Electrical and Electronics Engineers (IEEE) Computer Society. This family of standards represents a systematic effort to capture professional consensus on quality practices throughout the software development life cycle. The only omission-the implementation phase-is treated by accepted American National Standards Institute or de facto standards for programming languages

  5. [Development of a software standardizing optical density with operation settings related to several limitations].

    Science.gov (United States)

    Tu, Xiao-Ming; Zhang, Zuo-Heng; Wan, Cheng; Zheng, Yu; Xu, Jin-Mei; Zhang, Yuan-Yuan; Luo, Jian-Ping; Wu, Hai-Wei

    2012-12-01

    To develop a software that can be used to standardize optical density to normalize the procedures and results of standardization in order to effectively solve several problems generated during standardization of in-direct ELISA results. The software was designed based on the I-STOD method with operation settings to solve the problems that one might encounter during the standardization. Matlab GUI was used as a tool for the development. The software was tested with the results of the detection of sera of persons from schistosomiasis japonica endemic areas. I-STOD V1.0 (WINDOWS XP/WIN 7, 0.5 GB) was successfully developed to standardize optical density. A serial of serum samples from schistosomiasis japonica endemic areas were used to examine the operational effects of I-STOD V1.0 software. The results indicated that the software successfully overcame several problems including reliability of standard curve, applicable scope of samples and determination of dilution for samples outside the scope, so that I-STOD was performed more conveniently and the results of standardization were more consistent. I-STOD V1.0 is a professional software based on I-STOD. It can be easily operated and can effectively standardize the testing results of in-direct ELISA.

  6. Planning the Unplanned Experiment: Assessing the Efficacy of Standards for Safety Critical Software

    Science.gov (United States)

    Graydon, Patrick J.; Holloway, C. Michael

    2015-01-01

    We need well-founded means of determining whether software is t for use in safety-critical applications. While software in industries such as aviation has an excellent safety record, the fact that software aws have contributed to deaths illustrates the need for justi ably high con dence in software. It is often argued that software is t for safety-critical use because it conforms to a standard for software in safety-critical systems. But little is known about whether such standards `work.' Reliance upon a standard without knowing whether it works is an experiment; without collecting data to assess the standard, this experiment is unplanned. This paper reports on a workshop intended to explore how standards could practicably be assessed. Planning the Unplanned Experiment: Assessing the Ecacy of Standards for Safety Critical Software (AESSCS) was held on 13 May 2014 in conjunction with the European Dependable Computing Conference (EDCC). We summarize and elaborate on the workshop's discussion of the topic, including both the presented positions and the dialogue that ensued.

  7. TCV software test and validation tools and technique. [Terminal Configured Vehicle program for commercial transport aircraft operation

    Science.gov (United States)

    Straeter, T. A.; Williams, J. R.

    1976-01-01

    The paper describes techniques for testing and validating software for the TCV (Terminal Configured Vehicle) program which is intended to solve problems associated with operating a commercial transport aircraft in the terminal area. The TCV research test bed is a Boeing 737 specially configured with digital computer systems to carry out automatic navigation, guidance, flight controls, and electronic displays research. The techniques developed for time and cost reduction include automatic documentation aids, an automatic software configuration, and an all software generation and validation system.

  8. Customizing Standard Software as a Business Model in the IT Industry

    DEFF Research Database (Denmark)

    Kautz, Karlheinz; Rab, Sameen M.; Sinnet, Michael

    2011-01-01

    This research studies a new business model in the IT industry, the customization of standard software as the sole foundation for a software company’s earnings. Based on a theoretical background which combines the concepts of inter-organizational networks and open innovation we provide an interpre......This research studies a new business model in the IT industry, the customization of standard software as the sole foundation for a software company’s earnings. Based on a theoretical background which combines the concepts of inter-organizational networks and open innovation we provide...... an interpretive case study of a small software company which customizes a standard product. We investigate the company’s interactions with a large global software company which is the producer of the original software product and with other companies which are involved in the software customization process. We...... primarily on complex, formal partnerships, in which also opportunistic behavior occurs and where informal relations are invaluable sources of knowledge. In addition, the original software producer’s view and treatment of these companies has a vital impact on the customizing company’s practice which...

  9. Quantitative 177Lu-SPECT/CT imaging and validation of a commercial dosimetry software

    International Nuclear Information System (INIS)

    D'Ambrosio, L.; Aloj, L.; Morisco, A.; Aurilio, M.; Prisco, A.; Di Gennaro, F.; Lastoria, S.; Madesani, D.

    2015-01-01

    Full text of publication follows. Aim: 3D dosimetry is an appealing yet complex application of SPECT/CT in patients undergoing radionuclide therapy. In this study we have developed a quantitative imaging protocol and we have validated commercially available dosimetry software (Dosimetry Tool-kit Package, GE Heathcare) in patients undergoing 177 Lu-DOTATATE therapy. Materials and methods: dosimetry tool-kit uses multi SPECT/CT and/or WB planar datasets for quantifying changes in radiopharmaceutical uptake over time to determine residence times. This software includes tools for performing reconstruction of SPECT/CT data, registration of all scans to a common reference, segmentation of the different organs, creating time activity curves, curve fitting and calculation of residence times. All acquisitions were performed using a hybrid dual-head SPECT-CT camera (Discovery 670, GE Heathcare) equipped with medium energy collimator using a triple-energy window. SPECT images were reconstructed using an iterative reconstruction algorithm with attenuation, scatter and collimator depth-dependent three-dimensional resolution recovery correction. Camera sensitivity and dead time were evaluated. Accuracy of activity quantification was performed on a large homogeneous source with addition of attenuating/scattering medium. A NEMA/IEC body phantom was utilized to measure the recovery coefficient that the software does not take into account. The residence times for organs at risk were calculated in five patients. OLINDA-EXM software was used to calculate absorbed doses. Results: 177 Lu-sensitivity factor was 13 counts/MBq/s. Dead time was <3% with 1.11 GBq in the field of view. The measured activity was consistent with the decay-corrected calibrated activity for large volumes (>100 cc). The recovery coefficient varied from 0.71 (26.5 ml) to 0.16 (2.5 ml) in the absence of background activity and from 0.58 to 0.13 with a source to background activity concentration ratio 20:1. The

  10. 78 FR 54197 - Energy Efficiency Program for Commercial and Industrial Equipment: Energy Conservation Standards...

    Science.gov (United States)

    2013-09-03

    .... EERE-2013-BT-STD-0030] RIN 1904-AD01 Energy Efficiency Program for Commercial and Industrial Equipment: Energy Conservation Standards for Commercial Packaged Boilers AGENCY: Office of Energy Efficiency and..., Office of Energy Efficiency and Renewable Energy, Building Technologies Office, EE-2J, 1000 Independence...

  11. The Effective Use of System and Software Architecture Standards for Software Technology Readiness Assessments

    Science.gov (United States)

    2011-05-01

    icons, mouse- control and network paradigms. Successfully directed engineering and quality process development on all levels of the enterprise. As...Actual system proven through successful mission operations A t l t l t d d lifi d th h t t d TRL 9 TRL 8 c ua sys em comp e e an qua e roug es an...A Software Technology Example • Net Centricity – a typical, new mission requirement – Network Centric Warfare (NCW) • NCW is a state-of-the art war

  12. Software database creature for investment property measurement according to international standards

    Science.gov (United States)

    Ponomareva, S. V.; Merzliakova, N. A.

    2018-05-01

    The article deals with investment property measurement and accounting problems at the international, national and enterprise levels. The need to create the software for investment property measurement according to International Accounting Standards was substantiated. The necessary software functions and the processes were described.

  13. Software architecture standard for simulation virtual machine, version 2.0

    Science.gov (United States)

    Sturtevant, Robert; Wessale, William

    1994-01-01

    The Simulation Virtual Machine (SBM) is an Ada architecture which eases the effort involved in the real-time software maintenance and sustaining engineering. The Software Architecture Standard defines the infrastructure which all the simulation models are built from. SVM was developed for and used in the Space Station Verification and Training Facility.

  14. Standardization and software infrastructure for gas hydrate data communications

    Energy Technology Data Exchange (ETDEWEB)

    Kroenlein, K.; Chirico, R.D.; Kazakov, A.; Frenkel, M. [National Inst. of Standards and Technology, Boulder, CO (United States). Physical and Chemical Properties Div.; Lowner, R. [GeoForschungsZentrum Potsdam (Germany); Wang, W. [Chinese Academy of Science, Beijing (China). Computer Network Information Center; Smith, T. [MIT Systems, Flushing, NY (United States); Sloan, E.D. [Colorado School of Mines, Golden, CO (United States). Centre for Hydrate Research

    2008-07-01

    The perceived value of gas hydrates as an energy resource for the future has led to extensive hydrate research studies and experiments. The hydrate deposits are widely dispersed throughout the world, and many countries are now investigating methods of extracting gas hydrate resources. This paper described a gas hydrates markup language (GHML) developed as an international standard for data transfer and storage within the gas hydrates community. The language is related to a hydrates database developed to facilitate a greater understanding of naturally occurring hydrate interactions with geophysical processes, and aid in the development of hydrate technologies for resource recovery and storage. Recent updates to the GHML included the addition of ThermoML, a communication standard for thermodynamic data into the GHML schema. The standard will be used to represent all gas hydrates thermodynamic data. A new element for the description of crystal structures has also been developed, as well as a guided data capture tool. The tool is available free of charge and is publicly licensed for use by gas hydrate data producers. A web service has also been provided to ensure that access to GHML files for gas hydrates and data files are available for users. It was concluded that the tool will help to ensure data quality assurance for the conversion of data and meta-data within the database. 28 refs., 9 figs.

  15. Technique of semiautomatic surface reconstruction of the visible Korean human data using commercial software.

    Science.gov (United States)

    Park, Jin Seo; Shin, Dong Sun; Chung, Min Suk; Hwang, Sung Bae; Chung, Jinoh

    2007-11-01

    This article describes the technique of semiautomatic surface reconstruction of anatomic structures using widely available commercial software. This technique would enable researchers to promptly and objectively perform surface reconstruction, creating three-dimensional anatomic images without any assistance from computer engineers. To develop the technique, we used data from the Visible Korean Human project, which produced digitalized photographic serial images of an entire cadaver. We selected 114 anatomic structures (skin [1], bones [32], knee joint structures [7], muscles [60], arteries [7], and nerves [7]) from the 976 anatomic images which were generated from the left lower limb of the cadaver. Using Adobe Photoshop, the selected anatomic structures in each serial image were outlined, creating a segmented image. The Photoshop files were then converted into Adobe Illustrator files to prepare isolated segmented images, so that the contours of the structure could be viewed independent of the surrounding anatomy. Using Alias Maya, these isolated segmented images were then stacked to construct a contour image. Gaps between the contour lines were filled with surfaces, and three-dimensional surface reconstruction could be visualized with Rhinoceros. Surface imperfections were then corrected to complete the three-dimensional images in Alias Maya. We believe that the three-dimensional anatomic images created by these methods will have widespread application in both medical education and research. 2007 Wiley-Liss, Inc

  16. A Randomised Controlled Trial of the Use of a Piece of Commercial Software for the Acquisition of Reading Skills

    Science.gov (United States)

    Khan, Muhammad Ahmad; Gorard, Stephen

    2012-01-01

    We report here the overall results of a cluster randomised controlled trial of the use of computer-aided instruction with 672 Year 7 pupils in 23 secondary school classes in the north of England. A new piece of commercial software, claimed on the basis of publisher testing to be effective in improving reading after just six weeks of use in the…

  17. Interactive reconstructions of cranial 3D implants under MeVisLab as an alternative to commercial planning software.

    Directory of Open Access Journals (Sweden)

    Jan Egger

    Full Text Available In this publication, the interactive planning and reconstruction of cranial 3D Implants under the medical prototyping platform MeVisLab as alternative to commercial planning software is introduced. In doing so, a MeVisLab prototype consisting of a customized data-flow network and an own C++ module was set up. As a result, the Computer-Aided Design (CAD software prototype guides a user through the whole workflow to generate an implant. Therefore, the workflow begins with loading and mirroring the patients head for an initial curvature of the implant. Then, the user can perform an additional Laplacian smoothing, followed by a Delaunay triangulation. The result is an aesthetic looking and well-fitting 3D implant, which can be stored in a CAD file format, e.g. STereoLithography (STL, for 3D printing. The 3D printed implant can finally be used for an in-depth pre-surgical evaluation or even as a real implant for the patient. In a nutshell, our research and development shows that a customized MeVisLab software prototype can be used as an alternative to complex commercial planning software, which may also not be available in every clinic. Finally, not to conform ourselves directly to available commercial software and look for other options that might improve the workflow.

  18. Interactive reconstructions of cranial 3D implants under MeVisLab as an alternative to commercial planning software

    Science.gov (United States)

    Egger, Jan; Gall, Markus; Tax, Alois; Ücal, Muammer; Zefferer, Ulrike; Li, Xing; von Campe, Gord; Schäfer, Ute; Schmalstieg, Dieter; Chen, Xiaojun

    2017-01-01

    In this publication, the interactive planning and reconstruction of cranial 3D Implants under the medical prototyping platform MeVisLab as alternative to commercial planning software is introduced. In doing so, a MeVisLab prototype consisting of a customized data-flow network and an own C++ module was set up. As a result, the Computer-Aided Design (CAD) software prototype guides a user through the whole workflow to generate an implant. Therefore, the workflow begins with loading and mirroring the patients head for an initial curvature of the implant. Then, the user can perform an additional Laplacian smoothing, followed by a Delaunay triangulation. The result is an aesthetic looking and well-fitting 3D implant, which can be stored in a CAD file format, e.g. STereoLithography (STL), for 3D printing. The 3D printed implant can finally be used for an in-depth pre-surgical evaluation or even as a real implant for the patient. In a nutshell, our research and development shows that a customized MeVisLab software prototype can be used as an alternative to complex commercial planning software, which may also not be available in every clinic. Finally, not to conform ourselves directly to available commercial software and look for other options that might improve the workflow. PMID:28264062

  19. Planning the Unplanned Experiment: Towards Assessing the Efficacy of Standards for Safety-Critical Software

    Science.gov (United States)

    Graydon, Patrick J.; Holloway, C. M.

    2015-01-01

    Safe use of software in safety-critical applications requires well-founded means of determining whether software is fit for such use. While software in industries such as aviation has a good safety record, little is known about whether standards for software in safety-critical applications 'work' (or even what that means). It is often (implicitly) argued that software is fit for safety-critical use because it conforms to an appropriate standard. Without knowing whether a standard works, such reliance is an experiment; without carefully collecting assessment data, that experiment is unplanned. To help plan the experiment, we organized a workshop to develop practical ideas for assessing software safety standards. In this paper, we relate and elaborate on the workshop discussion, which revealed subtle but important study design considerations and practical barriers to collecting appropriate historical data and recruiting appropriate experimental subjects. We discuss assessing standards as written and as applied, several candidate definitions for what it means for a standard to 'work,' and key assessment strategies and study techniques and the pros and cons of each. Finally, we conclude with thoughts about the kinds of research that will be required and how academia, industry, and regulators might collaborate to overcome the noted barriers.

  20. Overview of the ANS [American Nuclear Society] mathematics and computation software standards

    International Nuclear Information System (INIS)

    Smetana, A.O.

    1991-01-01

    The Mathematics and Computations Division of the American Nuclear Society sponsors the ANS-10 Standards Subcommittee. This subcommittee, which is part of the ANS Standards Committee, currently maintains four ANSI/ANS software standards. These standards are: Recommended Programming Practices to Facilitate the Portability of Scientific Computer Programs, ANS-10.2; Guidelines for the Documentation of Computer Software, ANS-10.3; Guidelines for the Verification and Validation of Scientific and Engineering Computer Programs for the Nuclear Industry, ANS-10.4; and Guidelines for Accommodating User Needs in Computer Program Development, ANS-10.5. 5 refs

  1. Accounting Education Approach in the Context of New Turkish Commercial Code and Turkish Accounting Standards

    Directory of Open Access Journals (Sweden)

    Cevdet Kızıl

    2014-08-01

    Full Text Available The aim of this article is to investigate the impact of new Turkish commercial code and Turkish accounting standards on accounting education. This study takes advantage of the survey method for gathering information and running the research analysis. For this purpose, questionnaire forms are distributed to university students personally and via the internet.This paper includes significant research questions such as “Are accounting academicians informed and knowledgeable on new Turkish commercial code and Turkish accounting standards?”, “Do accounting academicians integrate new Turkish commercial code and Turkish accounting standards to their lectures?”, “How does modern accounting education methodology and technology coincides with the teaching of new Turkish commercial code and Turkish accounting standards?”, “Do universities offer mandatory and elective courses which cover the new Turkish commercial code and Turkish accounting standards?” and “If such courses are offered, what are their names, percentage in the curriculum and degree of coverage?”Research contributes to the literature in several ways. Firstly, new Turkish commercial code and Turkish accounting standards are current significant topics for the accounting profession. Furthermore, the accounting education provides a basis for the implementations in public and private sector. Besides, one of the intentions of new Turkish commercial code and Turkish accounting standards is to foster transparency. That is definitely a critical concept also in terms of mergers, acquisitions and investments. Stakeholders of today’s business world such as investors, shareholders, entrepreneurs, auditors and government are in need of more standardized global accounting principles Thus, revision and redesigning of accounting educations plays an important role. Emphasized points also clearly prove the necessity and functionality of this research.

  2. MRI/TRUS fusion software-based targeted biopsy: the new standard of care?

    Science.gov (United States)

    Manfredi, M; Costa Moretti, T B; Emberton, M; Villers, A; Valerio, M

    2015-09-01

    The advent of multiparametric MRI has made it possible to change the way in which prostate biopsy is done, allowing to direct biopsies to suspicious lesions rather than randomly. The subject of this review relates to a computer-assisted strategy, the MRI/US fusion software-based targeted biopsy, and to its performance compared to the other sampling methods. Different devices with different methods to register MR images to live TRUS are currently in use to allow software-based targeted biopsy. Main clinical indications of MRI/US fusion software-based targeted biopsy are re-biopsy in men with persistent suspicious of prostate cancer after first negative standard biopsy and the follow-up of patients under active surveillance. Some studies have compared MRI/US fusion software-based targeted versus standard biopsy. In men at risk with MRI-suspicious lesion, targeted biopsy consistently detects more men with clinically significant disease as compared to standard biopsy; some studies have also shown decreased detection of insignificant disease. Only two studies directly compared MRI/US fusion software-based targeted biopsy with MRI/US fusion visual targeted biopsy, and the diagnostic ability seems to be in favor of the software approach. To date, no study comparing software-based targeted biopsy against in-bore MRI biopsy is available. The new software-based targeted approach seems to have the characteristics to be added in the standard pathway for achieving accurate risk stratification. Once reproducibility and cost-effectiveness will be verified, the actual issue will be to determine whether MRI/TRUS fusion software-based targeted biopsy represents anadd-on test or a replacement to standard TRUS biopsy.

  3. Commercial counterboard for 10 ns software correlator for photon and fluorescence correlation spectroscopy

    Science.gov (United States)

    Molteni, Matteo; Ferri, Fabio

    2016-11-01

    A 10 ns time resolution, multi-tau software correlator, capable of computing simultaneous autocorrelation (A-A, B-B) and cross (A-B) correlation functions at count rates up to ˜10 MHz, with no data loss, has been developed in LabVIEW and C++ by using the National Instrument timer/counterboard (NI PCIe-6612) and a fast Personal Computer (PC) (Intel Core i7-4790 Processor 3.60 GHz ). The correlator works by using two algorithms: for large lag times (τ ≳ 1 μs), a classical time-mode scheme, based on the measure of the number of pulses per time interval, is used; differently, for τ ≲ 1 μs a photon-mode (PM) scheme is adopted and the correlation function is retrieved from the sequence of the photon arrival times. Single auto- and cross-correlation functions can be processed online in full real time up to count rates of ˜1.8 MHz and ˜1.2 MHz, respectively. Two autocorrelation (A-A, B-B) and a cross correlation (A-B) functions can be simultaneously processed in full real time only up to count rates of ˜750 kHz. At higher count rates, the online processing takes place in a delayed modality, but with no data loss. When tested with simulated correlation data and latex spheres solutions, the overall performances of the correlator appear to be comparable with those of commercial hardware correlators, but with several nontrivial advantages related to its flexibility, low cost, and easy adaptability to future developments of PC and data acquisition technology.

  4. 76 FR 67480 - Standard on Commercial Diving Operations; Extension of the Office of Management and Budget's (OMB...

    Science.gov (United States)

    2011-11-01

    ...] Standard on Commercial Diving Operations; Extension of the Office of Management and Budget's (OMB) Approval... Commercial Diving Operations Standard (29 CFR part 1910, subpart T). DATES: Comments must be submitted... existing Standard on Commercial Diving Operations (29 CFR part 1910, Subpart [[Page 67481

  5. Analysis of Potential Benefits and Costs of Adopting a Commercial Building Energy Standard in South Dakota

    Energy Technology Data Exchange (ETDEWEB)

    Belzer, David B.; Cort, Katherine A.; Winiarski, David W.; Richman, Eric E.

    2005-03-04

    The state of South Dakota is considering adopting a commercial building energy standard. This report evaluates the potential costs and benefits to South Dakota residents from requiring compliance with the most recent edition of the ANSI/ASHRAE/IESNA 90.1-2001 Energy Standard for Buildings except Low-Rise Residential Buildings. These standards were developed in an effort to set minimum requirements for the energy efficient design and construction of new commercial buildings. The quantitative benefits and costs of adopting a commercial building energy code are modeled by comparing the characteristics of assumed current building practices with the most recent edition of the ASHRAE Standard, 90.1-2001. Both qualitative and quantitative benefits and costs are assessed in this analysis. Energy and economic impacts are estimated using results from a detailed building simulation tool (Building Loads Analysis and System Thermodynamics [BLAST] model) combined with a Life-Cycle Cost (LCC) approach to assess corresponding economic costs and benefits.

  6. Understanding the Perception of Very Small Software Companies towards the Adoption of Process Standards

    Science.gov (United States)

    Basri, Shuib; O'Connor, Rory V.

    This paper is concerned with understanding the issues that affect the adoption of software process standards by Very Small Entities (VSEs), their needs from process standards and their willingness to engage with the new ISO/IEC 29110 standard in particular. In order to achieve this goal, a series of industry data collection studies were undertaken with a collection of VSEs. A twin track approach of a qualitative data collection (interviews and focus groups) and quantitative data collection (questionnaire) were undertaken. Data analysis was being completed separately and the final results were merged, using the coding mechanisms of grounded theory. This paper serves as a roadmap for both researchers wishing to understand the issues of process standards adoption by very small companies and also for the software process standards community.

  7. Accounting treatment of software development costs according to applicable accounting standards

    Directory of Open Access Journals (Sweden)

    Dilyana Markova

    2017-05-01

    Full Text Available The growth of the software sector worldwide is ahead of the creation and updating of accounting standards that regulate the reporting of the products and services it creates. Applicable standards across countries are interpreted differently and that lead to incomplete reports. This impose the adoption and application of explanations to give a specific guidelines and rules on the accounting treatment of R & D expenditure at each phase of the software project life cycle and disclosure of the information in the financial statements.

  8. Software

    Energy Technology Data Exchange (ETDEWEB)

    Macedo, R.; Budd, G.; Ross, E.; Wells, P.

    2010-07-15

    The software section of this journal presented new software programs that have been developed to help in the exploration and development of hydrocarbon resources. Software provider IHS Inc. has made additions to its geological and engineering analysis software tool, IHS PETRA, a product used by geoscientists and engineers to visualize, analyze and manage well production, well log, drilling, reservoir, seismic and other related information. IHS PETRA also includes a directional well module and a decline curve analysis module to improve analysis capabilities in unconventional reservoirs. Petris Technology Inc. has developed a software to help manage the large volumes of data. PetrisWinds Enterprise (PWE) helps users find and manage wellbore data, including conventional wireline and MWD core data; analysis core photos and images; waveforms and NMR; and external files documentation. Ottawa-based Ambercore Software Inc. has been collaborating with Nexen on the Petroleum iQ software for steam assisted gravity drainage (SAGD) producers. Petroleum iQ integrates geology and geophysics data with engineering data in 3D and 4D. Calgary-based Envirosoft Corporation has developed a software that reduces the costly and time-consuming effort required to comply with Directive 39 of the Alberta Energy Resources Conservation Board. The product includes an emissions modelling software. Houston-based Seismic Micro-Technology (SMT) has developed the Kingdom software that features the latest in seismic interpretation. Holland-based Joa Oil and Gas and Calgary-based Computer Modelling Group have both supplied the petroleum industry with advanced reservoir simulation software that enables reservoir interpretation. The 2010 software survey included a guide to new software applications designed to facilitate petroleum exploration, drilling and production activities. Oil and gas producers can use the products for a range of functions, including reservoir characterization and accounting. In

  9. Academic and Non-Profit Accessibility to Commercial Remote Sensing Software

    Science.gov (United States)

    O'Connor, A. S.; Farr, B.

    2013-12-01

    Remote Sensing as a topic of teaching and research at the university and college level continues to increase. As more data is made freely available and software becomes easier to use, more and more academic and non-profits institutions are turning to remote sensing to solve their tough and large spatial scale problems. Exelis Visual Information Solutions (VIS) has been supporting teaching and research endeavors for over 30 years with a special emphasis over the last 5 years with scientifically proven software and accessible training materials. The Exelis VIS academic program extends to US and Canadian 2 year and 4 year colleges and universities with tools for analyzing aerial and satellite multispectral and hyperspectral imagery, airborne LiDAR and Synthetic Aperture Radar. The Exelis VIS academic programs, using the ENVI Platform, enables labs and classrooms to be outfitted with software and makes software accessible to students. The ENVI software provides students hands on experience with remote sensing software, an easy teaching platform for professors and allows researchers scientifically vetted software they can trust. Training materials are provided at no additional cost and can either serve as a basis for course curriculum development or self paced learning. Non-profit organizations like The Nature Conservancy (TNC) and CGIAR have deployed ENVI and IDL enterprise wide licensing allowing researchers all over the world to have cost effective access COTS software for their research. Exelis VIS has also contributed licenses to the NASA DEVELOP program. Exelis VIS is committed to supporting the academic and NGO community with affordable enterprise licensing, access to training materials, and technical expertise to help researchers tackle today's Earth and Planetary science big data challenges.

  10. The role of open-source software in innovation and standardization in radiology.

    Science.gov (United States)

    Erickson, Bradley J; Langer, Steve; Nagy, Paul

    2005-11-01

    The use of open-source software (OSS), in which developers release the source code to applications they have developed, is popular in the software industry. This is done to allow others to modify and improve software (which may or may not be shared back to the community) and to allow others to learn from the software. Radiology was an early participant in this model, supporting OSS that implemented the ACR-National Electrical Manufacturers Association (now Digital Imaging and Communications in Medicine) standard for medical image communications. In radiology and in other fields, OSS has promoted innovation and the adoption of standards. Popular OSS is of high quality because access to source code allows many people to identify and resolve errors. Open-source software is analogous to the peer-review scientific process: one must be able to see and reproduce results to understand and promote what is shared. The authors emphasize that support for OSS need not threaten vendors; most vendors embrace and benefit from standards. Open-source development does not replace vendors but more clearly defines their roles, typically focusing on areas in which proprietary differentiators benefit customers and on professional services such as implementation planning and service. Continued support for OSS is essential for the success of our field.

  11. Application of industry-standard guidelines for the validation of avionics software

    Science.gov (United States)

    Hayhurst, Kelly J.; Shagnea, Anita M.

    1990-01-01

    The application of industry standards to the development of avionics software is discussed, focusing on verification and validation activities. It is pointed out that the procedures that guide the avionics software development and testing process are under increased scrutiny. The DO-178A guidelines, Software Considerations in Airborne Systems and Equipment Certification, are used by the FAA for certifying avionics software. To investigate the effectiveness of the DO-178A guidelines for improving the quality of avionics software, guidance and control software (GCS) is being developed according to the DO-178A development method. It is noted that, due to the extent of the data collection and configuration management procedures, any phase in the life cycle of a GCS implementation can be reconstructed. Hence, a fundamental development and testing platform has been established that is suitable for investigating the adequacy of various software development processes. In particular, the overall effectiveness and efficiency of the development method recommended by the DO-178A guidelines are being closely examined.

  12. The family of standard hydrogen monitoring system computer software design description: Revision 2

    International Nuclear Information System (INIS)

    Bender, R.M.

    1994-01-01

    In March 1990, 23 waste tanks at the Hanford Nuclear Reservation were identified as having the potential for the buildup of gas to a flammable or explosive level. As a result of the potential for hydrogen gas buildup, a project was initiated to design a standard hydrogen monitoring system (SHMS) for use at any waste tank to analyze gas samples for hydrogen content. Since it was originally deployed three years ago, two variations of the original system have been developed: the SHMS-B and SHMS-C. All three are currently in operation at the tank farms and will be discussed in this document. To avoid confusion in this document, when a feature is common to all three of the SHMS variants, it will be referred to as ''The family of SHMS.'' When it is specific to only one or two, they will be identified. The purpose of this computer software design document is to provide the following: the computer software requirements specification that documents the essential requirements of the computer software and its external interfaces; the computer software design description; the computer software user documentation for using and maintaining the computer software and any dedicated hardware; and the requirements for computer software design verification and validation

  13. 76 FR 3517 - Standards of Performance for Fossil-Fuel-Fired, Electric Utility, Industrial-Commercial...

    Science.gov (United States)

    2011-01-20

    ... Standards of Performance for Fossil-Fuel-Fired, Electric Utility, Industrial-Commercial-Institutional, and... following: Category NAICS \\1\\ Examples of regulated entities Industry 221112 Fossil fuel-fired electric utility steam generating units. Federal Government 22112 Fossil fuel-fired electric utility steam...

  14. 76 FR 17573 - Energy Conservation Standards for Commercial Refrigeration Equipment: Public Meeting and...

    Science.gov (United States)

    2011-03-30

    ... INFORMATION: I. Statutory Authority II. History of Standards Rulemaking for Commercial Refrigeration Equipment... feedback from interested parties on its analytical framework, models, and preliminary results. II. History... equipment installed in the field, such as in grocery stores and restaurants. DOE also carries out additional...

  15. DIGITAL IMAGE CORRELATION FROM COMMERCIAL TO FOS SOFTWARE: A MATURE TECHNIQUE FOR FULL-FIELD DISPLACEMENT MEASUREMENTS

    Directory of Open Access Journals (Sweden)

    V. Belloni

    2018-05-01

    Full Text Available In the last few decades, there has been a growing interest in studying non-contact methods for full-field displacement and strain measurement. Among such techniques, Digital Image Correlation (DIC has received particular attention, thanks to its ability to provide these information by comparing digital images of a sample surface before and after deformation. The method is now commonly adopted in the field of civil, mechanical and aerospace engineering and different companies and some research groups implemented 2D and 3D DIC software. In this work a review on DIC software status is given at first. Moreover, a free and open source 2D DIC software is presented, named py2DIC and developed in Python at the Geodesy and Geomatics Division of DICEA of the University of Rome “La Sapienza”; its potentialities were evaluated by processing the images captured during tensile tests performed in the Structural Engineering Lab of the University of Rome “La Sapienza” and comparing them to those obtained using the commercial software Vic-2D developed by Correlated Solutions Inc, USA. The agreement of these results at one hundredth of millimetre level demonstrate the possibility to use this open source software as a valuable 2D DIC tool to measure full-field displacements on the investigated sample surface.

  16. Analysis in the Utility of Commercial Wargaming Simulation Software for Army Organizational Leadership Development

    National Research Council Canada - National Science Library

    Macintyre, Kerry

    2000-01-01

    ... analysis, operational test and evaluation, and campaign development. The intent of this monograph was to determine if commercial wargame simulations could be used to develop the organizational leadership abilities of Army officers...

  17. Development of a consensus standard for verification and validation of nuclear system thermal-fluids software

    International Nuclear Information System (INIS)

    Harvego, Edwin A.; Schultz, Richard R.; Crane, Ryan L.

    2011-01-01

    With the resurgence of nuclear power and increased interest in advanced nuclear reactors as an option to supply abundant energy without the associated greenhouse gas emissions of the more conventional fossil fuel energy sources, there is a need to establish internationally recognized standards for the verification and validation (V and V) of software used to calculate the thermal–hydraulic behavior of advanced reactor designs for both normal operation and hypothetical accident conditions. To address this need, ASME (American Society of Mechanical Engineers) Standards and Certification has established the V and V 30 Committee, under the jurisdiction of the V and V Standards Committee, to develop a consensus standard for verification and validation of software used for design and analysis of advanced reactor systems. The initial focus of this committee will be on the V and V of system analysis and computational fluid dynamics (CFD) software for nuclear applications. To limit the scope of the effort, the committee will further limit its focus to software to be used in the licensing of High-Temperature Gas-Cooled Reactors. Although software verification will be an important and necessary part of the standard, much of the initial effort of the committee will be focused on the validation of existing software and new models that could be used in the licensing process. In this framework, the Standard should conform to Nuclear Regulatory Commission (NRC) and other regulatory practices, procedures and methods for licensing of nuclear power plants as embodied in the United States (U.S.) Code of Federal Regulations and other pertinent documents such as Regulatory Guide 1.203, “Transient and Accident Analysis Methods” and NUREG-0800, “NRC Standard Review Plan”. In addition, the Standard should be consistent with applicable sections of ASME NQA-1-2008 “Quality Assurance Requirements for Nuclear Facility Applications (QA)”. This paper describes the general

  18. Leveraging Software Architectures through the ISO/IEC 42010 standard: A Feasibility Study

    NARCIS (Netherlands)

    Tamburri, D.A.; Lago, P.; Muccini, H.; Proper, E.; Lankhorst, M.; Schoenherr, M.

    2011-01-01

    The state of the practice in enterprise and software architecture learnt that relevant architectural aspects should be illustrated in multiple views, targeting the various concerns of different stakeholders. This has been expressed a.o. in the ISO/IEC 42010 Standard on architecture descriptions. In

  19. Standard gamma-ray spectra for the comparison of spectral analysis software

    International Nuclear Information System (INIS)

    Woods, S.; Hemingway, J.; Bowles, N.

    1997-01-01

    Three sets of standard γ-ray spectra have been produced for use in assessing the performance of spectral analysis software. The origin of and rationale behind the spectra are described. Nine representative analysis systems have been tested both in terms of component performance and in terms of overall performance and the problems encountered in the analysis are discussed. (author)

  20. Standard gamma-ray spectra for the comparison of spectral analysis software

    Energy Technology Data Exchange (ETDEWEB)

    Woods, S.; Hemingway, J.; Bowles, N. [and others

    1997-08-01

    Three sets of standard {gamma}-ray spectra have been produced for use in assessing the performance of spectral analysis software. The origin of and rationale behind the spectra are described. Nine representative analysis systems have been tested both in terms of component performance and in terms of overall performance and the problems encountered in the analysis are discussed. (author)

  1. New Modelling Capabilities in Commercial Software for High-Gain Antennas

    DEFF Research Database (Denmark)

    Jørgensen, Erik; Lumholt, Michael; Meincke, Peter

    2012-01-01

    characterization of the reflectarray element, an initial phaseonly synthesis, followed by a full optimization procedure taking into account the near-field from the feed and the finite extent of the array. Another interesting new modelling capability is made available through the DIATOOL software, which is a new...... type of EM software tool aimed at extending the ways engineers can use antenna measurements in the antenna design process. The tool allows reconstruction of currents and near fields on a 3D surface conformal to the antenna, by using the measured antenna field as input. The currents on the antenna...... surface can provide valuable information about the antenna performance or undesired contributions, e.g. currents on a cable,can be artificially removed. Finally, the CHAMP software will be extended to cover reflector shaping and more complex materials,which combined with a much faster execution speed...

  2. Computer systems and software description for Standard-E+ Hydrogen Monitoring System (SHMS-E+)

    International Nuclear Information System (INIS)

    Tate, D.D.

    1997-01-01

    The primary function of the Standard-E+ Hydrogen Monitoring System (SHMS-E+) is to determine tank vapor space gas composition and gas release rate, and to detect gas release events. Characterization of the gas composition is needed for safety analyses. The lower flammability limit, as well as the peak burn temperature and pressure, are dependent upon the gas composition. If there is little or no knowledge about the gas composition, safety analyses utilize compositions that yield the worst case in a deflagration or detonation. Knowledge of the true composition could lead to reductions in the assumptions and therefore there may be a potential for a reduction in controls and work restrictions. Also, knowledge of the actual composition will be required information for the analysis that is needed to remove tanks from the Watch List. Similarly, the rate of generation and release of gases is required information for performing safety analyses, developing controls, designing equipment, and closing safety issues. This report outlines the computer system design layout description for the Standard-E+ Hydrogen Monitoring System

  3. Improved detection of pulmonary nodules on energy-subtracted chest radiographs with a commercial computer-aided diagnosis software: comparison with human observers

    International Nuclear Information System (INIS)

    Szucs-Farkas, Zsolt; Patak, Michael A.; Yuksel-Hatz, Seyran; Ruder, Thomas; Vock, Peter

    2010-01-01

    To retrospectively analyze the performance of a commercial computer-aided diagnosis (CAD) software in the detection of pulmonary nodules in original and energy-subtracted (ES) chest radiographs. Original and ES chest radiographs of 58 patients with 105 pulmonary nodules measuring 5-30 mm and images of 25 control subjects with no nodules were randomized. Five blinded readers evaluated firstly the original postero-anterior images alone and then together with the subtracted radiographs. In a second phase, original and ES images were analyzed by a commercial CAD program. CT was used as reference standard. CAD results were compared to the readers' findings. True-positive (TP) and false-positive (FP) findings with CAD on subtracted and non-subtracted images were compared. Depending on the reader's experience, CAD detected between 11 and 21 nodules missed by readers. Human observers found three to 16 lesions missed by the CAD software. CAD used with ES images produced significantly fewer FPs than with non-subtracted images: 1.75 and 2.14 FPs per image, respectively (p=0.029). The difference for the TP nodules was not significant (40 nodules on ES images and 34 lesions in non-subtracted radiographs, p = 0.142). CAD can improve lesion detection both on energy subtracted and non-subtracted chest images, especially for less experienced readers. The CAD program marked less FPs on energy-subtracted images than on original chest radiographs. (orig.)

  4. A pioneering application of NQA-1 quality assurance standards in the development of software

    International Nuclear Information System (INIS)

    Weisbin, A.N.

    1988-01-01

    The application of NQA-1 Quality Assurance Standards to computer software programs has been recent at the Oak Ridge National Laboratory. One reason for systematically applying quality assurance to computer software is the extensive use of results from computer programs. to characterize potential sites for nuclear waste repositories leading ultimately to important policy making decisions. Because data from these programs characterize the likely radioactivity profile for many hundreds of years, experimental validation is not feasible. The Sensitivity and Uncertainty Analysis Methods Development Project (SUAMDP) was developed to formulate and utilize efficient and comprehensive methods for determining sensitivities of calculated results with respect to changes in all input parameters. The computerized methodology was embodied in the Gradient Enhanced Software System (GRESS). Due to the fact that GRESS was to be used in the site characterization for waste storage, stringent NQA-1 requirements were imposed by the sponsor. A working relationship between the Oak Ridge National Laboratory (ORNL) Quality Department and the research scientists developing GRESS was essential in achieving understanding and acceptance of the quality assurance requirements as applied to the SUAMDP. The relationship resulted in the SUAMDP becoming the first software project at ORNL to develop a comprehensive NQA-1 Quality Assurance Plan; this plan now serves as a model for software quality assurance at ORNL. This paper describes the evolution of this plan and its impact on the application of quality assurance procedures to software

  5. Software engineering of a navigation and guidance system for commercial aircraft

    Science.gov (United States)

    Lachmann, S. G.; Mckinstry, R. G.

    1975-01-01

    The avionics experimental configuration of the considered system is briefly reviewed, taking into account the concept of an advanced air traffic management system, flight critical and noncritical functions, and display system characteristics. Cockpit displays and the navigation computer are examined. Attention is given to the functions performed in the navigation computer, major programs in the navigation computer, and questions of software development.

  6. Commercial Building Energy Baseline Modeling Software: Performance Metrics and Method Testing with Open Source Models and Implications for Proprietary Software Testing

    Energy Technology Data Exchange (ETDEWEB)

    Price, Phillip N.; Granderson, Jessica; Sohn, Michael; Addy, Nathan; Jump, David

    2013-09-01

    The overarching goal of this work is to advance the capabilities of technology evaluators in evaluating the building-level baseline modeling capabilities of Energy Management and Information System (EMIS) software. Through their customer engagement platforms and products, EMIS software products have the potential to produce whole-building energy savings through multiple strategies: building system operation improvements, equipment efficiency upgrades and replacements, and inducement of behavioral change among the occupants and operations personnel. Some offerings may also automate the quantification of whole-building energy savings, relative to a baseline period, using empirical models that relate energy consumption to key influencing parameters, such as ambient weather conditions and building operation schedule. These automated baseline models can be used to streamline the whole-building measurement and verification (M&V) process, and therefore are of critical importance in the context of multi-measure whole-building focused utility efficiency programs. This report documents the findings of a study that was conducted to begin answering critical questions regarding quantification of savings at the whole-building level, and the use of automated and commercial software tools. To evaluate the modeling capabilities of EMIS software particular to the use case of whole-building savings estimation, four research questions were addressed: 1. What is a general methodology that can be used to evaluate baseline model performance, both in terms of a) overall robustness, and b) relative to other models? 2. How can that general methodology be applied to evaluate proprietary models that are embedded in commercial EMIS tools? How might one handle practical issues associated with data security, intellectual property, appropriate testing ‘blinds’, and large data sets? 3. How can buildings be pre-screened to identify those that are the most model-predictable, and therefore those

  7. Fuzzy system for risk analysis in software projects through the attributes of quality standards iso 25000

    Directory of Open Access Journals (Sweden)

    Chau Sen Shia

    2014-02-01

    Full Text Available With the growth in demand for products and services in the IT area, companies encounter difficulties in establishing a metric or measure of quality of services to address measurably qualitative values in their planning. In this work fuzzy logic, standard SQuaRE (measurement of the quality of software products, Likertscale, GQM method (Goal-Question-Metric -indicator of quality of Software and the project risk analysis model of Boehm were used to assess the quality of services and decision-making, according to your demand and requests for software development. With the aim of improving the quality in the provision of services, the application is used to integrate the team and follow the life cycle of a project from its initial phase, and to assist in the comparison with the proposed schedule during the requirements elicitation.

  8. Progress on standardization and automation in software development on W7X

    International Nuclear Information System (INIS)

    Kühner, Georg; Bluhm, Torsten; Heimann, Peter; Hennig, Christine; Kroiss, Hugo; Krom, Jon; Laqua, Heike; Lewerentz, Marc; Maier, Josef; Schacht, Jörg; Spring, Anett; Werner, Andreas; Zilker, Manfred

    2012-01-01

    Highlights: ► For W7X software development the use of ISO/IEC15504-5 is further extended. ► The standard provides a basis to manage software multi-projects for a large system project. ► Adoption of a scrum-like management allows for quick reaction on priority changes. ► A high degree of software build automation allows for quick responses to user requests. ► It provides additional resources to concentrate work on product quality (ISO/IEC 25000). - Abstract: For a complex experiment like W7X being subject to changes all along its projected lifetime the advantages of a formalized software development method have already been stated. Quality standards like ISO/IEC-12207 provide a guideline for structuring of development work and improving process and product quality. A considerable number of tools has emerged supporting and automating parts of development work. On W7X progress has been made during the last years in exploiting the benefit of automation and management during software development: –Continuous build, integration and automated test of software artefacts. ∘Syntax checks and code quality metrics. ∘Documentation generation. ∘Feedback for developers by temporal statistics. –Versioned repository for build products (libraries, executables). –Separate snapshot and release repositories and automatic deployment. –Semi-automatic provisioning of applications. –Feedback from testers and feature requests by ticket system. This toolset is working efficiently and allows the team to concentrate on development. The activity there is presently focused on increasing the quality of the existing software to become a dependable product. Testing of single functions and qualities must be simplified. So a restructuring is underway which relies more on small, individually testable components with standardized interfaces providing the capability to construct arbitrary function aggregates for dedicated tests of quality attributes as availability, reliability

  9. Progress on standardization and automation in software development on W7X

    Energy Technology Data Exchange (ETDEWEB)

    Kuehner, Georg, E-mail: kuehner@ipp.mpg.de [Max-Planck-Institut fuer Plasmaphysik, Wendelsteinstrasse 1, D-17491 Greifswald (Germany); Bluhm, Torsten [Max-Planck-Institut fuer Plasmaphysik, Wendelsteinstrasse 1, D-17491 Greifswald (Germany); Heimann, Peter [Max-Planck-Institut fuer Plasmaphysik, Boltzmannstrasse 2, D-85748 Garching (Germany); Hennig, Christine [Max-Planck-Institut fuer Plasmaphysik, Wendelsteinstrasse 1, D-17491 Greifswald (Germany); Kroiss, Hugo [Max-Planck-Institut fuer Plasmaphysik, Boltzmannstrasse 2, D-85748 Garching (Germany); Krom, Jon; Laqua, Heike; Lewerentz, Marc [Max-Planck-Institut fuer Plasmaphysik, Wendelsteinstrasse 1, D-17491 Greifswald (Germany); Maier, Josef [Max-Planck-Institut fuer Plasmaphysik, Boltzmannstrasse 2, D-85748 Garching (Germany); Schacht, Joerg; Spring, Anett; Werner, Andreas [Max-Planck-Institut fuer Plasmaphysik, Wendelsteinstrasse 1, D-17491 Greifswald (Germany); Zilker, Manfred [Max-Planck-Institut fuer Plasmaphysik, Boltzmannstrasse 2, D-85748 Garching (Germany)

    2012-12-15

    Highlights: Black-Right-Pointing-Pointer For W7X software development the use of ISO/IEC15504-5 is further extended. Black-Right-Pointing-Pointer The standard provides a basis to manage software multi-projects for a large system project. Black-Right-Pointing-Pointer Adoption of a scrum-like management allows for quick reaction on priority changes. Black-Right-Pointing-Pointer A high degree of software build automation allows for quick responses to user requests. Black-Right-Pointing-Pointer It provides additional resources to concentrate work on product quality (ISO/IEC 25000). - Abstract: For a complex experiment like W7X being subject to changes all along its projected lifetime the advantages of a formalized software development method have already been stated. Quality standards like ISO/IEC-12207 provide a guideline for structuring of development work and improving process and product quality. A considerable number of tools has emerged supporting and automating parts of development work. On W7X progress has been made during the last years in exploiting the benefit of automation and management during software development: -Continuous build, integration and automated test of software artefacts. Ring-Operator Syntax checks and code quality metrics. Ring-Operator Documentation generation. Ring-Operator Feedback for developers by temporal statistics. -Versioned repository for build products (libraries, executables). -Separate snapshot and release repositories and automatic deployment. -Semi-automatic provisioning of applications. -Feedback from testers and feature requests by ticket system. This toolset is working efficiently and allows the team to concentrate on development. The activity there is presently focused on increasing the quality of the existing software to become a dependable product. Testing of single functions and qualities must be simplified. So a restructuring is underway which relies more on small, individually testable components with standardized

  10. Biosafety decisions and perceived commercial risks: The role of GM-free private standards

    OpenAIRE

    Gruère, Guillaume; Sengupta, Debdatta

    2009-01-01

    "We herein investigate the observed discrepancy between real and perceived commercial risks associated with the use of genetically modified (GM) products in developing countries. We focus particularly on the effects of GM-free private standards set up by food companies in Europe and other countries on biotechnology and biosafety policy decisions in food-exporting developing countries. Based on field visits made to South Africa, Namibia, and Kenya in June 2007, and secondary information from t...

  11. Accounting Education Approach in the Context of New Turkish Commercial Code and Turkish Accounting Standards

    OpenAIRE

    Cevdet Kızıl; Ayşe Tansel Çetin; Ahmed Bulunmaz

    2014-01-01

    The aim of this article is to investigate the impact of new Turkish commercial code and Turkish accounting standards on accounting education. This study takes advantage of the survey method for gathering information and running the research analysis. For this purpose, questionnaire forms are distributed to university students personally and via the internet.This paper includes significant research questions such as “Are accounting academicians informed and knowledgeable on new Turkish commerc...

  12. Comparison of Standard 90.1-2007 and the 2009 IECC with Respect to Commercial Buildings

    Energy Technology Data Exchange (ETDEWEB)

    Conover, David R.; Bartlett, Rosemarie; Halverson, Mark A.

    2009-12-11

    The U.S. Department of Energy’s (DOE’s) Building Energy Codes Program (BECP) has been asked by some states and energy code stakeholders to address the comparability of the 2009 International Energy Conservation Code® (IECC) as applied to commercial buildings and ANSI/ASHRAE/IESNA Standard 90.1-2007 (hereinafter referred to as Standard 90.1-07). An assessment of comparability will help states respond to and implement conditions specified in the State Energy Program (SEP) Formula Grants American Recovery and Reinvestment Act Funding Opportunity, Number DE-FOA-0000052, and eliminate the need for the states individually or collectively to perform comparative studies of the 2009 IECC and Standard 90.1-07. The funding opportunity announcement contains the following conditions: (2) The State, or the applicable units of local government that have authority to adopt building codes, will implement the following: (A) A residential building energy code (or codes) that meets or exceeds the most recent International Energy Conservation Code, or achieves equivalent or greater energy savings. (B) A commercial building energy code (or codes) throughout the State that meets or exceeds the ANSI/ASHRAE/IESNA Standard 90.1-2007, or achieves equivalent or greater energy savings . (C) A plan to achieve 90 percent compliance with the above energy codes within eight years. This plan will include active training and enforcement programs and annual measurement of the rate of compliance. With respect to item (B) above, many more states, regardless of the edition date, directly adopt the IECC than Standard 90.1-07. This is predominately because the IECC is a model code and part of a coordinated set of model building codes that state and local government have historically adopted to regulate building design and construction. This report compares the 2009 IECC to Standard 90.1-07 with the intent of helping states address whether the adoption and application of the 2009 IECC for commercial

  13. A ternary phase-field model incorporating commercial CALPHAD software and its application to precipitation in superalloys

    International Nuclear Information System (INIS)

    Wen, Y.H.; Lill, J.V.; Chen, S.L.; Simmons, J.P.

    2010-01-01

    A ternary phase-field model was developed that is linked directly to commercial CALPHAD software to provide quantitative thermodynamic driving forces. A recently available diffusion mobility database for ordered phases is also implemented to give a better description of the diffusion behavior in alloys. Because the targeted application of this model is the study of precipitation in Ni-based superalloys, a Ni-Al-Cr model alloy was constructed. A detailed description of this model is given in the paper. We have considered the misfit effects of the partitioning of the two solute elements. Transformation rules of the dual representation of the γ+γ ' microstructure by CALPHAD and by the phase field are established and the link with commercial CALPHAD software is described. Proof-of-concept tests were performed to evaluate the model and the results demonstrate that the model can qualitatively reproduce observed γ ' precipitation behavior. Uphill diffusion of Al is observed in a few diffusion couples, showing the significant influence of Cr on the chemical potential of Al. Possible applications of this model are discussed.

  14. 76 FR 9817 - Standard on Commercial Diving Operations; Extension of the Office of Management and Budget's (OMB...

    Science.gov (United States)

    2011-02-22

    ...] Standard on Commercial Diving Operations; Extension of the Office of Management and Budget's (OMB) Approval... Commercial Diving Operations Standard (29 CFR part 1910, subpart T). DATES: Comments must be submitted... obtaining information (29 U.S.C. 657). Subpart T applies to diving and related support operations conducted...

  15. SIMPLIFIED CHARGED PARTICLE BEAM TRANSPORT MODELING USING COMMONLY AVAILABLE COMMERCIAL SOFTWARE

    Energy Technology Data Exchange (ETDEWEB)

    D. Douglas; K. Beard; J. Eldred; P. Evtushenko; A. Jenkins; W. Moore; L. Osborne; D. Sexton; C. Tennant

    2007-06-18

    Particle beam modeling in accelerators has been the focus of considerable effort since the 1950s. Many generations of tools have resulted from this process, each leveraging both prior experience and increases in computer power. However, continuing innovation in accelerator technology results in systems that are not well described by existing tools, so the software development process is on-going. We discuss a novel response to this situation, which was encountered when Jefferson Lab began operation of its energy-recovering linacs. These machines were not readily described with legacy soft-ware; therefore a model was built using Microsoft Excel. This interactive simulation can query data from the accelerator, use it to compute machine parameters, analyze difference orbit data, and evaluate beam properties. It can also derive new accelerator tunings and rapidly evaluate the impact of changes in machine configuration. As it is spreadsheet-based, it can be easily user-modified in response to changing requirements. Examples for the JLab IR Upgrade FEL are presented.

  16. SIMPLIFIED CHARGED PARTICLE BEAM TRANSPORT MODELING USING COMMONLY AVAILABLE COMMERCIAL SOFTWARE

    International Nuclear Information System (INIS)

    D. Douglas; K. Beard; J. Eldred; P. Evtushenko; A. Jenkins; W. Moore; L. Osborne; D. Sexton; C. Tennant

    2007-01-01

    Particle beam modeling in accelerators has been the focus of considerable effort since the 1950s. Many generations of tools have resulted from this process, each leveraging both prior experience and increases in computer power. However, continuing innovation in accelerator technology results in systems that are not well described by existing tools, so the software development process is on-going. We discuss a novel response to this situation, which was encountered when Jefferson Lab began operation of its energy-recovering linacs. These machines were not readily described with legacy soft-ware; therefore a model was built using Microsoft Excel. This interactive simulation can query data from the accelerator, use it to compute machine parameters, analyze difference orbit data, and evaluate beam properties. It can also derive new accelerator tunings and rapidly evaluate the impact of changes in machine configuration. As it is spreadsheet-based, it can be easily user-modified in response to changing requirements. Examples for the JLab IR Upgrade FEL are presented

  17. Software evolution and maintenance

    CERN Document Server

    Tripathy, Priyadarshi

    2014-01-01

    Software Evolution and Maintenance: A Practitioner's Approach is an accessible textbook for students and professionals, which collates the advances in software development and provides the most current models and techniques in maintenance.Explains two maintenance standards: IEEE/EIA 1219 and ISO/IEC14764Discusses several commercial reverse and domain engineering toolkitsSlides for instructors are available onlineInformation is based on the IEEE SWEBOK (Software Engineering Body of Knowledge)

  18. AR2, a novel automatic muscle artifact reduction software method for ictal EEG interpretation: Validation and comparison of performance with commercially available software [version 2; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    Shennan Aibel Weiss

    2017-04-01

    Full Text Available Objective: To develop a novel software method (AR2 for reducing muscle contamination of ictal scalp electroencephalogram (EEG, and validate this method on the basis of its performance in comparison to a commercially available software method (AR1 to accurately depict seizure-onset location. Methods: A blinded investigation used 23 EEG recordings of seizures from 8 patients. Each recording was uninterpretable with digital filtering because of muscle artifact and processed using AR1 and AR2 and reviewed by 26 EEG specialists. EEG readers assessed seizure-onset time, lateralization, and region, and specified confidence for each determination. The two methods were validated on the basis of the number of readers able to render assignments, confidence, the intra-class correlation (ICC, and agreement with other clinical findings. Results: Among the 23 seizures, two-thirds of the readers were able to delineate seizure-onset time in 10 of 23 using AR1, and 15 of 23 using AR2 (p<0.01. Fewer readers could lateralize seizure-onset (p<0.05. The confidence measures of the assignments were low (probable-unlikely, but increased using AR2 (p<0.05. The ICC for identifying the time of seizure-onset was 0.15 (95% confidence interval (CI, 0.11-0.18 using AR1 and 0.26 (95% CI 0.21-0.30 using AR2.  The EEG interpretations were often consistent with behavioral, neurophysiological, and neuro-radiological findings, with left sided assignments correct in 95.9% (CI 85.7-98.9%, n=4 of cases using AR2, and 91.9% (77.0-97.5% (n=4 of cases using AR1. Conclusions: EEG artifact reduction methods for localizing seizure-onset does not result in high rates of interpretability, reader confidence, and inter-reader agreement. However, the assignments by groups of readers are often congruent with other clinical data. Utilization of the AR2 software method may improve the validity of ictal EEG artifact reduction.

  19. An independent monitor unit calculation by commercial software as a part of a radiotherapy treatment planning system quality control

    International Nuclear Information System (INIS)

    Nechvil, K.; Mynarik, J.

    2014-01-01

    For the independent calculation of the monitored unit (MU) the commercial software RadCalc (Lifeline Software Inc., Tyler TX) was used as the choice of some available similar programs. The program was configured and used to verify the doses calculated by commercially accessible planning system Eclipse version 8.6.17 (Varian Medical System Inc., Palo Alto). This system is being used during the clinical running for the creation of the treatment plans. The results of each plan were compared to the dose phantom measurements by the ionization chamber at the same point in which the calculation were done (Eclipse, RadCalc) - in the izocentre. TPS is configured by the beam data (PDD and OAR). Those beam data were exported and afterwards the same data were imported to the program RadCalc. The consistent and independent data between TPS and RadCalc were gained by this process. The reference conditions were set the identical in RadCalc as in TPS, so the consistency between TPS and RadCalc output factors has been achieved (Collimator Scatter Factor: Sc, Phantom Scatter Factor: Sp). Those output factors were also measured by the ionizing chamber in the water phantom and compared with the TPS. Based on the clinical data of the response to the doses, ICRU recommends ensuring the ability of dosimetric systems to deliver the doses with accuracy of at least 5%. Many factors, such as layout of anatomic structures, positioning of a patient, factors related to an accelerator (a dose calibration and mechanic parameters) cause random and systematic failures in a dose delivery. The source of some problems can be also caused by the system databases and relating information transfer; and the TPS containing besides other things other dose calculation algorithms. (authors)

  20. An adaptive software defined radio design based on a standard space telecommunication radio system API

    Science.gov (United States)

    Xiong, Wenhao; Tian, Xin; Chen, Genshe; Pham, Khanh; Blasch, Erik

    2017-05-01

    Software defined radio (SDR) has become a popular tool for the implementation and testing for communications performance. The advantage of the SDR approach includes: a re-configurable design, adaptive response to changing conditions, efficient development, and highly versatile implementation. In order to understand the benefits of SDR, the space telecommunication radio system (STRS) was proposed by NASA Glenn research center (GRC) along with the standard application program interface (API) structure. Each component of the system uses a well-defined API to communicate with other components. The benefit of standard API is to relax the platform limitation of each component for addition options. For example, the waveform generating process can support a field programmable gate array (FPGA), personal computer (PC), or an embedded system. As long as the API defines the requirements, the generated waveform selection will work with the complete system. In this paper, we demonstrate the design and development of adaptive SDR following the STRS and standard API protocol. We introduce step by step the SDR testbed system including the controlling graphic user interface (GUI), database, GNU radio hardware control, and universal software radio peripheral (USRP) tranceiving front end. In addition, a performance evaluation in shown on the effectiveness of the SDR approach for space telecommunication.

  1. Experience implementing energy standards for commercial buildings and its lessons for the Philippines

    Energy Technology Data Exchange (ETDEWEB)

    Busch, John; Deringer, Joseph

    1998-10-01

    Energy efficiency standards for buildings have been adopted in over forty countries. This policy mechanism is pursued by governments as a means of increasing energy efficiency in the buildings sector, which typically accounts for about a third of most nations' energy consumption and half of their electricity consumption. This study reports on experience with implementation of energy standards for commercial buildings in a number of countries and U.S. states. It is conducted from the perspective of providing useful input to the Government of the Philippines' (GOP) current effort at implementing their building energy standard. While the impetus for this work is technical assistance to the Philippines, the intent is to shed light on the broader issues attending implementation of building energy standards that would be applicable there and elsewhere. The background on the GOP building energy standard is presented, followed by the objectives for the study, the approach used to collect and analyze information about other jurisdictions' implementation experience, results, and conclusions and recommendations.

  2. Use of other industry standards to facilitate the procurement and dedication of commercial-grade items

    International Nuclear Information System (INIS)

    Beard, R.L.; Rosch, F.C.; Sanwarwalla, M.H.; Tjernlund, R.M.

    1994-01-01

    Nuclear utilities are embarking on innovative approaches for reducing costs in all aspects of engineering, operation, maintenance, and procurement to produce power cheaply and efficiently and remain competitive with other power producers. In the area of procurement, utilities are increasingly obtaining commercial-grade items for use in safety-related applications. This trend is occurring because of lack of suppliers capable or willing to meet 10 CFR 21 and 10 CFR 50 App. B requirements and because of the absence of original equipment suppliers or the spiraling cost associated with procuring items' basic components safety-related from original suppliers. Utilities have been looking at ways to reduce procurement costs. One promising means to reduce costs is to utilize information provided in other nonnuclear industry standards regarding the specification, control, manufacture, and acceptance of the critical characteristics required of the item to perform its design function. A task force was instituted under the sponsorship of the Electric Power Research Institute to investigate the feasibility of using items manufactured to other industry standards in nuclear safety-related applications. This investigation looked at a broad spectrum of available industry standards pertaining to the design, function, manufacture, and testing of items and determined that some standards are more useful than others. This paper discusses the results of this investigation and how credit from the controls exercised for items manufactured to certain existing industry standards can be taken to minimize dedication costs

  3. Effective dose and organ doses estimation taking tube current modulation into account with a commercial software package

    International Nuclear Information System (INIS)

    Lopez-Rendon, X.; Bosmans, H.; Zanca, F.; Oyen, R.

    2015-01-01

    To evaluate the effect of including tube current modulation (TCM) versus using the average mAs in estimating organ and effective dose (E) using commercial software. Forty adult patients (24 females, 16 males) with normal BMI underwent chest/abdomen computed tomography (CT) performed with TCM at 120 kVp, reference mAs of 110 (chest) and 200 (abdomen). Doses to fully irradiated organs (breasts, lungs, stomach, liver and ovaries) and E were calculated using two versions of a dosimetry software: v.2.0, which uses the average mAs, and v.2.2, which accounts for TCM by implementing a gender-specific mAs profile. Student's t-test was used to assess statistically significant differences between organ doses calculated with the two versions. A statistically significant difference (p < 0.001) was found for E on chest and abdomen CT, with E being lower by 4.2 % when TCM is considered. Similarly, organ doses were also significantly lower (p < 0.001): 13.7 % for breasts, 7.3 % for lungs, 9.1 % for the liver and 8.5 % for the stomach. Only the dose to the ovaries was higher with TCM (11.5 %). When TCM is used, for the stylized phantom, the doses to lungs, breasts, stomach and liver decreased while the dose to the ovaries increased. (orig.)

  4. Effective dose and organ doses estimation taking tube current modulation into account with a commercial software package

    Energy Technology Data Exchange (ETDEWEB)

    Lopez-Rendon, X. [KU Leuven, Department of Imaging and Pathology, Division of Medical Physics and Quality Assessment, Herestraat 49, box 7003, Leuven (Belgium); Bosmans, H.; Zanca, F. [KU Leuven, Department of Imaging and Pathology, Division of Medical Physics and Quality Assessment, Herestraat 49, box 7003, Leuven (Belgium); University Hospitals Leuven, Department of Radiology, Leuven (Belgium); Oyen, R. [University Hospitals Leuven, Department of Radiology, Leuven (Belgium)

    2015-07-15

    To evaluate the effect of including tube current modulation (TCM) versus using the average mAs in estimating organ and effective dose (E) using commercial software. Forty adult patients (24 females, 16 males) with normal BMI underwent chest/abdomen computed tomography (CT) performed with TCM at 120 kVp, reference mAs of 110 (chest) and 200 (abdomen). Doses to fully irradiated organs (breasts, lungs, stomach, liver and ovaries) and E were calculated using two versions of a dosimetry software: v.2.0, which uses the average mAs, and v.2.2, which accounts for TCM by implementing a gender-specific mAs profile. Student's t-test was used to assess statistically significant differences between organ doses calculated with the two versions. A statistically significant difference (p < 0.001) was found for E on chest and abdomen CT, with E being lower by 4.2 % when TCM is considered. Similarly, organ doses were also significantly lower (p < 0.001): 13.7 % for breasts, 7.3 % for lungs, 9.1 % for the liver and 8.5 % for the stomach. Only the dose to the ovaries was higher with TCM (11.5 %). When TCM is used, for the stylized phantom, the doses to lungs, breasts, stomach and liver decreased while the dose to the ovaries increased. (orig.)

  5. Software design practice using two SCADA software packages

    DEFF Research Database (Denmark)

    Basse, K.P.; Christensen, Georg Kronborg; Frederiksen, P. K.

    1996-01-01

    Typical software development for manufacturing control is done either by specialists with consideral real-time programming experience or done by the adaptation of standard software packages for manufacturing control. After investigation and test of two commercial software packages: "InTouch" and ......Touch" and "Fix", it is argued, that a more efficient software solution can be achieved by utilising an integrated specification for SCADA and PLC-programming. Experiences gained from process control is planned investigated for descrete parts manufacturing....

  6. Simple method for the determination of rosiglitazone in human plasma using a commercially available internal standard.

    Science.gov (United States)

    Mamidi, Rao N V S; Benjamin, Biju; Ramesh, Mullangi; Srinivas, Nuggehally R

    2003-09-01

    To the best of our knowledge, bioanalytical methods to determine rosiglitazone in human plasma reported in literature use internal standards that are not commercially available. Our purpose was to develop a simple method for the determination of rosiglitazone in plasma employing a commercially available internal standard (IS). After the addition of celecoxib (IS), plasma (0.25 mL) samples were extracted into ethyl acetate. The residue after evaporation of the organic layer was dissolved in 750 microL of mobile phase and 50 microL was injected on to HPLC. The separation was achieved using a Hichrom KR 100, 250 x 4.6 mm C(18) with a mobile phase composition potassium dihydrogen phosphate buffer (0.01 m, pH 6.5):acetonitrile:methanol (40:50:10, v/v/v). The flow-rate of the mobile phase was set at 1 mL/min. The column eluate was monitored by fluorescence detector set at an excitation wavelength of 247 nm and emission wavelength of 367 nm. Linear relationships (r(2) > 0.99) were observed between the peak area ratio rosiglitazone to IS vs rosiglitazone concentrations across the concentration range 5-1000 ng/mL. The intra-run precision (%RSD) and accuracy (%Dev) in the measurement of rosiglitazone were 80% for both rosiglitazone and IS from human plasma. The lower limit of quantitation of the assay was 5 ng/mL. In summary, the methodology for rosiglitazone measurement in plasma was simple, sensitive and employed a commercially available IS. Copyright 2003 John Wiley & Sons, Ltd.

  7. Developing evidence-based prescriptive ventilation rate standards for commercial buildings in California: a proposed framework

    Energy Technology Data Exchange (ETDEWEB)

    Mendell, Mark J.; Fisk, William J.

    2014-02-01

    Background - The goal of this project, with a focus on commercial buildings in California, was to develop a new framework for evidence-based minimum ventilation rate (MVR) standards that protect occupants in buildings while also considering energy use and cost. This was motivated by research findings suggesting that current prescriptive MVRs in commercial buildings do not provide occupants with fully safe and satisfactory indoor environments. Methods - The project began with a broad review in several areas ? the diverse strategies now used for standards or guidelines for MVRs or for environmental contaminant exposures, current knowledge about adverse human effects associated with VRs, and current knowledge about contaminants in commercial buildings, including their their presence, their adverse human effects, and their relationships with VRs. Based on a synthesis of the reviewed information, new principles and approaches are proposed for setting evidence-based VRs standards for commercial buildings, considering a range of human effects including health, performance, and acceptability of air. Results ? A review and evaluation is first presented of current approaches to setting prescriptive building ventilation standards and setting acceptable limits for human contaminant exposures in outdoor air and occupational settings. Recent research on approaches to setting acceptable levels of environmental exposures in evidence-based MVR standards is also described. From a synthesis and critique of these materials, a set of principles for setting MVRs is presented, along with an example approach based on these principles. The approach combines two sequential strategies. In a first step, an acceptable threshold is set for each adverse outcome that has a demonstrated relationship to VRs, as an increase from a (low) outcome level at a high reference ventilation rate (RVR, the VR needed to attain the best achievable levels of the adverse outcome); MVRs required to meet each

  8. XML as a standard I/O data format in scientific software development

    International Nuclear Information System (INIS)

    Song Tianming; Yang Jiamin; Yi Rongqing

    2010-01-01

    XML is an open standard data format with strict syntax rules, which is widely used in large-scale software development. It is adopted as I/O file format in the development of SpectroSim, a simulation and data-processing system for soft x-ray spectrometer used in ICF experiments. XML data that describe spectrometer configurations, schema codes that define syntax rules for XML and report generation technique for visualization of XML data are introduced. The characteristics of XML such as the capability to express structured information, self-descriptive feature, automation of visualization are explained with examples, and its feasibility as a standard scientific I/O data file format is discussed. (authors)

  9. Computer-aided auscultation of murmurs in children: evaluation of commercially available software.

    Science.gov (United States)

    Lee, Cecilia; Rankin, Kathryn N; Zuo, Kevin J; Mackie, Andrew S

    2016-10-01

    Heart murmurs are common in children and may represent congenital or acquired cardiac pathology. Auscultation is challenging and many primary-care physicians lack the skill to differentiate innocent from pathologic murmurs. We sought to determine whether computer-aided auscultation (CardioscanTM) identifies which children require referral to a cardiologist. We consecutively enrolled children aged between 0 and 17 years with a murmur, innocent or pathologic, being evaluated in a tertiary-care cardiology clinic. Children being evaluated for the first time and patients with known cardiac pathology were eligible. We excluded children who had undergone cardiac surgery previously or were unable to sit still for auscultation. CardioscanTM auscultation was performed in a quiet room with the subject in the supine position. The sensitivity and specificity of a potentially pathologic murmur designation by CardioscanTM - that is, requiring referral - was determined using echocardiography as the reference standard. We enrolled 126 subjects (44% female) with a median age of 1.7 years, with 93 (74%) having cardiac pathology. The sensitivity and specificity of a potentially pathologic murmur determination by CardioscanTM for identification of cardiac pathology were 83.9 and 30.3%, respectively, versus 75.0 and 71.4%, respectively, when limited to subjects with a heart rate of 50-120 beats per minute. The combination of a CardioscanTM potentially pathologic murmur designation or an abnormal electrocardiogram improved sensitivity to 93.5%, with no haemodynamically significant lesions missed. Sensitivity of CardioscanTM when interpreted in conjunction with an abnormal electrocardiogram was high, although specificity was poor. Re-evaluation of computer-aided auscultation will remain necessary as advances in this technology become available.

  10. Inside a VAMDC data node—putting standards into practical software

    Science.gov (United States)

    Regandell, Samuel; Marquart, Thomas; Piskunov, Nikolai

    2018-03-01

    Access to molecular and atomic data is critical for many forms of remote sensing analysis across different fields. Many atomic and molecular databases are however highly specialised for their intended application, complicating querying and combination data between sources. The Virtual Atomic and Molecular Data Centre, VAMDC, is an electronic infrastructure that allows each database to register as a ‘node’. Through services such as VAMDC’s portal website, users can then access and query all nodes in a homogenised way. Today all major Atomic and Molecular databases are attached to VAMDC This article describes the software tools we developed to help data providers create and manage a VAMDC node. It gives an overview of the VAMDC infrastructure and of the various standards it uses. The article then discusses the development choices made and how the standards are implemented in practice. It concludes with a full example of implementing a VAMDC node using a real-life case as well as future plans for the node software.

  11. EVENT GENERATION OF STANDARD MODEL HIGGS DECAY TO DIMUON PAIRS USING PYTHIA SOFTWARE

    CERN Document Server

    Yusof, Adib

    2015-01-01

    My project for CERN Summer Student Programme 2015 is on Event Generation of Standard Model Higgs Decay to Dimuon Pairs using Pythia Software. Briefly, Pythia or specifically, Pythia 8.1 is a program for the generation of high-energy Physics events that is able to describe the collisions at any given energies between elementary particles such as Electron, Positron, Proton as well as anti-Proton. It contains theory and models for a number of Physics aspects, including hard and soft interactions, parton distributions, initial-state and final-state parton showers, multiparton interactions, fragmentation and decay. All programming code is to be written in C++ language for this version (the previous version uses FORTRAN) and can be linked to ROOT software for displaying output in form of histogram. For my project, I need to generate events for standard model Higgs Boson into Muon and anti-Muon pairs (H→μ+ μ) to study the expected significance value for this particular process at centre-of-mass energy of 13 TeV...

  12. Commercial and Industrial Solid Waste Incineration Units (CISWI): New Source Performance Standards (NSPS) and Emission Guidelines (EG) for Existing Sources

    Science.gov (United States)

    Learn about the New Source Performance Standards (NSPS) for commercial and industrial solid waste incineration (CISWI) units including emission guidelines and compliance times for the rule. Read the rule history and summary, and find supporting documents

  13. Evaluations of UltraiQ software for objective ultrasound image quality assessment using images from a commercial scanner.

    Science.gov (United States)

    Long, Zaiyang; Tradup, Donald J; Stekel, Scott F; Gorny, Krzysztof R; Hangiandreou, Nicholas J

    2018-03-01

    We evaluated a commercially available software package that uses B-mode images to semi-automatically measure quantitative metrics of ultrasound image quality, such as contrast response, depth of penetration (DOP), and spatial resolution (lateral, axial, and elevational). Since measurement of elevational resolution is not a part of the software package, we achieved it by acquiring phantom images with transducers tilted at 45 degrees relative to the phantom. Each measurement was assessed in terms of measurement stability, sensitivity, repeatability, and semi-automated measurement success rate. All assessments were performed on a GE Logiq E9 ultrasound system with linear (9L or 11L), curved (C1-5), and sector (S1-5) transducers, using a CIRS model 040GSE phantom. In stability tests, the measurements of contrast, DOP, and spatial resolution remained within a ±10% variation threshold in 90%, 100%, and 69% of cases, respectively. In sensitivity tests, contrast, DOP, and spatial resolution measurements followed the expected behavior in 100%, 100%, and 72% of cases, respectively. In repeatability testing, intra- and inter-individual coefficients of variations were equal to or less than 3.2%, 1.3%, and 4.4% for contrast, DOP, and spatial resolution (lateral and axial), respectively. The coefficients of variation corresponding to the elevational resolution test were all within 9.5%. Overall, in our assessment, the evaluated package performed well for objective and quantitative assessment of the above-mentioned image qualities under well-controlled acquisition conditions. We are finding it to be useful for various clinical ultrasound applications including performance comparison between scanners from different vendors. © 2018 The Authors. Journal of Applied Clinical Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.

  14. Need for standardization of methodology and components in commercial radioimmunoassay kits

    Energy Technology Data Exchange (ETDEWEB)

    Wood, W G; Marschner, I; Scriba, P C [Muenchen Univ. (Germany, F.R.). Medizinische Klinik Innenstadt

    1977-01-01

    The problems arising from increasing use of commercial kits in radioimmunoassay (RIA) and related fields are discussed. The problems arising in various RIAs differ according to the substance under test. The quality of individual components is often good, although methodology is often not optimal and contains short-cuts, which although commercially attractive, can lead to erroneous values and poor sensitivity and precision. Minor modification of methodology often leads to major improvements in sensitivity and precision, and this has been demonstrated in the case of three digoxin kits employing antibody-coated tube techniques and in four kits for thyrotropin (TSH) using different techniques. It has also been noted that in many imported quality control sera from the USA no values have been ascribed to European kits for the components listed, thus reducing these sera to the function of precision control. The deductions from this study are that a standardization of kit components and assay methods is desirable in order to allow comparison of results between laboratories using different kits.

  15. The need for standardization of methodology and components in commercial radioimmunoassay kits

    International Nuclear Information System (INIS)

    Wood, W.G.; Marschner, I.; Scriba, P.C.

    1978-01-01

    The problems arising from the increasing use of commercial kits in radioimmunoassay (RIA) and related fields are discussed. These problems differ according to the substance under test. The quality of individual reagents is often good, but the methodology is often not optimal and may contain short-cuts which, although commercially attractive, can lead to erroneous values and poor sensitivity and precision. Minor modifications in the methodology often lead to big improvements in sensitivity and precision. This has been demonstrated in three digoxin kits employing antibody-coated tube techniques and in four kits for thyrotropin (TSH) using different techniques. It has also been noted that with many quality-control sera imported from the USA no values are ascribed to European kits for the components listed, thus reducing these sera to the function of precision control. The study underlines the need to standardize kit components and assay methods to enable the results obtained by different laboratories with different kits to be compared. (author)

  16. Assessing the Content and Quality of Commercially Available Reading Software Programs: Do They Have the Fundamental Structures to Promote the Development of Early Reading Skills in Children?

    Science.gov (United States)

    Grant, Amy; Wood, Eileen; Gottardo, Alexandra; Evans, Mary Ann; Phillips, Linda; Savage, Robert

    2012-01-01

    The current study developed a taxonomy of reading skills and compared this taxonomy with skills being trained in 30 commercially available software programs designed to teach emergent literacy or literacy-specific skills for children in preschool, kindergarten, and Grade 1. Outcomes suggest that, although some skills are being trained in a…

  17. Starworld: Preparing Accountants for the Future: A Case-Based Approach to Teach International Financial Reporting Standards Using ERP Software

    Science.gov (United States)

    Ragan, Joseph M.; Savino, Christopher J.; Parashac, Paul; Hosler, Jonathan C.

    2010-01-01

    International Financial Reporting Standards now constitute an important part of educating young professional accountants. This paper looks at a case based process to teach International Financial Reporting Standards using integrated Enterprise Resource Planning software. The case contained within the paper can be used within a variety of courses…

  18. Accuracy evaluation of fusion of CT, MR, and SPECT images using commercially available software packages (SRS PLATO and IFS)

    International Nuclear Information System (INIS)

    Mongioj, Valeria; Brusa, Anna; Loi, Gianfranco; Pignoli, Emanuele; Gramaglia, Alberto; Scorsetti, Marta; Bombardieri, Emilio; Marchesini, Renato

    1999-01-01

    Purpose: A problem for clinicians is to mentally integrate information from multiple diagnostic sources, such as computed tomography (CT), magnetic resonance (MR), and single photon emission computed tomography (SPECT), whose images give anatomic and metabolic information. Methods and Materials: To combine this different imaging procedure information, and to overlay correspondent slices, we used commercially available software packages (SRS PLATO and IFS). The algorithms utilize a fiducial-based coordinate system (or frame) with 3 N-shaped markers, which allows coordinate transformation of a clinical examination data set (9 spots for each transaxial section) to a stereotactic coordinate system. The N-shaped markers were filled with fluids visible in each modality (gadolinium for MR, calcium chloride for CT, and 99m Tc for SPECT). The frame is relocatable, in the different acquisition modalities, by means of a head holder to which a face mask is fixed so as to immobilize the patient. Position errors due to the algorithms were obtained by evaluating the stereotactic coordinates of five sources detectable in each modality. Results: SPECT and MR position errors due to the algorithms were evaluated with respect to CT: Δx was ≤ 0.9 mm for MR and ≤ 1.4 mm for SPECT, Δy was ≤ 1 mm and ≤ 3 mm for MR and SPECT, respectively. Maximal differences in distance between estimated and actual fiducial centers (geometric mismatch) were in the order of the pixel size (0.8 mm for CT, 1.4 mm for MR, and 1.8 mm for SPECT). In an attempt to distinguish necrosis from residual disease, the image fusion protocol was studied in 35 primary or metastatic brain tumor patients. Conclusions: The image fusion technique has a good degree of accuracy as well as the potential to improve the specificity of tissue identification and the precision of the subsequent treatment planning

  19. Diagnostic X-Ray dosimeters using standard Float Zone (FZ) and XRA-50 commercial diodes

    Energy Technology Data Exchange (ETDEWEB)

    Gonçalves, Josemary A.C.; Bueno, Carmen C., E-mail: josemary@ipen.br, E-mail: ccbueno@ipen.br [Instituto de Pesquisas Energéticas e Nucleares (IPEN-CNEN/SP), São Paulo, SP (Brazil); Barros, Vinicius S.M.; Asfora, Viviane K.; Khoury, Helen J., E-mail: vsmdbarros@gmail.com, E-mail: vikhoury@gmail.com, E-mail: hjkhoury@gmail.com [Universidade Federal de Pernambuco (UFPE), Recife, PE (Brazil). Departamento de Física

    2017-07-01

    The results obtained with a standard float zone (FZ) silicon diode, processed at the Helsinki Institute of Physics, used as on-line diagnostic X-ray dosimeter are described in this work. The device was connected in the short circuit current mode to the input of an integrating electrometer. The response repeatability and the current sensitivity coefficient of the diode were measured with diagnostic X-ray beams in the range of 40-80 kV. The dose-response of the device, evaluated from 10 mGy up to 500 mGy, was linear with high charge sensitivity. Nevertheless, significant energy dependence was observed in the charge sensitivity of FZ device for energies below 70 kV. The dosimetric characteristics of this FZ diode were compared to those of an XRA-50 commercial Si diode, specially designed to X-ray dosimetry. The results obtained with the FZ diode evidenced that it can be an alternative choice for diagnostic X-ray dosimetry, although it needs to be calibrated for individual X-ray beam energies. The studies of long-term stability and the radiation hardness of these diodes are under way. (author)

  20. A real-time computer simulation of nuclear simulator software using standard PC hardware and linux environments

    International Nuclear Information System (INIS)

    Cha, K. H.; Kweon, K. C.

    2001-01-01

    A feasibility study, which standard PC hardware and Real-Time Linux are applied to real-time computer simulation of software for a nuclear simulator, is presented in this paper. The feasibility prototype was established with the existing software in the Compact Nuclear Simulator (CNS). Throughout the real-time implementation in the feasibility prototype, we has identified that the approach can enable the computer-based predictive simulation to be approached, due to both the remarkable improvement in real-time performance and the less efforts for real-time implementation under standard PC hardware and Real-Time Linux envrionments

  1. Software reengineering

    Science.gov (United States)

    Fridge, Ernest M., III

    1991-01-01

    Today's software systems generally use obsolete technology, are not integrated properly with other software systems, and are difficult and costly to maintain. The discipline of reverse engineering is becoming prominent as organizations try to move their systems up to more modern and maintainable technology in a cost effective manner. JSC created a significant set of tools to develop and maintain FORTRAN and C code during development of the Space Shuttle. This tool set forms the basis for an integrated environment to re-engineer existing code into modern software engineering structures which are then easier and less costly to maintain and which allow a fairly straightforward translation into other target languages. The environment will support these structures and practices even in areas where the language definition and compilers do not enforce good software engineering. The knowledge and data captured using the reverse engineering tools is passed to standard forward engineering tools to redesign or perform major upgrades to software systems in a much more cost effective manner than using older technologies. A beta vision of the environment was released in Mar. 1991. The commercial potential for such re-engineering tools is very great. CASE TRENDS magazine reported it to be the primary concern of over four hundred of the top MIS executives.

  2. Collaboration using open standards and open source software (examples of DIAS/CEOS Water Portal)

    Science.gov (United States)

    Miura, S.; Sekioka, S.; Kuroiwa, K.; Kudo, Y.

    2015-12-01

    The DIAS/CEOS Water Portal is a part of the DIAS (Data Integration and Analysis System, http://www.editoria.u-tokyo.ac.jp/projects/dias/?locale=en_US) systems for data distribution for users including, but not limited to, scientists, decision makers and officers like river administrators. One of the functions of this portal is to enable one-stop search and access variable water related data archived multiple data centers located all over the world. This portal itself does not store data. Instead, according to requests made by users on the web page, it retrieves data from distributed data centers on-the-fly and lets them download and see rendered images/plots. Our system mainly relies on the open source software GI-cat (http://essi-lab.eu/do/view/GIcat) and open standards such as OGC-CSW, Opensearch and OPeNDAP protocol to enable the above functions. Details on how it works will be introduced during the presentation. Although some data centers have unique meta data format and/or data search protocols, our portal's brokering function enables users to search across various data centers at one time. And this portal is also connected to other data brokering systems, including GEOSS DAB (Discovery and Access Broker). As a result, users can search over thousands of datasets, millions of files at one time. Users can access the DIAS/CEOS Water Portal system at http://waterportal.ceos.org/.

  3. Software methodologies for the SSC

    International Nuclear Information System (INIS)

    Loken, S.C.

    1990-01-01

    This report describes some of the considerations that will determine how the author developed software for the SSC. He begins with a review of the general computing problem for SSC experiments and recent experiences in software engineering for the present generation of experiments. This leads to a discussion of the software technologies that will be critical for the SSC experiments. He describes the emerging software standards and commercial products that may be useful in addressing the SSC needs. He concludes with some comments on how collaborations and the SSC Lab should approach the software development issue

  4. A Comparison of Two Commercial Volumetry Software Programs in the Analysis of Pulmonary Ground-Glass Nodules: Segmentation Capability and Measurement Accuracy

    Science.gov (United States)

    Kim, Hyungjin; Lee, Sang Min; Lee, Hyun-Ju; Goo, Jin Mo

    2013-01-01

    Objective To compare the segmentation capability of the 2 currently available commercial volumetry software programs with specific segmentation algorithms for pulmonary ground-glass nodules (GGNs) and to assess their measurement accuracy. Materials and Methods In this study, 55 patients with 66 GGNs underwent unenhanced low-dose CT. GGN segmentation was performed by using 2 volumetry software programs (LungCARE, Siemens Healthcare; LungVCAR, GE Healthcare). Successful nodule segmentation was assessed visually and morphologic features of GGNs were evaluated to determine factors affecting segmentation by both types of software. In addition, the measurement accuracy of the software programs was investigated by using an anthropomorphic chest phantom containing simulated GGNs. Results The successful nodule segmentation rate was significantly higher in LungCARE (90.9%) than in LungVCAR (72.7%) (p = 0.012). Vascular attachment was a negatively influencing morphologic feature of nodule segmentation for both software programs. As for measurement accuracy, mean relative volume measurement errors in nodules ≥ 10 mm were 14.89% with LungCARE and 19.96% with LungVCAR. The mean relative attenuation measurement errors in nodules ≥ 10 mm were 3.03% with LungCARE and 5.12% with LungVCAR. Conclusion LungCARE shows significantly higher segmentation success rates than LungVCAR. Measurement accuracy of volume and attenuation of GGNs is acceptable in GGNs ≥ 10 mm by both software programs. PMID:23901328

  5. A comparison of two commercial volumetry software programs in the analysis of pulmonary ground-glass nodules: Segmentation capability and measurement accuracy

    International Nuclear Information System (INIS)

    Kim, Hyung Jin; Park, Chang Min; Lee, Sang Min; Lee, Hyun Joo; Goo, Jin Mo

    2013-01-01

    To compare the segmentation capability of the 2 currently available commercial volumetry software programs with specific segmentation algorithms for pulmonary ground-glass nodules (GGNs) and to assess their measurement accuracy. In this study, 55 patients with 66 GGNs underwent unenhanced low-dose CT. GGN segmentation was performed by using 2 volumetry software programs (LungCARE, Siemens Healthcare; LungVCAR, GE Healthcare). Successful nodule segmentation was assessed visually and morphologic features of GGNs were evaluated to determine factors affecting segmentation by both types of software. In addition, the measurement accuracy of the software programs was investigated by using an anthropomorphic chest phantom containing simulated GGNs. The successful nodule segmentation rate was significantly higher in LungCARE (90.9%) than in LungVCAR (72.7%) (p = 0.012). Vascular attachment was a negatively influencing morphologic feature of nodule segmentation for both software programs. As for measurement accuracy, mean relative volume measurement errors in nodules ≥ 10 mm were 14.89% with LungCARE and 19.96% with LungVCAR. The mean relative attenuation measurement errors in nodules ≥ 10 mm were 3.03% with LungCARE and 5.12% with LungVCAR. LungCARE shows significantly higher segmentation success rates than LungVCAR. Measurement accuracy of volume and attenuation of GGNs is acceptable in GGNs ≥ 10 mm by both software programs.

  6. A comparison of two commercial volumetry software programs in the analysis of pulmonary ground-glass nodules: Segmentation capability and measurement accuracy

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Hyung Jin; Park, Chang Min; Lee, Sang Min; Lee, Hyun Joo; Goo, Jin Mo [Dept. of Radiology, Seoul National University College of Medicine, and Institute of Radiation Medicine, Seoul National University Medical Research Center, Seoul (Korea, Republic of)

    2013-08-15

    To compare the segmentation capability of the 2 currently available commercial volumetry software programs with specific segmentation algorithms for pulmonary ground-glass nodules (GGNs) and to assess their measurement accuracy. In this study, 55 patients with 66 GGNs underwent unenhanced low-dose CT. GGN segmentation was performed by using 2 volumetry software programs (LungCARE, Siemens Healthcare; LungVCAR, GE Healthcare). Successful nodule segmentation was assessed visually and morphologic features of GGNs were evaluated to determine factors affecting segmentation by both types of software. In addition, the measurement accuracy of the software programs was investigated by using an anthropomorphic chest phantom containing simulated GGNs. The successful nodule segmentation rate was significantly higher in LungCARE (90.9%) than in LungVCAR (72.7%) (p = 0.012). Vascular attachment was a negatively influencing morphologic feature of nodule segmentation for both software programs. As for measurement accuracy, mean relative volume measurement errors in nodules ≥ 10 mm were 14.89% with LungCARE and 19.96% with LungVCAR. The mean relative attenuation measurement errors in nodules ≥ 10 mm were 3.03% with LungCARE and 5.12% with LungVCAR. LungCARE shows significantly higher segmentation success rates than LungVCAR. Measurement accuracy of volume and attenuation of GGNs is acceptable in GGNs ≥ 10 mm by both software programs.

  7. A Real-Time GPP Software-Defined Radio Testbed for the Physical Layer of Wireless Standards

    NARCIS (Netherlands)

    Schiphorst, Roelof; Hoeksema, F.W.; Slump, Cornelis H.

    2005-01-01

    We present our contribution to the general-purpose-processor-(GPP)-based radio. We describe a baseband software-defined radio testbed for the physical layer of wireless LAN standards. All physical layer functions have been successfully mapped on a Pentium 4 processor that performs these functions in

  8. BioContainers: an open-source and community-driven framework for software standardization

    Science.gov (United States)

    da Veiga Leprevost, Felipe; Grüning, Björn A.; Alves Aflitos, Saulo; Röst, Hannes L.; Uszkoreit, Julian; Barsnes, Harald; Vaudel, Marc; Moreno, Pablo; Gatto, Laurent; Weber, Jonas; Bai, Mingze; Jimenez, Rafael C.; Sachsenberg, Timo; Pfeuffer, Julianus; Vera Alvarez, Roberto; Griss, Johannes; Nesvizhskii, Alexey I.; Perez-Riverol, Yasset

    2017-01-01

    Abstract Motivation BioContainers (biocontainers.pro) is an open-source and community-driven framework which provides platform independent executable environments for bioinformatics software. BioContainers allows labs of all sizes to easily install bioinformatics software, maintain multiple versions of the same software and combine tools into powerful analysis pipelines. BioContainers is based on popular open-source projects Docker and rkt frameworks, that allow software to be installed and executed under an isolated and controlled environment. Also, it provides infrastructure and basic guidelines to create, manage and distribute bioinformatics containers with a special focus on omics technologies. These containers can be integrated into more comprehensive bioinformatics pipelines and different architectures (local desktop, cloud environments or HPC clusters). Availability and Implementation The software is freely available at github.com/BioContainers/. Contact yperez@ebi.ac.uk PMID:28379341

  9. BioContainers: an open-source and community-driven framework for software standardization.

    Science.gov (United States)

    da Veiga Leprevost, Felipe; Grüning, Björn A; Alves Aflitos, Saulo; Röst, Hannes L; Uszkoreit, Julian; Barsnes, Harald; Vaudel, Marc; Moreno, Pablo; Gatto, Laurent; Weber, Jonas; Bai, Mingze; Jimenez, Rafael C; Sachsenberg, Timo; Pfeuffer, Julianus; Vera Alvarez, Roberto; Griss, Johannes; Nesvizhskii, Alexey I; Perez-Riverol, Yasset

    2017-08-15

    BioContainers (biocontainers.pro) is an open-source and community-driven framework which provides platform independent executable environments for bioinformatics software. BioContainers allows labs of all sizes to easily install bioinformatics software, maintain multiple versions of the same software and combine tools into powerful analysis pipelines. BioContainers is based on popular open-source projects Docker and rkt frameworks, that allow software to be installed and executed under an isolated and controlled environment. Also, it provides infrastructure and basic guidelines to create, manage and distribute bioinformatics containers with a special focus on omics technologies. These containers can be integrated into more comprehensive bioinformatics pipelines and different architectures (local desktop, cloud environments or HPC clusters). The software is freely available at github.com/BioContainers/. yperez@ebi.ac.uk. © The Author(s) 2017. Published by Oxford University Press.

  10. Comparison of CT- and radiograph-based post-implant dosimetry for transperineal 125I prostate brachytherapy using single seeds and a commercial treatment-planning software

    International Nuclear Information System (INIS)

    Siebert, F.A.; Kohr, P.; Kovacs, G.

    2006-01-01

    Background and purpose: the objective of this investigation was a direct comparison of the dosimetry of CT-based and radiograph-based postplanning procedures for seed implants. Patients and methods: CT- and radiograph-based postplans were carried out for eight iodine-125 ( 125 I) seed implant patients with a commercial treatment-planning system (TPS). To assess a direct comparison of the dosimetric indices (D90, V100, V400), the radiograph-based seed coordinates were transformed to the coordinate system of the CT postplan. Afterwards, the CT-based seed positions were replaced by the radiograph-based coordinates in the TPS and the dose distribution was recalculated. Results: the computations demonstrated that the radiograph-based dosimetric values for the prostate (D p 90, V p 100, and V p 400) were on average lower than the values of the CT postplan. Normalized to the CT postplan the following mean values were found: D p 90: 90.6% (standard deviation [SD]: 9.0%), V p 100: 86.1% (SD: 14.7%), and V p 400: 79.4% (SD: 14.4%). For three out of the eight patients the D p 90 decreased to 90% of the initial CT postplan values. The reason for this dosimetric difference is supposed to be evoked by an error of the reconstruction software used. It was detected that the TPS algorithm assigned some sources to wrong coordinates, partly out of the prostate gland. Conclusion: the radiograph-based postplanning technique of the investigated TPS should only be used in combination with CT postplanning. Furthermore, complex testing procedures of reconstruction algorithms are recommended to minimize calculation errors. (orig.)

  11. 77 FR 48108 - Energy Conservation Standards for Commercial Clothes Washers: Public Meeting and Availability of...

    Science.gov (United States)

    2012-08-13

    ... commercial matters regulated by U.S. antitrust laws. After the public meeting and the expiration of the.... Code, for editorial reasons.) The Energy Policy Act of 2005 (EPACT 2005), Public Law 109-58, further...

  12. New AICPA standards aid accounting for the costs of internal-use software.

    Science.gov (United States)

    Luecke, R W; Meeting, D T; Klingshirn, R G

    1999-05-01

    Statement of Position (SOP) No. 98-1, "Accounting for the Costs of Computer Software Developed or Obtained for Internal Use," issued by the American Institute of Certified Public Accountants in March 1998, provides financial managers with guidelines regarding which costs involved in developing or obtaining internal-use software should be expensed and which should be capitalized. The SOP identifies three stages in the development of internal-use software: the preliminary project stage, the application development stage, and the postimplementation-operation stage. The SOP provides that all costs incurred during the preliminary project stage should be expensed as incurred. During the application development stage, costs associated with developing or obtaining the software should be capitalized, while costs associated with preparing data for use within the new system should be expensed. Costs incurred during the postimplementation-operation stage, typically associated with training and application maintenance, should be expensed.

  13. CT and MR perfusion can discriminate severe cerebral hypoperfusion from perfusion absence: evaluation of different commercial software packages by using digital phantoms

    Energy Technology Data Exchange (ETDEWEB)

    Uwano, Ikuko; Kudo, Kohsuke; Sasaki, Makoto [Iwate Medical University, Advanced Medical Research Center, Morioka (Japan); Christensen, Soren [University of Melbourne, Royal Melbourne Hospital, Departments of Neurology and Radiology, Victoria (Australia); Oestergaard, Leif [Aarhus University Hospital, Department of Neuroradiology, Center for Functionally Integrative Neuroscience, DK, Aarhus C (Denmark); Ogasawara, Kuniaki; Ogawa, Akira [Iwate Medical University, Department of Neurosurgery, Morioka (Japan)

    2012-05-15

    Computed tomography perfusion (CTP) and magnetic resonance perfusion (MRP) are expected to be usable for ancillary tests of brain death by detection of complete absence of cerebral perfusion; however, the detection limit of hypoperfusion has not been determined. Hence, we examined whether commercial software can visualize very low cerebral blood flow (CBF) and cerebral blood volume (CBV) by creating and using digital phantoms. Digital phantoms simulating 0-4% of normal CBF (60 mL/100 g/min) and CBV (4 mL/100 g/min) were analyzed by ten software packages of CT and MRI manufacturers. Region-of-interest measurements were performed to determine whether there was a significant difference between areas of 0% and areas of 1-4% of normal flow. The CTP software detected hypoperfusion down to 2-3% in CBF and 2% in CBV, while the MRP software detected that of 1-3% in CBF and 1-4% in CBV, although the lower limits varied among software packages. CTP and MRP can detect the difference between profound hypoperfusion of <5% from that of 0% in digital phantoms, suggesting their potential efficacy for assessing brain death. (orig.)

  14. A pioneering application of NQA-1 quality assurance standards in the development of software

    International Nuclear Information System (INIS)

    Weisbin, A.N.

    1988-01-01

    One reason for systematically applying quality assurance to computer software is the extensive use of results from computer programs to characterize potential sited for nuclear waste repositories leading ultimately to important policy making decisions. Because data from these programs characterize the likely radioactivity profile for many hundreds of years, experimental validation is not feasible. The Sensitivity and Uncertainty Analysis Methods Development Project (SUAMDP) was developed to formulate and utilize efficient and comprehensive methods for determining sensitivities of calculated results with respect to changes in all input parameters. The computerized methodology was embodied in the Gradient Enhanced Software System (GRESS). Due to the fact that GRESS was to be used in the site characterization for waste storage, stringent NQA-1 requirements were imposed by the sponsor. A working relationship between the Oak Ridge National Laboratory (ORNL) Quality Department and the research scientists developing GRESS was essential in achieving understanding and acceptance of the quality assurance requirements as applied to the SUAMDP. The relationship resulted in the SUAMDP becoming the first software project at ORNL to develop a comprehensive NQA-1 Quality Assurance Plan; this plan now serves as a model for software quality assurance at ORNL. This paper describes the evolution of this plan and its impact on the application of quality assurance procedures to software. 2 refs

  15. Commercial lumber

    Science.gov (United States)

    Kent A. McDonald; David E. Kretschmann

    1999-01-01

    In a broad sense, commercial lumber is any lumber that is bought or sold in the normal channels of commerce. Commercial lumber may be found in a variety of forms, species, and types, and in various commercial establishments, both wholesale and retail. Most commercial lumber is graded by standardized rules that make purchasing more or less uniform throughout the country...

  16. Software Epistemology

    Science.gov (United States)

    2016-03-01

    in-vitro decision to incubate a startup, Lexumo [7], which is developing a commercial Software as a Service ( SaaS ) vulnerability assessment...LTS Label Transition System MUSE Mining and Understanding Software Enclaves RTEMS Real-Time Executive for Multi-processor Systems SaaS Software ...as a Service SSA Static Single Assignment SWE Software Epistemology UD/DU Def-Use/Use-Def Chains (Dataflow Graph)

  17. Standard assays do not predict the efficiency of commercial cellulase preparations towards plant materials

    NARCIS (Netherlands)

    Kabel, M.A.; Maarel, van der M.J.E.C.; Klip, G.; Voragen, A.G.J.; Schols, H.A.

    2006-01-01

    Commercial cellulase preparations are potentially effective for processing biomass feedstocks in order to obtain bioethanol. In plant cell walls, cellulose fibrils occur in close association with xylans (monocotyls) or xyloglucans (dicotyls). The enzymatic conversion of cellulose/xylans is a complex

  18. Standard Assays Do Not Predict the Efficiency of Commercial Cellulase Preparations Towards Plant Materials

    NARCIS (Netherlands)

    Kabel, Mirjam A.; Maarel, Marc J.E.C. van der; Klip, Gert; Voragen, Alphons G.J.; Schols, Henk A.

    2006-01-01

    Commercial cellulase preparations are potentially effective for processing biomass feedstocks in order to obtain bioethanol. In plant cell walls, cellulose fibrils occur in close association with xylans (monocotyls) or xyloglucans (dicotyls). The enzymatic conversion of cellulose/xylans is a complex

  19. 75 FR 33661 - Commercial Driver's License (CDL) Standards; Rotel North American Tours, LLC; Application for...

    Science.gov (United States)

    2010-06-14

    ... named drivers, employed by Rotel and possessing German CDLs, to operate commercial motor vehicles (CMVs.... ADDRESSES: You may submit comments identified by Federal Docket Management System Number FMCSA-2008-0078 by... . Follow the online instructions for submitting comments. Fax: 1-202-493-2251. Mail: Docket Management...

  20. 77 FR 3404 - Energy Conservation Standards for Automatic Commercial Ice Makers: Public Meeting and...

    Science.gov (United States)

    2012-01-24

    .... Email: [email protected] . SUPPLEMENTARY INFORMATION: I. Statutory Authority II. History of... feedback from interested parties on its analytical framework, models, and preliminary results. II. History... automatic commercial ice makers installed in the field, such as in hospitals and restaurants. Details of the...

  1. Automated Facial Coding Software Outperforms People in Recognizing Neutral Faces as Neutral from Standardized Datasets

    Directory of Open Access Journals (Sweden)

    Peter eLewinski

    2015-09-01

    Full Text Available Little is known about people’s accuracy of recognizing neutral faces as neutral. In this paper, I demonstrate the importance of knowing how well people recognize neutral faces. I contrasted human recognition scores of 100 typical, neutral front-up facial images with scores of an arguably objective judge – automated facial coding (AFC software. I hypothesized that the software would outperform humans in recognizing neutral faces because of the inherently objective nature of computer algorithms. Results confirmed this hypothesis. I provided the first-ever evidence that computer software (90% was more accurate in recognizing neutral faces than people were (59%. I posited two theoretical mechanisms, i.e. smile-as-a-baseline and false recognition of emotion, as possible explanations for my findings.

  2. Testing of Software Routine to Determine Deviate and Cumulative Probability: ModStandardNormal Version 1.0

    International Nuclear Information System (INIS)

    A.H. Monib

    1999-01-01

    The purpose of this calculation is to document that the software routine ModStandardNomal Version 1.0 which is a Visual Fortran 5.0 module, provides correct results for a normal distribution up to five significant figures (three significant figures at the function tails) for a specified range of input parameters. The software routine may be used for quality affecting work. Two types of output are generated in ModStandardNomal: a deviate, x, given a cumulative probability, p, between 0 and 1; and a cumulative probability, p, given a deviate, x, between -8 and 8. This calculation supports Performance Assessment, under Technical Product Development Plan, TDP-EBS-MD-000006 (Attachment I, DIRS 3) and is written in accordance with the AP-3.12Q Calculations procedure (Attachment I, DIRS 4)

  3. Systematic evaluation of three different commercial software solutions for automatic segmentation for adaptive therapy in head-and-neck, prostate and pleural cancer

    International Nuclear Information System (INIS)

    La Macchia, Mariangela; Fellin, Francesco; Amichetti, Maurizio; Cianchetti, Marco; Gianolini, Stefano; Paola, Vitali; Lomax, Antony J; Widesott, Lamberto

    2012-01-01

    To validate, in the context of adaptive radiotherapy, three commercial software solutions for atlas-based segmentation. Fifteen patients, five for each group, with cancer of the Head&Neck, pleura, and prostate were enrolled in the study. In addition to the treatment planning CT (pCT) images, one replanning CT (rCT) image set was acquired for each patient during the RT course. Three experienced physicians outlined on the pCT and rCT all the volumes of interest (VOIs). We used three software solutions (VelocityAI 2.6.2 (V), MIM 5.1.1 (M) by MIMVista and ABAS 2.0 (A) by CMS-Elekta) to generate the automatic contouring on the repeated CT. All the VOIs obtained with automatic contouring (AC) were successively corrected manually. We recorded the time needed for: 1) ex novo ROIs definition on rCT; 2) generation of AC by the three software solutions; 3) manual correction of AC. To compare the quality of the volumes obtained automatically by the software and manually corrected with those drawn from scratch on rCT, we used the following indexes: overlap coefficient (DICE), sensitivity, inclusiveness index, difference in volume, and displacement differences on three axes (x, y, z) from the isocenter. The time saved by the three software solutions for all the sites, compared to the manual contouring from scratch, is statistically significant and similar for all the three software solutions. The time saved for each site are as follows: about an hour for Head&Neck, about 40 minutes for prostate, and about 20 minutes for mesothelioma. The best DICE similarity coefficient index was obtained with the manual correction for: A (contours for prostate), A and M (contours for H&N), and M (contours for mesothelioma). From a clinical point of view, the automated contouring workflow was shown to be significantly shorter than the manual contouring process, even though manual correction of the VOIs is always needed

  4. Design of software for calculation of shielding based on various standards radiodiagnostic calculation

    International Nuclear Information System (INIS)

    Falero, B.; Bueno, P.; Chaves, M. A.; Ordiales, J. M.; Villafana, O.; Gonzalez, M. J.

    2013-01-01

    The aim of this study was to develop a software application that performs calculation shields in radiology room depending on the type of equipment. The calculation will be done by selecting the user, the method proposed in the Guide 5.11, the Report 144 and 147 and also for the methodology given by the Portuguese Health Ministry. (Author)

  5. Development of a Standard Set of Software Indicators for Aeronautical Systems Center.

    Science.gov (United States)

    1992-09-01

    29:12). The composite models listed include COCOMO and the Software Productivity, Quality, and Reliability Model ( SPQR ) (29:12). The SPQR model was...determine the values of the 68 input parameters. Source provides no specifics. Indicator Name SPQR (SW Productivity, Qual, Reliability) Indicator Class

  6. 78 FR 55889 - Energy Conservation Program: Energy Conservation Standards for Commercial Refrigeration Equipment

    Science.gov (United States)

    2013-09-11

    ... in ANSI/Air- Conditioning and Refrigeration Institute (ARI) Standard 1200-2006, appendix D. [dagger... Manufacturers C. National Benefits II. Introduction A. Authority B. Background 1. Current Standards 2. History... class terminology. ** ``TDA'' is the total display area of the case, as measured in the Air...

  7. An evaluation of the impact of state Renewable Portfolio Standards (RPS) on retail, commercial, and industrial electricity prices

    Science.gov (United States)

    Puram, Rakesh

    The Renewable Portfolio Standard (RPS) has become a popular mechanism for states to promote renewable energy and its popularity has spurred a potential bill within Congress for a nationwide Federal RPS. While RPS benefits have been touted by several groups, it also has detractors. Among the concerns is that RPS standards could raise electricity rates, given that renewable energy is costlier than traditional fossil fuels. The evidence on the impact of RPS on electricity prices is murky at best: Complex models by NREL and USEIA utilize computer programs with several assumptions which make empirical studies difficult and only predict slight increases in electricity rates associated with RPS standards. Recent theoretical models and empirical studies have found price increases, but often fail to comprehensively include several sets of variables, which in fact could confound results. Utilizing a combination of past papers and studies to triangulate variables this study aims to develop both a rigorous fixed effects regression model as well as a theoretical framework to explain the results. This study analyzes state level panel data from 2002 to 2008 to analyze the effect of RPS on residential, commercial, and industrial electricity prices, controlling for several factors including amount of electricity generation from renewable and non-renewable sources, customer incentives for renewable energy, macroeconomic and demographic indicators, and fuel price mix. The study contrasts several regressions to illustrate important relationships and how inclusions as well as exclusion of various variables have an effect on electricity rates. Regression results indicate that the presence of RPS within a state increases the commercial and residential electricity rates, but have no discernable effect on the industrial electricity rate. Although RPS tends to increase electricity prices, the effect has a small impact on higher electricity prices. The models also indicate that jointly all

  8. Assessing the performance of commercial Agisoft PhotoScan software to deliver reliable data for accurate3D modelling

    Directory of Open Access Journals (Sweden)

    Jebur Ahmed

    2018-01-01

    Full Text Available 3D models delivered from digital photogrammetric techniques have massively increased and developed to meet the requirements of many applications. The reliability of these models is basically dependent on the data processing cycle and the adopted tool solution in addition to data quality. Agisoft PhotoScan is a professional image-based 3D modelling software, which seeks to create orderly, precise n 3D content from fixed images. It works with arbitrary images those qualified in both controlled and uncontrolled conditions. Following the recommendations of many users all around the globe, Agisoft PhotoScan, has become an important source to generate precise 3D data for different applications. How reliable is this data for accurate 3D modelling applications is the current question that needs an answer. Therefore; in this paper, the performance of the Agisoft PhotoScan software was assessed and analyzed to show the potential of the software for accurate 3D modelling applications. To investigate this, a study was carried out in the University of Baghdad / Al-Jaderia campus using data collected from airborne metric camera with 457m flying height. The Agisoft results show potential according to the research objective and the dataset quality following statistical and validation shape analysis.

  9. Use of commercial grade equipment and industrial standards for equipment qualification

    International Nuclear Information System (INIS)

    Gradin, L.P.; Muller, E.S.

    1984-01-01

    One of the most controversial issues is the proper application of Arrhenius aging methodology. Naturally and artificially aged equipment used in type test programs are often costly and burdensome tasks. Appropriate use of non-nuclear industrial standards and comparison to past history can demonstrate, with reasonable assurance, that equipment is qualified with proper consideration of aging. Specific review of the industrial standards available and application examples are provided

  10. Fermenter control and modelling system. Online Kopplung von Standard-Software zur Modellierung von biologischen Prozessen

    Energy Technology Data Exchange (ETDEWEB)

    Goldschmidt, B [Halle-Wittenberg Univ., Halle (Germany). Inst. fuer Bioprozesstechnik; Diehl, U; Lauterbach, U [Diessel Biotech GmbH, Melsungen (Germany)

    1991-10-01

    The development and operation of small biotechnological plants increasingly requires process control technique, which is both powerful and robust, but at the same time flexible. One criterion for the performance of a process control system is its ability to process and evaluate online process data project specifically. This contribution describes this for the control system Micro-MFCS and its coupling with a Modelling System. The Modelling System is a software package for the acquisition, processing and evaluating of data from biochemical, chemical and physico-technical experiments. It was developed at the Martin-Luther-University in Halle (Germany) and offers the features: Simulation of fermentation processes using mathematical models and fitting of mathematical models to fermentation processes. In the context of a joint project the online coupling of the software package Micro-MFCS and Modelling System was realised. (orig.).

  11. Using a Commercial Framework to Implement and Enhance the IEEE 1451.1 Standard

    OpenAIRE

    Viegas, Vítor; Pereira, José Dias; Girão, P. Silva

    2005-01-01

    In 1999, the 1451.1 Std was published defining a common object model and interface specification to develop open, multi-vendor distributed measurement and control systems. However, despite the well-known advantages of the model, few have been the initiatives to implement it. In this paper we describe the implementation of a NCAP – Network Capable Application Processor, in a well-known and well-proven infrastructure: the Microsoft .NET Framework. The choice of a commercial framework was part o...

  12. The series production in a standardized fabrication line for silicide fuels and commercial aspects

    International Nuclear Information System (INIS)

    Wehner, E.L.; Hassel, H.W.

    1987-01-01

    NUKEM has been responsible for the development and fabrication of LEU fuel elements for MTR reactors under the frame of the German AF program since 1979. The AF program is part of the international RERTR efforts, which were initiated by the INFCE Group in 1978. This paper describes the actual status of development and the transition from the prototype to the series production in a standardized manufacturing line for silicide fuels at NUKEM. Technical provisions and a customer oriented standardized product range aim at an economized manufacturing. (Author)

  13. Using a commercial mathematics software package for on-line analysis at the BNL Accelerator Test Facility

    International Nuclear Information System (INIS)

    Malone, R.; Wang, X.J.

    1999-01-01

    BY WRITING BOTH A CUSTOM WINDOWS(NTTM) DYNAMIC LINK LIBRARY AND GENERIC COMPANION SERVER SOFTWARE, THE INTRINSIC FUNCTIONS OF MATHSOFT MATHCAD(TM) HAVE BEEN EXTENDED WITH NEW CAPABILITIES WHICH PERMIT DIRECT ACCESS TO THE CONTROL SYSTEM DATABASES OF BROOKHAVEN NATIONAL LABORATORY ACCELERATOR TEST FACILITY. UNDER THIS SCHEME, A MATHCAD WORKSHEET EXECUTING ON A PERSONAL COMPUTER BECOMES A CLIENT WHICH CAN BOTH IMPORT AND EXPORT DATA TO A CONTROL SYSTEM SERVER VIA A NETWORK STREAM SOCKET CONNECTION. THE RESULT IS AN ALTERNATIVE, MATHEMATICALLY ORIENTED VIEW OF CONTROLLING THE ACCELERATOR INTERACTIVELY

  14. Quantitative comparison and evaluation of two commercially available, two-dimensional electrophoresis image analysis software packages, Z3 and Melanie.

    Science.gov (United States)

    Raman, Babu; Cheung, Agnes; Marten, Mark R

    2002-07-01

    While a variety of software packages are available for analyzing two-dimensional electrophoresis (2-DE) gel images, no comparisons between these packages have been published, making it difficult for end users to determine which package would best meet their needs. The goal here was to develop a set of tests to quantitatively evaluate and then compare two software packages, Melanie 3.0 and Z3, in three of the fundamental steps involved in 2-DE image analysis: (i) spot detection, (ii) gel matching, and (iii) spot quantitation. To test spot detection capability, automatically detected protein spots were compared to manually counted, "real" protein spots. Spot matching efficiency was determined by comparing distorted (both geometrically and nongeometrically) gel images with undistorted original images, and quantitation tests were performed on artificial gels with spots of varying Gaussian volumes. In spot detection tests, Z3 performed better than Melanie 3.0 and required minimal user intervention to detect approximately 89% of the actual protein spots and relatively few extraneous spots. Results from gel matching tests depended on the type of image distortion used. For geometric distortions, Z3 performed better than Melanie 3.0, matching 99% of the spots, even for extreme distortions. For nongeometrical distortions, both Z3 and Melanie 3.0 required user intervention and performed comparably, matching 95% of the spots. In spot quantitation tests, both Z3 and Melanie 3.0 predicted spot volumes relatively well for spot ratios less than 1:6. For higher ratios, Melanie 3.0 did much better. In summary, results suggest Z3 requires less user intervention than Melanie 3.0, thus simplifying differential comparison of 2-DE gel images. Melanie 3.0, however, offers many more optional tools for image editing, spot detection, data reporting and statistical analysis than Z3. All image files used for these tests and updated information on the software are available on the internet

  15. Spiked proteomic standard dataset for testing label-free quantitative software and statistical methods

    Directory of Open Access Journals (Sweden)

    Claire Ramus

    2016-03-01

    Full Text Available This data article describes a controlled, spiked proteomic dataset for which the “ground truth” of variant proteins is known. It is based on the LC-MS analysis of samples composed of a fixed background of yeast lysate and different spiked amounts of the UPS1 mixture of 48 recombinant proteins. It can be used to objectively evaluate bioinformatic pipelines for label-free quantitative analysis, and their ability to detect variant proteins with good sensitivity and low false discovery rate in large-scale proteomic studies. More specifically, it can be useful for tuning software tools parameters, but also testing new algorithms for label-free quantitative analysis, or for evaluation of downstream statistical methods. The raw MS files can be downloaded from ProteomeXchange with identifier http://www.ebi.ac.uk/pride/archive/projects/PXD001819. Starting from some raw files of this dataset, we also provide here some processed data obtained through various bioinformatics tools (including MaxQuant, Skyline, MFPaQ, IRMa-hEIDI and Scaffold in different workflows, to exemplify the use of such data in the context of software benchmarking, as discussed in details in the accompanying manuscript [1]. The experimental design used here for data processing takes advantage of the different spike levels introduced in the samples composing the dataset, and processed data are merged in a single file to facilitate the evaluation and illustration of software tools results for the detection of variant proteins with different absolute expression levels and fold change values.

  16. Critical Review of Commercial Secondary Lithium-Ion Battery Safety Standards

    Science.gov (United States)

    Jones, Harry P.; Chapin, Thomas, J.; Tabaddor, Mahmod

    2010-09-01

    The development of Li-ion cells with greater energy density has lead to safety concerns that must be carefully assessed as Li-ion cells power a wide range of products from consumer electronics to electric vehicles to space applications. Documented field failures and product recalls for Li-ion cells, mostly for consumer electronic products, highlight the risk of fire, smoke, and even explosion. These failures have been attributed to the occurrence of internal short circuits and the subsequent thermal runaway that can lead to fire and explosion. As packaging for some applications include a large number of cells, the risk of failure is likely to be magnified. To address concerns about the safety of battery powered products, safety standards have been developed. This paper provides a review of various international safety standards specific to lithium-ion cells. This paper shows that though the standards are harmonized on a host of abuse conditions, most lack a test simulating internal short circuits. This paper describes some efforts to introduce internal short circuit tests into safety standards.

  17. 77 FR 43015 - Energy Conservation Standards for Commercial and Industrial Electric Motors: Public Meeting and...

    Science.gov (United States)

    2012-07-23

    ... following seven factors: 1. The economic impact of the standard on manufacturers and customers of equipment... evaluating. This relationship serves as the basis for cost-benefit calculations for individual customers... Period Analyses E. National Impact Analysis IV. Public Participation I. Statutory Authority The Energy...

  18. MOLCARE development towards MCFC commercial power plants based on 500 kW standard modules

    Energy Technology Data Exchange (ETDEWEB)

    Torazza, A; Dufour, A; Perfumo, A; Ricerche, A; Gegundez, J; Sanson, F; Moreno, A

    1998-07-01

    Fuel cells technologies for stationary applications are expected to play a remarkable role in the field of next decade energy production systems ranging from some hundreds kW to some MW. The interest in using fuel cells to produce electric energy comes from the advantages that fuel cells offer in terms of high efficiency, good behavior at base and partial load, very low emissions, modularity (easy adjustment of plant capacity to power-demand increase), and reduced time to be spent for plant erection. At least four types of fuel cells can be considered suitable for stationary applications. With reference to their electrolyte they can be classified as: Polymeric Electrolyte Membrane Fuel Cells (PEMFC), Phosphoric Acid Fuel Cells (PAFC), Molten Carbonate Fuel Cells (MCFC) and Solid Oxide Fuel Cells (SOFC). Each of them works at a temperature level that is depending on the type of electrolyte. From a general point of view all the fuel cell technologies present, at various extents, the above listed advantages. Nevertheless specific features of each fuel cell type suggest to identify a specific field of application for each type of solution, in order to stress the potential advantages of any technology and minimize its possible drawbacks. Anyway the different level of maturity for the various fuel cell technologies does not allow an homogeneous comparison of technical and economical key parameters. PAFCs, due to their present commercial availability and operation experience, are well outlined in terms of performance and costs; on the contrary with regard to the other technologies--PEMFC, MCFC and SOFC--which are still under development, their commercialization is expected within a period of 7 to 13 years according to single technology maturity level (MCFC level seems to be more ready), kind of application, competitors, environmental constraints, etc.

  19. Development, analysis, and evaluation of a commercial software framework for the study of Extremely Low Probability of Rupture (xLPR) events at nuclear power plants.

    Energy Technology Data Exchange (ETDEWEB)

    Kalinich, Donald A.; Helton, Jon Craig; Sallaberry, Cedric M.; Mattie, Patrick D.

    2010-12-01

    Sandia National Laboratories (SNL) participated in a Pilot Study to examine the process and requirements to create a software system to assess the extremely low probability of pipe rupture (xLPR) in nuclear power plants. This project was tasked to develop a prototype xLPR model leveraging existing fracture mechanics models and codes coupled with a commercial software framework to determine the framework, model, and architecture requirements appropriate for building a modular-based code. The xLPR pilot study was conducted to demonstrate the feasibility of the proposed developmental process and framework for a probabilistic code to address degradation mechanisms in piping system safety assessments. The pilot study includes a demonstration problem to assess the probability of rupture of DM pressurizer surge nozzle welds degraded by primary water stress-corrosion cracking (PWSCC). The pilot study was designed to define and develop the framework and model; then construct a prototype software system based on the proposed model. The second phase of the project will be a longer term program and code development effort focusing on the generic, primary piping integrity issues (xLPR code). The results and recommendations presented in this report will be used to help the U.S. Nuclear Regulatory Commission (NRC) define the requirements for the longer term program.

  20. Development and validation of a HPLC method for standardization of herbal and commercial extracts of Myrcia uniflora

    Directory of Open Access Journals (Sweden)

    Andrea N. de L. Batista

    2011-06-01

    Full Text Available Myrcia uniflora Barb. Rodr., Myrtaceae, popularly known as "pedra-hume-caá" in Brazil, is sold as dry extracts in capsules or as tinctures for the treatment of diabetes mellitus. Previous phytochemical studies on this species described the occurrence of the flavonoids mearnsitrin and myricitrin. In the present study, the chromatographic profiles of M. uniflora leaves and commercial extracts were determined using HPLC-PAD. Myricitrin was used as an external standard in the development and validation of the HPLC method. The proposed method is simple, rapid and reliable and can be successfully applied in industry for standardization of herbs and phytomedicines commercialised in Brazil as "pedra-hume-caá".

  1. Development and validation of a HPLC method for standardization of herbal and commercial extracts of Myrcia uniflora

    Directory of Open Access Journals (Sweden)

    Andrea N. de L. Batista

    2011-04-01

    Full Text Available Myrcia uniflora Barb. Rodr., Myrtaceae, popularly known as "pedra-hume-caá" in Brazil, is sold as dry extracts in capsules or as tinctures for the treatment of diabetes mellitus. Previous phytochemical studies on this species described the occurrence of the flavonoids mearnsitrin and myricitrin. In the present study, the chromatographic profiles of M. uniflora leaves and commercial extracts were determined using HPLC-PAD. Myricitrin was used as an external standard in the development and validation of the HPLC method. The proposed method is simple, rapid and reliable and can be successfully applied in industry for standardization of herbs and phytomedicines commercialised in Brazil as "pedra-hume-caá".

  2. Systematic evaluation of three different commercial software solutions for automatic segmentation for adaptive therapy in head-and-neck, prostate and pleural cancer.

    Science.gov (United States)

    La Macchia, Mariangela; Fellin, Francesco; Amichetti, Maurizio; Cianchetti, Marco; Gianolini, Stefano; Paola, Vitali; Lomax, Antony J; Widesott, Lamberto

    2012-09-18

    To validate, in the context of adaptive radiotherapy, three commercial software solutions for atlas-based segmentation. Fifteen patients, five for each group, with cancer of the Head&Neck, pleura, and prostate were enrolled in the study. In addition to the treatment planning CT (pCT) images, one replanning CT (rCT) image set was acquired for each patient during the RT course. Three experienced physicians outlined on the pCT and rCT all the volumes of interest (VOIs). We used three software solutions (VelocityAI 2.6.2 (V), MIM 5.1.1 (M) by MIMVista and ABAS 2.0 (A) by CMS-Elekta) to generate the automatic contouring on the repeated CT. All the VOIs obtained with automatic contouring (AC) were successively corrected manually. We recorded the time needed for: 1) ex novo ROIs definition on rCT; 2) generation of AC by the three software solutions; 3) manual correction of AC.To compare the quality of the volumes obtained automatically by the software and manually corrected with those drawn from scratch on rCT, we used the following indexes: overlap coefficient (DICE), sensitivity, inclusiveness index, difference in volume, and displacement differences on three axes (x, y, z) from the isocenter. The time saved by the three software solutions for all the sites, compared to the manual contouring from scratch, is statistically significant and similar for all the three software solutions. The time saved for each site are as follows: about an hour for Head&Neck, about 40 minutes for prostate, and about 20 minutes for mesothelioma. The best DICE similarity coefficient index was obtained with the manual correction for: A (contours for prostate), A and M (contours for H&N), and M (contours for mesothelioma). From a clinical point of view, the automated contouring workflow was shown to be significantly shorter than the manual contouring process, even though manual correction of the VOIs is always needed.

  3. Systematic evaluation of three different commercial software solutions for automatic segmentation for adaptive therapy in head-and-neck, prostate and pleural cancer

    Directory of Open Access Journals (Sweden)

    La Macchia Mariangela

    2012-09-01

    Full Text Available Abstract Purpose To validate, in the context of adaptive radiotherapy, three commercial software solutions for atlas-based segmentation. Methods and materials Fifteen patients, five for each group, with cancer of the Head&Neck, pleura, and prostate were enrolled in the study. In addition to the treatment planning CT (pCT images, one replanning CT (rCT image set was acquired for each patient during the RT course. Three experienced physicians outlined on the pCT and rCT all the volumes of interest (VOIs. We used three software solutions (VelocityAI 2.6.2 (V, MIM 5.1.1 (M by MIMVista and ABAS 2.0 (A by CMS-Elekta to generate the automatic contouring on the repeated CT. All the VOIs obtained with automatic contouring (AC were successively corrected manually. We recorded the time needed for: 1 ex novo ROIs definition on rCT; 2 generation of AC by the three software solutions; 3 manual correction of AC. To compare the quality of the volumes obtained automatically by the software and manually corrected with those drawn from scratch on rCT, we used the following indexes: overlap coefficient (DICE, sensitivity, inclusiveness index, difference in volume, and displacement differences on three axes (x, y, z from the isocenter. Results The time saved by the three software solutions for all the sites, compared to the manual contouring from scratch, is statistically significant and similar for all the three software solutions. The time saved for each site are as follows: about an hour for Head&Neck, about 40 minutes for prostate, and about 20 minutes for mesothelioma. The best DICE similarity coefficient index was obtained with the manual correction for: A (contours for prostate, A and M (contours for H&N, and M (contours for mesothelioma. Conclusions From a clinical point of view, the automated contouring workflow was shown to be significantly shorter than the manual contouring process, even though manual correction of the VOIs is always needed.

  4. A brain-computer interface as input channel for a standard assistive technology software.

    Science.gov (United States)

    Zickler, Claudia; Riccio, Angela; Leotta, Francesco; Hillian-Tress, Sandra; Halder, Sebastian; Holz, Elisa; Staiger-Sälzer, Pit; Hoogerwerf, Evert-Jan; Desideri, Lorenzo; Mattia, Donatella; Kübler, Andrea

    2011-10-01

    Recently brain-computer interface (BCI) control was integrated into the commercial assistive technology product QualiWORLD (QualiLife Inc., Paradiso-Lugano, CH). Usability of the first prototype was evaluated in terms of effectiveness (accuracy), efficiency (information transfer rate and subjective workload/NASA Task Load Index) and user satisfaction (Quebec User Evaluation of Satisfaction with assistive Technology, QUEST 2.0) by four end-users with severe disabilities. Three assistive technology experts evaluated the device from a third person perspective. The results revealed high performance levels in communication and internet tasks. Users and assistive technology experts were quite satisfied with the device. However, none could imagine using the device in daily life without improvements. Main obstacles were the EEG-cap and low speed.

  5. Classical table services in commercial catering: standardization proposal and clarifications for future researches

    Directory of Open Access Journals (Sweden)

    Rodolfo Wendhausen Krause

    2016-08-01

    Full Text Available This study aims to synthesize the scientific knowledge with the empirical knowledge of the authors of this article on the four main types/styles of individual services in gastronomic full service establishments. In addition, it seeks to, as secondary objectives, to simplify and standardize the types of classic services in restaurants. These objectives were met through a positivist methodological approach. It had as research techniques a comparative analysis and synthesis of the state of the art on the typology of classical services with the empirical knowledge of the authors. Subsequently, the validation of standardization proposal was made by a panel of evaluators. We came to simplify the services into three basic categories: French Service; Direct English Service and Platted Service. It is understood that, because it is an exploratory study, the proposal is the beginning of scientific research on the subject. Therefore, it has to be investigated in greater depth in future studies. Therefore, the research field of the mise en place is the area that greater needs research of this nature.

  6. Standard software for automated testing of infrared imagers, IRWindows, in practical applications

    Science.gov (United States)

    Irwin, Alan; Nicklin, Robert L.

    1998-08-01

    In the past, ad-hoc and manual testing of infrared images hasn't been a deterrent to the characterization of these systems due to the low volume of production and high ratio of skilled personnel to the quantity of units under test. However, with higher volume production, increasing numbers of development labs in emerging markets, and the push towards less expensive, faster development cycles, there is a strong need for standardized testing that is quickly configurable by test engineers, which can be run by less experienced test technicians, and which produce repeatable, accurate results. The IRWindowsTM system addresses these needs using a standard computing platform and existing automated IR test equipment. This paper looks at the general capabilities of the IRWindowsTM system, and then examines the specific results from its application in the PalmIR and Automotive IR production environments.

  7. Utilizing Commercial Hardware and Open Source Computer Vision Software to Perform Motion Capture for Reduced Gravity Flight

    Science.gov (United States)

    Humphreys, Brad; Bellisario, Brian; Gallo, Christopher; Thompson, William K.; Lewandowski, Beth

    2016-01-01

    Long duration space travel to Mars or to an asteroid will expose astronauts to extended periods of reduced gravity. Since gravity is not present to aid loading, astronauts will use resistive and aerobic exercise regimes for the duration of the space flight to minimize the loss of bone density, muscle mass and aerobic capacity that occurs during exposure to a reduced gravity environment. Unlike the International Space Station (ISS), the area available for an exercise device in the next generation of spacecraft is limited. Therefore, compact resistance exercise device prototypes are being developed. The NASA Digital Astronaut Project (DAP) is supporting the Advanced Exercise Concepts (AEC) Project, Exercise Physiology and Countermeasures (ExPC) project and the National Space Biomedical Research Institute (NSBRI) funded researchers by developing computational models of exercising with these new advanced exercise device concepts. To perform validation of these models and to support the Advanced Exercise Concepts Project, several candidate devices have been flown onboard NASAs Reduced Gravity Aircraft. In terrestrial laboratories, researchers typically have available to them motion capture systems for the measurement of subject kinematics. Onboard the parabolic flight aircraft it is not practical to utilize the traditional motion capture systems due to the large working volume they require and their relatively high replacement cost if damaged. To support measuring kinematics on board parabolic aircraft, a motion capture system is being developed utilizing open source computer vision code with commercial off the shelf (COTS) video camera hardware. While the systems accuracy is lower than lab setups, it provides a means to produce quantitative comparison motion capture kinematic data. Additionally, data such as required exercise volume for small spaces such as the Orion capsule can be determined. METHODS: OpenCV is an open source computer vision library that provides the

  8. Calculation of residence times and radiation doses using the standard PC software Excel

    International Nuclear Information System (INIS)

    Herzog, H.; Zilken, H.; Niederbremer, A.; Friedrich, W.; Mueller-Gaertner, H.W.

    1997-01-01

    We developed a program which aims to facilitate the calculation of radiation doses to single organs and the whole body. IMEDOSE uses Excel to include calculations, graphical displays, and interactions with the user in a single general-purpose PC software tool. To start the procedure the input data are copied into a spreadsheet. They must represent percentage uptake values of several organs derived from measurements in animals or humans. To extrapolate these data up to seven half-lives of the radionuclide, fitting to one or two exponentional functions is included and can be checked by the user. By means of the approximate time-activity information the cumulated activity or residence times are calculated. Finally these data are combined with the absorbed fraction doses (S-values) given by MIRD pamphlet No. 11 to yield radiation doses, the effective dose equivalent and the effective dose. These results are presented in a final table. Interactions are realized with push-buttons and drop-down menus. Calculations use the Visual Basic tool of Excel. In order to test our program, biodistribution data of fluorine-18 fluorodeoxyglucose were taken from the literature (Meija et al., J Nucl Med 1991; 32:699-706). For a 70-kg adult the resulting radiation doses of all target organs listed in MIRD 11 were different from the ICRP 53 values by 1%±18% on the average. When the residence times were introduced into MIRDOSE3 (Stabin, J Nucl Med 1996; 37:538-546) the mean difference between our results and those of MIRDOSE3 was -3%±6%. Both outcomes indicate the validity of the present approach. (orig.)

  9. Calculation of residence times and radiation doses using the standard PC software Excel.

    Science.gov (United States)

    Herzog, H; Zilken, H; Niederbremer, A; Friedrich, W; Müller-Gärtner, H W

    1997-12-01

    We developed a program which aims to facilitate the calculation of radiation doses to single organs and the whole body. IMEDOSE uses Excel to include calculations, graphical displays, and interactions with the user in a single general-purpose PC software tool. To start the procedure the input data are copied into a spreadsheet. They must represent percentage uptake values of several organs derived from measurements in animals or humans. To extrapolate these data up to seven half-lives of the radionuclide, fitting to one or two exponentional functions is included and can be checked by the user. By means of the approximate time-activity information the cumulated activity or residence times are calculated. Finally these data are combined with the absorbed fraction doses (S-values) given by MIRD pamphlet No. 11 to yield radiation doses, the effective dose equivalent and the effective dose. These results are presented in a final table. Interactions are realized with push-buttons and drop-down menus. Calculations use the Visual Basic tool of Excel. In order to test our program, biodistribution data of fluorine-18 fluorodeoxyglucose were taken from the literature (Meija et al., J Nucl Med 1991; 32:699-706). For a 70-kg adult the resulting radiation doses of all target organs listed in MIRD 11 were different from the ICRP 53 values by 1%+/-18% on the average. When the residence times were introduced into MIRDOSE3 (Stabin, J Nucl Med 1996; 37:538-546) the mean difference between our results and those of MIRDOSE3 was -3%+/-6%. Both outcomes indicate the validity of the present approach.

  10. Calculation of residence times and radiation doses using the standard PC software Excel

    Energy Technology Data Exchange (ETDEWEB)

    Herzog, H.; Zilken, H.; Niederbremer, A.; Friedrich, W. [Institute of Medicine, Research Center Juelich, Juelich (Germany); Mueller-Gaertner, H.W. [Institute of Medicine, Research Center Juelich, Juelich (Germany)]|[Department of Nuclear Medicine, Heinrich-Heine University Hospital Duesseldorf (Germany)

    1997-12-01

    We developed a program which aims to facilitate the calculation of radiation doses to single organs and the whole body. IMEDOSE uses Excel to include calculations, graphical displays, and interactions with the user in a single general-purpose PC software tool. To start the procedure the input data are copied into a spreadsheet. They must represent percentage uptake values of several organs derived from measurements in animals or humans. To extrapolate these data up to seven half-lives of the radionuclide, fitting to one or two exponentional functions is included and can be checked by the user. By means of the approximate time-activity information the cumulated activity or residence times are calculated. Finally these data are combined with the absorbed fraction doses (S-values) given by MIRD pamphlet No. 11 to yield radiation doses, the effective dose equivalent and the effective dose. These results are presented in a final table. Interactions are realized with push-buttons and drop-down menus. Calculations use the Visual Basic tool of Excel. In order to test our program, biodistribution data of fluorine-18 fluorodeoxyglucose were taken from the literature (Meija et al., J Nucl Med 1991; 32:699-706). For a 70-kg adult the resulting radiation doses of all target organs listed in MIRD 11 were different from the ICRP 53 values by 1%{+-}18% on the average. When the residence times were introduced into MIRDOSE3 (Stabin, J Nucl Med 1996; 37:538-546) the mean difference between our results and those of MIRDOSE3 was -3%{+-}6%. Both outcomes indicate the validity of the present approach. (orig.) With 5 figs., 2 tabs., 18 refs.

  11. Identification of Water Quality Significant Parameter with Two Transformation/Standardization Methods on Principal Component Analysis and Scilab Software

    Directory of Open Access Journals (Sweden)

    Jovan Putranda

    2016-09-01

    Full Text Available Water quality monitoring is prone to encounter error on its recording or measuring process. The monitoring on river water quality not only aims to recognize the water quality dynamic, but also to evaluate the data to create river management policy and water pollution in order to maintain the continuity of human health or sanitation requirement, and biodiversity preservation. Evaluation on water quality monitoring needs to be started by identifying the important water quality parameter. This research objected to identify the significant parameters by using two transformation or standardization methods on water quality data, which are the river Water Quality Index, WQI (Indeks Kualitas Air, Sungai, IKAs transformation or standardization method and transformation or standardization method with mean 0 and variance 1; so that the variability of water quality parameters could be aggregated with one another. Both of the methods were applied on the water quality monitoring data which its validity and reliability have been tested. The PCA, Principal Component Analysis (Analisa Komponen Utama, AKU, with the help of Scilab software, has been used to process the secondary data on water quality parameters of Gadjah Wong river in 2004-2013, with its validity and reliability has been tested. The Scilab result was cross examined with the result from the Excel-based Biplot Add In software. The research result showed that only 18 from total 35 water quality parameters that have passable data quality. The two transformation or standardization data methods gave different significant parameter type and amount result. On the transformation or standardization mean 0 variances 1, there were water quality significant parameter dynamic to mean concentration of each water quality parameters, which are TDS, SO4, EC, TSS, NO3N, COD, BOD5, Grease Oil and NH3N. On the river WQI transformation or standardization, the water quality significant parameter showed the level of

  12. Direct Search for Standard Model-Like Higgs Boson and Software Integration of Data Acquisition Cards

    CERN Document Server

    Potterat, Cédric

    2010-01-01

    The Large Hadron Collider (LHC) accelerator at CERN near Geneva is designed to collide protons with a centre-of-mass energy up to 14TeV.It was tested at lower energy in November 2009. The world record collisions of beams of 1180 MeV was achieved. The LHC has four interaction points for the four large experiments: ALICE, ATLAS, CMS, and LHCb. LHCb is the "LHC beauty" experiment, located at interaction point P8 (France). It is a single arm forward spectrometer dedicated to the $b$-hadron sector optimised to study CP-violating processes and rare decays. In particular the LHCb detector has the capability to measure decay vertices with a resolution of few tenths of microns. Two topics have been addressed in this thesis. In the first part we study the LHCb sensitivity to detect a Standard Model Higgs boson in the $HW^{\\pm} \\rightarrow b\\overline{b}+\\ell^{\\pm}\\overline(-){\

  13. High-performance liquid chromatographic analysis of cyclosporin A in rat blood and liver using a commercially available internal standard.

    Science.gov (United States)

    Chimalakonda, Anjaneya P; Shah, Rakhi B; Mehvar, Reza

    2002-05-25

    All the available HPLC assays of cyclosporin A (CyA) use internal standards that are not commercially available. Our purpose was to develop an HPLC assay for measurements of CyA in rat blood and liver using a commercially available internal standard (I.S.). After the addition of tamoxifen (I.S.), blood (0.25 ml) or the liver homogenate (1 ml) samples were extracted into a mixture of ether:methanol (95:5). The residue after evaporation of the organic layer was dissolved in 200 microl of an injection solution and washed with 1 ml of hexane before analysis. The separation was achieved using an LC-1 column (70 degrees C) with a mobile phase of methanol-acetonitrile-0.01 M KH(2)PO(4) (50:25:25, v/v) and a flow-rate of 1 ml/min. Detection was at 205 nm. Cyclosporin A and I.S. eluted at 5 and 7 min, respectively, free from endogenous peaks. Linear relationships (r>0.98) were observed between the CyA:I.S. peak area ratios and the CyA concentrations within the range of 0.2-10 microg/ml for blood and 0.1-4 microg/ml for the liver homogenates. The intra- and inter-run C.V.s and errors for both the blood and liver samples were <15%. The extraction efficiency (n=5) was close to 100% for both CyA and I.S. in both blood and liver homogenates. The lower limit of quantitation of the assay was 0.2 or 0.1 microg/ml based on 250 microl of blood or 1 ml of liver homogenate, respectively. The assay was capable of measuring blood and liver concentrations of CyA in a rat injected intravenously with a single 5-mg/kg dose of the drug.

  14. 48 CFR 12.212 - Computer software.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Computer software. 12.212... software. (a) Commercial computer software or commercial computer software documentation shall be acquired... required to— (1) Furnish technical information related to commercial computer software or commercial...

  15. Definition and specification for PACS. A checklist based on the standard ''IEEE Recommended Practice for Software Requirements Specifications''

    International Nuclear Information System (INIS)

    Koenig, H.; Klose, K.J.

    1999-01-01

    Problem: The formulation of requirements is necessary to control the goals of a PACS project. Furthermore, in this way, the scope of functionality necessary to support radiological working processes becomes clear. Method: Definitions of requirements and specification are formulated independently of systems according to the IEEE standard 'Recommended Practice for Software Requirements Specifications'. Definitions are given in the Request for Information, specifications in the Request for Proposal. Functional and non-functional requirements are distinguished. The solutions are rated with respect to scope, appropriateness and quality of implementation. Results: A PACS checklist was created according to the methods described above. It is published on the homepage of the 'Arbeitsgemeinschaft Informationstechnologie' (AGIT) within the 'Deutsche Roentgengesellschaft' (DRG) (http://www.uni-marburg.de/mzr.agit). Conclusion: The checklist provides a discussion forum which should contribute to an agreement on accepted basic PACS functionalities. (orig.) [de

  16. Cloud computing geospatial application for water resources based on free and open source software and open standards - a prototype

    Science.gov (United States)

    Delipetrev, Blagoj

    2016-04-01

    Presently, most of the existing software is desktop-based, designed to work on a single computer, which represents a major limitation in many ways, starting from limited computer processing, storage power, accessibility, availability, etc. The only feasible solution lies in the web and cloud. This abstract presents research and development of a cloud computing geospatial application for water resources based on free and open source software and open standards using hybrid deployment model of public - private cloud, running on two separate virtual machines (VMs). The first one (VM1) is running on Amazon web services (AWS) and the second one (VM2) is running on a Xen cloud platform. The presented cloud application is developed using free and open source software, open standards and prototype code. The cloud application presents a framework how to develop specialized cloud geospatial application that needs only a web browser to be used. This cloud application is the ultimate collaboration geospatial platform because multiple users across the globe with internet connection and browser can jointly model geospatial objects, enter attribute data and information, execute algorithms, and visualize results. The presented cloud application is: available all the time, accessible from everywhere, it is scalable, works in a distributed computer environment, it creates a real-time multiuser collaboration platform, the programing languages code and components are interoperable, and it is flexible in including additional components. The cloud geospatial application is implemented as a specialized water resources application with three web services for 1) data infrastructure (DI), 2) support for water resources modelling (WRM), 3) user management. The web services are running on two VMs that are communicating over the internet providing services to users. The application was tested on the Zletovica river basin case study with concurrent multiple users. The application is a state

  17. Software engineering architecture-driven software development

    CERN Document Server

    Schmidt, Richard F

    2013-01-01

    Software Engineering: Architecture-driven Software Development is the first comprehensive guide to the underlying skills embodied in the IEEE's Software Engineering Body of Knowledge (SWEBOK) standard. Standards expert Richard Schmidt explains the traditional software engineering practices recognized for developing projects for government or corporate systems. Software engineering education often lacks standardization, with many institutions focusing on implementation rather than design as it impacts product architecture. Many graduates join the workforce with incomplete skil

  18. LHCb software strategy

    CERN Document Server

    Van Herwijnen, Eric

    1998-01-01

    This document describes the software strategy of the LHCb experiment. The main objective is to reuse designs and code wherever possible; We will implement an architecturally driven design process; This architectural process will be implemented using Object Technology; We aim for platform indepence; try to take advantage of distributed computing and will use industry standards, commercial software and profit from HEP developments; We will implement a common software process and development environment. One of the major problems that we are immediately faced with is the conversion of our current code from Fortran into an Object Oriented language and the conversion of our current developers to Object technology. Some technical terms related to OO programming are defined in Annex A.1

  19. Requirement Volatility, Standardization and Knowledge Integration in Software Projects: An Empirical Analysis on Outsourced IS Development Projects

    Directory of Open Access Journals (Sweden)

    Rajesri Govindaraju

    2015-08-01

    Full Text Available Information systems development (ISD projects are highly complex, with different groups of people having  to collaborate and exchange their knowledge. Considering the intensity of knowledge exchange that takes place in outsourced ISD projects, in this study a conceptual model was developed, aiming to examine the influence of four antecedents, i.e. standardization, requirement volatility, internal integration, and external integration, on two dependent variables, i.e. process performance and product performance. Data  were collected from 46 software companies in four big cities in Indonesia. The collected data were examined to verify the proposed theoretical model using the partial least square structural equation modeling (PLS-SEM technique. The results show that process performance is significantly influenced by internal integration and standardization, while product performance is  significantly influenced by external integration and  requirement volatility. This study contributes  to a better understanding of how knowledge integration can be managed in outsourced ISD projects in view of increasing their success.

  20. The Harvard Automated Processing Pipeline for Electroencephalography (HAPPE): Standardized Processing Software for Developmental and High-Artifact Data.

    Science.gov (United States)

    Gabard-Durnam, Laurel J; Mendez Leal, Adriana S; Wilkinson, Carol L; Levin, April R

    2018-01-01

    Electroenchephalography (EEG) recordings collected with developmental populations present particular challenges from a data processing perspective. These EEGs have a high degree of artifact contamination and often short recording lengths. As both sample sizes and EEG channel densities increase, traditional processing approaches like manual data rejection are becoming unsustainable. Moreover, such subjective approaches preclude standardized metrics of data quality, despite the heightened importance of such measures for EEGs with high rates of initial artifact contamination. There is presently a paucity of automated resources for processing these EEG data and no consistent reporting of data quality measures. To address these challenges, we propose the Harvard Automated Processing Pipeline for EEG (HAPPE) as a standardized, automated pipeline compatible with EEG recordings of variable lengths and artifact contamination levels, including high-artifact and short EEG recordings from young children or those with neurodevelopmental disorders. HAPPE processes event-related and resting-state EEG data from raw files through a series of filtering, artifact rejection, and re-referencing steps to processed EEG suitable for time-frequency-domain analyses. HAPPE also includes a post-processing report of data quality metrics to facilitate the evaluation and reporting of data quality in a standardized manner. Here, we describe each processing step in HAPPE, perform an example analysis with EEG files we have made freely available, and show that HAPPE outperforms seven alternative, widely-used processing approaches. HAPPE removes more artifact than all alternative approaches while simultaneously preserving greater or equivalent amounts of EEG signal in almost all instances. We also provide distributions of HAPPE's data quality metrics in an 867 file dataset as a reference distribution and in support of HAPPE's performance across EEG data with variable artifact contamination and

  1. The Harvard Automated Processing Pipeline for Electroencephalography (HAPPE: Standardized Processing Software for Developmental and High-Artifact Data

    Directory of Open Access Journals (Sweden)

    Laurel J. Gabard-Durnam

    2018-02-01

    Full Text Available Electroenchephalography (EEG recordings collected with developmental populations present particular challenges from a data processing perspective. These EEGs have a high degree of artifact contamination and often short recording lengths. As both sample sizes and EEG channel densities increase, traditional processing approaches like manual data rejection are becoming unsustainable. Moreover, such subjective approaches preclude standardized metrics of data quality, despite the heightened importance of such measures for EEGs with high rates of initial artifact contamination. There is presently a paucity of automated resources for processing these EEG data and no consistent reporting of data quality measures. To address these challenges, we propose the Harvard Automated Processing Pipeline for EEG (HAPPE as a standardized, automated pipeline compatible with EEG recordings of variable lengths and artifact contamination levels, including high-artifact and short EEG recordings from young children or those with neurodevelopmental disorders. HAPPE processes event-related and resting-state EEG data from raw files through a series of filtering, artifact rejection, and re-referencing steps to processed EEG suitable for time-frequency-domain analyses. HAPPE also includes a post-processing report of data quality metrics to facilitate the evaluation and reporting of data quality in a standardized manner. Here, we describe each processing step in HAPPE, perform an example analysis with EEG files we have made freely available, and show that HAPPE outperforms seven alternative, widely-used processing approaches. HAPPE removes more artifact than all alternative approaches while simultaneously preserving greater or equivalent amounts of EEG signal in almost all instances. We also provide distributions of HAPPE's data quality metrics in an 867 file dataset as a reference distribution and in support of HAPPE's performance across EEG data with variable artifact

  2. Computational tools and resources for metabolism-related property predictions. 1. Overview of publicly available (free and commercial) databases and software.

    Science.gov (United States)

    Peach, Megan L; Zakharov, Alexey V; Liu, Ruifeng; Pugliese, Angelo; Tawa, Gregory; Wallqvist, Anders; Nicklaus, Marc C

    2012-10-01

    Metabolism has been identified as a defining factor in drug development success or failure because of its impact on many aspects of drug pharmacology, including bioavailability, half-life and toxicity. In this article, we provide an outline and descriptions of the resources for metabolism-related property predictions that are currently either freely or commercially available to the public. These resources include databases with data on, and software for prediction of, several end points: metabolite formation, sites of metabolic transformation, binding to metabolizing enzymes and metabolic stability. We attempt to place each tool in historical context and describe, wherever possible, the data it was based on. For predictions of interactions with metabolizing enzymes, we show a typical set of results for a small test set of compounds. Our aim is to give a clear overview of the areas and aspects of metabolism prediction in which the currently available resources are useful and accurate, and the areas in which they are inadequate or missing entirely.

  3. An operational approach to standard nuclear process model (SNPM) and SAP nuclear software implementation at Slovenske Elektrarne

    International Nuclear Information System (INIS)

    Warren, C.C.

    2010-01-01

    Benchmarking efforts in the fall of 2006 showed significant performance gaps in multiple measured processes between the Slovenske Elektrarne (SE) nuclear organization and the highest performing nuclear organizations in the world. While overall performance of the SE nuclear fleet was good and in the second quartile, when compared to the worldwide population of Pressurized Water Reactors (PWR), SE leadership set new goals to improve safety and operational performance to the first decile of the worldwide PWR Fleet. To meet these goals the SE nuclear team initiated a project to identify and implement the Best Practice nuclear processes in multiple areas. The benchmarking process identified the Standard Nuclear Performance Model (SNPM), used in the US nuclear fleet, as the industry best practice process model. The Slovenske Elektrarne nuclear management team used various change management techniques to clearly establish the case for organizational and process change within the nuclear organization. The project organization established by the SE nuclear management team relied heavily on functional line organization personnel to gain early acceptance of the project goals and methods thereby reducing organizational opposition to the significant organizational and process changes. The choice of a standardized process model used, all or in part, by approximately one third of the nuclear industry worldwide greatly facilitated the development and acceptance of the changes. Use of a nuclear proven templated software platform significantly reduced development and testing efforts for the resulting fully integrated solution. In the spring of 2007 SE set in motion a set of initiatives that has resulted in a significant redesign of most processes related to nuclear plant maintenance and continuous improvement. Significant organizational structure changes have been designed and implemented to align the organization to the SNPM processes and programs. The completion of the initial

  4. An operational approach to standard nuclear process model (SNPM) and SAP nuclear software implementation at Slovenske Elektrarne

    Energy Technology Data Exchange (ETDEWEB)

    Warren, C.C. [Nuclear Power Plants Operation Department, Slovenske Elektrarne, a.s., Mlynske nivy 47, 821 09 Bratislava (Slovakia)

    2010-07-01

    Benchmarking efforts in the fall of 2006 showed significant performance gaps in multiple measured processes between the Slovenske Elektrarne (SE) nuclear organization and the highest performing nuclear organizations in the world. While overall performance of the SE nuclear fleet was good and in the second quartile, when compared to the worldwide population of Pressurized Water Reactors (PWR), SE leadership set new goals to improve safety and operational performance to the first decile of the worldwide PWR Fleet. To meet these goals the SE nuclear team initiated a project to identify and implement the Best Practice nuclear processes in multiple areas. The benchmarking process identified the Standard Nuclear Performance Model (SNPM), used in the US nuclear fleet, as the industry best practice process model. The Slovenske Elektrarne nuclear management team used various change management techniques to clearly establish the case for organizational and process change within the nuclear organization. The project organization established by the SE nuclear management team relied heavily on functional line organization personnel to gain early acceptance of the project goals and methods thereby reducing organizational opposition to the significant organizational and process changes. The choice of a standardized process model used, all or in part, by approximately one third of the nuclear industry worldwide greatly facilitated the development and acceptance of the changes. Use of a nuclear proven templated software platform significantly reduced development and testing efforts for the resulting fully integrated solution. In the spring of 2007 SE set in motion a set of initiatives that has resulted in a significant redesign of most processes related to nuclear plant maintenance and continuous improvement. Significant organizational structure changes have been designed and implemented to align the organization to the SNPM processes and programs. The completion of the initial

  5. Software testing in roughness calculation

    International Nuclear Information System (INIS)

    Chen, Y L; Hsieh, P F; Fu, W E

    2005-01-01

    A test method to determine the function quality provided by the software for roughness measurement is presented in this study. The function quality of the software requirements should be part of and assessed through the entire life cycle of the software package. The specific function, or output accuracy, is crucial for the analysis of the experimental data. For scientific applications, however, commercial software is usually embedded with specific instrument, which is used for measurement or analysis during the manufacture process. In general, the error ratio caused by the software would be more apparent especially when dealing with relatively small quantities, like the measurements in the nanometer-scale range. The model of 'using a data generator' proposed by NPL of UK was applied in this study. An example of the roughness software is tested and analyzed by the above mentioned process. After selecting the 'reference results', the 'reference data' was generated by a programmable 'data generator'. The filter function of 0.8 mm long cutoff value, defined in ISO 11562 was tested with 66 sinusoid data at different wavelengths. Test results from commercial software and CMS written program were compared to the theoretical data calculated from ISO standards. As for the filter function in this software, the result showed a significant disagreement between the reference and test results. The short cutoff feature for filtering at the high frequencies does not function properly, while the long cutoff feature has the maximum difference in the filtering ratio, which is more than 70% between the wavelength of 300 μm and 500 μm. Conclusively, the commercial software needs to be tested more extensively for specific application by appropriate design of reference dataset to ensure its function quality

  6. Evaluation of technetium-99m-Mag3-clearance. A pilot study of industry for the standardization of software

    International Nuclear Information System (INIS)

    Stritzke, H.P.

    1996-01-01

    The manufacturers for gamma camera systems and their distribution organisations in Germany, ADAC, Elscint, Gaede, General Electric, Picker, Siemens, Sopha and Toshiba initiated in cooperation with the Central Association of the Electro and Electronic Industry (Zentralverband der Elektro-und Elektronikindustrie eV, ZVEl), a pilot study to test whether the clinical computer programs for the calculation of the Technetium-99m-Mag 3 Clearance according Bubeck deliver comparable results. For this purpose three dynamic scintigraphic renal studies were converted to interfile format and were sent to the firms at two occasions including blood sampling data and anonymousized patient information. The results were returned to the SAM GmbH System Analysen in der Medizin (Bad Oeynhausen) and analyzed. All of the eight participating firms except one calculated correct clearance values. The reason for the failure in one case turned out to be a bug in the computer program. This pilot study has demonstrated that it is possible to establish certain technical standards for the clinical software in nuclear medicine but also to detect and correct errors. (orig.) [de

  7. Net-VISA used as a complement to standard software at the CTBTO: initial operational experience with next-generation software.

    Science.gov (United States)

    Le Bras, R. J.; Arora, N. S.; Kushida, N.; Kebede, F.; Feitio, P.; Tomuta, E.

    2017-12-01

    The International Monitoring System of the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO) has reached out to the broader scientific community through a series of conferences, the later one of which took place in June 2017 in Vienna, Austria. Stemming out of this outreach effort, after the inception of research and development efforts in 2009, the NET-VISA software, following a Bayesian modelling approach, has been elaborated to improve on the key step of automatic association of joint seismic, hydro-acoustic, and infrasound detections. When compared with the current operational system, it has been consistently shown on off-line tests to improve the overlap with the analyst-reviewed Reviewed Event Bulletin (REB) by ten percent for an average of 85% overlap, while the inconsistency rate is essentially the same at about 50%. Testing by analysts in realistic conditions on a few days of data has also demonstrated the software performance in finding additional events which qualify for publication in the REB. Starting in August 2017, the automatic events produced by the software will be reviewed by analysts at the CTBTO, and we report on the initial evaluation of this introduction into operations.

  8. Software Intensive Systems

    National Research Council Canada - National Science Library

    Horvitz, E; Katz, D. J; Rumpf, R. L; Shrobe, H; Smith, T. B; Webber, G. E; Williamson, W. E; Winston, P. H; Wolbarsht, James L

    2006-01-01

    .... Additionally, recommend that DoN invest in software engineering, particularly as it complements commercial industry developments and promotes the application of systems engineering methodology...

  9. Software Quality Assurance in Software Projects: A Study of Pakistan

    OpenAIRE

    Faisal Shafique Butt; Sundus Shaukat; M. Wasif Nisar; Ehsan Ullah Munir; Muhammad Waseem; Kashif Ayyub

    2013-01-01

    Software quality is specific property which tells what kind of standard software should have. In a software project, quality is the key factor of success and decline of software related organization. Many researches have been done regarding software quality. Software related organization follows standards introduced by Capability Maturity Model Integration (CMMI) to achieve good quality software. Quality is divided into three main layers which are Software Quality Assurance (SQA), Software Qu...

  10. First experiences with the implementation of the European standard EN 62304 on medical device software for the quality assurance of a radiotherapy unit

    International Nuclear Information System (INIS)

    Höss, Angelika; Lampe, Christian; Panse, Ralf; Ackermann, Benjamin; Naumann, Jakob; Jäkel, Oliver

    2014-01-01

    According to the latest amendment of the Medical Device Directive standalone software qualifies as a medical device when intended by the manufacturer to be used for medical purposes. In this context, the EN 62304 standard is applicable which defines the life-cycle requirements for the development and maintenance of medical device software. A pilot project was launched to acquire skills in implementing this standard in a hospital-based environment (in-house manufacture). The EN 62304 standard outlines minimum requirements for each stage of the software life-cycle, defines the activities and tasks to be performed and scales documentation and testing according to its criticality. The required processes were established for the pre-existent decision-support software FlashDumpComparator (FDC) used during the quality assurance of treatment-relevant beam parameters. As the EN 62304 standard implicates compliance with the EN ISO 14971 standard on the application of risk management to medical devices, a risk analysis was carried out to identify potential hazards and reduce the associated risks to acceptable levels. The EN 62304 standard is difficult to implement without proper tools, thus open-source software was selected and integrated into a dedicated development platform. The control measures yielded by the risk analysis were independently implemented and verified, and a script-based test automation was retrofitted to reduce the associated test effort. After all documents facilitating the traceability of the specified requirements to the corresponding tests and of the control measures to the proof of execution were generated, the FDC was released as an accessory to the HIT facility. The implementation of the EN 62304 standard was time-consuming, and a learning curve had to be overcome during the first iterations of the associated processes, but many process descriptions and all software tools can be re-utilized in follow-up projects. It has been demonstrated that a

  11. First experiences with the implementation of the European standard EN 62304 on medical device software for the quality assurance of a radiotherapy unit.

    Science.gov (United States)

    Höss, Angelika; Lampe, Christian; Panse, Ralf; Ackermann, Benjamin; Naumann, Jakob; Jäkel, Oliver

    2014-03-21

    According to the latest amendment of the Medical Device Directive standalone software qualifies as a medical device when intended by the manufacturer to be used for medical purposes. In this context, the EN 62304 standard is applicable which defines the life-cycle requirements for the development and maintenance of medical device software. A pilot project was launched to acquire skills in implementing this standard in a hospital-based environment (in-house manufacture). The EN 62304 standard outlines minimum requirements for each stage of the software life-cycle, defines the activities and tasks to be performed and scales documentation and testing according to its criticality. The required processes were established for the pre-existent decision-support software FlashDumpComparator (FDC) used during the quality assurance of treatment-relevant beam parameters. As the EN 62304 standard implicates compliance with the EN ISO 14971 standard on the application of risk management to medical devices, a risk analysis was carried out to identify potential hazards and reduce the associated risks to acceptable levels. The EN 62304 standard is difficult to implement without proper tools, thus open-source software was selected and integrated into a dedicated development platform. The control measures yielded by the risk analysis were independently implemented and verified, and a script-based test automation was retrofitted to reduce the associated test effort. After all documents facilitating the traceability of the specified requirements to the corresponding tests and of the control measures to the proof of execution were generated, the FDC was released as an accessory to the HIT facility. The implementation of the EN 62304 standard was time-consuming, and a learning curve had to be overcome during the first iterations of the associated processes, but many process descriptions and all software tools can be re-utilized in follow-up projects. It has been demonstrated that a

  12. Analysis of Potential Benefits and Costs of Adopting ASHRAE Standard 90.1-1999 as a Commercial Building Energy Code in Illinois Jurisdictions

    Energy Technology Data Exchange (ETDEWEB)

    Belzer, David B.; Cort, Katherine A.; Winiarski, David W.; Richman, Eric E.; Friedrich, Michele

    2002-05-01

    ASHRAE Standard 90.1-1999 was developed in an effort to set minimum requirements for energy efficienty design and construction of new commercial buildings. This report assesses the benefits and costs of adopting this standard as the building energy code in Illinois. Energy and economic impacts are estimated using BLAST combined with a Life-Cycle Cost approach to assess corresponding economic costs and benefits.

  13. Challenges in Commercial Buildings | Buildings | NREL

    Science.gov (United States)

    systems Assessing the energy and economic impacts of various technologies, giving priority to those that standardized language for commercial building energy audit data that can be used by software developers to exchange data between audit tools, and can be required by building owners and audit program managers to

  14. Analysis of Potential Benefits and Costs of Adopting ASHRAE Standard 90.1-2001 as the Commercial Building Energy Code in Tennessee

    Energy Technology Data Exchange (ETDEWEB)

    Cort, Katherine A.; Winiarski, David W.; Belzer, David B.; Richman, Eric E.

    2004-09-30

    ASHRAE Standard 90.1-2001 Energy Standard for Buildings except Low-Rise Residential Buildings (hereafter referred to as ASHRAE 90.1-2001 or 90.1-2001) was developed in an effort to set minimum requirements for the energy efficient design and construction of new commercial buildings. The State of Tennessee is considering adopting ASHRAE 90.1-2001 as its commercial building energy code. In an effort to evaluate whether or not this is an appropriate code for the state, the potential benefits and costs of adopting this standard are considered in this report. Both qualitative and quantitative benefits and costs are assessed. Energy and economic impacts are estimated using the Building Loads Analysis and System Thermodynamics (BLAST) simulations combined with a Life-Cycle Cost (LCC) approach to assess corresponding economic costs and benefits. Tennessee currently has ASHRAE Standard 90A-1980 as the statewide voluntary/recommended commercial energy standard; however, it is up to the local jurisdiction to adopt this code. Because 90A-1980 is the recommended standard, many of the requirements of ASHRAE 90A-1980 were used as a baseline for simulations.

  15. National Software Capacity: Near-Term Study

    Science.gov (United States)

    1990-05-01

    34 sweatshops " [Singhal 90]. Because they work for below market wages, they allow software development costs in the commercial sector to be reduced or...arrangements. Presently, the command/management director is far too often at a technological disadvantage because of the job assignment structure. A...facto commercial standards on the supply of both raw and skilled labor needs to be evaluated in light of the purely technological disadvantages or

  16. The GOLM-database standard- a framework for time-series data management based on free software

    Science.gov (United States)

    Eichler, M.; Francke, T.; Kneis, D.; Reusser, D.

    2009-04-01

    Monitoring and modelling projects usually involve time series data originating from different sources. Often, file formats, temporal resolution and meta-data documentation rarely adhere to a common standard. As a result, much effort is spent on converting, harmonizing, merging, checking, resampling and reformatting these data. Moreover, in work groups or during the course of time, these tasks tend to be carried out redundantly and repeatedly, especially when new data becomes available. The resulting duplication of data in various formats strains additional ressources. We propose a database structure and complementary scripts for facilitating these tasks. The GOLM- (General Observation and Location Management) framework allows for import and storage of time series data of different type while assisting in meta-data documentation, plausibility checking and harmonization. The imported data can be visually inspected and its coverage among locations and variables may be visualized. Supplementing scripts provide options for data export for selected stations and variables and resampling of the data to the desired temporal resolution. These tools can, for example, be used for generating model input files or reports. Since GOLM fully supports network access, the system can be used efficiently by distributed working groups accessing the same data over the internet. GOLM's database structure and the complementary scripts can easily be customized to specific needs. Any involved software such as MySQL, R, PHP, OpenOffice as well as the scripts for building and using the data base, including documentation, are free for download. GOLM was developed out of the practical requirements of the OPAQUE-project. It has been tested and further refined in the ERANET-CRUE and SESAM projects, all of which used GOLM to manage meteorological, hydrological and/or water quality data.

  17. Sandia software guidelines: Software quality planning

    Energy Technology Data Exchange (ETDEWEB)

    1987-08-01

    This volume is one in a series of Sandia Software Guidelines intended for use in producing quality software within Sandia National Laboratories. In consonance with the IEEE Standard for Software Quality Assurance Plans, this volume identifies procedures to follow in producing a Software Quality Assurance Plan for an organization or a project, and provides an example project SQA plan. 2 figs., 4 tabs.

  18. Software reliability assessment

    International Nuclear Information System (INIS)

    Barnes, M.; Bradley, P.A.; Brewer, M.A.

    1994-01-01

    The increased usage and sophistication of computers applied to real time safety-related systems in the United Kingdom has spurred on the desire to provide a standard framework within which to assess dependable computing systems. Recent accidents and ensuing legislation have acted as a catalyst in this area. One particular aspect of dependable computing systems is that of software, which is usually designed to reduce risk at the system level, but which can increase risk if it is unreliable. Various organizations have recognized the problem of assessing the risk imposed to the system by unreliable software, and have taken initial steps to develop and use such assessment frameworks. This paper relates the approach of Consultancy Services of AEA Technology in developing a framework to assess the risk imposed by unreliable software. In addition, the paper discusses the experiences gained by Consultancy Services in applying the assessment framework to commercial and research projects. The framework is applicable to software used in safety applications, including proprietary software. Although the paper is written with Nuclear Reactor Safety applications in mind, the principles discussed can be applied to safety applications in all industries

  19. Construction cost impact analysis of the U.S. Department of Energy mandatory performance standards for new federal commercial and multi-family, high-rise residential buildings

    International Nuclear Information System (INIS)

    Di Massa, F.V.; Hadley, D.L.; Halverson, M.A.

    1993-12-01

    In accordance with federal legislation, the U.S. Department of Energy (DOE) has conducted a project to demonstrate use of its Energy Conservation Voluntary Performance Standards for Commercial and Multi-Family High-Rise Residential Buildings; Mandatory for New Federal Buildings; Interim Rule (referred to in this report as DOE-1993). A key requisite of the legislation requires DOE to develop commercial building energy standards that are cost effective. During the demonstration project, DOE specifically addressed this issue by assessing the impacts of the standards on (1) construction costs, (2) builders (and especially small builders) of multi-family, high-rise buildings, and (3) the ability of low-to moderate-income persons to purchase or rent units in such buildings. This document reports on this project

  20. Implementing Software Defined Radio

    CERN Document Server

    Grayver, Eugene

    2013-01-01

    Software Defined Radio makes wireless communications easier, more efficient, and more reliable. This book bridges the gap between academic research and practical implementation. When beginning a project, practicing engineers, technical managers, and graduate students can save countless hours by considering the concepts presented in these pages. The author covers the myriad options and trade-offs available when selecting an appropriate hardware architecture. As demonstrated here, the choice between hardware- and software-centric architecture can mean the difference between meeting an aggressive schedule and bogging down in endless design iterations. Because of the author’s experience overseeing dozens of failed and successful developments, he is able to present many real-life examples. Some of the key concepts covered are: Choosing the right architecture for the market – laboratory, military, or commercial Hardware platforms – FPGAs, GPPs, specialized and hybrid devices Standardization efforts to ens...

  1. SU-G-JeP2-06: Dosimetric and Workflow Evaluation of First Commercial Synthetic CT Software for Clinical Use in Pelvis

    Energy Technology Data Exchange (ETDEWEB)

    Tyagi, N; Zhang, J; Happersett, L; Kadbi, M; Mechalakos, J; Deasy, J; Hunt, M [Memorial Sloan Kettering Cancer Center, New York, NY (United States)

    2016-06-15

    Purpose: evaluate a commercial synthetic CT (syn-CT) software for use in prostate radiotherapy Methods: Twenty prostate patients underwent CT and MR simulation scans in treatment position on a 3T Philips scanner. The MR protocol consisted of a T2w turbo spin-echo for soft tissue contrast, a 2D balanced-fast field echo (b-FFE) for fiducial identification, a dual-echo 3D FFE B0 map for distortion analysis and a 3D mDIXON FFE sequence to generate syn-CT. Two echoes are acquired during mDIXON scan, allowing water, fat, and in-phase images to be derived using the frequency shift of the fat and water protons. Tissues were classified as: air, adipose, water, trabecular/spongy bone and compact/cortical bone and assigned specific bulk HU values. Bone structures are segmented based on a pelvis bone atlas. Accuracy of syn-CT for patient treatment planning was analyzed by transferring the original plan and structures from the CT to syn-CT via rigid registration and recalculating dose. In addition, new IMRT plans were generated on the syn-CT using structures contoured on MR and transferred to the syn-CT. Accuracy of fiducial-based localization at the treatment machine performed using syn-CT or DRRs generated from syn-CT was assessed by comparing to orthogonal kV radiographs or CBCT. Results: Dosimetric comparison between CT and syn-CT was within 0.5% for all structures. The de-novo optimized plans generated on the syn-CT met our institutional clinical objectives for target and normal structures. Patient-induced susceptibility distortion based on B0 maps was within 1mm and 0.4 mm in the body and prostate. The rectal and bladder outlines on the syn-CT were deemed sufficient for assessing rectal and bladder filling on the CBCT at the time of treatment. CBCT localization showed a median error of < ±1 mm in LR, AP and SI direction. Conclusion: MRI derived syn-CT can be used clinically in MR-alone planning and treatment process for prostate. Drs. Deasy, Hunt and Tyagi have Master

  2. Validation of quality control tests of a multi leaf collimator using electronic portal image devices and commercial software; Validacion de unas pruebas de control de calidad del colimador multilamina utilizando dispositivos electronicos de imagen portal y una aplicacion comercial

    Energy Technology Data Exchange (ETDEWEB)

    Latorre-Musoll, A.; Jornet Sala, N.; Carrasco de Fez, P.; Edualdo Puell, T.; Ruiz Martinez, A.; Ribas Morales, M.

    2013-07-01

    We describe a daily quality control procedure of the multi leaf collimator (MLC) based on electronic portal image devices and commercial software. We designed tests that compare portal images of a set of static and dynamic MLC configurations to a set of reference images using commercial portal dosimetry software. Reference images were acquired using the same set of MLC configurations after the calibration of the MLC. To assess the sensitivity to detect MLC under performances, we modified the MLC configurations by inserting a range of leaf position and speed errors. Distance measurements on portal images correlated with leaf position errors down to 0.1 mm in static MLC configurations. Dose differences between portal images correlated both with speed errors down to 0.5% of the nominal leaf velocities and with leaf position errors down to 0.1 mm in dynamic MLC configurations. The proposed quality control procedure can assess static and dynamic MLC configurations with high sensitivity and reliability. (Author)

  3. Comparison of a commercial blood cross-matching kit to the standard laboratory method for establishing blood transfusion compatibility in dogs.

    Science.gov (United States)

    Guzman, Leo Roa; Streeter, Elizabeth; Malandra, Allison

    2016-01-01

    To evaluate the accuracy of a commercial blood transfusion cross-match kit when compared to the standard laboratory method for establishing blood transfusion compatibility. A prospective observational in intro study performed from July 2009 to July 2013. Private referral veterinary center. Ten healthy dogs, 11 anemic dogs, and 24 previously transfused dogs. None. Forty-five dogs were enrolled in a prospective study in order to compare the standard blood transfusion cross-match technique to a commercial blood transfusion cross-matching kit. These dogs were divided into 3 different groups that included 10 healthy dogs (control group), 11 anemic dogs in need of a blood transfusion, and 24 sick dogs that were previously transfused. Thirty-five dogs diagnosed with anemia secondary to multiple disease processes were cross-matched using both techniques. All dogs cross-matched via the kit had a compatible major and minor result, whereas 16 dogs out of 45 (35%) had an incompatible cross-match result when the standard laboratory technique was performed. The average time to perform the commercial kit was 15 minutes and this was 3 times shorter than the manual cross-match laboratory technique that averaged 45-50 minutes to complete. While the gel-based cross-match kit is quicker and less technically demanding than standard laboratory cross-match procedures, microagglutination and low-grade hemolysis are difficult to identify by using the gel-based kits. This could result in transfusion reactions if the gel-based kits are used as the sole determinant of blood compatibility prior to transfusion. Based on our results, the standard manual cross-match technique remains the gold standard test to determine blood transfusion compatibility. © Veterinary Emergency and Critical Care Society 2016.

  4. A Model for Joint Software Reviews

    Science.gov (United States)

    1998-10-01

    CEPMAN 1, 1996; Gabb, 1997], and with the growing popularity of outsourcing, they are becoming more important in the commercial sector [ ISO /IEC 12207 ...technical and management reviews [MIL-STD-498, 1996; ISO /IEC 12207 , 1995]. Management reviews occur after technical reviews, and are focused on the cost...characteristics, Standard (No. ISO /IEC 9126-1). [ ISO /IEC 12207 , 1995] Information Technology Software Life Cycle Processes, Standard (No. ISO /IEC 12207

  5. Software Quality Assurance for Nuclear Safety Systems

    International Nuclear Information System (INIS)

    Sparkman, D R; Lagdon, R

    2004-01-01

    The US Department of Energy has undertaken an initiative to improve the quality of software used to design and operate their nuclear facilities across the United States. One aspect of this initiative is to revise or create new directives and guides associated with quality practices for the safety software in its nuclear facilities. Safety software includes the safety structures, systems, and components software and firmware, support software and design and analysis software used to ensure the safety of the facility. DOE nuclear facilities are unique when compared to commercial nuclear or other industrial activities in terms of the types and quantities of hazards that must be controlled to protect workers, public and the environment. Because of these differences, DOE must develop an approach to software quality assurance that ensures appropriate risk mitigation by developing a framework of requirements that accomplishes the following goals: (sm b ullet) Ensures the software processes developed to address nuclear safety in design, operation, construction and maintenance of its facilities are safe (sm b ullet) Considers the larger system that uses the software and its impacts (sm b ullet) Ensures that the software failures do not create unsafe conditions Software designers for nuclear systems and processes must reduce risks in software applications by incorporating processes that recognize, detect, and mitigate software failure in safety related systems. It must also ensure that fail safe modes and component testing are incorporated into software design. For nuclear facilities, the consideration of risk is not necessarily sufficient to ensure safety. Systematic evaluation, independent verification and system safety analysis must be considered for software design, implementation, and operation. The software industry primarily uses risk analysis to determine the appropriate level of rigor applied to software practices. This risk-based approach distinguishes safety

  6. Commercial Law Reform in territories subject to International Administration. Kosovo & Iraq. Different standards of legitimacy and accountability?

    Directory of Open Access Journals (Sweden)

    Alejandro Carballo Leyda

    2008-01-01

    Full Text Available The paper will address questions of legality and accountability of the legislative functions exerted by international territorial administrations1 in the field of commercial law in two recent scenarios that are theoretically different: a UN-authorized mission under Chapter VII of the UN Chart and that of a strictly Occupying Power. No attempt will be made to study other important and interrelated issues, such as the problematic privatizations carried out in Kosovo and Iraq, which do not seem to be compatible with the obligation of administration of public assets (Art. 55 of the 1907 Hague Regulations.This paper will first provide a brief overview of the deep economic legislative reformation that took place in Iraq and Kosovo during the very early stages. Most of the scholar literature focused on criminal law and human rights aspects, leaving aside commercial law reforms; yet, those profound commercial reforms have resulted in a drastic economic transformation from a planned, centrally controlled, socialist system into a liberal, marketoriented, capitalist economy. The radical nature of those changes raises the question of their conformity with relevant international law and the need for public accountability.Part III will then explore the sources of legality invoked so far (namely UN Mandates, International Humanitarian Law, and authority invested by local intervention by the academic world, experts and intervening actors as basis for the commercial reformation in Kosovo and Iraq, and whether the actual results comply with the discretion vested in the temporal administrations by those sources. Finally, in Part IV problems of judicial review and public accountability in relation to the law-making function of those international administrations in Iraq and Kosovo will be considered.

  7. Implementation and Testing of the JANUS Standard with SSC Pacific’s Software-Defined Acoustic Modem

    Science.gov (United States)

    2017-12-01

    Communications Outpost (FDECO) Innovative Naval Prototype (INP) Program by the Advanced Photonic Technologies Branch (Code 55360), Space and Naval Warfare...Systems Center Pacific (SSC Pacific), San Diego, CA. Further support was provided by the 55340 Enterprise Communications and Networks Branch (Code 55340...names of names of manufacturers is not to be construed as official government endorsement or approval of commercial products or services referenced in

  8. Avionics and Software Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The goal of the AES Avionics and Software (A&S) project is to develop a reference avionics and software architecture that is based on standards and that can be...

  9. lessons and challenges from software quality assessment

    African Journals Online (AJOL)

    DJFLEX

    www.globaljournalseries.com, Email: info@globaljournalseries.com ... ASSESSMENT: THE CASE OF SPACE SYSTEMS SOFTWARE. ... KEYWORDS: Software, Software Quality ,Quality Standard, Characteristics, ... and communication, etc.

  10. Analysis of Potential Benefits and Costs of Adopting ASHRAE Standard 90.1-1999 as a Commercial Building Energy Code in Michigan

    Energy Technology Data Exchange (ETDEWEB)

    Cort, Katherine A.; Belzer, David B.; Halverson, Mark A.; Richman, Eric E.; Winiarski, David W.

    2002-09-30

    The state of Michigan is considering adpoting ASHRAE 90.1-1999 as its commercial building energy code. In an effort to evaluate whether or not this is an appropraite code for the state, the potential benefits and costs of adopting this standard are considered. Both qualitative and quantitative benefits are assessed. The energy simulation and economic results suggest that adopting ASHRAE 90.1-1999 would provide postitive net benefits to the state relative to the building and design requirements currently in place.

  11. Software component quality evaluation

    Science.gov (United States)

    Clough, A. J.

    1991-01-01

    The paper describes a software inspection process that can be used to evaluate the quality of software components. Quality criteria, process application, independent testing of the process and proposed associated tool support are covered. Early results indicate that this technique is well suited for assessing software component quality in a standardized fashion. With automated machine assistance to facilitate both the evaluation and selection of software components, such a technique should promote effective reuse of software components.

  12. Automatic Commercial Permit Sets

    Energy Technology Data Exchange (ETDEWEB)

    Grana, Paul [Folsom Labs, Inc., San Francisco, CA (United States)

    2017-12-21

    Final report for Folsom Labs’ Solar Permit Generator project, which has successfully completed, resulting in the development and commercialization of a software toolkit within the cloud-based HelioScope software environment that enables solar engineers to automatically generate and manage draft documents for permit submission.

  13. Computational Tools and Resources for Metabolism-Related Property Predictions. 1. Overview of Publicly Available (Free and Commercial) Databases and Software

    Science.gov (United States)

    2012-01-01

    inhibitors, kinetics, UGT substrates, clearance QSAR models [211] ADMEworks Predictor Fujitsu Commercial CYP-binding affinities QSAR models [253] CypScore...fragment energies pre- calculated using density functional theory . The second descriptor is a measure of how far each site is from the center of the...semiempirical molecular orbital theory . ChemMedChem 4(4), 657–669 (2009). 44 Rydberg P, Gloriam DE, Zaretzki J, Breneman C, Olsen L. SMARTCyp: A 2D method for

  14. First-trimester risk calculation for trisomy 13, 18, and 21: comparison of the screening efficiency between 2 locally developed programs and commercial software

    DEFF Research Database (Denmark)

    Sørensen, Steen; Momsen, Günther; Sundberg, Karin

    2011-01-01

    -A) in maternal plasma from unaffected pregnancies. Means and SDs of these parameters in unaffected and affected pregnancies are used in the risk calculation program. Unfortunately, our commercial program for risk calculation (Astraia) did not allow use of local medians. We developed 2 alternative risk...... calculation programs to assess whether the screening efficacies for T13, T18, and T21 could be improved by using our locally estimated medians....

  15. Upgrade of the CATS sample changer on FIP-BM30A at the ESRF: towards a commercialized standard

    International Nuclear Information System (INIS)

    Jacquamet, L.; Joly, J.; Charrault, P.; Pirocchi, M.; Vernede, X.; Bouis, F.; Borel, F.; Ferrer, J.L.; Perin, J.P.; Denis, T.; Rechatin, J.L.

    2009-01-01

    An upgraded version of the sample changer 'CATS' (Cryogenic Automated Transfer System) that was developed on the FIP-BM30A beamline at the ESRF is presented. At present, CATS is installed at SLS (three systems), BESSY (one system), DLS (two systems) and APS (four systems for the LSCAT beamline). It consists mainly of an automated Dewar with an assortment of specific grippers designed to obtain a fast and reliable mounting/dismounting rate without jeopardizing the flexibility of the system. The upgraded system has the ability to manage any sample standard stored in any kind of puck. (authors)

  16. Comparison of Kayzero for Windows and k0-IAEA software packages for k0 standardization in neutron activation analysis

    Czech Academy of Sciences Publication Activity Database

    Kubešová, Marie; Kučera, Jan

    2011-01-01

    Roč. 654, č. 1 (2011), s. 206-212 ISSN 0168-9002 R&D Projects: GA ČR GA202/09/0363 Institutional research plan: CEZ:AV0Z10480505 Keywords : neutron activation analysis * ko standardization * Kayzero for Windows program * ko-IAEA program Subject RIV: BG - Nuclear, Atomic and Molecular Physics, Colliders Impact factor: 1.207, year: 2011

  17. Implementation and Testing of the JANUS Standard with SSC Pacific’s Software-Defined Acoustic Modem

    Science.gov (United States)

    2017-10-01

    JANUS limit) was typed from the keyboard by the user. Next, the transmitter bash script was run and the user typed the ASCII text of choice, or the...FDECO Forward Deployed Energy and Communications Outpost CONOPS concept of operation DSP digital signal processor FPGA field programmable gate array...deployed by U.S. and international military and civilian organizations have for decades operated without any type of widely adopted standards or

  18. Comparison of ultraviolet A light protection standards in the United States and European Union through in vitro measurements of commercially available sunscreens.

    Science.gov (United States)

    Wang, Steven Q; Xu, Haoming; Stanfield, Joseph W; Osterwalder, Uli; Herzog, Bernd

    2017-07-01

    The importance of adequate ultraviolet A light (UVA) protection has become apparent in recent years. The United States and Europe have different standards for assessing UVA protection in sunscreen products. We sought to measure the in vitro critical wavelength (CW) and UVA protection factor (PF) of commercially available US sunscreen products and see if they meet standards set by the United States and the European Union. Twenty sunscreen products with sun protection factors ranging from 15 to 100+ were analyzed. Two in vitro UVA protection tests were conducted in accordance with the 2011 US Food and Drug Administration final rule and the 2012 International Organization for Standardization method for sunscreen effectiveness testing. The CW of the tested sunscreens ranged from 367 to 382 nm, and the UVA PF of the products ranged from 6.1 to 32. Nineteen of 20 sunscreens (95%) met the US requirement of CW >370 nm. Eleven of 20 sunscreens (55%) met the EU desired ratio of UVA PF/SPF > 1:3. The study only evaluated a small number of sunscreen products. The majority of tested sunscreens offered adequate UVA protection according to US Food and Drug Administration guidelines for broad-spectrum status, but almost half of the sunscreens tested did not pass standards set in the European Union. Copyright © 2017. Published by Elsevier Inc.

  19. Sensory characteristics of liquids thickened with commercial thickeners to levels specified in the International Dysphagia Diet Standardization Initiative (IDDSI) framework.

    Science.gov (United States)

    Ong, Jane Jun-Xin; Steele, Catriona M; Duizer, Lisa M

    2018-06-01

    Sensory characteristics are important for the acceptance of thickened liquids, but those of liquids thickened to the new standards put forth by the International Dysphagia Diet Standardization Initiative (IDDSI) are unknown. This research sought to identify and rate the perception of important sensory properties of liquids thickened to levels specified in the IDDSI framework. Samples were made with water, with and without added barium sulfate, and were thickened with a cornstarch or xanthan gum based thickener. Samples were characterized using projective mapping/ultra-flash profiling to identify important sample attributes, and then with trained descriptive analysis panels to characterize those attributes in non-barium and barium thickened liquids. Three main groups of attributes were observed. Taste and flavor attributes decreased in intensity with increasing thickener. Thickener specific attributes included graininess and chalkiness for the cornstarch thickened samples, and slipperiness for the xanthan gum samples. Within the same type of thickener, ratings of thickness-related attributes (perceived viscosity, adhesiveness, manipulation, and swallowing) at different IDDSI levels were significantly different from each other. However, in non-barium samples, cornstarch samples were perceived as thicker than xanthan gum samples even though they had similar apparent viscosities at 50 s -1 . On the other hand, the two thickeners had similar perceived thickness in the barium samples even though the apparent viscosities of cornstarch samples were higher than those of the xanthan gum samples. In conclusion, IDDSI levels can be distinguished based on sensory properties, but these properties may be affected by the type of thickener and medium being thickened.

  20. Evaluation of the botanical origin of commercial dry bee pollen load batches using pollen analysis: a proposal for technical standardization

    Directory of Open Access Journals (Sweden)

    Ortrud M. Barth

    2010-12-01

    Full Text Available High quality of bee pollen for commercial purpose is required. In order to attend the consumer with the best identification of the botanical and floral origin of the product, 25 bee pollen batches were investigated using two techniques of pollen grain preparation. The first started to identify pollen loads of different colors in two grams of each well mixed batch, and the second to identify pollen grains in a pool made of all the pollen loads comprised in two grams. The best result was obtained by this last technique, when a pollen grain suspension was dropped on a microscope slide and circa 500 pollen grains were counted per sample. This analysis resulted in the recognition of monofloral and bifloral pollen batches, while the use of the first technique resulted in all samples receiving a heterofloral diagnosis.É exigida alta qualidade para a comercialização de pólen apícola. A fim de atender o consumidor com a melhor identificação da origem botânica e floral do produto, 25 partidas de pólen apícola feram investigadas usande duas diferentes técnicas na preparação dos grãos de pólen. A primeira partiu da identificação das cargas polínicas contidas em dois gramas de cada partida bem misturada segundo suas cores. A segunda visava identificar os grãos de pólen de um agrupamento ("pool" de todas as cargas polínicas contidas em dois gramas de cada amostra. O melhor resultado foi obtido pela última técnica, quando uma suspensão de grãos de pólen era gotejada sobre uma lâmina de microscopia e cerca de 500 grãos de pólen eram centades por amostra. Esta análise resultou no reconhecimento de partidas monoflorais e biflorais de pólen apícola, enquanto que usando a primeira técnica, todas as amostras receberam a diagnose heterefloral.

  1. Hospital Management Software Development

    OpenAIRE

    sobogunGod, olawale

    2012-01-01

    The purpose of this thesis was to implement a hospital management software which is suitable for small private hospitals in Nigeria, especially for the ones that use a file based system for storing information rather than having it stored in a more efficient and safer environment like databases or excel programming software. The software developed within this thesis project was specifically designed for the Rainbow specialist hospital which is based in Lagos, the commercial neurological cente...

  2. Comparison between In-house developed and Diamond commercial software for patient specific independent monitor unit calculation and verification with heterogeneity corrections.

    Science.gov (United States)

    Kuppusamy, Vijayalakshmi; Nagarajan, Vivekanandan; Jeevanandam, Prakash; Murugan, Lavanya

    2016-02-01

    The study was aimed to compare two different monitor unit (MU) or dose verification software in volumetric modulated arc therapy (VMAT) using modified Clarkson's integration technique for 6 MV photons beams. In-house Excel Spreadsheet based monitor unit verification calculation (MUVC) program and PTW's DIAMOND secondary check software (SCS), version-6 were used as a secondary check to verify the monitor unit (MU) or dose calculated by treatment planning system (TPS). In this study 180 patients were grouped into 61 head and neck, 39 thorax and 80 pelvic sites. Verification plans are created using PTW OCTAVIUS-4D phantom and also measured using 729 detector chamber and array with isocentre as the suitable point of measurement for each field. In the analysis of 154 clinically approved VMAT plans with isocentre at a region above -350 HU, using heterogeneity corrections, In-house Spreadsheet based MUVC program and Diamond SCS showed good agreement TPS. The overall percentage average deviations for all sites were (-0.93% + 1.59%) and (1.37% + 2.72%) for In-house Excel Spreadsheet based MUVC program and Diamond SCS respectively. For 26 clinically approved VMAT plans with isocentre at a region below -350 HU showed higher variations for both In-house Spreadsheet based MUVC program and Diamond SCS. It can be concluded that for patient specific quality assurance (QA), the In-house Excel Spreadsheet based MUVC program and Diamond SCS can be used as a simple and fast accompanying to measurement based verification for plans with isocentre at a region above -350 HU. Copyright © 2016 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  3. Computer software quality assurance

    International Nuclear Information System (INIS)

    Ives, K.A.

    1986-06-01

    The author defines some criteria for the evaluation of software quality assurance elements for applicability to the regulation of the nuclear industry. The author then analyses a number of software quality assurance (SQA) standards. The major extracted SQA elements are then discussed, and finally specific software quality assurance recommendations are made for the nuclear industry

  4. An open, interoperable, transdisciplinary approach to a point cloud data service using OGC standards and open source software.

    Science.gov (United States)

    Steer, Adam; Trenham, Claire; Druken, Kelsey; Evans, Benjamin; Wyborn, Lesley

    2017-04-01

    High resolution point clouds and other topology-free point data sources are widely utilised for research, management and planning activities. A key goal for research and management users is making these data and common derivatives available in a way which is seamlessly interoperable with other observed and modelled data. The Australian National Computational Infrastructure (NCI) stores point data from a range of disciplines, including terrestrial and airborne LiDAR surveys, 3D photogrammetry, airborne and ground-based geophysical observations, bathymetric observations and 4D marine tracers. These data are stored alongside a significant store of Earth systems data including climate and weather, ecology, hydrology, geoscience and satellite observations, and available from NCI's National Environmental Research Data Interoperability Platform (NERDIP) [1]. Because of the NERDIP requirement for interoperability with gridded datasets, the data models required to store these data may not conform to the LAS/LAZ format - the widely accepted community standard for point data storage and transfer. The goal for NCI is making point data discoverable, accessible and useable in ways which allow seamless integration with earth observation datasets and model outputs - in turn assisting researchers and decision-makers in the often-convoluted process of handling and analyzing massive point datasets. With a use-case of providing a web data service and supporting a derived product workflow, NCI has implemented and tested a web-based point cloud service using the Open Geospatial Consortium (OGC) Web Processing Service [2] as a transaction handler between a web-based client and server-side computing tools based on a native Linux operating system. Using this model, the underlying toolset for driving a data service is flexible and can take advantage of NCI's highly scalable research cloud. Present work focusses on the Point Data Abstraction Library (PDAL) [3] as a logical choice for

  5. Gammasphere software development

    International Nuclear Information System (INIS)

    Piercey, R.B.

    1994-01-01

    This report describes the activities of the nuclear physics group at Mississippi State University which were performed during 1993. Significant progress has been made in the focus areas: chairing the Gammasphere Software Working Group (SWG); assisting with the porting and enhancement of the ORNL UPAK histogramming software package; and developing standard formats for Gammasphere data products. In addition, they have established a new public ftp archive to distribute software and software development tools and information

  6. Implementation of the k0-standardization Method for an Instrumental Neutron Activation Analysis: Use-k0-IAEA Software as a Demonstration

    International Nuclear Information System (INIS)

    Chung, Yong Sam; Moon, Jong Hwa; Kim, Sun Ha; Kim, Hark Rho; Ho, Manh Dung

    2006-03-01

    Under the RCA post-doctoral program, from May 2005 through February 2006, it was an opportunity to review the present work being carried out in the Neutron Activation Analysis Laboratory, HANARO Center, KAERI. The scope of this research included: a calibration of the counting system, a characterization of the irradiation facility ,a validation of the established k o -NAA procedure.The k o -standardization method for an Neutron Activation Analysis(k o -NAA), which is becoming increasingly popular and widespread,is an absolute calibration technique where the nuclear data are replaced by compound nuclear constants which are experimentally determined. The k o -IAEA software distributed by the IAEA in 2005 was used as a demonstration for this work. The NAA no. 3 irradiation hole in the HANARO research reactor and the gamma-ray spectrometers No. 1 and 5 in the NAA Laboratory were used

  7. Implementation of the k{sub 0}-standardization Method for an Instrumental Neutron Activation Analysis: Use-k{sub 0}-IAEA Software as a Demonstration

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Yong Sam; Moon, Jong Hwa; Kim, Sun Ha; Kim, Hark Rho [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of); Ho, Manh Dung [Nuclear Research Institute, Dalat (Viet Nam)

    2006-03-15

    Under the RCA post-doctoral program, from May 2005 through February 2006, it was an opportunity to review the present work being carried out in the Neutron Activation Analysis Laboratory, HANARO Center, KAERI. The scope of this research included: a calibration of the counting system, a characterization of the irradiation facility ,a validation of the established k{sub o}-NAA procedure.The k{sub o}-standardization method for an Neutron Activation Analysis(k{sub o}-NAA), which is becoming increasingly popular and widespread,is an absolute calibration technique where the nuclear data are replaced by compound nuclear constants which are experimentally determined. The k{sub o}-IAEA software distributed by the IAEA in 2005 was used as a demonstration for this work. The NAA no. 3 irradiation hole in the HANARO research reactor and the gamma-ray spectrometers No. 1 and 5 in the NAA Laboratory were used.

  8. [Definition and specification requirements for PAC-systems (picture archiving and communication system). A performance index with reference to the standard "IEEE Recommended Practice for Software Requirement Specifications"].

    Science.gov (United States)

    König, H; Klose, K J

    1999-04-01

    The formulation of requirements is necessary to control the goals of a PACS project. Furthermore, in this way, the scope of functionality necessary to support radiological working processes becomes clear. Definitions of requirements and specification are formulated independently of systems according to the IEEE standard "Recommended Practice for Software Requirements Specifications". Definitions are given in the Request for Information, specifications in the Request for Proposal. Functional and non-functional requirements are distinguished. The solutions are rated with respect to scope, appropriateness and quality of implementation. A PACS checklist was created according to the methods described above. It is published on the homepage of the "Arbeitsgemeinschaft Informationstechnologie" (AGIT) within the "Deutsche Röntgengesellschaft" (DRG) (http://www.uni-marburg.de/mzr/agit). The checklist provides a discussion forum which should contribute to an agreement on accepted basic PACS functionalities.

  9. Application of standard softWare of the CDC-6500 interactive graphic therminal for representation of spiral scanning data and event saving

    International Nuclear Information System (INIS)

    Nehrguj, B.; Ososkov, G.A.

    1978-01-01

    A system of programs, based on a standard graphic display software, is developed which enables the user to display the results of spiral scanning using a terminal keyboard and a FILTR program. Quality assessment of filtering is also available. The use of a cursor which provides a certain feedback between the display and the CDC-6500 computer provides good capabilities for the investigation of the filtering program failures and saves the most interesting events. To speed up scanning of events, a special program is written which performs pre-filtration and reduces 4 to 5 fold the amount of source numerical data of track projections. Its flowchart is based on the well-known method of cords, which allows to save 10-12 events/hour

  10. Computer software configuration management

    International Nuclear Information System (INIS)

    Pelletier, G.

    1987-08-01

    This report reviews the basic elements of software configuration management (SCM) as defined by military and industry standards. Several software configuration management standards are evaluated given the requirements of the nuclear industry. A survey is included of available automated tools for supporting SCM activities. Some information is given on the experience of establishing and using SCM plans of other organizations that manage critical software. The report concludes with recommendations of practices that would be most appropriate for the nuclear power industry in Canada

  11. Software configuration management

    CERN Document Server

    Keyes, Jessica

    2004-01-01

    Software Configuration Management discusses the framework from a standards viewpoint, using the original DoD MIL-STD-973 and EIA-649 standards to describe the elements of configuration management within a software engineering perspective. Divided into two parts, the first section is composed of 14 chapters that explain every facet of configuration management related to software engineering. The second section consists of 25 appendices that contain many valuable real world CM templates.

  12. From Software Development to Software Assembly

    NARCIS (Netherlands)

    Sneed, Harry M.; Verhoef, Chris

    2016-01-01

    The lack of skilled programming personnel and the growing burden of maintaining customized software are forcing organizations to quit producing their own software. It's high time they turned to ready-made, standard components to fulfill their business requirements. Cloud services might be one way to

  13. Software To Go: A Catalog of Software Available for Loan.

    Science.gov (United States)

    Kurlychek, Ken, Comp.

    This catalog lists the holdings of the Software To Go software lending library and clearinghouse for programs and agencies serving students or clients who are deaf or hard of hearing. An introduction describes the clearinghouse and its collection of software, much of it commercial and copyrighted material, for Apple, Macintosh, and IBM (MS-DOS)…

  14. Usability in open source software development

    DEFF Research Database (Denmark)

    Andreasen, M. S.; Nielsen, H. V.; Schrøder, S. O.

    2006-01-01

    Open Source Software (OSS) development has gained significant importance in the production of soft-ware products. Open Source Software developers have produced systems with a functionality that is competitive with similar proprietary software developed by commercial software organizations. Yet OSS...

  15. Comparison of Protein Value of Commercial Baby Food with Homemade Baby Food and Casein Standard in Rats as the Refference point

    Directory of Open Access Journals (Sweden)

    Z. Asemi

    2008-10-01

    Full Text Available Background and ObjectivesEvaluation of protein quality in food is of great importance due to the biological and economical impacts of food proteins. This study has been conducted with the aim of comparing the protein quality of homemade food (mixture of macaroni and soy bean with commercial baby food (Cerelac Wheat using Casein as the refference point.MethodsThis study was conducted on 64 twenty one day old male Wistar rats. The rats were divided into 8 groups, and each group was put on a different diet regiments. The diet regiments were as follow: 2 homemade food+Cerelac test diet, 1 Ccasein+Methionine standard diet, 1 protien-free basal diet, 2 test diet, 1 standard diet and 1 basal diet. The purpose of protien-free diet was to evaluate True Protien Digestability (TPD. Net Protein Ratio (NPR and Protien Efficiency Ratios (PER were investigated by the basal diet. Protein intake and increasing of weight were determined for NPR and PER calculating. Nitrogen intake and fecal Nitrogen were determined to calculate TPD. Comparison of TPD, NPR and PER among the groups were analyzed by ANOVA and Tukey methods.ResultsTPD values of Standard, Cerelac and homemade food diets were 92.8±4, 87±8 and 85.4±3.2; NPR values were 4.3±0.4, 4.3±0.9, 3.8±0.6; and PER values were 3±0.2, 2.5±0.4, 1.7±0.1 respectively. The statistical difference between TPD and PER values were significant (p 0.05. ConclusionThese results shows that TPD and PER of homemade foods are lower than Cerelac while their NPR are acceptable.Keywords: Protein; Cerelac; Macaroni; Soybeens.

  16. HAZARD ANALYSIS SOFTWARE

    International Nuclear Information System (INIS)

    Sommer, S; Tinh Tran, T.

    2008-01-01

    Washington Safety Management Solutions, LLC developed web-based software to improve the efficiency and consistency of hazard identification and analysis, control selection and classification, and to standardize analysis reporting at Savannah River Site. In the new nuclear age, information technology provides methods to improve the efficiency of the documented safety analysis development process which includes hazard analysis activities. This software provides a web interface that interacts with a relational database to support analysis, record data, and to ensure reporting consistency. A team of subject matter experts participated in a series of meetings to review the associated processes and procedures for requirements and standard practices. Through these meetings, a set of software requirements were developed and compiled into a requirements traceability matrix from which software could be developed. The software was tested to ensure compliance with the requirements. Training was provided to the hazard analysis leads. Hazard analysis teams using the software have verified its operability. The software has been classified as NQA-1, Level D, as it supports the analysis team but does not perform the analysis. The software can be transported to other sites with alternate risk schemes. The software is being used to support the development of 14 hazard analyses. User responses have been positive with a number of suggestions for improvement which are being incorporated as time permits. The software has enforced a uniform implementation of the site procedures. The software has significantly improved the efficiency and standardization of the hazard analysis process

  17. Software Quality Assurance Metrics

    Science.gov (United States)

    McRae, Kalindra A.

    2004-01-01

    Software Quality Assurance (SQA) is a planned and systematic set of activities that ensures conformance of software life cycle processes and products conform to requirements, standards and procedures. In software development, software quality means meeting requirements and a degree of excellence and refinement of a project or product. Software Quality is a set of attributes of a software product by which its quality is described and evaluated. The set of attributes includes functionality, reliability, usability, efficiency, maintainability, and portability. Software Metrics help us understand the technical process that is used to develop a product. The process is measured to improve it and the product is measured to increase quality throughout the life cycle of software. Software Metrics are measurements of the quality of software. Software is measured to indicate the quality of the product, to assess the productivity of the people who produce the product, to assess the benefits derived from new software engineering methods and tools, to form a baseline for estimation, and to help justify requests for new tools or additional training. Any part of the software development can be measured. If Software Metrics are implemented in software development, it can save time, money, and allow the organization to identify the caused of defects which have the greatest effect on software development. The summer of 2004, I worked with Cynthia Calhoun and Frank Robinson in the Software Assurance/Risk Management department. My task was to research and collect, compile, and analyze SQA Metrics that have been used in other projects that are not currently being used by the SA team and report them to the Software Assurance team to see if any metrics can be implemented in their software assurance life cycle process.

  18. Software quality assurance

    CERN Document Server

    Laporte, Claude Y

    2018-01-01

    This book introduces Software Quality Assurance (SQA) and provides an overview of standards used to implement SQA. It defines ways to assess the effectiveness of how one approaches software quality across key industry sectors such as telecommunications, transport, defense, and aerospace. * Includes supplementary website with an instructor's guide and solutions * Applies IEEE software standards as well as the Capability Maturity Model Integration for Development (CMMI) * Illustrates the application of software quality assurance practices through the use of practical examples, quotes from experts, and tips from the authors

  19. Open source software and libraries

    OpenAIRE

    Randhawa, Sukhwinder

    2008-01-01

    Open source software is, software that users have the ability to run, copy, distribute, study, change, share and improve for any purpose. Open source library software’s does not need the initial cost of commercial software and enables libraries to have greater control over their working environment. Library professionals should be aware of the advantages of open source software and should involve in their development. They should have basic knowledge about the selection, installation and main...

  20. 48 CFR 212.212 - Computer software.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 3 2010-10-01 2010-10-01 false Computer software. 212.212... Acquisition of Commercial Items 212.212 Computer software. (1) Departments and agencies shall identify and... technology development), opportunities for the use of commercial computer software and other non...

  1. Comparison of Protein Value of Commercial Baby Food with Homemade Baby Food and Casein Standard in Rats as the Refference point

    Directory of Open Access Journals (Sweden)

    Z Asemi

    2012-05-01

    Full Text Available

    Background and Objectives

    Evaluation of protein quality in food is of great importance due to the  biological and economical impacts of food proteins. This study has been conducted with the aim of comparing the protein quality of  homemade food (mixture of macaroni and soy bean with commercial baby food (Cerelac Wheat using Casein as the refference point.

     

    Methods

    This study was conducted on 64 twenty one day old male Wistar rats. The rats were divided into 8 groups, and each group was put on a different diet regiments. The diet regiments were as follow: 2 homemade food+Cerelac test diet, 1 Ccasein+Methionine standard diet, 1 protien-free basal diet, 2 test diet, 1 standard diet and 1 basal diet. The purpose of protien-free diet was  to evaluate True Protien Digestability (TPD. Net Protein Ratio (NPR and Protien Efficiency Ratios (PER were investigated by the basal diet. Protein intake and increasing of weight were determined for NPR and PER calculating. Nitrogen intake and fecal Nitrogen were determined to calculate TPD. Comparison of TPD, NPR and PER among the groups were analyzed by ANOVA and Tukey methods.

     

    Results

    TPD values of Standard, Cerelac and homemade food diets were 92.8±4, 87±8 and 85.4±3.2; NPR values were 4.3±0.4, 4.3±0.9, 3.8±0.6; and PER values were 3±0.2, 2.5±0.4, 1.7±0.1 respectively. The statistical difference between TPD and PER values were significant (p < 0.05, whereas NPR differences were insignificant ( p > 0.05.

     

    Conclusion

    These results shows that TPD and PER of homemade foods are lower than Cerelac while their NPR are acceptable

  2. KAERI software verification and validation guideline for developing safety-critical software in digital I and C system of NPP

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jang Yeol; Lee, Jang Soo; Eom, Heung Seop

    1997-07-01

    This technical report is to present V and V guideline development methodology for safety-critical software in NPP safety system. Therefore it is to present V and V guideline of planning phase for the NPP safety system in addition to critical safety items, for example, independence philosophy, software safety analysis concept, commercial off the shelf (COTS) software evaluation criteria, inter-relationships between other safety assurance organizations, including the concepts of existing industrial standard, IEEE Std-1012, IEEE Std-1059. This technical report includes scope of V and V guideline, guideline framework as part of acceptance criteria, V and V activities and task entrance as part of V and V activity and exit criteria, review and audit, testing and QA records of V and V material and configuration management, software verification and validation plan production etc., and safety-critical software V and V methodology. (author). 11 refs.

  3. KAERI software verification and validation guideline for developing safety-critical software in digital I and C system of NPP

    International Nuclear Information System (INIS)

    Kim, Jang Yeol; Lee, Jang Soo; Eom, Heung Seop.

    1997-07-01

    This technical report is to present V and V guideline development methodology for safety-critical software in NPP safety system. Therefore it is to present V and V guideline of planning phase for the NPP safety system in addition to critical safety items, for example, independence philosophy, software safety analysis concept, commercial off the shelf (COTS) software evaluation criteria, inter-relationships between other safety assurance organizations, including the concepts of existing industrial standard, IEEE Std-1012, IEEE Std-1059. This technical report includes scope of V and V guideline, guideline framework as part of acceptance criteria, V and V activities and task entrance as part of V and V activity and exit criteria, review and audit, testing and QA records of V and V material and configuration management, software verification and validation plan production etc., and safety-critical software V and V methodology. (author). 11 refs

  4. FASTBUS software status

    International Nuclear Information System (INIS)

    Gustavson, D.B.

    1980-10-01

    Computer software will be needed in addition to the mechanical, electrical, protocol and timing specifications of the FASTBUS, in order to facilitate the use of this flexible new multiprocessor and multisegment data acquisition and processing system. Software considerations have been important in the FASTBUS design, but standard subroutines and recommended algorithms will be needed as the FASTBUS comes into use. This paper summarizes current FASTBUS software projects, goals and status

  5. Freeware Versus Commercial Office Productivity Software

    Science.gov (United States)

    2016-12-01

    THREAT (SWOT) ANALYSIS Comparing multiple categories in a product selection process helps aid the DOD in choosing the correct productivity suite...DOD (Taylor, 2016). Because SWOT identifies both internal and external forces, the DOD’s product selection process needs to consider both. Once...search on alternate third party apps for things like hotels and restaurants instead of going to Google’s website. Google does not get paid if the users

  6. Engineering high quality medical software

    CERN Document Server

    Coronato, Antonio

    2018-01-01

    This book focuses on high-confidence medical software in the growing field of e-health, telecare services and health technology. It covers the development of methodologies and engineering tasks together with standards and regulations for medical software.

  7. Report on the Observance of Standards and Codes, Accounting and Auditing : Module B - Institutional Framework for Corporate Financial Reporting, B.1 Commercial Enterprises (including SMEs)

    OpenAIRE

    World Bank

    2017-01-01

    The purpose of this report is to gain an understanding of the general financial reporting and audit requirements for commercial enterprises in a jurisdiction as established by law or other regulation (for example, companies’ act). Commercial enterprises are defined as companies established with a profit-making objective that do not issue equity and debt on a public exchange, are not financ...

  8. Commercialization of NESSUS: Status

    Science.gov (United States)

    Thacker, Ben H.; Millwater, Harry R.

    1991-01-01

    A plan was initiated in 1988 to commercialize the Numerical Evaluation of Stochastic Structures Under Stress (NESSUS) probabilistic structural analysis software. The goal of the on-going commercialization effort is to begin the transfer of Probabilistic Structural Analysis Method (PSAM) developed technology into industry and to develop additional funding resources in the general area of structural reliability. The commercialization effort is summarized. The SwRI NESSUS Software System is a general purpose probabilistic finite element computer program using state of the art methods for predicting stochastic structural response due to random loads, material properties, part geometry, and boundary conditions. NESSUS can be used to assess structural reliability, to compute probability of failure, to rank the input random variables by importance, and to provide a more cost effective design than traditional methods. The goal is to develop a general probabilistic structural analysis methodology to assist in the certification of critical components in the next generation Space Shuttle Main Engine.

  9. SEER Data & Software

    Science.gov (United States)

    Options for accessing datasets for incidence, mortality, county populations, standard populations, expected survival, and SEER-linked and specialized data. Plus variable definitions, documentation for reporting and using datasets, statistical software (SEER*Stat), and observational research resources.

  10. Software engineering

    CERN Document Server

    Sommerville, Ian

    2010-01-01

    The ninth edition of Software Engineering presents a broad perspective of software engineering, focusing on the processes and techniques fundamental to the creation of reliable, software systems. Increased coverage of agile methods and software reuse, along with coverage of 'traditional' plan-driven software engineering, gives readers the most up-to-date view of the field currently available. Practical case studies, a full set of easy-to-access supplements, and extensive web resources make teaching the course easier than ever.

  11. Professional Issues In Software Engineering

    CERN Document Server

    Bott, Frank; Eaton, Jack; Rowland, Diane

    2000-01-01

    An comprehensive text covering all the issues that software engineers now have to take into account apart from the technical side of things. Includes information on the legal, professional and commercial context in which they work.

  12. Impacts of Reinsurance Operations on Significant Items of the Financial Statements of Commercial Insurance Companies According to Czech Accounting Legislation and International Accounting Standards

    OpenAIRE

    Jana Gláserová; Eva Vávrová

    2015-01-01

    The principal aim of the paper is to determine the impact of reinsurance operations in commercial insurance companies, in accordance with the relevant accounting legislation, for certain significant items of the financial statements. In actual fact, the reinsurance operations affect the profit of a commercial insurance company, following the financial statements. The prerequisite for fulfilling the objective of the paper is to analyse the accounting legislation for reinsurance operations in c...

  13. An Empirical Study of a Free Software Company

    OpenAIRE

    Pakusch, Cato

    2010-01-01

    Free software has matured well into the commercial software market, yet little qualitative research exists which accurately describes the state of commercial free software today. For this thesis, an instrumental case study was performed on a prominent free software company in Norway. The study found that the commercial free software market is largely driven by social networks, which have a social capital in its own that attracts more people, which in turn become members of the ...

  14. Software Formal Inspections Guidebook

    Science.gov (United States)

    1993-01-01

    The Software Formal Inspections Guidebook is designed to support the inspection process of software developed by and for NASA. This document provides information on how to implement a recommended and proven method for conducting formal inspections of NASA software. This Guidebook is a companion document to NASA Standard 2202-93, Software Formal Inspections Standard, approved April 1993, which provides the rules, procedures, and specific requirements for conducting software formal inspections. Application of the Formal Inspections Standard is optional to NASA program or project management. In cases where program or project management decide to use the formal inspections method, this Guidebook provides additional information on how to establish and implement the process. The goal of the formal inspections process as documented in the above-mentioned Standard and this Guidebook is to provide a framework and model for an inspection process that will enable the detection and elimination of defects as early as possible in the software life cycle. An ancillary aspect of the formal inspection process incorporates the collection and analysis of inspection data to effect continual improvement in the inspection process and the quality of the software subjected to the process.

  15. Impacts of Reinsurance Operations on Significant Items of the Financial Statements of Commercial Insurance Companies According to Czech Accounting Legislation and International Accounting Standards

    Directory of Open Access Journals (Sweden)

    Jana Gláserová

    2015-01-01

    Full Text Available The principal aim of the paper is to determine the impact of reinsurance operations in commercial insurance companies, in accordance with the relevant accounting legislation, for certain significant items of the financial statements. In actual fact, the reinsurance operations affect the profit of a commercial insurance company, following the financial statements. The prerequisite for fulfilling the objective of the paper is to analyse the accounting legislation for reinsurance operations in commercial insurance companies. Attention will be devoted also to the method of accounting for reinsurance operations and their specific reporting in various parts of the financial statements of commercial insurance companies. The partial aim of this paper is to identify significant differences in the area of accounting of commercial insurance companies, based on the comparison of accounting practices of the issues examined in accordance with IAS/IFRS. In the conclusion, the authors will address the latest development of necessary steps in adopting the concept of IFRS 4 Phase II and accomplishing the process of the application of IFRS 4 Phase II to the accounts of commercial insurance companies.

  16. Calibration of GafChromic XR-RV3 radiochromic film for skin dose measurement using standardized x-ray spectra and a commercial flatbed scanner

    International Nuclear Information System (INIS)

    McCabe, Bradley P.; Speidel, Michael A.; Pike, Tina L.; Van Lysel, Michael S.

    2011-01-01

    Purpose: In this study, newly formulated XR-RV3 GafChromic film was calibrated with National Institute of Standards and Technology (NIST) traceability for measurement of patient skin dose during fluoroscopically guided interventional procedures. Methods: The film was calibrated free-in-air to air kerma levels between 15 and 1100 cGy using four moderately filtered x-ray beam qualities (60, 80, 100, and 120 kVp). The calibration films were scanned with a commercial flatbed document scanner. Film reflective density-to-air kerma calibration curves were constructed for each beam quality, with both the orange and white sides facing the x-ray source. A method to correct for nonuniformity in scanner response (up to 25% depending on position) was developed to enable dose measurement with large films. The response of XR-RV3 film under patient backscattering conditions was examined using on-phantom film exposures and Monte Carlo simulations. Results: The response of XR-RV3 film to a given air kerma depended on kVp and film orientation. For a 200 cGy air kerma exposure with the orange side of the film facing the source, the film response increased by 20% from 60 to 120 kVp. At 500 cGy, the increase was 12%. When 500 cGy exposures were performed with the white side facing the x-ray source, the film response increased by 4.0% (60 kVp) to 9.9% (120 kVp) compared to the orange-facing orientation. On-phantom film measurements and Monte Carlo simulations show that using a NIST-traceable free-in-air calibration curve to determine air kerma in the presence of backscatter results in an error from 2% up to 8% depending on beam quality. The combined uncertainty in the air kerma measurement from the calibration curves and scanner nonuniformity correction was ±7.1% (95% C.I.). The film showed notable stability. Calibrations of film and scanner separated by 1 yr differed by 1.0%. Conclusions: XR-RV3 radiochromic film response to a given air kerma shows dependence on beam quality and film

  17. Improving Software Sustainability: Lessons Learned from Profiles in Science.

    Science.gov (United States)

    Gallagher, Marie E

    2013-01-01

    The Profiles in Science® digital library features digitized surrogates of historical items selected from the archival collections of the U.S. National Library of Medicine as well as collaborating institutions. In addition, it contains a database of descriptive, technical and administrative metadata. It also contains various software components that allow creation of the metadata, management of the digital items, and access to the items and metadata through the Profiles in Science Web site [1]. The choices made building the digital library were designed to maximize the sustainability and long-term survival of all of the components of the digital library [2]. For example, selecting standard and open digital file formats rather than proprietary formats increases the sustainability of the digital files [3]. Correspondingly, using non-proprietary software may improve the sustainability of the software--either through in-house expertise or through the open source community. Limiting our digital library software exclusively to open source software or to software developed in-house has not been feasible. For example, we have used proprietary operating systems, scanning software, a search engine, and office productivity software. We did this when either lack of essential capabilities or the cost-benefit trade-off favored using proprietary software. We also did so knowing that in the future we would need to replace or upgrade some of our proprietary software, analogous to migrating from an obsolete digital file format to a new format as the technological landscape changes. Since our digital library's start in 1998, all of its software has been upgraded or replaced, but the digitized items have not yet required migration to other formats. Technological changes that compelled us to replace proprietary software included the cost of product licensing, product support, incompatibility with other software, prohibited use due to evolving security policies, and product abandonment

  18. Development of a methodology and software for analysis of energy and economic feasibility of introducing natural gas facilities in residential an commercial sector; Desenvolvimento de metodologia e de software para analise de viabilidade energetica e economica da introducao de instalacoes para gas natural no setor residencial e comercial

    Energy Technology Data Exchange (ETDEWEB)

    Jesus, Marcos Fabio de; Torres, Ednildo Andrade [Universidade Federal da Bahia (UFBA), Salvador, BA (Brazil). Escola Politecnica. Lab. de Energia e Gas; Santos, Carlos Antonio Cabral dos [Universidade Federal da Paraiba (UFPB), Joao Pessoa, PB (Brazil). Lab. de Energia Solar; Campos, Michel Fabianski [PETROBRAS, Rio de Janeiro, RJ (Brazil). RedeGasEnergia

    2004-07-01

    With the increasing participation of the natural gas in the world-wide and national energy matrix, beyond the constant search for an alternative source of energy that has an acceptable behavior of the ambient point of view, they become each time more necessary studies to make possible the expansion of the use of this fuel in the diverse energy sectors, such as: Industrial, advertising, residential, to propagate, among others; Of these sectors, the residential one is what more it needs innovations and/or technological adaptations to exert a massive participation in the demand of the natural gas. This work has as objective to establish a methodology adjusted for analysis of the energy and economic viability of the introduction of installations for natural gas in the residential and commercial sector, as well as the implementation of a software that will more facilitate to the taking of decisions of this the confection of the plant low of the enterprise until the choice of the adjusted material for the installation of the tubing, besides showing to the viability technique - economic of the use of the natural gas for supplying all even though the energy necessities of this construction or of its joint participation with the electric energy or with the GLP. The methodology will mainly have support in first and the second law of the thermodynamics, beyond the norms Brazilian techniques that conduct this sector of the civil construction, taking in consideration the fixed and changeable costs of the energy construction of the construction and the involved ones. One expects, on the basis of the literature, that the introduction of installations for natural gas in the residential and commercial sector presents viability economic technique and, increasing with this the demand of this fuel and consequently its participation in the national energy matrix. (author)

  19. SOFTWARE OPEN SOURCE, SOFTWARE GRATIS?

    Directory of Open Access Journals (Sweden)

    Nur Aini Rakhmawati

    2006-01-01

    Full Text Available Normal 0 false false false IN X-NONE X-NONE MicrosoftInternetExplorer4 Berlakunya Undang – undang Hak Atas Kekayaan Intelektual (HAKI, memunculkan suatu alternatif baru untuk menggunakan software open source. Penggunaan software open source menyebar seiring dengan isu global pada Information Communication Technology (ICT saat ini. Beberapa organisasi dan perusahaan mulai menjadikan software open source sebagai pertimbangan. Banyak konsep mengenai software open source ini. Mulai dari software yang gratis sampai software tidak berlisensi. Tidak sepenuhnya isu software open source benar, untuk itu perlu dikenalkan konsep software open source mulai dari sejarah, lisensi dan bagaimana cara memilih lisensi, serta pertimbangan dalam memilih software open source yang ada. Kata kunci :Lisensi, Open Source, HAKI

  20. Software piracy: A study of causes, effects and preventive measures

    OpenAIRE

    Khadka, Ishwor

    2015-01-01

    Software piracy is a serious issue that has been affecting software companies for decades. According to Business Software Alliance (BSA), the global software piracy rate in 2013 was 43 percent and the commercial value of unlicensed software installations was $62.7 billion, which resulted in millions of revenues and jobs lost in software companies. The goal of this study was to better understand the software piracy behaviours, how it happens, how it affects to individuals and software compani...

  1. Software reliability

    CERN Document Server

    Bendell, A

    1986-01-01

    Software Reliability reviews some fundamental issues of software reliability as well as the techniques, models, and metrics used to predict the reliability of software. Topics covered include fault avoidance, fault removal, and fault tolerance, along with statistical methods for the objective assessment of predictive accuracy. Development cost models and life-cycle cost models are also discussed. This book is divided into eight sections and begins with a chapter on adaptive modeling used to predict software reliability, followed by a discussion on failure rate in software reliability growth mo

  2. Computer software.

    Science.gov (United States)

    Rosenthal, L E

    1986-10-01

    Software is the component in a computer system that permits the hardware to perform the various functions that a computer system is capable of doing. The history of software and its development can be traced to the early nineteenth century. All computer systems are designed to utilize the "stored program concept" as first developed by Charles Babbage in the 1850s. The concept was lost until the mid-1940s, when modern computers made their appearance. Today, because of the complex and myriad tasks that a computer system can perform, there has been a differentiation of types of software. There is software designed to perform specific business applications. There is software that controls the overall operation of a computer system. And there is software that is designed to carry out specialized tasks. Regardless of types, software is the most critical component of any computer system. Without it, all one has is a collection of circuits, transistors, and silicone chips.

  3. Software Radar Technology

    Directory of Open Access Journals (Sweden)

    Tang Jun

    2015-08-01

    Full Text Available In this paper, the definition and the key features of Software Radar, which is a new concept, are proposed and discussed. We consider the development of modern radar system technology to be divided into three stages: Digital Radar, Software radar and Intelligent Radar, and the second stage is just commencing now. A Software Radar system should be a combination of various modern digital modular components conformed to certain software and hardware standards. Moreover, a software radar system with an open system architecture supporting to decouple application software and low level hardware would be easy to adopt "user requirements-oriented" developing methodology instead of traditional "specific function-oriented" developing methodology. Compared with traditional Digital Radar, Software Radar system can be easily reconfigured and scaled up or down to adapt to the changes of requirements and technologies. A demonstration Software Radar signal processing system, RadarLab 2.0, which has been developed by Tsinghua University, is introduced in this paper and the suggestions for the future development of Software Radar in China are also given in the conclusion.

  4. Enhancing requirements engineering for patient registry software systems with evidence-based components.

    Science.gov (United States)

    Lindoerfer, Doris; Mansmann, Ulrich

    2017-07-01

    Patient registries are instrumental for medical research. Often their structures are complex and their implementations use composite software systems to meet the wide spectrum of challenges. Commercial and open-source systems are available for registry implementation, but many research groups develop their own systems. Methodological approaches in the selection of software as well as the construction of proprietary systems are needed. We propose an evidence-based checklist, summarizing essential items for patient registry software systems (CIPROS), to accelerate the requirements engineering process. Requirements engineering activities for software systems follow traditional software requirements elicitation methods, general software requirements specification (SRS) templates, and standards. We performed a multistep procedure to develop a specific evidence-based CIPROS checklist: (1) A systematic literature review to build a comprehensive collection of technical concepts, (2) a qualitative content analysis to define a catalogue of relevant criteria, and (3) a checklist to construct a minimal appraisal standard. CIPROS is based on 64 publications and covers twelve sections with a total of 72 items. CIPROS also defines software requirements. Comparing CIPROS with traditional software requirements elicitation methods, SRS templates and standards show a broad consensus but differences in issues regarding registry-specific aspects. Using an evidence-based approach to requirements engineering for registry software adds aspects to the traditional methods and accelerates the software engineering process for registry software. The method we used to construct CIPROS serves as a potential template for creating evidence-based checklists in other fields. The CIPROS list supports developers in assessing requirements for existing systems and formulating requirements for their own systems, while strengthening the reporting of patient registry software system descriptions. It may be

  5. Standardization Documents

    Science.gov (United States)

    2011-08-01

    Specifications and Standards; Guide Specifications; CIDs; and NGSs . Learn. Perform. Succeed. STANDARDIZATION DOCUMENTS Federal Specifications Commercial...national or international standardization document developed by a private sector association, organization, or technical society that plans ...Maintain lessons learned • Examples: Guidance for application of a technology; Lists of options Learn. Perform. Succeed. DEFENSE HANDBOOK

  6. Gammasphere software development. Progress report

    Energy Technology Data Exchange (ETDEWEB)

    Piercey, R.B.

    1994-01-01

    This report describes the activities of the nuclear physics group at Mississippi State University which were performed during 1993. Significant progress has been made in the focus areas: chairing the Gammasphere Software Working Group (SWG); assisting with the porting and enhancement of the ORNL UPAK histogramming software package; and developing standard formats for Gammasphere data products. In addition, they have established a new public ftp archive to distribute software and software development tools and information.

  7. jMRUI plugin software (jMRUI2XML) to allow automated MRS processing and XML-based standardized output

    Czech Academy of Sciences Publication Activity Database

    Mocioiu, V.; Ortega-Martorell, S.; Olier, I.; Jabłoński, Michal; Starčuková, Jana; Lisboa, P.; Arús, C.; Julia-Sapé, M.

    2015-01-01

    Roč. 28, S1 (2015), S518 ISSN 0968-5243. [ESMRMB 2015. Annual Scientific Meeting /32./. 01.09.2015-03.09.2015, Edinburgh] Institutional support: RVO:68081731 Keywords : MR Spectroscopy * signal processing * jMRUI * software development * XML Subject RIV: BH - Optics, Masers, Lasers

  8. Software quality in 1997

    Energy Technology Data Exchange (ETDEWEB)

    Jones, C. [Software Productivity Research, Inc., Burlington, MA (United States)

    1997-11-01

    For many years, software quality assurance lagged behind hardware quality assurance in terms of methods, metrics, and successful results. New approaches such as Quality Function Deployment (QFD) the ISO 9000-9004 standards, the SEI maturity levels, and Total Quality Management (TQM) are starting to attract wide attention, and in some cases to bring software quality levels up to a parity with manufacturing quality levels. Since software is on the critical path for many engineered products, and for internal business systems as well, the new approaches are starting to affect global competition and attract widespread international interest. It can be hypothesized that success in mastering software quality will be a key strategy for dominating global software markets in the 21st century.

  9. Software product quality measurement

    OpenAIRE

    Godliauskas, Eimantas

    2016-01-01

    This paper analyses Ruby product quality measures, suggesting three new measures for Ruby product quality measurement tool Rubocop to measure Ruby product quality characteristics defined in ISO 2502n standard series. This paper consists of four main chapters. The first chapter gives a brief view of software product quality and software product quality measurement. The second chapter analyses object oriented quality measures. The third chapter gives a brief view of the most popular Ruby qualit...

  10. Software System for the Calibration of X-Ray Measuring Instruments

    International Nuclear Information System (INIS)

    Gaytan-Gallardo, E.; Tovar-Munoz, V. M.; Cruz-Estrada, P.; Vergara-Martinez, F. J.; Rivero-Gutierrez, T.

    2006-01-01

    A software system that facilities the calibration of X-ray measuring instruments used in medical applications is presented. The Secondary Standard Dosimetry Laboratory (SSDL) of the Nuclear Research National Institute in Mexico (ININ in Spanish), supports activities concerning with ionizing radiations in medical area. One of these activities is the calibration of X-ray measuring instruments, in terms of air kerma or exposure by substitution method in an X-ray beam at a point where the rate has been determined by means of a standard ionization chamber. To automatize this process, a software system has been developed, the calibration system is composed by an X-ray unit, a Dynalizer IIIU X-ray meter by RADCAL, a commercial data acquisition card, the software system and the units to be tested and calibrated. A quality control plan has been applied in the development of the software system, ensuring that quality assurance procedures and standards are being followed

  11. Criteria for software modularization

    Science.gov (United States)

    Card, David N.; Page, Gerald T.; Mcgarry, Frank E.

    1985-01-01

    A central issue in programming practice involves determining the appropriate size and information content of a software module. This study attempted to determine the effectiveness of two widely used criteria for software modularization, strength and size, in reducing fault rate and development cost. Data from 453 FORTRAN modules developed by professional programmers were analyzed. The results indicated that module strength is a good criterion with respect to fault rate, whereas arbitrary module size limitations inhibit programmer productivity. This analysis is a first step toward defining empirically based standards for software modularization.

  12. Interface-based software testing

    Directory of Open Access Journals (Sweden)

    Aziz Ahmad Rais

    2016-10-01

    Full Text Available Software quality is determined by assessing the characteristics that specify how it should work, which are verified through testing. If it were possible to touch, see, or measure software, it would be easier to analyze and prove its quality. Unfortunately, software is an intangible asset, which makes testing complex. This is especially true when software quality is not a question of particular functions that can be tested through a graphical user interface. The primary objective of software architecture is to design quality of software through modeling and visualization. There are many methods and standards that define how to control and manage quality. However, many IT software development projects still fail due to the difficulties involved in measuring, controlling, and managing software quality. Software quality failure factors are numerous. Examples include beginning to test software too late in the development process, or failing properly to understand, or design, the software architecture and the software component structure. The goal of this article is to provide an interface-based software testing technique that better measures software quality, automates software quality testing, encourages early testing, and increases the software’s overall testability

  13. Method of V ampersand V for safety-critical software in NPPs

    International Nuclear Information System (INIS)

    Kim, Jang-Yeol; Lee, Jang-Soo; Kwon, Kee-Choon

    1997-01-01

    Safety-critical software is software used in systems in which a failure could affect personal or equipment safety or result in large financial or social loss. Examples of systems using safety-critical software are systems such as plant protection systems in nuclear power plants (NPPs), process control systems in chemical plants, and medical instruments such as the Therac-25 medical accelerator. This paper presents verification and validation (V ampersand V) methodology for safety-critical software in NPP safety systems. In addition, it addresses issues related to NPP safety systems, such as independence parameters, software safety analysis (SSA) concepts, commercial off-the-shelf (COTS) software evaluation criteria, and interrelationships among software and system assurance organizations. It includes the concepts of existing industrial standards on software V ampersand V, Institute of Electrical and Electronics Engineers (IEEE) Standards 1012 and 1059. This safety-critical software V ampersand V methodology covers V ampersand V scope, a regulatory framework as part of its acceptance criteria, V ampersand V activities and task entrance and exit criteria, reviews and audits, testing and quality assurance records of V ampersand V material, configuration management activities related to V ampersand V, and software V ampersand V (SVV) plan (SVVP) production

  14. Evidence synthesis software.

    Science.gov (United States)

    Park, Sophie Elizabeth; Thomas, James

    2018-06-07

    It can be challenging to decide which evidence synthesis software to choose when doing a systematic review. This article discusses some of the important questions to consider in relation to the chosen method and synthesis approach. Software can support researchers in a range of ways. Here, a range of review conditions and software solutions. For example, facilitating contemporaneous collaboration across time and geographical space; in-built bias assessment tools; and line-by-line coding for qualitative textual analysis. EPPI-Reviewer is a review software for research synthesis managed by the EPPI-centre, UCL Institute of Education. EPPI-Reviewer has text mining automation technologies. Version 5 supports data sharing and re-use across the systematic review community. Open source software will soon be released. EPPI-Centre will continue to offer the software as a cloud-based service. The software is offered via a subscription with a one-month (extendible) trial available and volume discounts for 'site licences'. It is free to use for Cochrane and Campbell reviews. The next EPPI-Reviewer version is being built in collaboration with National Institute for Health and Care Excellence using 'surveillance' of newly published research to support 'living' iterative reviews. This is achieved using a combination of machine learning and traditional information retrieval technologies to identify the type of research each new publication describes and determine its relevance for a particular review, domain or guideline. While the amount of available knowledge and research is constantly increasing, the ways in which software can support the focus and relevance of data identification are also developing fast. Software advances are maximising the opportunities for the production of relevant and timely reviews. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise

  15. EMMC guidance on quality assurance for academic materials modelling software engineering

    OpenAIRE

    European Materials Modelling Council

    2015-01-01

    Proposed recommendations for software development in LEIT projects. This document presents the advice of software owners, commercial and academic, on what academic software could do to generate better quality software, ready to be used by third parties.

  16. Low-cost approach for a software-defined radio based ground station receiver for CCSDS standard compliant S-band satellite communications

    Science.gov (United States)

    Boettcher, M. A.; Butt, B. M.; Klinkner, S.

    2016-10-01

    A major concern of a university satellite mission is to download the payload and the telemetry data from a satellite. While the ground station antennas are in general easy and with limited afford to procure, the receiving unit is most certainly not. The flexible and low-cost software-defined radio (SDR) transceiver "BladeRF" is used to receive the QPSK modulated and CCSDS compliant coded data of a satellite in the HAM radio S-band. The control software is based on the Open Source program GNU Radio, which also is used to perform CCSDS post processing of the binary bit stream. The test results show a good performance of the receiving system.

  17. Development of standardized approaches to reporting of minimal residual disease data using a reporting software package designed within the European LeukemiaNet

    DEFF Research Database (Denmark)

    Ostergaard, M; Nyvold, Charlotte Guldborg; Jovanovic, J V

    2011-01-01

    Quantitative PCR (qPCR) for detection of fusion transcripts and overexpressed genes is a promising tool for following minimal residual disease (MRD) in patients with hematological malignancies. Its widespread clinical use has to some extent been hampered by differences in data analysis and presen......Quantitative PCR (qPCR) for detection of fusion transcripts and overexpressed genes is a promising tool for following minimal residual disease (MRD) in patients with hematological malignancies. Its widespread clinical use has to some extent been hampered by differences in data analysis...... and presentation that complicate multicenter clinical trials. To address these issues, we designed a highly flexible MRD-reporting software program, in which data from various qPCR platforms can be imported, processed, and presented in a uniform manner to generate intuitively understandable reports. The software...... was tested in a two-step quality control (QC) study; the first step involved eight centers, whose previous experience with the software ranged from none to extensive. The participants received cDNA from consecutive samples from a BCR-ABL+ chronic myeloid leukemia (CML) patient and an acute myeloid leukemia...

  18. Software Innovation

    DEFF Research Database (Denmark)

    Rose, Jeremy

      Innovation is the forgotten key to modern systems development - the element that defines the enterprising engineer, the thriving software firm and the cutting edge software application.  Traditional forms of technical education pay little attention to creativity - often encouraging overly...

  19. Comparison of some lead and non-lead based glass systems, standard shielding concretes and commercial window glasses in terms of shielding parameters in the energy region of 1 keV-100 GeV: A comparative study

    International Nuclear Information System (INIS)

    Kurudirek, Murat; Ozdemir, Yueksel; Simsek, Onder; Durak, Ridvan

    2010-01-01

    The effective atomic numbers, Z eff of some glass systems with and without Pb have been calculated in the energy region of 1 keV-100 GeV including the K absorption edges of high Z elements present in the glass. Also, these glass systems have been compared with some standard shielding concretes and commercial window glasses in terms of mean free paths and total mass attenuation coefficients in the continuous energy range. Comparisons with experiments were also provided wherever possible for glasses. It has been observed that the glass systems without Pb have higher values of Z eff than that of Pb based glasses at some high energy regions even if they have lower mean atomic numbers than Pb based glasses. When compared with some standard shielding concretes and commercial window glasses, generally it has been shown that the given glass systems have superior properties than concretes and window glasses with respect to the radiation-shielding properties, thus confirming the availability of using these glasses as substitutes for some shielding concretes and commercial window glasses to improve radiation-shielding properties in the continuous energy region.

  20. Software engineering

    CERN Document Server

    Sommerville, Ian

    2016-01-01

    For courses in computer science and software engineering The Fundamental Practice of Software Engineering Software Engineering introduces readers to the overwhelmingly important subject of software programming and development. In the past few years, computer systems have come to dominate not just our technological growth, but the foundations of our world's major industries. This text seeks to lay out the fundamental concepts of this huge and continually growing subject area in a clear and comprehensive manner. The Tenth Edition contains new information that highlights various technological updates of recent years, providing readers with highly relevant and current information. Sommerville's experience in system dependability and systems engineering guides the text through a traditional plan-based approach that incorporates some novel agile methods. The text strives to teach the innovators of tomorrow how to create software that will make our world a better, safer, and more advanced place to live.

  1. A study on the establishment of safety assessment guidelines of commercial grade item dedication in digitalized safety systems

    International Nuclear Information System (INIS)

    Hwang, H. S.; Kim, B. R.; Oh, S. H.

    1999-01-01

    Because of obsolescing the components used in safety related systems of nuclear power plants, decreasing the number of suppliers qualified for the nuclear QA program and increasing maintenance costs of them, utilities have been considering to use commercial grade digital computers as an alternative for resolving such issues. However, commercial digital computers use the embedded pre-existing software, including operating system software, which are not developed by using nuclear grade QA program. Thus, it is necessary for utilities to establish processes for dedicating digital commercial grade items. A regulatory body also needs guidance to evaluate the digital commercial products properly. This paper surveyed the regulations and their regulatory guides, which establish the requirements for commercial grade items dedication, industry standards and guidances applicable to safety related systems. This paper provides some guidelines to be applied in evaluating the safety of digital upgrades and new digital plant protection systems in Korea

  2. Commercial Toilets

    Science.gov (United States)

    Whether you are looking to reduce water use in a new facility or replace old, inefficient toilets in commercial restrooms, a WaterSense labeled flushometer-valve toilet is a high-performance, water-efficient option worth considering.

  3. Aircraft Design Software

    Science.gov (United States)

    1997-01-01

    Successful commercialization of the AirCraft SYNThesis (ACSYNT) tool has resulted in the creation of Phoenix Integration, Inc. ACSYNT has been exclusively licensed to the company, an outcome of a seven year, $3 million effort to provide unique software technology to a focused design engineering market. Ames Research Center formulated ACSYNT and in working with the Virginia Polytechnic Institute CAD Laboratory, began to design and code a computer-aided design for ACSYNT. Using a Joint Sponsored Research Agreement, Ames formed an industry-government-university alliance to improve and foster research and development for the software. As a result of the ACSYNT Institute, the software is becoming a predominant tool for aircraft conceptual design. ACSYNT has been successfully applied to high- speed civil transport configuration, subsonic transports, and supersonic fighters.

  4. Space Commercialization

    Science.gov (United States)

    Martin, Gary L.

    2011-01-01

    A robust and competitive commercial space sector is vital to continued progress in space. The United States is committed to encouraging and facilitating the growth of a U.S. commercial space sector that supports U.S. needs, is globally competitive, and advances U.S. leadership in the generation of new markets and innovation-driven entrepreneurship. Energize competitive domestic industries to participate in global markets and advance the development of: satellite manufacturing; satellite-based services; space launch; terrestrial applications; and increased entrepreneurship. Purchase and use commercial space capabilities and services to the maximum practical extent Actively explore the use of inventive, nontraditional arrangements for acquiring commercial space goods and services to meet United States Government requirements, including measures such as public-private partnerships, . Refrain from conducting United States Government space activities that preclude, discourage, or compete with U.S. commercial space activities. Pursue potential opportunities for transferring routine, operational space functions to the commercial space sector where beneficial and cost-effective.

  5. Evaluation of a Commercial Sandwich Enzyme-Linked Immunosorbent Assay for the Quantification of Beta-Casomorphin 7 in Yogurt Using Solid-Phase Extraction Coupled to Liquid Chromatography-Tandem Mass Spectrometry as the "Gold Standard" Method.

    Science.gov (United States)

    Nguyen, Duc Doan; Busetti, Francesco; Johnson, Stuart Keith; Solah, Vicky Ann

    2018-03-01

    This study investigated beta-casomorphin 7 (BCM7) in yogurt by means of LC-tandem MS (MS/MS) and enzyme-linked immunosorbent assay (ELISA) and use LC-MS/MS as the "gold standard" method to evaluate the applicability of a commercial ELISA. The level of BCM7 in milk obtained from ELISA analysis was much lower than that obtained by LC-MS/MS analysis and trended to increase during fermentation and storage of yogurt. Meanwhile, the results obtained from LC-MS/MS showed that BCM7 degraded during stages of yogurt processing, and its degradation may have been caused by X-prolyl dipeptidyl aminopeptidase activity. As a result, the commercial sandwich ELISA kit was not suitable for the quantification of BCM7 in fermented dairy milk.

  6. An off-the-shelf guider for the Palomar 200-inch telescope: interfacing amateur astronomy software with professional telescopes for an easy life

    Science.gov (United States)

    Clarke, Fraser; Lynn, James; Thatte, Niranjan; Tecza, Matthias

    2014-08-01

    We have developed a simple but effective guider for use with the Oxford-SWIFT integral field spectrograph on the Palomar 200-inch telescope. The guider uses mainly off-the-shelf components, including commercial amateur astronomy software to interface with the CCD camera, calculating guiding corrections, and send guide commands to the telescope. The only custom piece of software is an driver to provide an interface between the Palomar telescope control system and the industry standard 'ASCOM' system. Using existing commercial software provided a very cheap guider (guiding, and could easily be adapted to any other professional telescope

  7. Modernization of tank floor scanning system (TAFLOSS) Software

    International Nuclear Information System (INIS)

    Mohd Fitri Abd Rahman; Jaafar Abdullah; Zainul A Hassan

    2002-01-01

    The main objective of the project is to develop new user-friendly software that combined the second-generation software (developed in-house) and commercial software. This paper describes the development of computer codes for analysing the initial data and plotting exponential curve fit. The method that used in curve fitting is least square technique. The software that had been developed is capable to give a comparable result as the commercial software. (Author)

  8. Selection and Management of Open Source Software in Libraries

    OpenAIRE

    Vimal Kumar, V.

    2007-01-01

    Open source software was a revolutionary concept among computer programmers and users. To a certain extent open source solutions could provide an alternative solution to costly commercial software. Open source software is, software that users have the ability to run, copy, distribute, study, change, share and improve for any purpose. Open source library software’s does not need the initial cost of commercial software and enables libraries to have greater control over their working environmen...

  9. Software qualification in safety applications

    International Nuclear Information System (INIS)

    Lawrence, J.D.

    2000-01-01

    The developers of safety-critical instrumentation and control systems must qualify the design of the components used, including the software in the embedded computer systems, in order to ensure that the component can be trusted to perform its safety function under the full range of operating conditions. There are well known ways to qualify analog systems using the facts that: (1) they are built from standard modules with known properties; (2) design documents are available and described in a well understood language; (3) the performance of the component is constrained by physics; and (4) physics models exist to predict the performance. These properties are not generally available for qualifying software, and one must fall back on extensive testing and qualification of the design process. Neither of these is completely satisfactory. The research reported here is exploring an alternative approach that is intended to permit qualification for an important subset of instrumentation software. The research goal is to determine if a combination of static analysis and limited testing can be used to qualify a class of simple, but practical, computer-based instrumentation components for safety application. These components are of roughly the complexity of a motion detector alarm controller. This goal is accomplished by identifying design constraints that enable meaningful analysis and testing. Once such design constraints are identified, digital systems can be designed to allow for analysis and testing, or existing systems may be tested for conformance to the design constraints as a first step in a qualification process. This will considerably reduce the cost and monetary risk involved in qualifying commercial components for safety-critical service

  10. Differences in serum thyroglobulin measurements by 3 commercial immunoradiometric assay kits and laboratory standardization using Certified Reference Material 457 (CRM-457).

    Science.gov (United States)

    Lee, Ji In; Kim, Ji Young; Choi, Joon Young; Kim, Hee Kyung; Jang, Hye Won; Hur, Kyu Yeon; Kim, Jae Hyeon; Kim, Kwang-Won; Chung, Jae Hoon; Kim, Sun Wook

    2010-09-01

    Serum thyroglobulin (Tg) is essential in the follow-up of patients with differentiated thyroid carcinoma (DTC). However, interchangeability and standardization between Tg assays have not yet been achieved, even with the development of an international Tg standard (Certified Reference Material 457 [CRM-457]). Serum Tg from 30 DTC patients and serially diluted CRM-457 were measured using 3 different immunoradiometric assays (IRMA-1, IRMA-2, IRMA-3). The intraclass correlation coefficient (ICC) method was used to describe the concordance of each IRMA to CRM-457. The serum Tg measured by 3 different IRMAs correlated well (r > .85, p CRM-457, showed the best ICC (p(1) = .98) for the CRM-457. Hospitals caring for patients with DTC should either set their own cutoffs for IRMAs for Tg based on their patient pools, or adopt IRMAs standardized to CRM-457 and calibrate their laboratory using CRM-457.

  11. Analysis of linear measurements on 3D surface models using CBCT data segmentation obtained by automatic standard pre-set thresholds in two segmentation software programs: an in vitro study.

    Science.gov (United States)

    Poleti, Marcelo Lupion; Fernandes, Thais Maria Freire; Pagin, Otávio; Moretti, Marcela Rodrigues; Rubira-Bullen, Izabel Regina Fischer

    2016-01-01

    The aim of this in vitro study was to evaluate the reliability and accuracy of linear measurements on three-dimensional (3D) surface models obtained by standard pre-set thresholds in two segmentation software programs. Ten mandibles with 17 silica markers were scanned for 0.3-mm voxels in the i-CAT Classic (Imaging Sciences International, Hatfield, PA, USA). Twenty linear measurements were carried out by two observers two times on the 3D surface models: the Dolphin Imaging 11.5 (Dolphin Imaging & Management Solutions, Chatsworth, CA, USA), using two filters(Translucent and Solid-1), and in the InVesalius 3.0.0 (Centre for Information Technology Renato Archer, Campinas, SP, Brazil). The physical measurements were made by another observer two times using a digital caliper on the dry mandibles. Excellent intra- and inter-observer reliability for the markers, physical measurements, and 3D surface models were found (intra-class correlation coefficient (ICC) and Pearson's r ≥ 0.91). The linear measurements on 3D surface models by Dolphin and InVesalius software programs were accurate (Dolphin Solid-1 > InVesalius > Dolphin Translucent). The highest absolute and percentage errors were obtained for the variable R1-R1 (1.37 mm) and MF-AC (2.53 %) in the Dolphin Translucent and InVesalius software, respectively. Linear measurements on 3D surface models obtained by standard pre-set thresholds in the Dolphin and InVesalius software programs are reliable and accurate compared with physical measurements. Studies that evaluate the reliability and accuracy of the 3D models are necessary to ensure error predictability and to establish diagnosis, treatment plan, and prognosis in a more realistic way.

  12. Software requirements

    CERN Document Server

    Wiegers, Karl E

    2003-01-01

    Without formal, verifiable software requirements-and an effective system for managing them-the programs that developers think they've agreed to build often will not be the same products their customers are expecting. In SOFTWARE REQUIREMENTS, Second Edition, requirements engineering authority Karl Wiegers amplifies the best practices presented in his original award-winning text?now a mainstay for anyone participating in the software development process. In this book, you'll discover effective techniques for managing the requirements engineering process all the way through the development cy

  13. Modular Software-Defined Radio

    Directory of Open Access Journals (Sweden)

    Rhiemeier Arnd-Ragnar

    2005-01-01

    Full Text Available In view of the technical and commercial boundary conditions for software-defined radio (SDR, it is suggestive to reconsider the concept anew from an unconventional point of view. The organizational principles of signal processing (rather than the signal processing algorithms themselves are the main focus of this work on modular software-defined radio. Modularity and flexibility are just two key characteristics of the SDR environment which extend smoothly into the modeling of hardware and software. In particular, the proposed model of signal processing software includes irregular, connected, directed, acyclic graphs with random node weights and random edges. Several approaches for mapping such software to a given hardware are discussed. Taking into account previous findings as well as new results from system simulations presented here, the paper finally concludes with the utility of pipelining as a general design guideline for modular software-defined radio.

  14. Weighting Factors for the Commercial Building Prototypes Used in the Development of ANSI/ASHRAE/IESNA Standard 90.1-2010

    Energy Technology Data Exchange (ETDEWEB)

    Jarnagin, Ronald E.; Bandyopadhyay, Gopal K.

    2010-01-21

    Detailed construction data from the McGraw Hill Construction Database was used to develop construction weights by climate zones for use with DOE Benchmark Buildings and for the ASHRAE Standard 90.1-2010 development. These construction weights were applied to energy savings estimates from simulation of the benchmark buildings to establish weighted national energy savings.

  15. The STARLINK software collection

    Science.gov (United States)

    Penny, A. J.; Wallace, P. T.; Sherman, J. C.; Terret, D. L.

    1993-12-01

    A demonstration will be given of some recent Starlink software. STARLINK is: a network of computers used by UK astronomers; a collection of programs for the calibration and analysis of astronomical data; a team of people giving hardware, software and administrative support. The Starlink Project has been in operation since 1980 to provide UK astronomers with interactive image processing and data reduction facilities. There are now Starlink computer systems at 25 UK locations, serving about 1500 registered users. The Starlink software collection now has about 25 major packages covering a wide range of astronomical data reduction and analysis techniques, as well as many smaller programs and utilities. At the core of most of the packages is a common `software environment', which provides many of the functions which applications need and offers standardized methods of structuring and accessing data. The software environment simplifies programming and support, and makes it easy to use different packages for different stages of the data reduction. Users see a consistent style, and can mix applications without hitting problems of differing data formats. The Project group coordinates the writing and distribution of this software collection, which is Unix based. Outside the UK, Starlink is used at a large number of places, which range from installations at major UK telescopes, which are Starlink-compatible and managed like Starlink sites, to individuals who run only small parts of the Starlink software collection.

  16. Software Reviews.

    Science.gov (United States)

    Dwyer, Donna; And Others

    1989-01-01

    Reviewed are seven software packages for Apple and IBM computers. Included are: "Toxicology"; "Science Corner: Space Probe"; "Alcohol and Pregnancy"; "Science Tool Kit Plus"; Computer Investigations: Plant Growth"; "Climatrolls"; and "Animal Watch: Whales." (CW)

  17. Software Reviews.

    Science.gov (United States)

    Davis, Shelly J., Ed.; Knaupp, Jon, Ed.

    1984-01-01

    Reviewed is computer software on: (1) classification of living things, a tutorial program for grades 5-10; and (2) polynomial practice using tiles, a drill-and-practice program for algebra students. (MNS)

  18. Software Reviews.

    Science.gov (United States)

    Miller, Anne, Ed.; Radziemski, Cathy, Ed.

    1988-01-01

    Three pieces of computer software are described and reviewed: HyperCard, to build and use varied applications; Iggy's Gnees, for problem solving with shapes in grades kindergarten-two; and Algebra Shop, for practicing skills and problem solving. (MNS)

  19. Sandia software guidelines, Volume 4: Configuration management

    Energy Technology Data Exchange (ETDEWEB)

    1992-06-01

    This volume is one in a series of Sandia Software Guidelines for use in producing quality software within Sandia National Laboratories. This volume is based on the IEEE standard and guide for software configuration management. The basic concepts and detailed guidance on implementation of these concepts are discussed for several software project types. Example planning documents for both projects and organizations are included.

  20. ECLIPSE, an Emerging Standardized Modular, Secure and Affordable Software Toolset in Support of Product Assurance, Quality Assurance and Project Management for the Entire European Space Industry (from Innovative SMEs to Primes and Institutions)

    Science.gov (United States)

    Bennetti, Andrea; Ansari, Salim; Dewhirst, Tori; Catanese, Giuseppe

    2010-08-01

    The development of satellites and ground systems (and the technologies that support them) is complex and demands a great deal of rigor in the management of both the information it relies upon and the information it generates via the performance of well established processes. To this extent for the past fifteen years Sapienza Consulting has been supporting the European Space Agency (ESA) in the management of this information and provided ESA with ECSS (European Cooperation for Space Standardization) Standards based Project Management (PM), Product Assurance (PA) and Quality Assurance (QA) software applications. In 2009 Sapienza recognised the need to modernize, standardizing and integrate its core ECSS-based software tools into a single yet modularised suite of applications named ECLIPSE aimed at: • Fulfilling a wider range of historical and emerging requirements, • Providing a better experience for users, • Increasing the value of the information it collects and manages • Lowering the cost of ownership and operation • Increasing collaboration within and between space sector organizations • Aiding in the performance of several PM, PA, QA, and configuration management tasks in adherence to ECSS standards. In this paper, Sapienza will first present the toolset, and a rationale for its development, describing and justifying its architecture, and basic modules composition. Having defined the toolset architecture, this paper will address the current status of the individual applications. A compliance assessment will be presented for each module in the toolset with respect to the ECSS standard it addresses. Lastly experience from early industry and Institutional users will be presented.

  1. Rapid Prototyping of Standard Compliant Visible Light Communications System

    OpenAIRE

    Gavrincea, Ciprian; Baranda, Jorge; Henarejos, Pol

    2014-01-01

    This article describes the implementation of a prototype visible light communications system based on the IEEE 802.15.7 standard using low-cost commercial off-the-shelf analog devices. The aim of this article is to show that this standard provides a framework that could promote the introduction of applications into the market. Thus, these specifications could be further developed, reducing the gap between the industry and research communities. The implemented prototype makes use of software d...

  2. Ground control station software design for micro aerial vehicles

    Science.gov (United States)

    Walendziuk, Wojciech; Oldziej, Daniel; Binczyk, Dawid Przemyslaw; Slowik, Maciej

    2017-08-01

    This article describes the process of designing the equipment part and the software of a ground control station used for configuring and operating micro unmanned aerial vehicles (UAV). All the works were conducted on a quadrocopter model being a commonly accessible commercial construction. This article contains a characteristics of the research object, the basics of operating the micro aerial vehicles (MAV) and presents components of the ground control station model. It also describes the communication standards for the purpose of building a model of the station. Further part of the work concerns the software of the product - the GIMSO application (Generally Interactive Station for Mobile Objects), which enables the user to manage the actions and communication and control processes from the UAV. The process of creating the software and the field tests of a station model are also presented in the article.

  3. VMStools: Open-source software for the processing, analysis and visualisation of fisheries logbook and VMS data

    NARCIS (Netherlands)

    Hintzen, N.T.; Bastardie, F.; Beare, D.J.; Piet, G.J.; Ulrich, C.; Deporte, N.; Egekvist, J.; Degel, H.

    2012-01-01

    VMStools is a package of open-source software, build using the freeware environment R, specifically developed for the processing, analysis and visualisation of landings (logbooks) and vessel location data (VMS) from commercial fisheries. Analyses start with standardized data formats for logbook

  4. Searches for standard model Higgs boson and supersymmetry - Trigger studies and software tools development for new phenomena in the DO experiment

    International Nuclear Information System (INIS)

    Duperrin, A.

    2007-04-01

    This document presents a summary of my research work during the past 7 years for the preparation of the D0 experiment at Fermilab and the analysis of the data collected at the Tevatron hadron collider. It mainly focuses on 2 topics: trigger and direct search for new phenomena, particularly on supersymmetry and standard model Higgs boson searches. This document is divided into 5 chapters: 1) the phenomenology of the standard model and beyond, 2) the phenomenology of pp-bar events, 3) the Tevatron and D0 detector, 4) the trigger system and data acquisition, and 5) data analysis: search for supersymmetry and Higgs boson

  5. Safety Review related to Commercial Grade Digital Equipment in Safety System

    International Nuclear Information System (INIS)

    Yu, Yeongjin; Park, Hyunshin; Yu, Yeongjin; Lee, Jaeheung

    2013-01-01

    The upgrades or replacement of I and C systems on safety system typically involve digital equipment developed in accordance with non-nuclear standards. However, the use of commercial grade digital equipment could include the vulnerability for software common-mode failure, electromagnetic interference and unanticipated problems. Although guidelines and standards for dedication methods of commercial grade digital equipment are provided, there are some difficulties to apply the methods to commercial grade digital equipment for safety system. This paper focuses on regulatory guidelines and relevant documents for commercial grade digital equipment and presents safety review experiences related to commercial grade digital equipment in safety system. This paper focuses on KINS regulatory guides and relevant documents for dedication of commercial grade digital equipment and presents safety review experiences related to commercial grade digital equipment in safety system. Dedication including critical characteristics is required to use the commercial grade digital equipment on safety system in accordance with KEPIC ENB 6370 and EPRI TR-106439. The dedication process should be controlled in a configuration management process. Appropriate methods, criteria and evaluation result should be provided to verify acceptability of the commercial digital equipment used for safety function

  6. Software quality assurance plans for safety-critical software

    International Nuclear Information System (INIS)

    Liddle, P.

    2006-01-01

    Application software is defined as safety-critical if a fault in the software could prevent the system components from performing their nuclear-safety functions. Therefore, for nuclear-safety systems, the AREVA TELEPERM R XS (TXS) system is classified 1E, as defined in the Inst. of Electrical and Electronics Engineers (IEEE) Std 603-1998. The application software is classified as Software Integrity Level (SIL)-4, as defined in IEEE Std 7-4.3.2-2003. The AREVA NP Inc. Software Program Manual (SPM) describes the measures taken to ensure that the TELEPERM XS application software attains a level of quality commensurate with its importance to safety. The manual also describes how TELEPERM XS correctly performs the required safety functions and conforms to established technical and documentation requirements, conventions, rules, and standards. The program manual covers the requirements definition, detailed design, integration, and test phases for the TELEPERM XS application software, and supporting software created by AREVA NP Inc. The SPM is required for all safety-related TELEPERM XS system applications. The program comprises several basic plans and practices: 1. A Software Quality-Assurance Plan (SQAP) that describes the processes necessary to ensure that the software attains a level of quality commensurate with its importance to safety function. 2. A Software Safety Plan (SSP) that identifies the process to reasonably ensure that safety-critical software performs as intended during all abnormal conditions and events, and does not introduce any new hazards that could jeopardize the health and safety of the public. 3. A Software Verification and Validation (V and V) Plan that describes the method of ensuring the software is in accordance with the requirements. 4. A Software Configuration Management Plan (SCMP) that describes the method of maintaining the software in an identifiable state at all times. 5. A Software Operations and Maintenance Plan (SO and MP) that

  7. Commercial Slit-Lamp Anterior Segment Photography versus Digital Compact Camera Mounted on a Standard Slit-Lamp with an Adapter.

    Science.gov (United States)

    Oliphant, Huw; Kennedy, Alasdair; Comyn, Oliver; Spalton, David J; Nanavaty, Mayank A

    2018-06-16

    To compare slit lamp mounted cameras (SLC) versus digital compact camera (DCC) with slit-lamp adaptor when used by an inexperienced technician. In this cross sectional study, where posterior capsule opacification (PCO) was used as a comparator, patients were consented for one photograph with SLC and two with DCC (DCC1 and DCC2), with a slit-lamp adaptor. An inexperienced clinic technician, who took all the photographs and masked the images, recruited one eye of each patient. Images were graded for PCO using ECPO2000 software by two independent masked graders. Repeatability between DCC1 & DCC2 and limits-of-agreement between SLC and DCC1 mounted on slit-lamp with an adaptor were assessed. Coefficient-of-repeatability and Bland-Altmann plots were analyzed. Seventy-two patients (eyes) were recruited in the study. First 9 patients (eyes) were excluded due to unsatisfactory image quality from both the systems. Mean EPCO score for SLC was 2.28 (95% CI: 2.09 -2.45), for DCC1 was 2.28 (95% CI: 2.11-2.45), and for the DCC2 was 2.11 (95% CI: 2.11-2.45). There was no significant difference in EPCO scores between SLC Vs. DCC1 (p = 0.98) and between DCC1 and DCC 2 (p = 0.97). Coefficient of repeatability between DCC images was 0.42, and the coefficient of repeatability between DCC and SLC was 0.58. DCC on slit-lamp with an adaptor is comparable to a SLC. There is an initial learning curve, which is similar for both for an inexperienced person. This opens up the possibility for low cost anterior segment imaging in the clinical, research and teaching settings.

  8. Software process in Geant4

    International Nuclear Information System (INIS)

    Cosmo, G.

    2001-01-01

    Since its erliest years of R and D, the GEANT4 simulation toolkit has been developed following software process standards which dictated the overall evolution of the project. The complexity of the software involved, the wide areas of application of the software product, the huge amount of code and Category complexity, the size and distributed nature of the Collaboration itself are all ingredients which involve and correlate together a wide variety of software processes. Although in 'production' and available to the public since December 1998, the GEANT4 software product includes Category Domains which are still under active development. Therefore they require different treatment also in terms of improvement of the development cycle, system testing and user support. The author is meant to describe some of the software processes as they are applied in GEANT4 for both development, testing and maintenance of the software

  9. AN IMPROVED COCOMO SOFTWARE COST ESTIMATION MODEL

    African Journals Online (AJOL)

    DJFLEX

    developmental effort favourable to both software developers and customers, a standard effort multiplication factor(er) is introduced, to ... for recent changes in software engineering technology. The COCOMO .... application composition utilities.

  10. Development of a fatigue analysis software system

    International Nuclear Information System (INIS)

    Choi, B. I.; Lee, H. J.; Han, S. W.; Kim, J. Y.; Hwang, K. H.; Kang, J. Y.

    2001-01-01

    A general purpose fatigue analysis software to predict fatigue lives of mechanical components and structures was developed. This software has some characteristic features including functions of searching weak regions on the free surface in order to reduce computing time significantly, a database of fatigue properties for various materials, and an expert system which can assist any users to get more proper results. This software can be used in the environment consists of commercial finite element packages. Using the software developed fatigue analyses for a SAE keyhole specimen and an automobile knuckle were carried out. It was observed that the results were agree well with those from commercial packages

  11. Software Authentication

    International Nuclear Information System (INIS)

    Wolford, J.K.; Geelhood, B.D.; Hamilton, V.A.; Ingraham, J.; MacArthur, D.W.; Mitchell, D.J.; Mullens, J.A.; Vanier, P. E.; White, G.K.; Whiteson, R.

    2001-01-01

    The effort to define guidance for authentication of software for arms control and nuclear material transparency measurements draws on a variety of disciplines and has involved synthesizing established criteria and practices with newer methods. Challenges include the need to protect classified information that the software manipulates as well as deal with the rapid pace of innovation in the technology of nuclear material monitoring. The resulting guidance will shape the design of future systems and inform the process of authentication of instruments now being developed. This paper explores the technical issues underlying the guidance and presents its major tenets

  12. Software engineering

    CERN Document Server

    Thorin, Marc

    1985-01-01

    Software Engineering describes the conceptual bases as well as the main methods and rules on computer programming. This book presents software engineering as a coherent and logically built synthesis and makes it possible to properly carry out an application of small or medium difficulty that can later be developed and adapted to more complex cases. This text is comprised of six chapters and begins by introducing the reader to the fundamental notions of entities, actions, and programming. The next two chapters elaborate on the concepts of information and consistency domains and show that a proc

  13. FASTBUS software workshop

    International Nuclear Information System (INIS)

    1985-01-01

    FASTBUS is a standard for modular high-speed data acquisition, data-processing and control, development for use in high-energy physics experiments incorporating different types of computers and microprocessors. This Workshop brought together users from different laboratories for a review of current software activities, using the standard both in experiments and for test equipment. There are also papers on interfacing and the present state of systems being developed for use in future LEP experiments. Also included is a discussion on the proposed revision of FASTBUS Standard Routines. (orig.)

  14. Radiation safety assessment and development of environmental radiation monitoring technology; standardization of input parameters for the calculation of annual dose from routine releases from commercial reactor effluents

    Energy Technology Data Exchange (ETDEWEB)

    Rhee, I. H.; Cho, D.; Youn, S. H.; Kim, H. S.; Lee, S. J.; Ahn, H. K. [Soonchunhyang University, Ahsan (Korea)

    2002-04-01

    This research is to develop a standard methodology for determining the input parameters that impose a substantial impact on radiation doses of residential individuals in the vicinity of four nuclear power plants in Korea. We have selected critical nuclei, pathways and organs related to the human exposure via simulated estimation with K-DOSE 60 based on the updated ICRP-60 and sensitivity analyses. From the results, we found that 1) the critical nuclides were found to be {sup 3}H, {sup 133}Xe, {sup 60}Co for Kori plants and {sup 14}C, {sup 41}Ar for Wolsong plants. The most critical pathway was 'vegetable intake' for adults and 'milk intake' for infants. However, there was no preference in the effective organs, and 2) sensitivity analyses showed that the chemical composition in a nuclide much more influenced upon the radiation dose than any other input parameters such as food intake, radiation discharge, and transfer/concentration coefficients by more than 102 factor. The effect of transfer/concentration coefficients on the radiation dose was negligible. All input parameters showed highly estimated correlation with the radiation dose, approximated to 1.0, except for food intake in Wolsong power plant (partial correlation coefficient (PCC)=0.877). Consequently, we suggest that a prediction model or scenarios for food intake reflecting the current living trend and a formal publications including details of chemical components in the critical nuclei from each plant are needed. Also, standardized domestic values of the parameters used in the calculation must replace the values of the existed or default-set imported factors via properly designed experiments and/or modelling such as transport of liquid discharge in waters nearby the plants, exposure tests on corps and plants so on. 4 figs., 576 tabs. (Author)

  15. Commercial Banks

    Directory of Open Access Journals (Sweden)

    Abbas Asosheh

    2009-09-01

    Full Text Available Information systems outsourcing issues has been attracted in recent years because many information systems projects in organizations are done in this case. On the other hand, failure rate of this kind of projects is also high. The aim of this article is to find success factors in risk management of information systems outsourcing in commercial banks using these factors leads to increase the success rate of risk management of information systems outsourcing projects. Research methods in the present article based on purpose are applied and descriptive- survey. In addition, research tool is questionnaire which was used among commercial bank experts. For this purpose, First information systems outsourcing risks were identified and then ranked. In the next step, the information systems outsourcing reasons were surveyed and the most important reasons were identified. Then the risks which have not any relationship with the most important reasons were removed and success factors in managing residual risks were extracted.

  16. Software-based acoustical measurements

    CERN Document Server

    Miyara, Federico

    2017-01-01

    This textbook provides a detailed introduction to the use of software in combination with simple and economical hardware (a sound level meter with calibrated AC output and a digital recording system) to obtain sophisticated measurements usually requiring expensive equipment. It emphasizes the use of free, open source, and multiplatform software. Many commercial acoustical measurement systems use software algorithms as an integral component; however the methods are not disclosed. This book enables the reader to develop useful algorithms and provides insight into the use of digital audio editing tools to document features in the signal. Topics covered include acoustical measurement principles, in-depth critical study of uncertainty applied to acoustical measurements, digital signal processing from the basics, and metrologically-oriented spectral and statistical analysis of signals. The student will gain a deep understanding of the use of software for measurement purposes; the ability to implement software-based...

  17. Nested Cohort - R software package

    Science.gov (United States)

    NestedCohort is an R software package for fitting Kaplan-Meier and Cox Models to estimate standardized survival and attributable risks for studies where covariates of interest are observed on only a sample of the cohort.

  18. Commercial Mobile Alert Service (CMAS) Scenarios

    Science.gov (United States)

    2012-05-01

    Commercial Mobile Alert Service (CMAS) Scenarios The WEA Project Team May 2012 SPECIAL REPORT CMU/SEI-2012-SR-020 CERT® Division, Software ...Homeland Security under Contract No. FA8721-05-C-0003 with Carnegie Mellon University for the operation of the Software Engineering Institute, a federally...DISTRIBUTES IT “AS IS.” References herein to any specific commercial product, process, or service by trade name, trade mark, manufacturer, or otherwise

  19. Reviews, Software.

    Science.gov (United States)

    Science Teacher, 1988

    1988-01-01

    Reviews two computer software packages for use in physical science, physics, and chemistry classes. Includes "Physics of Model Rocketry" for Apple II, and "Black Box" for Apple II and IBM compatible computers. "Black Box" is designed to help students understand the concept of indirect evidence. (CW)

  20. Software Reviews.

    Science.gov (United States)

    Kinnaman, Daniel E.; And Others

    1988-01-01

    Reviews four educational software packages for Apple, IBM, and Tandy computers. Includes "How the West was One + Three x Four,""Mavis Beacon Teaches Typing,""Math and Me," and "Write On." Reviews list hardware requirements, emphasis, levels, publisher, purchase agreements, and price. Discusses the strengths…

  1. Software Review.

    Science.gov (United States)

    McGrath, Diane, Ed.

    1989-01-01

    Reviewed is a computer software package entitled "Audubon Wildlife Adventures: Grizzly Bears" for Apple II and IBM microcomputers. Included are availability, hardware requirements, cost, and a description of the program. The murder-mystery flavor of the program is stressed in this program that focuses on illegal hunting and game…

  2. Software Reviews.

    Science.gov (United States)

    Teles, Elizabeth, Ed.; And Others

    1990-01-01

    Reviewed are two computer software packages for Macintosh microcomputers including "Phase Portraits," an exploratory graphics tool for studying first-order planar systems; and "MacMath," a set of programs for exploring differential equations, linear algebra, and other mathematical topics. Features, ease of use, cost, availability, and hardware…

  3. MIAWARE Software

    DEFF Research Database (Denmark)

    Wilkowski, Bartlomiej; Pereira, Oscar N. M.; Dias, Paulo

    2008-01-01

    is automatically generated. Furthermore, MIAWARE software is accompanied with an intelligent search engine for medical reports, based on the relations between parts of the lungs. A logical structure of the lungs is introduced to the search algorithm through the specially developed ontology. As a result...

  4. Coordination Implications of Software Coupling in Open Source Projects

    NARCIS (Netherlands)

    Amrit, Chintan Amrit; van Hillegersberg, Jos; Ågerfalk, Pär

    2010-01-01

    The effect of software coupling on the quality of software has been studied quite widely since the seminal paper on software modularity by Parnas [1]. However, the effect of the increase in software coupling on the coordination of the developers has not been researched as much. In commercial

  5. A Methodology for Integrating Maintainability Using Software Metrics

    OpenAIRE

    Lewis, John A.; Henry, Sallie M.

    1989-01-01

    Maintainability must be integrated into software early in the development process. But for practical use, the techniques used must be as unobtrusive to the existing software development process as possible. This paper defines a methodology for integrating maintainability into large-scale software and describes an experiment which implemented the methodology into a major commercial software development environment.

  6. First International Workshop on Variability in Software Architecture (VARSA 2011)

    NARCIS (Netherlands)

    Galster, Matthias; Avgeriou, Paris; Weyns, Danny; Mannisto, Tomi

    2011-01-01

    Variability is the ability of a software artifact to be changed for a specific context. Mechanisms to accommodate variability include software product lines, configuration wizards and tools in commercial software, configuration interfaces of software components, or the dynamic runtime composition of

  7. 77 FR 50726 - Software Requirement Specifications for Digital Computer Software and Complex Electronics Used in...

    Science.gov (United States)

    2012-08-22

    ... Computer Software and Complex Electronics Used in Safety Systems of Nuclear Power Plants AGENCY: Nuclear...-1209, ``Software Requirement Specifications for Digital Computer Software and Complex Electronics used... Electronics Engineers (ANSI/IEEE) Standard 830-1998, ``IEEE Recommended Practice for Software Requirements...

  8. Software Tools for Software Maintenance

    Science.gov (United States)

    1988-10-01

    COMMUNICATIONS, AND COMPUTER SCIENCES I ,(AIRMICS) FO~SOFTWARE TOOLS (.o FOR SOF1 ’ARE MAINTENANCE (ASQBG-1-89-001) October, 1988 DTIC ELECTE -ifB...SUNWW~. B..c Program An~Iysw HA.c C-Tractr C Cobol Stncturing Facility VS Cobol 11 F-Scan Foctma Futbol Cobol Fortran Sltiuc Code Anaiyaer Fortran IS

  9. EPIQR software

    Energy Technology Data Exchange (ETDEWEB)

    Flourentzos, F. [Federal Institute of Technology, Lausanne (Switzerland); Droutsa, K. [National Observatory of Athens, Athens (Greece); Wittchen, K.B. [Danish Building Research Institute, Hoersholm (Denmark)

    1999-11-01

    The support of the EPIQR method is a multimedia computer program. Several modules help the users of the method to treat the data collected during a diagnosis survey, to set up refurbishment scenario and calculate their cost or energy performance, and finally to visualize the results in a comprehensive way and to prepare quality reports. This article presents the structure and the main features of the software. (au)

  10. Software preservation

    Directory of Open Access Journals (Sweden)

    Tadej Vodopivec

    2011-01-01

    Full Text Available Comtrade Ltd. covers a wide range of activities related to information and communication technologies; its deliverables include web applications, locally installed programs,system software, drivers, embedded software (used e.g. in medical devices, auto parts,communication switchboards. Also the extensive knowledge and practical experience about digital long-term preservation technologies have been acquired. This wide spectrum of activities puts us in the position to discuss the often overlooked aspect of the digital preservation - preservation of software programs. There are many resources dedicated to digital preservation of digital data, documents and multimedia records,but not so many about how to preserve the functionalities and features of computer programs. Exactly these functionalities - dynamic response to inputs - render the computer programs rich compared to documents or linear multimedia. The article opens the questions on the beginning of the way to the permanent digital preservation. The purpose is to find a way in the right direction, where all relevant aspects will be covered in proper balance. The following questions are asked: why at all to preserve computer programs permanently, who should do this and for whom, when we should think about permanent program preservation, what should be persevered (such as source code, screenshots, documentation, and social context of the program - e.g. media response to it ..., where and how? To illustrate the theoretic concepts given the idea of virtual national museum of electronic banking is also presented.

  11. Prospective observer and software-based assessment of magnetic resonance imaging quality in head and neck cancer: Should standard positioning and immobilization be required for radiation therapy applications?

    Science.gov (United States)

    Ding, Yao; Mohamed, Abdallah S R; Yang, Jinzhong; Colen, Rivka R; Frank, Steven J; Wang, Jihong; Wassal, Eslam Y; Wang, Wenjie; Kantor, Michael E; Balter, Peter A; Rosenthal, David I; Lai, Stephen Y; Hazle, John D; Fuller, Clifton D

    2015-01-01

    The purpose of this study was to investigate the potential of a head and neck magnetic resonance simulation and immobilization protocol on reducing motion-induced artifacts and improving positional variance for radiation therapy applications. Two groups (group 1, 17 patients; group 2, 14 patients) of patients with head and neck cancer were included under a prospective, institutional review board-approved protocol and signed informed consent. A 3.0-T magnetic resonance imaging (MRI) scanner was used for anatomic and dynamic contrast-enhanced acquisitions with standard diagnostic MRI setup for group 1 and radiation therapy immobilization devices for group 2 patients. The impact of magnetic resonance simulation/immobilization was evaluated qualitatively by 2 observers in terms of motion artifacts and positional reproducibility and quantitatively using 3-dimensional deformable registration to track intrascan maximum motion displacement of voxels inside 7 manually segmented regions of interest. The image quality of group 2 (29 examinations) was significantly better than that of group 1 (50 examinations) as rated by both observers in terms of motion minimization and imaging reproducibility (P quality of head and neck MRI in terms of motion-related artifacts and positional reproducibility was greatly improved by use of radiation therapy immobilization devices. Consequently, immobilization with external and intraoral fixation in MRI examinations is required for radiation therapy application. Copyright © 2015 American Society for Radiation Oncology. Published by Elsevier Inc. All rights reserved.

  12. Geoportale del Consorzio LaMMA Disseminazione di dati meteo in near real-time tramite standard OGC e software Open Source

    Directory of Open Access Journals (Sweden)

    Simone Giannechini

    2014-02-01

    Full Text Available This paper describes the spatial data infrastructure (SDI used by the LaMMA Consortium - Environmental Mod elling and Monitoring Laboratory for Sustainable Developm ent of Tuscany Region for sharing, viewing and cataloguing (metadata and related information all geospatial data that are daily proc essed and used op erationally in many meteorological and environmental app lications.The SDI was develop ed using Open Source technologies, mo reover the geospatial data has been imp lemented through protoco ls based on ogc (Open Geospatial Consortium standards such as WMS, WFS and CSW. Geoserver was used for disseminating geospatial data and maps through OGC WMS and WFS protoco ls while GeoNetwork was used as the cataloguing and search po rtal through also the CSW protocol; eventually MapStore was used to implement the mash-up front-end.The innovative aspect of this po rtal is the fact that it currently is ingesting, fusing and disseminating geospatial data related to the MetOcfield from various sources in near real-time in a comp rehensive manner that allows users to create add ed value visualizations for the support of operational use cases as well as to access and download underlying data (where app licable.

  13. Standard Fortran

    International Nuclear Information System (INIS)

    Marshall, N.H.

    1981-01-01

    Because of its vast software investment in Fortran programs, the nuclear community has an inherent interest in the evolution of Fortran. This paper reviews the impact of the new Fortran 77 standard and discusses the projected changes which can be expected in the future

  14. International Liability Issues for Software Quality

    National Research Council Canada - National Science Library

    Mead, Nancy

    2003-01-01

    This report focuses on international law related to cybercrime, international information security standards, and software liability issues as they relate to information security for critical infrastructure applications...

  15. Establishing software quality assurance

    International Nuclear Information System (INIS)

    Malsbury, J.

    1983-01-01

    This paper is concerned with four questions about establishing software QA: What is software QA. Why have software QA. What is the role of software QA. What is necessary to ensure the success of software QA

  16. Web Application Software for Ground Operations Planning Database (GOPDb) Management

    Science.gov (United States)

    Lanham, Clifton; Kallner, Shawn; Gernand, Jeffrey

    2013-01-01

    A Web application facilitates collaborative development of the ground operations planning document. This will reduce costs and development time for new programs by incorporating the data governance, access control, and revision tracking of the ground operations planning data. Ground Operations Planning requires the creation and maintenance of detailed timelines and documentation. The GOPDb Web application was created using state-of-the-art Web 2.0 technologies, and was deployed as SaaS (Software as a Service), with an emphasis on data governance and security needs. Application access is managed using two-factor authentication, with data write permissions tied to user roles and responsibilities. Multiple instances of the application can be deployed on a Web server to meet the robust needs for multiple, future programs with minimal additional cost. This innovation features high availability and scalability, with no additional software that needs to be bought or installed. For data governance and security (data quality, management, business process management, and risk management for data handling), the software uses NAMS. No local copy/cloning of data is permitted. Data change log/tracking is addressed, as well as collaboration, work flow, and process standardization. The software provides on-line documentation and detailed Web-based help. There are multiple ways that this software can be deployed on a Web server to meet ground operations planning needs for future programs. The software could be used to support commercial crew ground operations planning, as well as commercial payload/satellite ground operations planning. The application source code and database schema are owned by NASA.

  17. Software Prototyping

    Science.gov (United States)

    Del Fiol, Guilherme; Hanseler, Haley; Crouch, Barbara Insley; Cummins, Mollie R.

    2016-01-01

    Summary Background Health information exchange (HIE) between Poison Control Centers (PCCs) and Emergency Departments (EDs) could improve care of poisoned patients. However, PCC information systems are not designed to facilitate HIE with EDs; therefore, we are developing specialized software to support HIE within the normal workflow of the PCC using user-centered design and rapid prototyping. Objective To describe the design of an HIE dashboard and the refinement of user requirements through rapid prototyping. Methods Using previously elicited user requirements, we designed low-fidelity sketches of designs on paper with iterative refinement. Next, we designed an interactive high-fidelity prototype and conducted scenario-based usability tests with end users. Users were asked to think aloud while accomplishing tasks related to a case vignette. After testing, the users provided feedback and evaluated the prototype using the System Usability Scale (SUS). Results Survey results from three users provided useful feedback that was then incorporated into the design. After achieving a stable design, we used the prototype itself as the specification for development of the actual software. Benefits of prototyping included having 1) subject-matter experts heavily involved with the design; 2) flexibility to make rapid changes, 3) the ability to minimize software development efforts early in the design stage; 4) rapid finalization of requirements; 5) early visualization of designs; 6) and a powerful vehicle for communication of the design to the programmers. Challenges included 1) time and effort to develop the prototypes and case scenarios; 2) no simulation of system performance; 3) not having all proposed functionality available in the final product; and 4) missing needed data elements in the PCC information system. PMID:27081404

  18. Analysis of open source GIS software

    OpenAIRE

    Božnis, Andrius

    2006-01-01

    GIS is one of the most perspective information technology sciences sphere. GIS conjuncts the digital image analysis and data base systems. This makes GIS wide applicable and very high skills demanding system. There is a lot of commercial GIS software which is well advertised and which functionality is pretty well known, while open source software is forgotten. In this diploma work is made analysis of available open source GIS software on the Internet, in the scope of different projects interr...

  19. DAMARIS – a flexible and open software platform for NMR spectrometer control

    OpenAIRE

    Gädke, Achim; Rosenstihl, Markus; Schmitt, Christopher; Stork, Holger; Nestle, Nikolaus

    2016-01-01

    Home-built NMR spectrometers with self-written control software have a long tradition in porous media research. Advantages of such spectrometers are not just lower costs but also more flexibility in developing new experiments (while commercial NMR systems are typically optimized for standard applications such as spectroscopy, imaging or quality control applications). Increasing complexity of computer operating systems, higher expectations with respect to user-friendliness and graphical use...

  20. Automated software development tools in the MIS (Management Information Systems) environment

    Energy Technology Data Exchange (ETDEWEB)

    Arrowood, L.F.; Emrich, M.L.

    1987-09-11

    Quantitative and qualitative benefits can be obtained through the use of automated software development tools. Such tools are best utilized when they complement existing procedures and standards. They can assist systems analysts and programmers with project specification, design, implementation, testing, and documentation. Commercial products have been evaluated to determine their efficacy. User comments have been included to illustrate actual benefits derived from introducing these tools into MIS organizations.

  1. ROLE OF DATA MINING CLASSIFICATION TECHNIQUE IN SOFTWARE DEFECT PREDICTION

    OpenAIRE

    Dr.A.R.Pon Periyasamy; Mrs A.Misbahulhuda

    2017-01-01

    Software defect prediction is the process of locating defective modules in software. Software quality may be a field of study and apply that describes the fascinating attributes of software package product. The performance should be excellent with none defects. Software quality metrics are a set of software package metrics that target the standard aspects of the product, process, and project. The software package defect prediction model helps in early detection of defects and contributes to t...

  2. RTSPM: real-time Linux control software for scanning probe microscopy.

    Science.gov (United States)

    Chandrasekhar, V; Mehta, M M

    2013-01-01

    Real time computer control is an essential feature of scanning probe microscopes, which have become important tools for the characterization and investigation of nanometer scale samples. Most commercial (and some open-source) scanning probe data acquisition software uses digital signal processors to handle the real time data processing and control, which adds to the expense and complexity of the control software. We describe here scan control software that uses a single computer and a data acquisition card to acquire scan data. The computer runs an open-source real time Linux kernel, which permits fast acquisition and control while maintaining a responsive graphical user interface. Images from a simulated tuning-fork based microscope as well as a standard topographical sample are also presented, showing some of the capabilities of the software.

  3. Flight Software Math Library

    Science.gov (United States)

    McComas, David

    2013-01-01

    The flight software (FSW) math library is a collection of reusable math components that provides typical math utilities required by spacecraft flight software. These utilities are intended to increase flight software quality reusability and maintainability by providing a set of consistent, well-documented, and tested math utilities. This library only has dependencies on ANSI C, so it is easily ported. Prior to this library, each mission typically created its own math utilities using ideas/code from previous missions. Part of the reason for this is that math libraries can be written with different strategies in areas like error handling, parameters orders, naming conventions, etc. Changing the utilities for each mission introduces risks and costs. The obvious risks and costs are that the utilities must be coded and revalidated. The hidden risks and costs arise in miscommunication between engineers. These utilities must be understood by both the flight software engineers and other subsystem engineers (primarily guidance navigation and control). The FSW math library is part of a larger goal to produce a library of reusable Guidance Navigation and Control (GN&C) FSW components. A GN&C FSW library cannot be created unless a standardized math basis is created. This library solves the standardization problem by defining a common feature set and establishing policies for the library s design. This allows the libraries to be maintained with the same strategy used in its initial development, which supports a library of reusable GN&C FSW components. The FSW math library is written for an embedded software environment in C. This places restrictions on the language features that can be used by the library. Another advantage of the FSW math library is that it can be used in the FSW as well as other environments like the GN&C analyst s simulators. This helps communication between the teams because they can use the same utilities with the same feature set and syntax.

  4. The Commercial Open Source Business Model

    Science.gov (United States)

    Riehle, Dirk

    Commercial open source software projects are open source software projects that are owned by a single firm that derives a direct and significant revenue stream from the software. Commercial open source at first glance represents an economic paradox: How can a firm earn money if it is making its product available for free as open source? This paper presents the core properties of com mercial open source business models and discusses how they work. Using a commercial open source approach, firms can get to market faster with a superior product at lower cost than possible for traditional competitors. The paper shows how these benefits accrue from an engaged and self-supporting user community. Lacking any prior comprehensive reference, this paper is based on an analysis of public statements by practitioners of commercial open source. It forges the various anecdotes into a coherent description of revenue generation strategies and relevant business functions.

  5. Software quality assurance plan for GCS

    Science.gov (United States)

    Duncan, Stephen E.; Bailey, Elizabeth K.

    1990-01-01

    The software quality assurance (SQA) function for the Guidance and Control Software (GCS) project which is part of a software error studies research program is described. The SQA plan outlines all of the procedures, controls, and audits to be carried out by the SQA organization to ensure adherence to the policies, procedures, and standards for the GCS project.

  6. Regional vegetation management standards for commercial pine ...

    African Journals Online (AJOL)

    Trial sites were selected across different physiographic regions such that a range of altitudinal, climatic and environmental gradients were represented. ... Two of the trials were situated at lower-altitude sites (900 m and 1 000 m above sea level [asl]), one at a mid-altitude site (1 267 m asl), and one at a higher-altitude site (1 ...

  7. Commercial applications

    International Nuclear Information System (INIS)

    Anon.

    1982-01-01

    The objective of this paper is to assess the near term (one-to-five-year) needs of domestic and foreign commercial suppliers of radiochemicals and radiopharmaceuticals for electromagnetically separated stable isotopes. Only isotopes purchased to make products for sale and profit are considered in this assessment. Radiopharmaceuticals produced from enriched stable isotopes supplied by the Calutron facility at ORNL are used in about 600,000 medical procedures each year in the United States. A temporary or permanent disruption of the supply of stable isotopes to the domestic radiopharmaceutical industry could curtail, if not eliminate, the use of such diagnostic procedures as the thallium heart scan, the gallium cancer scan, the gallium abscess scan, and the low-radiation-dose thyroid scan. The word could in the preceding sentence is underlined because an alternative source of enriched stable isotopes does exist in the USSR. Alternative starting materials could, in theory, eventually be developed for both the thallium and gallium scans. The development of a new technology for these purposes, however, would take at least five years and would be expensive. Hence, any disruption of the supply of enriched isotopes from ORNL and the resulting unavailability of critical nuclear medicine procedures would have a dramatic negative effect on the level of health care in the United States

  8. Commercial applications

    Science.gov (United States)

    The near term (one to five year) needs of domestic and foreign commercial suppliers of radiochemicals and radiopharmaceuticals for electromagnetically separated stable isotopes are assessed. Only isotopes purchased to make products for sale and profit are considered. Radiopharmaceuticals produced from enriched stable isotopes supplied by the Calutron facility at ORNL are used in about 600,000 medical procedures each year in the United States. A temporary or permanent disruption of the supply of stable isotopes to the domestic radiopharmaceutical industry could curtail, if not eliminate, the use of such diagnostic procedures as the thallium heart scan, the gallium cancer scan, the gallium abscess scan, and the low radiation dose thyroid scan. An alternative source of enriched stable isotopes exist in the USSR. Alternative starting materials could, in theory, eventually be developed for both the thallium and gallium scans. The development of a new technology for these purposes, however, would take at least five years and would be expensive. Hence, any disruption of the supply of enriched isotopes from ORNL and the resulting unavailability of critical nuclear medicine procedures would have a dramatic negative effect on the level of health care in the United States.

  9. Global Software Engineering: A Software Process Approach

    Science.gov (United States)

    Richardson, Ita; Casey, Valentine; Burton, John; McCaffery, Fergal

    Our research has shown that many companies are struggling with the successful implementation of global software engineering, due to temporal, cultural and geographical distance, which causes a range of factors to come into play. For example, cultural, project managementproject management and communication difficulties continually cause problems for software engineers and project managers. While the implementation of efficient software processes can be used to improve the quality of the software product, published software process models do not cater explicitly for the recent growth in global software engineering. Our thesis is that global software engineering factors should be included in software process models to ensure their continued usefulness in global organisations. Based on extensive global software engineering research, we have developed a software process, Global Teaming, which includes specific practices and sub-practices. The purpose is to ensure that requirements for successful global software engineering are stipulated so that organisations can ensure successful implementation of global software engineering.

  10. Software system safety

    Science.gov (United States)

    Uber, James G.

    1988-01-01

    Software itself is not hazardous, but since software and hardware share common interfaces there is an opportunity for software to create hazards. Further, these software systems are complex, and proven methods for the design, analysis, and measurement of software safety are not yet available. Some past software failures, future NASA software trends, software engineering methods, and tools and techniques for various software safety analyses are reviewed. Recommendations to NASA are made based on this review.

  11. Commercial lumber, round timbers, and ties

    Science.gov (United States)

    David E. Kretschmann

    2010-01-01

    When sawn, a log yields round timber, ties, or lumber of varying quality. This chapter presents a general discussion of grading, standards, and specifications for these commercial products. In a broad sense, commercial lumber is any lumber that is bought or sold in the normal channels of commerce. Commercial lumber may be found in a variety of forms, species, and types...

  12. Balancing energy conservation and occupant needs in ventilation rate standards for Big Box stores and other commercial buildings in California. Issues related to the ASHRAE 62.1 Indoor Air Quality Procedure

    Energy Technology Data Exchange (ETDEWEB)

    Mendell, Mark J. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Apte, Mike G. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2010-10-31

    This report considers the question of whether the California Energy Commission should incorporate the ASHRAE 62.1 ventilation standard into the Title 24 ventilation rate (VR) standards, thus allowing buildings to follow the Indoor Air Quality Procedure. This, in contrast to the current prescriptive standard, allows the option of using ventilation rate as one of several strategies, which might include source reduction and air cleaning, to meet specified targets of indoor air concentrations and occupant acceptability. The research findings reviewed in this report suggest that a revised approach to a ventilation standard for commercial buildings is necessary, because the current prescriptive ASHRAE 62.1 Ventilation Rate Procedure (VRP) apparently does not provide occupants with either sufficiently acceptable or sufficiently healthprotective air quality. One possible solution would be a dramatic increase in the minimum ventilation rates (VRs) prescribed by a VRP. This solution, however, is not feasible for at least three reasons: the current need to reduce energy use rather than increase it further, the problem of polluted outdoor air in many cities, and the apparent limited ability of increasing VRs to reduce all indoor airborne contaminants of concern (per Hodgson (2003)). Any feasible solution is thus likely to include methods of pollutant reduction other than increased outdoor air ventilation; e.g., source reduction or air cleaning. The alternative 62.1 Indoor Air Quality Procedure (IAQP) offers multiple possible benefits in this direction over the VRP, but seems too limited by insufficient specifications and inadequate available data to provide adequate protection for occupants. Ventilation system designers rarely choose to use it, finding it too arbitrary and requiring use of much non-engineering judgment and information that is not readily available. This report suggests strategies to revise the current ASHRAE IAQP to reduce its current limitations. These

  13. Year 2000 commercial issues

    Energy Technology Data Exchange (ETDEWEB)

    Kratz, M.P.J.; Booth, R.T. [Bennett Jones, Calgary, AB (Canada)

    1998-12-31

    This presentation focused on commercial aspects of the Y2K including: (1) special communication issues, (2) outsourcing transactions, (3) joint ventures and the significance for the oil and gas industry, and (4) contingency planning. Communication issues involve interaction with suppliers and vendors of critical systems, liability for Y2K communications (misrepresentation, defamation, promissory estoppel, statutory liability), securities disclosure (Canadian and US SEC requirements), protected communications, protection for Year 2000 statements. Outsourcing problems highlighted include resistance of suppliers to assume responsibility for Y2K problem remediation, factors which support and negate supplier responsibility, scope of suppliers` obligation, and warranties in respect of third party software. Regarding joint ventures, questions concerning limitations on liability, supply warranties, stand-by arrangements, stockpiling inventory, indemnities, confidentiality, operator compensation versus operator risk, and insurance were raised and addressed. Among contingency planning issues the questions of Y2K legal audit, and disclosure aspects of contingency planning were the featured concerns. figs.

  14. Year 2000 commercial issues

    International Nuclear Information System (INIS)

    Kratz, M.P.J.; Booth, R.T.

    1998-01-01

    This presentation focused on commercial aspects of the Y2K including: (1) special communication issues, (2) outsourcing transactions, (3) joint ventures and the significance for the oil and gas industry, and (4) contingency planning. Communication issues involve interaction with suppliers and vendors of critical systems, liability for Y2K communications (misrepresentation, defamation, promissory estoppel, statutory liability), securities disclosure (Canadian and US SEC requirements), protected communications, protection for Year 2000 statements. Outsourcing problems highlighted include resistance of suppliers to assume responsibility for Y2K problem remediation, factors which support and negate supplier responsibility, scope of suppliers' obligation, and warranties in respect of third party software. Regarding joint ventures, questions concerning limitations on liability, supply warranties, stand-by arrangements, stockpiling inventory, indemnities, confidentiality, operator compensation versus operator risk, and insurance were raised and addressed. Among contingency planning issues the questions of Y2K legal audit, and disclosure aspects of contingency planning were the featured concerns. figs

  15. Sandia Software Guidelines, Volume 2. Documentation

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-09-01

    This volume is one in a series of Sandia Software Guidelines intended for use in producing quality software within Sandia National Laboratories. In consonance with the IEEE Standards for software documentation, this volume provides guidance in the selection of an adequate document set for a software project and example formats for many types of software documentation. A tutorial on life cycle documentation is also provided. Extended document thematic outlines and working examples of software documents are available on electronic media as an extension of this volume.

  16. An engineering context for software engineering

    OpenAIRE

    Riehle, Richard D.

    2008-01-01

    New engineering disciplines are emerging in the late Twentieth and early Twenty-first Century. One such emerging discipline is software engineering. The engineering community at large has long harbored a sense of skepticism about the validity of the term software engineering. During most of the fifty-plus years of software practice, that skepticism was probably justified. Professional education of software developers often fell short of the standard expected for conventional engineers; so...

  17. Software And Systems Engineering Risk Management

    Science.gov (United States)

    2010-04-01

    RSKM 2004 COSO Enterprise RSKM Framework 2006 ISO/IEC 16085 Risk Management Process 2008 ISO/IEC 12207 Software Lifecycle Processes 2009 ISO/IEC...1 Software And Systems Engineering Risk Management John Walz VP Technical and Conferences Activities, IEEE Computer Society Vice-Chair Planning...Software & Systems Engineering Standards Committee, IEEE Computer Society US TAG to ISO TMB Risk Management Working Group Systems and Software

  18. Producing and supporting sharable software

    International Nuclear Information System (INIS)

    Johnstad, H.; Nicholls, J.

    1987-02-01

    A survey is reported that addressed the question of shareable software for the High Energy Physics community. Statistics are compiled for the responses of 54 people attending a conference on the subject of shareable software to a questionnaire which addressed the usefulness of shareable software, preference of programming language, and source management tools. The results are found to reflect a continued need for shareable software in the High Energy Physics community and that this effort be performed in coordination. A strong mandate is also claimed for large facilities to support the community with software and that these facilities should act as distribution points. Considerable interest is expressed in languages other than FORTRAN, and the desire for standards or rules in programming is expressed. A need is identified for source management tools

  19. National Software Reference Library (NSRL)

    Science.gov (United States)

    National Software Reference Library (NSRL) (PC database for purchase)   A collaboration of the National Institute of Standards and Technology (NIST), the National Institute of Justice (NIJ), the Federal Bureau of Investigation (FBI), the Defense Computer Forensics Laboratory (DCFL),the U.S. Customs Service, software vendors, and state and local law enforement organizations, the NSRL is a tool to assist in fighting crime involving computers.

  20. Experiment to evaluate software safety

    International Nuclear Information System (INIS)

    Soubies, B.; Henry, J.Y.

    1994-01-01

    The process of licensing nuclear power plants for operation consists of mandatory steps featuring detailed examination of the instrumentation and control system by the safety authorities, including softwares. The criticality of these softwares obliges the manufacturer to develop in accordance with the IEC 880 standard 'Computer software in nuclear power plant safety systems' issued by the International Electronic Commission. The evaluation approach, a two-stage assessment is described in detail. In this context, the IPSN (Institute of Protection and Nuclear Safety), the technical support body of the safety authority uses the MALPAS tool to analyse the quality of the programs. (R.P.). 4 refs

  1. Avoidable Software Procurements

    Science.gov (United States)

    2012-09-01

    software license, software usage, ELA, Software as a Service , SaaS , Software Asset...PaaS Platform as a Service SaaS Software as a Service SAM Software Asset Management SMS System Management Server SEWP Solutions for Enterprise Wide...delivery of full Cloud Services , we will see the transition of the Cloud Computing service model from Iaas to SaaS , or Software as a Service . Software

  2. Free software and open source databases

    Directory of Open Access Journals (Sweden)

    Napoleon Alexandru SIRITEANU

    2006-01-01

    Full Text Available The emergence of free/open source software -FS/OSS- enterprises seeks to push software development out of the academic stream into the commercial mainstream, and as a result, end-user applications such as open source database management systems (PostgreSQL, MySQL, Firebird are becoming more popular. Companies like Sybase, Oracle, Sun, IBM are increasingly implementing open source strategies and porting programs/applications into the Linux environment. Open source software is redefining the software industry in general and database development in particular.

  3. Experimental research control software system

    International Nuclear Information System (INIS)

    Cohn, I A; Kovalenko, A G; Vystavkin, A N

    2014-01-01

    A software system, intended for automation of a small scale research, has been developed. The software allows one to control equipment, acquire and process data by means of simple scripts. The main purpose of that development is to increase experiment automation easiness, thus significantly reducing experimental setup automation efforts. In particular, minimal programming skills are required and supervisors have no reviewing troubles. Interactions between scripts and equipment are managed automatically, thus allowing to run multiple scripts simultaneously. Unlike well-known data acquisition commercial software systems, the control is performed by an imperative scripting language. This approach eases complex control and data acquisition algorithms implementation. A modular interface library performs interaction with external interfaces. While most widely used interfaces are already implemented, a simple framework is developed for fast implementations of new software and hardware interfaces. While the software is in continuous development with new features being implemented, it is already used in our laboratory for automation of a helium-3 cryostat control and data acquisition. The software is open source and distributed under Gnu Public License.

  4. Experimental research control software system

    Science.gov (United States)

    Cohn, I. A.; Kovalenko, A. G.; Vystavkin, A. N.

    2014-05-01

    A software system, intended for automation of a small scale research, has been developed. The software allows one to control equipment, acquire and process data by means of simple scripts. The main purpose of that development is to increase experiment automation easiness, thus significantly reducing experimental setup automation efforts. In particular, minimal programming skills are required and supervisors have no reviewing troubles. Interactions between scripts and equipment are managed automatically, thus allowing to run multiple scripts simultaneously. Unlike well-known data acquisition commercial software systems, the control is performed by an imperative scripting language. This approach eases complex control and data acquisition algorithms implementation. A modular interface library performs interaction with external interfaces. While most widely used interfaces are already implemented, a simple framework is developed for fast implementations of new software and hardware interfaces. While the software is in continuous development with new features being implemented, it is already used in our laboratory for automation of a helium-3 cryostat control and data acquisition. The software is open source and distributed under Gnu Public License.

  5. Assessing the Army’s Software Patch Management Process

    Science.gov (United States)

    2016-03-04

    software maker or to antivirus vendors (Zetter, 2014). Fixing such a vulnerability within the zero-day period requires teamwork across multiple...Assessing the Army’s Software Patch Management Process Benjamin Alan Pryor March 4, 2016 PUBLISHED...19 Commercial-Off-the-Shelf Software

  6. Software Process Improvement

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; Diebold, Philipp; Münch, Jürgen

    2016-01-01

    Software process improvement (SPI) is around for decades: frameworks are proposed, success factors are studied, and experiences have been reported. However, the sheer mass of concepts, approaches, and standards published over the years overwhelms practitioners as well as researchers. What is out...... to new specialized frameworks. New and specialized frameworks account for the majority of the contributions found (approx. 38%). Furthermore, we find a growing interest in success factors (approx. 16%) to aid companies in conducting SPI and in adapting agile principles and practices for SPI (approx. 10...

  7. Dependability Analysis Methods For Configurable Software

    International Nuclear Information System (INIS)

    Dahll, Gustav; Pulkkinen, Urho

    1996-01-01

    Configurable software systems are systems which are built up by standard software components in the same way as a hardware system is built up by standard hardware components. Such systems are often used in the control of NPPs, also in safety related applications. A reliability analysis of such systems is therefore necessary. This report discusses what configurable software is, and what is particular with respect to reliability assessment of such software. Two very commonly used techniques in traditional reliability analysis, viz. failure mode, effect and criticality analysis (FMECA) and fault tree analysis are investigated. A real example is used to illustrate the discussed methods. Various aspects relevant to the assessment of the software reliability in such systems are discussed. Finally some models for quantitative software reliability assessment applicable on configurable software systems are described. (author)

  8. The software life cycle

    CERN Document Server

    Ince, Darrel

    1990-01-01

    The Software Life Cycle deals with the software lifecycle, that is, what exactly happens when software is developed. Topics covered include aspects of software engineering, structured techniques of software development, and software project management. The use of mathematics to design and develop computer systems is also discussed. This book is comprised of 20 chapters divided into four sections and begins with an overview of software engineering and software development, paying particular attention to the birth of software engineering and the introduction of formal methods of software develop

  9. Vertical bone measurements from cone beam computed tomography images using different software packages

    Energy Technology Data Exchange (ETDEWEB)

    Vasconcelos, Taruska Ventorini; Neves, Frederico Sampaio; Moraes, Livia Almeida Bueno; Freitas, Deborah Queiroz, E-mail: tataventorini@hotmail.com [Universidade Estadual de Campinas (UNICAMP), Piracicaba, SP (Brazil). Faculdade de Odontologia

    2015-03-01

    This article aimed at comparing the accuracy of linear measurement tools of different commercial software packages. Eight fully edentulous dry mandibles were selected for this study. Incisor, canine, premolar, first molar and second molar regions were selected. Cone beam computed tomography (CBCT) images were obtained with i-CAT Next Generation. Linear bone measurements were performed by one observer on the cross-sectional images using three different software packages: XoranCat®, OnDemand3D® and KDIS3D®, all able to assess DICOM images. In addition, 25% of the sample was reevaluated for the purpose of reproducibility. The mandibles were sectioned to obtain the gold standard for each region. Intraclass coefficients (ICC) were calculated to examine the agreement between the two periods of evaluation; the one-way analysis of variance performed with the post-hoc Dunnett test was used to compare each of the software-derived measurements with the gold standard. The ICC values were excellent for all software packages. The least difference between the software-derived measurements and the gold standard was obtained with the OnDemand3D and KDIS3D (‑0.11 and ‑0.14 mm, respectively), and the greatest, with the XoranCAT (+0.25 mm). However, there was no statistical significant difference between the measurements obtained with the different software packages and the gold standard (p > 0.05). In conclusion, linear bone measurements were not influenced by the software package used to reconstruct the image from CBCT DICOM data. (author)

  10. Vertical bone measurements from cone beam computed tomography images using different software packages

    International Nuclear Information System (INIS)

    Vasconcelos, Taruska Ventorini; Neves, Frederico Sampaio; Moraes, Livia Almeida Bueno; Freitas, Deborah Queiroz

    2015-01-01

    This article aimed at comparing the accuracy of linear measurement tools of different commercial software packages. Eight fully edentulous dry mandibles were selected for this study. Incisor, canine, premolar, first molar and second molar regions were selected. Cone beam computed tomography (CBCT) images were obtained with i-CAT Next Generation. Linear bone measurements were performed by one observer on the cross-sectional images using three different software packages: XoranCat®, OnDemand3D® and KDIS3D®, all able to assess DICOM images. In addition, 25% of the sample was reevaluated for the purpose of reproducibility. The mandibles were sectioned to obtain the gold standard for each region. Intraclass coefficients (ICC) were calculated to examine the agreement between the two periods of evaluation; the one-way analysis of variance performed with the post-hoc Dunnett test was used to compare each of the software-derived measurements with the gold standard. The ICC values were excellent for all software packages. The least difference between the software-derived measurements and the gold standard was obtained with the OnDemand3D and KDIS3D (‑0.11 and ‑0.14 mm, respectively), and the greatest, with the XoranCAT (+0.25 mm). However, there was no statistical significant difference between the measurements obtained with the different software packages and the gold standard (p > 0.05). In conclusion, linear bone measurements were not influenced by the software package used to reconstruct the image from CBCT DICOM data. (author)

  11. Software for Optimizing Quality Assurance of Other Software

    Science.gov (United States)

    Feather, Martin; Cornford, Steven; Menzies, Tim

    2004-01-01

    Software assurance is the planned and systematic set of activities that ensures that software processes and products conform to requirements, standards, and procedures. Examples of such activities are the following: code inspections, unit tests, design reviews, performance analyses, construction of traceability matrices, etc. In practice, software development projects have only limited resources (e.g., schedule, budget, and availability of personnel) to cover the entire development effort, of which assurance is but a part. Projects must therefore select judiciously from among the possible assurance activities. At its heart, this can be viewed as an optimization problem; namely, to determine the allocation of limited resources (time, money, and personnel) to minimize risk or, alternatively, to minimize the resources needed to reduce risk to an acceptable level. The end result of the work reported here is a means to optimize quality-assurance processes used in developing software.

  12. Software reliability studies

    Science.gov (United States)

    Hoppa, Mary Ann; Wilson, Larry W.

    1994-01-01

    There are many software reliability models which try to predict future performance of software based on data generated by the debugging process. Our research has shown that by improving the quality of the data one can greatly improve the predictions. We are working on methodologies which control some of the randomness inherent in the standard data generation processes in order to improve the accuracy of predictions. Our contribution is twofold in that we describe an experimental methodology using a data structure called the debugging graph and apply this methodology to assess the robustness of existing models. The debugging graph is used to analyze the effects of various fault recovery orders on the predictive accuracy of several well-known software reliability algorithms. We found that, along a particular debugging path in the graph, the predictive performance of different models can vary greatly. Similarly, just because a model 'fits' a given path's data well does not guarantee that the model would perform well on a different path. Further we observed bug interactions and noted their potential effects on the predictive process. We saw that not only do different faults fail at different rates, but that those rates can be affected by the particular debugging stage at which the rates are evaluated. Based on our experiment, we conjecture that the accuracy of a reliability prediction is affected by the fault recovery order as well as by fault interaction.

  13. High-performance commercial building systems

    Energy Technology Data Exchange (ETDEWEB)

    Selkowitz, Stephen

    2003-10-01

    This report summarizes key technical accomplishments resulting from the three year PIER-funded R&D program, ''High Performance Commercial Building Systems'' (HPCBS). The program targets the commercial building sector in California, an end-use sector that accounts for about one-third of all California electricity consumption and an even larger fraction of peak demand, at a cost of over $10B/year. Commercial buildings also have a major impact on occupant health, comfort and productivity. Building design and operations practices that influence energy use are deeply engrained in a fragmented, risk-averse industry that is slow to change. Although California's aggressive standards efforts have resulted in new buildings designed to use less energy than those constructed 20 years ago, the actual savings realized are still well below technical and economic potentials. The broad goal of this program is to develop and deploy a set of energy-saving technologies, strategies, and techniques, and improve processes for designing, commissioning, and operating commercial buildings, while improving health, comfort, and performance of occupants, all in a manner consistent with sound economic investment practices. Results are to be broadly applicable to the commercial sector for different building sizes and types, e.g. offices and schools, for different classes of ownership, both public and private, and for owner-occupied as well as speculative buildings. The program aims to facilitate significant electricity use savings in the California commercial sector by 2015, while assuring that these savings are affordable and promote high quality indoor environments. The five linked technical program elements contain 14 projects with 41 distinct R&D tasks. Collectively they form a comprehensive Research, Development, and Demonstration (RD&D) program with the potential to capture large savings in the commercial building sector, providing significant economic benefits to

  14. Comparison of Chinese and European k0 software

    International Nuclear Information System (INIS)

    Sasajima, Fumio

    2004-01-01

    The element determination by neutron activation analysis is commonly done by the relative method using comparison standard materials. However, when a simultaneous multi-element analysis of an unknown sample is carried out, this method requires an advance preparation of reference material for each content element, their simultaneous irradiation with the sample, and measurement in a same condition. It is indeed a demanding technique with the laborious work such as arranging reference materials. On the other hand, the k 0 method does not usually require reference materials, and allows easier and more accurate simultaneous multi-element analysis, therefore it is widely practiced in many countries including European nations. This report describes two kinds of k 0 software (KAYZERO/SOLCOI, ADVNAA) on their characteristics and the results of environmental standard sample (NIST 1632c, NIES No.8, JB-3) analyses using those software. As a result, both of those software accomplished an accuracy within about 10% in analysis of all but a few elements. They have both drawbacks and advantages in their characteristics and features, although it might not be reasonable to compare two products with different development purposes. commercial, or personal use. (author)

  15. Auditing Community Software Development

    Directory of Open Access Journals (Sweden)

    Mészáros Gergely

    2015-12-01

    Full Text Available In accordance with European efforts related to Critical Information Infrastructure Protection, in Hungary a special department called LRL-IBEK has been formed which is designated under the Disaster Management. While specific security issues of commercial applications are well understood and regulated by widely applied standards, increasing share of information systems are developed partly or entirely in a different way, by the community. In this paper different issues of the open development style will be discussed regarding the high requirements of Critical Information Infrastructures, and possible countermeasures will be suggested for the identified problems.

  16. Software Process Improvement

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; Konopka, Claudia; Nellemann, Peter

    2016-01-01

    directions. An analysis of 635 publications draws a big picture of SPI-related research of the past 25 years. Our study shows a high number of solution proposals, experience reports, and secondary studies, but only few theories. In particular, standard SPI models are analyzed and evaluated for applicability......Software process improvement (SPI) is around for decades: frameworks are proposed, success factors are studied, and experiences have been reported. However, the sheer mass of concepts, approaches, and standards published over the years overwhelms practitioners as well as researchers. What is out...... there? Are there new emerging approaches? What are open issues? Still, we struggle to answer the question for what is the current state of SPI and related research? We present initial results from a systematic mapping study to shed light on the field of SPI and to draw conclusions for future research...

  17. Software Innovation in a Mission Critical Environment

    Science.gov (United States)

    Fredrickson, Steven

    2015-01-01

    Operating in mission-critical environments requires trusted solutions, and the preference for "tried and true" approaches presents a potential barrier to infusing innovation into mission-critical systems. This presentation explores opportunities to overcome this barrier in the software domain. It outlines specific areas of innovation in software development achieved by the Johnson Space Center (JSC) Engineering Directorate in support of NASA's major human spaceflight programs, including International Space Station, Multi-Purpose Crew Vehicle (Orion), and Commercial Crew Programs. Software engineering teams at JSC work with hardware developers, mission planners, and system operators to integrate flight vehicles, habitats, robotics, and other spacecraft elements for genuinely mission critical applications. The innovations described, including the use of NASA Core Flight Software and its associated software tool chain, can lead to software that is more affordable, more reliable, better modelled, more flexible, more easily maintained, better tested, and enabling of automation.

  18. TOGAF usage in outsourcing of software development

    Directory of Open Access Journals (Sweden)

    Aziz Ahmad Rais

    2013-12-01

    Full Text Available TOGAF is an Enterprise Architecture framework that provides a method for developing Enterprise Architecture called architecture development method (ADM. The purpose of this paper is whether TOGAF ADM can be used for developing software application architecture. Because the software application architecture is one of the disciplines in application development life cycle, it is important to find out how the enterprise architecture development method can support the application architecture development. Having an open standard that can be used in the application architecture development could help in outsourcing of software development. If ADM could be used for software application architecture development, then we could consider its usability in outsourcing of software development.

  19. Ubuntuism, commodification, and the software dialectic

    OpenAIRE

    Chege, Mike

    2008-01-01

    “Free as in speech, but not free as in beer,” is the refrain made famous by Richard Stallman, the standard-bearer of the free software movement. However, many free software advocates seem to be of the opinion that the purity of free software is somehow tainted by any preoccupation with money or profit. Inevitably, this has implications for the economic sustainability of free software, for without a source of income, how can free software hope to survive? The challenge of finding a way to ensu...

  20. NASA Software Engineering Benchmarking Study

    Science.gov (United States)

    Rarick, Heather L.; Godfrey, Sara H.; Kelly, John C.; Crumbley, Robert T.; Wifl, Joel M.

    2013-01-01

    was its software assurance practices, which seemed to rate well in comparison to the other organizational groups and also seemed to include a larger scope of activities. An unexpected benefit of the software benchmarking study was the identification of many opportunities for collaboration in areas including metrics, training, sharing of CMMI experiences and resources such as instructors and CMMI Lead Appraisers, and even sharing of assets such as documented processes. A further unexpected benefit of the study was the feedback on NASA practices that was received from some of the organizations interviewed. From that feedback, other potential areas where NASA could improve were highlighted, such as accuracy of software cost estimation and budgetary practices. The detailed report contains discussion of the practices noted in each of the topic areas, as well as a summary of observations and recommendations from each of the topic areas. The resulting 24 recommendations from the topic areas were then consolidated to eliminate duplication and culled into a set of 14 suggested actionable recommendations. This final set of actionable recommendations, listed below, are items that can be implemented to improve NASA's software engineering practices and to help address many of the items that were listed in the NASA top software engineering issues. 1. Develop and implement standard contract language for software procurements. 2. Advance accurate and trusted software cost estimates for both procured and in-house software and improve the capture of actual cost data to facilitate further improvements. 3. Establish a consistent set of objectives and expectations, specifically types of metrics at the Agency level, so key trends and models can be identified and used to continuously improve software processes and each software development effort. 4. Maintain the CMMI Maturity Level requirement for critical NASA projects and use CMMI to measure organizations developing software for NASA. 5

  1. A within-trial cost-effectiveness analysis of primary care referral to a commercial provider for weight loss treatment, relative to standard care—an international randomised controlled trial

    Science.gov (United States)

    Fuller, N R; Colagiuri, S; Schofield, D; Olson, A D; Shrestha, R; Holzapfel, C; Wolfenstetter, S B; Holle, R; Ahern, A L; Hauner, H; Jebb, S A; Caterson, I D

    2013-01-01

    Background: Due to the high prevalence of overweight and obesity there is a need to identify cost-effective approaches for weight loss in primary care and community settings. Objective: We evaluated the cost effectiveness of two weight loss programmes of 1-year duration, either standard care (SC) as defined by national guidelines, or a commercial provider (Weight Watchers) (CP). Design: This analysis was based on a randomised controlled trial of 772 adults (87% female; age 47.4±12.9 years; body mass index 31.4±2.6 kg m−2) recruited by health professionals in primary care in Australia, United Kingdom and Germany. Both a health sector and societal perspective were adopted to calculate the cost per kilogram of weight loss and the ICER, expressed as the cost per quality adjusted life year (QALY). Results: The cost per kilogram of weight loss was USD122, 90 and 180 for the CP in Australia, the United Kingdom and Germany, respectively. For SC the cost was USD138, 151 and 133, respectively. From a health-sector perspective, the ICER for the CP relative to SC was USD18 266, 12 100 and 40 933 for Australia, the United Kingdom and Germany, respectively. Corresponding societal ICER figures were USD31 663, 24 996 and 51 571. Conclusion: The CP was a cost-effective approach from a health funder and societal perspective. Despite participants in the CP group attending two to three times more meetings than the SC group, the CP was still cost effective even including these added patient travel costs. This study indicates that it is cost effective for general practitioners (GPs) to refer overweight and obese patients to a CP, which may be better value than expending public funds on GP visits to manage this problem. PMID:22929209

  2. Evolution of Secondary Software Businesses: Understanding Industry Dynamics

    Science.gov (United States)

    Tyrväinen, Pasi; Warsta, Juhani; Seppänen, Veikko

    Primary software industry originates from IBM's decision to unbundle software-related computer system development activities to external partners. This kind of outsourcing from an enterprise internal software development activity is a common means to start a new software business serving a vertical software market. It combines knowledge of the vertical market process with competence in software development. In this research, we present and analyze the key figures of the Finnish secondary software industry, in order to quantify its interaction with the primary software industry during the period of 2000-2003. On the basis of the empirical data, we present a model for evolution of a secondary software business, which makes explicit the industry dynamics. It represents the shift from internal software developed for competitive advantage to development of products supporting standard business processes on top of standardized technologies. We also discuss the implications for software business strategies in each phase.

  3. ESTSC - Software Best Practices

    Science.gov (United States)

    DOE Scientific and Technical Software Best Practices December 2010 Table of Contents 1.0 Introduction 2.0 Responsibilities 2.1 OSTI/ESTSC 2.2 SIACs 2.3 Software Submitting Sites/Creators 2.4 Software Sensitivity Review 3.0 Software Announcement and Submission 3.1 STI Software Appropriate for Announcement 3.2

  4. Software Assurance Competency Model

    Science.gov (United States)

    2013-03-01

    COTS) software , and software as a service ( SaaS ). L2: Define and analyze risks in the acquisition of contracted software , COTS software , and SaaS ...2010a]: Application of technologies and processes to achieve a required level of confidence that software systems and services function in the...

  5. Software attribute visualization for high integrity software

    Energy Technology Data Exchange (ETDEWEB)

    Pollock, G.M.

    1998-03-01

    This report documents a prototype tool developed to investigate the use of visualization and virtual reality technologies for improving software surety confidence. The tool is utilized within the execution phase of the software life cycle. It provides a capability to monitor an executing program against prespecified requirements constraints provided in a program written in the requirements specification language SAGE. The resulting Software Attribute Visual Analysis Tool (SAVAnT) also provides a technique to assess the completeness of a software specification.

  6. The Synthetic Biology Open Language (SBOL) provides a community standard for communicating designs in synthetic biology.

    Science.gov (United States)

    Galdzicki, Michal; Clancy, Kevin P; Oberortner, Ernst; Pocock, Matthew; Quinn, Jacqueline Y; Rodriguez, Cesar A; Roehner, Nicholas; Wilson, Mandy L; Adam, Laura; Anderson, J Christopher; Bartley, Bryan A; Beal, Jacob; Chandran, Deepak; Chen, Joanna; Densmore, Douglas; Endy, Drew; Grünberg, Raik; Hallinan, Jennifer; Hillson, Nathan J; Johnson, Jeffrey D; Kuchinsky, Allan; Lux, Matthew; Misirli, Goksel; Peccoud, Jean; Plahar, Hector A; Sirin, Evren; Stan, Guy-Bart; Villalobos, Alan; Wipat, Anil; Gennari, John H; Myers, Chris J; Sauro, Herbert M

    2014-06-01

    The re-use of previously validated designs is critical to the evolution of synthetic biology from a research discipline to an engineering practice. Here we describe the Synthetic Biology Open Language (SBOL), a proposed data standard for exchanging designs within the synthetic biology community. SBOL represents synthetic biology designs in a community-driven, formalized format for exchange between software tools, research groups and commercial service providers. The SBOL Developers Group has implemented SBOL as an XML/RDF serialization and provides software libraries and specification documentation to help developers implement SBOL in their own software. We describe early successes, including a demonstration of the utility of SBOL for information exchange between several different software tools and repositories from both academic and industrial partners. As a community-driven standard, SBOL will be updated as synthetic biology evolves to provide specific capabilities for different aspects of the synthetic biology workflow.

  7. Building a virtual ligand screening pipeline using free software: a survey.

    Science.gov (United States)

    Glaab, Enrico

    2016-03-01

    Virtual screening, the search for bioactive compounds via computational methods, provides a wide range of opportunities to speed up drug development and reduce the associated risks and costs. While virtual screening is already a standard practice in pharmaceutical companies, its applications in preclinical academic research still remain under-exploited, in spite of an increasing availability of dedicated free databases and software tools. In this survey, an overview of recent developments in this field is presented, focusing on free software and data repositories for screening as alternatives to their commercial counterparts, and outlining how available resources can be interlinked into a comprehensive virtual screening pipeline using typical academic computing facilities. Finally, to facilitate the set-up of corresponding pipelines, a downloadable software system is provided, using platform virtualization to integrate pre-installed screening tools and scripts for reproducible application across different operating systems. © The Author 2015. Published by Oxford University Press.

  8. 78 FR 47011 - Software Unit Testing for Digital Computer Software Used in Safety Systems of Nuclear Power Plants

    Science.gov (United States)

    2013-08-02

    ... NUCLEAR REGULATORY COMMISSION [NRC-2012-0195] Software Unit Testing for Digital Computer Software... revised regulatory guide (RG), revision 1 of RG 1.171, ``Software Unit Testing for Digital Computer Software Used in Safety Systems of Nuclear Power Plants.'' This RG endorses American National Standards...

  9. Building Energy Management Open Source Software

    Energy Technology Data Exchange (ETDEWEB)

    Rahman, Saifur [Virginia Polytechnic Inst. and State Univ. (Virginia Tech), Blacksburg, VA (United States)

    2017-08-25

    Funded by the U.S. Department of Energy in November 2013, a Building Energy Management Open Source Software (BEMOSS) platform was engineered to improve sensing and control of equipment in small- and medium-sized commercial buildings. According to the Energy Information Administration (EIA), small- (5,000 square feet or smaller) and medium-sized (between 5,001 to 50,000 square feet) commercial buildings constitute about 95% of all commercial buildings in the U.S. These buildings typically do not have Building Automation Systems (BAS) to monitor and control building operation. While commercial BAS solutions exist, including those from Siemens, Honeywell, Johnsons Controls and many more, they are not cost effective in the context of small- and medium-sized commercial buildings, and typically work with specific controller products from the same company. BEMOSS targets small and medium-sized commercial buildings to address this gap.

  10. Reliability of software

    International Nuclear Information System (INIS)

    Kopetz, H.

    1980-01-01

    Common factors and differences in the reliability of hardware and software; reliability increase by means of methods of software redundancy. Maintenance of software for long term operating behavior. (HP) [de

  11. Space Flight Software Development Software for Intelligent System Health Management

    Science.gov (United States)

    Trevino, Luis C.; Crumbley, Tim

    2004-01-01

    The slide presentation examines the Marshall Space Flight Center Flight Software Branch, including software development projects, mission critical space flight software development, software technical insight, advanced software development technologies, and continuous improvement in the software development processes and methods.

  12. Software Engineering Guidebook

    Science.gov (United States)

    Connell, John; Wenneson, Greg

    1993-01-01

    The Software Engineering Guidebook describes SEPG (Software Engineering Process Group) supported processes and techniques for engineering quality software in NASA environments. Three process models are supported: structured, object-oriented, and evolutionary rapid-prototyping. The guidebook covers software life-cycles, engineering, assurance, and configuration management. The guidebook is written for managers and engineers who manage, develop, enhance, and/or maintain software under the Computer Software Services Contract.

  13. Commercial microwave space power

    International Nuclear Information System (INIS)

    Siambis, J.; Gregorwich, W.; Walmsley, S.; Shockey, K.; Chang, K.

    1991-01-01

    This paper reports on central commercial space power, generating power via large scale solar arrays, and distributing power to satellites via docking, tethering or beamed power such as microwave or laser beams, that is being investigated as a potentially advantageous alternative to present day technology where each satellite carries its own power generating capability. The cost, size and weight for electrical power service, together with overall mission requirements and flexibility are the principal selection criteria, with the case of standard solar array panels based on the satellite, as the reference point. This paper presents and investigates a current technology design point for beamed microwave commercial space power. The design point requires that 25 kW be delivered to the user load with 30% overall system efficiency. The key elements of the design point are: An efficient rectenna at the user end; a high gain, low beam width, efficient antenna at the central space power station end, a reliable and efficient cw microwave tube. Design trades to optimize the proposed near term design point and to explore characteristics of future systems were performed. Future development for making the beamed microwave space power approach more competitive against docking and tethering are discussed

  14. 7 CFR 51.2278 - U.S. Commercial.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false U.S. Commercial. 51.2278 Section 51.2278 Agriculture Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing... Standards for Shelled English Walnuts (Juglans Regia) Grades § 51.2278 U.S. Commercial. “U.S. Commercial...

  15. Commercial Off-The-Shelf (COTS) Avionics Software Study

    National Research Council Canada - National Science Library

    Krodel, Jim

    2001-01-01

    .... The motivation is even a bit beyond monetary resources as the scarcity of highly trained personnel that can develop such systems has also provided fuel to the attractiveness of considering reuse...

  16. Re-purposing commercial entertainment software for military use

    OpenAIRE

    DeBrine, Jeffrey D.; Morrow, Donald E.

    2000-01-01

    Approved for public release; distribution is unlimited Virtual environments have achieved widespread use in the military in applications such as theater planning, training, and architectural walkthroughs. These applications are generally expensive and inflexible in design and implementation. Re-purposing these applications to meet the dynamic modeling and simulation needs of the military can be awkward or impossible. Video games are designed to be both technologically advanced and flexible...

  17. ARROW (Version 2) Commercial Software Validation and Configuration Control

    International Nuclear Information System (INIS)

    HEARD, F.J.

    2000-01-01

    ARROW (Version 2), a compressible flow piping network modeling and analysis computer program from Applied Flow Technology, was installed for use at the U.S. Department of Energy Hanford Site near Richland, Washington

  18. ARROW (Version 2) Commercial Software Validation and Configuration Control

    Energy Technology Data Exchange (ETDEWEB)

    HEARD, F.J.

    2000-02-10

    ARROW (Version 2), a compressible flow piping network modeling and analysis computer program from Applied Flow Technology, was installed for use at the U.S. Department of Energy Hanford Site near Richland, Washington.

  19. Strengthening Software Authentication with the ROSE Software Suite

    International Nuclear Information System (INIS)

    White, G

    2006-01-01

    Many recent nonproliferation and arms control software projects include a software authentication regime. These include U.S. Government-sponsored projects both in the United States and in the Russian Federation (RF). This trend toward requiring software authentication is only accelerating. Demonstrating assurance that software performs as expected without hidden ''backdoors'' is crucial to a project's success. In this context, ''authentication'' is defined as determining that a software package performs only its intended purpose and performs said purpose correctly and reliably over the planned duration of an agreement. In addition to visual inspections by knowledgeable computer scientists, automated tools are needed to highlight suspicious code constructs, both to aid visual inspection and to guide program development. While many commercial tools are available for portions of the authentication task, they are proprietary and not extensible. An open-source, extensible tool can be customized to the unique needs of each project (projects can have both common and custom rules to detect flaws and security holes). Any such extensible tool has to be based on a complete language compiler. ROSE is precisely such a compiler infrastructure developed within the Department of Energy (DOE) and targeted at the optimization of scientific applications and user-defined libraries within large-scale applications (typically applications of a million lines of code). ROSE is a robust, source-to-source analysis and optimization infrastructure currently addressing large, million-line DOE applications in C and C++ (handling the full C, C99, C++ languages and with current collaborations to support Fortran90). We propose to extend ROSE to address a number of security-specific requirements, and apply it to software authentication for nonproliferation and arms control projects

  20. Computer systems and software engineering

    Science.gov (United States)

    Mckay, Charles W.

    1988-01-01

    The High Technologies Laboratory (HTL) was established in the fall of 1982 at the University of Houston Clear Lake. Research conducted at the High Tech Lab is focused upon computer systems and software engineering. There is a strong emphasis on the interrelationship of these areas of technology and the United States' space program. In Jan. of 1987, NASA Headquarters announced the formation of its first research center dedicated to software engineering. Operated by the High Tech Lab, the Software Engineering Research Center (SERC) was formed at the University of Houston Clear Lake. The High Tech Lab/Software Engineering Research Center promotes cooperative research among government, industry, and academia to advance the edge-of-knowledge and the state-of-the-practice in key topics of computer systems and software engineering which are critical to NASA. The center also recommends appropriate actions, guidelines, standards, and policies to NASA in matters pertinent to the center's research. Results of the research conducted at the High Tech Lab/Software Engineering Research Center have given direction to many decisions made by NASA concerning the Space Station Program.

  1. Robotic Software for the Thacher Observatory

    Science.gov (United States)

    Lawrence, George; Luebbers, Julien; Eastman, Jason D.; Johnson, John A.; Swift, Jonathan

    2018-06-01

    The Thacher Observatory—a research and educational facility located in Ojai, CA—uses a 0.7 meter telescope to conduct photometric research on a variety of targets including eclipsing binaries, exoplanet transits, and supernovae. Currently, observations are automated using commercial software. In order to expand the flexibility for specialized scientific observations and to increase the educational value of the facility on campus, we are adapting and implementing the custom observatory control software and queue scheduling developed for the Miniature Exoplanet Radial Velocity Array (MINERVA) to the Thacher Observatory. We present the design and implementation of this new software as well as its demonstrated functionality on the Thacher Observatory.

  2. Speech to Text Software Evaluation Report

    CERN Document Server

    Martins Santo, Ana Luisa

    2017-01-01

    This document compares out-of-box performance of three commercially available speech recognition software: Vocapia VoxSigma TM , Google Cloud Speech, and Lime- craft Transcriber. It is defined a set of evaluation criteria and test methods for speech recognition softwares. The evaluation of these softwares in noisy environments are also included for the testing purposes. Recognition accuracy was compared using noisy environments and languages. Testing in ”ideal” non-noisy environment of a quiet room has been also performed for comparison.

  3. Open Source Software and the Intellectual Commons.

    Science.gov (United States)

    Dorman, David

    2002-01-01

    Discusses the Open Source Software method of software development and its relationship to control over information content. Topics include digital library resources; reference services; preservation; the legal and economic status of information; technical standards; access to digital data; control of information use; and copyright and patent laws.…

  4. Reflections on Courses for Software Language Engineering

    NARCIS (Netherlands)

    Bagge, A.H.; Lämmel, R.; Zaytsev, V.; Demuth, B.; Stikkolorum, D.

    2014-01-01

    Software Language Engineering (SLE) has emerged as a field in computer science research and software engineering, but it has yet to become entrenched as part of the standard curriculum at universities. Many places have a compiler construction (CC) course and a programming languages (PL) course, but

  5. Technical Support Document for Version 3.9.0 of the COMcheck Software

    Energy Technology Data Exchange (ETDEWEB)

    Bartlett, Rosemarie; Connell, Linda M.; Gowri, Krishnan; Halverson, Mark A.; Lucas, R. G.; Richman, Eric E.; Schultz, Ralph W.; Winiarski, David W.

    2011-09-01

    COMcheck provides an optional way to demonstrate compliance with commercial and high-rise residential building energy codes. Commercial buildings include all use groups except single family and multifamily not over three stories in height. COMcheck was originally based on ANSI/ASHRAE/IES Standard 90.1-1989 (Standard 90.1-1989) requirements and is intended for use with various codes based on Standard 90.1, including the Codification of ASHRAE/IES Standard 90.1-1989 (90.1-1989 Code) (ASHRAE 1989a, 1993b) and ASHRAE/IESNA Standard 90.1-1999 (Standard 90.1-1999). This includes jurisdictions that have adopted the 90.1-1989 Code, Standard 90.1-1989, Standard 90.1-1999, or their own code based on one of these. We view Standard 90.1-1989 and the 90.1-1989 Code as having equivalent technical content and have used both as source documents in developing COMcheck. This technical support document (TSD) is designed to explain the technical basis for the COMcheck software as originally developed based on the ANSI/ASHRAE/IES Standard 90.1-1989 (Standard 90.1-1989). Documentation for other national model codes and standards and specific state energy codes supported in COMcheck has been added to this report as appendices. These appendices are intended to provide technical documentation for features specific to the supported codes and for any changes made for state-specific codes that differ from the standard features that support compliance with the national model codes and standards. Beginning with COMcheck version 3.8.0, support for 90.1-1989, 90.1-1999, and the 1998 IECC are no longer included, but those sections remain in this document for reference purposes.

  6. Technical Support Document for Version 3.4.0 of the COMcheck Software

    Energy Technology Data Exchange (ETDEWEB)

    Bartlett, Rosemarie; Connell, Linda M.; Gowri, Krishnan; Halverson, Mark A.; Lucas, Robert G.; Richman, Eric E.; Schultz, Robert W.; Winiarski, David W.

    2007-09-14

    COMcheck provides an optional way to demonstrate compliance with commercial and high-rise residential building energy codes. Commercial buildings include all use groups except single family and multifamily not over three stories in height. COMcheck was originally based on ANSI/ASHRAE/IES Standard 90.1-1989 (Standard 90.1-1989) requirements and is intended for use with various codes based on Standard 90.1, including the Codification of ASHRAE/IES Standard 90.1-1989 (90.1-1989 Code) (ASHRAE 1989a, 1993b) and ASHRAE/IESNA Standard 90.1-1999 (Standard 90.1-1999). This includes jurisdictions that have adopted the 90.1-1989 Code, Standard 90.1-1989, Standard 90.1-1999, or their own code based on one of these. We view Standard 90.1-1989 and the 90.1-1989 Code as having equivalent technical content and have used both as source documents in developing COMcheck. This technical support document (TSD) is designed to explain the technical basis for the COMcheck software as originally developed based on the ANSI/ASHRAE/IES Standard 90.1-1989 (Standard 90.1-1989). Documentation for other national model codes and standards and specific state energy codes supported in COMcheck has been added to this report as appendices. These appendices are intended to provide technical documentation for features specific to the supported codes and for any changes made for state-specific codes that differ from the standard features that support compliance with the national model codes and standards.

  7. Technical Support Document for Version 3.9.1 of the COMcheck Software

    Energy Technology Data Exchange (ETDEWEB)

    Bartlett, Rosemarie; Connell, Linda M.; Gowri, Krishnan; Halverson, Mark A.; Lucas, Robert G.; Richman, Eric E.; Schultz, Robert W.; Winiarski, David W.

    2012-09-01

    COMcheck provides an optional way to demonstrate compliance with commercial and high-rise residential building energy codes. Commercial buildings include all use groups except single family and multifamily not over three stories in height. COMcheck was originally based on ANSI/ASHRAE/IES Standard 90.1-1989 (Standard 90.1-1989) requirements and is intended for use with various codes based on Standard 90.1, including the Codification of ASHRAE/IES Standard 90.1-1989 (90.1-1989 Code) (ASHRAE 1989a, 1993b) and ASHRAE/IESNA Standard 90.1-1999 (Standard 90.1-1999). This includes jurisdictions that have adopted the 90.1-1989 Code, Standard 90.1-1989, Standard 90.1-1999, or their own code based on one of these. We view Standard 90.1-1989 and the 90.1-1989 Code as having equivalent technical content and have used both as source documents in developing COMcheck. This technical support document (TSD) is designed to explain the technical basis for the COMcheck software as originally developed based on the ANSI/ASHRAE/IES Standard 90.1-1989 (Standard 90.1-1989). Documentation for other national model codes and standards and specific state energy codes supported in COMcheck has been added to this report as appendices. These appendices are intended to provide technical documentation for features specific to the supported codes and for any changes made for state-specific codes that differ from the standard features that support compliance with the national model codes and standards. Beginning with COMcheck version 3.8.0, support for 90.1-1989, 90.1-1999, and the 1998 IECC and version 3.9.0 support for 2000 and 2001 IECC are no longer included, but those sections remain in this document for reference purposes.

  8. Ensuring Software IP Cleanliness

    Directory of Open Access Journals (Sweden)

    Mahshad Koohgoli

    2007-12-01

    Full Text Available At many points in the life of a software enterprise, determination of intellectual property (IP cleanliness becomes critical. The value of an enterprise that develops and sells software may depend on how clean the software is from the IP perspective. This article examines various methods of ensuring software IP cleanliness and discusses some of the benefits and shortcomings of current solutions.

  9. Ensuring Software IP Cleanliness

    OpenAIRE

    Mahshad Koohgoli; Richard Mayer

    2007-01-01

    At many points in the life of a software enterprise, determination of intellectual property (IP) cleanliness becomes critical. The value of an enterprise that develops and sells software may depend on how clean the software is from the IP perspective. This article examines various methods of ensuring software IP cleanliness and discusses some of the benefits and shortcomings of current solutions.

  10. Statistical Software Engineering

    Science.gov (United States)

    1998-04-13

    multiversion software subject to coincident errors. IEEE Trans. Software Eng. SE-11:1511-1517. Eckhardt, D.E., A.K Caglayan, J.C. Knight, L.D. Lee, D.F...J.C. and N.G. Leveson. 1986. Experimental evaluation of the assumption of independence in multiversion software. IEEE Trans. Software

  11. Agile Software Development

    Science.gov (United States)

    Biju, Soly Mathew

    2008-01-01

    Many software development firms are now adopting the agile software development method. This method involves the customer at every level of software development, thus reducing the impact of change in the requirement at a later stage. In this article, the principles of the agile method for software development are explored and there is a focus on…

  12. Improving Software Developer's Competence

    DEFF Research Database (Denmark)

    Abrahamsson, Pekka; Kautz, Karlheinz; Sieppi, Heikki

    2002-01-01

    Emerging agile software development methods are people oriented development approaches to be used by the software industry. The personal software process (PSP) is an accepted method for improving the capabilities of a single software engineer. Five original hypotheses regarding the impact...

  13. Software - Naval Oceanography Portal

    Science.gov (United States)

    are here: Home › USNO › Earth Orientation › Software USNO Logo USNO Navigation Earth Orientation Products GPS-based Products VLBI-based Products EO Information Center Publications about Products Software Search databases Auxiliary Software Supporting Software Form Folder Earth Orientation Matrix Calculator

  14. Software Engineering Education Directory

    Science.gov (United States)

    1990-04-01

    and Engineering (CMSC 735) Codes: GPEV2 * Textiooks: IEEE Tutoria on Models and Metrics for Software Management and Engameeing by Basi, Victor R...Software Engineering (Comp 227) Codes: GPRY5 Textbooks: IEEE Tutoria on Software Design Techniques by Freeman, Peter and Wasserman, Anthony 1. Software

  15. Commercial Skills Test Information Management System final report and self-sustainability plan : [technology brief].

    Science.gov (United States)

    2014-04-01

    The Commercial Skills Test Information Management System (CSTIMS) was developed to address the fraudulent issuance of commercial drivers licenses (CDLs) across the United States. CSTIMS was developed as a Web-based, software-as-a-service system to...

  16. Great software debates

    CERN Document Server

    Davis, A

    2004-01-01

    The industry’s most outspoken and insightful critic explains how the software industry REALLY works. In Great Software Debates, Al Davis, shares what he has learned about the difference between the theory and the realities of business and encourages you to question and think about software engineering in ways that will help you succeed where others fail. In short, provocative essays, Davis fearlessly reveals the truth about process improvement, productivity, software quality, metrics, agile development, requirements documentation, modeling, software marketing and sales, empiricism, start-up financing, software research, requirements triage, software estimation, and entrepreneurship.

  17. On the Ambiguity of Commercial Open Source

    Directory of Open Access Journals (Sweden)

    Lucian Luca

    2006-01-01

    Full Text Available . Open source and commercial applications used to be two separate worlds. The former was the work of amateurs who had little interest in making a profit, while the latter was only profit oriented and was produced by big companies. Nowadays open source is a threat and an opportunity to serious businesses of all kinds, generating good profits while delivering low costs products to customers. The competition between commercial and open source software has impacted the industry and the society as a whole. But in the last years, the markets for commercial and open source software are converging rapidly and it is interesting to resume and discuss the implications of this new paradigm, taking into account arguments pro and against it.

  18. Crew Transportation Operations Standards

    Science.gov (United States)

    Mango, Edward J.; Pearson, Don J. (Compiler)

    2013-01-01

    The Crew Transportation Operations Standards contains descriptions of ground and flight operations processes and specifications and the criteria which will be used to evaluate the acceptability of Commercial Providers' proposed processes and specifications.

  19. Views on Software Testability

    OpenAIRE

    Shimeall, Timothy; Friedman, Michael; Chilenski, John; Voas, Jeffrey

    1994-01-01

    The field of testability is an active, well-established part of engineering of modern computer systems. However, only recently have technologies for software testability began to be developed. These technologies focus on accessing the aspects of software that improve or depreciate the ease of testing. As both the size of implemented software and the amount of effort required to test that software increase, so will the important of software testability technologies in influencing the softwa...

  20. Agile software assessment

    OpenAIRE

    Nierstrasz Oscar; Lungu Mircea

    2012-01-01

    Informed decision making is a critical activity in software development but it is poorly supported by common development environments which focus mainly on low level programming tasks. We posit the need for agile software assessment which aims to support decision making by enabling rapid and effective construction of software models and custom analyses. Agile software assessment entails gathering and exploiting the broader context of software information related to the system at hand as well ...

  1. Experience with case tools in the design of process-oriented software

    Science.gov (United States)

    Novakov, Ognian; Sicard, Claude-Henri

    1994-12-01

    In Accelerator systems such as the CERN PS complex, process equipment has a life time which may exceed the typical life cycle of its related software. Taking into account the variety of such equipment, it is important to keep the analysis and design of the software in a system-independent form. This paper discusses the experience gathered in using commercial CASE tools for analysis, design and reverse engineering of different process-oriented software modules, with a principal emphasis on maintaining the initial analysis in a standardized form. Such tools have been in existence for several years, but this paper shows that they are not fully adapted to our needs. In particular, the paper stresses the problems of integrating such a tool into an existing data-base-dependent development chain, the lack of real-time simulation tools and of Object-Oriented concepts in existing commercial packages. Finally, the paper gives a broader view of software engineering needs in our particular context.

  2. Workflow-Based Software Development Environment

    Science.gov (United States)

    Izygon, Michel E.

    2013-01-01

    The Software Developer's Assistant (SDA) helps software teams more efficiently and accurately conduct or execute software processes associated with NASA mission-critical software. SDA is a process enactment platform that guides software teams through project-specific standards, processes, and procedures. Software projects are decomposed into all of their required process steps or tasks, and each task is assigned to project personnel. SDA orchestrates the performance of work required to complete all process tasks in the correct sequence. The software then notifies team members when they may begin work on their assigned tasks and provides the tools, instructions, reference materials, and supportive artifacts that allow users to compliantly perform the work. A combination of technology components captures and enacts any software process use to support the software lifecycle. It creates an adaptive workflow environment that can be modified as needed. SDA achieves software process automation through a Business Process Management (BPM) approach to managing the software lifecycle for mission-critical projects. It contains five main parts: TieFlow (workflow engine), Business Rules (rules to alter process flow), Common Repository (storage for project artifacts, versions, history, schedules, etc.), SOA (interface to allow internal, GFE, or COTS tools integration), and the Web Portal Interface (collaborative web environment

  3. Commercial Buildings Characteristics, 1992

    Energy Technology Data Exchange (ETDEWEB)

    1994-04-29

    Commercial Buildings Characteristics 1992 presents statistics about the number, type, and size of commercial buildings in the United States as well as their energy-related characteristics. These data are collected in the Commercial Buildings Energy Consumption Survey (CBECS), a national survey of buildings in the commercial sector. The 1992 CBECS is the fifth in a series conducted since 1979 by the Energy Information Administration. Approximately 6,600 commercial buildings were surveyed, representing the characteristics and energy consumption of 4.8 million commercial buildings and 67.9 billion square feet of commercial floorspace nationwide. Overall, the amount of commercial floorspace in the United States increased an average of 2.4 percent annually between 1989 and 1992, while the number of commercial buildings increased an average of 2.0 percent annually.

  4. Static Checking of Interrupt-driven Software

    DEFF Research Database (Denmark)

    Brylow, Dennis; Damgaard, Niels; Palsberg, Jens

    2001-01-01

    at the assembly level. In this paper we present the design and implementation of a static checker for interrupt-driven Z86-based software with hard real-time requirements. For six commercial microcontrollers, our checker has produced upper bounds on interrupt latencies and stack sizes, as well as verified...

  5. Numerical methods in software and analysis

    CERN Document Server

    Rice, John R

    1992-01-01

    Numerical Methods, Software, and Analysis, Second Edition introduces science and engineering students to the methods, tools, and ideas of numerical computation. Introductory courses in numerical methods face a fundamental problem-there is too little time to learn too much. This text solves that problem by using high-quality mathematical software. In fact, the objective of the text is to present scientific problem solving using standard mathematical software. This book discusses numerous programs and software packages focusing on the IMSL library (including the PROTRAN system) and ACM Algorithm

  6. Programming Language Software For Graphics Applications

    Science.gov (United States)

    Beckman, Brian C.

    1993-01-01

    New approach reduces repetitive development of features common to different applications. High-level programming language and interactive environment with access to graphical hardware and software created by adding graphical commands and other constructs to standardized, general-purpose programming language, "Scheme". Designed for use in developing other software incorporating interactive computer-graphics capabilities into application programs. Provides alternative to programming entire applications in C or FORTRAN, specifically ameliorating design and implementation of complex control and data structures typifying applications with interactive graphics. Enables experimental programming and rapid development of prototype software, and yields high-level programs serving as executable versions of software-design documentation.

  7. Software Engineering Program: Software Process Improvement Guidebook

    Science.gov (United States)

    1996-01-01

    The purpose of this document is to provide experience-based guidance in implementing a software process improvement program in any NASA software development or maintenance community. This guidebook details how to define, operate, and implement a working software process improvement program. It describes the concept of the software process improvement program and its basic organizational components. It then describes the structure, organization, and operation of the software process improvement program, illustrating all these concepts with specific NASA examples. The information presented in the document is derived from the experiences of several NASA software organizations, including the SEL, the SEAL, and the SORCE. Their experiences reflect many of the elements of software process improvement within NASA. This guidebook presents lessons learned in a form usable by anyone considering establishing a software process improvement program within his or her own environment. This guidebook attempts to balance general and detailed information. It provides material general enough to be usable by NASA organizations whose characteristics do not directly match those of the sources of the information and models presented herein. It also keeps the ideas sufficiently close to the sources of the practical experiences that have generated the models and information.

  8. Techniques and tools for software qualification in KNICS

    International Nuclear Information System (INIS)

    Cha, Kyung H.; Lee, Yeong J.; Cheon, Se W.; Kim, Jang Y.; Lee, Jang S.; Kwon, Kee C.

    2004-01-01

    This paper describes techniques and tools for qualifying safety software in Korea Nuclear Instrumentation and Control System (KNICS). Safety software are developed and applied for a Reactor Protection System (RPS), an Engineered Safety Features and Component Control System (ESF-CCS), and a safety Programmable Logic Controller (PLC) in the KNICS. Requirements and design specifications of safety software are written by both natural language and formal specification languages. Statechart is used for formal specification of software of the ESF-CCS and the safety PLC while NuSCR is used for formal specification of them of the RPS. pSET (POSCON Software Engineering Tool) as a software development tool has been developed and utilized for the IEC61131-3 based PLC programming. The qualification of the safety software consists of software verification and validation (V and V) through software life cycle, software safety analysis, and software configuration management, software quality assurance, and COTS (Commercial-Off-The-Shelf) dedication. The criteria and requirements for qualifying the safety software have been established with them in Software Review Plan (SRP)/Branch Technical Positions (BTP)-14, IEEE Std. 7-4.3.2-1998, NUREG/CR-6463, IEEE Std. 1012-1998, and so on. Figure 1 summarizes qualification techniques and tools for the safety software

  9. Design and Implementation of a Mobile Phone Locator Using Software Defined Radio

    National Research Council Canada - National Science Library

    Larsen, Ian P

    2007-01-01

    ...) signal using software defined radio and commodity computer hardware. Using software designed by the GNU free software project as a base, standard GSM packets were transmitted and received over the air, and their arrival times detected...

  10. Framework for Small-Scale Experiments in Software Engineering: Guidance and Control Software Project: Software Engineering Case Study

    Science.gov (United States)

    Hayhurst, Kelly J.

    1998-01-01

    Software is becoming increasingly significant in today's critical avionics systems. To achieve safe, reliable software, government regulatory agencies such as the Federal Aviation Administration (FAA) and the Department of Defense mandate the use of certain software development methods. However, little scientific evidence exists to show a correlation between software development methods and product quality. Given this lack of evidence, a series of experiments has been conducted to understand why and how software fails. The Guidance and Control Software (GCS) project is the latest in this series. The GCS project is a case study of the Requirements and Technical Concepts for Aviation RTCA/DO-178B guidelines, Software Considerations in Airborne Systems and Equipment Certification. All civil transport airframe and equipment vendors are expected to comply with these guidelines in building systems to be certified by the FAA for use in commercial aircraft. For the case study, two implementations of a guidance and control application were developed to comply with the DO-178B guidelines for Level A (critical) software. The development included the requirements, design, coding, verification, configuration management, and quality assurance processes. This paper discusses the details of the GCS project and presents the results of the case study.

  11. Satellite Communications Using Commercial Protocols

    Science.gov (United States)

    Ivancic, William D.; Griner, James H.; Dimond, Robert; Frantz, Brian D.; Kachmar, Brian; Shell, Dan

    2000-01-01

    NASA Glenn Research Center has been working with industry, academia, and other government agencies in assessing commercial communications protocols for satellite and space-based applications. In addition, NASA Glenn has been developing and advocating new satellite-friendly modifications to existing communications protocol standards. This paper summarizes recent research into the applicability of various commercial standard protocols for use over satellite and space- based communications networks as well as expectations for future protocol development. It serves as a reference point from which the detailed work can be readily accessed. Areas that will be addressed include asynchronous-transfer-mode quality of service; completed and ongoing work of the Internet Engineering Task Force; data-link-layer protocol development for unidirectional link routing; and protocols for aeronautical applications, including mobile Internet protocol routing for wireless/mobile hosts and the aeronautical telecommunications network protocol.

  12. Software metrics to improve software quality in HEP

    International Nuclear Information System (INIS)

    Lancon, E.

    1996-01-01

    The ALEPH reconstruction program maintainability has been evaluated with a case tool implementing an ISO standard methodology based on software metrics. It has been found that the overall quality of the program is good and has shown improvement over the past five years. Frequently modified routines exhibits lower quality; most buys were located in routines with particularly low quality. Implementing from the beginning a quality criteria could have avoided time losses due to bug corrections. (author)

  13. gr-MRI: A software package for magnetic resonance imaging using software defined radios

    Science.gov (United States)

    Hasselwander, Christopher J.; Cao, Zhipeng; Grissom, William A.

    2016-09-01

    The goal of this work is to develop software that enables the rapid implementation of custom MRI spectrometers using commercially-available software defined radios (SDRs). The developed gr-MRI software package comprises a set of Python scripts, flowgraphs, and signal generation and recording blocks for GNU Radio, an open-source SDR software package that is widely used in communications research. gr-MRI implements basic event sequencing functionality, and tools for system calibrations, multi-radio synchronization, and MR signal processing and image reconstruction. It includes four pulse sequences: a single-pulse sequence to record free induction signals, a gradient-recalled echo imaging sequence, a spin echo imaging sequence, and an inversion recovery spin echo imaging sequence. The sequences were used to perform phantom imaging scans with a 0.5 Tesla tabletop MRI scanner and two commercially-available SDRs. One SDR was used for RF excitation and reception, and the other for gradient pulse generation. The total SDR hardware cost was approximately 2000. The frequency of radio desynchronization events and the frequency with which the software recovered from those events was also measured, and the SDR's ability to generate frequency-swept RF waveforms was validated and compared to the scanner's commercial spectrometer. The spin echo images geometrically matched those acquired using the commercial spectrometer, with no unexpected distortions. Desynchronization events were more likely to occur at the very beginning of an imaging scan, but were nearly eliminated if the user invoked the sequence for a short period before beginning data recording. The SDR produced a 500 kHz bandwidth frequency-swept pulse with high fidelity, while the commercial spectrometer produced a waveform with large frequency spike errors. In conclusion, the developed gr-MRI software can be used to develop high-fidelity, low-cost custom MRI spectrometers using commercially-available SDRs.

  14. 7 CFR 51.1435 - U.S. Commercial Pieces.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false U.S. Commercial Pieces. 51.1435 Section 51.1435 Agriculture Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards... STANDARDS) United States Standards for Grades of Shelled Pecans Grades § 51.1435 U.S. Commercial Pieces. The...

  15. 7 CFR 51.1433 - U.S. Commercial Halves.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false U.S. Commercial Halves. 51.1433 Section 51.1433 Agriculture Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards... STANDARDS) United States Standards for Grades of Shelled Pecans Grades § 51.1433 U.S. Commercial Halves. The...

  16. Thermal comfort in commercial kitchens (RP-1469)

    DEFF Research Database (Denmark)

    Simone, Angela; Olesen, Bjarne W.; Stoops, John L.

    2013-01-01

    The indoor climate in commercial kitchens is often unsatisfactory, and working conditions can have a significant effect on employees’ comfort and productivity. The type of establishment (fast food, casual, etc.) and climatic zone can influence thermal conditions in the kitchens. Moreover, the size...... and arrangement of the kitchen zones, appliances, etc., further complicate an evaluation of the indoor thermal environment in commercial kitchens. In general, comfort criteria are stipulated in international standards (e.g., ASHRAE 55 or ISO EN 7730), but are these standardized methods applicable...... dissatisfied (PMV/PPD) index is not directly appropriate for all thermal conditions in commercial kitchens....

  17. Software agents for the dissemination of remote terrestrial sensing data

    Science.gov (United States)

    Toomey, Christopher N.; Simoudis, Evangelos; Johnson, Raymond W.; Mark, William S.

    1994-01-01

    Remote terrestrial sensing (RTS) data is constantly being collected from a variety of space-based and earth-based sensors. The collected data, and especially 'value-added' analyses of the data, are finding growing application for commercial, government, and scientific purposes. The scale of this data collection and analysis is truly enormous; e.g., by 1995, the amount of data available in just one sector, NASA space science, will reach 5 petabytes. Moreover, the amount of data, and the value of analyzing the data, are expected to increase dramatically as new satellites and sensors become available (e.g., NASA's Earth Observing System satellites). Lockheed and other companies are beginning to provide data and analysis commercially. A critical issue for the exploitation of collected data is the dissemination of data and value-added analyses to a diverse and widely distributed customer base. Customers must be able to use their computational environment (eventually the National Information Infrastructure) to obtain timely and complete information, without having to know the details of where the relevant data resides and how it is accessed. Customers must be able to routinely use standard, widely available (and, therefore, low cost) analyses, while also being able to readily create on demand highly customized analyses to make crucial decisions. The diversity of user needs creates a difficult software problem: how can users easily state their needs, while the computational environment assumes the responsibility of finding (or creating) relevant information, and then delivering the results in a form that users understand? A software agent is a self-contained, active software module that contains an explicit representation of its operational knowledge. This explicit representation allows agents to examine their own capabilities in order to modify their goals to meet changing needs and to take advantage of dynamic opportunities. In addition, the explicit representation

  18. Track counting and thickness measurement of LR115 radon detectors using a commercial image scanner

    International Nuclear Information System (INIS)

    De Cicco, F.; Pugliese, M.; Roca, V.; Sabbarese, C.

    2014-01-01

    An original optical method for track counting and film thickness determination of etched LR115 radon detectors was developed. The method offers several advantages compared with standard techniques. In particular, it is non-destructive, very simple and rather inexpensive, since it uses a commercial scanner and a free software. The complete analysis and the calibration procedure carried out for the determination of radon specific activity are reported. A comparison with the results of spark counting defines the accuracy and the precision of the new technique. (authors)

  19. Modernising ATLAS Software Build Infrastructure

    CERN Document Server

    Ritsch, Elmar; The ATLAS collaboration

    2017-01-01

    In the last year ATLAS has radically updated its software development infrastructure hugely reducing the complexity of building releases and greatly improving build speed, flexibility and code testing. The first step in this transition was the adoption of CMake as the software build system over the older CMT. This required the development of an automated translation from the old system to the new, followed by extensive testing and improvements. This resulted in a far more standard build process that was married to the method of building ATLAS software as a series of $12$ separate projects from Subversion. We then proceeded with a migration of the code base from Subversion to Git. As the Subversion repository had been structured to manage each package more or less independently there was no simple mapping that could be used to manage the migration into Git. Instead a specialist set of scripts that captured the software changes across official software releases was developed. With some clean up of the repositor...

  20. Modernising ATLAS Software Build Infrastructure

    CERN Document Server

    Gaycken, Goetz; The ATLAS collaboration

    2017-01-01

    In the last year ATLAS has radically updated its software development infrastructure hugely reducing the complexity of building releases and greatly improving build speed, flexibility and code testing. The first step in this transition was the adoption of CMake as the software build system over the older CMT. This required the development of an automated translation from the old system to the new, followed by extensive testing and improvements. This resulted in a far more standard build process that was married to the method of building ATLAS software as a series of 12 separate projects from SVN. We then proceeded with a migration of its code base from SVN to git. As the SVN repository had been structured to manage each package more or less independently there was no simple mapping that could be used to manage the migration into git. Instead a specialist set of scripts that captured the software changes across official software releases was developed. With some clean up of the repository and the policy of onl...

  1. 2016 International Conference on Software Process Improvement

    CERN Document Server

    Muñoz, Mirna; Rocha, Álvaro; Feliu, Tomas; Peña, Adriana

    2017-01-01

    This book offers a selection of papers from the 2016 International Conference on Software Process Improvement (CIMPS’16), held between the 12th and 14th of October 2016 in Aguascalientes, Aguascalientes, México. The CIMPS’16 is a global forum for researchers and practitioners to present and discuss the most recent innovations, trends, results, experiences and concerns in the different aspects of software engineering with a focus on, but not limited to, software processes, security in information and communication technology, and big data. The main topics covered include: organizational models, standards and methodologies, knowledge management, software systems, applications and tools, information and communication technologies and processes in non-software domains (mining, automotive, aerospace, business, health care, manufacturing, etc.) with a clear focus on software process challenges.

  2. TMT approach to observatory software development process

    Science.gov (United States)

    Buur, Hanne; Subramaniam, Annapurni; Gillies, Kim; Dumas, Christophe; Bhatia, Ravinder

    2016-07-01

    The purpose of the Observatory Software System (OSW) is to integrate all software and hardware components of the Thirty Meter Telescope (TMT) to enable observations and data capture; thus it is a complex software system that is defined by four principal software subsystems: Common Software (CSW), Executive Software (ESW), Data Management System (DMS) and Science Operations Support System (SOSS), all of which have interdependencies with the observatory control systems and data acquisition systems. Therefore, the software development process and plan must consider dependencies to other subsystems, manage architecture, interfaces and design, manage software scope and complexity, and standardize and optimize use of resources and tools. Additionally, the TMT Observatory Software will largely be developed in India through TMT's workshare relationship with the India TMT Coordination Centre (ITCC) and use of Indian software industry vendors, which adds complexity and challenges to the software development process, communication and coordination of activities and priorities as well as measuring performance and managing quality and risk. The software project management challenge for the TMT OSW is thus a multi-faceted technical, managerial, communications and interpersonal relations challenge. The approach TMT is using to manage this multifaceted challenge is a combination of establishing an effective geographically distributed software team (Integrated Product Team) with strong project management and technical leadership provided by the TMT Project Office (PO) and the ITCC partner to manage plans, process, performance, risk and quality, and to facilitate effective communications; establishing an effective cross-functional software management team composed of stakeholders, OSW leadership and ITCC leadership to manage dependencies and software release plans, technical complexities and change to approved interfaces, architecture, design and tool set, and to facilitate

  3. Bayesian Software Health Management for Aircraft Guidance, Navigation, and Control

    Science.gov (United States)

    Schumann, Johann; Mbaya, Timmy; Menghoel, Ole

    2011-01-01

    Modern aircraft, both piloted fly-by-wire commercial aircraft as well as UAVs, more and more depend on highly complex safety critical software systems with many sensors and computer-controlled actuators. Despite careful design and V&V of the software, severe incidents have happened due to malfunctioning software. In this paper, we discuss the use of Bayesian networks (BNs) to monitor the health of the on-board software and sensor system, and to perform advanced on-board diagnostic reasoning. We will focus on the approach to develop reliable and robust health models for the combined software and sensor systems.

  4. Development of Radio Frequency Antenna Radiation Simulation Software

    International Nuclear Information System (INIS)

    Mohamad Idris Taib; Rozaimah Abd Rahim; Noor Ezati Shuib; Wan Saffiey Wan Abdullah

    2014-01-01

    Antennas are widely used national wide for radio frequency propagation especially for communication system. Radio frequency is electromagnetic spectrum from 10 kHz to 300 GHz and non-ionizing. These radiation exposures to human being have radiation hazard risk. This software was under development using LabVIEW for radio frequency exposure calculation. For the first phase of this development, software purposely to calculate possible maximum exposure for quick base station assessment, using prediction methods. This software also can be used for educational purpose. Some results of this software are comparing with commercial IXUS and free ware NEC software. (author)

  5. BIM Software Capability and Interoperability Analysis : An analytical approach toward structural usage of BIM software (S-BIM)

    OpenAIRE

    A. Taher, Ali

    2016-01-01

    This study focused on the structuralanalysis of BIM models. Different commercial software (Autodesk products and Rhinoceros)are presented through modelling and analysis of different structures with varying complexity,section properties, geometry, and material. Beside the commercial software, differentarchitectural and different tools for structural analysis are evaluated (dynamo, grasshopper,add-on tool, direct link, indirect link via IFC). BIM and Structural BIM (S-BIM)

  6. Commercial Radio as Communication.

    Science.gov (United States)

    Rothenbuhler, Eric W.

    1996-01-01

    Compares the day-to-day work routines of commercial radio with the principles of a theoretical communication model. Illuminates peculiarities of the conduct of communication by commercial radio. Discusses the application of theoretical models to the evaluation of practicing institutions. Offers assessments of commercial radio deriving from…

  7. Commercial Banking Industry Survey.

    Science.gov (United States)

    Bright Horizons Children's Centers, Cambridge, MA.

    Work and family programs are becoming increasingly important in the commercial banking industry. The objective of this survey was to collect information and prepare a commercial banking industry profile on work and family programs. Fifty-nine top American commercial banks from the Fortune 500 list were invited to participate. Twenty-two…

  8. Software for Data Acquisition AMC Module with PCI Express Interface

    CERN Document Server

    Szachowalow, S; Makowski, D; Butkowski, L

    2010-01-01

    Free Electron Laser in Hamburg (FLASH) and XRay Free Electron Laser (XFEL) are linear accelerators that require a complex and accurate Low Level Radio Frequency (LLRF) control system. Currently working systems are based on aged Versa Module Eurocard (VME) architecture. One of the alternatives for the VME bus is the Advanced Telecommunications and Computing Architecture (ATCA) standard. The ATCA based LLRF controller mainly consists of a few ATCA carrier boards and several Advanced Mezzanine Cards (AMC). AMC modules are available in variety of functions such as: ADC, DAC, data storage, data links and even CPU cards. This paper focuses on the software that allows user to collect and plot the data from commercially available TAMC900 board.

  9. Taking advantage of ground data systems attributes to achieve quality results in testing software

    Science.gov (United States)

    Sigman, Clayton B.; Koslosky, John T.; Hageman, Barbara H.

    1994-01-01

    During the software development life cycle process, basic testing starts with the development team. At the end of the development process, an acceptance test is performed for the user to ensure that the deliverable is acceptable. Ideally, the delivery is an operational product with zero defects. However, the goal of zero defects is normally not achieved but is successful to various degrees. With the emphasis on building low cost ground support systems while maintaining a quality product, a key element in the test process is simulator capability. This paper reviews the Transportable Payload Operations Control Center (TPOCC) Advanced Spacecraft Simulator (TASS) test tool that is used in the acceptance test process for unmanned satellite operations control centers. The TASS is designed to support the development, test and operational environments of the Goddard Space Flight Center (GSFC) operations control centers. The TASS uses the same basic architecture as the operations control center. This architecture is characterized by its use of distributed processing, industry standards, commercial off-the-shelf (COTS) hardware and software components, and reusable software. The TASS uses much of the same TPOCC architecture and reusable software that the operations control center developer uses. The TASS also makes use of reusable simulator software in the mission specific versions of the TASS. Very little new software needs to be developed, mainly mission specific telemetry communication and command processing software. By taking advantage of the ground data system attributes, successful software reuse for operational systems provides the opportunity to extend the reuse concept into the test area. Consistency in test approach is a major step in achieving quality results.

  10. Quality Assurance Results for a Commercial Radiosurgery System: A Communication.

    Science.gov (United States)

    Ruschin, Mark; Lightstone, Alexander; Beachey, David; Wronski, Matt; Babic, Steven; Yeboah, Collins; Lee, Young; Soliman, Hany; Sahgal, Arjun

    2015-10-01

    The purpose of this communication is to inform the radiosurgery community of quality assurance (QA) results requiring attention in a commercial FDA-approved linac-based cone stereo-tactic radiosurgery (SRS) system. Standard published QA guidelines as per the American Association of Physics in Medicine (AAPM) were followed during the SRS system's commissioning process including end-to-end testing, cone concentricity testing, image transfer verification, and documentation. Several software and hardware deficiencies that were deemed risky were uncovered during the process and QA processes were put in place to mitigate these risks during clinical practice. In particular, the present work focuses on daily cone concentricity testing and commissioning-related findings associated with the software. Cone concentricity/alignment is measured daily using both optical light field inspection, as well as quantitative radiation field tests with the electronic portal imager. In 10 out of 36 clini-cal treatments, adjustments to the cone position had to be made to align the cone with the collimator axis to less than 0.5 mm and on two occasions the pre-adjustment measured offset was 1.0 mm. Software-related errors discovered during commissioning included incorrect transfer of the isocentre in DICOM coordinates, improper handling of non-axial image sets, and complex handling of beam data, especially for multi-target treatments. QA processes were established to mitigate the occurrence of the software errors. With proper QA processes, the reported SRS system complies with tolerances set out in established guidelines. Discussions with the vendor are ongoing to address some of the hardware issues related to cone alignment. © The Author(s) 2014.

  11. A NEW EXHAUST VENTILATION SYSTEM DESIGN SOFTWARE

    Directory of Open Access Journals (Sweden)

    H. Asilian Mahabady

    2007-09-01

    Full Text Available A Microsoft Windows based ventilation software package is developed to reduce time-consuming and boring procedure of exhaust ventilation system design. This program Assure accurate and reliable air pollution control related calculations. Herein, package is tentatively named Exhaust Ventilation Design Software which is developed in VB6 programming environment. Most important features of Exhaust Ventilation Design Software that are ignored in formerly developed packages are Collector design and fan dimension data calculations. Automatic system balance is another feature of this package. Exhaust Ventilation Design Software algorithm for design is based on two methods: Balance by design (Static pressure balance and design by Blast gate. The most important section of software is a spreadsheet that is designed based on American Conference of Governmental Industrial Hygienists calculation sheets. Exhaust Ventilation Design Software is developed so that engineers familiar with American Conference of Governmental Industrial Hygienists datasheet can easily employ it for ventilation systems design. Other sections include Collector design section (settling chamber, cyclone, and packed tower, fan geometry and dimension data section, a unit converter section (that helps engineers to deal with units, a hood design section and a Persian HTML help. Psychometric correction is also considered in Exhaust Ventilation Design Software. In Exhaust Ventilation Design Software design process, efforts are focused on improving GUI (graphical user interface and use of programming standards in software design. Reliability of software has been evaluated and results show acceptable accuracy.

  12. Software Engineering Improvement Plan

    Science.gov (United States)

    2006-01-01

    In performance of this task order, bd Systems personnel provided support to the Flight Software Branch and the Software Working Group through multiple tasks related to software engineering improvement and to activities of the independent Technical Authority (iTA) Discipline Technical Warrant Holder (DTWH) for software engineering. To ensure that the products, comments, and recommendations complied with customer requirements and the statement of work, bd Systems personnel maintained close coordination with the customer. These personnel performed work in areas such as update of agency requirements and directives database, software effort estimation, software problem reports, a web-based process asset library, miscellaneous documentation review, software system requirements, issue tracking software survey, systems engineering NPR, and project-related reviews. This report contains a summary of the work performed and the accomplishments in each of these areas.

  13. Spotting software errors sooner

    International Nuclear Information System (INIS)

    Munro, D.

    1989-01-01

    Static analysis is helping to identify software errors at an earlier stage and more cheaply than conventional methods of testing. RTP Software's MALPAS system also has the ability to check that a code conforms to its original specification. (author)

  14. Paladin Software Support Lab

    Data.gov (United States)

    Federal Laboratory Consortium — The Paladin Software Support Environment (SSE) occupies 2,241 square-feet. It contains the hardware and software tools required to support the Paladin Automatic Fire...

  15. Pragmatic Software Innovation

    DEFF Research Database (Denmark)

    Aaen, Ivan; Jensen, Rikke Hagensby

    2014-01-01

    We understand software innovation as concerned with introducing innovation into the development of software intensive systems, i.e. systems in which software development and/or integration are dominant considerations. Innovation is key in almost any strategy for competitiveness in existing markets......, for creating new markets, or for curbing rising public expenses, and software intensive systems are core elements in most such strategies. Software innovation therefore is vital for about every sector of the economy. Changes in software technologies over the last decades have opened up for experimentation......, learning, and flexibility in ongoing software projects, but how can this change be used to facilitate software innovation? How can a team systematically identify and pursue opportunities to create added value in ongoing projects? In this paper, we describe Deweyan pragmatism as the philosophical foundation...

  16. Process mining software repositories

    NARCIS (Netherlands)

    Poncin, W.; Serebrenik, A.; Brand, van den M.G.J.

    2011-01-01

    Software developers' activities are in general recorded in software repositories such as version control systems, bug trackers and mail archives. While abundant information is usually present in such repositories, successful information extraction is often challenged by the necessity to

  17. Software Assurance Curriculum Project Volume 1: Master of Software Assurance Reference Curriculum

    Science.gov (United States)

    2010-08-01

    developed products. The above definition was derived from these references: [IEEE-CS 2008] ISO /IEC 12207 , IEEE Std 12207 -2008, Systems and Software...Systems [CNSS 2009]. Software quality Capability of a software product to satisfy stated and implied needs when used under specified conditions [ ISO ...Curriculum ISO International Organization for Standardization IT information technology KA knowledge area KU knowledge unit MBA Master of

  18. An Introduction to Flight Software Development: FSW Today, FSW 2010

    Science.gov (United States)

    Gouvela, John

    2004-01-01

    Experience and knowledge gained from ongoing maintenance of Space Shuttle Flight Software and new development projects including Cockpit Avionics Upgrade are applied to projected needs of the National Space Exploration Vision through Spiral 2. Lessons learned from these current activities are applied to create a sustainable, reliable model for development of critical software to support Project Constellation. This presentation introduces the technologies, methodologies, and infrastructure needed to produce and sustain high quality software. It will propose what is needed to support a Vision for Space Exploration that places demands on the innovation and productivity needed to support future space exploration. The technologies in use today within FSW development include tools that provide requirements tracking, integrated change management, modeling and simulation software. Specific challenges that have been met include the introduction and integration of Commercial Off the Shelf (COTS) Real Time Operating System for critical functions. Though technology prediction has proved to be imprecise, Project Constellation requirements will need continued integration of new technology with evolving methodologies and changing project infrastructure. Targets for continued technology investment are integrated health monitoring and management, self healing software, standard payload interfaces, autonomous operation, and improvements in training. Emulation of the target hardware will also allow significant streamlining of development and testing. The methodologies in use today for FSW development are object oriented UML design, iterative development using independent components, as well as rapid prototyping . In addition, Lean Six Sigma and CMMI play a critical role in the quality and efficiency of the workforce processes. Over the next six years, we expect these methodologies to merge with other improvements into a consolidated office culture with all processes being guided by

  19. Optimization of Antivirus Software

    OpenAIRE

    Catalin BOJA; Adrian VISOIU

    2007-01-01

    The paper describes the main techniques used in development of computer antivirus software applications. For this particular category of software, are identified and defined optimum criteria that helps determine which solution is better and what are the objectives of the optimization process. From the general viewpoint of software optimization are presented methods and techniques that are applied at code development level. Regarding the particularities of antivirus software, the paper analyze...

  20. Open Source Software Development

    Science.gov (United States)

    2011-01-01

    appropriate to refer to FOSS or FLOSS (L for Libre , where the alternative term “ libre software ” has popularity in some parts of the world) in order...Applying Social Network Analysis to Community-Drive Libre Software Projects, Intern. J. Info. Tech. and Web Engineering, 2006, 1(3), 27-28. 17...Open Source Software Development* Walt Scacchi Institute for Software Researcher University of California, Irvine Irvine, CA 92697-3455 USA Abstract